Entropy as the term suggests (en ‘in’ + trope ‘a turning, a transformation’) is a system’s ability to transform in itself. This is why it can be used to indicate thermal and information content transformations and their possible directions.
This interesting concept of the nature of changes is the measure of the order and disorder in multi-element systems. The concept was invented in classical thermodynamics in the 19th century when scientists realised that temperature is created by molecules moving in the medium. From here it took only a few steps to define the statistics of thermal motion, i. e. its uniformity in a group of molecules and how this uniformity increases or decreases with heat transfer. This statistical average is measured by entropy which—as motion disorder—will decrease when the system releases heat.
Later on we will see that the concept can be generalized so as to include other systems as well. Thus, it plays a role in biology and, more importantly from our perspective, in Shannon’s information theory since information can only be received reliably if it is not noisy (disordered).