Entropy

Introduced into classical thermodynamics by Rudolf J. Clausius (1822-1888), the term refers to a state of disorder or a quantitative measure of it.  According to the second law of thermodynamics, closed systems evolve spontaneously toward a state of maximum entropy or disorder.  In such a disorganised state, energy is evenly distributed throughout a system, with the consequence that the system does not display any signs of structural differentiation.  Living organisms do not evolve to a state of disorder, but rather to states of increasing structural differentiation and complexity.  They escape the symmetrical end state of maximum disorder dictated by the second law of thermodynamics because they are open systems that exchange energy, information and matter with their environments, and thus can constrain the amount of entropy production.  Formally speaking, the entropy of a system is a measure of the unavailability of its internal energy to do work in a cyclical process.  Entropy (S) can be measured by using the negative of the Shannon and Weaver information equation (viz., S = -Σpᵢ log₂ pᵢ, where pٖ is the probability of a particular state).  For a nominal scale, the equation is a summarising measure of probabilistic uncertainty, and is analogous to the standard deviation in the case of an interval scale. 

See Classical thermodynamics, Closed system, Complexity, Information, Open system, Order, Organization, Second law of thermodynamics