Chemistry

The Entropy

The Entropy

The entropy of an object is a measure of the amount of energy that is unavailable to do work. It is also a measure of the number of possible arrangements the atoms in a system can have. It is the measure of a system’s thermal energy per unit temperature that is unavailable for doing useful work. In this sense, entropy is a measure of uncertainty or randomness. The higher the entropy of an object, the more uncertain we are about the states of the atoms making up that object because there are more states to decide from.

It is an important concept in physics and chemistry, plus it can be applied to other disciplines, including cosmology and economics. A law of physics says that it takes work to make the entropy of an object or system smaller; without work, entropy can never become smaller – you could say that everything slowly goes to disorder (higher entropy). In physics, it is part of thermodynamics. In chemistry, it is a core concept in physical chemistry.

The concept of entropy provides deep insight into the direction of spontaneous change for many everyday phenomena. The word entropy came from the study of heat and energy in the period 1850 to 1900. Some very useful mathematical ideas about probability calculations emerged from the study of entropy. These ideas are now used in information theory, chemistry and other areas of study.

While temperature and pressure are easily measured and the volume of a system is obvious, entropy cannot be observed directly and there are no entropy meters. Entropy is simply a quantitative measure of what the second law of thermodynamics describes: the spreading of energy until it is evenly spread. The meaning of entropy is different in different fields. It can mean:

  • Information entropy, which is a measure of information communicated by systems that are affected by data noise.
  • Thermodynamic entropy is part of the science of heat energy. It is a measure of how organized or disorganized energy is in a system of atoms or molecules.