By Robert Hazen, Ph.D., George Mason University
19th-century scientists discovered that entropy could be measured in terms of energy. German scientist Rudolf Julius Emmanuel Clausius devised the concept of entropy in order to quantify the natural tendency of systems to become more disordered with time.

Clausius’s Concept of Entropy
Rudolf Julius Emmanuel Clausius published the first clear statement of the two laws of thermodynamics in 1850. Gradually, over a period of many decades, this concept of entropy came to be understood in the 19th-century. In this original statement that was not presented in any rigorous mathematical form, Clausius noted the tendency of systems to become more disordered with time.
Clausius realized that for the second law to be quantitatively useful it demanded a new, rather abstract physical variable. He called this variable ‘entropy’—a word derived from the Greek word for transformation. He defined entropy purely in terms of heat and temperature.
Entropy, in fact, is the ratio of heat energy over temperature. This definition has very important applications in chemistry and engineering, where heat is used to do work. But for most people, this definition of entropy—heat divided by temperature—is a non-intuitive concept.
Learn more about the nature of science.
Clausius’s Study of Steam Engines

Clausius studied steam engines and observed their behavior. He realized that the ratio of heat divided by temperature must either remain constant or it has to increase. It never decreases.
Stated another way, in any heat engine, the heat energy divided by the temperature of the cold reservoir is always greater or equal to the heat energy divided by the temperature of the hot reservoir. Thus, the entropy of a closed system cannot decrease. Entropy can be constant, or it can increase.
Every physical substance has a quantity of entropy just like it has a quantity of energy. One can measure it and give it a unit. In principle, one can measure the entropy of a system by measuring its temperature and its total heat energy, although it is not a trivial thing to do. The two laws of thermodynamics can be summarized in a short way—as energy is constant, entropy tends to increase.
Learn more about the ordered universe.
Entropy in Terms of Energy
A question that arises is—why should things become more disordered? One can try to understand this by thinking about the natural world at the scale of atoms. All matter is made of atoms, which are tiny particles that together form chemical bonds, various solids, liquids, and gas structures that one sees everywhere.
Thus, entropy can be identified in terms of the properties, particularly the energy, the kinetic energy, of individual atoms. Heat spreads out because some atoms with more kinetic energy collide with atoms that have less kinetic energy. Eventually, the kinetic energy of atoms averages out and, as a result, heat spreads out. It is a physical process of vibrating atoms that keep colliding with each other, whether they are in a gas state, a liquid state, or a solid state. Concentrated heat, thus, spreads out.
Measuring the Order of Any System
It can be stated that the order of any system can be measured by the orderly arrangement of atoms. For example, a lump of coal has a highly ordered arrangement of atoms that contains a lot of ordered energy with carbon-carbon bonds. The bonds between carbon and hydrogen, in particular, are very energetic bonds. One can burn the coal and disrupt those bonds. The material becomes more disordered. The entropy increases as that heat energy releases.
Meanwhile, one can also talk about the arrangement of atoms in gas. Gas atoms have different temperatures and different velocities. For example, there are two reservoirs of gas that are at different temperatures—one is very hot, and one is cold.
What happens if both are mixed? The separation of the gas into two separate populations is a more ordered state. As soon as the two gases are mixed, the hot atoms in the hot gas start colliding with the colder atoms, and the temperature averages out, as the average velocity of the gas atoms averages out. The entropy, therefore, the disorder of the system, increases.
This is a transcript from the video series The Joy of Science. Watch it now Wondrium.
Ludwig Boltzmann’s Concept of Entropy

The definition of entropy, as a measure of the degree of disorder, was placed on a very firm quantitative footing in the late-19th century by the Austrian physicist, Ludwig Boltzmann. Born in Vienna in 1844, Boltzmann studied at the University of Vienna, where he spent most of his professional life as a professor of theoretical physics.
Boltzmann used what is called the probability theory to demonstrate that for any given configuration of atoms, the mathematical value of entropy is related to the number of different possible ways you can achieve a particular configuration. Entropy, in fact, is the logarithm of the number of configurations, that is, the mathematical form.
To illustrate what Boltzmann meant, take the example of six balls numbered one through six. Three of them are yellow, three of them are orange. Think about all the different ways to arrange these balls. It turns out that there is 1×2×3×4×5×6 different arrangements, a total of 720 different ways just to line up six balls.
If one goes through the math, it turns out in exactly 36 of those ways, one can rearrange the balls so there is still an ordered arrangement of three yellow followed by three orange. That is 36 out of 720 different ways, so if one randomly threw down all six balls in a row, one would only have one chance in 20 of having three yellow followed by three orange. All the other arrangements would be different.
Common Questions about Entropy in Terms of Energy
Rudolf Julius Emmanuel Clausius defined entropy purely in terms of heat and temperature.
In the context of the kinetic energy of atoms, heat spreads out because some atoms with more kinetic energy collide with atoms that have less kinetic energy. Eventually, the kinetic energy of atoms averages out and as a result, heat spreads out.
Ludwig Boltzmann used probability theory to demonstrate that for any given configuration of atoms, the mathematical value of entropy is related to the number of different possible ways you can achieve a particular configuration.