In the argument between science and religion, the second law of thermodynamics is often cited to claim that it disproves evolution. The term entropy is a key element of this law. What is entropy? Do people who want to disprove evolution truly understand entropy? Let’s try to understand where their argument stems from, and whether it is a misconception or not.
It’s no secret that there has long been a tension in American society between religion and science. This argument is often advanced by people who think that they have found an airtight inconsistency in scientific theories. The claim is that this inconsistency disproves science itself.
However, the argument that is put forth doesn’t lead to that conclusion. In fact, the argument itself is fatally flawed and the flaw arises from a misunderstanding of the laws of thermodynamics.
Physics Versus Evolution
The argument begins with one of the laws of thermodynamics—specifically the second one. This law says that the entropy of the universe can never decrease.
The way the argument is presented is that one has to either reject evolution or physics. And since physics is the more analytic science, one can’t reject physics. And according to the second law of thermodynamics, entropy always increases. Therefore, this just disproves evolution.
Now, to understand how this seems like such a slam dunk against evolution, you need to understand it the way it’s usually understood—which is wrong, by the way. If you try to find a definition of entropy, you’ll find that it is considered to be roughly synonymous with the word disorder.
So, when someone says that entropy increases, they often mean that things get messier. And that just kind of makes sense.
Learn more about the misconceptions of evolution.
Entropy and Disorder are not Synonymous
There are several misconceptions here that require correction. The first one is that entropy and disorder are synonymous. That’s not true.
We often think of evolution as a continuous increase in complexity. Roughly speaking, the idea is that evolution goes from less complex forms of life to more complex. Basically, the simple understanding is that bacteria became fish, which became amphibians, and then lizards, mammals, and then humans. We started with the simple and ended with the complex.
Now, if you take these two ideas—that the laws of thermodynamics require that things fall apart over time, and that evolution makes things more refined and more complicated over time—you can see where people might get the idea that the two statements are in contradiction.
The other flaw in this argument is that evolution doesn’t imply an increase in complexity. In fact, the most common form of life on the planet has always been single-celled organisms.
Earth and Life on Earth Are not Isolated Systems
Let’s look at another problem with how people commonly understand the law of thermodynamics. The correct statement of the second law of thermodynamics states that “the total entropy of an isolated system can never decrease over time”.
It’s very important to emphasize the part about “isolated system”. Isolated, in this context, means no energy flows in or out. And that clearly isn’t true in the case of life.
Life constantly takes in energy; imagine what would happen to the animal life on the planet if every living thing stopped eating for six months. The world would be a very different place, with very few creatures still alive.
Taken at a larger level, there’s this giant ball of fire in the sky that keeps our planet from being a frozen, ice-covered rock. Every year, the Earth absorbs more energy than can be extracted from all of the fossil fuels and easily accessible uranium.
The bottom line is that the Earth is most definitely not an isolated environment. And, because of that simple fact, the whole claim that the second law of thermodynamics disproves evolution is simply wrong.
This is a transcript from the video series Understanding the Misconceptions of Science. Watch it now, Wondrium.
What Is Entropy?
There’s another and subtler reason why the laws of thermodynamics are being mangled when they’re being used in this way. It’s because people are misusing the idea of entropy. Briefly, entropy really isn’t disorder in the sense that most people understand it.
It’s more correct to say that entropy is a measure of the number of ways something can exist and still look more or less the same. To try to make this more understandable, let’s talk about ten fair coins that can be heads or tails. And let’s lay them out next to one another.
There’s only a single way that all of the coins can be heads, and there is only a single way that all of the coins can be tails. This is also true of the scenario where the coins go heads, tails, heads, tails, and the pattern repeats itself for all of the coins. Because there’s only one way these coins can be in each configuration, these are all equally likely and they have low entropy.
Now let’s ask ourselves, “How many ways can these coins have one heads and nine tails?” Well, that’s pretty easy. The first coin can be heads, the second coin can be heads, the third one, and so on. That means that there are ten distinct configurations which satisfy the requirement that there be one coin as heads. This means that the entropy is higher for the configuration with exactly one coin being heads.
We can also figure out the number of different ways in which you can get two heads from ten coins. You can do it the hard way, by counting them—say the two heads can be coins one and two, or one and three, or one and four, and you can add them all up. Or you can use a mathematical theory called combinatorics.
It turns out that while there is only one way to get zero heads, and ten ways to get one heads, using combinatorics, you can calculate that there are 45 ways to get two heads, 120 ways to get three heads, 210 ways to get four heads, and 252 ways to get five heads.
After that, the number of configurations goes down. After all, there is only one way to get ten heads. The exact mathematics really isn’t important. What is important is the trend of the numbers.
There are simply more ways—more configurations of coins—to get three heads than there are to get no heads. And the most likely configuration is to get five heads. There are 252 more ways to do that than to get none. And this means that the entropy of getting five heads is higher than getting none. There are simply more ways to do it.
This is what is meant by entropy in the second law of thermodynamics. A closed system, which is to say one in which energy neither flows into nor out of, tends to move to configurations which are superficially similar but just have more ways to be like that.
Learn more about if thermodynamics disprove evolution.
Increased Entropy Is Not Equal to Increased Disorder
It’s perhaps useful to try to tie this to the idea of disorder to see how people make the claim that increased entropy means increased disorder. To do that, let’s think about a college dorm room. In order for everything to be in its proper place, the clothes need to be folded and put in the proper drawers, the shoes need to be near the door, and the toothbrush needs to be in the designated holder in the bathroom. There’s only one proper configuration.
But if things aren’t in the right place, the clothes could be on the floor, in the bathroom, under the bed, on the bed, etc. The same thing is true for the shoes, the toothbrush, books, laptop, and so on. There are many, many ways to be disorganized, all of which are basically the same.
Thus, the messy room has a higher entropy than the neat one, which has only one acceptable configuration. This is the reason that some people say that entropy is a measure of disorder, but it’s not.
Entropy is a measure of the number of ways things can look like each other at the big picture level but be different at the detail level. And this misunderstanding of the meaning of the word is yet another reason why people who try to use the second law of thermodynamics to invalidate evolution are so far off.
Aside from misconceptions about the nature of evolution, they have completely neglected the fact that the Earth and the ecosphere isn’t a closed system, and they’ve used an analogy for the word entropy rather than the precise mathematical formulation.
Common Questions about Thermodynamics, Entropy, and Evolution
The second law of thermodynamics says that the entropy of a system can never decrease or that things fall apart over time. However, this only holds true for an isolated system. Isolated, in this context, means no energy flows in or out.
Entropy really isn’t a disorder in the sense that most people understand it. Entropy is a measure of the number of ways something can exist and still look more or less the same. It is a measure of the number of ways things can look like each other at the big picture level but be different at the detail level.
A closed system, in which energy neither flows in nor out, tends to move to configurations which are superficially similar but just have more ways to be like that. This is what entropy in thermodynamics, and specially in the second law of thermodynamics, means.
Entropy requires a closed system with no inflow or outflow of energy. Earth and life on earth are both quite clearly not closed systems. Hence, there’s no contradiction, because it simply doesn’t apply to evolution.