The Nature of Randomness

From the lecture series: What Are the Chances? Probability Made Clear

The goal of probability is to describe what is to be expected from randomness. But randomness can be confusing to the human mind because its expression is often quite different than what we expect it might be.

Imagine an experiment in randomness. Take a coin and flip it 200 times, and each time record whether it’s heads or tails, putting down H’s for the heads and T’s for the tails. Now, suppose you ask a person to just write down a random list of 200 H’s and T’s. You put up both lists on a blackboard, one made by actually flipping a coin, and the other made by a human. Even though they may both look like an ocean of H’s and T’s, there is a way to tell which one is truly random, and which is human-generated.

The thing to do is look for strings of long sequences where there are all H’s or all T’s in a row. In the 200 H’s and T’s generated by randomly flipping a coin, you might see at least four or five long sequences of each: six H’s in a row here, five T’s there—a lot of streaks of many in a row.

How often will a human being write more than four strings of the same letter in a row when they’re trying to be random?

Now consider the list generated by the human being. How often will a human being write more than four strings of the same letter in a row when they’re trying to be random? We resist this because we don’t think that’s very random. They think you’ve got to alternate—H-T-H-T—and in a human-generated list, you would see very few strings of H’s and T’s in a row.

This is a transcript from the video series What Are the Chances? Probability Made Clear. Watch it now, on Wondrium.

When you flip a coin 200 times, the probability of having at least one string of six or longer of H’s or T’s is roughly 96 percent—very likely. The probability of having at least one string of five is 99.9 percent—it’s essentially certain. You’d be very unlikely to flip a coin that many times without getting these long strings. If you simulate this on the computer, you’ll see that this plays out by getting long strings.

Expectations of Randomness: An Experiment

One of the common misconceptions that a lot of people have about randomness is illustrated by the coin-flipping experiment. Let’s say that you flip a coin many times, and randomly it happened that 10 times in a row you got heads. Doesn’t it seem like the next time it’s more apt to be tails? It does to most people. The answer is that the coin doesn’t know what it’s just done. To the coin, every flip is a new flip, and it’s just as likely to be heads as tails after it’s done 10 heads in a row, as it was to get a heads than a tails if it had done none of them.

Take a coin, and more than a million times, you flip the coin 11 times. Obviously you do this with a computer.

To demonstrate this, you can simulate the following experiment. Take a coin, and more than a million times, you flip the coin 11 times. Obviously, you do this with a computer. To make it easy, you actually flip the coin 11 times for 1,024,000 times, because every 1,024 times is the probability of getting 10 heads in a row. In other words, if you do the experiment of flipping the coin 1,024,000 times, and each time you flip it 11 times, you expect that the first 10 will all be heads about 1,000 times.

You run the computer simulation the first time, and the number of times you get 10 heads in the first simulation is 1,008: extremely close to 1,000. What happened to the 11th coin? Well, 521 times it turned out to be a head also, and 487 times it turned out to be a tail. There’s no memory. Approximately half the time heads, half the time tails.

If you do it again, the first 10 might be heads 983 times, and then the 11th flip is heads 473 times and tails 510 times. During a third experiment, 1,031 times it came out heads 10 times in a row, and of those, 502 had the next coin be heads, and 529 tails. The coin has no memory. After it’s gotten 10 heads in a row, it’s just as likely to be heads the next time as it was the first time you flipped that coin.

What Exactly is Rare?

There is another counterintuitive aspect of probability: what is rare, and how do we view rarity in probability? Suppose you got dealt the following hand: the two of spades, the nine of spades, the jack of clubs, the eight of spades, and the five of hearts. It probably doesn’t strike you as an impressive hand, but it is. One out of 2,598,960—that’s the probability of getting that hand.

If you were dealt the ace, king, queen, jack, ten of spades—a royal flush—what’s the probability of getting this royal flush in spades? Exactly the same—1 out of 2,598,960—yet you would write home to your mother about this hand for sure. Your previous hand was just an average hand, and yet in your whole life of playing cards, you will probably never get that hand again, because its probability is almost zero—1 out of 2,598,960. This is one of the counterintuitive concepts of probability: that rare events happen all the time, but you may not recognize them as significant.

Rare events happen by chance alone. The most-common rare event that you see mentioned in the newspapers every day is the lottery. The probability of winning the Powerball lottery is approximately 1 out of 146,000,000. This is the big multistate lottery in some states. One out of 146,000,000. That chance is so remote you’d think it would never happen; but it happens regularly. Why? Because a lot of people try. A lot of people buy random numbers and some of them then occasionally win. If you try something that’s rare often enough, then it will actually come to pass.

This concept—that rare things will happen if you repeat them enough and you look for them enough—was encapsulated in an observation that was first made by the astronomer Sir Arthur Eddington in 1929, and he was describing some features of the second law of thermodynamics. He wrote the following:

If I let my fingers wander idly over the keys of a typewriter it might happen that my screed made an intelligible sentence. If an army of monkeys were strumming on typewriters they might write all the books in the British Museum. The chance of their doing so is decidedly more favourable than the chance of the molecules returning to one half of the vessel.

The Bible Code Hoax

However, you can find patterns in random writing, and in fact, an enterprising author made a lot of money a few years ago when he wrote The Bible Code. What the author of The Bible Code did was take the Bible, written in Hebrew, and find patterns of words by skipping a certain number of letters, and in that pattern of skips they would find words written out. One example was “Atomic holocaust Japan 1945.” He said that this was an example of how the Bible showed the future.

The truth is that this is just a matter of probability. If you take all possible sequences of different lengths, you can, by randomness alone, find surprising things. To demonstrate it, people debunking this analysis found patterns in War and Peace and so on. This is another challenging part of probability, namely that if you look for rare things but you have many places to look, you’ll tend to find them.

These are some of the challenges of looking at and asking what is random in the world.

Common Questions About Randomness and Probability

Q: What is randomness in probability?

In probability, randomness refers to events that occur in no apparent order and are not causally related.

Q: What is true randomness?

True randomness means that something unfolds purely by chance rather than intentionality, free from human interference.

Q: What are random number generators used for?

Cryptography, gambling, statistical sampling, and computer simulation are all purposes for using a random number generator.

Q: Are winning lottery numbers really random?

Many people claim that they can “outsmart” the lottery or predict winning combinations. People even sell tools to this aim, but these tools are most likely a waste of money. To the best of anyone’s knowledge, the process of choosing the winning lottery numbers operates on the principle of randomness.