By Professor Edward B. Burger, PhD, Southwestern University
How do you multiply using Roman numerals? How would you write the number 10,030 without using zero? A compact, place-based (or positional) number system with a symbol for zero opens the floodgates for arithmetic calculations and the discovery of new numbers.

With only 10 symbols, we have the machinery to describe new numbers that grow beyond our imagination. Here, we’ll explore the origins of zero and the development of our modern decimal system. With a powerful positional number system in place, humankind was finally equipped with the tools necessary to begin the development of modern mathematics.
There was, however, a downside to the ancient additive systems. Most of the systems required the repetition of symbols. For example, the Roman numerals XXIII equal 23, and they’d add up the two Xs (10 each) and then the three Is, and get 23. The Babylonians used dovetails and nails, which they would add up. Although computation with the additive systems was fast using tools such as the abacus, those systems required a very long list of symbols to denote larger and larger numbers, and this was a problem in practice.
This is a transcript from the video series Zero to Infinity: A History of Numbers. Watch it now, on The Great Courses.
Slow Progress for Heaps of Numbers

Additive systems made it difficult to look at more arithmetically complicated questions and thus slowed the progress of the study of numbers. To move to what we call a positional system, they needed a new number. This inspired a philosophical question: How many items do you see in an empty box? Is your answer a number? This is the question about zero. In the Rhind Papyrus from 1650 BCE, the scribe Ahmes referred to numbers as “heaps.” This tradition continued through the Pythagoreans, who in the 6th century BCE viewed numbers as “a combination or heaping of units.”
The notion of having zero be a quantity didn’t make any sense at all because they were thinking in terms of quantity. This lack of zero caused many challenges.
Even Aristotle defined a number as an accumulation or heap. Also, the word “three” derived from the Anglo-Saxon word throp, again meaning “pile” or “heap.” Because we can’t have a heap of zero objects—with zero objects, there would be no heaping at all—zero was not viewed as a number. The notion of having zero be a quantity didn’t make any sense at all because they were thinking in terms of quantity. This lack of zero caused many challenges. A careless Sumerian scribe could cause ambiguities because, in cuneiform, different spacing between symbols can represent different numbers. The Egyptian system, on the other hand, did not require a placeholder like zero, but their additive notation was cumbersome. As a result, in the 2,000 years of the Egyptian numeral system, they made little progress in arithmetic or, more generally, in mathematics. It’s interesting to see how notation drives our understanding, intuition, and our further quest to consider numbers.
Learn more about why all numbers are interesting
An Empty Placeholder Appears
The Mayans also had an eye-shaped symbol for zero that they used only as a placeholder.
Zero first appeared as an empty placeholder rather than a number. The Babylonians had a symbol for zero by 300 BCE. It was a placeholder rather than a number because they were thinking heapings, but they needed to distinguish between numbers. The Mayans also had an eye-shaped symbol for zero that they also used only as a placeholder. The evolution of the symbol for zero is difficult to chart. The modern symbol “0” may have arisen from the use of sand tables that were used to calculate things, whereby pebbles would be placed in and moved back and forth for addition or subtraction. When a pebble was removed, there would be an indentation or a dimple in the sand, which reflects the “0” that we see today. Calculations performed on the sand tables may have actually led to the development of the place-based number systems.
Learn more about Zeno’s paradoxes of motion, space, and time
The Birth of the Zero

Later, in the 2nd century CE, Ptolemy used the Greek letter omicron, which looks like an “O,” to denote “nothing.” So this is the symbol for zero, the “0” that we see—the circle. Ptolemy did not view this as a number, but merely as the idea of nothing. But you can see, again, that these things were slowly coming together. Zero as a number occurred in India, most likely.
By the 7th century, the Indian astronomer Bhramagupta offered a treatment of negative numbers and understood zero as a number, not just as a placeholder. He actually studied 0 divided by 0, and 1 divided by 0, and he decided erroneously that 0 divided by 0 equals 0 but just didn’t know what to conclude about 1 divided by 0.

Here again, we see a couple of things. First of all, we know today that we can’t divide by 0. If we divide by 0, it does not yield a number—something we learn in school, so we leave the realm of the number. But we also see a wonderful development. Bhramagupta, this important, great mind, had made a mistake—something that is to be celebrated rather than to feel embarrassed about. While he didn’t get it quite right, his contributions were enormous. Finally, humankind had expanded its view of the number to include and embrace zero.
Learn more about Kurt Gödel’s demonstration that mathematical consistency is a mirage and that the price for avoiding paradoxes is incompleteness
From Empty to Nothing to Zero
A few words about this “nothing” number in terms of language: From the 6th to the 8th centuries, in Sanskrit there was “sun-yah,” which meant “empty,” to represent zero as we think of it. By the 9th century, in Arabic, there was “sigh-fr.” By 13th-century Latin, there was “zef-ear-e-um.” From 14th-century Italian there was “zef-ear-row.” By 15th-century English, we have “zero.” Here we can see the slow evolution of that word.
Learn more about how the paradoxes associated with infinity are infinite
Because of zero’s power in computation, some viewed it as mysterious and nearly magical. As a result, the word zero has the same origins as another word that means “a hidden or mysterious code,” and that word, of course, is “cipher.” We can see that “cipher” actually came from the mysterious qualities that zero possessed in the eyes of our ancestors.
Common Questions About the Number Zero
While it was used as a placeholder for millennia before, the number zero is officially thought to have been invented by Brahmagupta around the year 628, though this is still mostly scholarly conjecture.
The number zero is absolutely a natural number on the number line between positive and negative 1 and can be used in sets to identify numbers. However, as numbers are used to count and zero cannot count anything, it can also be considered not a number!
Technically, the number zero cannot be larger or smaller than itself like the number one or negative one can be, so it is neither. However, in set theory zero is in the set of non-negative numbers while also not being in the set of positive numbers. Zero is unique.
The number zero does not hold a value. Zero is best thought of as a placeholder and a tool for extending mathematics.