Several centuries ago Shakespeare titled a play Much Ado About Nothing. If I gave these columns titles, that’s what I’d call this one–not because I think I write as well as Shakespeare, but because that’s what this column is about: nothing.
Nothing is very important. Um, what I mean is, the concept of nothing is very important: specifically, nothing as embodied in the simple circular form of the zero.
The zero was invented about 1500 years ago by a group of Indian astronomers. When its use became widespread in western mathematics in the 15th century, it had an impact comparable to that of the computer in this century–and, in fact, without zero, computers would be impossible.
The Indian astronomers invented zero because they were having a problem representing large numbers easily. On the surface it sounds unlikely that a notation representing nothing could be useful in writing numbers representing lots of something, but that’s the magic of the zero.
To understand how it works, consider a zero-less system of notation: Roman numerals. You remember: I stands for one, V for five, X for 10, L for 50, C for 100, D for 500 and M for 1000. If you place a smaller unit to the right of a larger unit, you add the two together; if the smaller unit goes to the left of the larger, you subtract the smaller from the larger. Thus, VI stands for six and IV for four. MCMLIX translates into a very important year, 1959–the year I was born.
The higher you go, the more symbols you need. And since there’s no end to numbers, eventually you’ll need an infinite number of symbols. This is not exactly practical.
The zero changed all this. Thanks to it, we can express very large numbers using only 10 symbols. That’s because the position of the symbols is just as important as the symbols themselves. A seven can mean seven, 70, 700, 7000, and so on. But study those numerals. Without the zero, how would you tell them apart? The places with nothing in them would just be blank spaces.
The Babylonians had a place-value system similar to ours, and for centuries that’s what they did–left blank spaces for empty places. Trouble was, someone copying figures could easily miss that blank space, changing the value of the number. Eventually the Babylonians started using a dot to indicate empty places, but only within a numeral; places to the right were still just left empty. So is that number 25, or 2500?
The zero-inventing Indian astronomers knew about the Babylonian system. They also knew and used (as practically everyone did) the counting board, or abacus, whose rows correspond to numerical places. Their great invention was a notation to indicate an empty column on the abacus–at first a small dot, later a circle or cross.
This doesn’t sound particularly earth-shaking to us. But it enabled the astronomers to write down large (one might say astronomical) numbers much more easily, and also enabled them to perform complex calculations that were impossible on the abacus.
The system spread to Europe as a result of the rise of the Islamic empire, and thanks particularly to The Book of Addition and Subtraction According to the Hindu Calculation by the Islamic scientist Muhammad ibn Musa al-Khwarizmi, which appeared about 820.
The Arabic influence in spreading this form of calculation is reflected in the fact that we still refer to our system of notation as Arabic numerals. (By the way, al-Khwarizmi’s name was later corrupted into “algorithms,” which today is what we call any step-by-step approach to problem-solving. Computer programs, for example, are algorithms.)
The new system wasn’t exactly welcomed with open arms. In fact, it took centuries to displace Roman numerals. Zero, especially, was viewed with great suspicion. Sometimes it was nothing at all. Sometimes it multiplied the value of other numbers by 10. Sounds ominously magical and mysterious, doesn’t it? Could it be the work of . . . SATAN?
As late as the 15th century, a French writer called the zero “a sign which creates confusion and difficulties,” and another French writer scoffed, “Just as the rag doll wanted to be an eagle, the donkey a lion, and the monkey a queen, the cifra (zero) put on airs and pretended to be a digit.” One insult of the day was to call someone an “algorism-cipher.”
But, as is often the case, commercial interests were the driving force in acceptance of the new system. Merchants started using the new numbers because they were so much easier. There were still hold-outs who insisted the only proper way to calculate was with an abacus, but by the beginning of the 16th century, the battle was over. A couple of hundred years later the abacus was so completely forgotten that, when one of Napoleon’s generals brought one back from Russia (where they are still in common usage even today in remote areas), it was viewed as a great oddity.
All this fuss over, literally, nothing? It seems hard to believe. But mathematicians will tell you zero is a very special number indeed. If you add it to another number, nothing changes. If you subtract it from another number, nothing changes. If you multiply a number by it, you get zero. If you divide a number by zero, you get . . . well, if you’re using a computer, you get an error message. Your brain has a similar reaction. It goes, “Huh?” Zero is the only number you cannot divide with.
It’s also the only number that is neither positive or negative; but you have to have it for the “number line,” the line of numbers that stretches to infinity in both positive and negative directions. Without zero, that line has an unbreachable break in it.
But for most of us, the power of a zero is best summed up with a simple question: “Which would you rather have–$10 or $100?”
What’s the difference?
Just a zero–nothing, really.