Numbers

“1, 2, 3, 4! What are we all counting for?
“5, 6, 7, 8! Ain’t our number system great?”

All right, so maybe you won’t hear thousands of people chanting it at a street demonstration–it’s still an interesting question. (The question in the first line, that is; the question in the second is rhetorical.)

What are we all counting for? And what are these things called numbers that we’re all counting with?

Numbers are a system of symbols used to express quantities. Throw in a few rules about how to use them, and you’ve got a special language that makes all kinds of things possible–things such as counting (“997 sheep, 998 sheep, 999 sheep . . . “), comparing amounts (“My bank account’s bigger than yours is!”), performing calculations (“If I buy that Ferrari, I won’t have enough for the yacht . . .”), determining order (“Quit shoving, I was here first!”), making measurements (“You can’t fit a 44-inch waist in a 34-inch waistband!”), representing value (“That’ll be $1.8 billion, please–cash or credit card?”), setting limits (“You have three days to make up your mind–it’s me or The Sports Channel!”), coding information (“There are too many zeroes on this price tag!”) and transmitting data (“Beep-beep, beep-beep-beep…”).

See how important numbers are to us? Almost everything we do to keep our complex modern civilization running depends on effectively performing one or more of these functions. To take a very mundane example, you’d have a tough time balancing your chequebook if the only numbers you had to work with were “one,” “two” and “many,” even though that’s all our distant ancestors had. (Of course, a dollar went a lot further in Cro-Magnon days…)

The most primitive form of number system is the tally, where there’s only one symbol, used over and over again, once for each thing being counted. Eventually, though, someone got the bright idea of using a series of symbols, each representing the addition of one more to the tally, and numbers were born.

However, a number system with a unique symbol for each quantity quickly gets out of hand. The way to simplify it is to use a limited set of symbols and use them to indicate, not just single units, but also agreed-upon multiples of those units. We use the “decimal” or “base 10” system, which means that whenever we get 10 units of something, we lump them altogether and count the lumps, too. Then, when we get 10 of those lumps, we lump them altogether into an even bigger lump, and so on. To keep this all straight, we use “positional base notation,” which is a fancy way of saying the position of the digit indicates which size of lump it is counting: from right to left, ones, 10s, 100s, 1000s, etc.

The ancient Egyptians, Chinese, Cretans, Greeks, Hebrews and Romans all had versions of the decimal system, but the bulk of the credit for it goes to the Hindu-Arabic mathematicians of the eighth to 11th centuries A.D. No doubt it was born because we have 10 fingers, but other cultures have done quite well using different bases altogether.

For example, the Mayans used base 20, but then modified it to conform to their 360-day calendar. Instead of counting ones, 10s, 100s, etc., they counted ones, 20s, 360s, 7,200s, etc. (An unmodified base 20 system would count ones, 20s, 400s, 8000s, and so on.)

The Babylonians, for whatever reason, used base 60. They had positional base notation, which should have simplified things, but instead of using the 60 symbols you’d think they’d have to, they used only two, and therefore, as one reference book put it, their system “suffered from ambiguities.” (I should think so! Apparently you had to consider the context to figure out exactly which number was meant.)

But then, the decimal system suffered from plenty of ambiguity of its own until the revolutionary notion of zero, which signifies nothing but is all-important as a placeholder, came along, again courtesy of the Arabs. Modern decimal notation, in turn, is generally credited to Leonardo of Pisa, who sketched its beginnings in a book called Liber abaci (Abacus Book), in 1202.

The importance of positional base notation, especially with the inclusion of the zero, can’t be overstated. Before it was fully developed, operations such as multiplication and division were for experts only. But by the 1100s, algorists, using notation, were successfully challenging abacus users in tests of mathematical speed–and they had a written record of their calculations when they were done.

Such advanced calculation skills were critical in fields like astronomy, manufacturing and navigation, and led directly to logarithms, slide rules, mechanical calculators–and, eventually, computers.

That being the case, it’s interesting that computers themselves don’t use the decimal system. They find it much easier to use the binary, or base 2, system, where there are only two symbols, 1 and 0 (“on” and “off” in electronic terms), and instead of counting ones, 10s, 100s, and so on, you count ones, twos, fours, eights, 16s, etc. (Of course, they translate their binary code into decimal for us slow-witted humans.)

(By the way, a 17th-century mathematician named Gottfried Leibniz would have been very pleased with computers; he advocated the binary system for religious reasons. To him, one represented God and zero represented the void, which would make a computer a kind of electronic prayer wheel.)

And so, in a sense, we’ve come full circle. Our ancestors started out counting “one, “two,” “many,” and now our most sophisticated computers make do with even less–they’ve dropped the “many” and just use “one” and “two.”

But that accomplishment itself is a tribute to a lot of human thought–and a lot of human counting.

Permanent link to this article: https://edwardwillett.com/1992/02/numbers/

Leave a Reply

Your email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Easy AdSense Pro by Unreal