Why Zero is the First Number


In the Olde Dayes there was no such thing as zero. It didn't exist. Nothing. Nada. Zip.

We outgrew those days. Blame the Arabs. They invented zero.

At first the Arabs didn't really believe in zero, either. If you look at Arabic numerals (really, in Arabic, not westernized numerals we only call "Arabic"), one is | just like ours and the Romans, a single stroke, and two is || like the Romans, except they don't pick up their pen between strokes (going right-to-left, the way the Arabs write), so it comes out  more like this:  Three is the same, but with three strokes:  It doesn't take much thinking to realize that the Europeans, when they went to see these new Arabic numerals, were looking from the left side of the Arab who was demonstrating them, so we got our twos and threes sideways.

Like I was saying, the Arabs didn't really believe in zero. It was a Nothing, just an empty space. However, to make sure you understood that the space was really empty there, they put a little raised dotthe way the Hebrews and the Romans (and probably also the early Arabs) put between their words, instead of just blank space like our printers do. Well dots are easy to miss, so to make it a little blacker, the scribe would wiggle his pen around a little, which sometimes left a little hole in the middle of a small circle. And that's where we got our zero.

If you go to an Arabic country today, where they use real Arabic numerals, sometimes you will see a little round circle for zero, but mostly they still use a dot. But if you see a little circle, you might confuse it for five, which is a real circle, like the palm of your hand. So somewhere along the way, perhaps it was the Europeans, or their slightly off-center Arabic tutor, somebody got the idea of putting a line over the top of the (big) five-circle, so you knew it was a five, not zero. The Greeks did that to their big-Oh letter to distinguish it from their little-Oh letter, except they put the line under the circle instead of over it. Now you need to realize that these scribes were pretty picky about how you formed your letters and numerals. The Greeks insisted that you start at the bottom, drawing half of the line, then around the circle, then finished out the rest of the line. That way you could do it without picking up your pen -- or failing to pick it up, leaving that ugly smear that gave us our connected twos and threes. If you got sloppy -- we all do -- the two halves of the line didn't connect, and what you got is Omega W (that's Greek for "big Oh"). After centuries of sloppy writing, the two halves of the bar started to curl around, so it began to look more like the sides on our modern W. Well anyway, to do a five, you had to draw the circle in a clockwise motion from the top left, then pick up your pen and draw the top bar again from the left. When I was in grade school, 500+ years later, they still told us to write our fives in two separate strokes like that. Me, I do it in one S-like stroke. It looks like it.

But this is about zero. Like I said, the Arabs invented it, as a placeholder for nothing. Everybody knows that numbers start at one. There is no Year Zero. The year after 1BC is 1AD. Actually there is a good mathematical reason for that.

Numbers on a scale -- think of a ruler -- are like mathematical points; they have no size. The line representing 1 inch is infinitesmally narrow, no width at all. Of course they can't paint it that way on the ruler, but we pretend. If the ruler extended out in both directions, zero would also have no width. Days and years, however, are not dividers between chunks of time, they are the chunks of time. The first day of the week is a whole 24 hours long. A particular time in that day, say 8am, is infinitesmally small, with no duration, but the eighth hour refers to a whole sixty minutes. Year 2006 is not a divider between blocks of time, it is a whole 365 days long. It is the 2006th year after we started counting. The first year -- we call it Year One, but it is really the First Year, starts at time = 0/0/0,00:00:00 and goes for 365 days. The year 1BC is actually the First Year counting backwards.

For the rest of this discussion, I will ignore numbered years and days and other chunks like that.

I'm into computers, and the numbers we use in computers also refer to chunks. So why don't they start at one, like years? There is a good mathematical reason for that.

Take a piece of paper and write down in a column, all the numbers from 1 to 100. It's better if you could go all the way to 1000, but that's too tedious. Here I have shown a small segment of that list, the block encompassing the eighties. Now I call these the "eighties" because they are all the numbers with "eighty" in their name. Which is the first of the eighties? Not 81! There are ten eighties, and the first one has a zero in its last digit. It's as simple as that.

Now let's look at another decade, the tens (teens). In English we have separate names for one-teen and two-teen, and (no surprise) zero-teen we call just plain ten. Again, the decade starts with zero in its last digit.

OK, what about the first decade, the numbers with an empty space in the next-last digit. There's really a zero there, but as we already observed, zero is nothing, so we don't even write it unless it's needed to hold larger digits from sliding over (that's where it would help to look at numbers larger than 100: 101 needs that zero in the middle, so it doesn't get confused with 11). How many eighties are there? Ten, starting with 80. How many teens are there? Again ten, starting with 10. So how many naughts are there in the first decade? Count'em, there are ten. You only see nine? That's because zero is a nothing. It really there, in the blank line before 01.

In computers this matters, because just like the space on your paper, the numbers take up space in the computer. Originally it was very expensive space, a couple hundred dollars for each digit. So if we wanted space for 100 different numbers, we could do that in two (not three) digits if we started at zero.

Today computers count in binary. Every binary digit can be either zero or one, off or on, nothing else, not even empty spaces. To count to 100 takes seven of them, and if all seven digits are turned off, 0000000, that's zero. Seven binary digits (we call them bits, for short) can count up to 1111111, which is 127, not 99 or 100, but the idea is the same.

The bottom line is that zero comes before one. Zero is the first number. Every computer geek knows that. God does too: In the beginning God created the heavens and the earth out of nothing. That's zero. Zero was first. Don't you forget it.
 

Tom Pittman
2006 February 10