This chapter is about zero, not nothing, but zero, the number. It is interesting that it took so long for humans to include zero in the numbering system, since it has such an important place in the value system. However, it is a very dangerous concept because it actually creates a discontinuous point when used in our mathematical system.
The need for zero in the numbering system was ignored for millennia. It is perfectly obvious to everyone that the ‘count’ values should be established and named. Fractional values were also naturally added, although less as true numerical values than as expressions to be used in speech. But apparently, our forebears felt that ‘nothing’ was such an obvious value that it didn’t require specific notation.
This all changed when the modern numerical notation came into being, sometime around the fall of the Roman Empire. The Romans had their own system for enumeration. It started rather logically; 1 = I, 2 = II and so forth. When they got to 5, they chose 5 = V, a somewhat logical choice, relating to the crossing over of the 4 strike marks when counting by item. Somewhere along the line, it was determined that if one placed a lower valued digit to the left of a higher one, it meant ‘subtract this value first’. Therefore IV = (5 – 1) = 4, IX = (10 – 1) = 9, etc.
The system loses its relationship to counting however, right after ten. X = 10, L = 50, C = 100, D = 500 and M = 1000. These are the commonly used notational values. There are a few more symbols that were commonly used, but the interesting point is that this system of notation is self limiting; that is, you can only express numbers that you have preordained values for. The Romans had no concept of, or values for a million (1,000,000), for example, although we do still use their numbering to count our Superbowls. Furthermore, every number expression required a mathematical exercise. MCMDLII = (1000 + (1000 – 100) + 500 + 50 + 2) = 1952. Not very convenient. This system becomes especially cumbersome when one attempts to do simple mathematics. Imagine trying to balance your checkbook. Calculating a square root would be practically out of the question.
This is a very clear case of the language limiting the expression and the thinking. Because of this system, mathematics that are now routinely taught to and performed by school children (or at least were performed by school children before calculators came along) were well beyond the reach of all but the most highly trained mathematicians. Advanced calculations were simply not possible, unless of course, you could do it all in your head. And then there was that pesky little detail about not being able to assign values to quantities that had not already been allowed.
But the Arabic civilization changed all that. Sometime toward the end of the final Roman Empire, in Egypt, they gave a symbol to the value, zero, and it became a big hit. The notational system that they developed is still in use today, and since it relies on an ordered set of repeating digits, 0 – 9, it can be easily interpreted, and more importantly, is without limits.
It was the inclusion of zero that allowed this fantastic advance. By grouping values by column, and agreeing in advance how many each column equaled, it eliminated the multiple additions and subtractions that were required to assign value to a quantity. So many of us never think about how hard it was before this notational system was created, yet in retrospect, why did it take so long?
However, as with almost any new invention, zero had its downside; it’s hidden other meaning. In the common understanding, zero does not just mean ‘no ones’ or ‘no tens’, it can be used to express the concept of ‘nothing at all’, and therein lies the problem.
If zero is an actual, notable value, then it should be able to be used in all of the other mathematical equations that the other, countable values can be manipulated by. Let’s check:
1 + 0 = 1. OK, check.
1 – 0 = 1. OK, check.
1 * 0 = This one is a little harder, but if you express it as “take 1, 0 times” the answer is more obvious. 1 * 0 = 0. OK, we can but that one.
1 : 0 = ?
Uh, oh. How can this be solved? One, divided zero times should equal one, right? But if it does, then 1 times zero should also equal one.
What can be multiplied by 0 to equal 1? The answer, of course is that there is no answer. It’s a mathematical koan. Take a blue billion, 0 times, and you still have zero. The logical expression of one divided by zero is an action performed by two perfectly acceptable numbers that creates a question for which there is no logical answer. It points out that there is a flaw in the design in the system of our mathematics, and has therefore been rendered as an ‘illegal operation’ and left undefined. But that’s not the only one. As we shall soon see (in Number Theory, Part 2), there is another ‘bear trap’ lurking in our number system, that is a specific result of our chosen geometry.
We’ll address this issue further when we discuss the continuity of the number system and what it means to the overall theory, but first, we must pay a visit to the thoughts of Rene Descartes and his signature creation, Cartesian Geometry.