In the world of math, many strange results are possible
when we change the rules. But there’s one rule that most of us
have been warned not to break: don’t divide by zero. How can the simple combination
of an everyday number and a basic operation
cause such problems? Normally, dividing by smaller
and smaller numbers gives you bigger and bigger answers. Ten divided by two is five, by one is ten, by one-millionth is 10 million, and so on. So it seems like if you divide by numbers that keep shrinking
all the way down to zero, the answer will grow
to the largest thing possible. Then, isn’t the answer to 10
divided by zero actually infinity? That may sound plausible. But all we really know is
that if we divide 10 by a number that tends towards zero, the answer tends towards infinity. And that’s not the same thing as
saying that 10 divided by zero is equal to infinity. Why not? Well, let’s take a closer look
at what division really means. Ten divided by two could mean, "How many times must
we add two together to make 10,” or, “two times what equals 10?” Dividing by a number is essentially
the reverse of multiplying by it, in the following way: if we multiply any number
by a given number x, we can ask if there’s a new number
we can multiply by afterwards to get back to where we started. If there is, the new number is called
the multiplicative inverse of x. For example, if you multiply
three by two to get six, you can then multiply
by one-half to get back to three. So the multiplicative inverse
of two is one-half, and the multiplicative inverse
of 10 is one-tenth. As you might notice, the product of any
number and its multiplicative inverse is always one. If we want to divide by zero, we need to find
its multiplicative inverse, which should be one over zero. This would have to be such a number that
multiplying it by zero would give one. But because anything multiplied
by zero is still zero, such a number is impossible, so zero has no multiplicative inverse. Does that really settle things, though? After all, mathematicians
have broken rules before. For example, for a long time, there was no such thing as taking
the square root of negative numbers. But then mathematicians defined
the square root of negative one as a new number called i, opening up a whole new
mathematical world of complex numbers. So if they can do that, couldn’t we just make up a new rule, say, that the symbol infinity
means one over zero, and see what happens? Let's try it, imagining we don’t know
anything about infinity already. Based on the definition
of a multiplicative inverse, zero times infinity must be equal to one. That means zero times infinity plus
zero times infinity should equal two. Now, by the distributive property, the left side of the equation
can be rearranged to zero plus zero times infinity. And since zero plus zero
is definitely zero, that reduces down to zero times infinity. Unfortunately, we’ve already defined
this as equal to one, while the other side of the equation
is still telling us it’s equal to two. So, one equals two. Oddly enough,
that's not necessarily wrong; it's just not true
in our normal world of numbers. There’s still a way it could
be mathematically valid, if one, two, and every other number
were equal to zero. But having infinity equal to zero is ultimately not all that useful
to mathematicians, or anyone else. There actually is something called
the Riemann sphere that involves dividing by zero
by a different method, but that’s a story for another day. In the meantime, dividing by zero
in the most obvious way doesn’t work out so great. But that shouldn’t stop us
from living dangerously and experimenting
with breaking mathematical rules to see if we can invent
fun, new worlds to explore.
I’ve always explained it to my 3rd graders as not being able to start with something and break it into 0 groups. I usually show 3 pens and say I will always have at least 1 group, I then throw the pens over my shoulder and tell them I can’t just get rid of them to make 0 groups, they are just on the floor now. 😀
TLDR pls