## Division by Zero: The Truth

There are many misconceptions about division by zero. These misconceptions will be reviewed, and there will be discussion as to what division by zero truly means.

**Myth: one over zero equals infinity.**

It is true that the limit as n approaches zero of one over n is indeed infinity. But a limit and an actual calculation are not the same thing. You cannot divide a number by a number and get a non-number (infinity is not a number).

**Myth: Division by zero causes a computer to crash.**

Nope. All it will do is cause a program to continue running forever. And that’s just a really unsophisticated program. Here’s some Ruby code for a division algorithm:

quotient = 0 val = 1.0 20.times { while x > y x -= y quotient += val end y /= 10 val /= 10 }

This will effectively divide x by y. But if y = 0, it will continue running indefinitely. It won’t crash. Most real programs, though, won’t let you divide by zero. They either return an error, or return Not A Number.

**Why can’t we divide by zero?**

Let’s think of this geometrically.

How do you divide three by five? Like this: You have three pizzas and five people. How many pizzas does each person get? Well, three fifths. What if you have zero pizzas and five people? Each person gets zero pizzas: 0 divided by 5 equals 0. But what if you have five pizzas and zero people? How many pizzas does each person get? You could say zero, because no one gets any pizza. Or you could say infinity, since even a tiny fraction of a person would get all the pizza, and zero is like the tiniest fraction of all. But neither of these is really correct.

Here is a mathematical example of why you cannot divide by zero.

`x = 1 / 0`

Using simple algebra, we can multiply both sides by 0.

`x * 0 = 1`

We know that anything multiplied by zero equals zero. So we can simplify this.

`0 = 1`

Oh wait. That doesn’t work, does it. When we try to divide by zero, math breaks.

Or take a look at this:

Plus, everyone knows that when you divide by zero you get over 9000.

What if you try to make division by zero actually work? Well, first, stop trying. You can’t. But secondly, and more importantly, if division by zero results in an actual number, then you will break calculus. Oops. In order to calculate a derivative, division by zero *has to* be undefined. So go ahead and try to legitimately divide by zero. But don’t blame me when calculus breaks and then we can’t launch missiles and stuff. And then the commies win.

For example, you want to take the derivative of y = x squared, instead of getting a nice pretty y = 2x, you get a completely crazy graph that makes no sense. Not good.

And that’s why you can’t divide by zero, kids.

I like to think about it in terms of file compression:

Multiplying by zero is like compressing a file down to zero bytes, and dividing by zero is like trying to decompress that zero-byte file back into the original. How can you know what the original file (number) was if every file (number) gets compressed (multiplied) into the same thing?

NOWyou are really breaking the laws of math.How is y = x^2 the same as y = 2x (unless x = 2 or 0)

Read carefully… you want to take the **derivative** of y = x squared, instead of getting a nice pretty y = 2x.

Notice the keyword DERIVATIVE. He never mentioned that y = x^2 is the same as y = 2x

If this is the algorithm to divide…

… doesn’t the y /= 10 would mean divide y by 10, but this is suppose to define divide. How does this work?