Division by zero equals zero.
In computer programming, an attempt to divide a floating point number by zero will by default lead to positive or negative infinity by the IEEE 754 floating point standard. However, depending on the programming environment and the type of number (e.g. integer) being divided by zero, it may: generate an exception, generate an error message, cause the program to terminate, generate either positive or negative infinity, or result in a special not-a-number value.
The answer is that as far as mathematics (and standards like IEEE
754) are concerned, dividing 0 by 0 is either a trap or a NaN. Not
zero. if Apple hardware is producing zero as a result of dividing
zero by zero, I'd consider that buggy hardware.
I do note from a Google search that at least the PowerPC 8360
does the correct thing with divide by zero, which breaks someone's
application that depended on different behaviour:
http://stackoverflow.com/questions/6460558/powerpc-how-to-make-div-0-return-zero-as-a-result
I do find amusing the comment about the original dependency on
1/0 producing 0:
I've inherited legacy code like this before & I feel your pain. You want to shake your fist at the people who installed such bone-headed behavior, but right now shaking your fist doesn't help you ship product. You need a solution. Good luck.