First, I must give κῦδος(kudos) to Arithmeticus Simplex for pointing out the practicality of operations with roman numerals (it’s important to acknowledge that a Roman provided with the argument of comparison would conclude that her system was vastly superior…and a Kyalian well versed in balanced ternary would correctly conclude balanced ternary to be superior to both Roman and decimal notation).

On division by zero. I very much favor Abraham Walker, his non-standard analysis.*

*(H. Jerome Keisler provided open access to his book, Elementary Calculus: An Infinitesimal Approach, under a Creative Commons by-nc-sa license.

http://www.math.wisc.edu/~keisler/calc.html

The book provides a highly approachable explanation of non-standard analysis.)

Non-standard analysis (will be a funny name, if at some point in the future, it becomes standard) defines a range of positive numbers that are greater than zero and less than any positive real number. The additive inverse gives a negative range. And the inverses give infinite results.

I prefer this method for dealing with infinity because, at least to me, it does not seem reasonable to think of 1 divided by zero as positive or negative or even non zero.

To me, defining division by zero as undefined is not a bug. I think the bug is believing that we know nothing well enough assume that dividing by it should be defined (of course, if we do some day come do know nothing, then we may feel free to divide by it).

So for all practical intents and purposes, I like to use the inverse of positive and negative infinitesimal numbers to represent infinity because I know the sign of their inverses. (I don’t like taking the integral between negative and positive infinity because I don’t think it’s reasonable to say that infinity has a sign or non zero value).

On the other hand, I do think it may be reasonable to define zero divided by zero as 1, provided that the “1” thus generated is given a universe of sets not compared with other divisions of zero by zero.

(Note: I am not familiar with set theory, so please correct and forgive any errors in my use of terminology. Specifically, I’m the concept of Universe that I learned from Lewis Carrol [Dodgson], his Symbolic Logic).

That is 0/0 of set universe A = 1 of set universe B, but 1_a does not equal 1_b. And 0_b/0_b = 1_c which does not equal 1_a or 1_b, or rather that 1_a = 1_b is one possibility out of an infinite set of roughly equivalent possibilities and therefore infinitely improbable.