A while back, I wrote a post on some unusual math notation I was playing with. I actually took it much further than that and drafted up a paper full of different ideas over the following months. At the time, I was hoping it might be publishable but then got discouraged and never did anything with it… Except show it to people in person and promise I’d put it online soon. In any event, it’s one of those things I’ve been meaning to put up for forever.

So here it is. I’m particularly proud of the quantifier stuff and the numerals.

When one considers how complicated the ideas mathematical notation must represent are, it clearly does quite a good job. Despite this, the author believes that any claim that modern mathematical notation is the *best* should be met with extreme scepticism.

Mathematical notation is a natural language: no one sat down and constructed it, but rather it formed gradually by people making changes that are adopted. Most changes are not adopted; whether they are depends on a variety of factors including: mathematical utility, ease of adoption (the average individual doesn’t want to spend hours learning), dissemination, and shear dumb luck. The first two of these qualities are associated to real properties in notation, forming the necessary selective pressure for evolution to occur.

Evolution is a blind watchmaker: the world around us is filled with examples of the stupidity of biological evolution (the classic example being halibut). Similarly, evolution is also a blind language and notation designer. In particular, it is held back by a strong selective force against change, since people would need to adopt it, and so evolution doesn’t effectively explore the full notation-space. This staticism means that even the most outrageous notations remain unchallenged by virtue of age.

Read More…

Please keep in mind that I wrote this (except for the redaction of some sillier sections and some minor improvements) three years ago, when I was quite a bit younger, and it isn’t representative of my present abilities. That said, I do still think that the fundamental idea, that notation is important and should be explored, is very true.

(I’ve also put the paper on github, for those who are interested.)

### Like this:

Like Loading...

*Related*

Tags: math, notation

This entry was posted on March 20, 2013 at 03:58 and is filed under Uncategorized. You can follow any responses to this entry through the RSS 2.0 feed.
You can leave a response, or trackback from your own site.

March 20, 2013 at 09:43 |

Reminds me of an amusing game from my childhood, for basic arithmetic at least your digit values don’t have to start at 0, you can for example build a perfectly functional base 7 system using -3, -2, -1, 0, 1, 2 and 3 at which point you no longer need a separate notation to indicate positive and negative values. Alternately you could have a base 10 system with digits representing 1 – 10, which works just fine until you need to represent 0.

March 20, 2013 at 22:05 |

You can also have the *base* be negative or imaginary. For example, in base -2, -2 = 10 = -2, -1 = 11 = -2 +1, 0=0, 1 = 1, 2 = 110 = 4-2, 3= 111 = 4-2+1… I had lots of fun with that around the same time as writing this paper.

March 20, 2013 at 18:28 |

Sounds like someone was reading “Worse is Better” while getting ideas for this paper….😉

March 20, 2013 at 22:01 |

I actually haven’t read it! But it seems like something I should read.