Probably due to the influence of Plato, mathematics is widely conceived of as universal. In fact, it is widely accepted that the constant π would be equally well known to a reasonable advanced alien species as it is to us, albeit likely in a different base — an idea that has a recurring theme both in fiction and serious discussion about communication with an alien species.

I’m sceptical. Not because I think that the value of π is subjective, but because it is not at all clear to me that aliens would share our abstraction of numbers.

Two things have brought me to this view: reading about how humans count, and reading about the development of numbers.

Experiment with how long it takes to count the number of dots on a page and you will notice an odd pattern: when the number of dots increases past three or four, the amount of time it takes to count them suddenly jumps, while accuracy falls and brain activity changes. (You can try the test in this BBC article) While the small numbers can be almost instantly determined, with the larger ones people generally have to shift their focus on to small groups and increment a number in their head. This ability to rapidly recognize the number of something there is is called subitizing, and it is observed not just in adults, but in babies before they can speak and in non-Human animals (though the threshold at which subitization stops occurring varies between species). This evidence for an intrinsic biological root (almost for a numerical equivalent of Chomesky’s linguistic innateness hypothesis) casts doubt on the view that any reasoning species would develop numeracy.

However, evidence against some sort of universal literacy extends well beyond mere theoretical arguments: despite the advantage of subitizing, there are some very real counterexamples to numeracy. One need only look at the historical development of numbers: it was no trivial task for zero and negative numbers to be accepted as numbers, and people literally died over whether irrational numbers like e and π were numbers. And even in the modern world there are societies lacking our modern conception of numbers — the Piraha tribe lack words for precise numbers greater than two and its members have difficulty working with even modest numbers.

Still, it is reasonable to be sceptical, as many I describe this view to are, regarding whether a society could become advanced without numbers. Don’t we need them for trade? For Chemistry, Physics, Engineering and much of Biology? Where would computers be without numbers?

What alternative could there possibly be?

It is hard, of course, for me to come up with an alternative to numbers. Even if they weren’t, to some extent, hard-wired into my brain at birth, one of the most parts of elementary education is the development of the numerical abstraction — it is ingrained deeply into our thought process so that we use it without consciously deciding to (and while I may not like how modern math education works, that ingraining is a good thing! Most of the utility and beauty of math is only available when it becomes part of the way you think rather than just something you know). One can easily imagine aliens with their own abstraction, as foreign to our thought-process as numbers are to theirs.

However, while it is impossible for me to fathom how deeply *alien* an alien’s numerical abstraction alternative might be, it is possible to construct something. And so we might imagine a species of photosynthetic aliens with square bodies in a square environment. We shall dub them the Squariens. The Squariens spend their lives sitting in tightly packed grids, jumping up in the air and turning or flipping to face other neighbours. They have little concern for the amount of food they must eat, but are deeply concerned with the symmetries of the square, to us the dihedral group of order 8. And just like we may occasionally muse about other bases, they occasionally spend their time imagining the symmetries of other shapes, of triangles and pentagon, and so on: other dihedral groups. Eventually the Squariens discover group theory — they find the integers mod n in the rotational subgroups. At some point, a Squarien mathematician thinks about the symmetries of something with infinitely many sides… and when they consider the rotational subgroup, the discover a group isomorphic to the integers! Except they don’t think of them like that, don’t think of them like we do: just like we write elements of dihedral groups in terms of numbers (what else is s¹r³?) they describe their integer isomorphic group in terms of how they think of dihedral groups… But while these aliens might find our obsession with numbers odd (and our obsession with prime numbers even odder…) and we might find their obsession with symmetries strange, we’d still be able to understand each other quite easily.

However, that is a rather silly example. And I’m afraid I can’t provide a better example of an alternative to numbers. However, I believe that I can provide a serious and satisfying alternative to polynomials. It is my hope that the reader might take this a slight piece of evidence for the existence of alternatives to numbers.

When I was younger, I remember wondering why we were so concerned with polynomials. Why did we care about things of the form … rather than … ?

At that age, polynomials won me over when I realized that I could `push’ them into arbitrary shapes by using higher degree terms to control the shape farther from zero. What I had discovered was that one can find a polynomials arbitrarily close to any continuous functions of a closed interval (for a large variety of meanings of close, in fact). But what I didn’t realize at the time was that this was very much non-unique to polynomials — for example, it is also true for things of the form … (this follows easily from the Stone-Weistrass Theorem).

A deeper and more satisfying answer lies in Taylor’s Theorem. If we know the derivatives of a function at a point k, the natural way to approximate it is:

A polynomial! So polynomials are a natural result of the idea of a derivative. And while I have nothing more than anecdotal evidence for this, I’d suggest that, while many people may not know the formalities of calculus or the word derivative, everyone innately understands the idea. Certainly, young children understand ideas like speed. And the `rules of differentiation’ can be told to you by a lay person, if you ask in the right manner (sum rule: speed of a person walking on a ship, chain rule: speed of person in a movie when you fast-forward, and so on…).

But what is a derivative? One could just say that it is the rate at which something is changing and be done, but let’s go a little deeper. The definition of a derivative is usually:

It’s the ratio between change in y and change in x. And so if the derivative is A, we write dy/dx = A, meaning that A is the ratio between dy (the change in y) and dx (the change in x). One may also write it in the `differential form,’ dy = Adx, meaning that the change in y is A times the change in x. This latter form leads to a natural way to describe the derivative: it, times the change in x, is the amount one needs to add to move forward.

But there’s another interesting question: how much does one need to multiply by to move forward? This question gives birth to what we now call multiplicative or geometric calculus (Wikipedia deleted the main article, sadly, but this one still has useful content).

In many circumstances, multiplicative calculus is highly natural; for example, the decay of radioactive materials or the unconstrained growth of bacterial colony have constant multiplicative derivatives.

Just as normal derivatives naturally give rise to polynomials, multiplicative derivatives naturally give rise to things of the form … So you see, if we naturally thought of rates in terms of “how much do I need multiply by to move forward” instead of “how much do I need to add to move forward” things would be rather different. We live in a society with a very particular way of thinking, one might call us `polynomial centric’ or `linear approximation focused’.

Now an argument can be made that multiplicative derivatives would make standard physics problems more challenging. While I think it is important to keep in mind that this is simply a tradeoff and that there are other problems that are easier with the multiplicative approach, it is a valid point to consider. The easy solution would be for the Aliens to measure distance and related values in exponential form, in which case the problems would become computationally isomorphic to our normal ones. Still, there’s a lot of anthropomorphism in the assumption that they’d have the same “standard problems” as us. We like to consider point sources and things moving at constant speeds because they are simple and intuitive cases, but what’s to say that those would be the `simple’ cases they’d consider?

Regardless, there is clearly a serious contender as an alternative to polynomials that we humans can see. One of the things that really excites me about the possibility of Humanity contacting an Alien species (even if they are at a similar level of technology to us and are too far away for us to ever physically interact) is that it seems very likely that they will have a profoundly and inconceivably different perspective, and by extension abstractions, than us. I want to learn alien mathematics! And, if nothing else, I think that seeing another way of thinking will give us a lot of insight into our own. And on a similar note, I think a lot of the value of thought experiments like this one is the insight they can give us into our selves…

June 10, 2011 at 19:52 |

nah that’s a power series not a polynomial. The reason polynomials are so fundamental is because that’s what you get from adding and multiplying stuff together.

June 10, 2011 at 22:39 |

Sort of. If I approximate only to finitely many terms, it is a polynomial. The power series is the limit of a sequence of approximation polynomials.

Your explanation of why polynomials are so fundamental is also valid, but it leaves a number of questions. For example, if it depends on what you start with in `adding and multiplying stuff together’ (what some would call a function algebra). In this case, it is the constant functions and the identity function.

June 11, 2011 at 15:19 |

“chain rule: speed of person in a movie when you fast-forward”

This is an extremely good explanation of the chain rule. I didn’t know it, and I’ve always found the chain rule to be slightly unintuitive, but not anymore.

After reading your post, I tried to justify to myself, why do polynomials seem more fundamental to me than uh… exponential sums, or whatever you call them. I couldn’t think of any good reason and even the argument you considered (and sorta rejected) – connection to Taylor series doesn’t sound very persuasive.

The best explanation why polynomials are so important I could come up with is the one alex already gave: polynomials are easy to evaluate. To find the value of a polynomial in the arbitrary point, you only need to know how to add and to multiply, and these two operations are the easiest to carry out. At the same time to evaluate an “exponential sum” you need to know how to raise integers to arbitrary real powers, which seems more time consuming to me. (Try comparing computing 2.71828^2 and 2^2.71828. I am pretty sure I could find the value of the first expression without the use of a computer, but I am not sure what to do with the second one.)

Here’s my attempt at describing an alien civilization with a different math: imagine aliens, who are extremely good at mental computations. Let’s say that they can mentally compute anything my computer can: multiply two thousand-digit numbers in almost immediately, find 1000.000! in a matter of minutes, etc.

These aliens would have no particular need for calculus. Instead, they would develop what we would call numerical methods.

They would have no concept of real numbers, instead, they would use rationals. They would think of functions either in terms of lists, or in terms of computational algorithms. Instead of derivatives they would have finite differences, instead of differential equations – huge systems of linear equations. Their geometry would be entirely “pixel-based”. Eventually, they would have to introduce reals though, but for them it would be a strange, paradoxical abstraction.

June 12, 2011 at 04:37 |

>This is an extremely good explanation of the chain rule. I didn’t know it, and I’ve always found the chain rule to be slightly unintuitive, but not anymore.

Thanks! I was rather proud of myself when I came up with it. :) Though, I’m sure I’m not the first person to come up with it.

I’ve been spending a lot of my time going over basic math and thinking about what it really means. Basically, I think that one of the serious problems with modern mathematical education is that we become alienated from what the symbols represent… I’m planning to do a post on my results for single variable calculus soon.

>I couldn’t think of any good reason and even the argument you considered (and sorta rejected) – connection to Taylor series doesn’t sound very persuasive.

To be clear, I think that the Taylor series argument is a great reason for us Humans to use polynomials. They’re almost intrinsic to our way of thinking. Consider the familiar formula Δd = vᵢΔt + ½aΔt² . It’s a polynomial describing a physical phenomena, and it exists because we have the ideas of velocity and constant acceleration.

> (calculatory ease of Taylor Series)

I can see where you’re coming from. It’s a… dissatisfying answer (but this may reflect more on my desire for profound reasons rather than reality), and I can’t help but wonder to what extent my difficulty in calculating floating point powers is because is because, unlike multiplying floating points, it wasn’t drilled into my head for years in elementary school.

> … They would have no concept of real numbers, instead, they would use rationals. They would think of functions either in terms of lists, or in terms of computational algorithms. Instead of derivatives they would have finite differences, instead of differential equations – huge systems of linear equations. Their geometry would be entirely “pixel-based”. Eventually, they would have to introduce reals though, but for them it would be a strange, paradoxical abstraction.

This is a very interesting idea. I’m rather sleepy at the moment and don’t think I can give an intelligent comment on it right now, but I’m going to sleep on it and get back to you.

Update:

I can imagine them developing the idea of real numbers very early on if they investigate geometry (because of π) but I can see them very much thinking of them as a limit of a sequence of rational numbers (as we typically construct them, but more explicit). They keep being more and more precise in their measurement of the area of a circle with radius one… And I agree that it would likely seem very strange, as it did for us.

Regarding calculus, wouldn’t they have to have an idea of an infinitesimally small difference that they were approximating? I completely agree that their approach to differential equations would focus on numerical methods, though.

With geometry, I can see them using pixels in any non-trivial scenario.

What’s really interesting about the species you’re describing though is that I can see Humanity becoming them over the course of the next few decades as if we get computer implants with direct interaction with our brains… Or even if computing just becomes so ubiquitous and part of the way we live that we’re essentially always using one and they become part of our thought process. What will that do to mathematics, I wonder? Will we loose interest in our present techniques for finding precise solutions in favor of numerical methods?

June 12, 2011 at 16:26 |

“I’m planning to do a post on my results for single variable calculus soon. ”

Understanding what do symbols and formal operations mean on a gut level is 90% of the difference between a good mathematician and a bad one. I am looking forward to your future posts.

What slightly bothers me about “Taylor series argument”, as you named it is that saying “polynomials are the most fundamental and important functions of all, because they are used in that one method of approximation” is a little bit like saying “sine function is the most fundamental of all, because it is used in Fourier sequences” or even “gamma function is the most important, because it is used in [obscure identity #45.]” Taylor series are overwhelmingly important, but it’s not *the* method how to approximate something, and not even the most convenient. They require lots of smoothness, break down at the first sign of a singularity, real or imaginary, and then there are always such non-analytic monsters as exp(-1/x^2). Fourier series are in many ways more natural and more powerful.

Besides, people started working with polynomial long before they came up with Taylor series.

“I can’t help but wonder to what extent my difficulty in calculating floating point powers is because is because, unlike multiplying floating points, it wasn’t drilled into my head for years in elementary school.”

I was under impression that all the known algorithms for raising numbers to real powers are much more convoluted than the standard multiplication algorithm, but I am not sure. I thought you would know, since you work with computers and stuff.

“I can imagine them developing the idea of real numbers very early on if they investigate geometry (because of π) but I can see them very much thinking of them as a limit of a sequence of rational numbers”

Maybe my aliens wouldn’t feel the need to even think of any limits, they would be only interested in guaranteeing the certain high number of correct digits of the result of a computation. They’d say: “3.1415926, what more to ask for?” Yes, eventually they would be forced to introduce reals, if only to be able to prove the correctness of their differential schemes, but they would think of them as an abstract, theoretical construct: “It’s like a regular number, but we will pretend that digits go on forever, and even though we don’t know what those digits are we will act like we do”.

“Will we loose interest in our present techniques for finding precise solutions in favor of numerical methods?”

You are aware that this is already happening, right? As computers become faster and faster, and numerical methods improve more and more, many people start to wonder if there is any point in trying to solve equations analytically, when you can open MatLab and find the numeric solution in a matter of seconds, especially in cases when you only need the precise solution in order to turn it into a column filled with numbers to start with. Mathematics has been human-oriented for 2000 years, and during the last 50 it has started to become machine-oriented. I can’t say I am not a little bit frightened.

June 24, 2011 at 12:53 |

> What slightly bothers me about “Taylor series argument”, as you named it is that saying “polynomials are the most fundamental and important functions of all, because they are used in that one method of approximation” is a little bit like saying “sine function is the most fundamental of all, because it is used in Fourier sequences” or even “gamma function is the most important, because it is used in [obscure identity #45.]” Taylor series are overwhelmingly important, but it’s not *the* method how to approximate something, and not even the most convenient. They require lots of smoothness, break down at the first sign of a singularity, real or imaginary, and then there are always such non-analytic monsters as exp(-1/x^2). Fourier series are in many ways more natural and more powerful.

So, first of all, I should make sure that we both understand I’m not making an argument for polynomials being universal in some abstract Platonic way. I mean it in a very Human way.

You’re right that there isn’t really something mathematically superior about Taylor approximations, and in fact that in many ways Fourier series (tolerant of discontinuity on measure zero sets, etc). But they aren’t part of the way people think in the way the polynomials for Taylor series are. We naturally think in terms of derivatives which are just a factorial away from being the coefficients of Taylor series approximation polynomial.

If we naturally abstracted into sine Fourier series coefficients, I’d argue that they were deeply fundamental too.

(And while they aren’t natural in the same way Taylor polynomials are, they’re definitely more natural than Legendre Polynomials, speaking of Fourier Series. It has more to do with the Fourier Transform though, and its connection to our hearing.)

> I was under impression that all the known algorithms for raising numbers to real powers are much more convoluted than the standard multiplication algorithm, but I am not sure. I thought you would know, since you work with computers and stuff.

I don’t usually do anything that low level and hadn’t really thought about it. The closest I got was when I when I was younger and spent weeks of Summer vacation drawing a simple processor out of logic gates on paper. But I never implemented anything higher up the hyper operation chain than multiplication, so I don’t know much about how one implements exponentiation.

Some research seems to suggest that you’re right, however.

>You are aware that this is already happening, right? As computers become faster and faster, and numerical methods improve more and more, many people start to wonder if there is any point in trying to solve equations analytically

Computers are definitely becoming more important, but I haven’t been doing math long enough to actually see trends. I was hoping that, as would seem natural, we were deemphasizing the importance of calculation and focusing instead on understanding.

>Mathematics has been human-oriented for 2000 years, and during the last 50 it has started to become machine-oriented. I can’t say I am not a little bit frightened

I’m actually not sure how to respond to this, even after spending a number of days reflecting on it. I’m not sure that what you say is happening is what is happening, and I’m not sure that what is happening is a bad thing.

Being able to calculate is essentially useless if you don’t understand what you’re calculating. (And understanding is deeply tied to knowing how to find exact solutions.) And, short of the singularity, computers aren’t going to be doing that for us. It just means that humans aren’t trying to be calculators anymore. And that doesn’t seem too bad.

But there’s definitely an extent to which culture misunderstands mathematics and doesn’t see anything beyond calculating. So it could go with society abandoning real mathematics as obsolete… I don’t know.

June 24, 2011 at 13:28 |

I thought about these things for a little while, and it occurred to me that there is indeed something very special about the tree classes of the most frequently functions: polynomials, exponential and trigonometric functions – they are tied to the operation of the differentiation each in their own way. Polynomials form the kernel of the (iterated) differentiation operation, and this is what makes Taylor series possible, and exponents and combinations of sines are eigenfunctions of the iterated differentiation, that’s why they are so important for differential equations and Fourier series. So, I think every alien civilization that uses our version of differentiation must also be very fond of these three classes of functions, but it may not be so if they use, for example, “geometric calculus” you are such a big fan of, or some other modification like that.

“But they aren’t part of the way people think in the way the polynomials for Taylor series are. We naturally think in terms of derivatives which are just a factorial away from being the coefficients of Taylor series approximation polynomial.”

I agree with the basic idea, but if I wanted to play Devil’s advocate I’d say: “polynomials are also one factorial away from the coefficients of [obscure formula #56], and [obscure formula #56] uses gamma function, hence gamma function is the most important function of all! QED.”

The main reason I think that Fourier series (trigonometry based or your-orthogonal-system-of-choice based) are more natural than Taylor series, is because they basically generalize the idea of coordinates to the spaces of functions. Expanding a function in a linear combination of some other functions, forming an orthogonal system is basically the same as writing a vector as a linear combination of vectors forming the basis.

June 28, 2011 at 14:41 |

I think you’re just looking at it too much from a calculus perspective. Linear algebra, ring theory, field theory, galois theory, algebraic geometry even, all involve polynomials. Polynomials are more immediate from the basic operations of multiplication and addition than what you are proposing. Whether there are polynomial-free fields worth studying that are being neglected somehow by the the idiosyncratic way in which humans view the world who knows though.

August 8, 2011 at 21:58 |

Your tale of a culture that is far more interested in the order-8 dihedral group than it is in the integers seems farfetched; I think most technological cultures could not help discovering the integers.

That having been said, it is uncanny that you picked D8: the Warlpiri people of central Australia do indeed have a far more advanced nomenclature concerning D8 than they do for Z. They have names for all the elements of D8 and can certainly compose them effortlessly in their heads, but they do not have a native word for 4. (They aren’t a very technological culture, though.) I think I’m going to let you have the pleasure of doing the research to find out why D8 is of interest to them.

August 8, 2011 at 22:17 |

That’s fascinating. I will indeed have to research that :)

May 3, 2012 at 23:11 |

Here is a quote from an article about a mathematical savant, Danile Tammet:

“Daniel Tammet is able to see and feel numbers. In his mind’s eye, every digit from zero to 10,000 is pictured as a 3-dimensional shape with a unique color and texture. For example, he says, the number fifteen is white, yellow, lumpy and round.

Synesthesia occurs when regions of the brain associated with different abilities are able to form unusual connections. In most people’s brains, the recognition of colors, the ability to manipulate numbers, or language capacity all work differently in separate parts, and the information is generally kept divided to prevent information overload. But in synesthetes, the brain communicates between the regions.

Tammet doesn’t need a calculator to solve exponential math problems such as 27 to the 7th power — that’s 27 multiplied by itself seven times — he’ll come up with the answer, 10,460,353,203, in a few seconds.

Tammet visualizes numbers in their unique forms and then melds them together to create a new image for the solution. When asked to multiply 53 by 131, he explains the solution in shapes and textures: “Fifty-three, which is round, very round…and larger at the bottom. Then you’ve got another number 131, which is longer a little bit like an hourglass. And there’s a space that’s created in between. That shape is the solution. 6,943!”

Perhaps a better understanding of this phenomena will shed some light on different ways of doing mathematics.