If the idea behind a normal derivative might be describe as the amount one needs to ad to move forward. If so, then the idea behind a geometric derivative is the amount one needs to multiply by.

The definition of a geometric derivative is

One thing one might wonder is whether we can use them to approximate functions, like we can use derivatives to approximate functions with Taylor series. A quick Google search didn’t reveal anything, but it isn’t too hard to figure out.

Where as Taylor series are a sum of (which have their nth derivative as the constant a) our approximation should be a product with terms that have analogous properties regarding geometric derivatives (ie. the nth geometric derivative is a constant).

Since , . By induction, .

So, our approximation at 0 would be where . The approximation about a point y will be .

As a test, let’s try approximating about $1$:

Approximation of sin about 1 by Taylor Series and Geometric Derivatives -- blue is sine, red is Taylor, yellow is geometric. You may need to click to see the animation.

Not bad!

Update: In my discussion with George G. bellow, I realised that we can understand the geometric Taylor Series as the exponentiated Taylor series of the logarithm of the original function…

### Like this:

Like Loading...

*Related*

Tags: geometric derivatives, math

This entry was posted on December 9, 2010 at 02:13 and is filed under Uncategorized. You can follow any responses to this entry through the RSS 2.0 feed.
You can leave a response, or trackback from your own site.

December 9, 2010 at 16:40 |

Interesting. I’ve never heard about these derivatives before. I knew about logarithmic derivatives, but this is new to me.

Naturally, I am suspicious. The biggest problem with theories and generalizations like these is that they often turn out to be cute, but worthless and can’t be used to derive anything that you can’t already derive with standart calculus. What makes me especially suspicious, is how easily the new derivative can be expressed using the old one. (I also suspect these derivatives to be the work of Satan, as all the new things invariably are.)

One the other hand, it seems to me that your Taylor-esque product can probably be used to obtain some neat identities with infinite products, if nothing else, in the same way that Fourier series can be used to evaluate various non-trivial infinite sums.

It would also be very cool if your expansion turned out to have some good convergence properties – then it probably could be used when the regular Taylor sequence diverges or converges too slowly, which is pretty often.

Also, see this paper:

http://math.pugetsound.edu/~mspivey/ProdInt.pdf

December 9, 2010 at 18:33 |

> Naturally, I am suspicious. The biggest problem with theories and generalizations like these is that they often turn out to be cute, but worthless and can’t be used to derive anything that you can’t already derive with standart calculus. What makes me especially suspicious, is how easily the new derivative can be expressed using the old one. (I also suspect these derivatives to be the work of Satan, as all the new things invariably are.)

I should mention that one of my projects for the holidays is to research fractional calculus, the fractional Fourier transform, the fractional Laplace transform, etc. If you’re suspicious of this… 🙂

Also, in the defence of geometric derivatives: I think that one of the very important roles of mathematics is providing useful abstractions to think about things with. And just like the rate at which something is increasing in terms of how much one needs to add is a great abstraction for, say kinematics, the rate at which something is increasing in terms of how much one needs to multiply is also useful in situations like the rate of growth of bacterial colony. In fact, a lot of the ideas science has about exponential properties, doubling time, half-life and so on, connect neatly with geometric derivatives.

>One the other hand, it seems to me that your Taylor-esque product can probably be used to obtain some neat identities with infinite products, if nothing else, in the same way that Fourier series can be used to evaluate various non-trivial infinite sums.

Hm… It might be a nice way to look at infinite products, but I’m not sure it would ultimately be that useful, since one could just do: and apply Taylor’s theorem and get similar results, no need to consider geometric derivatives. Actually, it occurs to me that that is exactly what my Taylor series thing is. This is somewhat disappointing. I’ll update the post and add it though.

> It would also be very cool if your expansion turned out to have some good convergence properties – then it probably could be used when the regular Taylor sequence diverges or converges too slowly, which is pretty often.

In all the fair tests that I’ve done, — ie not — Taylor’s theorem seems to have geometric Taylor’s theorem beat both in terms of rate of convergence and radius. And powers are a lot less computationally expensive than exponentials. In fact, I’m pretty sure that the exponentials are implemented in terms of a taylor series on most computers. 😦 On the other hand, I’ve only tested a couple functions, so who knows what more might turn up.

>Also, see this paper: http://math.pugetsound.edu/~mspivey/ProdInt.pdf

Awesome! Thanks.

I figured someone else had already done this. It really was too trivial for it to be otherwise… Still, it was fun to figure out.

December 9, 2010 at 19:08 |

//////I should mention that one of my projects for the holidays is to research fractional calculus, the fractional Fourier transform, the fractional Laplace transform, etc. If you’re suspicious of this… //////

You bet I am. Actually, when it comes to Fourier and Laplace transforms, I am completely incompetent: I only have a vague, general idea of what those things are, but I’ve never used them so far, which is, of course, a shame. But fractional stuff… Is it actually useful or a generalization for the sake of generalization? (That’s an actual, honest question, not a rhetorical one.)

\\\\\\\Actually, it occurs to me that that is exactly what my Taylor series thing is. This is somewhat disappointing. \\\\\\\

It’s not very fun when something you’ve been working on suddenly trivializes, but at least it means that you have achieved the new level of clarity and undestanding of the subject.

December 9, 2010 at 22:45 |

>You bet I am. Actually, when it comes to Fourier and Laplace transforms, I am completely incompetent: I only have a vague, general idea of what those things are, but I’ve never used them so far, which is, of course, a shame. But fractional stuff… Is it actually useful or a generalization for the sake of generalization? (That’s an actual, honest question, not a rhetorical one.)

Fractional calculus is used in modelling and the fractional Fourier transform is used in signal processing (according to some papers I’ve seen). I don’t know to what extent they’re used . Or even really understand them at this point.

Everything I can find on them right now is at a very high level. I’m going to try and understand them and simplify it so that one doesn’t have to dedicate days (weeks?) to understand it.