Valhalla Legends Forums Archive | Yoni's Math Forum | Fun Calculus Problem

AuthorMessageTime
nslay
Prove the product rule of derivatives using the Taylor series.

I'll post the proof in a few days.

Note:  No Yonis allowed.  Any Yonis providing the proof will be attacked by a giant slug monster with 3 tentacules (this excludes giant slug monsters with more or less than 3 tentacules).
May 4, 2005, 7:42 AM
Rule
Let f and g both be functions of x,

d/dx (f*g) = gf'+fg'

f(x) = f(a)+f'(a)(x-a) + f''(a)(x-a)^2 /2  + f'''(a)(x-a)^3/3! + f''''(a)(x-a)^4/4!...  (1)
g(x) = g(a) + g'(a)(x-a) + g''(a)(x-a)^2/2 + g'''(a)(x-a)^3/3!....                        (2)

f*g = f(a)g(a) + f(a)g'(a)(x-a) + f(a)/2! * (x-a)^2 * g''(a) + ... + f(a)/n!*(x-a)^n gn(a) + f'(a)(x-a)g(a) + f(a)(x-a)g'(a)(x-a) + f'(a)(x-a)^3g''(a)/2!...
and continue multiplying them out...
differentiate them to get (3).

then differentiate f(x) in (1) and multiply by g(x) in (2) and add with f(x) in (1) multiplied by d/dx of g(x) in (2), and show that both sides are equivalent.

I started doing this but it seemed long-winded, so I stopped.  I'm pretty sure that will work, otherwise we've discovered a major problem with calculus :p.

If you disagree, try proving this in a multivariable case. 

D(gf)(x) = gDf(x) + fDg(x)
<-- General product rule (I doubt this is in many multivariable calculus texts :P)


Hint: f(x,y,z,w,r,q,L....) = f(a,b,c,d,e,f,g,...) + Df(a)h + 1/2!*h (transpose) * Hf(a) * h ...

Where Hf(x) is the hessian matrix, Df(x) is the derivative matrix; x, and a are vectors, h is a matrix representing all the components of delta f (e.g. x-a, y-b, ...)

Enjoy!

May 4, 2005, 4:26 PM
nslay
[quote author=Rule link=topic=11486.msg111040#msg111040 date=1115223969]
Let f and g both be functions of x,

d/dx (f*g) = gf'+fg'

f(x) = f(a)+f'(a)(x-a) + f''(a)(x-a)^2 /2  + f'''(a)(x-a)^3/3! + f''''(a)(x-a)^4/4!...  (1)
g(x) = g(a) + g'(a)(x-a) + g''(a)(x-a)^2/2 + g'''(a)(x-a)^3/3!....                        (2)

f*g = f(a)g(a) + f(a)g'(a)(x-a) + f(a)/2! * (x-a)^2 * g''(a) + ... + f(a)/n!*(x-a)^n gn(a) + f'(a)(x-a)g(a) + f(a)(x-a)g'(a)(x-a) + f'(a)(x-a)^3g''(a)/2!...
and continue multiplying them out...
differentiate them to get (3).

then differentiate f(x) in (1) and multiply by g(x) in (2) and add with f(x) in (1) multiplied by d/dx of g(x) in (2), and show that both sides are equivalent.

I started doing this but it seemed long-winded, so I stopped.  I'm pretty sure that will work, otherwise we've discovered a major problem with calculus :p.

If you disagree, try proving this in a multivariable case. 

D(gf)(x) = gDf(x) + fDg(x)
<-- General product rule (I doubt this is in many multivariable calculus texts :P)


Hint: f(x,y,z,w,r,q,L....) = f(a,b,c,d,e,f,g,...) + Df(a)h + 1/2!*h (transpose) * Hf(a) * h ...

Where Hf(x) is the hessian matrix, Df(x) is the derivative matrix; x, and a are vectors, h is a matrix representing all the components of delta f (e.g. x-a, y-b, ...)

Enjoy!


[/quote]

uh huh ... now factor the (x-a) out of the terms that share it and you'll end up with f(x)g(x) = f(a)g(a) + (f'(a)g(a)+f(a)g'(a))(x-a) + ...
and as you can see
for h(x)=f(x)g(x)
h(x)=h(a)+h'(a)(x-a)+(h''(a)(x-a)^2)/(2!)+...
so h'(a)=(f'(a)g(a)+f(a)g'(a)) since all polynomials are unique
May 4, 2005, 8:43 PM

Search