From: "David C. Ullrich" Subject: Re: extremum points of real valued functions Date: 25 Jun 2000 00:07:44 GMT Newsgroups: sci.math Summary: [missing] artur.steiner@usa.net wrote in article <8j36i4$b76$1@nnrp1.deja.com>... > In a very good book on Real Analysis, by Robert Bartle, there's a proof > of that famous theorem which says that, if f is a function from R to R > and a is an element of it's domain such that the first n-1 derivatives > are zero and the derivative of order n is different from zero, then: > > If n is even and the nth derivative is >0, then f has a relative > minimum at a; > If n is even and the nth derivative is <0, then f has a relative > maximum at a; > If n is odd, then f has neither a minimum nor a maximum at a. > > The proof is based on Taylor's Theorem and the author assumes the nth > derivative of f exists and is continuous on a neighborhood of a. I'm > not quite sure, but I think such assumptions are too strong. I think > it's not necessary the existence of such neighborhood. Isn't the simple > existence of the nth derivative of f at a a sufficient condition for > the theorem to be valid? Yes. Actually you can give a version of Taylor's theorem that's true under weaker hypotheses than often stated. (I read Chung say this and say he was giving Hardy's(?) version - I never bothered to look at what standard references include it. I really should, could put up a webpage on BAD calculus books...) If f is just differentiable at 0 then 1. f(x) = a + bx + o(x), where a = f(0), b = f'(0) and "o(phi(x))" denotes "some function g(x) such that g(x) / phi(x) -> 0 as x -> wherever." Equation (1) is an obvious consequence of Taylor's theorem as often stated for continuously differentiable functions, but in fact it's valid supposing just that f is differentiable at the origin (it's just the definition of differentiability rearranged). Same thing works for n > 1, by induction: Say f(0) = f'(0) = f''(0) = 0. Then f' and its derivative both vanish at 0, so applying the above with f' in place of f shows that f'(x) = o(x). But this immediately implies that 2. f(x) = o(x^2) by the Mean Value Theorem. QED. (The same argument shows by induction that if f(0) = ... = f^(n)(0) = 0 then f(x) = o(x^n). This shows that in general if f^(n)(0) exists then f has a Taylor expansion with error term o(x^n).) In fact we didn't even need to assume that f' was continuous to get (2) (which I thought I was going to be assuming when I started typing this, actually). We assume literally nothing but the existence of f''(0) (which of course implies that f' be defined at every point in some neighborhood of 0, which is exactly enough to be able to apply MVT.) Now about the extrema: If f^(n)(0) exists and is non-zero but all the lower-order derivatives vanish then f(x) = ax^n + o(x^n) for some non-zero a, hence if n is odd [etc]. > Thank you > Artur > > > Sent via Deja.com http://www.deja.com/ > Before you buy. >