From: bruck@math.usc.edu (Ronald Bruck) Newsgroups: sci.math Subject: Re: Axioma or evidence? Date: Tue, 17 Nov 1998 09:43:40 -0800 In article <36519359.E375B201@student.utwente.nl>, Wilbert Dijkhof wrote: :Ronald Bruck wrote: : :Couldn't you post this a couple of months ago? : :> It can be done somewhat the way calculus books develop exp and log; :> developing the arcsin first by DEFINING :> :> arcsin(x) = \int_0^x 1/sqrt(1-t^2) dt :> :> for - 1 < x < 1 (and extending to -1 and 1 by continuity); then defining :> sin to be the inverse function of this. That only gives you sin from -pi/2 :> to pi/2, but you can build the whole function up from that. In this :> development, you show that y = sin x is a solution of y'' = -y, and that :> all solutions of y'' = -y are linear combinations of sin and Dsin. The :> addition formulas follow from this fact. : :A couple of months ago me and Frank Wappler tried to define sin/arcsin :similar as you do above. We didn't succeed because we couldn't give a :'good' prove of the addition formula (not to confuse things, we are talking :about sin(x+y) = sin(x)cos(y) + sin(y)cos(x) right?). . Sorry. Actually, I HAVE posted this before, at least a year ago, maybe two. Many years ago (late sixties) I taught this in a calculus course at the University of Chicago to a VERY good honors class. The idea is to begin with a function as defined by as(x) = \int_0^x 1/sqrt(1-t^2) dt, defined for at least -1 < x < 1. It's then extended to [-1,1] by taking the limit (the improper integral is easily seen to converge). We have as'(x) =1/sqrt(1-x^2) (-1 < x < 1), thus as is strictly increasing. Its inverse function therefore exists and is C^1 (a fact which we don't seem to prove anymore in elementary calculus). Let's denote as^{-1} by Sin. Then implicitly differentiating Sin(as(x)) = x we get Sin'(as(x)) as'(x) = 1, which means Sin'(as(x)) = sqrt(1-x^2). Replacing x by Sin(t), therefore Sin'(t) = sqrt(1-Sin(t)^2). Differentiating again, we see that Sin'' = -Sin. And, of course, Sin' also satisfies this DE. It's clear that Sin(0) = 0, Sin'(0) = 1. Now prove a uniqueness theorem: if y is differentiable on an interval (-a,a), and satisfies y'' = -y, y(0) = 0, y'(0) = 0, then y = 0. Proof: show that y'^2 + y^2 is constant by differentiating it and discovering a zero derivative. So far this is all standard, and very much in the spirit of the currently- fashionable log-exp development. The little bit I added was how to extend Sin to the whole real line. You see, we have Sin (and Cos, too, when it's defined to be D Sin) but only on an interval symmetric about zero, not the real line. (Of course, the interval is [-pi/2,pi/2] but we don't know anything about pi yet.) I did the extension using the double-angle formulas. Put f(x) = 2 Sin(x/2) Cos(x/2). I leave it to you to verify that f'' = -f, and f is defined on an interval TWICE as long as where Sin was; and f(0) = 0, f'(0) = 1. Thus by the uniqueness result, f is an extension of Sin. Then extend f to an interval twice as long; then extend that; etc. You get a GLOBALLY DEFINED sin : R --> R which satisfies sin'' = -sin, sin(0) = 0, sin'(0) = 1. The addition formulas follow from the uniqueness theorem. Fix x, and define g(t) = sin t cos x + cos t sin x - sin(t+x). Show that g'' = -g, g(0) = 0, and g'(0) = 0 and you're done. I would NEVER try this in a "modern" calculus course. --Ron Bruck ============================================================================== From: bruck@math.usc.edu (Ronald Bruck) Newsgroups: sci.math Subject: Re: Axioma or evidence? Date: 17 Nov 1998 16:35:40 -0800 :I've just noticed Ron Bruck's interesting follow-up. :AFAIU, he's implying and _using_ : :g'( 0 ) = cos( x ) - sin'( x ) = 0. : :I believe that I won't find this obvious even after contemplating :it a little longer, and I'm going to ask him about it then ... Yes, I introduced the Sin function on the interval [-pi/2,pi/2] (w/o specifying the value of pi), but DEFINED Cos to be Sin'. It's direct that Sin' = Cos and Cos' = -Sin from this definition. Perhaps I forgot to specify this definition, I don't have the post immediately at hand. By extending using f(x) = 2 Sin(x/2) Cos(x/2) I get Sin extended a function Sin_1 on the interval [-pi,pi]; and by setting Cos_1 to be the derivative of THAT, I get an extension of Cos. And then do it again, and again, and again... Actually, there's no reason to ever use cos. Just write sin' in all your formulas, and forget all tangents, secants, cosecants, etc., and you have all of trigonometry. Things do get a little messy... --Ron Bruck The World's Most Famous Mathematics Books And the winners are: "Popular" category: "Once is Not Enough" (do it again, and again, and again...) "Classical" category: "Les Measurables" ============================================================================== From: bruck@math.usc.edu (Ronald Bruck) Newsgroups: sci.math Subject: Re: Axioma or evidence? Date: 19 Nov 1998 09:18:20 -0800 In article <72th38$7ra@maenad.csc.albany.edu>, Frank Wappler wrote: : :Ron Bruck wrote: : :> I introduced the Sin function on the interval [-pi/2,pi/2] :> (w/o specifying the value of pi) : :... isn't this implicit in calling : :Int_[ dt: 0 ... p_x = 1 ]_( 1 / sqrt( 1 - t^2 ) ) == Pi/2 ? ... : :> but DEFINED Cos to be Sin'. :> It's direct that Sin' = Cos and Cos' = -Sin from this definition. : :And earlier: : :. Now prove a uniqueness theorem: :. if y is differentiable on an interval (-a,a), and satisfies y'' = -y, :. y(0) = 0, y'(0) = 0, then y = 0. :. Proof: show that y'^2 + y^2 is constant by differentiating it and :. discovering a zero derivative. [...] : :. The addition formulas follow from the uniqueness theorem. :. Fix x, and define : :. g(t) = sin t cos x + cos t sin x - sin(t+x). : :. Show that g'' = -g, g(0) = 0, and g'(0) = 0 and you're done. : :O.k., this seems all very sensible and sound within the _open_ :interval (-pi/2, pi/2). : :But in order to determine Sin' _at_ +/- Pi/2, don't you have to :take a limit "starting a little outside" that interval? :Don't you therefore have to make _some_ assumption about the :periodocity of Sin( x ), even if not for instance :Sin( Pi - x ) = Sin( x ) right away? ======================================== No, that can be avoided; see the immediately- following paragraph: ======================================== : :. I did the extension using the double-angle formulas. Put : :. f(x) = 2 Sin(x/2) Cos(x/2). : :. I leave it to you to verify that f'' = -f, and f is defined on an :. interval TWICE as long as where Sin was; and f(0) = 0, f'(0) = 1. :. Thus by the uniqueness result, f is an extension of Sin. :. Then extend f to an interval twice as long; then extend that; etc. :. You get a GLOBALLY DEFINED sin : R --> R which satisfies :. sin'' = -sin, sin(0) = 0, sin'(0) = 1. : :Then I suppose that you should be able to prove (somehow more :directly, please :) that the power expansion of the thus defined :sin( x ), namely (?) : :Sum_[ k = 0, 1, 2, ... ]_( (-1)^k x^k / k! ) : :has at least one zero other than x = 0. Can you do that? ================================================== Sure, this is fairly standard; see addition at the end of the post. ================================================== : :. I would NEVER try this in a "modern" calculus course. : :Then what _do_ they do in a "modern" calculus course!? ================================================== Well, "modern" in terms of "what's being taught now", not in the sense of "new and better". Almost all US textbooks use the geometry approach. You get sin x < x in the first quadrant by comparing arcs, x < tan x by comparing areas. ================================================== Someone actually READ my post! You're right, I sloughed off the question of what pi is. The post was already long, and this particular analysis is pretty standard, so I ducked the question. By using the open interval only, and NOT WORRYING about the improper integral at all, we get a function Sin defined on an open interval (-a,a) which satisfies y'' = -y, y(0) = 0, y'(0) = 1. You extend that to the interval (-2a,2a) by the double-angle formula, Sin1(x) := 2 Sin(x/2) Sin'(x/2), and verify that also Sin1'' = - Sin1, Sin1(0) = 0, Sin1'(0) = 1. By unique- ness you get Sin1 = Sin on (-a,a). Fine. By continuing this proess we get a function sin defined on ALL OF R satisfying these conditions. We define cos = sin'. Now what the heck is pi? The usual definition--I'm not sure, but I think this may be in "Baby Rudin"--is to set pi to be the smallest POSITIVE root of sin x = 0. Why must such a root exist? Well, since sin'(0) = 1 and sin(0) = 0, sin must be positive in some interval (0,epsilon); and the set of roots in [0,infinity) is closed; thus the roots there MINUS 0 is closed; hence there exists a closest point in THAT set to 0. Provided the set of positive roots is nonempty, of course! But if there are no positive roots of sin x = 0, this means sin > 0 on (0,+infinity) (intermediate value theorem). THAT means cos is decreasing. Yet if cos(x) were ever zero, we should have sin(2x) = 2 sin(x) cos(x) = 0, violating the assumption that sin has no positive roots. Thus cos(x) > 0 for all x > 0. But that means sin(x) is increasing. Hence there are epsilon > 0 and R0 > 0 such that sin(x) > R0 for all x > epsilon. Since the derivative of -cos is sin, this means -cos x > R0(x-epsilon) - cos(epsilon), which is mighty unfortunate because the RHS can grow to be > 1. Yet sin^2 + cos^2 = 1 (this follows by differentiation), so always | cos x | <= 1. In other words, the usual general nonsense about comparing solutions of ODEs. Once you've got sin (pi) = 0 you easily get 2pi-periodicity, sin(pi/2) = 1, etc. But don't get me wrong. The EASIEST way to define sine is by power-series. (You still have to contruct pi.) The easiest way is by using Taylor's theorem, but you can also get there by repeatedly integrating the inequalities -1 <= sin x <= 1 on [0,x] for x > 0. The power-series definition isn't usually done in the US, because the tools aren't available that early. I once taught a course at the U of Chicago-- I think they put me only on honors courses there, to shelter the REGULAR students ;-) --from Ostrowski, in what I've always interpreted as the older European style. That is, he introduces series VERY early--first chapter?--defines continuity in terms of convergence of sequences, etc. Students were using series from the get-go. What I have always found disappointing is that most of my students never seem to get the big picture; for example, the problem recently posted of finding the limit of 1 - cos(x^2) ----------- ??? I've forgotten; something 1/2 x^4 + O(x^6) is TRIVIAL when you use power-series expansions, or even Taylor polynomial with big-Oh truncation; I would hope that every engineer and mathematician would know not to apply L'Hopital's rule four times. Students just don't seem to think in these terms. Is that my failure? Yes. But it's also the failure of the system, for putting this USEFUL stuff near the end of a two-semester course. --Ron Bruck Perhaps, incorporating another thread, we ought to DEFINE sine as being the unique C^infinity function on R such that f'(0) = 1 and all derivatives |f^(p)(x)| <= 1 (p >= 0, x in R). ;-)