From: lasmith@athena.mit.edu (Lones A Smith) Newsgroups: sci.math Subject: a very simple asymptotics question Date: 14 Dec 1994 06:26:04 GMT Hello, This surely must be a basic question for asymptotics affictionados. Let f be a non-negative function. Then \int_0^1 f(x) dx <\infty => for some a>-1, f(x)=O(x^a) (for x close to 0) where O(x) is Hardy's notation. Just to make clear, this means that f(x)/x^a is bounded for x close to 0. [I believe this, but a proof eludes me. It is not eminently obvious, for instance, that some function like f(x)=1/[x*(stuff with log x in it)] might not be integrable.] As usual, I _do_ appreciate any and all help, Thanks in advance, Lones -- Lones A. Smith, Economics, MIT E52-252C, Cambridge MA 02139 v-mail: (617) 253-0914 e-mail: lones@lones.mit.edu (NeXT mail or the garden variety stuff) lasmith@athena.mit.edu (if all else fails) ============================================================================== From: clong@cnj.digex.net (Chris Long) Newsgroups: sci.math Subject: Re: a very simple asymptotics question Date: 14 Dec 1994 10:28:46 -0500 In article <3cm35s$c5a@senator-bedfellow.mit.edu>, Lones Smith wrote: > This surely must be a basic question for asymptotics affictionados. >Let f be a non-negative function. Then >\int_0^1 f(x) dx <\infty => for some a>-1, f(x)=O(x^a) (for x close to 0) >where O(x) is Hardy's notation. >Just to make clear, this means that f(x)/x^a is bounded for x close to 0. I think you want that for some a>-1 f(x)=O(x^a) as x->0 *almost everywhere*, since otherwise it isn't true (e.g. let f(x) = e^{1/x} for x rational, 1 otherwise). Or if you are only interested in "nice" functions you could take f to be continuous except possibly at {0,1}. -- Chris Long, 265 Old York Rd., Bridgewater, NJ 08807-2618 Score: 0, Diff: 1, clong killed by a Harvard Math Team on 1 ============================================================================== From: lasmith@athena.mit.edu (Lones A Smith) Newsgroups: sci.math Subject: Re: a very simple asymptotics question Date: 14 Dec 1994 16:06:27 GMT In article <3cn2ve$p59@cnj.digex.net> clong@cnj.digex.net (Chris Long) writes: >In article <3cm35s$c5a@senator-bedfellow.mit.edu>, Lones Smith wrote: > >> This surely must be a basic question for asymptotics affictionados. >>Let f be a non-negative function. Then > >>\int_0^1 f(x) dx <\infty => for some a>-1, f(x)=O(x^a) (for x close to 0) > >>where O(x) is Hardy's notation. >>Just to make clear, this means that f(x)/x^a is bounded for x close to 0. > >I think you want that for some a>-1 f(x)=O(x^a) as x->0 *almost everywhere*, >since otherwise it isn't true (e.g. let f(x) = e^{1/x} for x rational, 1 >otherwise). Or if you are only interested in "nice" functions you could >take f to be continuous except possibly at {0,1}. >-- >Chris Long, 265 Old York Rd., Bridgewater, NJ 08807-2618 Okay, I now qualify my assertion. Is my assertion true for the following cases: 1. f continuous (if not, how about uniformly so?) 2. f monotonic If so, where is (or what is) the proof? Thanks for pointing out my error, Lones -- Lones A. Smith, Economics, MIT E52-252C, Cambridge MA 02139 v-mail: (617) 253-0914 e-mail: lones@lones.mit.edu (NeXT mail or the garden variety stuff) lasmith@athena.mit.edu (if all else fails) ============================================================================== From: clong@cnj.digex.net (Chris Long) Newsgroups: sci.math Subject: Re: a very simple asymptotics question Date: 15 Dec 1994 16:02:57 -0500 In article <3cn563$o0v@senator-bedfellow.mit.edu>, wrote: >Okay, I now qualify my assertion. Is my assertion true for the following >cases: >1. f continuous (if not, how about uniformly so?) It's obviously not true for f continuous. Construct a continuous function that has thin symmetric triangular spikes of height 2^n and base 2/4^n (for an area of 1/2^n) centered at x=1/2^n for n>0, and which is 0 elsewhere. -- Chris Long, 265 Old York Rd., Bridgewater, NJ 08807-2618 Score: 0, Diff: 1, clong killed by a Harvard Math Team on 1 ============================================================================== From: rusin@washington.math.niu.edu (Dave Rusin) Newsgroups: sci.math Subject: Re: a very simple asymptotics question Date: 14 Dec 1994 16:45:04 GMT In article <3cm35s$c5a@senator-bedfellow.mit.edu>, Lones A Smith asked for verification: >Let f be a non-negative function. Then > >\int_0^1 f(x) dx <\infty => for some a>-1, f(x)=O(x^a) (for x close to 0) > >[I believe this, but a proof eludes me. Perhaps you have some additional hypotheses; another poster has commented that continuity is necessary (if you want to avoid "almost-everywhere" conclusions), but even more would be needed. For example, do you know that the function is decreasing? Otherwise, counterexamples abound. Basically all you'd have to do is make a function which gets really large on really narrow intervals -- so narrow that the integral stays finite, but large enough to defeat f(x)0) implies _both_ of the other limits are finite, and in particular, f(r) < f(1). r^(-1). This isn't quite as strong as the result you want, but gets pretty close. Moreover, you can see in this example why you need something like a decreasing f, to keep the second integral tight. (I must agree with the original poster, though: it seems like even if one could prove, with assumptions like those above, that f(x) is little-o( 1/x ), it ought to be possible to make a function along the lines of ln(x)/x which fails to be big-O ( x^a) for every a> -1. None came to mind as I typed this.) dave ============================================================================== From: rusin@washington.math.niu.edu (Dave Rusin) Newsgroups: sci.math Subject: Re: a very simple asymptotics question Date: 14 Dec 1994 17:55:27 GMT In article <3cn7eg$ms5@mp.cs.niu.edu>, Dave Rusin wrote: > >If you know for example that f is decreasing and differentiable on (0,1] >then using integration by parts we may rewrite your integral as > \int_r^1 f(x) dx = (f(1)- r.f(r) ) + \int_r^1 x.|f'(x)| dx >which, significantly, is the sum of two positive terms. Who wrote that? Did I write that? Who made me write that? The first part of the post was right, and the trick above is good, but misapplied. I should have said, if f is differentiable and decreasing, then for r > 0, r f(r) = f(1) - int_r^1 f + \int_r^1 x |f'| so that (since the original integral is given to be bounded) we will have f(r) = big-O( 1/r ) iff the second integral is bounded. However, I don't see right now why the latter needs to be true. Sorry if I've caused confusion. dave ============================================================================== From: israel@math.ubc.ca (Robert Israel) Newsgroups: sci.math Subject: Re: a very simple asymptotics question Date: 14 Dec 1994 18:02:01 GMT In article <3cm35s$c5a@senator-bedfellow.MIT.EDU> lasmith@athena.mit.edu (Lones A Smith) writes: > This surely must be a basic question for asymptotics affictionados. > Let f be a non-negative function. Then > > \int_0^1 f(x) dx <\infty => for some a>-1, f(x)=O(x^a) (for x close to 0) > [I believe this, but a proof eludes me. It is false. > It is not eminently obvious, for instance, that some function like > > f(x)=1/[x*(stuff with log x in it)] > > might not be integrable.] For example, f(x) = 1/(x*(log(x/2))^2) Or 1/(x*log(x/2)*(log(-log(x/2))^2) It is true that if f is non-increasing, f(x) = o(1/x) as x -> 0. Proof: Suppose not. Then there exist epsilon > 0 and a sequence x_n -> 0 such that f(x_n) > epsilon/x_n. Thus int_{x_n}^{x_{n-1}} f(x) dx >= (x_{n-1}-x_n)/x_n = x_{n-1}/x_n - 1 and so sum_{n=1}^infty (x_{n-1}/x_n - 1) < infty Now ln(x_0/x_N) = sum_{n=1}^N ln(x_{n-1}/x_n) <= sum_{n=1}^infty (x_{n-1}/x_n - 1) which would contradict x_N -> 0. -- Robert Israel israel@math.ubc.ca Department of Mathematics University of British Columbia Vancouver, BC, Canada V6T 1Y4 ============================================================================== From: rusin@washington.math.niu.edu (Dave Rusin) Newsgroups: sci.math Subject: Re: a very simple asymptotics question Date: 14 Dec 1994 18:55:09 GMT In article <3cm35s$c5a@senator-bedfellow.mit.edu>, Lones A Smith wrote: >Let f be a non-negative function. Then > >\int_0^1 f(x) dx <\infty => for some a>-1, f(x)=O(x^a) (for x close to 0) OK, I've got it now. The assertion is false even if you assume f has the kind of normal behaviour you're likely to assume in Smith's department. Consider first the function g(x) = e / [ x (ln x)^2 ] on the interval (e, oo) (e=2.7...). Then \int_e^\infty g(x) dx = e \int_1^\infty du/u^2 = e after making the substitution u = ln(x). If you add in a (1)-by-(e) rectangle at the left, you'll get a nice region with an area of 2e. Flip this region over the line y=x and you'll still have an area of 2e, bounded by the lines y=0, x=0, x=1, and the curve x = g(y), that is, 2e is the integral of the inverse function f of g over the interval [0,1]. So f=g^(-1) is a function meeting the hypotheses of your conjecture. Now assume f(x) < C x^a for all small x >0, where a is a number > -1. Apply the decreasing function g to this inequality and see x > g(C x^a). This gives an inequality which is best presented as having the form | ln z + A | > B z^{(1+a)/2} where A and B are constants, B>0, and z=1/x runs over all large positive values. Clearly for a> -1 this would imply ln z grows faster than a positive power of z, which is false. Therefore, f fails to meet the conclusion of your conjecture. dave