From: "David R. MacIver" Subject: Moment Generating Functions Date: Sun, 17 Sep 2000 22:31:47 +0100 Newsgroups: sci.math Summary: [missing] Hi. Recently I've been teaching myself about moment generating functions of random variables. In the book it made an assumption which it didn't prove: That the m.g.f for has a unique corresponding distribution. I *think* I've proved it, but I'd like to check my logic. First of all from the expansion of M(t) the moment generating function of a r.v. X M(t)=1+t.E(X)+t^2/2.E(X^2).... Thus M(t) uniquely determines E(X^i) for all values of i from 1 to infinity (in fact, from 0 to infinity, but knowing that E(X^0)=1 isn't really terribly useful :-)). Now considering a collection of n values where the ith value is y(i) If I know all the values of the mean of y^r for r=1 to n then I have a sufficient number of simultaneous equations to, in principle, solve it. (In principle. In practice the equations will be absolutely horrendous to solve. But the point is that they uniquely determine all the values). Thus if we have the mean of y^r for r = 1 to infinity we can determine all the values for any value of n. If we then define p(z) as the number of times a value of z appears divided by n as tn tends to infinity we thus see that if we have all the values of E(x^r) from r=1 to infinity then all probabilities are uniquely determined. The m.g.f gives all values of E(X^r) and therefore gives all the probabilities and thus the probability distribution. Hence the m.g.f. Uniquely determines the probability distribution. Is this correct? Please e-mail me your answers as well as/instead of posting them, David ============================================================================== From: hrubin@odds.stat.purdue.edu (Herman Rubin) Subject: Re: Moment Generating Functions Date: 17 Sep 2000 20:29:20 -0500 Newsgroups: sci.math In article <39C53843.276A6D45@btinternet.com>, David R. MacIver wrote: >Hi. Recently I've been teaching myself about moment generating functions >of random variables. In the book it made an assumption which it didn't >prove: That the m.g.f for has a unique corresponding distribution. I >*think* I've proved it, but I'd like to check my logic. >First of all from the expansion of M(t) the moment generating function >of a r.v. X >M(t)=1+t.E(X)+t^2/2.E(X^2).... >Thus M(t) uniquely determines E(X^i) for all values of i from 1 to >infinity (in fact, from 0 to infinity, but knowing that E(X^0)=1 isn't >really terribly useful :-)). >Now considering a collection of n values where the ith value is y(i) >If I know all the values of the mean of y^r for r=1 to n then I have a >sufficient number of simultaneous equations to, in principle, solve it. >(In principle. In practice the equations will be absolutely horrendous >to solve. But the point is that they uniquely determine all the values). This argument is NOT quite valid. There are distributions, and not too difficult one, for which knowing all moments will not determine the distribution. It is true that if the moment generating function exists in an interval on both sides of 0, the solution is unique. The theory of this is not that well known, although it is quite old. A large number of intermediate steps needs to be done in the proof, more than a full chapter in a book. An easier approach is to use the characteristic function, which is the moment generating function for purely imaginary arguments, and therefore must exist, and is fully determined by analytic function arguments from the moment generating function in any neighborhood of 0. One can find an explicit formula for the cumulative distribution function from the characteristic function in the paper by Gurland in _Annals of Mathematical Statistics_, 1947, with proof. I do not know if your background has gone far enough for this argument, which is simpler than proving uniqueness for the moment problem when the moment generating function exists. -- This address is for information only. I do not claim that these views are those of the Statistics Department or of Purdue University. Herman Rubin, Dept. of Statistics, Purdue Univ., West Lafayette IN47907-1399 hrubin@stat.purdue.edu Phone: (765)494-6054 FAX: (765)494-0558 ============================================================================== From: allen_abrahamson@my-deja.com Subject: Re: Moment Generating Functions Date: Mon, 18 Sep 2000 03:28:23 GMT Newsgroups: sci.math In article <8q3r5g$2cms@odds.stat.purdue.edu>, hrubin@odds.stat.purdue.edu (Herman Rubin) wrote: [snip] > This argument is NOT quite valid. There are distributions, > and not too difficult one, for which knowing all moments > will not determine the distribution. One fairly well known example of such a distribution is the log-normal. For distributions with finite range, the moments (and hence the mgf) will always uniquely determine the distributions. For infinite-range distributions, then the moments will uniquely determine the distribution if the appropriate following series DIverges: \sum_{j=0}^\infty{ 1/(\mu_{2j})^1/2j} , -\infty < x < \infty \sum_{j=0}^\infty{ 1/(\mu_{j})^1/2j} , 0 < x < \infty > > It is true that if the moment generating function exists in > an interval on both sides of 0, the solution is unique. > The theory of this is not that well known, although it is > quite old. A large number of intermediate steps needs to > be done in the proof, more than a full chapter in a book. The source of most of this study is work done independently by Stieltjes, Hamburger, and Carlman (-mann?). Most of the original is in French or German, and tough to find anyway. But, depending on how far along you are, you can check out Shohat and Tamarkin, _The Problem of Moments_. (I just recently bought a copy on bn.com's Out of Print for 30 bucks!) In any case, as Herman Rubin suggests, go with the characteristic function as soon as you can. Sent via Deja.com http://www.deja.com/ Before you buy.