From: Franz Lehner Subject: Re: generating random numbers with a given distribution Date: Thu, 10 Jun 1999 13:07:18 +0200 Newsgroups: sci.math.research Keywords: Cumulants, Marcinkiewicz theorem On Wed, 9 Jun 1999, Abraham Flaxman wrote: > I am attempting to generate random numbers with arbitrary cumulants: > > If X is a random variable, with partial density function f(x), then the > characteristic function of X is E{exp(j*w*X)} and the second > characteristic > function of X is ln(E{exp(j*w*X}). The n-th order cumulant of X is the > n-th derivative of the second characteristic function evaluated at w=0. > > Ideally I want to specify the first n cumulants, and generate random > variables > from the distribution which has cumulants equal to zero for all higher > orders, That's impossible by Marcinkiewicz' theorem, which says that whenever a probability distribution has only a finite number of non-vanishing cumulants, then it must be Gaussian, and that every cumulant of order >= 3 vanishes. Hope that saves some time, Franz ============================================================================== From: Abraham Flaxman Subject: Re: generating random numbers with a given distribution Date: Thu, 10 Jun 1999 16:16:28 -0600 Newsgroups: comp.dsp,sci.math.research Dan Grayson wrote: > Abraham Flaxman wrote: > > > > I am attempting to generate random numbers with arbitrary cumulants: > > > > If X is a random variable, with partial density function f(x), then the > > characteristic function of X is E{exp(j*w*X)} and the second > > characteristic > > function of X is ln(E{exp(j*w*X}). The n-th order cumulant of X is the > > n-th derivative of the second characteristic function evaluated at w=0. > > > > Ideally I want to specify the first n cumulants, and generate random > > variables > > from the distribution which has cumulants equal to zero for all higher > > orders, > > but I'll settle for less -- even an algorithm to generate distributions > > with a > > specified second and fourth cumulant would be appreciated. > > I'm lost, but maybe by explaining things you'll figure out what > you're after :-) > > E{} means estimate? is f(x) = exp(jwX)? n-th derivative with > respect to X or w? (does the E{} integrate away w?) Sorry for the cryptic statistics notation... I swear I didn't make it up. E{} means expectation -- definition to come shortly... f(x) is the partial density function of random variable X. In more words, if the probability that random variable X is less than some number x = F(x), where F(x) is a positive, increasing function with limits of zero and one at positive and negative infinity, then the partial density function f(x) is defined as the derivative of F(x) with respect to x. f(x) is interesting because it gives a relative idea of how likely it is to see various values of X. Getting back to expectation... the expectation of a continuous random variable, E{X} is defined to be the integral from negative infinity to positive infinity of x*f(x) dx. So, to jump out of order, the expectation operator actually integrates away x. E{exp(jwX)} is shorthand for the Fourier transform of the partial density function of X. It takes a little trickery to see this. > I ususally think of random variables as being uniform, and do > a lot of work to make them uniform. Once you've got a uniform > generator, you can mung the output to be any distribution you > like. It seems like you're looking for something different > than a given distribution, it's more like the relationship > between different frequencies has to be fixed, and the > amplitudes of the frequencies changes to keep the relationships > stationary(???). I think you've got the right track here, except if you describe the output in terms of cumulants, it turns out you _can't_ mung the output to be any distribution you like. As Frantz Lerner pointed out on sci.math.research, Marcinkiewicz' theorem proves that if you have a finite number of cumulants, you have a normal distribution. (J. Marcinkiewicz, Math. Z., 44, 612 (1939)). > > How about straightening me out now :-) I hope that does more help than harm. abie ============================================================================== From: hrubin@stat.purdue.edu (Herman Rubin) Subject: Re: generating random numbers with a given distribution Date: 10 Jun 1999 15:56:06 -0500 Newsgroups: comp.dsp,sci.math.research In article <375EDA15.8F3001E3@malleus.lanl.gov>, Abraham Flaxman wrote: >I am attempting to generate random numbers with arbitrary cumulants: >If X is a random variable, with partial density function f(x), then the >characteristic function of X is E{exp(j*w*X)} and the second >characteristic >function of X is ln(E{exp(j*w*X}). I have never seen this called the second characteristic function. The n-th order cumulant of X is the >n-th derivative of the second characteristic function evaluated at w=0. This is slightly wrong; the it has to be divided by j^n. >Ideally I want to specify the first n cumulants, and generate random >variables >from the distribution which has cumulants equal to zero for all higher >orders, There need not be any such distributions; I seem to recall a theorem that a distribution cannot exist with a polynomial of degree greater than 2 as its characteristic function. >but I'll settle for less -- even an algorithm to generate distributions >with a >specified second and fourth cumulant would be appreciated. There are lots of them, and no simple ones. The first attempt at this was the Gram-Charlier series, which SOMETIMES produces a distribution function. There are always discrete distributions with any given finite set of moments which correspond to a distribution; for your case, it can be done with a three-point distribution. -- This address is for information only. I do not claim that these views are those of the Statistics Department or of Purdue University. Herman Rubin, Dept. of Statistics, Purdue Univ., West Lafayette IN47907-1399 hrubin@stat.purdue.edu Phone: (765)494-6054 FAX: (765)494-0558