From: "R.G. Vickson" Subject: Re: Probabity help please Date: Wed, 10 Mar 1999 18:34:17 -0500 Newsgroups: sci.math To: TheSeaGorn Keywords: Law of Large Numbers TheSeaGorn wrote: > Please help me explain something to a friend of mine. Let's say you have an > event that is 50/50, like tossing a coin. Regardless of the results of previous > flips, the odds are still 50/50 for either heads or tails on each additional > flip. My friend cocedes this. However, the greater the total number of flips, > the closer the totals will equal 50/50. (eg., after 1000 flips, you will have > very close to 500 heads and 500 tails.) > I concede this. But suppose you have a string of tails somewhere in this > series. Now, the balance is skewed. Eventually, my friend says, it will have to > even out. Therefore, heads are bound to come up more often in the future. > As absurd as this proposition is, my friend (who has had several statistics > classes, and says he learned this notion from a professor), I hope he misinterpreted what his professor said, since otherwise that's one very wrong stats prof. You are right: if heads are ahead by m points at the end of n > m tosses, the most probable situation at any point in the future is that heads are still ahead by m. The point, though is that with high probability, the head/tail imbalance after N >> n tosses will differ from m by a quantity of order sqrt(N), while the total number of heads will be of order .5N. Thus, for large N the _relative_ imbalance will go to zero. Even more can be said: the so-called Strong Law of Large Numbers states that with probability 1, the limiting frequency of heads and tails is 1/2. Your friend's error is quite a common one. He is apparently assuming that there is some mysterious "Law of Averages" that will cause things to even out in the end. This is just not so. In long coin-tossing games, the most probable situation is that heads are mostly ahead, or that tails are mostly ahead. In half the games heads dominate tails, while in the other half, tails dominate heads. A nice discussion of this is in Feller, "An Introduction to Probability Theory" Vol. I, Wiley, where he shows a simulated 10,000 toss coin game, showing that there are surprisingly long "winning streaks" or "losing streaks". The length of time one side is ahead of the other, and the number of equalizations that will occur in N tosses, are random quantities whose distributions are worked out in Feller (look up the so-calle "arc sine" law). Such misconceptions occur in managerial contexts, as has first been explained in New, C.C., "A Common Error in Production Scheduling", Operational Research Quarterly, Vol. 25 (1972), pp. 283-292. New's work led me to ask whether a manager should worry about and correct an off-schedule process, or just let nature take its course. My paper "Optimal Control of Production Sequences: a Continuous Parameter Analysis", Operations Research, Vol. 30 (1982), pp. 659-679 deals with this issue, by constructing optimally controlled diffusions to approximate the discrete process. RGV > swears by it. I > cannot make him see how it still doesn't matter. I tried to explain to him > that a coin tossing trial (whether it be 1000 or 1,000,000 tosses) is merely > one series in an infinite string of coin tosses. That is, you cannot expect > things to "even out" if you don't have a starting or an ending point in this > continuum. > Can anybody with creditials help me devise a better way to explain this? My > friend has taken to playing casino roulette, and i fear for his saving > account!! > [reformatted -- djr]