The Law of Averages

Colloquially, the Law of Averages dictates that what ought to happen does happen if it happens long enough. If a gambler plays a casino for a very long time, he is almost certainly guaranteed to lose. If a weak sports team plays a stronger team in a multiple-game series, then it is almost certain to lose the series.

However, if the gambler plays in the casino for only a little while, then there is a realistic (though less than 50%) chance of coming out ahead. And a weak sports team may defeat a stronger one if only one game is played… hence the appeal of the NCAA basketball tournament and, on a larger scale, the knockout stages of the World Cup.

In my statistics class, I use a simple simple spreadsheet to illustrate that \hbox{SD}(K) = \sqrt{n p(1-p}) for a sample count, but \hbox{SD}(\hat{p}) = \displaystyle \sqrt{ \frac{p(1-p)}{n} } for a sample proportion.  Here is one image from the spreadsheet:

coinflip1

The user can change the bright green cell to be any positive integer up to 5000. This number represents the number of simulated coins that are flipped. In the above example, ten coins are flipped. Column B shows the results of the simulated coins, while column C shows a running count of the number of heads that have appeared. In the above example, 7 of the 10 flips are heads, for an observed error of +2 (two more heads than the expected number of 5) and a percentage error of 20%.

In class, I would run the spreadsheet several times, and students will see that the observed error usually is in the range of -2 to +2, and the percentage error is usually in the range of -20% to +20%.

By contrast, look what happens when the number of flips increases to a large number, like 5000.

coinflip2

There is now a larger absolute error — in this case, -28. Of course, an absolute error of that size is impossible with 10 coin flips or even 50 coin flips. However, at the same time, the percentage error is now significantly smaller (only -0.56%).

This example gives evidence for the counter-intuitive result that the absolute error grows like \sqrt{n} while the relative error decreases like \sqrt{n}.

Leave a comment

2 Comments

  1. Story about Notah Begay III | Mean Green Math
  2. My Favorite One-Liners: Part 61 | Mean Green Math

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.

%d bloggers like this: