Colloquially, the Law of Averages dictates that what ought to happen does happen if it happens long enough. If a gambler plays a casino for a very long time, he is almost certainly guaranteed to lose. If a weak sports team plays a stronger team in a multiple-game series, then it is almost certain to lose the series.
However, if the gambler plays in the casino for only a little while, then there is a realistic (though less than 50%) chance of coming out ahead. And a weak sports team may defeat a stronger one if only one game is played… hence the appeal of the NCAA basketball tournament and, on a larger scale, the knockout stages of the World Cup.
In my statistics class, I use a simple simple spreadsheet to illustrate that for a sample count, but for a sample proportion. Here is one image from the spreadsheet:
The user can change the bright green cell to be any positive integer up to 5000. This number represents the number of simulated coins that are flipped. In the above example, ten coins are flipped. Column B shows the results of the simulated coins, while column C shows a running count of the number of heads that have appeared. In the above example, 7 of the 10 flips are heads, for an observed error of +2 (two more heads than the expected number of 5) and a percentage error of 20%.
In class, I would run the spreadsheet several times, and students will see that the observed error usually is in the range of -2 to +2, and the percentage error is usually in the range of -20% to +20%.
By contrast, look what happens when the number of flips increases to a large number, like 5000.
There is now a larger absolute error — in this case, -28. Of course, an absolute error of that size is impossible with 10 coin flips or even 50 coin flips. However, at the same time, the percentage error is now significantly smaller (only -0.56%).
This example gives evidence for the counter-intuitive result that the absolute error grows like while the relative error decreases like .