Wednesday, October 8, 2008

"Unlikely" vs. "Highly Unlikely"


The Difference Between an “Unlikely Event” and a “Very Unlikely Event”

I have mentioned elsewhere that “math” negates all systems. Or perhaps better put, the reality of systems (that they will not overcome the house edge) becomes apparent when one applies strict rules of mathematics.

But, now I’m looking at things a little differently:

To understand my new theory, let’s look at a very old one. Suppose we employ a very simple martingale system as follows (looking at a true 50/50 decision): We’ll bet one unit on the first decision and if we lose we will double the bet (2 units) on the second decision and then double again after a second loss. This three-step martingale gives us three shots at making a one unit profit. If we lose all three stages, we are out 7 units. This “beginner” approach tends to make us feel comfortable because we “know” that in a 50/50 game, we should be winning more often than one in three hands. Math tells us, on the other hand, that we will break even (no house edge in this hypothetical) because we “should” encounter the full-series loss of seven once for every 7 single-unit wins. As players we know that “in the short run” the 50/50 probability of a win is not reliable and that it would be well within the realm of possibilities to encounter a “bad run” of full-series losses in any particular session.

Math helps us see that although we have a 50% chance of winning the very next decision we also have a 50% chance of losing. When looking at the proposition of winning one of our next three decisions, we see that the likelihood is one in 8. This means that if we play 100 series of 3 hands, we should win 7/8ths of those series and lose one eighth. We will win 7 units for each successful series of three and lose 7 units for every full-series loss (or failure of our system). If we encounter a session where we get the “expected” results, we can expect to break even.

The lure of this system is (in part) based on the mathematical fact that we begin each series of three decisions with a 87.5% chance of winning that particular series. We all would admit that an 87.5% chance of winning anything “sounds” good. We know however that, our 87.5% chance of winning 1 unit is counter-balanced with a 12.5% of losing 7 units. NOTHING we can do will overcome this fact.

So, why bother?

Well, let’s look at this fact from a different angle. Suppose a typical session at a roulette wheel is 300 spins. How many times would we expect to be brought back to zero using the three-step martingale mentioned above? Three hundred spins will produce one-hundred series of three. We “should” win 87.5 of those series and lose 12.5. If we play for 5 hours (at 60 spins per hour), we’d be losing or “brought back to even” 2.5 times per hour. If we only played 30 minutes, we could expect to be brought back to even 1.25 times and if we played for 15 minutes, theoretically, we would not encounter enough decisions (or series of decisions) to have one losing session.

[House Edge: Before you start writing me and saying that I left out the house edge, let me explain that I am trying to keep the math simple for the purpose of this discussion. Think of the game in my example as “no-edge” Roulette. At an American Roulette wheel (using the example above) you would expect to win 7 series and lose 1 series but instead of ending up at a break even point, you would end up down 5.26% of a unit, because of the house edge. If you are interested in the “real” odds on an American roulette wheel, then you need to begin with the fact that a bet of one unit on black has a 47.4% (less than the 50% above) likelihood of winning and if you set out to win at least one decision of the next three consecutive decisions, your likelihood of winning at least one in the next three would be 85.44% (which is of course below the 87.5% in the example below).]

My thought is that we can reduce the “likely” to the “highly unlikely” by increasing coverage of the layout. For example: Suppose we cover 2/3rds of the layout instead of 50% in the earlier example. We know that we can not overcome the odds. We know that we will be brought back to even eventually. However, lets look at the short run. Employing a three-step martingale, we are now betting that we will win at least one of the next three decisions and we have 66% of the layout covered. Instead of losing one in eight series of three, we now can expect to lose only one in 27 series of three.

(Of course we would expect to lose 26 units and once again be brought back to zero. The math would go like this: our first bet would be 2 units, one on each of 2 dozens for example, after a loss, our bet would have to be increased to 6 units (3 on each of 2 dozens) and after two losses, we’d increase our bet to 19 (or 9 on each of 2 dozens). This three-step martingale progression would produce a win of 1 unit for each successful series and lose 26 for each three-step loss.)

Lets go back to our 5 hour session of 300 decisions. Again we are looking at this session as 100 series of three decisions. How many of those series can we expect to lose? The answer is 3.6. How many can we expect to lose in an hour? The answer is: .72.

Now, math tells us that if we play this game for one hour, it is unlikely that we will encounter a loss. We should see 60 decisions and win 20 units. And one might say that we are “due” a loss around the corner.

Stopping While Ahead and Minimizing Volatility

Suppose you played a game where you always won 50% of your decisions in the short run (like in 100 decisions), suppose also that this game rarely produced streaks of more than 4 or 5 losses in a row. It is likely that you could sit at any session and play this sort of game flat-betting and stop when up by one unit. Some sessions would only be one decision long and other might take 100 decisions but under these circumstances you could almost certainly eventually hit a point where you were up by one single solitary unit. One might say that when you quit, you were due a loss on the very next decision. Being only up by one unit, you were due to be brought back to even. Never the less, with minimal volatility, you could always play until up by one unit. Eventually you’d get ahead.

Now shift back to the game where you are expected to lose 27 units in one decision and win one unit for each of 26 decisions. Using my theory of stopping while ahead and minimizing the volatility, you should have many opportunities to stop when ahead by less than 26 units. Of course, if you started out losing, you’d have a long row to hoe. But math tells us that each bet you place has a 2 in 3 chance of winning and the likelihood of losing three such bets in a row is less than 4%. The chances of getting into a point where you are ahead by half of a successful series (win stop of 13 units) are very good indeed.

Final Thoughts

As with all systems, any play based on the ideas contained here needs to play well capitalized, and as always, successful players need to be disciplined and patient.

[PS – No doubt, these ideas have been around in one form or another for many, many years. I do not claim to be presenting any sort of original breakthrough. These ideas merely represent my current thoughts on a better way for me to look at the sobering application of math to gambling systems.]