Showing posts with label mathematics. Show all posts
Showing posts with label mathematics. Show all posts

Thursday, November 20, 2008

Systems 101 - A Primer

It dawned on me while reading a message board that there are a lot of beginner-level players out there searching the web for some useful information and often unable to follow the threads because the more experienced players seem to be writing in code.

I thought I'd begin a thread design for the entry-level gambler looking for a system to play. You will not find any systems in this thread only basic information. You will also not find the rules of play here. If you will check other threads within this blog, I have recommended some books for the beginner (and intermediate) player.



BS v MM

I think a good place to start is the distinction between bet selection and money management. Simply put, Bet Selection systems (BS) tell the player where to place the next bet (like on red, or player, or passline for example). Money Management (MM) systems tell the player how much to bet on the next decision.

Most systems are mathematical templates that tell the player when to bet, where to bet and how much to bet. Typically systems are designed for Even Chance or 50/50 decisions (EC).


The House Edge

It should come as no surprise that ALL Casino games have a built in house edge. This is the amount of money one can expect to lose by playing the game. The edge is determined by the difference between the true odds and the payout. A simple example is playing one number straight up in roulette. If you place one unit on the 17 and the 17 hits, you are paid 35 to 1. You receive 36 units (comprised of your original bet and 35 of the house's chips, now might be a good time to stop playing). Because there are 38 numbers on an American roulette wheel, you have a one in 38 chance of winning a bet that pays 35 to 1. If true odds were paid, you would be paid 37 house units instead of 35. The house edge changes from game to game, but this simple illustration would be true for every single bet you can place in the casino. Some people win, some people lose but the casino ALWAYS wins, just look at the casino and this fact should be plainly evident.


MATH and TESTING

It is widely claimed and typically accepted that math proves that systems do not work. It seems that with thorough testing, all that one can hope for is to break even (or more accurately, to break even less the house edge).

There are system testers available for system's players to test their theories against actual casino results. (see another thread in this blog for links to purchase testers). The idea behind testing is that if your system can beat the testers, then it should perform strongly under real circumstances. It should seem obvious that if you can not beat the testers, you system has a good chance of failing in the casino.

Is there hope? If math brings you to the conclusion that systems will not win and the testers are nearly impossible to beat, is there hope for developing a successful strategy? Many successful players claim that some systems perform reliably in the short run and the key to winning in the long run is changing systems to respond to the game as you play it. Testing several systems which are triggered by (sometimes) subtle changes in the game is a very difficult task. Therefore, it is plausible that a successful systems player could win in the long term by making changes that would not be easily duplicated in the tester books (like leaving the table in search of a more lively one for example).

LONG RUN v. SHORT RUN

It is important to understand that series of random events tend to perform in accordance with their expected mathematical probabilities in the long run BUT rarely do in the short run. If an event has a near 50% likelihood of occurring (like the "player" winning a hand at baccarat for example), then if you looked at a large sample (like 1,000 decisions) you would probably find the even occurs very close to 50% of the time. This can be relied upon in the long run. However, it might be unwise to bet in anticipation of a 50% occurrence in the short run (like the next 6 decisions for example).
There is another concept to throw into the mix. That is the "standard deviation", but for this "beginner level" primer, I do not think it is necessary to go into how it works, Just be aware that when looking at a set of decisions, they can be expected to perform "close to" their expectation and there is a mathematical way to determine how "close to" the expectation would be normal (or at what point the numbers would be abnormal) and this is called the standard deviation.

OTHER ABBREVIATIONS

FTL = Follow The Last. This is a bet selection system that simply means your next bet is that the last decision will repeat. If red hits on roulette, your next bet is on red.
OLD = Opposite Last Decision. This of course is the opposite of FTL, if red hits on Roulette, your next bet should be black.
DBL = Decision Before Last. Here you would bet the same as the decision before the last decision. This simple Bet Selection system has the benefit of breaking up streaks that could work against you. (I'll try to come back and present an example of this here later.)


more later . . .

Friday, October 24, 2008

The "Cycle"

THE CYCLE
Part I

Lately I've been focusing my work on a concept I call the "Cycle."
I have posted elsewhere my idea of "going for half" and these two concepts work well together.
I have also written about mathematical expectancy and this is a good place to begin an explanation of the cycle.
All gambling propositions have a probability which can be described in the form of mathematical expectancy. A very simple example would be betting one number, 17 for example, straight up on an American roulette wheel. Since there are 38 numbers on an American roulette wheel, it is said that the probability of the number 17 hitting on the next spin is 1 in 38. The mathematical expectation is that we can "expect" a hit on the number 17 once in 38 spins. The "cycle" for this proposition therefore is 38 spins.
The simple example above is provided merely to illustrate the terminology. The concept becomes a little more complicated when we look at more complex bets, like betting 2 dozens and 2 columns for example, or using progressions.
We all know that the so-called even-money outside bets (like red/black for example) are close to 50/50 propositions. We also know that when you factor in the house edge, your chances of winning any particular "even-money" bet is a little less than 50%. In short, the "odds" are against you or in other words, you are more likely to lose this bet than to win this bet.
We also know that you can place bets that you are more likely to win but that the payoff is less than one to one. For example: Playing 2 columns gives you 24 of 38 chances to win, however the payoff is 1 to 2, you will be wagering two units in hopes of netting one.
My theory about "maximum coverage bets" (and I hope to come up with a better term than that) is that when you employ a progression, your chances of losing your series is drastically reduced.
NONE of this defeats the house edge I remind you. But, I accept that cold fact with all systems.
What I hope to develop here is a way of looking at cycles and maximum coverage bets to allow us to chose bets that will produce small but steady gains with rare losses (which will necessarily be large).

More later . . . .
The Cycle Within a Cycle
Using multiple levels of progression, leads to bigger cycles containing smaller cycles. For example: Suppose your bet was a three step martingale. You are betting on Red and you bet one unit on your first bet, then double after a loss, and again. Your progression is 1 2 4. Each winning series results in a gain of 1 unit. Each losing series results in a loss of 7 units. We know that you can expect to lose a series about once in 8 series. Assuming for this discussion that you are playing a true 50/50 game, a wheel with no house numbers, a wheel with exactly half of the numbers being red. Under these circumstances, you can expect to win 7 series and lose one. This is the first level of progression.
Now suppose you decide to add another level of martingale. After a losing series, your first bet will be 2 and your progression will be 2 4 8. After a win, you will return to your original series.
Now look at the cycle. Y0u have a cycle of 8 series where you can expect to win 7 series and lose one to break even for the cycle. This cycle can be expected to take 24 spins or decisions. By adding the 2nd level martingale, you are increasing your net by +1. IF you experience the mathematical expectancy of a typical cycle, you will end your 24 spins up one unit instead of break even.
OF COURSE, there is another mathematical expectancy of losing back to back series. This other expectancy has another point in a larger cycle where you can expect to be brought back to zero or even (or to a negative amount equal to the house edge) . In the original progression we saw that we can expect to lose one series out of eight. We then added a second level of progression gambling that we would not encounter our one in eight losses back-to-back. How often will that happen? [I have notes elsewhere and I'll return to fill in this gap] This would be the bigger cycle. Eventually, you can expect to be brought back to even (or zero) when the bigger cycle runs its course. By adding yet greater levels of progression you are increasing the size of the cycle and it is my theory that you are increasing the amount of time you can expect to be ahead of the game before being brought back to zero. AND MAYBE - if you have several tactics for stretching out the cycle of expectancy, then you can quit while ahead more often OR change strategies while ahead in the cycle.
More later . . .
The "No Lose" Expectancy
(Which, of course WILL Lose as expected)
As a general illustration of the discussion so far, I offer this example:
For this example, we are playing a wheel which produces 60 decisions an hour. We know that we can develop a system that has an expected loss of one time in 60 decisions. This one loss would be expected to eliminate all winnings from the cycle of 59 wins. If we play this game for only 30 minutes and IF we are ahead at 30 minutes, then we can quit under my notion of "going of half." The question then becomes, of all the 30 minutes sessions that we will play, how many will include the dreaded losing decision. Math would probably tell us half. Real play may show us something different. We know that IF we have one more winning 30 minute session than losing 30 minute session, then we'd be ahead in the big cycle. And if the sessions were kind enough to come evenly spread out, you'd always be only 30 minutes away from being ahead.
If we strip this theory down to it's simplest form, it becomes WAY less attractive. Yet there is something about the more complicated version that I find appealing.
Here is the stripped down version: Suppose we are playing a true 50/50 game and the game produced its mathematically expected results with regularity in the short run. So that if you flat-bet and you encountered a win/loss series like this: W L W L W L W L W L, you would always be just one or two decisions away from a profit. Following through with our example above, you could always quit while ahead and it would be easy to do so.
We find this to be unappealing because we know that Roulette and Baccarat and other near 50/50 games do not produce reliable results in the short run. We know that the 50/50 game is very volatile and that it takes THOUSANDS of spins or decisions for the results to approach the expected 50/50 mark.
My theory is that the smaller the cycle, the more volatile and unpredictable the game. BUT on the other hand, the larger the cycle, the more predictable the game becomes.
I found an Excellent article and example of a No Lose Expectancy System, I'll post a link here when i get my hands on it again.
More Later . . .

Wednesday, October 8, 2008

"Unlikely" vs. "Highly Unlikely"


The Difference Between an “Unlikely Event” and a “Very Unlikely Event”

I have mentioned elsewhere that “math” negates all systems. Or perhaps better put, the reality of systems (that they will not overcome the house edge) becomes apparent when one applies strict rules of mathematics.

But, now I’m looking at things a little differently:

To understand my new theory, let’s look at a very old one. Suppose we employ a very simple martingale system as follows (looking at a true 50/50 decision): We’ll bet one unit on the first decision and if we lose we will double the bet (2 units) on the second decision and then double again after a second loss. This three-step martingale gives us three shots at making a one unit profit. If we lose all three stages, we are out 7 units. This “beginner” approach tends to make us feel comfortable because we “know” that in a 50/50 game, we should be winning more often than one in three hands. Math tells us, on the other hand, that we will break even (no house edge in this hypothetical) because we “should” encounter the full-series loss of seven once for every 7 single-unit wins. As players we know that “in the short run” the 50/50 probability of a win is not reliable and that it would be well within the realm of possibilities to encounter a “bad run” of full-series losses in any particular session.

Math helps us see that although we have a 50% chance of winning the very next decision we also have a 50% chance of losing. When looking at the proposition of winning one of our next three decisions, we see that the likelihood is one in 8. This means that if we play 100 series of 3 hands, we should win 7/8ths of those series and lose one eighth. We will win 7 units for each successful series of three and lose 7 units for every full-series loss (or failure of our system). If we encounter a session where we get the “expected” results, we can expect to break even.

The lure of this system is (in part) based on the mathematical fact that we begin each series of three decisions with a 87.5% chance of winning that particular series. We all would admit that an 87.5% chance of winning anything “sounds” good. We know however that, our 87.5% chance of winning 1 unit is counter-balanced with a 12.5% of losing 7 units. NOTHING we can do will overcome this fact.

So, why bother?

Well, let’s look at this fact from a different angle. Suppose a typical session at a roulette wheel is 300 spins. How many times would we expect to be brought back to zero using the three-step martingale mentioned above? Three hundred spins will produce one-hundred series of three. We “should” win 87.5 of those series and lose 12.5. If we play for 5 hours (at 60 spins per hour), we’d be losing or “brought back to even” 2.5 times per hour. If we only played 30 minutes, we could expect to be brought back to even 1.25 times and if we played for 15 minutes, theoretically, we would not encounter enough decisions (or series of decisions) to have one losing session.

[House Edge: Before you start writing me and saying that I left out the house edge, let me explain that I am trying to keep the math simple for the purpose of this discussion. Think of the game in my example as “no-edge” Roulette. At an American Roulette wheel (using the example above) you would expect to win 7 series and lose 1 series but instead of ending up at a break even point, you would end up down 5.26% of a unit, because of the house edge. If you are interested in the “real” odds on an American roulette wheel, then you need to begin with the fact that a bet of one unit on black has a 47.4% (less than the 50% above) likelihood of winning and if you set out to win at least one decision of the next three consecutive decisions, your likelihood of winning at least one in the next three would be 85.44% (which is of course below the 87.5% in the example below).]

My thought is that we can reduce the “likely” to the “highly unlikely” by increasing coverage of the layout. For example: Suppose we cover 2/3rds of the layout instead of 50% in the earlier example. We know that we can not overcome the odds. We know that we will be brought back to even eventually. However, lets look at the short run. Employing a three-step martingale, we are now betting that we will win at least one of the next three decisions and we have 66% of the layout covered. Instead of losing one in eight series of three, we now can expect to lose only one in 27 series of three.

(Of course we would expect to lose 26 units and once again be brought back to zero. The math would go like this: our first bet would be 2 units, one on each of 2 dozens for example, after a loss, our bet would have to be increased to 6 units (3 on each of 2 dozens) and after two losses, we’d increase our bet to 19 (or 9 on each of 2 dozens). This three-step martingale progression would produce a win of 1 unit for each successful series and lose 26 for each three-step loss.)

Lets go back to our 5 hour session of 300 decisions. Again we are looking at this session as 100 series of three decisions. How many of those series can we expect to lose? The answer is 3.6. How many can we expect to lose in an hour? The answer is: .72.

Now, math tells us that if we play this game for one hour, it is unlikely that we will encounter a loss. We should see 60 decisions and win 20 units. And one might say that we are “due” a loss around the corner.

Stopping While Ahead and Minimizing Volatility

Suppose you played a game where you always won 50% of your decisions in the short run (like in 100 decisions), suppose also that this game rarely produced streaks of more than 4 or 5 losses in a row. It is likely that you could sit at any session and play this sort of game flat-betting and stop when up by one unit. Some sessions would only be one decision long and other might take 100 decisions but under these circumstances you could almost certainly eventually hit a point where you were up by one single solitary unit. One might say that when you quit, you were due a loss on the very next decision. Being only up by one unit, you were due to be brought back to even. Never the less, with minimal volatility, you could always play until up by one unit. Eventually you’d get ahead.

Now shift back to the game where you are expected to lose 27 units in one decision and win one unit for each of 26 decisions. Using my theory of stopping while ahead and minimizing the volatility, you should have many opportunities to stop when ahead by less than 26 units. Of course, if you started out losing, you’d have a long row to hoe. But math tells us that each bet you place has a 2 in 3 chance of winning and the likelihood of losing three such bets in a row is less than 4%. The chances of getting into a point where you are ahead by half of a successful series (win stop of 13 units) are very good indeed.

Final Thoughts

As with all systems, any play based on the ideas contained here needs to play well capitalized, and as always, successful players need to be disciplined and patient.

[PS – No doubt, these ideas have been around in one form or another for many, many years. I do not claim to be presenting any sort of original breakthrough. These ideas merely represent my current thoughts on a better way for me to look at the sobering application of math to gambling systems.]