*Ruby Nixson*

We all know someone who claims to have a foolproof gambling strategy, yet surprise surprise, they usually end up losing money. Often, this is because they haven’t stopped to think about the underlying maths…

Imagine you’re in a casino, and are lucky enough (you’ll see why soon) to find a game where on each individual turn, your chance of winning a point is 0.5, and your chance of losing a point is 0.5 – in other words the odds of winning or losing are 50:50. Suppose you start with 100 points and decide to stop when you reach 150 points (then you win), or 0 points (then you lose), whichever happens first. It would of course be great to know what the chance of losing is before you start playing, so you can make an informed decision about whether or not to play. So the question is, can we find this probability?

Luckily, being mathematicians, it turns out we can! Problems like these have been studied for centuries and are known as ‘**gambler’s ruin**’ problems.

We start by thinking about the possible outcomes of the first turn, of which there are 2:

- We can win a point, which has a 50% chance of happening. If this happens we’ll have 101 points and so we then need to know the probability of losing, now starting from 101 points.
- We can lose a point, which also has a 50% chance of happening, leaving us with 99 points. In this case we need the probability of losing, given that we start from 99 points.

If we say that P(100) is the probability of losing, starting from 100 points, and P(101), P(99) are similar for 101 and 99 starting points, then based on our observation of what can happen on the first turn, we need to solve the equation:

P(100) = (0.5 * P(99)) + (0.5 * P(101))

More generally, if we start with n points, this says:

P(n) = (0.5 * P(n-1)) + (0.5 * P(n+1))

(Note that if we start with 0 points, we’ve automatically lost, whereas starting with 150 points means we’ve automatically won and we don’t play.)

To solve this equation we begin by looking at something called the ‘characteristic equation’ of the problem, which in this instance is x= 0.5 + 0.5 x^{2}. (More details on where this comes from can be found here.)

The characteristic equation is a quadratic equation and can be solved (for example, by using the quadratic formula), to get x = 1. This gives us the ‘general form’ of the solution, which is:

P(n) = (A + B*n) * 1^{n}.

But if you think about doing 1x1x1x1x1x…x1 n times, this will always be 1. So our general solution is in fact:

P(n) = A + B*n

where we just need to work out what A and B are. These are calculated by ensuring the equation satisfies P(0) = 1, since we always lose if starting with 0 points, and P(150) = 0, since we always win when starting with 150 points and so the probability of losing is zero. To do this, we substitute in n = 0 and n = 150 and solve the system of simultaneous equations:

1 = P(0) = A + (B*0) = A

0 = P(150) = A + (B*150) = A + 150B

The top result gives that A=1, and the bottom then gives B = -1/150. So our final solution is:

P(n) = 1 – n/150 and therefore, P(100) = 1 – 100/150 = 1 – 2/3 = 1/3.

So, we only have a 33.3% chance of losing! The odds look pretty good here, and we’d be tempted to play. But, what happens if the chance of winning a point on a turn goes down to 47% (and the chance of losing a point increases to 53%)? It’s only a small change in the odds, so we expect only a small change in the probability that we win, right? Let’s calculate in the same way as before:

P(100) = (0.53 * P(99)) + (0.47 * P(101))

(Note that the only thing that has changed from the first equation are the probabilities of winning and losing.)

Proceeding as before we then have, more generally:

P(n)= (0.53 * P(n-1)) + (0.47 * P(n+1))

and we can then solve using a similar method as above to get:

P(n) = [(0.53/0.47)^{n} – (0.53/0.47)^{150}] / [1 – (0.53/0.47)^{150}]

See here for more notes on how to solve equations like these, known as **recurrence relations**.

If we substitute in n = 100 to the solution above, we get P(100) = 0.9975. So, even though a 47% chance of winning a point doesn’t *appear* to be too much worse than our original 50% chance, the probability of losing is now 99.75%, which, I’m sure we can all agree, doesn’t look great.

In a game of this sort, casinos will **always** set the odds of winning to be in their favour – they know the maths behind the games and so will never offer a game where the gambler has the advantage. A gambler might be able to take the advantage in certain rounds, but will never have an overall advantage. If they did, every casino would be out of business and we wouldn’t even be considering this problem!

So then, why do some gamblers continue to play even after losing large sums of money? One possible reason relates to a phenomenon known as the “gambler’s fallacy” or the “Monte Carlo fallacy”. This is the belief that if an event has happened more frequently than is deemed ‘normal’ in the past, then the event is less likely to happen in the future. So after a string of losses, gamblers often believe they are due some “good luck” and so continue betting thinking that a win has to come soon.

This is in fact a terrible idea, largely due to a misunderstanding of how independence of events works in these games. For an example of truly independent events, consider rolling 2 dice. The value on one die has absolutely no effect on the value of the second die, so the rolls are said to be independent. Another example you might not have considered is the National Lottery: the sequence 1,2,3,4,5,6 is just as likely to come up as any other sequence of 6 numbers, but most people will choose a more jumbled sequence, believing (incorrectly) that it is more likely to appear. In the context of gambling, the number of wins and losses of a player in a random game are also independent – so no matter what has happened previously, the probability of winning and losing remains the same in each game.

The most famous example of the gambler’s fallacy in practice occurred in 1913 in Monte Carlo Casino. During a roulette game, the ball landed in a black square over 20 times in a row before landing in a red square. As the streak of black squares continued, gamblers falsely believed that red would be more likely to come up, and so began putting large bets on this outcome. But, by the independence of each separate spin, the chance of getting a red result has nothing to do with the previous results, and by the time a red square finally came up, it is reported that millions of dollars had been gambled away.

We’ve all fallen victim to the gambler’s fallacy at some point, and most of the time it’s harmless, but it’s important to always think about the independence of events, and don’t let the hope of good luck take your money. If in doubt, do the maths!

[…] are independent (the concept of independence is also explored in the Gambler’s Ruin article here). When two normally distributed random variables are independent, their sum is also normally […]

LikeLike