This is an interesting example from decision theory (which is pretty closely linked to game theory).

A (very generous!) casino offers you a game where the pot starts at $1 and on each turn a (fair) coin is tossed. If it comes up heads then the pot is doubled, if it comes up tails then you win whatever is in the pot.

How much would you pay to play this game?

Half the time a tail comes up on the first coin toss and you win $1. Half the time you get a head, the pot doubles to $2, and you get to toss the coin again.

On the second toss, half the time you get a tail and win the $2 and half the time you get a head again and the pot doubles to $4 and you get to toss again.

Overall you get the following pattern. Half the time you win $1, a quarter of the time you win $2, one eighth of the time you win $4, etc. The amount you win gets bigger and bigger, but the chance of winning that amount gets smaller and smaller.

This means that you should expect to win:

(1/2 x $1) + (1/4 x $2) + (1/8 x $4) + (1/16 x $8) …

There are an infinite number of terms in this equation.

which is

50c + 50c + 50c …

which is the same as an infinite number of 50 cents, which is infinity.

So using an expected value argument you would pay an infinite amount to play the game because you will win an infinite amount.

However you almost certainly wouldn’t pay a massive amount to play the game. There have been a number of arguments put forward to explain this:

Daniel Bernoulli in 1738 was the first to attempt to resolve the paradox. He suggested that because each extra bit of money means less to you then you won’t value the later money you could win as much. Basically, if you have nothing then $1,000 is a lot of money, if you have $100 million then an extra $1,000 is irrelevant.

Another argument is that people don’t imagine that such a long string of heads is possible. Most people instinctively imagine that after a long run of heads a tail is more likely, even though it isn’t. That’s why casinos show all the recent numbers that have come up on a roulette table, because people are looking for patterns to base their next bet on.

One more argument is that the casino only has a limited amount of money and cannot payout an infinite amount. If the casino is willing to risk $1,000,000 on the bet then the expected payout drops dramatically. It takes a run of 20 heads to take the payout to over $1,000,000, this means that the expected payout drops to a bit over $10. The high expected value normally comes from having some incredibly rare but extraordinarily high payouts, without these the value drops substantially.

Which explanation do you prefer for the paradox, or do you have your own?

Image courtesy of marin / FreeDigitalPhotos.net