This weeks brainteaser is about a very simple card game that works as follows:

You have a standard 52 card deck that is properly shuffled and stacked with all cards face down. The dealer is going to flip over one card at a time. Before the dealer flips each card, you can place a $100 bet. If the next card is red (a heart or a diamond) you win $100. If the next card is black (a club or a spade), you lose $100.

What is the optimal strategy for this game? How much will you win on average if you play optimally?

This game doesn't relate directly to poker, but to solve it you need to understand backwards induction which is a key game theory concept that is needed to solve almost any real world poker situation.

Full Disclosure: I did not invent this brainteaser but despite a fair amount of googling I can't track down who did and it got passed on to me via word of mouth years ago. If anyone knows the creator of this puzzle please let me know as I'd love to cite him or her.

This problem is being discussed on reddit here if you want to join the discussion. And if you missed out on last weeks brainteaser be sure to check it out here.

You can play this game (for pretend money of course) below if you are using an up to date browser:

You have a standard 52 card deck that is properly shuffled and stacked with all cards face down. The dealer is going to flip over one card at a time. Before the dealer flips each card, you can place a $100 bet. If the next card is red (a heart or a diamond) you win $100. If the next card is black (a club or a spade), you lose $100.

**Win or lose, the game ends at the point, and the deck is reset and reshuffled. If you don't bet before the second to last card is flipped over, then you are required to bet on the last card.**What is the optimal strategy for this game? How much will you win on average if you play optimally?

This game doesn't relate directly to poker, but to solve it you need to understand backwards induction which is a key game theory concept that is needed to solve almost any real world poker situation.

Full Disclosure: I did not invent this brainteaser but despite a fair amount of googling I can't track down who did and it got passed on to me via word of mouth years ago. If anyone knows the creator of this puzzle please let me know as I'd love to cite him or her.

This problem is being discussed on reddit here if you want to join the discussion. And if you missed out on last weeks brainteaser be sure to check it out here.

**Update, its Monday and the solution has been posted here.**You can play this game (for pretend money of course) below if you are using an up to date browser:

Here are my thoughts, which may be wrong due to calculation error but seem intuitive:

ReplyDeleteBackwards induction for both a four and six card game suggests there is no difference between betting and not betting at the first card. The stipulation that one must bet on the final card removes the slight advantage that would be otherwise present from trying for a more favorable ratio of colors, because it counteracts every potential positive outcome with a forced loss. It is unreasonable and impractical to conduct backwards induction for the 52 card game, but the analysis can be extrapolated from that on the games of manageable size; it becomes clear that every game starting with an even ratio is composed of even subgames (with expected value zero) and offsetting uneven subgames. Therefore the optimal strategy is simply to bet on the first card every time, since you are not indifferent to the time it would take to progress through a game. Under this logic, of course, the real optimal strategy is not to play.

This is pretty good logic and you are right in the 4 and the 6 case. There are techniques for backwards inducting through all 52 cards which I will show in the solution and there are cases where things appear true for small numbers like 4 and 6 and then shift at some critical point. I won't give away the solution by saying whether this is one of these cases or not :)

DeleteIt also seems true for eight and ten, so I'm a little skeptical that it shifts at a critical point. Unlike a game that does shift (say the pirate game once you extend beyond 2*g) there is no numerical constraint that would alter the play. However, I'm incredibly rusty at game theory, such that I don't remember anything beyond basic backwards induction, so I could be missing something obvious. I look forward to seeing the more advanced techniques and the solution.

DeleteAs an aside, please keep posting these. They're a nice way to practice game theory.

For the four card game, I actually find that the EV is positive. If you bet with four cards, the EV is obviously 0. If you draw, then you either end up in with [2R,1B] or [1R,2B]. For [2R,1B], betting gives EV = +100/3, but waiting gives and EV = +100/2 (0.5 chance or [2R,0B] which has EV = +100 and 0.5 chance of [1R,1B] which has EV = 0). Thus for [2R,1B] you should draw with EV = +50. For [1B,2R], the EVs are reversed with an EV of -100/3 for betting and -50 for drawing. Thus for [1B,2R] you should bet overall with EV -100/3. Overall, the EV for drawing in the 4-card game is: 0.5*EV(draw,[2R,1B]) + 0.5*EV(bet,[1R,2B]) = +25/3.

DeleteSo the strategy for the 4-card game is to first draw a card. If the card is red, then bet on the next card being red. If the card is black, keep drawing until you draw the last black card from the deck or are forced to bet on the last card.

I don't think this is quite right. "For [2R,1B], betting gives EV = +100/3, but waiting gives and EV = +100/2 (0.5 chance or [2R,0B] which has EV = +100 and 0.5 chance of [1R,1B] which has EV = 0)." You have a 2/3rds chance of going to [1R,1B] and a 1/3rd of going to [2R,0B]

DeleteActually, no, nevermind, I think I screwed up the math on that (wrong probabilities for waiting on [2R,1B]). The Anon is correct.

Delete