tag:blogger.com,1999:blog-6899095487697524238.comments2014-04-14T18:17:25.382-07:00GTORangeBuilder BlogUnknownnoreply@blogger.comBlogger96125tag:blogger.com,1999:blog-6899095487697524238.post-38710095801262078812014-04-14T18:17:25.382-07:002014-04-14T18:17:25.382-07:00Sure, you can add that data to the post. I can giv...Sure, you can add that data to the post. I can give it a second check to be sure figures are fine.BlackLoternoreply@blogger.comtag:blogger.com,1999:blog-6899095487697524238.post-52045447765270347182014-04-14T12:15:02.291-07:002014-04-14T12:15:02.291-07:00You are right that the nash equilibrium strategy h...You are right that the nash equilibrium strategy has him playing scissors as often as we play rock, but he can lose less than 1 quarter. If you tell me the exact strategy that you are proposing we play, I can tell you the counter strategy for him that loses less than 1/4th of the time.Anonymoushttps://www.blogger.com/profile/01506290885754080155noreply@blogger.comtag:blogger.com,1999:blog-6899095487697524238.post-86542380533870087112014-04-14T12:09:12.266-07:002014-04-14T12:09:12.266-07:00then our strategy should be to play rock as often ...then our strategy should be to play rock as often as our opponent plays scissors and paper the rest of the time:<br />then <br />we win = r * (1 -s ) + s * s<br />tie = p * (1 - s) + r * s<br />we lose = s * (1 - s)<br /><br />because r >= 1/2 and s <= 1/2 my opponents strategy of playing scissors as often as our paper as much as possible would push him to play 1/2 scissors to counteract our strategy and losing 1/4 of the time is the best he can do.Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-6899095487697524238.post-55372088375867385892014-04-14T10:55:45.251-07:002014-04-14T10:55:45.251-07:00I'm not 100% sure I understand the strategy yo...I'm not 100% sure I understand the strategy you are proposing, but if you are saying we should play 50% rock, 50% paper, our opponent could always play paper when he is allowed to and effectively be playing 50% rock and 50% paper as well. We would then break even against him.Anonymoushttps://www.blogger.com/profile/01506290885754080155noreply@blogger.comtag:blogger.com,1999:blog-6899095487697524238.post-92174179613592260252014-04-14T10:29:25.633-07:002014-04-14T10:29:25.633-07:00I don't buy any of this. Please explain where ...I don't buy any of this. Please explain where I'm wrong. <br /><br />Over time my opponent will know my strategy and I'll know his.<br /><br />My opponent would just match the scissors % to my paper %, insofar as he could. <br /><br />My paper % should always be >= 50% as that is where my advantage lies.<br /><br />Therefore my opponent can best counter that advantage by playing scissor 1/2 the time.<br /><br />My only gain then is when I play rock to counteract the scissor.<br /><br />That gives me<br />1/4 P v R = +1/4<br />1/4 P v S = -1/4<br />1/4 R v S = +1/4<br />1/4 R v R = 0<br /><br />That means I win 1/4 of the time.<br /><br />If I play 2/3 paper, I only win 1/6 of the time vs 1/2 R & 1/2 SAnonymousnoreply@blogger.comtag:blogger.com,1999:blog-6899095487697524238.post-56204680342668318552014-04-14T06:18:19.031-07:002014-04-14T06:18:19.031-07:00Awesome, thanks for doing that, what program did y...Awesome, thanks for doing that, what program did you use if I may ask?<br /><br />The vs Tight numbers are interesting, especially how giant the exploitative leak is. If its okay with you, I'll edit the post to mention your data.Anonymoushttps://www.blogger.com/profile/01506290885754080155noreply@blogger.comtag:blogger.com,1999:blog-6899095487697524238.post-1010619759541353792014-04-14T05:37:30.796-07:002014-04-14T05:37:30.796-07:00I improved your analysis in the push/fold scenario...I improved your analysis in the push/fold scenario using a better program that will enable me to compute the exact gains in each scenario. I supposed the fish will employ the best ranges (ie the best 35% of hands he should call with as opposed to a sub optimal 35% of hands) which means that obviously in reality he may do worse.<br />These are the results I got:<br /> GTO bb/100 Max Expl. bb/100 GTO WR % Expl. Leak<br />vs GTO 0 0 100.00% 0<br />vs Small Fish 0.39 0.81 48.15% -1.26<br />vs Med. Fish 1.77 3.54 50.00% -2.82<br />vs Huge fish 21.18 26.04 81.34% -2.64<br />vs Tight 20% 1.47 8.82 16.67% -27.63<br />vs vTight 14% 5.07 28.14 18.02% -41.79<br />BlackLoternoreply@blogger.comtag:blogger.com,1999:blog-6899095487697524238.post-3184202904625550062014-04-13T14:12:19.128-07:002014-04-13T14:12:19.128-07:00Actually, no, nevermind, I think I screwed up the ...Actually, no, nevermind, I think I screwed up the math on that (wrong probabilities for waiting on [2R,1B]). The Anon is correct.Bryanhttps://biophysicalchemistry.wordpress.com/noreply@blogger.comtag:blogger.com,1999:blog-6899095487697524238.post-78697457985443816612014-04-13T14:04:25.020-07:002014-04-13T14:04:25.020-07:00I don't think this is quite right. "For ...I don't think this is quite right. "For [2R,1B], betting gives EV = +100/3, but waiting gives and EV = +100/2 (0.5 chance or [2R,0B] which has EV = +100 and 0.5 chance of [1R,1B] which has EV = 0)." You have a 2/3rds chance of going to [1R,1B] and a 1/3rd of going to [2R,0B] Anonymoushttps://www.blogger.com/profile/01506290885754080155noreply@blogger.comtag:blogger.com,1999:blog-6899095487697524238.post-13524067069736866952014-04-13T13:58:07.761-07:002014-04-13T13:58:07.761-07:00For the four card game, I actually find that the E...For the four card game, I actually find that the EV is positive. If you bet with four cards, the EV is obviously 0. If you draw, then you either end up in with [2R,1B] or [1R,2B]. For [2R,1B], betting gives EV = +100/3, but waiting gives and EV = +100/2 (0.5 chance or [2R,0B] which has EV = +100 and 0.5 chance of [1R,1B] which has EV = 0). Thus for [2R,1B] you should draw with EV = +50. For [1B,2R], the EVs are reversed with an EV of -100/3 for betting and -50 for drawing. Thus for [1B,2R] you should bet overall with EV -100/3. Overall, the EV for drawing in the 4-card game is: 0.5*EV(draw,[2R,1B]) + 0.5*EV(bet,[1R,2B]) = +25/3.<br /><br />So the strategy for the 4-card game is to first draw a card. If the card is red, then bet on the next card being red. If the card is black, keep drawing until you draw the last black card from the deck or are forced to bet on the last card.Bryanhttps://biophysicalchemistry.wordpress.com/noreply@blogger.comtag:blogger.com,1999:blog-6899095487697524238.post-23244948911761534302014-04-12T07:10:19.927-07:002014-04-12T07:10:19.927-07:00It also seems true for eight and ten, so I'm a...It also seems true for eight and ten, so I'm a little skeptical that it shifts at a critical point. Unlike a game that does shift (say the pirate game once you extend beyond 2*g) there is no numerical constraint that would alter the play. However, I'm incredibly rusty at game theory, such that I don't remember anything beyond basic backwards induction, so I could be missing something obvious. I look forward to seeing the more advanced techniques and the solution.<br /><br />As an aside, please keep posting these. They're a nice way to practice game theory.Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-6899095487697524238.post-53475667299352554192014-04-12T05:42:24.815-07:002014-04-12T05:42:24.815-07:00This is pretty good logic and you are right in the...This is pretty good logic and you are right in the 4 and the 6 case. There are techniques for backwards inducting through all 52 cards which I will show in the solution and there are cases where things appear true for small numbers like 4 and 6 and then shift at some critical point. I won't give away the solution by saying whether this is one of these cases or not :)Anonymoushttps://www.blogger.com/profile/01506290885754080155noreply@blogger.comtag:blogger.com,1999:blog-6899095487697524238.post-63455470771077238892014-04-12T05:37:45.341-07:002014-04-12T05:37:45.341-07:00You are definitely correct, that is an error, sorr...You are definitely correct, that is an error, sorry about that. It doesn't effect the solution, and I've corrected it above, thanks for pointing it out!<br /><br />Glad you like the brainteasers :)Anonymoushttps://www.blogger.com/profile/01506290885754080155noreply@blogger.comtag:blogger.com,1999:blog-6899095487697524238.post-54781870608369908862014-04-12T03:48:24.176-07:002014-04-12T03:48:24.176-07:00I'm loving these brainteasers, really hope to ...I'm loving these brainteasers, really hope to see more of them.<br /><br />One thing though that I'm a bit confused about: in the bonus puzzle solution when we solve for player 1's EVs, why is the second round EV represented by 50 * r2? Should it not be 50 * (1 - r2) since we only auto-win the second round if player 2 *doesn't* play rock? It's also inconsistent with one of the following paragraphs saying that, if we plug r2=p2=s2=1/3 in we get an EV of $33.33... which we don't since it simplifies to 50 * 1/3 = $16.67. Seems like an error to me...Michaelhttps://www.blogger.com/profile/09725266426672372476noreply@blogger.comtag:blogger.com,1999:blog-6899095487697524238.post-89167571583519505272014-04-11T21:25:53.295-07:002014-04-11T21:25:53.295-07:00Here are my thoughts, which may be wrong due to ca...Here are my thoughts, which may be wrong due to calculation error but seem intuitive:<br /><br />Backwards induction for both a four and six card game suggests there is no difference between betting and not betting at the first card. The stipulation that one must bet on the final card removes the slight advantage that would be otherwise present from trying for a more favorable ratio of colors, because it counteracts every potential positive outcome with a forced loss. It is unreasonable and impractical to conduct backwards induction for the 52 card game, but the analysis can be extrapolated from that on the games of manageable size; it becomes clear that every game starting with an even ratio is composed of even subgames (with expected value zero) and offsetting uneven subgames. Therefore the optimal strategy is simply to bet on the first card every time, since you are not indifferent to the time it would take to progress through a game. Under this logic, of course, the real optimal strategy is not to play.Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-6899095487697524238.post-24633940970732890252014-04-11T13:18:08.374-07:002014-04-11T13:18:08.374-07:00This comment has been removed by the author.Anonymoushttps://www.blogger.com/profile/06857932562738782648noreply@blogger.comtag:blogger.com,1999:blog-6899095487697524238.post-66033015754798147832014-04-08T00:55:03.054-07:002014-04-08T00:55:03.054-07:00That's a nice and clean math.That's a nice and clean math. k43rnoreply@blogger.comtag:blogger.com,1999:blog-6899095487697524238.post-401700707307678632014-04-07T05:39:59.594-07:002014-04-07T05:39:59.594-07:00Repost of my first comment which failed to appear ...Repost of my first comment which failed to appear :<br /><br />Bonus :<br />The optimal strategy if my opponent is intelligent and can predict my moves as I can predict his would be for me to make a random choice (head or tails) betwwen paper and rock on the first round, and for him to do the same.<br />I would then gain 25$ on average which is what I would be willing to pay.<br />(By "on average" here I mean : a means of all possible results. I know there are only two rounds.)<br /><br />Explanation :<br />The only difficulty lies in the strategy for the first round. In the second round, my opponent will play rock if he hasn't play rock before, and a random choice of P/R/S if he has. I will play paper if he hasn't played rock before, and gain 50$, or play a random selection of P/R/S and gain 0$ on average if he has played rock before.<br /><br />There are only 9 possible games for the first round :<br />If I play R and he plays R, I will gain 0$ on average for the two rounds.<br />If I play R and he plays P, I will gain 0$ on average for the two rounds.<br />If I play R and he plays S, I will gain 1000$ for the two rounds.<br /><br />If I play P and he plays R, I will gain 50$ on average for the two rounds.<br />If I play P and he plays P, I will gain 50$ for the two rounds.<br />If I play P and he plays S, I will gain 0$ for the two rounds.<br /><br />If I play S and he plays R, I will lose 50$ on average for the two rounds.<br />If I play S and he plays P, I will gain 100$ on average for the two rounds.<br />If I play S and he plays S, I will gain 50$ on average for the two rounds.<br /><br />So if my opponent plays randomly in the first round, all options will give me the same gain on average.<br />But if I play randomly, the situation is very different for him, as he can minimize my gain to an average of zero by playing rock.<br />But of course I could predict that and play paper in the first round and gain 50$ if he has played rock.<br />Which he could then predict, and play scissors to beat me, which I could predict and play rock, which he could predict, which I could predict...<br />If I can predict his move I can counter him, and if he can predict mine he can counter me.<br />So my best choice is to be unrpedictable and that goes for him too, while I still keep in mind that playing rock first is the best strategy for him if I make a random choice between the three options.<br />So my best choice is to make a random choice between paper and rock, thus optimizing my results against rock while remaining upredictable.<br />If he can predict that, it will be best for him to avoid playing scissors, because against my rock or paper his scissors will gain me 50$ on average, instead of 25$ for his rock or paper against mine.<br /><br /><br />rickadinnoreply@blogger.comtag:blogger.com,1999:blog-6899095487697524238.post-81803129477599541512014-04-07T05:34:08.161-07:002014-04-07T05:34:08.161-07:00Follow-up to my earlier comment :
I got it wrong....Follow-up to my earlier comment : <br />I got it wrong. I can still improve my strategy and pay as much as 33.33$to play if I throw a die, and play paper if I get 1,2,3,4, and rock if I get 5 or 6. So a 2/3 chance to play paper, 1/3 chance which of course just matches the result of the first problem.<br />In that case, there is no optimal strategy for my opponent, he can play whatever he wants.<br />If I increase the odds that I play paper to more than 50%, thent it becomes interesting for him to play scissors which will give me a 0$ gain against my paper, but it is balanced by the risk he might lose 100$ if I still play rock. A 2/3 chance of paper vs 1/3 chance of rock is the equilibrium. Whatever he chooses to play then will make me gain 33.33$ on average, whether he plays scissors or not.<br />rickadinnoreply@blogger.comtag:blogger.com,1999:blog-6899095487697524238.post-64354084174164453762014-04-07T04:15:15.581-07:002014-04-07T04:15:15.581-07:00I can see that several other people have come up w...I can see that several other people have come up with this answer already but I will post my solution as well.<br /><br />The Expected Value (EV) of regular RPS is 0 for both players, using the strategy of 1/3 for each possibility. The constrained player is unable to play this strategy so EV(us) > 0.<br /><br />Because the constrained player must play rock at least 1/2 of the time, if we play scissors with P > 0, then we will lose at least 50% of the time. If the game is to have positive EV, then we should be able to do better.<br /><br />In any Nash equilibrium, neither player must be able to better by changing their strategies. After ruling out every pure strategy (which I will leave up to the reader), we can see that mixed strategies are required. In any continuous, mixed strategy equilibrium, players will be indifferent between the strategies that play that have non-zero probability (otherwise they could do better by adjusting the probabilities).<br /><br />We want to find probabilities that will make our opponent indifferent to playing paper or scissors. Therefore:<br />Payoff (opponent, scissors) = P(us, paper) - P(us, rock)<br />and Payoff (opponent, paper) = P(us, rock) - P(us, scissors)<br />are equal. But P(us, scissors) = 0, so this simplifies to:<br /><br />P(us, paper) - P(us rock) = P(us, rock) or<br />P(us, paper) = 2 * P(us rock)<br /><br />Since P(us, paper) + P(us, rock) + P(us, scissors) = 1, we get:<br />3 * P(us rock) = 1 or,<br />P(us, rock) = 1/3, and<br />P(us, paper) = 2/3.<br /><br />In an equilibrium, we must be indifferent between paper and rock. Using the same logic above, we get P(opponent, scissors) = 2/3 and P(opponent, paper) = 1/3, otherwise it would our payoff would be higher under rock or paper.<br /><br />From this we can calculate the EV of 100/6 = 16.66. Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-6899095487697524238.post-58682403218540734002014-04-07T03:28:31.459-07:002014-04-07T03:28:31.459-07:00Bonus :
The optimal strategy if my opponent is int...Bonus :<br />The optimal strategy if my opponent is intelligent and can predict my moves as I can predict his would be for me to make a random choice (head or tails) betwwen paper and rock on the first round, and for him to do the same.<br />I would then gain 25$ on average which is what I would be willing to pay.<br />(By "on average" here I mean : a means of all possible results. I know there are only two rounds.)<br /><br />Explanation :<br />The only difficulty lies in the strategy for the first round. In the second round, my opponent will play rock if he hasn't play rock before, and a random choice of P/R/S if he has. I will play paper if he hasn't played rock before, and gain 50$, or play a random selection of P/R/S and gain 0$ on average if he has played rock before.<br /><br />There are only 9 possible games for the first round :<br />If I play R and he plays R, I will gain 0$ on average for the two rounds.<br />If I play R and he plays P, I will gain 0$ on average for the two rounds.<br />If I play R and he plays S, I will gain 1000$ for the two rounds.<br /><br />If I play P and he plays R, I will gain 50$ on average for the two rounds.<br />If I play P and he plays P, I will gain 50$ for the two rounds.<br />If I play P and he plays S, I will gain 0$ for the two rounds.<br /><br />If I play S and he plays R, I will lose 50$ on average for the two rounds.<br />If I play S and he plays P, I will gain 100$ on average for the two rounds.<br />If I play S and he plays S, I will gain 50$ on average for the two rounds.<br /><br />So if my opponent plays randomly in the first round, all options will give me the same gain on average.<br />But if I play randomly, the situation is very different for him, as he can minimize my gain to an average of zero by playing rock.<br />But of course I could predict that and play paper in the first round and gain 50$ if he has played rock.<br />Which he could then predict, and play scissors to beat me, which I could predict and play rock, which he could predict, which I could predict...<br />If I can predict his move I can counter him, and if he can predict mine he can counter me.<br />So my best choice is to be unrpedictable and that goes for him too, while I still keep in mind that playing rock first is the best strategy for him if I make a random choice between the three options.<br />So my best choice is to make a random choice between paper and rock, thus optimizing my results against rock while remaining upredictable.<br />If he can predict that, it will be best for him to avoid playing scissors, because against my rock or paper his scissors will gain me 50$ on average, instead of 25$ for his rock or paper against mine.<br /><br />rickadinnoreply@blogger.comtag:blogger.com,1999:blog-6899095487697524238.post-36207677438196920902014-04-06T22:14:39.488-07:002014-04-06T22:14:39.488-07:00This comment has been removed by the author.Anonymoushttps://www.blogger.com/profile/07814686307108533155noreply@blogger.comtag:blogger.com,1999:blog-6899095487697524238.post-71574173181673155592014-04-06T22:07:35.254-07:002014-04-06T22:07:35.254-07:00Spoiler.
The most you should be willing to pay is...Spoiler.<br /><br />The most you should be willing to pay is 100/6 = $16.66.<br /><br />I think that the equilibrium strategy is for the unconstrained player to play paper 2/3 of the time and rock 1/3 of the time. The constrained player plays scissors 2/3 and paper 1/3 of the time he gets to choose.Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-6899095487697524238.post-31705193214088416932014-04-06T18:26:14.110-07:002014-04-06T18:26:14.110-07:00That is entertaining, I like that a lot, thanks fo...That is entertaining, I like that a lot, thanks for sharing!Anonymoushttps://www.blogger.com/profile/01506290885754080155noreply@blogger.comtag:blogger.com,1999:blog-6899095487697524238.post-54717199942256353892014-04-06T18:09:59.380-07:002014-04-06T18:09:59.380-07:00Your explanation was not wlog. Using your logic ch...Your explanation was not wlog. Using your logic choose paper 100%. Then wlog fix R at 100%. You win every game! Wow, Clever!<br /><br />Lets look at a better strategy against yours. Suppose your opponent chooses Scissors and Rock just as often. Then we have as the expected number of wins the following: .5(2/3 -1/6) + .5(-2/3 + 1/6) = 0 when using your strategy. You break even. If you never play scissors, this works out better. Anonymousnoreply@blogger.com