A Strange Game

Suppose that you go into an alien casino on the planet Zreebnorf, and you are offered to play a game. It works like this. The casino will match you with another player at random. The players don’t know who they are playing with, so there is no way for them to coordinate their actions or reciprocate after the game. Both players secretly pick a number between 1 and 100. The outcome of the game is then calculated as follows.

If both players picked the same number, let X be the number they picked. Both players will receive X credits from the house and have to pay (100 − X) credits to the house.

If the players picked different numbers, let X be the smaller number. The player who chose X will receive (X + 2) credits, and the other player will pay (100 − X) credits to the house.

(Credits are standard Galactic currency worth approximately $1 USD each.)

Some examples:

  • Both players pick 50. They both pay 50 credits and receive 50 credits, for a net payout of 0.
  • Both players pick 100. They both receive 100 credits and pay 0, for a net payout of 100 credits each.
  • Player A picks 100 and player B picks 99. Player A pays 1 credit to the house, and player B receives 101 credits from the house.
  • Player A picks 1 and player B picks 2. Player A receives 3 credits and player B pays 99 credits.
  • Both players pick 1. They both pay 99 credits to the house and receive 1 credit, for a net payout of −98 credits each.

Would you play this game?

If so, what number would you choose?

Is there a Nash equilibrium?

Does the house make money from this game?

Comments

  1. I would either pick 100 or 99. 100 if I felt like being nice. 99 if I felt like being selfish for that extra credit. Of course though, that may be motivated as well by the factor of anonymity here.

    ReplyDelete
    Replies
    1. I think you're probably having the same misunderstanding that I had. If you pick 100, then the only possible way you could gain money if your opponent also picks 100. If your opponent picks any number from 1 to 99, then you will lose money. Your losses will also be high if your opponent picks a low number.

      Delete
  2. For a given choice X of player A, the incentive for Player B is to always go one lower. This works out in the end to a Nash-equilibirum of (1,1).
    Here the House gains 2*99 Credits per round, so a negative expectation-value for the players.

    In the scenario you presented, I can't expect to be paired with a high-trust opponent. I would not play the game.

    I expect if we were to play this game i a real-life setting, the house would loose money in the long run. In this case I pick x = 100.

    Is this just the prisoner's dilemma with more choices?

    ReplyDelete
    Replies
    1. There is a Nash equilibrium of (1, 1), in a sense.

      However, anyone who chooses to play the game will not expect the result to be (1, 1). So the Nash equilibrium is not predictive. Presumably, anyone playing the game is playing to win some money from the house. And, they will expect the other player they are paired with to have the same motivation. So it is rational to expect the other player not to select 1.

      Delete
  3. Most people who are playing the game presumably want to earn as many credits as possible. So, I think that the Nash equilibrium would be (100,100), even if it requires cooperation with unknown strangers. In that case, everybody would want to play the game because the likelihood of high gains is high, and the probability of losses is low.

    Picking a number that's lower than your opponent does hurt your opponent, but the returns for yourself aren't higher than (100,100) unless you pick 99 and the other player picks 100. The lower you pick, the more it hurts your opponent, and the less you have to gain.

    If you're only concerned about making a profit, then there isn't much to gain from hurting your opponent. It's also not clear why any player should try to hurt their opponent, unless one of the goals is to have more credits than the opponent by the end of the game. I believe that additional rule would be necessary for a Nash equilibrium of (1,1) to arise.

    However, these dynamics wouldn't make it reasonable for the casino house to offer such a game in the first place, since they would cause the casino to lose money most of the time. By contrast, most terrestrial casinos make a profit, while the average player loses money overall. If this weren't the case, then real world casinos would go out of business and cease to exist. So, I don't think this game would be possible in a realistic world scenario, unless one of the rules of the game were changed.

    By contrast, the prisoner's dilemma is a realistic game theoretic model that captures many different real world phenomena. Unlike the Zreebnorf casino, players in the prisoner's dilemma have the potential to make great gains at the expense of other players getting great losses.

    So, I think the reason why the Zreebnorf casino is not a real world scenario for casinos (and hence strange) is because the rewards for defecting players simply aren't very high. Likewise, the potential losses for cooperating when other players defect can be either high or low. When you pick 100, your potential losses for cooperating are entirely dependent on what number the opponent picks.

    On Zreebnorf, the rewards for cooperating are high, while the rewards for defecting are usually much lower (except in the rare case where Player A picks 100 and player B picks 99, and you are player B). These dynamics feel similar to the dynamics of cooperating and defecting in a modern society. The distribution of cooperators and defectors in the Zreebnorf casino would probably be similar to the distribution of law-abiding citizens and criminals in real life.

    ReplyDelete
    Replies
    1. The Nash equilibrium is (1, 1). (100, 100) is not a Nash equilibrium, because each player is not playing the best choice given the other player's choice. (Each could do better by choosing 99.)

      There is a lot to gain from picking a lower number. It's true that picking 99 instead of 100 would only improve your result by 1 if your opponent picks 100. However, if your opponent picks 99, it would improve your result by 99, from -1 to 98. So, there is a strong incentive to pick a lower number.

      Delete
    2. The incentive to pick smaller numbers is to maximize your chances of winning money and/or not losing money.

      The incentive to pick bigger numbers is to maximize your potential earnings, if your opponent picks higher than you or the same number.

      In hindsight, I didn't pay enough attention to the fourth example, so I misunderstood an important reason to pick lower numbers. If player A picks 1, and player B picks any number from 2 to 100, then player A is guaranteed win at least some money, while player B loses money. If player A picks 1, then the only way he could lose money if player B also picks 1.

      So, I stand corrected. I now understand why the Nash equilibrium is (1,1), even though neither player would want such a scenario since it entails losing money.

      I revise my answer to the questions. If I could be a player, then I would not play this game because it's too unpredictable. And if I could be the house, then I would not offer this game because it's too unpredictable.

      Delete
    3. Edit: Actually, I think the house would probably profit money overall, since the Nash equilibrium is (1,1), not (100,100). While there are incentives to pick higher numbers, the randomness and anonymity of the opponent matching discourages people from picking higher numbers, so there is no effective way for players to organize cooperation against the house.

      Anybody who consistently picks high numbers would eventually learn that it's impossible to organize for your opponent to pick the same number and avoid losses for both sides, so they'd revise their strategy to pick lower, for selfish reasons. Thus, it would make sense for the house to offer this game.

      Delete
    4. To be more precise, the house would only lose money if both players pick unequal numbers that are both at least 50 (26.5% of all possible outcomes).

      If both numbers were picked at random, then the probability that both players pick equal choices > 50 would be 0.5% of the time (50 / 10,000). The house breaks even if both players pick 50, or if one player picks 49 (3 out of 10,000 possible outcomes). But if either player picks 48 or lower, then the house makes a net profit.

      So, if both numbers are picked randomly, then the house would make a profit 73.5% of the time. Moreover, we discussed how there's also a Nash equilibrium of (1,1) and how both players have strong incentives to pick lower numbers in order to avoid losses. This would be a very profitable game for casinos.

      Delete

Post a Comment