June 04, 2014

Between One and Ten

Here's some background information: I was a campus tour guide at Brown University, which means that for one hour every week, I had to give a tour to prospective students and parents who are interested in the university. There were quite a few of us and we are all assigned individual time slots. On my slot, (Monday, 3PM) there were three other tour guides for a total of four.

Now here's the scenario: there were three tour guides one day (one was sick) and about four prospective students. Clearly, only a single tour guide was necessary. But how do we choose which of us three has to give the tour in a fair and diplomatic way? Do we play rock-paper-scissors? But that could take ages.

Here's the method: each of the tour guides picks a number from one to ten (inclusive). Then a prospective student does the same. Whoever is closest to the student's number has to give the tour. The others picked three and five, while I picked four. The number was five. I did not have to give the tour.

Here's the problem: On the walk back to our student center, the other tour guide who picked three said to me that my method was incredibly fair and she was very surprised at how quickly I had come up with it. This led me to think--is it really all that fair? Is there a way to pick a number so as to minimize my chances of giving the tour? I guess it's time for some economics.
One through ten
As you can see from above, there are exactly ten choices that we could have picked. To start off, we assume that each number is equally likely to be picked with a probability of 10%. We also set that no two players may pick the same number. Therefore the game is also a consecutive one--players take turns picking numbers, with only the prospective student picking any number he/she wishes. Again, to reiterate the rules, the player closest to the prospective's number must give the tour. So we must be as far from that number as possible to minimize our chances of giving the tour. (Because college students are lazy).

In the above scenario, there were three players in the game. Let's first simplify the problem and pretend that there were only two tour guides--player 1 (P1), player 2 (P2), and the prospective student (PS).

How does the game play out?
  1. PS chooses a number independently.
  2. P1 chooses a number first.
  3. P2 chooses last.
Let's say P1 chooses the number 5 arbitrarily. What is the best response for P2? Intuitively we can see that the number 1 is the best choice to minimize the chances of giving the tour.
If P1 = 5, then P2 should play 1
What happens if the PS is 3? Then a fair coin would be used to decide who gives the tour. Therefore, as the above diagram depicts, there is a 75% chance that P1 gives the tour, whereas P2 only has a 25% chance.

Now let's say P1 knew all that, so he decides to choose 1 before P2 has the chance to. What is the best response for P2 now? We can easily calculate the probabilities for every other choice as such:
Probabilities of both players if P1 first picks 1
We can see from the above chart that if P1 first picks 1, then P2 minimizes his/her chances of giving a tour by picking 10. Knowing these probabilities, how would the rational person behave? We could derive a large 10 by 10 payoff matrix listing all the possibilities, but that would be too much. Instead, we can intuitively see that, assuming both players are rational (i.e. economics majors), that the Nash equilibrium is (1, 10) or (10, 1) with the probabilities of either player giving a tour at 50%. Not a very interesting conclusion.

But what if P1 is not rational? What if P1 isn't an economics major? What is the best response of P2 given any choice that P1 can make?
P1's choice, the best response of P2, and the respective probabilities
As you can see from the above chart, given any choice made by P1, the best response by P2 is either 1 or 10. But how is that determined? It's easy to see that P2 should pick the furthest number on the side with the least amount of numbers on it. So for example, if P1 picked 6, there are 4 numbers to its right, and 5 numbers to the left. The furthest number on the right is 10, so therefore P2 should pick 10 for the minimum probability of 25%. This rule stands except for the case of 1 and 10, in which case there is only one side to pick from.

What have we learned so far? What are the valid conclusions?
  • If you are P1:
    • Is P2 rational?
      • Play 1 or 10
    • Is P2 irrational?
      • Cannot tell what P2 will pick, but still pick 1 or 10 to minimize probability
  • If you are P2:
    • Is P1 rational?
      • Play 1 or 10, whichever P1 did not pick
    • Is P2 irrational?
      • Play the furthest number from P1's choice on the side with the least amount of numbers

With that out of the way, let's introduce a new player, P3 to the mix. Now we have returned to the original scenario that got me thinking about this problem. How does it change things? Are the best responses of P1 and P2 the same or are they different? Let's find out.

Let's start by assuming that the optimal choices for P1 and P2 are the same despite the introduction of the new player P3.
  • P1 picks 1
  • P2, being a rational economist, picks 10
  • What should P3 pick?
What should P3 pick?
Intuition from our previous findings tells us to pick numbers that are the furthest from both 1 and 10--in this case: 5 and 6. How do they differ? Should P3 pick 5, he/she has a 45% chance of giving the tour, while P1 only has a 15% chance, and P2 has a 30% chance. Should P3 pick 6, he/she still has a 45% chance, but the percentages for P1 and P2 are swapped.

So given all of the above information, what is the most optimal choice you can make as a player in this made-up game? It seems quite obvious that if your motive was to not give the tour and be lazy, then you should always pick first and pick either 1 or 10, the two ends of the number range. This is of course assuming that the other players are rational lazy economists who also want to minimize their chances of giving a tour. But then they would also realize that picking first has an advantage over the other players. Would this trigger a pre-game mini-game? Perhaps a game of rock-paper-scissors to decide who picks the first number?

But what if the other players weren't rational. Or what if, god forbid, they wanted to give the tour? What is the optimal choice to make here as a lazy tour-guide?

Below are some scenarios where picking first is not the most optimal, despite our above calculations.
P1 is screwed over by P3 favoring P2

P1 is screwed over by a irrational decision by P2 and an opportunist P3
So what does this mean? Is there truly an optimal solution in the real world? Can we ever know what the "correct" answer is? This here lies the true problem of economics. Sure we can factor out any variables that we want, but at the end of the day (as my childish example shows) we really don't know how the other players will behave. You could also just say to suck it up and just give the damn tour, but then where else would I use all this expensive economics knowledge I learned from school?

Here's a quote from a previous economics post I made:
...come 2013 I my parents will have spent roughly $212,544 (tuition per year times 4) just so that I can use economics to analyze the above scenario...
Sounds pretty good to me.

On a side note, click here to read my previous tongue-in-cheek economics post about crossing sidewalks.

No comments:

Post a Comment