Monday, April 09, 2007

Switch to Win?

This was an actual game show back in the day, and Monty Hall was the host and became the namesake of the logical problem. The setup is this (from Wikipedia):
A thoroughly honest game-show host has placed a car behind one of three doors. There is a goat behind each of the other doors. You have no prior knowledge that allows you to distinguish among the doors. "First you point toward a door," he says. "Then I'll open one of the other doors to reveal a goat. After I've shown you the goat, you make your final choice whether to stick with your initial choice of doors, or to switch to the remaining door. You win whatever is behind the door." You begin by pointing to door number 1. The host shows you that door number 3 has a goat.
The question is this: do you have a better chance of driving home instead of taking the bus if you switch to door 2? Most people reason that there is a 50/50 chance–one door contains the goat, and one contains the car–but this isn't the case. Plenty of sources offer the correct answer ("Yes") and a full explanation, but even after hearing or reading several of them, many people (and some very smart ones) still attest that the odds are even. The solution is confusing to some and infuriating to others (the legendary Paul Erdős fought it adamantly, to no avail).
Earlier today, I was talking to my brother about this (he's the one who told me about ol' Paul's hatred of the thing), and as he explained the problem to me (He had asked if I knew the Monty Hall problem, and I didn't know the name.), I struggled to remember a discussion about it from high school computer class. When he finished, there was still a bit of confusion lingering in the air, so I proposed this version of the explanation (which is pretty much as simple as it can be said1):

Whichever door you first pick has a 2/3 chance of being a goat. When the host opens a goat door, your door still has a 2/3 chance of hiding a goat, which equates to a 1/3 chance of finding the car behind it.
Then we chatted a bit more about it and decided that this is just one of several ways to explain it (keep track of the chance the first door is hiding the car, for example), and he brought up the idea of considering the problem with more doors (not Mordors, one of those is scary enough). Imagine there are 100 doors, all lined up, and one door hides the car; a herd of sheep (cheaper than goats) was borrowed from a local farmer to put one behind each of the other 99 doors. Pick a door. The chance that you pick the one door with the car is 1 in 100. This means that there is a 99% chance that the car is behind one of the other doors. If the host then opens up all but one of the other doors, that one door has a 99% chance of being the door with the car because 1) there is still a 99% chance that one of the 99 doors you didn't pick is the winning door and 2) you can see that 98 of them are not winning doors. Thus, the one you can't see behind has a 99% chance of being a winner, and your door keeps the 1% probability and its status as a loser.
Increasing the scale simply helps to visualize the problem; it doesn't change the logic (If you have trouble seeing this, get out a piece of paper and try it with 99 doors, then 98, 97, and so on, until you get down to 3.). Of course, newer game shows are more creative and use multiple cases (instead of doors) with various sums of money (instead of goats and cars) with several opportunities to switch, accompanied by the same number of multi-case openings, and they have a lot more show (girls) and hype to attract viewers. I wonder what percentage of people who watch that show understand the math behind it? Maybe I should ask what the probability is that a randomly selected viewer is actually interested in the game and not just the show (girls).


1Some might contend that tracking the chance you picked the car is simpler, but I'd argue that it's exactly the same thing viewed from a different perspective. I stated it as written to maintain historical integrity.