This quarter at UC San Diego, I took a class on probability - MATH 180A, and absolutely loved the class. One of the most interesting parts of MATH 180A was the counter-intutive examples we went over in class and on homework. This blog post is a compilation of a few relatively simple to understand and compute counter-intuitive results in probability.

### The Monty Hall Problem

The Monty Hall problem is probably one of the most known and talked about game in all of mathematics. The game is simple, and here’s the exact excerpt -

“Suppose you’re on a game show, and you’re given the choice of three doors: Behind one door is a car; behind the others, goats. You pick a door, say No. 1, and the host, who knows what’s behind the doors, opens another door, say No. 2, which has a goat. He then says to you, “Do you want to pick door No. 3?” Is it to your advantage to switch your choice?”

At first glance, one might think that once the game host opens the door with the goat, the remaining two doors have an equal 50-50 chance of having the car behind them, and hence there is no advantage in switching. However, the correct answer is that it would always be better to switch, since the door you initially chose will only have 33% chance, while the other door will have a 66% chance of having the car behind it, and here’s the mathematical explanation:

We will use Bayes’ theorem to prove this.

Answer: Let $B_i = \{$car is behind door i$\}$ and $A = \{$door 2 is opened$\}$. We know that $P(B_1) == P(B_2) == P(B_3) == \frac{1}{3}$. Also, $P(A \vert B_1) = \frac{1}{2}$ because if the car is behind door 1, then the host could chose either door 2 OR door 3 with equal probability. And $P(A \vert B_2) = 0$ because if the car is behind door 2, then the host will definitely not open door 2 - he will open door 1 or door 3. Lastly, $P(A \vert B_3) = 1$ since if the car is behind door 3 the host would have no other option but to open door 2. Now we have all the components required to use the Bayes’ theorem. We are interested in calculating $P(B_3 \vert A) = \frac{P(A \vert B_3)P(B_3)}{P(A)}$ which is $\frac{1\frac{1}{3}}{1*\frac{1}{3} + 0*\frac{1}{3} + \frac{1}{2}*\frac{1}{3}}$, simplifying, $P(B_3 \vert A) = 2/3$ - implying you must switch for a higher chance of winning a car.

### Family with three children with the same birthday

Question: Among one million three-child families, what is the probability that there is at least one family in which all three have the same birthday?

Again, at first, one might think this probability to be somewhere between 0.5 - 0.8, since it is indeed rare that three children will have the same birthday, however given a million families there is a reasonable chance that such a family exists. Let’s calculate the exact chance of this happening.

Note that March 1, 2017 and March 1, 2004 are considered the same birthday for this problem, and we are ignoring the leap year case.

Answer: Let $A_i = \{$All three children in ith family have the same birthday$\}$. For j = 2,3, let $B_{i,j} = \{$in ith family, the jth child has the same birthday as the 1st child$\}$. Then, $P(A_i) = P(B_{i,2})P(B_{i,3}) = {\frac{1}{365}}^2 \approx = 0.0000075$ which implies $P({A_i}^C) = 0.9999925$. We want to compute $1 - P({A_1}^C \cap {A_2}^C \cap \cdots \cap {A_{1000000}}) = 1 - (0.9999925)^{1000000} \approx 1 - 0.00055$, and hence the probability that atleast in one family, all 3 children have the same birthday is $0.99945$ - implying its almost certain that atleast one family out of a million will have three children with the same birthday.

This paradox is different in nature than the traditional probability questions discussed above. This particular example deals with the expected value of sum of discrete random variables.

Question: Suppose you flip a coin until you get a heads. If heads comes up on the nth toss, you get $2^n$. How much do you expect to win in this game?

Answer: Let X be the random variable representing the number of dollars you win. Then note that $P(X = 2^n) = (1/2)^n$ and $E[X] = \sum_{n=1}^\infty 2^nP(X=2^n) = \infty$. Hence, theoretically, you can win an infinite amount of money playing this game - keep in mind, though, that it will take an infinite number of trials to actually win an infinite amount of money and the time taken to do so will exceed the length of the universe.

### Russian Roulette

Russian Roulette is a classic game where $6$ people stand in a circle. There is a revolver (6-slots) with a bullet in only one of the slots. Then, any one person will be given the gun and asked to point the gun to the person standing to their right, and shoot. If the person dies, the game is over, and if not, the gun is passed on. The game continues until someone dies.

Question: Where would you want to stand in the circle, to minimize your probability of death? (Assume that the barrel is never spun)

Answer: It is not possible to do better than $1/6$. Let’s say Person A the first one to whom the gun is pointed at. (Many people, based on a brief survey I did, feel this is the worst position to be in) It is clear that the chance of survival is $1/6$, since the bullet is in only one of the 6 slots. Now, let’s assume that Person A survives and is given the gun to point at Person B, now what is the probability of survival for Person B? The most common answer is $1/5$, since one of the slots has already been tested (and was empty) there are only 5 slots left out of which one must be full. However, this is incorrect. The probability of Person B dying is $1/5$ given that Person A survived, more formally, $P($ B dies $\|$ A survives $) = 1/5$. However, at the beginning of the game, we know that Person B will die only if Person A survives. That is, $P($ B Survives$) = (5/6)*(1/5) = 1/6$. Without the loss of generality, this holds true for every person in the circle. So the person who shot the gun first (Person F) will have probability $(5/6)*(4/5)*(3/4)*(2/3)*(1/2) = 1/6$ of death.

### Summary

There are many more counter-intuitive results in probability theory, and this post only scratches the surface. However, the examples in this post do a good job of demonstrating the beauty and mystery of real-world applications of probability. Note: I’m taking Stochastic Processes in the upcoming quarter (MATH 180B) - so lookout for some random processes posts on the blog!

References
[1] Professor Jason Schweinsberg at the UC San Diego mathematics department.