To encourage Elmer’s promising tennis career, his father offers him a prize if he wins (at least) two tennis sets in a row in a three-set series to be played with his father and the club champion alternately: father-champion-father or champion-father-champion, according to Elmer’s choice. The champion is a better player than Elmer’s father. Which series should Elmer choose?
Solution:
Since the champion plays better than the father, it seems reasonable that fewer sets should be played with the champion. On the other hand, the middle set is the key one, because Elmer cannot have two wins in a row without winning the middle one. Let C stand for the champion, F for father, and W and L for win and loss by Elmer. Let f be the probability of Elmer’s winning any set from his father, c the corresponding probability of winning from the champion. The table shows only possible prize-winning sequences together with their probabilities, given independence between sets, for the two choices.
Set with:
Father First
F
C
F
Probability
W
W
W
fcf
W
W
L
fc(1-f)
L
W
W
(1-f)cf
Total
fc(2-f)
Champion First
C
F
C
Probability
W
W
W
cfc
W
W
L
cf(1-c)
L
W
W
(1-c)fc
Total
fc(2-c)
Since Elmer is more likely to best his father than to beat the champion, f is larger than c, and 2-f is smaller than 2-c, and so Elmer should choose CFC. For example, for f=0.8, c=0.4, the chance of winning the prize with FCF is 0.384, that for CFC is 0.512. Thus, the importance of winning the middle game outweighs the disadvantage of playing the champion twice.
Many of us have a tendency to suppose that the higher the expected number of successes, the higher the probability of winning a prize, and often this supposition is useful. But occasionally a problem has special conditions that destroy this reasoning by analogy. In our problem, the expected number of wins under CFC is 2c+f, which is less than the expected number of wins for FCF, 2f+c. In our example with f=0.8 and c=0.4, these means are 1.6 and 2.0 in that order. This opposition of answers gives the problem flavor.
A drawer contains red socks and black socks. When two socks are drawn at random, the probability that both are red is 1/2.
a) How small can the number of socks in the drawer be?
b) How small if the number of black socks is even?
Solution:
Just to set the pattern, let us do a numerical example first. Suppose there were 5 red and 2 black socks; then the probability of the first sock’s being red would be 5/(5+2). If the first were red, the probability of the second’s being red would be 4/(4+2), because one red sock has already been removed. The product of these two numbers is the probability that both socks are red:
This result is close to 1/2, but we need exactly 1/2. Now let us go at the problem algebraically.
Let there be r red and b black socks. The probability of the first sock’s being red is \frac{r}{r+b}; and if the first sock is red, the probability of the second’s being red now that a red has been removed is \frac{r-1}{r+b-1}. Then we require the probability that both are red to be \frac{1}{2}, or
One could just start with b=1 and try successive values of r, then go to b=2 and try again, and so on. That would get the answer quickly. Or we could play along with a little more mathematics. Notice that
And so, 21 is the smallest number of socks when b is even. If we were to go on and ask for further values of r and b so that the probability of two red socks is 1/2, we would be wise to appreciate that this is a problem in the theory of numbers. It happens to lead to a famous result in Diophantine Analysis obtained from Pell’s equation. Try r = 85, b = 35.
You must be logged in to post a comment.