Conditional Probability
1.png

Notes (Part 1):

2.png

Notes (Part 2):

3.png

Notes (Part 3):

4.png

Notes (Part 4):

5.png

Notes (Part 5):

6.png

Notes (Part 6):

7.png

Notes (Part 7):

8.png

Notes (Part 8):

9.png

Notes (Part 9):

10.png

Notes (Part 10):

11.png

Notes (Part 11):

12.png

Notes (Part 12):

13.png

Notes (Part 13):

14.png

Notes (Part 14):

15.png

Notes (Part 15):

16.png

Notes (Part 16):

17.png

Notes (Part 17):

18.png

Notes (Part 18):

Here are the details of an example that was briefly sketched in lecture:
Suppose n letters are stuffed into n envelopes at random. What is the
probability that no letter ends up in its corresponding envelope.

Let $A_i$ be the probability that letter i ends up in envelope i. We wish to compute $1 - P[\cup_i A_i]$.

By inclusion exclusion, $P[\cup_i A_i] = (-1)^{k+1} \sum_{\{i_1, \ldots , i_k\}} P[A_{i_1} \cap \cdots \cap A_{i_k}]$

But $A_{i_1} \cap \cdots \cap A_{i_k}$ is the event that these particular $k$ letters end up in their envelopes. The number of ways this can happen is $(n-k)!$, and so the probability of this event is $\frac{(n-k)!}{n!}$. Since there are $n \choose k$ ways of picking the indices $i_1, \ldots , i_k$, the total contribution of these terms is ${\frac{(n-k)!}{n!}} {n \choose k} = {\frac{1}{k!}}$.

Plugging this back into the formula above, we get that $1 - P[\cup_i A_i] = 1 - \sum_k \frac{(-1)^k}{k!} \approx \frac{1}{e}$.

19.png

Notes (Part 19):

Comments

Add a New Comment
or Sign in as Wikidot user
(will not be published)
- +
Unless otherwise stated, the content of this page is licensed under Creative Commons Attribution-ShareAlike 3.0 License