FA17 Lecture 5

From CS2800 wiki

Lecture 5: conditional probability

  • Reading: [[../handouts/cameron_prob_notes.pdf#page=31|Cameron chapter 2]]
  • [[../../2017sp/lectures/lec13-conditioning.html|Last semester's notes]]
  • Definitions: conditional probability
  • Modeling with conditional probability
  • Bayes's rule, Law of total probability

Conditional probability

Definition: If [math]A [/math] and [math]B [/math] are events, then the probability of A given B, written [math]Pr(A|B) [/math] is given by [math]Pr(A|B) := \frac{Pr(A \cap B)}{Pr(B)} [/math] Note that [math]Pr(A|B) [/math] is only defined if [math]Pr(B) \neq 0 [/math].

Intuitively, [math]Pr(A|B) [/math] is the probability of [math]A [/math] in a new sample space created by restricting our attention to the subset of the sample space where [math]B [/math] occurs. We divide by [math]Pr(B) [/math] so that [math]Pr(B|B) = 1 [/math].

Note: [math]A|B [/math] is not defined, only [math]Pr(A|B) [/math]; this is an abuse of notation, but is standard.

Probability trees

Using conditional probability, we can draw a tree to help discover the probabilities of various events. Each branch of the tree partitions part of the sample space into smaller parts.

For example: suppose that it rains with probability 30%. Suppose that when it rains, I bring my umbrella 3/4 of the time, while if it is not raining, I bring my umbrella with probability 1/10. Given that I bring my umbrella, what is the probability that it is raining?

One way to model this problem is with the sample space

[math]S = \{raining (r), not raining (nr)\} \times \{umbrella (u), no umbrella (nu)\}

 = \{(r,u), (nr,u), (r,nu), (nr,nu)\}
[/math]

Let [math]R [/math] be the event "it is raining". Then [math]R = \{(r,u), (r,nu)\} [/math]. Let [math]U [/math] be the event "I bring my umbrella". Then [math]U = \{(r,u), (nr,u)\} [/math].

The problem tells us that [math]Pr(R) = 3/10 [/math]. It also states that [math]Pr(U|R) = 3/4 [/math] while [math]Pr(U|\bar{R}) = 1/10 [/math]. We can use the following fact:

Fact: [math]Pr(\bar{A}|B) = 1 - Pr(A|B) [/math]. Proof left as exercise.

to conclude that [math]Pr(\bar{U}|R) = 1/4 [/math] and [math]Pr(\bar{U}|\bar{R}) = 9/10 [/math].

We can draw a tree:

File:Lec05-tree.svg
caption Probability tree (<a href="lec05-tree.tex">LaTeX source</a>)

We can compute the probabilities of the events at the leaves by multiplying along the paths. For example, [math]Pr(\{(r,u)\}) = Pr(U \cap R) = Pr(R)Pr(U | R) = (3/10)(3/4) = (9/40) [/math]

To answer our question, we are interested in [math]Pr(R|U) = Pr(U \cap R)/Pr(U) [/math]. We know [math]U = \{(u,r), (u,nr)\} [/math]. We can compute [math]Pr(U) [/math] using the third axiom; [math]Pr(U) = Pr(\{(u,r)\}) + Pr(\{(u,nr)\}) = (3/10)(3/4) + (7/10)(1/10) [/math]. We can then plug this in to the above formula to find [math]Pr(R|U) [/math].

Note we could also answer this using Bayes's rule and the law of total probability (see below); it would amount to exactly the same calculation. The tree just helps organize all of the variables.

Bayes's Rule

Bayes's rule is a simple way to compute [math]P(A|B) [/math] from [math]P(B|A) [/math].

Claim: (Bayes's rule): [math]P(A|B) = P(B|A)P(A)/P(B) [/math].

Proof: Write down the definitions; proof left as exercise.

Law of total probability

Claim: (law of total probability) If [math]A_1, \dots, A_n [/math] partition the sample space [math]S [/math] (that is, if [math]A_i \cap A_j = \emptyset [/math] for [math]i \neq j [/math] and [math]S = \cup A_i [/math]), then

[math]Pr(B) = \sum_{i} Pr(B|A_i)Pr(A_i) [/math]

Proof sketch: Write [math]B = \cup_{i} (B \cap A_i) [/math]. Apply third axiom to conclude [math]Pr(B) = \sum_{i} Pr(B \cap A_i) [/math]. Apply definition of [math]Pr(B | A_i) [/math].