In this lecture we will discuss conditional probability, as well as apply this to probability trees and look at some useful tools such as Bayes' rule.
Please come to class prepared with these definitions:
A
probability measure on a
sample space S is a
function [math]\href{/cs2800/wiki/index.php/Pr}{Pr} : \href{/cs2800/wiki/index.php/2}{2}^\href{/cs2800/wiki/index.php/S}{S} \href{/cs2800/wiki/index.php/%E2%86%92}{→} \href{/cs2800/wiki/index.php?title=%E2%84%9D&action=edit&redlink=1}{ℝ}
[/math] satisfying the following three
properties:
- For all events [math]E
[/math], [math]\href{/cs2800/wiki/index.php/Pr}{Pr}(E) ≥ 0
[/math]
- [math]\href{/cs2800/wiki/index.php/Pr}{Pr}(S) = 1
[/math]
- For all disjoint events [math]E_1
[/math] and [math]E_2
[/math], [math]\href{/cs2800/wiki/index.php/Pr}{Pr}(E_1 \href{/cs2800/wiki/index.php/%E2%88%AA}{∪} E_2) = \href{/cs2800/wiki/index.php/Pr}{Pr}(E_1) + \href{/cs2800/wiki/index.php/Pr}{Pr}(E_2)
[/math]
These three
properties are referred to as the
Kolmogorov axioms.