Basic Probability #
Probability: models uncertainty (what you don’t know yet).
Probability quantifies uncertainty: a number between 0 and 1.
- 0 means: impossible
- 1 means: certain
Sample space and events #
Sample space (S): all possible outcomes
Example: dice roll → (S={1,2,3,4,5,6})Event (A): a subset of outcomes
Example: “even number” → (A={2,4,6})
Axioms of Probability #
For any event (A):
- Non-negativity
\[ P(A)\ge 0 \]
- Normalisation
\[ P(S)=1 \]
- Additivity (mutually exclusive events)
If (A\cap B=\emptyset), then
\[ P(A\cup B)=P(A)+P(B) \]
These axioms are the rules of the game: everything else follows from them.
Definition of Probability #
Two useful interpretations:
Classical (equally likely outcomes) #
\[ P(A)=\frac{\text{number of outcomes in }A}{\text{number of outcomes in }S} \]
Example: rolling a 6 on a fair die
\[ P(\{6\})=\frac{1}{6} \]
Empirical (long-run frequency) #
\[ P(A)\approx\frac{\text{count of }A}{\text{number of trials}} \]
Example: if “heads” appears 498 times in 1000 flips
\[ P(\text{heads})\approx 0.498 \]
Mutually Exclusive vs Independent Events #
These are often confused: they are different ideas.
Mutually exclusive (cannot happen together) #
Events (A) and (B) are mutually exclusive if:
\[ A\cap B=\emptyset \]
If mutually exclusive:
\[ P(A\cap B)=0 \]
Independent (one does not affect the other) #
Events (A) and (B) are independent if:
\[ P(A\cap B)=P(A)\,P(B) \]
flowchart LR
A["Two events: A and B"] --> B{"Can they happen together?"}
B -->|No| C["Mutually Exclusive: A ∩ B = ∅"]
B -->|Yes| D{"Does A change B?"}
D -->|No| E["Independent: P(A ∩ B)=P(A)P(B)"]
D -->|Yes| F["Dependent: use conditional probability"]
Mini-check (self-test) #
- If (P(A)=0.4) and (P(B)=0.3) and (A,B) are independent: what is (P(A\cap B))?
- If (A,B) are mutually exclusive: what is (P(A\cap B))?
- Which measure is more robust to outliers: mean or median?
Answers:
- (0.4\times 0.3=0.12)
- (0)
- Median
What’s next #
Conditional Probability & Bayes’ Theorem
This is where “given what I already know…” becomes mathematics, and where Naïve Bayes begins.