<?xml version="1.0" encoding="utf-8" standalone="yes"?><rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Conditional Probability &amp; Bayes’ Theorem on Arshad Siddiqui</title><link>https://arshadhs.github.io/docs/ai/statistics/02_conditional-probability/</link><description>Recent content in Conditional Probability &amp; Bayes’ Theorem on Arshad Siddiqui</description><generator>Hugo</generator><language>en-us</language><lastBuildDate>Thu, 12 Mar 2026 00:00:00 +0000</lastBuildDate><atom:link href="https://arshadhs.github.io/docs/ai/statistics/02_conditional-probability/index.xml" rel="self" type="application/rss+xml"/><item><title>Conditional Probability</title><link>https://arshadhs.github.io/docs/ai/statistics/02_conditional-probability/021_conditional_prob/</link><pubDate>Thu, 12 Mar 2026 00:00:00 +0000</pubDate><guid>https://arshadhs.github.io/docs/ai/statistics/02_conditional-probability/021_conditional_prob/</guid><description>&lt;h1 id="conditional-probability">
 Conditional Probability
 
 &lt;a class="anchor" href="#conditional-probability">#&lt;/a>
 
&lt;/h1>
&lt;p>Conditional probability updates the probability of an event when new information is available.&lt;/p>
&lt;p>It shows up whenever a question says:&lt;/p>
&lt;ul>
&lt;li>“given that…”&lt;/li>
&lt;li>“among those who…”&lt;/li>
&lt;li>“out of the items that…”&lt;/li>
&lt;li>“if it does not fail immediately…”&lt;/li>
&lt;/ul>
&lt;blockquote class="book-hint info">
&lt;p>Key takeaway:
Conditional probability is always:&lt;/p>
&lt;p>joint probability ÷ probability of the condition.&lt;/p>
&lt;p>The condition must not be an impossible event.&lt;/p>
&lt;/blockquote>
&lt;hr>
&lt;h2 id="prior-vs-posterior">
 Prior vs posterior
 
 &lt;a class="anchor" href="#prior-vs-posterior">#&lt;/a>
 
&lt;/h2>
&lt;ul>
&lt;li>
&lt;p>Prior probability:
probability with no condition (before new information)&lt;/p></description></item><item><title>Bayes’ Theorem</title><link>https://arshadhs.github.io/docs/ai/statistics/02_conditional-probability/022_bayes_theorem/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate><guid>https://arshadhs.github.io/docs/ai/statistics/02_conditional-probability/022_bayes_theorem/</guid><description>&lt;h1 id="bayes-theorem">
 Bayes’ Theorem
 
 &lt;a class="anchor" href="#bayes-theorem">#&lt;/a>
 
&lt;/h1>
&lt;h3 id="21-total-probability-needed-for-bayes">
 2.1 Total probability (needed for Bayes)
 
 &lt;a class="anchor" href="#21-total-probability-needed-for-bayes">#&lt;/a>
 
&lt;/h3>
&lt;p>Often we split the world into cases 
&lt;span>
 \( E_1,E_2,\dots,E_k \)
 &lt;/span>

 that:&lt;/p>
&lt;ul>
&lt;li>are mutually exclusive&lt;/li>
&lt;li>cover the whole sample space&lt;/li>
&lt;/ul>
&lt;p>Then for any event 
&lt;span>
 \( A \)
 &lt;/span>

:&lt;/p>
&lt;span style="color: red;">
 &lt;span>
 \[ 
P(A)=\sum_{i=1}^{k} P(A\mid E_i)\,P(E_i)
 \]
 &lt;/span>
&lt;/span>
&lt;p>Tree intuition:&lt;/p>


&lt;pre class="mermaid">
flowchart TD
 S[Start] --&amp;gt; E1[Case E1]
 S --&amp;gt; E2[Case E2]
 S --&amp;gt; E3[Case E3]
 E1 --&amp;gt; A1[&amp;#34;A happens&amp;#34;]
 E2 --&amp;gt; A2[&amp;#34;A happens&amp;#34;]
 E3 --&amp;gt; A3[&amp;#34;A happens&amp;#34;]
&lt;/pre>

&lt;hr>
&lt;h3 id="22-bayes-theorem-two-event-form">
 2.2 Bayes’ theorem (two-event form)
 
 &lt;a class="anchor" href="#22-bayes-theorem-two-event-form">#&lt;/a>
 
&lt;/h3>
&lt;p>Bayes&amp;rsquo; Theorem is a mathematical formula used to determine the &lt;strong>conditional probability of an event based on prior knowledge and new evidence&lt;/strong>.&lt;/p></description></item><item><title>Naïve Bayes</title><link>https://arshadhs.github.io/docs/ai/statistics/02_conditional-probability/023_naive_bayes/</link><pubDate>Thu, 12 Mar 2026 00:00:00 +0000</pubDate><guid>https://arshadhs.github.io/docs/ai/statistics/02_conditional-probability/023_naive_bayes/</guid><description>&lt;h1 id="naïve-bayes">
 Naïve Bayes
 
 &lt;a class="anchor" href="#na%c3%afve-bayes">#&lt;/a>
 
&lt;/h1>
&lt;p>Naïve Bayes is a &lt;strong>probabilistic classifier&lt;/strong>.&lt;/p>
&lt;ul>
&lt;li>Supervised Learning Problem&lt;/li>
&lt;li>Binary Classification - final target variable is considered in two classes&lt;/li>
&lt;li>Hypothesis is target which you want to classify&lt;/li>
&lt;li>Total Probability (Prior) of Yes and No is already calculated&lt;/li>
&lt;li>Post / Posterior is when you start studying data&lt;/li>
&lt;li>Based on max probability of hypotheses classify given instance into a class&lt;/li>
&lt;/ul>
&lt;p>It predicts a class label by computing:&lt;/p></description></item></channel></rss>