<?xml version="1.0" encoding="utf-8" standalone="yes"?><rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Statistics on Arshad Siddiqui</title><link>https://arshadhs.github.io/categories/statistics/</link><description>Recent content in Statistics on Arshad Siddiqui</description><generator>Hugo</generator><language>en-us</language><lastBuildDate>Thu, 12 Mar 2026 00:00:00 +0000</lastBuildDate><atom:link href="https://arshadhs.github.io/categories/statistics/index.xml" rel="self" type="application/rss+xml"/><item><title>Formula Sheet</title><link>https://arshadhs.github.io/docs/ai/statistics/00_formulas/</link><pubDate>Thu, 12 Mar 2026 00:00:00 +0000</pubDate><guid>https://arshadhs.github.io/docs/ai/statistics/00_formulas/</guid><description>&lt;h1 id="formula-sheet">
 Formula Sheet
 
 &lt;a class="anchor" href="#formula-sheet">#&lt;/a>
 
&lt;/h1>
&lt;p>This page is a quick reference of &lt;strong>definitions + formulas&lt;/strong>, grouped by the modules.&lt;/p>
&lt;hr>
&lt;h2 id="notation">
 Notation
 
 &lt;a class="anchor" href="#notation">#&lt;/a>
 
&lt;/h2>
&lt;ul>
&lt;li>Sample size: 
&lt;link rel="stylesheet" href="https://arshadhs.github.io/katex/katex.min.css" />
&lt;script defer src="https://arshadhs.github.io/katex/katex.min.js">&lt;/script>

 &lt;script defer src="https://arshadhs.github.io/katex/auto-render.min.js" onload="renderMathInElement(document.body, {
 &amp;#34;delimiters&amp;#34;: [
 {&amp;#34;left&amp;#34;: &amp;#34;$$&amp;#34;, &amp;#34;right&amp;#34;: &amp;#34;$$&amp;#34;, &amp;#34;display&amp;#34;: true},
 {&amp;#34;left&amp;#34;: &amp;#34;$&amp;#34;, &amp;#34;right&amp;#34;: &amp;#34;$&amp;#34;, &amp;#34;display&amp;#34;: false},
 {&amp;#34;left&amp;#34;: &amp;#34;\\(&amp;#34;, &amp;#34;right&amp;#34;: &amp;#34;\\)&amp;#34;, &amp;#34;display&amp;#34;: false},
 {&amp;#34;left&amp;#34;: &amp;#34;\\[&amp;#34;, &amp;#34;right&amp;#34;: &amp;#34;\\]&amp;#34;, &amp;#34;display&amp;#34;: true}
 ]
});">&lt;/script>

&lt;span>
 \( n \)
 &lt;/span>

 (sample), 
&lt;span>
 \( N \)
 &lt;/span>

 (population)&lt;/li>
&lt;li>Sample mean: 
&lt;span>
 \( \bar{x} \)
 &lt;/span>

, population mean: 
&lt;span>
 \( \mu \)
 &lt;/span>

&lt;/li>
&lt;li>Sample variance: 
&lt;span>
 \( s^2 \)
 &lt;/span>

, population variance: 
&lt;span>
 \( \sigma^2 \)
 &lt;/span>

&lt;/li>
&lt;li>Sample SD: 
&lt;span>
 \( s \)
 &lt;/span>

, population SD: 
&lt;span>
 \( \sigma \)
 &lt;/span>

&lt;/li>
&lt;li>Complement: 
&lt;span>
 \( A^c \)
 &lt;/span>

&lt;/li>
&lt;li>Intersection (“and”): 
&lt;span>
 \( A\cap B \)
 &lt;/span>

, union (“or”): 
&lt;span>
 \( A\cup B \)
 &lt;/span>

&lt;/li>
&lt;li>Conditional probability: 
&lt;span>
 \( P(A\mid B) \)
 &lt;/span>

&lt;/li>
&lt;/ul>
&lt;hr>
&lt;h1 id="1-basic-probability--statistics">
 1. Basic Probability &amp;amp; Statistics
 
 &lt;a class="anchor" href="#1-basic-probability--statistics">#&lt;/a>
 
&lt;/h1>
&lt;h2 id="11-measures-of-central-tendency">
 1.1 Measures of Central Tendency
 
 &lt;a class="anchor" href="#11-measures-of-central-tendency">#&lt;/a>
 
&lt;/h2>
&lt;h3 id="arithmetic-mean">
 Arithmetic mean
 
 &lt;a class="anchor" href="#arithmetic-mean">#&lt;/a>
 
&lt;/h3>
&lt;p>Sample mean (ungrouped):&lt;/p></description></item><item><title>Stats Formula Sheet</title><link>https://arshadhs.github.io/docs/ai/statistics/ism-formula-sheet/</link><pubDate>Wed, 25 Feb 2026 00:00:00 +0000</pubDate><guid>https://arshadhs.github.io/docs/ai/statistics/ism-formula-sheet/</guid><description>&lt;h1 id="stats-formula-sheet">
 Stats Formula Sheet
 
 &lt;a class="anchor" href="#stats-formula-sheet">#&lt;/a>
 
&lt;/h1>
&lt;p>Keep this page as a quick reference of &lt;strong>definitions + formulas&lt;/strong>.&lt;/p>
&lt;hr>
&lt;h2 id="notation">
 Notation
 
 &lt;a class="anchor" href="#notation">#&lt;/a>
 
&lt;/h2>
&lt;ul>
&lt;li>Sample size: 
&lt;span>
 \( n \)
 &lt;/span>

 (sample), 
&lt;span>
 \( N \)
 &lt;/span>

 (population)&lt;/li>
&lt;li>Mean: 
&lt;span>
 \( \bar{x} \)
 &lt;/span>

 (sample), 
&lt;span>
 \( \mu \)
 &lt;/span>

 (population)&lt;/li>
&lt;li>Variance: 
&lt;span>
 \( s^2 \)
 &lt;/span>

 (sample), 
&lt;span>
 \( \sigma^2 \)
 &lt;/span>

 (population)&lt;/li>
&lt;li>Standard deviation: 
&lt;span>
 \( s \)
 &lt;/span>

 (sample), 
&lt;span>
 \( \sigma \)
 &lt;/span>

 (population)&lt;/li>
&lt;/ul>
&lt;hr>
&lt;h2 id="module-1-basic-statistics">
 Module 1: Basic Statistics
 
 &lt;a class="anchor" href="#module-1-basic-statistics">#&lt;/a>
 
&lt;/h2>
&lt;h3 id="measures-of-central-tendency">
 Measures of Central Tendency
 
 &lt;a class="anchor" href="#measures-of-central-tendency">#&lt;/a>
 
&lt;/h3>
&lt;p>&lt;strong>Sample mean (ungrouped):&lt;/strong>&lt;/p></description></item><item><title>Basic Statistics</title><link>https://arshadhs.github.io/docs/ai/statistics/01_basic_statistics/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate><guid>https://arshadhs.github.io/docs/ai/statistics/01_basic_statistics/</guid><description>&lt;h1 id="basic-statistics">
 Basic Statistics
 
 &lt;a class="anchor" href="#basic-statistics">#&lt;/a>
 
&lt;/h1>
&lt;p>&lt;strong>Statistics&lt;/strong>: describes data (what you &lt;em>see&lt;/em>).&lt;br>
&lt;strong>Probability&lt;/strong>: models uncertainty (what you &lt;em>don’t know&lt;/em> yet).&lt;/p>
&lt;ul>
&lt;li>Summarise a dataset using central tendency and variability&lt;/li>
&lt;li>Explain core probability ideas using simple examples&lt;/li>
&lt;li>Apply the axioms of probability&lt;/li>
&lt;li>Distinguish mutually exclusive vs independent events&lt;/li>
&lt;/ul>
&lt;hr>


&lt;script src="https://arshadhs.github.io/mermaid.min.js">&lt;/script>

 &lt;script>mermaid.initialize({
 "flowchart": {
 "useMaxWidth":true
 },
 "theme": "default"
}
)&lt;/script>




&lt;pre class="mermaid">
flowchart TD
 A[Dataset] --&amp;gt; B[Central Tendency]
 A --&amp;gt; C[Variability]
 B --&amp;gt; B1[Mean]
 B --&amp;gt; B2[Median]
 B --&amp;gt; B3[Mode]
 C --&amp;gt; C1[Range]
 C --&amp;gt; C2[Variance]
 C --&amp;gt; C3[Standard Deviation]
 C --&amp;gt; C4[IQR]
&lt;/pre>

&lt;hr>
&lt;h2 id="measures-of-central-tendency">
 Measures of Central Tendency
 
 &lt;a class="anchor" href="#measures-of-central-tendency">#&lt;/a>
 
&lt;/h2>
&lt;p>Central tendency tells you where the “middle” of the data is.
Describes a set of scores with a &lt;strong>single number&lt;/strong> that describes the &lt;strong>PERFORMANCE&lt;/strong> of the group.&lt;/p></description></item><item><title>Basic Probability</title><link>https://arshadhs.github.io/docs/ai/statistics/01_basic_probability/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate><guid>https://arshadhs.github.io/docs/ai/statistics/01_basic_probability/</guid><description>&lt;h1 id="basic-probability">
 Basic Probability
 
 &lt;a class="anchor" href="#basic-probability">#&lt;/a>
 
&lt;/h1>
&lt;p>Probability models uncertainty:
what you &lt;em>don’t know&lt;/em> yet, but want to reason about.&lt;/p>
&lt;blockquote class="book-hint info">
&lt;p>Key takeaway:
Probability is a number between &lt;strong>0 and 1&lt;/strong> that measures how likely an event is.
The whole topic is about defining &lt;strong>events&lt;/strong> clearly and applying a few core rules consistently.&lt;/p>
&lt;/blockquote>
&lt;p>Probability quantifies uncertainty: a number between 0 and 1.&lt;/p>
&lt;ul>
&lt;li>0 means: impossible&lt;/li>
&lt;li>1 means: certain&lt;/li>
&lt;/ul>
&lt;hr>
&lt;h2 id="terminology">
 Terminology
 
 &lt;a class="anchor" href="#terminology">#&lt;/a>
 
&lt;/h2>
&lt;h3 id="random-experiment">
 Random experiment
 
 &lt;a class="anchor" href="#random-experiment">#&lt;/a>
 
&lt;/h3>
&lt;p>A random experiment is an action whose outcome is not known in advance.&lt;/p></description></item><item><title>Conditional Probability &amp; Bayes’ Theorem</title><link>https://arshadhs.github.io/docs/ai/statistics/conditional-probability/</link><pubDate>Thu, 12 Mar 2026 00:00:00 +0000</pubDate><guid>https://arshadhs.github.io/docs/ai/statistics/conditional-probability/</guid><description>&lt;h1 id="conditional-probability--bayes-theorem">
 Conditional Probability &amp;amp; Bayes’ Theorem
 
 &lt;a class="anchor" href="#conditional-probability--bayes-theorem">#&lt;/a>
 
&lt;/h1>
&lt;p>Probability often changes when we &lt;strong>learn new information&lt;/strong>.&lt;/p>
&lt;p>Conditional probability and Bayes’ theorem give a structured way to &lt;strong>update beliefs&lt;/strong> using evidence.&lt;/p>
&lt;blockquote class="book-hint info">
&lt;p>Conditional probability updates probabilities after observing an event.&lt;/p>
&lt;p>Bayes’ theorem lets you estimate a hidden cause from observed evidence.&lt;/p>
&lt;p>Naïve Bayes turns Bayes’ theorem into a practical classifier by assuming conditional independence of features given the class.&lt;/p>
&lt;/blockquote>
&lt;hr>


&lt;script src="https://arshadhs.github.io/mermaid.min.js">&lt;/script>

 &lt;script>mermaid.initialize({
 "flowchart": {
 "useMaxWidth":true
 },
 "theme": "default"
}
)&lt;/script>




&lt;pre class="mermaid">
flowchart TD

A[Conditional&amp;lt;br/&amp;gt;probability] --&amp;gt;|foundation| B[Bayes&amp;lt;br/&amp;gt;theorem]
D[Independent&amp;lt;br/&amp;gt;events] --&amp;gt;|implies| C[Independence]
C --&amp;gt;|simplifies| A

E[Prior] --&amp;gt;|with likelihood| B
F[Likelihood] --&amp;gt;|updates| H[Posterior]
G[Evidence] --&amp;gt;|normalises| B
B --&amp;gt;|yields| H

I[Naïve&amp;lt;br/&amp;gt;Bayes] --&amp;gt;|uses| B
J[Naïve&amp;lt;br/&amp;gt;assumption] --&amp;gt;|assumes| C
K[Features] --&amp;gt;|given class| J
L[Class] --&amp;gt;|conditions| J
I --&amp;gt;|predicts| M[Classification]
M --&amp;gt;|selects| L

style A fill:#90CAF9,stroke:#1E88E5,color:#000
style B fill:#90CAF9,stroke:#1E88E5,color:#000
style C fill:#90CAF9,stroke:#1E88E5,color:#000

style D fill:#CE93D8,stroke:#8E24AA,color:#000
style E fill:#CE93D8,stroke:#8E24AA,color:#000
style F fill:#CE93D8,stroke:#8E24AA,color:#000
style G fill:#CE93D8,stroke:#8E24AA,color:#000
style J fill:#CE93D8,stroke:#8E24AA,color:#000
style K fill:#CE93D8,stroke:#8E24AA,color:#000
style L fill:#CE93D8,stroke:#8E24AA,color:#000

style H fill:#C8E6C9,stroke:#2E7D32,color:#000
style I fill:#C8E6C9,stroke:#2E7D32,color:#000
style M fill:#C8E6C9,stroke:#2E7D32,color:#000

&lt;/pre>

&lt;hr>
&lt;h2 id="quick-summary">
 Quick summary
 
 &lt;a class="anchor" href="#quick-summary">#&lt;/a>
 
&lt;/h2>
&lt;ul>
&lt;li>Conditional probability:
updates probability after an event is known.&lt;/li>
&lt;li>Multiplication rule:
computes joint probability from conditional parts.&lt;/li>
&lt;li>Independence:
tested using 
&lt;link rel="stylesheet" href="https://arshadhs.github.io/katex/katex.min.css" />
&lt;script defer src="https://arshadhs.github.io/katex/katex.min.js">&lt;/script>

 &lt;script defer src="https://arshadhs.github.io/katex/auto-render.min.js" onload="renderMathInElement(document.body, {
 &amp;#34;delimiters&amp;#34;: [
 {&amp;#34;left&amp;#34;: &amp;#34;$$&amp;#34;, &amp;#34;right&amp;#34;: &amp;#34;$$&amp;#34;, &amp;#34;display&amp;#34;: true},
 {&amp;#34;left&amp;#34;: &amp;#34;$&amp;#34;, &amp;#34;right&amp;#34;: &amp;#34;$&amp;#34;, &amp;#34;display&amp;#34;: false},
 {&amp;#34;left&amp;#34;: &amp;#34;\\(&amp;#34;, &amp;#34;right&amp;#34;: &amp;#34;\\)&amp;#34;, &amp;#34;display&amp;#34;: false},
 {&amp;#34;left&amp;#34;: &amp;#34;\\[&amp;#34;, &amp;#34;right&amp;#34;: &amp;#34;\\]&amp;#34;, &amp;#34;display&amp;#34;: true}
 ]
});">&lt;/script>

&lt;span>
 \( P(A\cap B)=P(A)P(B) \)
 &lt;/span>

.&lt;/li>
&lt;li>Total probability:
breaks a probability into weighted cases.&lt;/li>
&lt;li>Bayes’ theorem:
reverses conditioning to infer causes from evidence.&lt;/li>
&lt;/ul>
&lt;hr>
&lt;h2 id="whats-next">
 What’s next
 
 &lt;a class="anchor" href="#whats-next">#&lt;/a>
 
&lt;/h2>
&lt;p>Probability Distributions&lt;br>
Move from events to random variables and distributions.&lt;/p></description></item><item><title>Conditional Probability</title><link>https://arshadhs.github.io/docs/ai/statistics/conditional-probability/021_conditional_prob/</link><pubDate>Thu, 12 Mar 2026 00:00:00 +0000</pubDate><guid>https://arshadhs.github.io/docs/ai/statistics/conditional-probability/021_conditional_prob/</guid><description>&lt;h1 id="conditional-probability">
 Conditional Probability
 
 &lt;a class="anchor" href="#conditional-probability">#&lt;/a>
 
&lt;/h1>
&lt;p>Conditional probability updates the probability of an event when new information is available.&lt;/p>
&lt;p>It shows up whenever a question says:&lt;/p>
&lt;ul>
&lt;li>“given that…”&lt;/li>
&lt;li>“among those who…”&lt;/li>
&lt;li>“out of the items that…”&lt;/li>
&lt;li>“if it does not fail immediately…”&lt;/li>
&lt;/ul>
&lt;blockquote class="book-hint info">
&lt;p>Key takeaway:
Conditional probability is always:&lt;/p>
&lt;p>joint probability ÷ probability of the condition.&lt;/p>
&lt;p>The condition must not be an impossible event.&lt;/p>
&lt;/blockquote>
&lt;hr>
&lt;h2 id="prior-vs-posterior">
 Prior vs posterior
 
 &lt;a class="anchor" href="#prior-vs-posterior">#&lt;/a>
 
&lt;/h2>
&lt;ul>
&lt;li>
&lt;p>Prior probability:
probability with no condition (before new information)&lt;/p></description></item><item><title>Bayes’ Theorem</title><link>https://arshadhs.github.io/docs/ai/statistics/conditional-probability/022_bayes_theorem/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate><guid>https://arshadhs.github.io/docs/ai/statistics/conditional-probability/022_bayes_theorem/</guid><description>&lt;h1 id="bayes-theorem">
 Bayes’ Theorem
 
 &lt;a class="anchor" href="#bayes-theorem">#&lt;/a>
 
&lt;/h1>
&lt;h3 id="21-total-probability-needed-for-bayes">
 2.1 Total probability (needed for Bayes)
 
 &lt;a class="anchor" href="#21-total-probability-needed-for-bayes">#&lt;/a>
 
&lt;/h3>
&lt;p>Often we split the world into cases 
&lt;span>
 \( E_1,E_2,\dots,E_k \)
 &lt;/span>

 that:&lt;/p>
&lt;ul>
&lt;li>are mutually exclusive&lt;/li>
&lt;li>cover the whole sample space&lt;/li>
&lt;/ul>
&lt;p>Then for any event 
&lt;span>
 \( A \)
 &lt;/span>

:&lt;/p>
&lt;span style="color: red;">
 &lt;span>
 \[ 
P(A)=\sum_{i=1}^{k} P(A\mid E_i)\,P(E_i)
 \]
 &lt;/span>
&lt;/span>
&lt;p>Tree intuition:&lt;/p>


&lt;pre class="mermaid">
flowchart TD
 S[Start] --&amp;gt; E1[Case E1]
 S --&amp;gt; E2[Case E2]
 S --&amp;gt; E3[Case E3]
 E1 --&amp;gt; A1[&amp;#34;A happens&amp;#34;]
 E2 --&amp;gt; A2[&amp;#34;A happens&amp;#34;]
 E3 --&amp;gt; A3[&amp;#34;A happens&amp;#34;]
&lt;/pre>

&lt;hr>
&lt;h3 id="22-bayes-theorem-two-event-form">
 2.2 Bayes’ theorem (two-event form)
 
 &lt;a class="anchor" href="#22-bayes-theorem-two-event-form">#&lt;/a>
 
&lt;/h3>
&lt;p>Bayes&amp;rsquo; Theorem is a mathematical formula used to determine the &lt;strong>conditional probability of an event based on prior knowledge and new evidence&lt;/strong>.&lt;/p></description></item><item><title>Naïve Bayes</title><link>https://arshadhs.github.io/docs/ai/statistics/conditional-probability/023_naive_bayes/</link><pubDate>Thu, 12 Mar 2026 00:00:00 +0000</pubDate><guid>https://arshadhs.github.io/docs/ai/statistics/conditional-probability/023_naive_bayes/</guid><description>&lt;h1 id="naïve-bayes">
 Naïve Bayes
 
 &lt;a class="anchor" href="#na%c3%afve-bayes">#&lt;/a>
 
&lt;/h1>
&lt;p>Naïve Bayes is a &lt;strong>probabilistic classifier&lt;/strong>.&lt;/p>
&lt;ul>
&lt;li>Supervised Learning Problem&lt;/li>
&lt;li>Binary Classification - final target variable is considered in two classes&lt;/li>
&lt;li>Hypothesis is target which you want to classify&lt;/li>
&lt;li>Total Probability (Prior) of Yes and No is already calculated&lt;/li>
&lt;li>Post / Posterior is when you start studying data&lt;/li>
&lt;li>Based on max probability of hypotheses classify given instance into a class&lt;/li>
&lt;/ul>
&lt;p>It predicts a class label by computing:&lt;/p></description></item><item><title>Probability Distributions</title><link>https://arshadhs.github.io/docs/ai/statistics/probability_distributions/</link><pubDate>Sun, 22 Feb 2026 00:00:00 +0000</pubDate><guid>https://arshadhs.github.io/docs/ai/statistics/probability_distributions/</guid><description>&lt;h1 id="probability-distributions">
 Probability Distributions
 
 &lt;a class="anchor" href="#probability-distributions">#&lt;/a>
 
&lt;/h1>
&lt;p>Probability distributions are the bridge between:
real-world randomness and mathematical modelling.&lt;/p>
&lt;p>A random experiment produces outcomes.
A random variable turns those outcomes into numbers.
A probability distribution tells you how likely each number (or range of numbers) is.&lt;/p>
&lt;blockquote class="book-hint info">
&lt;p>Key takeaway:
A distribution is a complete “story” about uncertainty:
what values are possible, how likely they are, and how we summarise them (mean, variance).&lt;/p>
&lt;/blockquote>
&lt;hr>


&lt;script src="https://arshadhs.github.io/mermaid.min.js">&lt;/script>

 &lt;script>mermaid.initialize({
 "flowchart": {
 "useMaxWidth":true
 },
 "theme": "default"
}
)&lt;/script>




&lt;pre class="mermaid">
flowchart TD
	PD[&amp;#34;Probability&amp;lt;br/&amp;gt;distributions&amp;#34;] --&amp;gt; RV[&amp;#34;Random&amp;lt;br/&amp;gt;variables&amp;#34;]
	PD[&amp;#34;Probability&amp;lt;br/&amp;gt;distributions&amp;#34;] --&amp;gt; DS[&amp;#34;Common&amp;lt;br/&amp;gt;distributions&amp;#34;]

	style PD fill:#90CAF9,stroke:#1E88E5,color:#000
	style RV fill:#90CAF9,stroke:#1E88E5,color:#000
	style DS fill:#90CAF9,stroke:#1E88E5,color:#000
&lt;/pre>

&lt;hr>
&lt;h2 id="aiml-connection">
 AI/ML Connection
 
 &lt;a class="anchor" href="#aiml-connection">#&lt;/a>
 
&lt;/h2>
&lt;ul>
&lt;li>Many ML models are probabilistic:
they assume data (or errors) follow a distribution.&lt;/li>
&lt;li>Loss functions often come from distribution assumptions:
squared loss aligns with Gaussian noise.&lt;/li>
&lt;li>Naïve Bayes (from the previous module) becomes practical once you can model:

&lt;link rel="stylesheet" href="https://arshadhs.github.io/katex/katex.min.css" />
&lt;script defer src="https://arshadhs.github.io/katex/katex.min.js">&lt;/script>

 &lt;script defer src="https://arshadhs.github.io/katex/auto-render.min.js" onload="renderMathInElement(document.body, {
 &amp;#34;delimiters&amp;#34;: [
 {&amp;#34;left&amp;#34;: &amp;#34;$$&amp;#34;, &amp;#34;right&amp;#34;: &amp;#34;$$&amp;#34;, &amp;#34;display&amp;#34;: true},
 {&amp;#34;left&amp;#34;: &amp;#34;$&amp;#34;, &amp;#34;right&amp;#34;: &amp;#34;$&amp;#34;, &amp;#34;display&amp;#34;: false},
 {&amp;#34;left&amp;#34;: &amp;#34;\\(&amp;#34;, &amp;#34;right&amp;#34;: &amp;#34;\\)&amp;#34;, &amp;#34;display&amp;#34;: false},
 {&amp;#34;left&amp;#34;: &amp;#34;\\[&amp;#34;, &amp;#34;right&amp;#34;: &amp;#34;\\]&amp;#34;, &amp;#34;display&amp;#34;: true}
 ]
});">&lt;/script>

&lt;span>
 \( P(X\mid Y) \)
 &lt;/span>

 using suitable distributions.&lt;/li>
&lt;/ul>
&lt;blockquote class="book-hint warning">
&lt;p>In practice:
choosing a distribution is a modelling decision.
It affects:
prediction, uncertainty estimates, and what “rare” or “typical” means in your data.&lt;/p></description></item><item><title>Random Variables</title><link>https://arshadhs.github.io/docs/ai/statistics/probability_distributions/random-variables/</link><pubDate>Sun, 22 Feb 2026 00:00:00 +0000</pubDate><guid>https://arshadhs.github.io/docs/ai/statistics/probability_distributions/random-variables/</guid><description>&lt;h1 id="random-variables">
 Random Variables
 
 &lt;a class="anchor" href="#random-variables">#&lt;/a>
 
&lt;/h1>
&lt;p>A random variable is a way to attach numbers to outcomes of a random experiment.&lt;/p>
&lt;p>It lets us move from:
“what happened?”
to:
“what number should we analyse?”&lt;/p>
&lt;blockquote class="book-hint info">
&lt;p>Key takeaway:
A random variable is a &lt;em>function&lt;/em> from the sample space to real numbers.
Once you define the random variable clearly, the rest (pmf/pdf/cdf, mean, variance) becomes systematic.&lt;/p>
&lt;/blockquote>
&lt;hr>


&lt;script src="https://arshadhs.github.io/mermaid.min.js">&lt;/script>

 &lt;script>mermaid.initialize({
 "flowchart": {
 "useMaxWidth":true
 },
 "theme": "default"
}
)&lt;/script>




&lt;pre class="mermaid">
flowchart TD
PD[&amp;#34;Probability&amp;lt;br/&amp;gt;distributions&amp;#34;] --&amp;gt; RV[&amp;#34;Random&amp;lt;br/&amp;gt;variables&amp;#34;]

RV --&amp;gt; T[&amp;#34;Types&amp;#34;]
T --&amp;gt; RV1[&amp;#34;Discrete&amp;lt;br/&amp;gt;RVs&amp;#34;]
T --&amp;gt; RV2[&amp;#34;Continuous&amp;lt;br/&amp;gt;RVs&amp;#34;]

RV --&amp;gt; F[&amp;#34;PMF / PDF / CDF&amp;#34;]
RV --&amp;gt; S[&amp;#34;Mean / Variance&amp;lt;br/&amp;gt;Covariance&amp;#34;]
RV --&amp;gt; J[&amp;#34;Joint &amp;amp; Marginal&amp;lt;br/&amp;gt;distributions&amp;#34;]
RV --&amp;gt; X[&amp;#34;Transformations&amp;#34;]

style PD fill:#90CAF9,stroke:#1E88E5,color:#000
style RV fill:#90CAF9,stroke:#1E88E5,color:#000

style T fill:#CE93D8,stroke:#8E24AA,color:#000
style F fill:#CE93D8,stroke:#8E24AA,color:#000
style S fill:#CE93D8,stroke:#8E24AA,color:#000
style J fill:#CE93D8,stroke:#8E24AA,color:#000
style X fill:#CE93D8,stroke:#8E24AA,color:#000
style RV1 fill:#CE93D8,stroke:#8E24AA,color:#000
style RV2 fill:#CE93D8,stroke:#8E24AA,color:#000
&lt;/pre>

&lt;hr>
&lt;h2 id="1-definition">
 1) Definition
 
 &lt;a class="anchor" href="#1-definition">#&lt;/a>
 
&lt;/h2>
&lt;p>Random variable:
a rule that assigns a number to each outcome.&lt;/p></description></item><item><title>Common Probability Distributions</title><link>https://arshadhs.github.io/docs/ai/statistics/probability_distributions/common-distributions/</link><pubDate>Sun, 22 Feb 2026 00:00:00 +0000</pubDate><guid>https://arshadhs.github.io/docs/ai/statistics/probability_distributions/common-distributions/</guid><description>&lt;h1 id="common-probability-distributions">
 Common Probability Distributions
 
 &lt;a class="anchor" href="#common-probability-distributions">#&lt;/a>
 
&lt;/h1>
&lt;p>Once you can describe a random variable using a pmf or pdf, the next step is to use
named distributions that appear repeatedly in real data and in ML models.&lt;/p>
&lt;blockquote class="book-hint info">
&lt;p>Key takeaway:
Named distributions give you ready-made probability models for common patterns:
binary outcomes, counts, and measurement noise.&lt;/p>
&lt;/blockquote>
&lt;hr>


&lt;pre class="mermaid">
flowchart TD
PD[&amp;#34;Probability&amp;lt;br/&amp;gt;distributions&amp;#34;] --&amp;gt; DS[&amp;#34;Common&amp;lt;br/&amp;gt;distributions&amp;#34;]

DS --&amp;gt; DIS[&amp;#34;Discrete&amp;#34;]
DS --&amp;gt; CON[&amp;#34;Continuous&amp;#34;]

DIS --&amp;gt; D1[&amp;#34;Bernoulli&amp;#34;]
DIS --&amp;gt; D2[&amp;#34;Binomial&amp;#34;]
DIS --&amp;gt; D3[&amp;#34;Poisson&amp;#34;]

CON --&amp;gt; D4[&amp;#34;Normal&amp;lt;br/&amp;gt;(Gaussian)&amp;#34;]
CON --&amp;gt; D5[&amp;#34;t / Chi-square / F&amp;lt;br/&amp;gt;(intro)&amp;#34;]

style PD fill:#90CAF9,stroke:#1E88E5,color:#000
style DS fill:#90CAF9,stroke:#1E88E5,color:#000

style DIS fill:#CE93D8,stroke:#8E24AA,color:#000
style CON fill:#CE93D8,stroke:#8E24AA,color:#000

style D1 fill:#C8E6C9,stroke:#2E7D32,color:#000
style D2 fill:#C8E6C9,stroke:#2E7D32,color:#000
style D3 fill:#C8E6C9,stroke:#2E7D32,color:#000
style D4 fill:#C8E6C9,stroke:#2E7D32,color:#000
style D5 fill:#C8E6C9,stroke:#2E7D32,color:#000
&lt;/pre>

&lt;hr>
&lt;h2 id="1-bernoulli-distribution-binary">
 1) Bernoulli distribution (binary)
 
 &lt;a class="anchor" href="#1-bernoulli-distribution-binary">#&lt;/a>
 
&lt;/h2>
&lt;p>Use when:
one trial has two outcomes (success/failure).&lt;/p></description></item><item><title>Hypothesis Testing</title><link>https://arshadhs.github.io/docs/ai/statistics/04_hypothesis_testing/</link><pubDate>Thu, 12 Mar 2026 00:00:00 +0000</pubDate><guid>https://arshadhs.github.io/docs/ai/statistics/04_hypothesis_testing/</guid><description>&lt;h1 id="hypothesis-testing">
 Hypothesis Testing
 
 &lt;a class="anchor" href="#hypothesis-testing">#&lt;/a>
 
&lt;/h1>
&lt;p>Hypothesis testing is a structured way to decide:&lt;/p>
&lt;p>Is what we see in a sample just random variation,
or is there evidence of a real effect in the population?&lt;/p>
&lt;p>Hypothesis Testing topic sits inside &lt;strong>inferential statistics&lt;/strong>:
we use a &lt;strong>sample&lt;/strong> to make a statement about a &lt;strong>population&lt;/strong>.&lt;/p>
&lt;ul>
&lt;li>Sampling (random and stratified)&lt;/li>
&lt;li>Sampling distribution and Central Limit Theorem&lt;/li>
&lt;li>Estimation (confidence intervals and confidence level)&lt;/li>
&lt;li>Testing hypotheses (mean, proportion, ANOVA)&lt;/li>
&lt;li>Maximum likelihood (MLE)&lt;/li>
&lt;/ul>
&lt;blockquote class="book-hint info">
&lt;p>Key takeaway:
The logic is always the same:&lt;/p></description></item><item><title>Prediction &amp; Forecasting</title><link>https://arshadhs.github.io/docs/ai/statistics/05_prediction_n_forecasting/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate><guid>https://arshadhs.github.io/docs/ai/statistics/05_prediction_n_forecasting/</guid><description>&lt;h1 id="prediction--forecasting">
 Prediction &amp;amp; Forecasting
 
 &lt;a class="anchor" href="#prediction--forecasting">#&lt;/a>
 
&lt;/h1>
&lt;h2 id="correlation">
 Correlation
 
 &lt;a class="anchor" href="#correlation">#&lt;/a>
 
&lt;/h2>
&lt;h2 id="regression">
 Regression
 
 &lt;a class="anchor" href="#regression">#&lt;/a>
 
&lt;/h2>
&lt;h2 id="time-series-analysis">
 Time Series Analysis
 
 &lt;a class="anchor" href="#time-series-analysis">#&lt;/a>
 
&lt;/h2>
&lt;h3 id="introduction-components-of-time-series-data">
 Introduction, Components of time series data
 
 &lt;a class="anchor" href="#introduction-components-of-time-series-data">#&lt;/a>
 
&lt;/h3>
&lt;h3 id="ma-model--basic-and-weighted-ma-model">
 MA model – basic and weighted MA model
 
 &lt;a class="anchor" href="#ma-model--basic-and-weighted-ma-model">#&lt;/a>
 
&lt;/h3>
&lt;h3 id="time-series-models">
 Time series models
 
 &lt;a class="anchor" href="#time-series-models">#&lt;/a>
 
&lt;/h3>
&lt;ul>
&lt;li>AR Model&lt;/li>
&lt;li>ARIMA Model&lt;/li>
&lt;li>SARIMA,SARIMAX,VAR,VARMAX&lt;/li>
&lt;li>Simple exponential smoothing model&lt;/li>
&lt;/ul>
&lt;hr>
&lt;h2 id="reference">
 Reference
 
 &lt;a class="anchor" href="#reference">#&lt;/a>
 
&lt;/h2>
&lt;p>&lt;a href="">Prediction &amp;amp; Forecasting&lt;/a>&lt;/p>
&lt;hr>
&lt;p>&lt;a href="https://arshadhs.github.io/">Home&lt;/a> | &lt;a href="https://arshadhs.github.io/docs/ai/statistics/">
 Statistics
&lt;/a>&lt;/p></description></item><item><title>Gaussian Mixture model &amp; Expectation Maximization</title><link>https://arshadhs.github.io/docs/ai/statistics/06_prediction_n_forecasting/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate><guid>https://arshadhs.github.io/docs/ai/statistics/06_prediction_n_forecasting/</guid><description>&lt;h1 id="gaussian-mixture-model--expectation-maximization">
 Gaussian Mixture model &amp;amp; Expectation Maximization
 
 &lt;a class="anchor" href="#gaussian-mixture-model--expectation-maximization">#&lt;/a>
 
&lt;/h1>
&lt;hr>
&lt;h2 id="reference">
 Reference
 
 &lt;a class="anchor" href="#reference">#&lt;/a>
 
&lt;/h2>
&lt;p>&lt;a href="https://www.geeksforgeeks.org/machine-learning/gaussian-mixture-model/">Gaussian Mixture model&lt;/a>&lt;/p>
&lt;p>&lt;a href="https://www.geeksforgeeks.org/machine-learning/ml-expectation-maximization-algorithm/">Expectation Maximization&lt;/a>&lt;/p>
&lt;hr>
&lt;p>&lt;a href="https://arshadhs.github.io/">Home&lt;/a> | &lt;a href="https://arshadhs.github.io/docs/ai/statistics/">
 Statistics
&lt;/a>&lt;/p></description></item></channel></rss>