<?xml version="1.0" encoding="utf-8" standalone="yes"?><rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Vector Spaces on Arshad Siddiqui</title><link>https://arshadhs.github.io/tags/vector-spaces/</link><description>Recent content in Vector Spaces on Arshad Siddiqui</description><generator>Hugo</generator><language>en-us</language><atom:link href="https://arshadhs.github.io/tags/vector-spaces/index.xml" rel="self" type="application/rss+xml"/><item><title>Basis and Rank</title><link>https://arshadhs.github.io/docs/ai/maths/010-linear-algebra/02-vector-spaces/020-basis-and-rank/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate><guid>https://arshadhs.github.io/docs/ai/maths/010-linear-algebra/02-vector-spaces/020-basis-and-rank/</guid><description>&lt;h1 id="basis-and-rank">
 Basis and Rank
 
 &lt;a class="anchor" href="#basis-and-rank">#&lt;/a>
 
&lt;/h1>
&lt;p>A &lt;strong>basis&lt;/strong> is a minimal set of linearly independent vectors that spans a space.&lt;/p>
&lt;p>The &lt;strong>dimension&lt;/strong> of a space is the number of vectors in a basis.&lt;/p>
&lt;blockquote class="book-hint info">
&lt;p>Key Idea:
Basis = independence + spanning.
Rank tells us how many independent directions exist in a matrix.&lt;/p>
&lt;/blockquote>
&lt;p>A basis must satisfy two conditions ⭐&lt;/p>
&lt;ol>
&lt;li>Vectors must be linearly independent&lt;/li>
&lt;li>Vectors must span the space&lt;/li>
&lt;/ol>
&lt;p>This means:&lt;/p>
&lt;ul>
&lt;li>No redundancy (independence)&lt;/li>
&lt;li>Full coverage (spanning)&lt;/li>
&lt;/ul>

&lt;span>
 \[ 
\text{Span}(v_1, v_2, \dots, v_k) = V
 \]
 &lt;/span>



&lt;span>
 \[ 
c_1 v_1 + \cdots + c_k v_k = 0 \Rightarrow c_i = 0
 \]
 &lt;/span>


&lt;hr>
&lt;h2 id="why-basis-matters">
 Why Basis Matters
 
 &lt;a class="anchor" href="#why-basis-matters">#&lt;/a>
 
&lt;/h2>
&lt;ul>
&lt;li>Represents space efficiently&lt;/li>
&lt;li>Removes redundancy&lt;/li>
&lt;li>Helps define coordinates&lt;/li>
&lt;li>Used in ML for feature representation&lt;/li>
&lt;/ul>
&lt;hr>
&lt;h1 id="dimension">
 Dimension
 
 &lt;a class="anchor" href="#dimension">#&lt;/a>
 
&lt;/h1>
&lt;p>Dimension is the number of vectors in a basis.&lt;/p></description></item><item><title>Norm</title><link>https://arshadhs.github.io/docs/ai/maths/010-linear-algebra/02-vector-spaces/030-norm/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate><guid>https://arshadhs.github.io/docs/ai/maths/010-linear-algebra/02-vector-spaces/030-norm/</guid><description>&lt;h1 id="norm">
 Norm
 
 &lt;a class="anchor" href="#norm">#&lt;/a>
 
&lt;/h1>
&lt;p>A &lt;strong>norm&lt;/strong> measures the length (magnitude) of a vector.&lt;/p>
&lt;ul>
&lt;li>the norm of a vector x measures the distance from the origin to the point x.&lt;/li>
&lt;/ul>
&lt;p>Common example: Euclidean norm.&lt;/p>

&lt;span>
 \[ 
\lVert \mathbf{x} \rVert_2 = \sqrt{x_1^2 + \cdots + x_n^2}
 \]
 &lt;/span>


&lt;blockquote class="book-hint info">
&lt;p>Key Idea:
Norm = measure of size or length of a vector.
It generalises the idea of distance in geometry to higher dimensions.&lt;/p>
&lt;/blockquote>
&lt;hr>
&lt;h2 id="common-norms">
 Common norms
 
 &lt;a class="anchor" href="#common-norms">#&lt;/a>
 
&lt;/h2>
&lt;ul>
&lt;li>L1&lt;/li>
&lt;li>L2&lt;/li>
&lt;li>Infinity norm&lt;/li>
&lt;/ul>
&lt;h2 id="why-it-matters">
 Why it matters
 
 &lt;a class="anchor" href="#why-it-matters">#&lt;/a>
 
&lt;/h2>
&lt;ul>
&lt;li>norms quantify size&lt;/li>
&lt;li>are used in distances and regularisation.&lt;/li>
&lt;/ul>
&lt;hr>
&lt;h1 id="intuition-from-lectures">
 Intuition (From Lectures)
 
 &lt;a class="anchor" href="#intuition-from-lectures">#&lt;/a>
 
&lt;/h1>
&lt;p>From lecture discussions on analytic geometry:&lt;/p></description></item><item><title>Lengths and Distances</title><link>https://arshadhs.github.io/docs/ai/maths/010-linear-algebra/02-vector-spaces/050-lengths-and-distances/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate><guid>https://arshadhs.github.io/docs/ai/maths/010-linear-algebra/02-vector-spaces/050-lengths-and-distances/</guid><description>&lt;h1 id="lengths-and-distances">
 Lengths and Distances
 
 &lt;a class="anchor" href="#lengths-and-distances">#&lt;/a>
 
&lt;/h1>
&lt;p>The &lt;strong>length&lt;/strong> of a vector is given by its norm.&lt;/p>
&lt;p>The &lt;strong>distance&lt;/strong> between two points (vectors) is the norm of their difference.&lt;/p>
&lt;p>Distance quantifies &lt;strong>how far two vectors (data points) are&lt;/strong> from each other.&lt;/p>

&lt;span>
 \[ 
d(\mathbf{x},\mathbf{y}) = \lVert \mathbf{x} - \mathbf{y} \rVert
 \]
 &lt;/span>


&lt;blockquote class="book-hint info">
&lt;p>Key Idea:
Length measures size of a single vector.
Distance measures separation between two vectors.
Distance = norm applied to difference.&lt;/p></description></item><item><title>Angles and Orthogonality</title><link>https://arshadhs.github.io/docs/ai/maths/010-linear-algebra/02-vector-spaces/060-angles-and-orthogonality/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate><guid>https://arshadhs.github.io/docs/ai/maths/010-linear-algebra/02-vector-spaces/060-angles-and-orthogonality/</guid><description>&lt;h1 id="angles-and-orthogonality">
 Angles and Orthogonality
 
 &lt;a class="anchor" href="#angles-and-orthogonality">#&lt;/a>
 
&lt;/h1>
&lt;p>Once we define an inner product, we can define the &lt;strong>angle between two vectors&lt;/strong>.&lt;/p>
&lt;p>Angles allow us to measure how aligned or different two vectors are in space.&lt;/p>
&lt;blockquote class="book-hint info">
&lt;p>Key Idea:
Angle measures similarity between vectors.
Orthogonality means complete independence (no similarity).&lt;/p>
&lt;/blockquote>
&lt;h2 id="why-it-matters-in-machine-learning">
 Why It Matters in Machine Learning
 
 &lt;a class="anchor" href="#why-it-matters-in-machine-learning">#&lt;/a>
 
&lt;/h2>
&lt;ul>
&lt;li>PCA produces orthogonal components&lt;/li>
&lt;li>Orthogonal features reduce redundancy&lt;/li>
&lt;li>Gradient directions depend on angle&lt;/li>
&lt;/ul>
&lt;hr>
&lt;h1 id="angle-formula">
 Angle Formula
 
 &lt;a class="anchor" href="#angle-formula">#&lt;/a>
 
&lt;/h1>
&lt;p>For vectors in n-dimensional space:&lt;/p></description></item><item><title>Orthonormal Basis</title><link>https://arshadhs.github.io/docs/ai/maths/010-linear-algebra/02-vector-spaces/070-orthonormal-basis/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate><guid>https://arshadhs.github.io/docs/ai/maths/010-linear-algebra/02-vector-spaces/070-orthonormal-basis/</guid><description>&lt;h1 id="orthonormal-basis">
 Orthonormal Basis
 
 &lt;a class="anchor" href="#orthonormal-basis">#&lt;/a>
 
&lt;/h1>
&lt;p>A basis is &lt;strong>orthonormal&lt;/strong> if its vectors are:&lt;/p>
&lt;ul>
&lt;li>orthogonal to each other&lt;/li>
&lt;li>each has unit length&lt;/li>
&lt;/ul>

&lt;span>
 \[ 
\langle \mathbf{e}_i, \mathbf{e}_j \rangle =
\begin{cases}
1 &amp; i=j \\
0 &amp; i\ne j
\end{cases}
 \]
 &lt;/span>


&lt;blockquote class="book-hint info">
&lt;p>Key Idea:
Orthonormal basis = perfectly independent + perfectly scaled.
This makes computations extremely simple and stable.&lt;/p>
&lt;/blockquote>
&lt;p>Why it matters: orthonormal bases make projections and computations simple.&lt;/p>
&lt;hr>
&lt;h1 id="intuition-from-lectures">
 Intuition (From Lectures)
 
 &lt;a class="anchor" href="#intuition-from-lectures">#&lt;/a>
 
&lt;/h1>
&lt;p>From analytic geometry and vector space lectures:&lt;/p></description></item></channel></rss>