<?xml version="1.0" encoding="utf-8" standalone="yes"?><rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>SVD on Arshad Siddiqui</title><link>https://arshadhs.github.io/tags/svd/</link><description>Recent content in SVD on Arshad Siddiqui</description><generator>Hugo</generator><language>en-us</language><lastBuildDate>Wed, 18 Mar 2026 00:00:00 +0000</lastBuildDate><atom:link href="https://arshadhs.github.io/tags/svd/index.xml" rel="self" type="application/rss+xml"/><item><title>Singular Value Decomposition (SVD)</title><link>https://arshadhs.github.io/docs/ai/maths/010-linear-algebra/03-matrix-decomposition/050-singular-value-decomposition/</link><pubDate>Wed, 18 Mar 2026 00:00:00 +0000</pubDate><guid>https://arshadhs.github.io/docs/ai/maths/010-linear-algebra/03-matrix-decomposition/050-singular-value-decomposition/</guid><description>&lt;h1 id="singular-value-decomposition-svd">
 Singular Value Decomposition (SVD)
 
 &lt;a class="anchor" href="#singular-value-decomposition-svd">#&lt;/a>
 
&lt;/h1>
&lt;p>Singular Value Decomposition (SVD) is one of the most important matrix decomposition techniques in linear algebra and machine learning.&lt;/p>
&lt;p>It factorises any matrix into three simpler matrices that reveal its structure.&lt;/p>
&lt;blockquote class="book-hint info">
&lt;p>Key Idea:
SVD decomposes a matrix into rotations + scaling.
It tells us how data is transformed along orthogonal directions.&lt;/p>
&lt;/blockquote>
&lt;hr>
&lt;h1 id="definition">
 Definition
 
 &lt;a class="anchor" href="#definition">#&lt;/a>
 
&lt;/h1>
&lt;p>For any matrix in real space:

&lt;span style="color: green;">
 &lt;span>
 \[ 
A \in \mathbb{R}^{m \times n}
 \]
 &lt;/span>

&lt;/span>&lt;/p></description></item></channel></rss>