<?xml version="1.0" encoding="utf-8" standalone="yes"?><rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Vector Calculus on Arshad Siddiqui</title><link>https://arshadhs.github.io/tags/vector-calculus/</link><description>Recent content in Vector Calculus on Arshad Siddiqui</description><generator>Hugo</generator><language>en-us</language><atom:link href="https://arshadhs.github.io/tags/vector-calculus/index.xml" rel="self" type="application/rss+xml"/><item><title>Differentiation of Univariate Functions</title><link>https://arshadhs.github.io/docs/ai/maths/020-calculus/04-vector-calculus/010-univariate-differentiation/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate><guid>https://arshadhs.github.io/docs/ai/maths/020-calculus/04-vector-calculus/010-univariate-differentiation/</guid><description>&lt;h1 id="differentiation-of-univariate-functions">
 Differentiation of Univariate Functions
 
 &lt;a class="anchor" href="#differentiation-of-univariate-functions">#&lt;/a>
 
&lt;/h1>
&lt;p>Differentiation measures rate of change.&lt;/p>
&lt;p>For a function f(x), the derivative measures the rate of change.&lt;/p>
&lt;span style="color: red;">
 $[
f'(x) = $lim_{h $to 0} $frac{f(x+h)-f(x)}{h}
$]
&lt;/span>
&lt;p>Interpretation:&lt;/p>
&lt;ul>
&lt;li>Slope of tangent&lt;/li>
&lt;li>Instantaneous rate of change&lt;/li>
&lt;/ul>
&lt;hr>
&lt;p>&lt;a href="https://arshadhs.github.io/">Home&lt;/a> | &lt;a href="https://arshadhs.github.io/docs/ai/maths/020-calculus/04-vector-calculus/">
 Vector Calculus
&lt;/a>&lt;/p></description></item><item><title>Partial Differentiation and Gradients</title><link>https://arshadhs.github.io/docs/ai/maths/020-calculus/04-vector-calculus/020-partial-derivatives-and-gradients/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate><guid>https://arshadhs.github.io/docs/ai/maths/020-calculus/04-vector-calculus/020-partial-derivatives-and-gradients/</guid><description>&lt;h1 id="partial-differentiation-and-gradients">
 Partial Differentiation and Gradients
 
 &lt;a class="anchor" href="#partial-differentiation-and-gradients">#&lt;/a>
 
&lt;/h1>
&lt;p>For f(x1, x2, &amp;hellip;, xn):&lt;/p>
&lt;span style="color: red;">
 [
\frac{\partial f}{\partial x_i}
]
&lt;/span>
&lt;p>Gradient vector:&lt;/p>
&lt;span style="color: red;">
 [
\nabla f =
\begin{bmatrix}
\frac{\partial f}{\partial x_1} \
\vdots \
\frac{\partial f}{\partial x_n}
\end{bmatrix}
]
&lt;/span>
&lt;p>Gradient points in direction of steepest ascent.&lt;/p>


&lt;script src="https://arshadhs.github.io/mermaid.min.js">&lt;/script>

 &lt;script>mermaid.initialize({
 "flowchart": {
 "useMaxWidth":true
 },
 "theme": "default"
}
)&lt;/script>




&lt;pre class="mermaid">
flowchart LR
 Input --&amp;gt; Function
 Function --&amp;gt; Gradient
 Gradient --&amp;gt; Optimisation
&lt;/pre>

&lt;hr>
&lt;p>&lt;a href="https://arshadhs.github.io/">Home&lt;/a> | &lt;a href="https://arshadhs.github.io/docs/ai/maths/020-calculus/04-vector-calculus/">
 Vector Calculus
&lt;/a>&lt;/p></description></item><item><title>Gradients of Vector-Valued and Matrix Functions</title><link>https://arshadhs.github.io/docs/ai/maths/020-calculus/04-vector-calculus/030-vector-and-matrix-gradients/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate><guid>https://arshadhs.github.io/docs/ai/maths/020-calculus/04-vector-calculus/030-vector-and-matrix-gradients/</guid><description>&lt;h1 id="gradients-of-vector-valued-and-matrix-functions">
 Gradients of Vector-Valued and Matrix Functions
 
 &lt;a class="anchor" href="#gradients-of-vector-valued-and-matrix-functions">#&lt;/a>
 
&lt;/h1>
&lt;p>Covers gradients when outputs or parameters are vectors/matrices.&lt;/p>
&lt;p>If f: R^n -&amp;gt; R^m, the derivative is the Jacobian.&lt;/p>
&lt;span style="color: red;">
 [
J =
\begin{bmatrix}
\frac{\partial f_1}{\partial x_1} &amp;amp; \dots &amp;amp; \frac{\partial f_1}{\partial x_n} \
\vdots &amp;amp; \ddots &amp;amp; \vdots \
\frac{\partial f_m}{\partial x_1} &amp;amp; \dots &amp;amp; \frac{\partial f_m}{\partial x_n}
\end{bmatrix}
]
&lt;/span>
&lt;p>For scalar f(x):&lt;/p>
&lt;span style="color: red;">
 [
H = \nabla^2 f
]
&lt;/span>
&lt;p>Hessian captures curvature.&lt;/p></description></item><item><title>Useful Gradient Identities</title><link>https://arshadhs.github.io/docs/ai/maths/020-calculus/04-vector-calculus/050-gradient-identities/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate><guid>https://arshadhs.github.io/docs/ai/maths/020-calculus/04-vector-calculus/050-gradient-identities/</guid><description>&lt;h1 id="useful-gradient-identities">
 Useful Gradient Identities
 
 &lt;a class="anchor" href="#useful-gradient-identities">#&lt;/a>
 
&lt;/h1>
&lt;span style="color: red;">
 [
\nabla (a^T x) = a
]
&lt;/span>
&lt;span style="color: red;">
 [
\nabla (x^T A x) = (A + A^T)x
]
&lt;/span>
&lt;p>If A symmetric:&lt;/p>
&lt;span style="color: red;">
 [
\nabla (x^T A x) = 2Ax
]
&lt;/span>
&lt;p>These are heavily used in &lt;strong>optimisation&lt;/strong>.&lt;/p>
&lt;hr>
&lt;p>&lt;a href="https://arshadhs.github.io/">Home&lt;/a> | &lt;a href="https://arshadhs.github.io/docs/ai/maths/020-calculus/04-vector-calculus/">
 Vector Calculus
&lt;/a>&lt;/p></description></item><item><title>Backpropagation and Automatic Differentiation</title><link>https://arshadhs.github.io/docs/ai/maths/020-calculus/04-vector-calculus/060-backpropagation/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate><guid>https://arshadhs.github.io/docs/ai/maths/020-calculus/04-vector-calculus/060-backpropagation/</guid><description>&lt;h1 id="backpropagation-and-automatic-differentiation">
 Backpropagation and Automatic Differentiation
 
 &lt;a class="anchor" href="#backpropagation-and-automatic-differentiation">#&lt;/a>
 
&lt;/h1>
&lt;p>Backpropagation applies the chain rule:&lt;/p>
&lt;ul>
&lt;li>efficiently across a computational graph.&lt;/li>
&lt;li>repeatedly.&lt;/li>
&lt;/ul>
&lt;p>Chain rule:&lt;/p>
&lt;span style="color: red;">
 [
\frac{dL}{dx} = \frac{dL}{dy} \cdot \frac{dy}{dx}
]
&lt;/span>


&lt;script src="https://arshadhs.github.io/mermaid.min.js">&lt;/script>

 &lt;script>mermaid.initialize({
 "flowchart": {
 "useMaxWidth":true
 },
 "theme": "default"
}
)&lt;/script>




&lt;pre class="mermaid">
flowchart LR
 x --&amp;gt; y
 y --&amp;gt; L
&lt;/pre>

&lt;p>Automatic differentiation computes exact derivatives efficiently using computational graphs.&lt;/p>
&lt;hr>
&lt;p>&lt;a href="https://arshadhs.github.io/">Home&lt;/a> | &lt;a href="https://arshadhs.github.io/docs/ai/maths/020-calculus/04-vector-calculus/">
 Vector Calculus
&lt;/a>&lt;/p></description></item><item><title>Higher-order derivatives</title><link>https://arshadhs.github.io/docs/ai/maths/020-calculus/04-vector-calculus/070-higher-order-derivatives/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate><guid>https://arshadhs.github.io/docs/ai/maths/020-calculus/04-vector-calculus/070-higher-order-derivatives/</guid><description>&lt;h1 id="higher-order-derivatives">
 Higher-order derivatives
 
 &lt;a class="anchor" href="#higher-order-derivatives">#&lt;/a>
 
&lt;/h1>
&lt;hr>
&lt;p>&lt;a href="https://arshadhs.github.io/">Home&lt;/a> | &lt;a href="https://arshadhs.github.io/docs/ai/maths/020-calculus/04-vector-calculus/">
 Vector Calculus
&lt;/a>&lt;/p></description></item><item><title>Taylor’s series</title><link>https://arshadhs.github.io/docs/ai/maths/020-calculus/04-vector-calculus/080-taylors-series/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate><guid>https://arshadhs.github.io/docs/ai/maths/020-calculus/04-vector-calculus/080-taylors-series/</guid><description>&lt;h1 id="linearization-and-multivariate-taylors-series">
 Linearization and multivariate Taylor’s series
 
 &lt;a class="anchor" href="#linearization-and-multivariate-taylors-series">#&lt;/a>
 
&lt;/h1>
&lt;hr>
&lt;p>&lt;a href="https://arshadhs.github.io/">Home&lt;/a> | &lt;a href="https://arshadhs.github.io/docs/ai/maths/020-calculus/04-vector-calculus/">
 Vector Calculus
&lt;/a>&lt;/p></description></item><item><title>Maxima and Minima</title><link>https://arshadhs.github.io/docs/ai/maths/020-calculus/04-vector-calculus/090-maxima-and-minima/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate><guid>https://arshadhs.github.io/docs/ai/maths/020-calculus/04-vector-calculus/090-maxima-and-minima/</guid><description>&lt;h1 id="computing-maxima-and-minima-for-unconstrained-optimization">
 Computing maxima and minima for unconstrained optimization
 
 &lt;a class="anchor" href="#computing-maxima-and-minima-for-unconstrained-optimization">#&lt;/a>
 
&lt;/h1>
&lt;hr>
&lt;p>&lt;a href="https://arshadhs.github.io/">Home&lt;/a> | &lt;a href="https://arshadhs.github.io/docs/ai/maths/020-calculus/04-vector-calculus/">
 Vector Calculus
&lt;/a>&lt;/p></description></item><item><title>Vector Calculus</title><link>https://arshadhs.github.io/docs/ai/maths/020-calculus/04-vector-calculus/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate><guid>https://arshadhs.github.io/docs/ai/maths/020-calculus/04-vector-calculus/</guid><description>&lt;h1 id="vector-calculus">
 Vector Calculus
 
 &lt;a class="anchor" href="#vector-calculus">#&lt;/a>
 
&lt;/h1>
&lt;p>Vector calculus extends differentiation to multivariate and vector-valued functions.&lt;/p>
&lt;p>Gradients power learning. This section builds differentiation skills needed for backpropagation.&lt;/p>
&lt;hr>




&lt;ul>
 
 
 
 
 &lt;li>
 &lt;a href="https://arshadhs.github.io/docs/ai/maths/020-calculus/04-vector-calculus/010-univariate-differentiation/">Differentiation of Univariate Functions&lt;/a>
 &lt;/li>
 
 
 
 
 &lt;li>
 &lt;a href="https://arshadhs.github.io/docs/ai/maths/020-calculus/04-vector-calculus/020-partial-derivatives-and-gradients/">Partial Differentiation and Gradients&lt;/a>
 &lt;/li>
 
 
 
 
 &lt;li>
 &lt;a href="https://arshadhs.github.io/docs/ai/maths/020-calculus/04-vector-calculus/030-vector-and-matrix-gradients/">Gradients of Vector-Valued and Matrix Functions&lt;/a>
 &lt;/li>
 
 
 
 
 &lt;li>
 &lt;a href="https://arshadhs.github.io/docs/ai/maths/020-calculus/04-vector-calculus/050-gradient-identities/">Useful Gradient Identities&lt;/a>
 &lt;/li>
 
 
 
 
 &lt;li>
 &lt;a href="https://arshadhs.github.io/docs/ai/maths/020-calculus/04-vector-calculus/060-backpropagation/">Backpropagation and Automatic Differentiation&lt;/a>
 &lt;/li>
 
 
 
 
 &lt;li>
 &lt;a href="https://arshadhs.github.io/docs/ai/maths/020-calculus/04-vector-calculus/070-higher-order-derivatives/">Higher-order derivatives&lt;/a>
 &lt;/li>
 
 
 
 
 &lt;li>
 &lt;a href="https://arshadhs.github.io/docs/ai/maths/020-calculus/04-vector-calculus/080-taylors-series/">Taylor’s series&lt;/a>
 &lt;/li>
 
 
 
 
 &lt;li>
 &lt;a href="https://arshadhs.github.io/docs/ai/maths/020-calculus/04-vector-calculus/090-maxima-and-minima/">Maxima and Minima&lt;/a>
 &lt;/li>
 
 

 
 
&lt;/ul>


&lt;hr>


&lt;script src="https://arshadhs.github.io/mermaid.min.js">&lt;/script>

 &lt;script>mermaid.initialize({
 "flowchart": {
 "useMaxWidth":true
 },
 "theme": "default"
}
)&lt;/script>




&lt;pre class="mermaid">
flowchart TD

 %% Core Node
 PD[&amp;#34;Partial Derivatives&amp;#34;]

 %% Supporting Concepts
 DQ[&amp;#34;Difference Quotient&amp;#34;]
 JH[&amp;#34;Jacobian / Hessian&amp;#34;]
 TS[&amp;#34;Taylor Series&amp;#34;]

 %% Application Chapters
 CH6[&amp;#34;&amp;lt;br/&amp;gt;Probability&amp;#34;]
 CH7[&amp;#34;&amp;lt;br/&amp;gt;Optimization&amp;#34;]
 CH9[&amp;#34;&amp;lt;br/&amp;gt;Regression&amp;#34;]
 CH10[&amp;#34;&amp;lt;br/&amp;gt;Dimensionality Reduction&amp;#34;]
 CH11[&amp;#34;&amp;lt;br/&amp;gt;Density Estimation&amp;#34;]
 CH12[&amp;#34;&amp;lt;br/&amp;gt;Classification&amp;#34;]

 %% Relationships
 DQ --&amp;gt;|defines| PD
 PD --&amp;gt;|collected in| JH
 JH --&amp;gt;|used in| TS
 JH --&amp;gt;|used in| CH6
	
 PD --&amp;gt;|used in| CH7
 PD --&amp;gt;|used in| CH9
 PD --&amp;gt;|used in| CH10
 PD --&amp;gt;|used in| CH11
 PD --&amp;gt;|used in| CH12

 %% Styling (Your Soft Academic Palette)
 style PD fill:#90CAF9,stroke:#1E88E5,color:#000

 style DQ fill:#CE93D8,stroke:#8E24AA,color:#000
 style JH fill:#CE93D8,stroke:#8E24AA,color:#000
 style TS fill:#CE93D8,stroke:#8E24AA,color:#000
 style CH6 fill:#CE93D8,stroke:#8E24AA,color:#000
	
 style CH7 fill:#C8E6C9,stroke:#2E7D32,color:#000
 style CH9 fill:#C8E6C9,stroke:#2E7D32,color:#000
 style CH10 fill:#C8E6C9,stroke:#2E7D32,color:#000
 style CH11 fill:#C8E6C9,stroke:#2E7D32,color:#000
 style CH12 fill:#C8E6C9,stroke:#2E7D32,color:#000

&lt;/pre>

&lt;hr>
&lt;p>&lt;a href="https://arshadhs.github.io/">Home&lt;/a> | &lt;a href="https://arshadhs.github.io/docs/ai/maths/020-calculus/">
 Calculus
&lt;/a>&lt;/p></description></item></channel></rss>