<?xml version="1.0" encoding="utf-8" standalone="yes"?><rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Derivatives on Arshad Siddiqui</title><link>https://arshadhs.github.io/tags/derivatives/</link><description>Recent content in Derivatives on Arshad Siddiqui</description><generator>Hugo</generator><language>en-us</language><atom:link href="https://arshadhs.github.io/tags/derivatives/index.xml" rel="self" type="application/rss+xml"/><item><title>Calculus</title><link>https://arshadhs.github.io/docs/ai/maths/020-calculus/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate><guid>https://arshadhs.github.io/docs/ai/maths/020-calculus/</guid><description>&lt;h1 id="calculus">
 Calculus
 
 &lt;a class="anchor" href="#calculus">#&lt;/a>
 
&lt;/h1>
&lt;p>Calculus is:&lt;/p>
&lt;ul>
&lt;li>the mathematical framework for understanding and controlling how quantities change&lt;/li>
&lt;li>the mathematics of &lt;strong>change&lt;/strong> and &lt;strong>accumulation&lt;/strong>&lt;/li>
&lt;/ul>
&lt;p>It helps answer:&lt;/p>
&lt;ul>
&lt;li>How fast is something changing &lt;strong>right now&lt;/strong>?&lt;/li>
&lt;li>What happens when inputs change &lt;strong>slightly&lt;/strong>?&lt;/li>
&lt;li>Where is something &lt;strong>maximum or minimum&lt;/strong>?&lt;/li>
&lt;/ul>
&lt;p>It answers two big questions:&lt;/p>
&lt;ul>
&lt;li>&lt;strong>How fast is something changing right now?&lt;/strong> → derivatives (differentiation)&lt;/li>
&lt;li>&lt;strong>How much has accumulated over an interval?&lt;/strong> → integrals (integration)&lt;/li>
&lt;/ul>
&lt;hr>


&lt;script src="https://arshadhs.github.io/mermaid.min.js">&lt;/script>

 &lt;script>mermaid.initialize({
 "flowchart": {
 "useMaxWidth":true
 },
 "theme": "default"
}
)&lt;/script>




&lt;pre class="mermaid">
flowchart TD
 A[Calculus] --&amp;gt; B[Limits]
 B --&amp;gt; C[Continuity]
 B --&amp;gt; D[Derivatives]
 B --&amp;gt; E[Integrals]
 D --&amp;gt; F[Optimisation: maxima/minima]
 D --&amp;gt; G[ML: gradients &amp;amp; learning]
 E --&amp;gt; H[Accumulation: area/total change]
&lt;/pre>

&lt;hr>




&lt;ul>
 
 
 
 
 
 
 
 
 
 
 

 
 
 &lt;li>
 &lt;a href="https://arshadhs.github.io/docs/ai/maths/020-calculus/04-vector-calculus/">Vector Calculus&lt;/a>

 
 



&lt;ul>
 
 
 
 
 &lt;li>
 &lt;a href="https://arshadhs.github.io/docs/ai/maths/020-calculus/04-vector-calculus/010-univariate-differentiation/">Differentiation of Univariate Functions&lt;/a>
 &lt;/li>
 
 
 
 
 &lt;li>
 &lt;a href="https://arshadhs.github.io/docs/ai/maths/020-calculus/04-vector-calculus/020-partial-derivatives-and-gradients/">Partial Differentiation and Gradients&lt;/a>
 &lt;/li>
 
 
 
 
 &lt;li>
 &lt;a href="https://arshadhs.github.io/docs/ai/maths/020-calculus/04-vector-calculus/030-vector-and-matrix-gradients/">Gradients of Vector-Valued and Matrix Functions&lt;/a>
 &lt;/li>
 
 
 
 
 &lt;li>
 &lt;a href="https://arshadhs.github.io/docs/ai/maths/020-calculus/04-vector-calculus/050-gradient-identities/">Useful Gradient Identities&lt;/a>
 &lt;/li>
 
 
 
 
 &lt;li>
 &lt;a href="https://arshadhs.github.io/docs/ai/maths/020-calculus/04-vector-calculus/060-backpropagation/">Backpropagation and Automatic Differentiation&lt;/a>
 &lt;/li>
 
 
 
 
 &lt;li>
 &lt;a href="https://arshadhs.github.io/docs/ai/maths/020-calculus/04-vector-calculus/070-higher-order-derivatives/">Higher-order derivatives&lt;/a>
 &lt;/li>
 
 
 
 
 &lt;li>
 &lt;a href="https://arshadhs.github.io/docs/ai/maths/020-calculus/04-vector-calculus/080-taylors-series/">Taylor’s series&lt;/a>
 &lt;/li>
 
 
 
 
 &lt;li>
 &lt;a href="https://arshadhs.github.io/docs/ai/maths/020-calculus/04-vector-calculus/090-maxima-and-minima/">Maxima and Minima&lt;/a>
 &lt;/li>
 
 

 
 
&lt;/ul>

 &lt;/li>
 
 &lt;li>
 &lt;a href="https://arshadhs.github.io/docs/ai/maths/020-calculus/05-optimisation/">Continuous Optimisation&lt;/a>

 
 



&lt;ul>
 
 
 
 
 &lt;li>
 &lt;a href="https://arshadhs.github.io/docs/ai/maths/020-calculus/05-optimisation/gradient-descent/">Optimisation using Gradient Descent&lt;/a>
 &lt;/li>
 
 
 
 
 &lt;li>
 &lt;a href="https://arshadhs.github.io/docs/ai/maths/020-calculus/05-optimisation/constrained-optimisation/">Constrained Optimisation&lt;/a>
 &lt;/li>
 
 
 
 
 &lt;li>
 &lt;a href="https://arshadhs.github.io/docs/ai/maths/020-calculus/05-optimisation/lagrange-multipliers/">Lagrange Multipliers&lt;/a>
 &lt;/li>
 
 
 
 
 &lt;li>
 &lt;a href="https://arshadhs.github.io/docs/ai/maths/020-calculus/05-optimisation/convex-optimisation/">Convex Optimisation&lt;/a>
 &lt;/li>
 
 

 
 
&lt;/ul>

 &lt;/li>
 
 &lt;li>
 &lt;a href="https://arshadhs.github.io/docs/ai/maths/020-calculus/06-nonlinear-optimisation/">Nonlinear Optimisation&lt;/a>

 
 



&lt;ul>
 
 
 
 
 &lt;li>
 &lt;a href="https://arshadhs.github.io/docs/ai/maths/020-calculus/06-nonlinear-optimisation/optimisation-challenges/">Challenges in Gradient-Based Optimisation&lt;/a>
 &lt;/li>
 
 
 
 
 &lt;li>
 &lt;a href="https://arshadhs.github.io/docs/ai/maths/020-calculus/06-nonlinear-optimisation/stochastic-gradient-descent/">Stochastic Gradient Descent (SGD)&lt;/a>
 &lt;/li>
 
 
 
 
 &lt;li>
 &lt;a href="https://arshadhs.github.io/docs/ai/maths/020-calculus/06-nonlinear-optimisation/momentum-methods/">Momentum-Based Learning&lt;/a>
 &lt;/li>
 
 
 
 
 &lt;li>
 &lt;a href="https://arshadhs.github.io/docs/ai/maths/020-calculus/06-nonlinear-optimisation/adaptive-methods/">Adaptive Methods: AdaGrad, RMSProp, Adam&lt;/a>
 &lt;/li>
 
 
 
 
 &lt;li>
 &lt;a href="https://arshadhs.github.io/docs/ai/maths/020-calculus/06-nonlinear-optimisation/hyperparameter-tuning/">Tuning Hyperparameters and Preprocessing&lt;/a>
 &lt;/li>
 
 

 
 
&lt;/ul>

 &lt;/li>
 
&lt;/ul>


&lt;hr>
&lt;div class="book-steps ">
&lt;ol>
&lt;li>
&lt;h2 id="differential-calculus-rates-of-change">
 Differential Calculus (Rates of Change)
 
 &lt;a class="anchor" href="#differential-calculus-rates-of-change">#&lt;/a>
 
&lt;/h2>
&lt;p>Studies &lt;strong>how things change&lt;/strong>.&lt;/p></description></item></channel></rss>