<?xml version="1.0" encoding="utf-8" standalone="yes"?><rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>ResNet on Arshad Siddiqui</title><link>https://arshadhs.github.io/tags/resnet/</link><description>Recent content in ResNet on Arshad Siddiqui</description><generator>Hugo</generator><language>en-us</language><lastBuildDate>Sun, 19 Apr 2026 00:00:00 +0000</lastBuildDate><atom:link href="https://arshadhs.github.io/tags/resnet/index.xml" rel="self" type="application/rss+xml"/><item><title>Deep CNN Architectures</title><link>https://arshadhs.github.io/docs/ai/deep-learning/065-deep-cnn-architectures/</link><pubDate>Sun, 19 Apr 2026 00:00:00 +0000</pubDate><guid>https://arshadhs.github.io/docs/ai/deep-learning/065-deep-cnn-architectures/</guid><description>&lt;h1 id="deep-cnn-architectures">
 Deep CNN Architectures
 
 &lt;a class="anchor" href="#deep-cnn-architectures">#&lt;/a>
 
&lt;/h1>
&lt;p>Once the basic ideas of convolution, pooling, channels, and classifier heads are understood, the next step is to study how successful CNN architectures are designed in practice. The history of deep CNNs is not just a list of famous models. It is a progression of design ideas: smaller filters, more depth, better optimisation, bottlenecks, multi-scale processing, residual connections, and transfer learning.&lt;/p>
&lt;blockquote class="book-hint info">
&lt;p>&lt;strong>Key takeaway:&lt;/strong>&lt;br>
Deep CNN architectures evolved by solving specific problems one by one: &lt;strong>LeNet&lt;/strong> established the template, &lt;strong>AlexNet&lt;/strong> proved deep learning could dominate large-scale vision, &lt;strong>VGG&lt;/strong> simplified the design, &lt;strong>NiN&lt;/strong> introduced powerful &lt;code>1 × 1&lt;/code> ideas, &lt;strong>GoogLeNet&lt;/strong> made multi-scale processing efficient, and &lt;strong>ResNet&lt;/strong> solved the optimisation problem of very deep networks.&lt;/p></description></item></channel></rss>