Skip to content

Details

This series is for those who want to move beyond "vibe coding" and build real engineering craftsmanship. We are stripping away the high-level libraries to understand the mathematical soul of neural networks. As long as you aren't a school dropout, no prior specialized knowledge is required to start.

***

### The Agenda

🚀 The Geometric Foundation We'll start by brushing up on Linear Algebra and Calculus, but from a purely geometric perspective. Instead of dry formulas, we’ll visualize how matrices transform space and how derivatives track change.
🧠 Deep Learning Architecture We will define the structural blueprints of a model. You’ll learn how to implement layers, activation functions, and loss calculations using first principles.
📉 Implementing Backpropagation This is the heart of the machine. We will manually code the backward pass, calculating gradients by hand to see exactly how a network "learns" from its mistakes.
⚙️ The Optimization Loop Building the engine that updates the weights. We’ll implement Stochastic Gradient Descent (SGD) and see how small, calculated adjustments lead to intelligent behavior.

Github repo https://github.com/arpanpathak/DataScienceNotebooks/blob/main/Build%20Neural%20Network%20from%20Scratch.ipynb

Related topics

Events in Seattle, WA
Deep Learning
Machine Learning
Natural Language Processing

You may also like