Neural Network Coding Workshop


Details
Join us for our monthly hands-on coding workshop.
We're working through Andrej Karpathy's "Neural Networks: Zero to Hero" series to build a transformer architecture and reproduce GPT-2.
"We start with the basics of backpropagation and build up to modern deep neural networks, like GPT. Language models are an excellent place to learn deep learning, even if you intend to eventually go to other areas like computer vision because most of what you learn will be immediately transferable. This is why we dive into and focus on language models."
This Month's Focus
We'll be covering the first two projects in the series:
- Building micrograd: A tiny autograd engine to implement backpropagation.
- Starting makemore: An introduction to language modeling with a character-level approach.
Preparation
Before the workshop, please:
- Watch the first three lessons of the "Neural Networks: Zero to Hero" series.
- Try to code up "micrograd" on your own.
- Keep working on "makemore" as time allows.
Prerequisites
- Solid programming skills in Python
- Basic understanding of calculus (e.g., derivatives)
Everyone is welcome!
Don't hesitate to reach out if you have any questions.

Neural Network Coding Workshop