Skip to content

Details

Join us for another session of our study group as we dive into the fascinating world of Large Language Models (LLMs) using the book Build a Large Language Model from Scratch from Manning. In this session, we will continue our progress through Chapter 3: Coding Attention Mechanisms to enhance our understanding of this key component of most large language models. This is probably the most challenging chapter of the book so it reading before the meeting will help.

This isn't just a lecture! Come ready to ask questions, share insights, and code along. Whether you're a beginner or have some experience, this is the perfect opportunity to build a strong foundation together. If you plan to work with the code on your own laptop during the session, try and set-up the environment ahead of time (https://github.com/rasbt/LLMs-from-scratch/tree/main/setup).

We will be meeting in the meeting room of the library which is on the first floor. When you enter through the front door, make a left and go down the hall.

Members are also interested in