[Course] Mastering Transformers in 4 Weeks (LLM course #1)

![[Course] Mastering Transformers in 4 Weeks (LLM course #1)](https://secure.meetupstatic.com/photos/event/2/7/c/9/highres_516070185.webp?w=750)
Details
This is a paid event. The registration is through Eventbrite (Meetup RSVP is not considered as registration ). Please register here [https://transformersCourse.eventbrite.com](https://transformerscourse.eventbrite.com/)
Transformers have become the cornerstone of LLMs. By the end of this course, you will gain complete understanding of transformer, its inside architecture and the process of building and training a transformer. You will gain the confidence of knowing all the major concepts behind today’s LLM, and how to implement them.
Topics include (see details at the end):
- Neural network and deep learning fundamentals
- Word embedding
- Attention layer and layer normalization
- Building encoders and decoders
- Building a transformer step by step
This course is for those with little background in deep learning but interested in getting deeper into transformers. It gives you real-time interaction with instructor and fellow students. You will be in a community to get help and learn and grow together.
What you get from this workshop:
- 4 weeks of live classes, live interaction with the instructor
- 4 weekly Q&A sessions (separate from the class time) to get all your questions answered.
- 4 Take-home Python notebooks that you can run and learn from
- Join a project group where you can learn and practice.
- Join the community of AI builders.
- A certificate when finishing the course.
The class starts on October 29th and ends on November 23rd.
The live class happens on Sunday 2-4 pm, and the live Q&A session is on Weds 7-8 pm. You can reach out to the instructor and other students at any time during the week.
Schedule details:
Week 1: Introduction to neural networks and deep learning.
(1) Introduction to neural network basics, activation function, backpropagation, and optimization algorithms such as Adam and AdamW.
(2) Deep learning fundamentals, including dropout, Layer normalization, residual network, and GPU computing. Hands on exercise in building a deep neural network.
Week 2: Sub-word tokenization and Word embedding
(1) Introduction to sub-word tokenization methods: BPE (Byte pair encode) and Unigram. Build a BPE tokenizer in a Python notebook
(2) Word embedding and position embedding, basic concepts and implementation.
Week 3: Attention layer, encoder and decoders
(1) Understand attention operation, query, key and value, and attention heads. Build Python code to Implement it.
(2) Understand the sublayers of encoder and decoder, feedforward network (FFN), masked attention. Build encoder and decoder in a Python notebook.
Week 4: Build a transformer for a practical application.
(1) Create a transformer step by step, and train and test it for a real dataset.
(2) Preview the pipeline of building LLM applications.
Instructor: Junling Hu
Refund policy:
Risk free purchase. 100% refundable if you are not happy with the class. Simply submit your request for refund within 1 day after the first class.
Register here: [https://transformersCourse.eventbrite.com](https://transformerscourse.eventbrite.com/)

Sponsors
[Course] Mastering Transformers in 4 Weeks (LLM course #1)