Skip to content

Details

A Collaboration between AppliedAI and Nexus.

Ever wondered how ChatGPT, Claude, and other AI models actually work under the hood? Join us for an intensive one-day workshop where you’ll build a complete transformer model from scratch—and compete to see whose performs best.

What You’ll Learn
We begin with a fast-paced journey through AI history, from early debates on probability and free will to the breakthroughs that led from statistics to machine learning, neural networks, and finally transformers. You’ll see why attention mechanisms succeeded where earlier approaches fell short.
Then it’s hands-on. You’ll implement every major component of a transformer in Python and PyTorch:

  • Tokenization and embeddings
  • Self-attention and multi-head attention
  • Positional encodings
  • Transformer blocks
  • Training loops and optimization

Who Should Attend

  • Software engineers curious about AI internals
  • Data scientists and ML practitioners who want to go beyond APIs
  • Anyone who’s used ChatGPT and thought, “How does this actually work?”

Prerequisites: Python experience, a laptop with internet access, and curiosity. Neural network basics help, but aren’t required.

What You’ll Walk Away With

  • A clear understanding of transformer architecture
  • A working transformer implementation you can extend
  • Real intuition for how modern AI models are trained and optimized

Schedule

  • Morning (9:00–12:30): From early statistics to transformers, with a deep dive into attention and training
  • Lunch provided (12:30–1:30)
  • Afternoon (1:30–5:00): Build your transformer, train and tune it, submit results

Related topics

Events in Saint Paul, MN
Artificial Intelligence
Artificial Intelligence Applications
Deep Learning
Machine Learning
Data Science

You may also like