In the deep end: Language & Large Language Models
Details
Fourth workshop: Computational Model of neuron, Architecture
We will explore from scratch how large language models achieve what they achieve.
This is slow and steady process spread across five interactive sessions
Skills needed: High School maths, Interest in language, Curiosity
Rest of the series will be:
- How LLMs are Trained, Bringing it all together
Bring Along: Paper/Pen
Good to have: High School maths
Laptop Not required
A bit about the facilitator
Farhan Sohail:
I work in data/visualization for corporations across Australia. Passionate about language and computing.
I have run such sessions fortnightly for nearly two years in Perth until moving recently to Brisbane.
