The Intuitive Transformer/Attention Explained -- In-Person & Hybrid
Details
The Intuitive Transformer
Join us for a guided tour through the intuition behind transformer and attention mechanisms. This talk will attempt to describe how transformers and attention work. Light on math, heavy on examples and visualizations. Many talks cover the specific operations within these models but do not describe the reasoning or rationale.
* Please make sure to read the instructions for joining the event below.
Agenda:
- 12:00 - 1:30 pm -- Presentation and discussion (both in person & zoom)
- Time permitting -- Additional Q&A, networking
Links to notes/slides and videos of prior meetups are available on the SDML GitHub repo https://github.com/SanDiegoMachineLearning/talks
Location:
We are meeting at our new location at Aquillius in Rancho Bernardo.
Please Note: There are two steps required to join the online meetup:
- You must go to our Slack community and ask for the password for the meeting. Link to join is below.
- You must have a Zoom login in order to join the event. A free Zoom account will work. If you get an error message joining the Zoom, please login to your account on the Zoom website then try again.
- Use this Zoom link: https://us06web.zoom.us/j/82891977558
Community:
Join our slack channel for questions and discussion about what's new in ML:
https://join.slack.com/t/sdmachinelearning/shared_invite/zt-34vyls6jn-3cREuo8EoPmo6AKwTEgGgA
