Skip to content

Details

In this edition of the meetup, we'll have a single talk - great for catching up on or revisiting developments in Machine Learning.

The Versatility of Attention-Based Autoregressive Models

In this talk, François will first introduce the attention operator and attention layers, which are key components of modern large-scale models such as GPTs or ViTs. François will illustrate the capabilities of these modules with toy examples, and then present some work done in his group that puts transformers to use in applications ranging from wind speed prediction to world models.

Please note that our host requires you to provide a first and last name when signing up.

Related topics

Events in Zürich, CH
Machine Learning
Data Science
Open Source
Software Engineering
Database Applications

Sponsors

NumFOCUS

NumFOCUS

Promoting accessible and reproducible computing in science & technology

You may also like