Skip to content

Attention Is All You Need (well, actually you need a couple more things...)

Photo of Caetano
Hosted By
Caetano
Attention Is All You Need (well, actually you need a couple more things...)

Details

On this meeting we will give a (mostly) non-technical description to one of the core mechanisms of LLMs (to the best of our ability): Attention and its associated mechanisms (Self-Attention, Multi-headed Attention, Transformers Networks, GPT, etc).

How is the GPT architecture different from other neural network approaches, particularly ones designed for sequences (e.g. sequences of words ) like Recurrent Neural Networks (RNNs) and its variants. And what do Transformers (which puts the T on GPT) bring to the table through its Attention blocks.few years.

Join us to understand and discuss the building blocks of LLMs and the improvements that allowed them take all areas of human activity by storm in the last few years.

Photo of From the Math to the Masses group
From the Math to the Masses
See more events
BRKLYN Torrefaction
Carrer de Bac de Roda, 79, Sant Martí, 08005 Barcelona · Barcelona