Skip to content

Sequence Modeling: Recurrent Neural Nets, LSTMs and Transformers (+demos)

Photo of Jaidev Shah
Hosted By
Jaidev S.
Sequence Modeling: Recurrent Neural Nets, LSTMs and Transformers (+demos)

Details

UPDATE:
Note: This session on Sequence Modeling was earlier scheduled for Saturday June 20 @ 7pm IST.

It will now be held tomorrow: Sunday June 21 at 7 pm.
___________________________________________________________________________________

Hi everyone!
In this session of our NLP series, we'll be diving deeper into sequence modeling.

I'll be covering a brief intro to markov chains for those of you who need a refresher and then diving into the theory behind HMMs (Hidden Markov Models), RNNs and LSTMs (Long Short Term Memory Networks). We'll cover a number of interesting use-cases ranging from multimodal stock prediction to applications pertaining to self-driving cars, but will ultimately hone in on applications of sequence models to text.

We'll cover the intuition behind transformers (slightly more advanced math for perhaps another day) but focus on showcasing the incredible performance you can achieve with wrappers on top of transformer based models like BERT, Universal Sentence Encoder, and GPT-2 for tasks that you might need for your own applications.

-Jaidev

Photo of AMLAI - Ahmedabad Machine Learning and AI Group group
AMLAI - Ahmedabad Machine Learning and AI Group
See more events