DON'T FORGET: At the end of the meetup, someone should post the topic for next week to Slack (signup information below).
This group meets weekly to discuss papers and topics foundational to the field of Deep Learning.
This week, we will be finishing up our discussion of the Annotated Transformer, and discuss the BERT paper afterwards.
The Annotated Transformer, Alexander Rush, https://nlp.seas.harvard.edu/2018/04/03/attention.html
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding, Devlin et al, https://arxiv.org/abs/1810.04805
Improving Language Understanding by Generative Pre-Training, Radfor et al
This event is hosted by Galvanize and the Boulder Data Science group.
To get onto slack, sign up here - https://bds-slackin.herokuapp.com/ - we use the #online-courses channel