In the second NLP paper reading session we will discuss the paper
"Neural Machine Translation by Jointly Learning to Align and Translate"(https://arxiv.org/abs/1409.0473)
which introduced sequence-to-sequence models with attention. This architecture was a breakthrough in machine translation back in 2016, and the current state of the art is held by models based on it.
The concepts we will discuss include Machine Translation (MT), Recurrent Neural Networks, Encoders, Decoders, Attention, and performance metrics for MT.
March 4 - March 9: Self reading, and preparing.
March 10: Paper reading session.
Paper Reading Schedule:
12:50 (Optional) Meet for lunch.
14:00 - 16:00 Paper discussion.
Reading Guidelines: https://discuss.mltokyo.ai/t/paper-reading-guidelines/241
The Reading Guidelines include important points to consider while reading the paper, as well as related literature and concepts. It will also serve as guide during the discussion.
Non-experts are also welcome. The only requirement is to read the paper and follow the reading guidelines.
Further Instructions and discussion:
* Unlike the previous time, there will be no presentation of the paper at the beginning of the session. We will jump straight into discussing it and address questions as they pop up.