Skip to content

Arun Raja on BERT: Pre-training of Deep Bidirectional Transformers for...

Photo of Nate Smith
Hosted By
Nate S. and 2 others
Arun Raja on BERT: Pre-training of Deep Bidirectional Transformers for...

Details

NEW LOCATION! Meet us at H Bar on Queen St. West

Arun Raja will be presenting “BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding” by Jacob Devlin, et al.

Bidirectional Encoder Representations from Transformers or BERT is a new language representation model created by scientists at Google AI. The X-factor of BERT is applying the concept of bidirectional training to preserve context from and left and right to language modelling which makes it the state-of-the-art language model.

Paper: https://arxiv.org/pdf/1810.04805.pdf

Arun Raja is a National Science Scholar from Singapore. He is an undergraduate Computer Science student at the University of Edinburgh. He is currently a third-year exchange student at the University of Toronto. He is passionate about research in Natural Language Processing.

Photo of Papers We Love - Toronto group
Papers We Love - Toronto
See more events
Hugs and Sarcasm
859 Queen St W · Toronto, ON