Brent Spell on BERT: Pre-training of Deep Bidirectional Transformers...

This is a past event

8 people went

Location image of event venue

Details

Full Title: BERT - Pre-training of Deep Bidirectional Transformers for Language Understanding

Description: BERT is an unsupervised deep language model that smashed benchmarks for many Natural Language Processing (NLP) tasks in 2018. While it no longer remains the state of the art for many NLP tasks, BERT represents an evolutionary milestone in language understanding, while democratizing access to deep NLP as a pre-trained model.

Brent is the chief technology officer of Pylon. When he isn't working on machine learning systems for speech, he can be found reading papers, building prototypes, and curating the best memes the internet has to offer.