Skip to content

Details

The year 2018 has been called an inflection point for Deep Learning applied to Natural Language Processing. It’s been referred to as NLP’s ImageNet moment, referencing how years ago similar developments accelerated the development of deep learning in Computer Vision tasks.
Our understanding of how best to represent words and sentences in a way that best captures underlying meanings and relationships is rapidly evolving. The NLP community has been putting forward incredibly powerful components that you can freely download and use in your own models and pipelines. The prime example of these component is Google’s language model BERT which is based up the Transformer architecture.

In this session we will learn about BERT and his Transformer-based successors. We will also see how we can use these models in practice.

About Kurt: Kurt Janssens is co-founder and technical lead of Brainjar.
Brainjar, part of the Raccoons group, builds end-to-end machine learning applications as a service. Kurt graduated from the KU Leuven as Mathematical engineer and has gathered seven years of experience as data scientist. Over the last years he gained expertise in machine learning with a special interest in deep learning. He is proficient in both Tensorflow & Keras as Pytorch & Fast.ai and has a affinity for high performance computing.

Related topics

You may also like