Skip to content

Dive into New Deep Learning Models for Natural Language Processing

Dive into New Deep Learning Models for Natural Language Processing

Details

Hi Deep Learning enthusiasts,

Sponsored by GoDataDriven, this month we’ve invited two guest speakers to delve deep into DL & NLP on March 29th!

----
Tom Kenter, a data scientist in NLP from Booking.com, will speak about his time at the company and a recent research project with Google Research on byte-level machine reading in morphologically varied languages and the encoder-transformer-decoder model.

Dataiku data scientist Alex Combessie will present a translator our data science team has developed using a new neural network architecture called the Transformer.

First Talk: "Byte-level Machine Reading across Morphologically Varied Languages"

The machine reading task, where a computer reads a document and answers questions about it, is important in artificial intelligence research. Word-level models, which have words as units of input and output, have proven to yield state-of-the-art results when evaluated on English datasets. But what about on datasets in languages with richer morphology?

In this talk, Tom will speak about major types of machine reading models, introduce a new seq2seq variant, called encoder-transformer-decoder, and discuss whether bytes are suitable as input units across morphologically varied languages.

Speaker Bio: Tom Kenter is currently working as a Data Scientist – NLP at Booking.com in Amsterdam. He has a computational linguistics background and has worked at several IR and data mining companies prior to starting his PhD at the University of Amsterdam, supervised by Maarten de Rijke. Before joining Booking, Tom did two internships at Google Research in Mountain View and has worked on neural network based methods for making machines understand text.

-----
Second Talk: "A Novel Neural Network Architecture for Natural Language Processing"

Deep Learning in NLP has been dominated in the past years by recurrent and convolutional models. But other models emerge to improve translation quality and performance.

Alex has developed a translator for his team and clients using a new neural network architecture called the Transformer. Unlike traditional translator models, this one solely focuses on attention instead of recurrence and develops powerful NLP models in a fraction of the training time.

Alex will explain how the translator has been built, give a live demo, and discuss how the Transformer is able to overcome pitfalls of RNN models.

Speaker Bio: Alex Combessie is a Data Scientist at Dataiku who designs and deploys data projects with Machine Learning from prototype to production. Prior to his time at Dataiku, he helped build the Data Science team of Capgemini Consulting in France. Having began his career in economic analysis, he continue to work on interpretable models in complement to Deep Learning. Alex is also a travel junkie, who enjoys learning new things and making useful products.

As always, you can expect excellent talks, discussion, and complimentary pizza + beer for everyone!
See you all there!

Photo of Analytics & Data Science by Dataiku Amsterdam group
Analytics & Data Science by Dataiku Amsterdam
See more events