BERT (Bidirectional Encoder Representations from Transformers) Modeling for NLP

Details

Please note the location for this event - it is NOT the usual ScienceLogic location.

Speaker(s) Profile:

Leonardo Apolonio is a machine learning engineer at Clarabridge, where he solves natural language processing (NLP) tasks, like detecting emotion, call reason, and expressed effort in the customer experience domain. He has experience maintaining and improving NLP pipelines to extract entities and topics from over 30 million websites daily, using the latest NLP and deep learning techniques. Leonardo has also built scalable analytics techniques for anomaly detection using datasets with billions of events.

Brief abstract:

Today we have machine learning engineers, software engineers, and data scientists. The trend in deep learning is that models are getting so powerful that there’s little need to know about the details of the specific algorithm, and instead the models can be immediately applied to custom use cases. Leonardo Apolonio explores how this trend will turn a machine learning engineer’s job into a software engineer’s skill.

Outline:

· What is BERT?

· Why is BERT important?

· Fine-tune BERT using AG News Dataset

· Write TensorFlow Serving Client

· Build Docker containers for TensorFlow-Serving and Tensorflow-Serving Client

· Build Docker containers and push containers to Dockerhub

· Create a Kubernetes cluster and deploy containers to Kubernetes in Google Cloud

Thank you, Clarabridge, for being the sponsor for this event.