Practical Tips and Tricks for Successful Transfer Learning (Partner Event)


Details
This is a partner event with our friends at the Bay Area NLP Meetup
For a link to the event: RSVP at
https://www.meetup.com/Bay-Area-NLP/events/271843560/
Summary:
How do you train a model on small amounts of data on complex tasks such as question answering or medical relation extraction? One area of research that tackles this is transfer learning, which focuses on training models to learn knowledge and skills from other related tasks that will transfer and boost performance on tasks of interest.
However, what does transfer learning look like in practice? This talk will go over practical tips and tricks for setting up transfer learning experiments, as well as hard-learned lessons of understanding which tasks will transfer well to others.
Bio:
Yada Pruksachatkun (https://www.yadapruk.com/) is an incoming Applied Scientist at Amazon Alexa working on fairness and bias in NLU. She recently completed graduate school at NYU, where her research on transfer learning was recently presented at 2020 ACL (https://arxiv.org/abs/2005.00628).
Aside from machine learning, she has a deep interest in healthcare, and has worked on a variety of projects using machine learning in healthcare.
Online Event:
For a link to the event: RSVP at
https://www.meetup.com/Bay-Area-NLP/events/271843560/

Practical Tips and Tricks for Successful Transfer Learning (Partner Event)