6:30 - Arrive and mingle
7:00 - Talks begin
8:00 - Discussion
Language modeling is the foundation for most state of the art NLP models, and language model pretraining has enabled a new wave of significant progress on many fundamental NLP benchmarks. Language modeling requires the model to learn a deep understanding of the text data it is training on, and its unlabeled nature means it is very easy to source large amounts of training data.
First up will be a lightning talk by Gabriel Mitchell titled "Fooling around with Transformers", showing how we can use huggingface's transformers library to fine-tune on interesting tasks.
Following that will be a talk by Stuart Axelbrooke about language modeling. This talk will discuss the basics of language modeling, recent progress, methods of evaluation, and novel applications, including pretraining for other downstream NLP tasks.
Thanks again to OnePieceWork (https://www.onepiecework.com/) for hosting us!
NLSea is a special interest group of PuPPy focused on application of natural language processing (NLP). The event is for NLP practitioners as well as those wanting to get into the field. We plan to cover modern applications of NLP, including project briefs as well as recent important research papers.