Skip to content
This event was canceled

AI Book Club: Natural Language Processing with Transformers | Week 4

Photo of Sage Elliott
Hosted By
Sage E.
AI Book Club: Natural Language Processing with Transformers | Week 4

Details

Octobers book: Natural Language Processing with Transformers, Revised Edition!

This is a casual style event. Not a structured presentation on topics. Feel free to grab the mic to join the discussion even if you have not read the book chapters! :)

For this LinkedIn Audio event lets discuss our continued reading about NLP, transformers, and any other interesting AI topics that come up!

Want to discuss the contents during the reading week? Join the Robust & Responsible AI Slack Group: https://bit.ly/r2ai-slack
-------------------------------------------------
About the book:
Title: Natural Language Processing with Transformers, Revised Edition
Authors: Lewis Tunstall, Leandro von Werra, Thomas Wolf
Published: May 2022

Feel free to read at your own pace, but my goal will be these chapters per week:

Week one goal:
1. Hello Transformers
2. Text Classification
3. Transformer Anatomy

Week two goal:
4. Multilingual Named Entity Recognition
5. Text Generation
6. Summarization

Week three goal:
7. Question Answering
8. Making Transformers Efficient In Production
9. Dealing With Few To No Labels

Week four goal:
10. Training Transformers From Scratch
11. Future Directions

Since their introduction in 2017, transformers have quickly become the dominant architecture for achieving state-of-the-art results on a variety of natural language processing tasks. If you're a data scientist or coder, this practical book -now revised in full color- shows you how to train and scale these large models using Hugging Face Transformers, a Python-based deep learning library.

Transformers have been used to write realistic news stories, improve Google Search queries, and even create chatbots that tell corny jokes. In this guide, authors Lewis Tunstall, Leandro von Werra, and Thomas Wolf, among the creators of Hugging Face Transformers, use a hands-on approach to teach you how transformers work and how to integrate them in your applications. You'll quickly learn a variety of tasks they can help you solve.

  • Build, debug, and optimize transformer models for core NLP tasks, such as text classification, named entity recognition, and question answering
  • Learn how transformers can be used for cross-lingual transfer learning
  • Apply transformers in real-world scenarios where labeled data is scarce
  • Make transformer models efficient for deployment using techniques such as distillation, pruning, and quantization
  • Train transformers from scratch and learn how to scale to multiple GPUs and distributed environments

Learn more about the book here:
https://www.oreilly.com/library/view/natural-language-processing/9781098136789/

Photo of Robust & Responsible AI - SF group
Robust & Responsible AI - SF
See more events