Mastering LLM Serving and Management at Grammarly
Details
Join us on May 15 to hear Grammarly software engineer Christoph Stuber explore our approaches to using the tools and processes of our ML infrastructure and discover how we manage and serve LLMs at scale.
✅ Registration: to attend the meetup, please register ➡️ here ⬅️
🔈 Christoph Stuber, Software Engineer at Grammarly
🚀 At Grammarly, we use a combination of third-party LLM APIs and in-house LLMs.
During this talk, we’ll:
- Talk about how LLMs play an essential role in our product offerings
- Give an overview of the different tools and processes we use in our ML infrastructure
- Discuss how we approach challenges like access, cost control, and load testing of LLMs
- And share our experience in optimizing and serving LLMs
✨ Who Will Be Interested: ML engineers, ML Infrastructure engineers, and anyone with knowledge of, or interest in, ML architecture and infrastructure.
This session will present a general overview of the topic, which will be of interest to enthusiasts and specialists at all levels. For the more senior members of our audience, we will briefly examine the practical aspects and associated challenges.
Agenda:
✨ 18:30 Doors open: Time for mingling and networking with fellows; snacks and drinks will be served
✨ 19:00 Talk
✨ 20:00 More snacks, drinks, mingling, and networking
✨ 21:00 Meetup ends
✅ Where: In person, Grammarly Berlin hub
✅ When: Wednesday, May 15
✅ Language: English
✅ Use this link to register: **https://gram.ly/3JGmbrq**
The event is free. Registration is mandatory. Due to a limited number of seats, the invites will be sent to a limited number of registered on a first registered first invited basis. Please check your inbox for a confirmation email about your attendance.
