Mixtral MoE on HyperPod/SLURM + Fine-Tuning and Continued Pre-Training
Details
RSVP Webinar: https://www.eventbrite.com/e/webinar-generative-ai-on-aws-tickets-45852865154
Talk #0: Introduction
by Chris Fregly (Principal SA, Generative AI) and Antje Barth (Principal Developer Advocate, Generative AI)
Talk #1: Train the Mixtral MoE foundation model on SLURM with SageMaker HyperPod
by Ben Snyder, Applied Scientist @ AWS
Talk #2: Fine-Tuning and Continued Pre-training
by Antje Barth and Chris Fregly
RSVP Webinar: https://www.eventbrite.com/e/webinar-generative-ai-on-aws-tickets-45852865154
Zoom link: https://us02web.zoom.us/j/82308186562
Related Links
Generative AI Free Course on DeepLearning.ai: https://bit.ly/gllm
O'Reilly Book: https://www.amazon.com/Generative-AWS-Context-Aware-Multimodal-Applications/dp/1098159225
Website: [https://generativeaionaws.com](https://datascienceonaws.com/)
Meetup: https://meetup.generativeaionaws.com
GitHub Repo: https://github.com/generative-ai-on-aws/
YouTube: https://youtube.generativeaionaws.com
