Skip to content

Details

Mixture-of-Experts is a powerful approach that leverages the strengths of multiple specialized models to tackle complex problems, offering improved performance and efficiency in various machine learning applications. In this talk, Dr. Lin will cover the basics of MoE and its recent development in large language models. Furthermore, he will briefly introduce the theory behind MoE under different setups to better understand its performance in practice.

This is a hybrid event. Join the event virtually at https://gdg.community.dev/events/details/google-gdg-houston-presents-mixture-of-experts-in-deep-learning-applications-and-theory/
or in person at

3551 Cullen Blvd - 3551 Cullen Boulevard Houston, 77004

Sponsors

Sponsor logo
O'Reilly
Tech Books for Giveaways
Sponsor logo
JetBrains
JetBrains Licenses
Sponsor logo
Educative
Free and Discounted courses on educative platform

Members are also interested in