Exploring Kaito to streamline AI inference model deployment in Azure Kubernetes

Hosted By
Alixia B.

Details
About this session:
Roy Kim will be presenting Kaito, an operator streamlining AI/ML inference model deployment in Kubernetes. Discover how Kaito simplifies deployment of large open-source inference models like Falcon and LLAMA2. Learn its unique features: managing large model files with container images, preset GPU configurations, auto-provisioning GPU nodes, and hosting on Microsoft Container Registry (MCR). See how Kaito simplifies the workflow of onboarding large AI inference models in Kubernetes.
Learn more and develop your skills in Azure Kubernetes Service with this Microsof Learn training module:
https://aka.ms/IntroToAKSLearn3

Microsoft Reactor Toronto
See more events
Online event
This event has passed
Sponsors
Exploring Kaito to streamline AI inference model deployment in Azure Kubernetes