Deploying AI Runtimes in Production
Details
In this session, we focus on how runtime deployments work in Kubernetes and OpenShift environments. We’ll explore how models are packaged, deployed, scaled, and monitored in production clusters. You’ll learn how container orchestration supports reliability, performance, and operational control, and how runtime management fits into modern AI platforms. Expect a practical overview grounded in real deployment scenarios.
We invite anyone who would like to take part as a Contributor to please register through our Luma link and join our Luma channel. This allows us to welcome you into events with full participation access
[https://luma.com/5y5z80ty ](https://luma.com/5y5z80ty)
AI summary
By Meetup
A session on deploying AI runtimes in production with K8s/OpenShift; for developers and contributors to learn packaging, deployment, scaling and monitoring.
AI summary
By Meetup
A session on deploying AI runtimes in production with K8s/OpenShift; for developers and contributors to learn packaging, deployment, scaling and monitoring.
