The rise of LLM applications creates new infrastructure challenges related to GPU resources, model serving, and cost management. This cutting-edge workshop demonstrates how to extend your IDP to support AI application teams with self-service infrastructure for AI-powered applications and agents.
You’ll learn to create templates for model serving, implement cost controls for GPU resources, and provide integration patterns for common use cases.
Speakers:
- Engin Diri, Sr. Solutions Architect at Pulumi
Join us to learn:
- How to create self-service templates for LLM application infrastructure and GPU resources
- Security patterns for handling sensitive data in AI-powered applications and API key management.
Where: BigMarker at https://pulumip.us/AI-Apps-Platform
Please register using the link above to attend and receive the link to the example code and workshop recording.
Join our other workshop here: https://pulumip.us/UpcomingWorkshops