Skip to content

[Webinar] AI Inference Workloads: Solving Challenges Beyond Training Models

[Webinar] AI Inference Workloads: Solving Challenges Beyond Training Models

Details

Register here: https://info.datascience.salon/ai-inference-workloads-solving-challenges-beyond-training-models

In recent years, enterprise AI initiatives have made great strides in solving the challenges of training massive, distributed computational models. Data wrangling, experimentation management, GPU resource allocation…these challenges have spawned an ever-growing market of new tools and considerable investment. But despite all this, the ability to productize AI is still stunted in most enterprises.

In our survey of more than 200 data scientists and MLOps/IT experts, a majority of AI/ML models still aren’t making it to production. AI/ML teams are now under pressure to optimize and manage AI inference workloads in production and deliver a return on investment.

In this webinar, we will walk through the distinct characteristics of each stage in the ML lifecycle and their computational requirements. We’ll discuss solutions to improve throughput and reduce latency, and finally, we’ll show how one organization built an efficient inference platform on top of Kubernetes to support their scaling AI initiatives.

“Rapid AI development is what this is all about for us. What Run:AI helps us do is to move from a company doing pure research, to a company with results in production.”

Siddharth Sharma, Sr. Research Engineer, Wayve

Register here: https://info.datascience.salon/ai-inference-workloads-solving-challenges-beyond-training-models

Photo of Data Science Salon | South Florida group
Data Science Salon | South Florida
See more events