Scaling LLMs: from local to company wide


Details
Talk
As the demand for LLMs grows, so does the need for scalable and efficient infrastructure to support them - but what does this infrastructure look like?
Given a reasonably fast PC anyone can run an LLM for himself by now. But these will mostly only be the smaller, less powerful version of the LLM. So what does it need to run the full-fledged, more powerful one? Ideally as a service for the whole company? Jonas tries to answer this question, speaking from experience of building a cluster running most up-to-date LLMs for a 900+ developer company.
Timeline
18:30–19:00 Doors Open
19:00–19:45 Talk: Scaling LLMs: from local to company wide.
19:45–... Food & Drinks, Networking Time
Hosted by TNG Technology Consulting GmbH

Scaling LLMs: from local to company wide