Skip to content

Scaling LLMs: from local to company wide

Photo of Benjamin
Hosted By
Benjamin and Luise B.
Scaling LLMs: from local to company wide

Details

Talk

As the demand for LLMs grows, so does the need for scalable and efficient infrastructure to support them - but what does this infrastructure look like?
Given a reasonably fast PC anyone can run an LLM for himself by now. But these will mostly only be the smaller, less powerful version of the LLM. So what does it need to run the full-fledged, more powerful one? Ideally as a service for the whole company? Jonas tries to answer this question, speaking from experience of building a cluster running most up-to-date LLMs for a 900+ developer company.

Timeline
18:30–19:00 Doors Open
19:00­–19:45 Talk: Scaling LLMs: from local to company wide.
19:45–... Food & Drinks, Networking Time
Hosted by TNG Technology Consulting GmbH

Photo of AI & Cloud Innovation Karlsruhe group
AI & Cloud Innovation Karlsruhe
See more events
TNG Technology Consulting GmbH
Amalienbadstraße 41a · Karlsruhe, BW