Skip to content

Local Intelligence: Running LLMs Natively, No Cloud Required

Photo of SSW
Hosted By
SSW
Local Intelligence: Running LLMs Natively, No Cloud Required

Details

Brisbane Full Stack User Group September: Local Intelligence: Running LLMs Natively, No Cloud Required with Jernej "JK" Kavka

Imagine an AI that runs entirely offline—fast, private, and directly connected to your application’s logic. In this session, we’ll explore how locally hosted large language models (LLMs) can interpret natural language prompts and invoke real-time functionality without ever touching the cloud. From smart environments to embedded devices, we’ll demonstrate how this approach unlocks new possibilities for responsiveness, autonomy, and data sovereignty. We’ll wrap by showing how .NET 9’s new AI-native method invocation makes this not just possible—but practical

Agenda:
6:00pm - Pizza and networking at SSW Brisbane
6:30pm - The Tech news with Eve and Adam Cogan, followed by
Local Intelligence: Running LLMs Natively, No Cloud Required with Jernej "JK" Kavka

Can't make it? Join us on the live stream! 🔴
➡️ https://www.ssw.com.au/live

Photo of Brisbane Full Stack User Group - Angular + React + .NET Core group
Brisbane Full Stack User Group - Angular + React + .NET Core
See more events
FREE