Skip to content

An Introduction to Running Generative AI (LLMs) Locally

Photo of Alfred Essa
Hosted By
Alfred E.
An Introduction to Running Generative AI (LLMs) Locally

Details

Ever wondered how you can run Generative AI models (LLMs )locally on your computer? In this hands-on, beginner-friendly session, we’ll show you how to install and use Ollama, a powerful but simple tool that makes running local large language models (LLMs) easy.

No experience needed—we’ll guide you step by step:

  • What Ollama is and why local LLMs are exciting
  • How to install it on your local laptop or desktop
  • How to load and use a variety of open-source models, including DeepSeek, LLaMA, and more
  • How to interact with these models and explore their capabilities

By the end, you’ll have your own local AI assistant up and running—no cloud, no fees, and no coding required.

Photo of Generative AI for Absolute Beginners group
Generative AI for Absolute Beginners
See more events
Online event
Link visible for attendees
FREE