We all love adding smart features to our apps — whether it's chat, suggestions, or summarizing content. But what happens when the user has no internet? Do we just tell them, “Oops, try again later”?
In this talk, I want to show you that we don’t have to rely on cloud AI services for everything. Thanks to new lightweight models like Gemma, we can now bring AI directly onto the device — and yes, that includes Flutter apps!
I’ll walk you through what on-device AI means (in simple terms), why it matters — especially in regions with unreliable internet — and how to start building with it even if you’re not an AI expert.
We’ll build a small offline smart assistant app in Flutter together. It’ll be able to summarize notes or respond to prompts without needing to connect to any server. Everything happens locally, on the device.
If you're curious about AI but find it overwhelming, this session is for you. I’ll break down the tools (like Gemma, llama.cpp, and Ollama), explain how they fit into Flutter, and give you the confidence to start building your own offline AI features — no PhD required.