🧙♂️ Vibe Coding from Your Terminal with Claude Code + Local Models
Details
🧙♂️ Vibe Coding from Your Terminal with Claude Code + Local Models
Ever wanted to just describe what you need and have working code appear? That's vibe coding, and now you can do it locally.
Claude Code is Anthropic's agentic coding CLI. By default it calls their cloud API, but Ollama now supports Anthropic API compatibility, so you can point Claude Code at your local models instead. All the polish of Anthropic's CLI, none of the cloud dependency.
🧙♂️ What We'll Cover
- Installing Claude Code (just needs Node.js)
- Running qwen3-coder locally through Ollama
- Pointing Claude Code at localhost instead of the cloud
- Basic workflow for generating, refactoring, and explaining code
- When vibe coding saves time vs when to just write it yourself
🧙♂️ Who's This For?
Anyone comfortable with the terminal who wants AI-assisted coding without sending their code to someone else's servers. No cloud subscriptions required. Local AI doing useful things.
☕ What This Is (And Isn't)
This is a casual hobbyist demo, not a professional workshop. Think of it like showing a friend something cool you found over coffee. I'll walk through how I set this up on my own machine, share what worked, and we'll chat about it.
No curriculum. No certificates. No polished slides. Just a tech enthusiast sharing something interesting with other tech enthusiasts.
🌐 Tech Stuff Worth Knowing
I'll be demonstrating on a Mac M4. This works on Linux too. I'm not a Windows user so I can't speak to that setup.
If you want to follow along, you'll need:
- macOS or Linux (Windows untested)
- Node.js installed
- Ollama running locally
- Enough RAM to run a coding model (16GB minimum, 32GB+ recommended)
We won't be troubleshooting individual setups during the session. This is a walkthrough, not hands-on tech support.
