Skip to content

Coding LLM APIs — From Cloud to Local

Photo of Alfred Essa
Hosted By
Alfred E.
Coding LLM APIs — From Cloud to Local

Details

In this workshop, we’ll write real code to interact with both proprietary LLMs (like OpenAI and Anthropic) and open-source local models (like those run via Ollama).

This is your chance to move from conceptual understanding to building working prototypes using Python and REST APIs.

1. Coding a Proprietary LLM API (OpenAI / Anthropic)

  • Set up API keys securely
  • Write Python code to send a prompt and receive a response
  • Handle parameters (temperature, max tokens, etc.)

2. Coding a Local LLM API with Ollama

  • Start an Ollama server and test locally
  • Use Python (requests or similar) to send prompts
  • Compare performance and output with proprietary APIs

3. Bonus: Wrapping It in a Simple App

  • Create a basic CLI or web interface (e.g., Streamlit or Flask)
  • Think about extending the code for your own projects
Photo of Vibe Code AI for Absolute Beginners group
Vibe Code AI for Absolute Beginners
See more events
FREE