Let's Talk Ollama - Memphis (Virtual Python User Group)
Details
This month, we'll take a longer look at Ollama, the open source large language model runtime and host that lets you run LLMs locally. Topics include:
- Installing Ollama
- To GPU or not to GPU
- Managing models
- Using the REST API
- Writing Python apps with Ollama
- Multimodal models
- Customizing models
- AI Toolkit for Visual Studio Code
The session will not be recorded so that everyone will feel comfortable to interact with the content.
Artificial Intelligence
Machine Learning
Python
Open Source
Software Development