ClePy & CLE PyLadies Collaboration: Self-Hosting LLMs


Details
Back at it for the LLM workshop series collab with Cleveland PyLadies. This time we will be talking about how to run LLMs locally on your own machine.
We will go over the benefits of running LLMs locally, system requirements, different methods of running LLMs locally, and go through an example of installing Ollama and downloading and running a small LLM on a machine.
Bring your laptops if you'd like to follow along!
Agenda
6:00-6:30pm Social and Setup
6:30-8:00pm Self-Hosting LLMs
8:00-8:30pm Social and Clean-up
Skill Level: This event is more suited for intermediate or advanced programmers. Python knowledge is recommended but not required.
If you decide to not come to the meetup but initially RSVP yes, please change your response so we have a proper headcount.
Want to present a talk? Let us know on meetup or the #clepy channel on Cleveland Tech Slack.
Want to present a Module of the Month? See here: https://github.com/CLEpy/CLEpy-MotM/blob/master/CONTRIBUTING.md
Join the Cleveland Tech Slack group here: https://cleveland-tech.vercel.app/

Sponsors
ClePy & CLE PyLadies Collaboration: Self-Hosting LLMs