Large language models (LLMs) like ChatGPT, Claude, Gemini, DeepSeek, and others are great for taking in natural language questions and producing complex but useful natural language replies. But by themselves, they can’t do anything except output text. Until recently, the way to connect LLMs to other systems was to develop custom code for each LLM and each online system it connects to.
Enter MCP, short for “Model Context Protocol.” Introduced to world by Anthropic not too long ago, it’s an open source, open standard that gives AI models connection to external data sources and tools. You could describe MCP as “a universal API for AIs to interact with online services.”
Join the Tampa Bay Artificial Intelligence Meetup and Tampa Bay Python on Wednesday, June 18 at Embarc Collective in an introduction to MCP! We’ll explain what it is, why it was made and how it came about, and then show you actual working code by building a simple MCP server connected to an LLM. Joey will demonstrate a couple of examples of MCP in action, and they’ll be examples that you can take home and experiment with!
Bring your laptop — you’ll have the opportunity to try out the MCP demos for yourself!
And yes, there’ll be food (most likely pizza) and water and soft drinks will be provided. If that doesn’t work for you, feel free to bring your own.
This event is possible thanks to the Tampa Bay Tech Meetup Scholarship, which was provided by Bank of America and Embarc Collective. Thanks, both of you; we couldn’t do this without you!