Scalable AI Pipelines: Integrating Apache Beam, LLMs, and MCP
Details
Olá Porto! 👋
We are excited to announce our next technical deep dive, focusing on the future of scalable AI-powered data processing!
Save the date! 📅 Thursday, December 4th
⌚️ Check-in starts at 18:30
🕖 Presentation starts at 19:00
We are thrilled to host Davide Razzoli for a comprehensive talk on building robust, intelligent data pipelines:
➡️ Building Scalable AI-Powered Data Pipelines with Apache Beam, LLMs & MCP
"In this tech talk, we'll explore how to build scalable, AI-powered data processing pipelines using Apache Beam, Large Language Models (LLMs), and the Model Context Protocol (MCP).
I’ll start by introducing the core concepts behind these technologies, explaining how they enable efficient distributed data processing and seamless AI integration. Then, I’ll walk through a hands-on demo developed in Java.
Apache Beam is an open-source, unified programming model for building batch and streaming data pipelines. It provides powerful abstractions that hide low-level details of distributed data processing.
MCP (Model Context Protocol) is an open-source standard for connecting AI applications to external systems - like databases, search engines, and other tools - enabling models to interact with real-world data.
By the end of this session, attendees will gain practical insights on:
- Building distributed data pipelines with Apache Beam
- Integrating LLMs for intelligent data processing
- Using MCP to connect AI models to external systems
This is a must-attend for data engineers and developers working with large-scale data and AI!
After the talk, stay and connect! We'll have a casual networking session with 🍕 and drinks, the perfect opportunity to share insights with fellow professionals and our speaker.
The event is free to attend, but registration 🎫 is mandatory to secure your spot. See you there!
