Apache Kafka® x Apache Flink® x Apache Iceberg


Details
Join us for an Apache Kafka® x Apache Flink® x Apache Iceberg meetup on October 6th from 6:00pm hosted by meshIQ!
📍Venue:
BootUp World: Startup Ecosystem Co-Working, Office Suites & Event Space
585 Glenwood Ave Menlo Park, Menlo Park, CA 94025
2nd Floor
🗓 Agenda:
- 6:00pm: Doors open
- 6:00pm – 6:30pm: Welcome & Food/Drinks, Networking
- 6:30pm - 7:00pm: Matthew Seal, Principal Engineer, Confluent
- 7:00pm - 7:30pm: JB Onofre, Director of the ASF & leads OSPO at Dremio
- 7:30pm - 8:00pm: Vishwanath Srinivasan, Manager Solutions Engineering, Confluent
- 8:00pm - 8:30pm: Additional Q&A and Networking
💡 Speaker One:
Matthew Seal, Principal Engineer, Confluent
Title of Talk:
Real Time Streaming Development: Combining Copilot, Apache Kafka®, and Confluent Cloud for Apache Flink® into one VSCode Extension
Abstract:
We'll be exploring a tool that aims to be a force multiplier for streaming development, including how we added direct augmentations for Copilot to enable deeper contextual AI answers and actions all without leaving one's IDE. In describing what this VSCode Extension does you'll get to explore the interfaces we built and the various ways to use them with and without AI. Overall we'll outline the reasons for developers needing such an interface, showoff what we've built, and demonstrate a success story for gathering user pain points in a domain and solving for them quickly.
Bio:
Before his current role Matthew was a CoFounder and CTO of a startup called Noteable which built real-time collaborative Notebooks before selling the company to Confluent in 2023. His history goes back through Netflix and earlier startups with a focus on Big Data ecosystems and Unsupervised Machine Learning.
💡 Speaker Two:
JB Onofre, Director of the ASF & leads OSPO at Dremio
Title of Talk:
How to leverage Apache ActiveMQ and Apache Kafka to inject data into Apache Iceberg with Apache Polaris (incubating)
Abstract:
After a quick update about Apache Iceberg and Polaris, we will explore the different ways to inject data as Iceberg tables: kafka connect, Apache Camel, custom "injector". What are the pros/cons of each approach ?
What are also the challenges we can face (snapshot management, schema evolution, ...).
We will illustrate one of these approaches with a quick demo.
Bio:
JB is Director of the ASF, PMC member on ~ 20 Apache projects. He also leads the OSPO at Dremio.
💡 Speaker Three:
Vishwanath Srinivasan, Manager Solutions Engineering, Confluent
Title of Talk:
Flink through the mind of a Chess player
Abstract:
Streaming systems and chess have more in common than you think: both revolve around sequences, state, timing, and pattern recognition. This talk introduces Apache Flink through the lens of chess - using the familiar game to make real-time concepts more accessible to engineers and data practitioners of all levels.
Bio:
Vish is a Solutions Engineering Leader at Confluent helping a number of tech companies in the Bay Area become more event-driven and to process real-time events at scale. Before this, he spent over 10 years in the data/app integration space working with connectors, REST APIs, and other data/middleware technologies.

Sponsors
Apache Kafka® x Apache Flink® x Apache Iceberg