

What we’re about
New to Data Streaming? Confluent Developer is full of free resources to put you on the right path!
Get involved with an upcoming event. First-time speakers are welcome!
Interested in speaking or providing a venue? See our rules of engagement then get in touch ([community@confluent.io](mailto:community@confluent.io)) or, to become a local champion for this group, see our Meetup in a Box Program.
This meetup is for anyone interested in Data Streaming. You can be totally new to the space or an accomplished Data Streaming Engineer - you are welcome regardless! At Data Streaming events, talks will relate to Confluent, Apache Kafka®, Apache Flink®, stream processing, security, microservices, cloud topics, Kafka Streams, Apache Iceberg®, AI-related topics, and anything else adjacent to the world of data streaming!
This group is part of the Confluent Community: Find all our programs and spaces here
- Join over 45k members in our online slack community
- See upcoming meetups in-person and online and find recordings of past events!
- Meet our Community Catalysts (MVPs), nominate someone ideal or find out how to become one yourself!
- If you’re here, you should probably know about Current: The premier data streaming Conference, find out where it is going next!
Important:
Our group goal is to provide the opportunity for participants to learn, communicate, contribute and collaborate. Confluent’s Community Code of Conduct governs how we all participate and behave.
Apache Kafka, Kafka®, Apache Iceberg, Iceberg®, Apache Flink, Flink®, Apache®, the squirrel logo, and the Apache feather logo are either registered trademarks or trademarks of The Apache Software Foundation.
Upcoming events (1)
See all- IN-PERSON: Apache Kafka® x Apache Flink® MeetupAutodesk , Dublin
Join us for an Apache Kafka® x Apache Flink® meetup on Wednesday October 1st from 6:00pm in Dublin hosted by Autodesk!
📍Venue:
Autodesk
1 Windmill Lane, 2nd Floor, Dublin D02 F206NOTE: IF YOU RSVP HERE, YOU DON'T ALSO NEED TO RSVP AT Dublin Apache Kafka®
🗓 Agenda:
- 6:00pm: Doors open
- 6:00pm – 6:30pm: Food, Drinks & Networking
- 6:30pm - 7:15pm: Nitish Tiwari, Founder & CEO, Parseable
- 7:15pm - 8:00pm: John Watson, Principal Engineer, Autodesk & Petr Novicky, Principal Engineer, Autodesk
- 8:00pm - 9:00pm: Additional Q&A and Networking
💡 First Speaker:
Nitish Tiwari, Founder & CEO, ParseableTitle of Talk:
Making sense of Kafka metrics with Agentic designAbstract:
In this talk we will look at how to track and export Kafka metrics from a Kafka production cluster to an observability system like Parseable. We'll then deep dive into the metrics data, its implications and more. Finally we'll look at a LLM based agentic style workflow to see how to predict metrics data points, set up relevant alerts and create actionable insights from this metrics data.Bio:
Nitish is the Founder and CEO of Parseable Inc. At Parseable, Nitish and team are building the next generation infrastructure and tooling for observability. Parseable is one of the fastest, purpose built, full stack observability platforms. With more than 15+ years of experience in the software industry, Nitish has previously worked at data infrastructure organisations like MinIO and DataStax.💡 Second Speakers:
John Watson, Principal Engineer, Autodesk & Petr Novicky, Principal Engineer, AutodeskTitle of Talk:
Lessons from the Journey: Evolving Autodesk’s Streaming PlatformAbstract:
Building a streaming platform at scale requires constant evolution. In this session, we'll dive deeper into the technical journey of building a reliable data streaming platform at Autodesk capable of processing billions of events per day. We will share some real-world lessons learnt while using Apache Flink and Kafka.
You'll learn about our key architectural shifts, including:
- Decomposed Flink jobs: We'll show you how we broke up a monolithic Flink job to a more flexible model, writing materialized entities to a Kafka backbone for decoupled consumption.
- Platform as a Product: We'll discuss how we're building a self-service platform with tools to give development teams more control over their data pipelines.
- KDS to embedded Debezium migration: We'll explain why we migrated our change data capture strategy from Kinesis Data Streams (KDS) to Debezium embedded in Flink, sharing the benefits and trade-offs we encountered.
Join us to learn practical strategies for building and evolving a resilient streaming platform.Bios:
John is a Principal Engineer at Autodesk and a key member of the data streaming and processing platform team. With a deep passion for data streaming and backend development, he is currently building a highly scalable and reliable data streaming platform to handle complex data challenges.Petr is a Principal Engineer at Autodesk and is part of the data streaming and processing platform team. As a passionate backend software engineer, he thrives in the dynamic world of technology, adeptly utilizing a variety of programming languages. His current focus is on leveraging Apache Flink for real-time data processing. With a commitment to continuous learning, Petr enjoys tackling complex problems and creating innovative, scalable solutions across diverse domains.
***
DISCLAIMER
We are unable to cater for any attendees under the age of 18.
If you would like to speak or host our next event please let us know! community@confluent.io