Elastic x Apache Kafka Phoenix Meetup


Details
Join Elastic Phoenix User Group and Phoenix / Scottsdale Apache Kafka meetup for joint meetup on Thursday, March 6th. We'll have exciting presentations from Viktor Gamov (Confluent) and Andreas Christoforides (Elastic), followed by networking, light bites, and refreshments.
π Date: Thursday, March 6th
π Time: 5:30 - 8:00 PM
π Location: CO+HOOTS β 221 E Indianola Ave, Phoenix, AZ 85012 (Classroom)
π€Featured Speakers:
- Viktor Gamov (Confluent) β Data Inception: Processing Dreams Within Dreams with Flink and Kafka
- Andreas Christoforides (Elastic) β Introduction to Elastic Connectors
π Parking: CO+HOOTS is located at 221 E. Indianola Ave. Phoenix 85012 (3rd St & Indianola). Feel free to park right under the building and walk the few steps to the middle set of stairs.
π Check-in: A CO+HOOTS community assistant will be at the front desk to direct you.
π Agenda:
- π 5:30 PM β Doors open & attendee check-in
- π€ 5:50 PM β Data Inception: Processing Dreams Within Dreams with Flink and Kafka - Viktor Gamov, Principal Developer Advocate, Confluent
- π€ 6:30 PM β Introduction to Elastic Connectors - Andreas Christoforides, Principal Consulting Architect, Elastic
- π» 7:00 - 8:00 PM β Networking & refreshments
- πͺ 8:00 PM β Event ends
πTalk Abstracts:
Data Inception: Processing Dreams Within Dreams with Flink and Kafka
Viktor Gamov, Principal Developer Advocate, Confluent
In today's data landscape, processing information after it's been stored is not only slowβitβs also outdated. Your dashboards often lag behind reality, and your data pipelines struggle to meet real-time demands. But what if you could process data in motion the moment it's created?
Enter the powerful combination of Apache Kafka and Apache Flink, where familiar SQL meets stream processing. Think of Kafka as your nervous system, transmitting signals throughout your data architecture, while Flink serves as your brain, processing these continuous signals in real-time. We will start with Flink SQL, transforming traditional query patterns into continuous streaming computations over Kafka topics.
As we dive deeper, you'll learn how Flink's Table API combines SQL's simplicity with more sophisticated processing needs. You'll also see how the DataStream API addresses complex streaming scenarios while seamlessly integrating with Kafka's event streams.
Through practical examples and real-world architectures, you will learn:
- How Flink transforms static SQL queries into dynamic computations over Kafka streams
- When to leverage the Table API and how to master the DataStream API for complex processing
- Essential patterns for managing state, handling time, and ensuring exactly-once processing
- How to build resilient streaming architectures that combine Kafka's durability with Flink's processing power
Whether you are building real-time analytics or processing sensor data, this session will give you a practical understanding of how Kafka and Flink work together. Join us to discover why companies worldwide choose this powerful combination and learn how to evolve from static queries to dynamic streams!
Introduction to Elastic Connectors - Andreas Christoforides, Principal Consulting Architect, Elastic
Elastic connectors are built using the Elastic Connector Framework and streamline the ingestion of diverse data sources into Elasticsearch. We'll explore the different connector types, deployment options, and how the connector framework integrates with the Elastic Stack.

Sponsors
Elastic x Apache Kafka Phoenix Meetup