Skip to content

Data Mash #2: Tackling Data in Real Time (Talks by Reply & Confluent)

Photo of Data Reply
Hosted By
Data R.
Data Mash #2: Tackling Data in Real Time (Talks by Reply & Confluent)

Details

Hello everyone! Join us for a new meetup on October 4th from 6:00 pm, hosted by Data Reply in Munich!

🗓 Agenda:

  • 6:00pm - 6:30pm: Food , Drinks 🍻🥤 & Network
  • 6:30pm - 7:15pm: Nassim Uhrmann, IT Consultant, Data Reply & Sahand Zarrinkoub, IT Consultant, Data Reply
  • 7:15pm - 8:00pm: Benedikt Linse, Staff Solutions Architect, Confluent
  • 8:00pm - 8:15pm: Meetup and Network

***

💡 Speaker 1:
Nassim Uhrmann, IT Consultant, Data Reply

Bio:
Nassim is a software developer with experience in developing microservice-based applications. His primary focus at Data Reply is to assist customers in gaining valuable insights from their data by leveraging Apache Kafka’s event streaming capabilities.

💡 Speaker 2:
Sahand Zarrinkoub, IT Consultant, Data Reply

Bio:
Sahand is a data engineer and has worked at Data Reply since 2020. His responsibilities include development, maintenance and analysis of different ETL-based applications using both real-time and batch-based approaches.

Talk:
Building a Scalable Analytics Application With Real-Time PS4 F1 Data

Abstract:
We developed a real-time analytics application designed around the popular PlayStation 4 game F1. Leveraging the event streaming capabilities of Kafka allows providing seamless telemetry data analysis, enabling analysts to gain valuable insights and improve operations based on data. We will explore the application’s architecture, including the Kafka Producer for data streaming, the responsive Kafka Consumer for data retrieval, and the use of OLAP databases for advanced analytical processing. This presentation highlights the potential of Kafka in enhancing real-time data visualization and analytics for gaming scenarios.

-----

💡 Speaker 3:
Benedikt Linse, Staff Solutions Architect, Confluent

Talk:
Latest Developments in Kafka Streams

Abstract:
Kafka and the Kafka Protocol have become the de facto basis for Data Intensive Streaming, Publish-Subcribe Scenarios, Data Mesh Implementations, and asynchronous microservice communication. Yet Kafka itself only provides the storage and data distribution layer of the Kafka Ecosystem. When it comes to authoring high-throughput and low-latency data intensive applications, Kafka Streams is a very popular choice. Kafka Streams supports different kinds of Joins between streams and tables, data enrichment facilities, windowed and non-windowed real-time integration, and a high level DSL for declarative authoring of streaming applications with JVM based languages. As can be seen in the list of Kafka Improvement Proposals, Kafka Streams is very actively developed and extended. This talk discusses new released and in progress features of Kafka Streams, and its relevancy to real world use cases.

Bio:
Benedikt is a Staff Solutions Architect at Confluent and has worked with a plethora of customers in Europe over the last five years. His favourite topics are Streaming applications and developing in the Cloud. Benedikt has given many use-cased oriented talks about Kafka Streams in the past, but this talk is more focused on the current developments in the Kafka Streams area. Benedikt also holds a PhD in data integration from the University of Munich.

Please register using this link to attend this event: https://www.meetup.com/apache-kafka-germany-munich/events/295880077/

COVID-19 safety measures

Event will be indoors
By attending this event, you acknowledge that risk includes possible exposure to and illness from infectious diseases including COVID-19, and accept responsibility for this, if it occurs.
The event host is instituting the above safety measures for this event. Meetup is not responsible for ensuring, and will not independently verify, that these precautions are followed.
Photo of Munich Data Mash group
Munich Data Mash
See more events
Luise-Ullrich-Straße 14
Luise-Ullrich-Straße 14 · München, BY