Join us for an Apache Kafka® x Apache Flink® x Grafana meetup on Thursday, Oct 2nd from 5:30pm hosted at Improving!
📍Venue:
Improving
1515 Central Ave NE, Suite 100, Minneapolis, MN 55413
🗓 Agenda:
- 5:30pm: Doors open
- 5:30pm – 6:00pm: Welcome & Food/Drinks, Networking
- 6:00pm - 6:30pm: Chase Horvath, Staff Solutions Engineer, Confluent
- 6:30pm - 7:00pm: Ryan Belgrave, Staff Software Engineer I, WarpStream
- 7:00pm - 8:00pm: Additional Q&A and Networking
💡 Speaker One:
Chase Horvath, Staff Solutions Engineer, Confluent
Title of Talk:
Video game telemetry on a dime
Abstract:
Games generate massive data streams. Many game studios rely on Kafka to ingest it. Cost is a limiting factor on the granularity of data they can collect, but more data leads to better decision making and a better player experience. After collection, the studios face more challenges when preparing data for use by apps and downstream analytics. This session details a cost effective game telemetry pipeline from player actions to analytics, real time visualizations, and actionable real-time events. We will cover:
- Real time uses and impacts VS high latency solutions.
- Reducing kafka costs with Confluent Cloud Freight clusters.
- Reducing player churn and increasing the success rate of real time offers by using Flink real-time stream processing on player telemetry and marketing data to predict player churn and generate a personalized offer.
- Visualizing telemetry in real time by using Confluent Cloud fully managed connectors to send data to Elastic Cloud.
- Exposing the data stored in Confluent Cloud in common Delta Lake and Iceberg formats for use in data lakes.
Come see the next generation technology studios are using to increase player engagement, monetize experiences, and build hit games.
Bio:
Chase brings a decade of experience with high performance and real time integration technologies to his customers. He has worked on many of the most scalable and high performance data pipelines in the gaming industry that drive top player experiences and successful monetization strategies. He does it all in the name of fun.
💡 Speaker Two:
Ryan Belgrave, Staff Software Engineer I, WarpStream
Title of Talk:
From Ticker Tape to Trendlines: A Stream Processing Journey into Market Dynamics
Abstract:
Building a platform to analyze real-time market data can be a complex undertaking. This session details an end-to-end project for ingesting, processing, and visualizing data from an active and entirely digital marketplace. We'll focus on moving beyond simple metrics to uncover deeper economic trends and behaviors.
Here’s what I’ll be covering:
- Tapping the Data Firehose: I’ll explain how to use public APIs—the source of which might surprise you—to ingest a stream of real-time trading events, as well as how to effectively backfill years of historical data to get a complete market picture.
- Building a Pipeline with Bento and WarpStream: I’ll demonstrate how to use Bento to seamlessly capture this trading data and publish it into WarpStream, a diskless, Kafka-compatible streaming platform designed for the cloud.
- From Streams to Lakehouse with Tableflow and Iceberg: I'll showcase how WarpStream's Tableflow feature automatically materializes our streaming data directly into an Apache Iceberg data lakehouse.
- Visualizing the Market with Grafana: The processed data is then brought to life in Grafana. I will show how to build various dashboards to track price histories, perform complex aggregations, and create custom "market indexes" to gauge the overall health of the economy.
- The Benefits of a Modern Streaming Lakehouse: I’ll explain the technical reasons for choosing this specific stack and how its storage-based architecture helped build a powerful analytics system while minimizing complexity and cost.
If you're interested in stream processing, data engineering, analytics, or are just curious to see what it takes to analyze a complex and fascinating market, this session is not one to miss!
Bio:
Ryan Belgrave is a Sr. Principal Engineer and has been working in the Distributed Data Platforms space at Optum since 2018. Before he joined Optum he worked at Target on their Public Cloud team building their cloud application platform for running Target.com. Ryan specializes in all things containers, Kubernetes and Cloud and has a Home Lab running various CNCF software. While Ryan has only been officially working in the industry since 2016, he has been learning and working with all the various Linux and Cloud technologies since 2006.
***
DISCLAIMER
We don't cater to anyone under the age of 21.
If you are interested in providing a talk/hosting a future meetup, please email community@confluent.io