18:00 - 18:30: Networking, mingling & refreshments.
18:30 - 19:30: The Magical Consumer Group Protocol of Apache Kafka. Gwen Shapira is a principal data architect @ Confluent. Join Livestream on YouTube: http://bit.ly/2ReFYVf
19:30 - 20:00: Unlimited Kafka Messages. Maor Mordehay @ Alooma. Join Livestream on YouTube: http://bit.ly/2rQSDiN
*** All talks are delivered in English and live-streamed to YouTube at: ***
First session description:
Very few people know that inside’s Apache Kafka’s binary protocol for
publishing and retrieving messages hides another protocol - a generic,
extensible protocol for managing work assignments between multiple
instances of a client application.
When multiple Kafka consumers in the same consumer group subscribe to
a set of topic partitions, Kafka knows how to assign a subset of a topic
partitions to each consumer and how to handle failover automatically.
What is less known is that this assignment is determined by the
consumer client itself and that the same protocol can be used by any
application for both leader election and task assignment.
Let's dive into the internals of this little-known assignment protocol!
We’ll look in detail at how Kafka Consumers, Connect and Streams API
use this protocol for task management.
Gwen Shapira is a principal data architect at Confluent, where she helps customers achieve success with their Apache Kafka implementation. She has 15 years of experience working with code and customers to build scalable data architectures, integrating relational and big data technologies. Gwen currently specializes in building real-time reliable data-processing pipelines using Apache Kafka. Gwen is an Oracle Ace Director, the coauthor of Hadoop Application Architectures, and a frequent presenter at industry conferences. She is also a committer on Apache Kafka and Apache Sqoop. When Gwen isn’t coding or building data pipelines, you can find her pedaling her bike, exploring the roads and trails of California and beyond.
Second session description:
As Kafka looks to turn 8 this year, we look to see how we can push its boundaries. Just how extensible can Kafka be?
It is a no-brainer that Kafka is the primary choice when looking for an open-sourced stream processing platform. Where it thrives, is handling relatively small messages. But what if you want to process large or very large messages?
At Alooma, we are heavy Kafka users but found ourselves needing and wanting more. So, we put our heads together to bring a Kafka Streams version that can handle unlimited Kafka messages to the community. Join us for a sneak peek of our open source project.