Past Meetup

Dataflow applications using event first thinking, stream processing & serverless

This Meetup is past

77 people went

Location image of event venue

What we'll do

Join us for an Apache Kafka meetup on December 5th from 6:00pm, hosted by Deloitte. The address, agenda and speaker information can be found below. See you there!

-----

Agenda:
6:00pm: Doors open
6:00pm - 6:30pm: Networking, Pizza and Drinks
6:30pm - 7:15pm: Neil Avery, Confluent
7:15pm - 7:45pm: Additional Q&A and Networking

-----

Speaker:
Neil Avery

Bio:
Neil is a technologist in the Office of the CTO at Confluent, the company founded by the creators of Apache Kafka. As a technologist, his role is as an industry expert in the world of streaming, distributed systems and the next generation of technology as the world becomes real time. Various aspects of the function include working with prominent customers, working with product to drive innovation into the Kafka ecosystem and thought-lead about the next frontier of innovation. He has over 25 years of expertise of working on distributed computing, messaging and stream processing. He has built or redesigned commercial messaging platforms.

Title:
The art of Dataflow applications using event first thinking, stream processing and serverless

Abstract:
Have you ever imagined what it would be like to build a massively scalable streaming application on Kafka, the challenges, the patterns and the thought process involved? How much of the application can be reused? What patterns will you discover? How does it all fit together?

Depending upon your use case and business, this can mean many things. Starting out with an ETL data pipe is one thing, but evolving to a company-wide real-time application that is business critical and entirely dependent upon a streaming platform is a giant leap. Large-scale streaming applications are also called dataflow applications. They are classically different from other data systems; they are viewed as a series of interconnected streams that are topologically defined using stream processors. Almost like a deconstructed database.

In this talk I step through the creation of dataflow systems, understanding how they are developed from raw events to evolve into something that can be adopted at scale. I will focus on event-first thinking, data models and the fundamentals of Stream processors such as Kafka Streams, KSQL and Serverless (FaaS).

Building upon this, I explain how data flow design can be used to build common business functionality or express use case applied to an auction system like ebay. I will focus on
- User registration
- User bidding event streams
- Payment processing
- FaaS stream processing

You will leave talk with an understanding of how to model events with event-first thinking, that work towards reusable streaming patterns and most importantly, how it all fits together at massive scale.

-----

Don't forget to join our Community Slack Team! https://launchpass.com/confluentcommunity

If you would like to speak or host our next event please let us know! [masked]