- Dataflow applications using event first thinking, stream processing & serverless
Join us for an Apache Kafka meetup on December 5th from 6:00pm, hosted by Deloitte. The address, agenda and speaker information can be found below. See you there! ----- Agenda: 6:00pm: Doors open 6:00pm - 6:30pm: Networking, Pizza and Drinks 6:30pm - 7:15pm: Neil Avery, Confluent 7:15pm - 7:45pm: Additional Q&A and Networking ----- Speaker: Neil Avery Bio: Neil is a technologist in the Office of the CTO at Confluent, the company founded by the creators of Apache Kafka. As a technologist, his role is as an industry expert in the world of streaming, distributed systems and the next generation of technology as the world becomes real time. Various aspects of the function include working with prominent customers, working with product to drive innovation into the Kafka ecosystem and thought-lead about the next frontier of innovation. He has over 25 years of expertise of working on distributed computing, messaging and stream processing. He has built or redesigned commercial messaging platforms. Title: The art of Dataflow applications using event first thinking, stream processing and serverless Abstract: Have you ever imagined what it would be like to build a massively scalable streaming application on Kafka, the challenges, the patterns and the thought process involved? How much of the application can be reused? What patterns will you discover? How does it all fit together? Depending upon your use case and business, this can mean many things. Starting out with an ETL data pipe is one thing, but evolving to a company-wide real-time application that is business critical and entirely dependent upon a streaming platform is a giant leap. Large-scale streaming applications are also called dataflow applications. They are classically different from other data systems; they are viewed as a series of interconnected streams that are topologically defined using stream processors. Almost like a deconstructed database. In this talk I step through the creation of dataflow systems, understanding how they are developed from raw events to evolve into something that can be adopted at scale. I will focus on event-first thinking, data models and the fundamentals of Stream processors such as Kafka Streams, KSQL and Serverless (FaaS). Building upon this, I explain how data flow design can be used to build common business functionality or express use case applied to an auction system like ebay. I will focus on - User registration - User bidding event streams - Payment processing - FaaS stream processing You will leave talk with an understanding of how to model events with event-first thinking, that work towards reusable streaming patterns and most importantly, how it all fits together at massive scale. ----- Don't forget to join our Community Slack Team! https://launchpass.com/confluentcommunity If you would like to speak or host our next event please let us know! [masked]
- Join us for a Kafka talk with Tim Berglund and Servian
Join us for our next Apache Kafka® meetup on October 2nd from 5:45pm, hosted by Servian. The agenda, venue and speaker information can be found below. See you there! ----- Agenda: 5:45pm: Doors open: Pizza & Drinks 6:15pm: Tim Berglund, Confluent 7:00pm: Jay Kim, Servian 7:45 - 8:30 - Drinks, Networking and additional Q&A ------ Speaker: Tim Berglund, Confluent Bio: Tim Berglund is a teacher, author, and technology leader with Confluent, where he serves as the Senior Director of Developer Experience. He can frequently be found at speaking at conferences in the United States and all over the world. He is the co-presenter of various O’Reilly training videos on topics ranging from Git to Distributed Systems, and is the author of Gradle Beyond the Basics. He tweets as @tlberglund, blogs very occasionally at http://timberglund.com (http://timberglund.com/), is the co-host of the http://devrelrad.io (http://devrelrad.io/) podcast, and lives in Littleton, CO, USA with the wife of his youth and their youngest child, the other two having mostly grown up. Title: Processing Streaming Data with KSQL Abstract: Apache Kafka is a de facto standard streaming data processing platform, being widely deployed as a messaging system, and having a robust data integration framework (Kafka Connect) and stream processing API (Kafka Streams) to meet the needs that common attend real-time message processing. But there’s more! Kafka now offers KSQL, a declarative, SQL-like stream processing language that lets you define powerful stream-processing applications easily. What once took some moderately sophisticated Java code can now be done at the command line with a familiar and eminently approachable syntax. Come to this talk for an overview of KSQL with live coding on live streaming data. ------ Speaker 2: Jay Kim Bio: TBC Title: Debezium, an open source change data capture (CDC) platform. Abstract: Jay Kim will be speaking about Debezium, an open source change data capture (CDC) platform. Change data capture refers to the process in which changes to a data set are identified for later consumption by consumer applications. Most commonly seen in data warehousing environments, CDC is seeing increased relevance in the modern data landscape in the context of stream processing, serverless computing and scalability. The presentation will cover a conceptual overview of the technology, a technical demonstration and potential business applications of Debezium. -------- Don't forget to join our Community Slack Team! https://launchpass.com/confluentcommunity If you would like to speak or host our next event please let us know! [masked] NOTE: We are unable to cater for any attendees under the age of 18. Please do not sign up for this event if you are under 18.
- Kafka Streams API and Kafka at Zendesk
Join us for an Apache Kafka meetup on July 16th from 6:00pm, hosted by Zendesk in Melbourne. The address, agenda and speaker information can be found below. See you there! *Enter at 67 Queen Street (https://maps.google.com/?q=67+Queen+Street&entry=gmail&source=g) (Side Entrance)* ----- Agenda: 6:00pm: Doors open 6:00pm - 6:30pm: Pizza, Drinks and Networking 6:30pm - 7:15pm: Antony Stubbs, Confluent 7:15pm - 8:00pm: Tim Cuthbertson, Zendesk 8:00pm - 8:15pm: Additional Q&A & Networking ------ Speaker: Antony Stubbs Bio: Antony is a Solution Architect for Confluent and spends most of his working hours talking to customers all around the world about their Kafka usage, his favourite aspect being funky Kafka Streams use cases. Focusing on Java environment technologies, he has previously worked in the telecommunications, logistics, TV media and education industries. Antony enjoys learning deeply about different cultures and that has encouraged him to move from his hometown in New Zealand to The Netherlands, New York and currently London. twitter.com/#!/@psynikal github.com/astubbs stackoverflow.com/users/105741/antony-stubbs Title: Beyond the DSL—Unlocking the power of Kafka Streams with the Processor API Abstract: Kafka Streams is a flexible and powerful framework. The Domain Specific Language (DSL) is an obvious place from which to start, but not all requirements fit the DSL model. Many people are unaware of the Processor API (PAPI) - or are intimidated by it because of sinks, sources, edges and stores - oh my! But most of the power of the PAPI can be leveraged, simply through the DSL `#process` method, which lets you attach the general building block `Processor` interface to your -easy to use- DSL topology, to combine the best of both worlds. In this talk you’ll get a look at the flexibility of the DSL’s process method and the possibilities it opens up. We’ll use real world use-cases borne from extensive experience in the field with multiple customers to explore power of direct write access to the state stores and how to perform range sub-selects. We’ll also see the options that punctuators bring to the table, as well as opportunities for major latency optimisations. Key takeaways: * Understanding of how to combine DSL and Processors * Capabilities and benefits of Processors * Real-world uses of Processors ------ Speaker: Tim Cuthbertson, Zendesk Bio: Tim is the Tech Lead and Staff Engineer in the Event Streaming team at Zendesk Title: Overview of Kafka at Zendesk Abstract: Tim will talk about the use case of Kafka at Zendesk, Including Zendesk's open source change data capture system, Maxwell. http://maxwells-daemon.io/ -------- Don't forget to join our Community Slack Team (https://launchpass.com/confluentcommunity) ! If you would like to speak or host our next event please let us know! [masked] NOTE: We are unable to cater for any attendees under the age of 18. Please do not sign up for this event if you are under 18.
- Streaming ETL with Apache Kafka and KSQL / Apache Metron
Join us for our next Melbourne Apache Kafka meetup on May 16th from 5:45pm hosted by Servian. The agenda, venue and speaker information can be found below. See you there! ----- Agenda: 5:45pm: Doors open: Pizza & Drinks 6:15pm: Ideen Faraji, Servian 7:00pm: - Nick Dearden, Confluent 7:45 - 8:30 - Drinks, Networking and additional Q&A ------ Speaker: Ideen Faraji, Servian Bio: I am a software engineer with strong focus on open source, currently, working as a senior consultant at Servian. Within the past couple of years, I have been involved in 2 Big Data projects - "data at rest" and "data in motion" - in architecture, build and support capacities. Prior to joining Servian, I worked as a software developer in the Java and Cloud spaces. Title: Apache Metron - Real-time Cybersecurity Analytics Abstract: Apache Metron is a streaming analytics application in cyber security context. It provides the capability to ingest, process and store security telemetry in order to detect cyber anomalies in real-time. Metron is built on top of open source technologies such as Kafka and Storm. In this talk, we will see: An overview of the Apache Metron product Basics of Metron architecture Using a secondary Kafka to prevent from data loss during data ingestion ----- Speaker: Nick Dearden, Confluent Bio: Nick is a technology and product leader at Confluent, where he enjoys leveraging many years of experience in the world of data and analytic systems to help design and explain the power of a streaming platform for every business. Prior to Confluent, he led the data platform group for a leading online real-estate seller and was chief architect for a cloud-based financial analytics platform. His early career stretches all the way back through multiple data warehouse and business intelligence adventures to the green-screen days of mainframe banking systems. Title: Streaming ETL with Apache Kafka and KSQL Abstract: Companies new and old are all recognising the importance of a low-latency, scalable, fault-tolerant data backbone - in the form of the Apache Kafka streaming platform. With Kafka developers can integrate multiple systems and data sources, enabling low latency analytics, event-driven architectures and the population of multiple downstream systems. What's more, these data pipelines can be built using configuration alone. In this talk, we'll see how easy it is to capture a stream of data changes in real-time from a database such as MySQL into Kafka using the Kafka Connect framework and then use KSQL to filter, aggregate and join it to other data, and finally stream the results from Kafka out into multiple targets such as Elasticsearch and MySQL. All of this can be accomplished without a single line of Java code! -------- Special thanks to Servian who are our local organisers for this event. Don't forget to join our Community Slack Team (https://launchpass.com/confluentcommunity) ! If you would like to speak or host our next event please let us know! [masked] NOTE: We are unable to cater for any attendees under the age of 18. Please do not sign up for this event if you are under 18.
- Our First Kafka Meetup with guest Speaker from Confluent
Join us for a Pre-holiday Apache Kafka meetup on December 19th from 6:00pm - 8:30pm, hosted by Zendesk in Melbourne. The agenda and speaker information can be found below. See you there! ----- Agenda: 6:00pm: Doors open 6:00pm - 6:30pm: Networking, Pizza and Drinks 6:30pm - 7:15pm: A Tour of Apache Kafka, Matt Howlett, Confluent 7:15pm - 7:30pm: Additional Q&A 7:30pm - 8:15pm: Additional Networking ----- Speaker: Matt Howlett Bio: As an early employee at Confluent, Matt has worked on many of Confluent's open source and enterprise products including Confluent Control Center, Schema Registry, Rest Proxy and Apache Kafka client libraries in various languages. Prior to joining Confluent, Matt spent many years developing materials tracking and optimization systems for large mining companies in Australia. His first exposure to distributed systems was in the computer games industry where he worked on the server of a massively multiplayer online game engine. Title: A Tour of Apache Kafka Abstract: Apache Kafka is a scalable streaming platform that forms a key part of the infrastructure at many companies including Netflix, Walmart, Uber, Goldman Sachs and LinkedIn. In this talk, Matt will give a technical overview of Kafka and how it differs from traditional messaging systems. He will then walk through some typical use cases including how Kafka can be used as a backbone for building fault tolerant micro-services, how it can be used to connect applications and data stores together at scale, and how to processing large quantities of data in real time using the Kafka Streams API and KSQL. ----- Special thanks to Zendesk who are hosting us for this event. Don't forget to join our Community Slack Team (https://slackpass.io/confluentcommunity)! If you would like to speak or host our next event please let us know! [masked] NOTE: We are unable to cater for any attendees under the age of 18. Please do not sign up for this event if you are under 18.