• From Zero to Hero with Kafka® Connect

    Coltbaan 4c

    Details **** Together with our friends from the Data Engineering NL Meetup group we are delighted to invite you to this event. The meetup is kindly hosted by BigData Republic. ***** PLEASE REGISTER ON THIS LINK: ***** https://www.meetup.com/Data-Engineering-NL/events/262440468/ Please make sure you RSVP in the link above so the organizer has accurate numbers for catering. We have closed RSVPs here to avoid confusion. Kafka is becoming an increasingly dominant component in the architecture of many companies. We're excited to announce that Robin Moffatt will be in the Netherlands for our meetup on Kafka Connect! — Agenda 17:30: Doors open 18:00 - 18:30: Food, drinks and networking 18:30 - 19:30: Robin Moffatt, Confluent, From Zero to Hero with Kafka Connect 19:30 - 20:00: Q & A: Drinks and networking Speaker: Robin Moffatt, Developer Advocate, Confluent Title of Talk: From Zero to Hero with Kafka Connect Abstract: Integrating Apache Kafka with other systems in a reliable and scalable way is often a key part of a streaming platform. Fortunately, Apache Kafka includes the Connect API that enables streaming integration both in and out of Kafka. Like any technology, understanding its architecture and deployment patterns is key to successful use, as is knowing where to go looking when things aren't working. Bio: Robin is a Developer Advocate at Confluent, as well as an Oracle Groundbreaker Ambassador. His career has always involved data, from the old worlds of COBOL and DB2, through the worlds of Oracle and Hadoop, and into the current world with Kafka. His particular interests are analytics, systems architecture, performance testing and optimization. He blogs at http://cnfl.io/rmoff and http://rmoff.net/ and can be found tweeting grumpy geek thoughts as https://twitter.com/rmoff. Outside of work he enjoys drinking good beer and eating fried breakfasts, although generally not at the same time. __________ KAFKA SUMMIT SF 2019: 30th September til the 1st October - We are able to offer you a 25% discount on the standard priced ticket for Kafka Summit San Francisco (September 30th & October 1st). To redeem it, please go to bit.ly/KSummitMeetupInvite, click ‘register’, select ‘Conference Pass’ and enter the community promo code “KS19Meetup”. Don't forget to join our Community Slack Team! https://launchpass.com/confluentcommunity Want to speak or host? [masked] NOTE: Please do not sign up for this event if you are under 18.

  • Topic Management at Scale

    ING Amsterdamse Poort

    Details ** IMPORTANT: Please make sure you sign up with your full first name and last name and bring photo ID to ensure entrance to the building for security purposes ** Join us for an Apache Kafka meetup on Tuesday, June 18th from 5:30 pm at ING in Amsterdam. The address, agenda and speaker information can be found below. See you there! ----- Agenda: 6:00 pm-6:45 pm: Registration, Networking, Pizza and Drinks 6:45 pm - 7:20 pm: Using Kafka to integrate DWH and Cloud Based big data systems: Mic Hussey, Systems Engineer, Confluent 7:20 pm-7:55 pm: Topic Management at Scale: Filip Yonov, Constantin Mota and Josephine Dik, ING 7:55pm - 8:30pm: Real Time Investment Alerts using Apache Kafka & Spring Kafka at ING Bank: Tim van Baarsen and Marcos Maia 8:30pm - 9:00pm: Drinks and Networking ----- First Talk: Mic Hussey, Systems Engineer, Nordics and Netherlands, Confluent Title of the Talk: Using Kafka to integrate DWH and Cloud Based big data systems Abstract: Data storage has become cheaper and cheaper over time. We're no longer scared of duplicating data many times if it helps optimize our analytics jobs. When migrating from a traditional Data Warehouse to a Big Data system what's involved. For example how do we extract data stored in third normal form in a form usable by BigQuery? ----- Second Talk: Filip Yonov, Constantin Mota and Josephine Dik, ING Title of the Talk: Topic Management at Scale Abstract: Kafka at ING has a long history. It all started in 2014 and in the following years we saw Kafka growing until 2018 when it took the spotlight at ING as the #1 searched-for technology with an unprecedented adoption curve. Suddenly what was a small trickle of niche use-cases, became a flood of customers on-boarding for every imaginable usage pattern. Even more astonishing was that our single cluster configuration saw almost 700% load increase just in 2018. During the explosion our team was so busy with on-boarding, maintenance and ops that it wasn’t clear whether we were supporting or sabotaging our long-term success. It was clear that we needed to challenge our view of how we used Kafka. We asked ourselves: given the demand, how can our clients easily manage their streams while not caring about scaling, clusters or technologies. In this talk we would like to discuss our experience of running a single cluster with more than 750 topics and show how we are in progress of making Kafka self-service via our Stream Marketplace. ----- Third Talk: Tim van Baarsen and Marcos Maia Title of the Talk: Real Time Investment Alerts using Apache Kafka & Spring Kafka at ING Bank Abstract: Nowadays our customers are expecting real time interactions with investments products and services we deliver at ING Bank especially because financial market fluctuations can have a direct impact on the investments performance of our customers. In this talk, we present how we deal with the massive stream of price updates on stocks and how we leverage many features of Apache Kafka and Spring Kafka to improve and simplify our solutions. From simple producer / consumer up to advanced usage of streams we strongly rely on Kafka to solve our functional and non-functional requirements delivering high quality, real time stream processing software that is reliable, precise and extremely valuable to our customers. ----- Don't forget to join our Community Slack Team! https://launchpass.com/confluentcommunity If you would like to speak or host our next event please let us know! [masked] NOTE: We are unable to cater for any attendees under the age of 18. Please do not sign up for this event if you are under 18.

    11
  • Kafka, Microservices and more from Confluent and Camunda

    Join us for an Apache Kafka meetup on April 24th from 6pm in Amsterdam. The address, agenda and speaker information can be found below. See you there! Agenda: 6:00pm - 6:30pm: Pizza, Drinks and Networking 6:30pm - 7:15pm: Kai Waehner- Intro to Apache Kafka® as Event Streaming Platform for Microservice Architectures 7:15pm - 8:00 pm: Bernd Rücker- Monitoring and Orchestration of Your Microservices Landscape with Kafka and Zeebe 8:00pm-8:15 pm - Q&A -- Kai Waehner works as Technology Evangelist at Confluent. Kai’s main area of expertise lies within the fields of Big Data Analytics, Machine Learning / Deep Learning, Cloud / Hybrid Architectures, Messaging, Integration, Microservices, Stream Processing, Internet of Things and Blockchain. He is regular speaker at international conferences such as JavaOne, O’Reilly Software Architecture or ApacheCon, writes articles for professional journals, and shares his experiences with new technologies on his blog (www.kai-waehner.de/blog). He also writes at https://cnfl.io/blog-kai-waehner. Contact and references: [masked] / @KaiWaehner / www.kai-waehner.de Title: Introduction to Apache Kafka as Event Streaming Platform for Microservice Architectures This session introduces Apache Kafka, an event-driven open source streaming platform. Apache Kafka goes far beyond scalable, high volume messaging. In addition, you can leverage Kafka Connect for integration and the Kafka Streams API for building lightweight stream processing microservices in autonomous teams. The Confluent Platform adds further components such as a Schema Registry, REST Proxy, KSQL, Clients for different programming languages and Connectors for different technologies. The session discusses how tech giants like LinkedIn, Ebay or Airbnb leverage Apache Kafka as event streaming platform to solve various different business problems and how to create a scalable, flexible microservice architecture. A live demo shows how you can easily process and analyze streams of events using Apache Kafka and KSQL. --- Bernd Rücker In 15+ years of software development, Bernd has helped to automate highly scalable core workflows at global companies including T-Mobile, Lufthansa and Zalando, and has contributed to various open source workflow engines. Bernd is co-founder and developer advocate at Camunda, an open source software company reinventing workflow automation. Bernd co-authored “Real-Life BPMN,” a popular book about workflow modeling and automation, now in its fifth edition. Bernd regularly speaks at conferences and writes for industry publications. Bernd is currently focused on new workflow automation paradigms that fit into modern architectures around distributed systems, microservices, domain-driven design, event-driven architecture and reactive systems. https://bernd-ruecker.com/ Title: Monitoring and Orchestration of Your Microservices Landscape with Kafka and Zeebe In this talk, I’ll demonstrate an approach based on real-life projects using the open source workflow engine zeebe.io to orchestrate microservices. Zeebe can connect to Kafka to coordinate workflows that span many microservices, providing end-to-end process visibility without violating the principles of loose coupling and service independence. Once an orchestration flow starts, Zeebe ensures that it is eventually carried out, retrying steps upon failure. In a Kafka architecture, Zeebe can easily produce events (or commands) and subscribe to events that will be correlated to workflows. Along the way, Zeebe facilitates monitoring and visibility into the progress and status of orchestration flows. Internally, Zeebe works as a distributed, event-driven and event-sourced system, making it not only very fast but horizontally scalable and fault tolerant—and able to handle the throughput required to operate alongside Kafka in a microservices architecture. Expect not only slides but also fun little live-hacking sessions and real-life stories

    17
  • Apache Kafka, KSQL, Demos & Booking.com

    Booking.com (The Learning Center)

    ** IMPORTANT: Please make sure you sign up with your full name and last name and bring photo ID to ensure entrance to the building for security purposes ** ----- Join us for an Apache Kafka meetup on March 21st from 6pm, hosted by Booking.com in Amsterdam. The address, agenda and speaker information can be found below. See you there! ----- Agenda: 17:30 - 18:00 - Guests arrival 18.00 - 18.50 - Data Streaming Services and k8s by Sergey Belikov, Booking.com 18.50 - 19.35 - Networking Break and Pizzas/drinks 19.35 - 20.35 - ATM Fraud Detection with Apache Kafka and KSQL by Robin Moffat, Confluent 20.35 - 20.55 - Demo by Mic Hussey, Confluent ----- First Talk Speaker: Sergey Belikov Bio: Sergey joined Booking.com in 2016 as a backend developer in Frontend department. Currently he works in the Core Infrastructure department within a team responsible for development, support and 24/7 reliability management of large-scale internal data-streaming pipelines built on top of the Kafka ecosystem. Title: Data Streaming Services and k8s Abstract: Over the course of this talk I will share our experience of making data from a Kafka topic available for downstream processing as a Hive table. Kafka Connect plays a core role in our setup so I will cover challenges we faced along our way while building this platform. Also, I will review what kind of tooling and automation team is using to stay sane during the day and sleep well during the night. --- Second Talk Speaker: Robin Moffatt Bio: Robin is a Developer Advocate at Confluent, the company founded by the original creators of Apache Kafka, as well as an Oracle Groundbreaker Ambassador and ACE Director (Alumnus). His career has always involved data, from the old worlds of COBOL and DB2, through the worlds of Oracle and Hadoop, and into the current world with Kafka. His particular interests are analytics, systems architecture, performance testing and optimization. He blogs at http://cnfl.io/rmoff and http://rmoff.net/ (and previously http://ritt.md/rmoff) and can be found tweeting grumpy geek thoughts as @rmoff. Outside of work he enjoys drinking good beer and eating fried breakfasts, although generally not at the same time. Title: ATM Fraud Detection with Apache Kafka and KSQL Abstract: Detecting fraudulent activity in real time can save a business significant amounts of money, but has traditionally been an area requiring a lot of complex programming and frameworks, particularly at scale. Using KSQL, it’s possible to use just SQL to build scalable real-time applications. In this talk, we’ll look at what KSQL is, and how its ability to join streams of events can be used to detect possibly fraudulent activity based on a stream of ATM transactions. We’ll also see how easy it is to integrate Kafka with other systems—both upstream and downstream—using Kafka Connect to stream from a database into Kafka, and from Kafka into Elasticsearch. --- Third Talk Speaker: Mic Hussey Bio: Mic is a Systems Engineer at Confluent, the company founded by the creators of Apache Kafka. He started out his career as a Civil Engineer in Dublin but fell into IT and moved to Stockholm at the height of the dot com boom. He's been architecting and building event based distributed systems for most of the time since, with a short detour into API Management along the way. Title: Demo Abstract: TBA -----

    5
  • Kafka, MQTT, Graph, KSQL and more!

    ING Acanthus

    ***PLEASE ENSURE YOU SIGN UP WITH YOUR NAME AND FAMILY NAME AND BRING YOUR PHOTO ID FOR SECURING ENTRANCE TO THE BUILDING *** Agenda: 5:00pm: Doors open 5:00pm - 6:00pm: Pizza, Drinks and Networking 6:00pm - 6:45pm: Kai Waehner, Confluent - Processing IoT Data from End to End with MQTT and Apache Kafka 6:45pm - 7:30pm: Will Bleker and Gary Stewart, ING - Pipelining the heroes with Kafka and Graph 7:30pm - 7:45pm: small break 7:45pm - 8:30pm: Mic Hussey, Confluent - Apache Kafka and KSQL in Action : Let’s Build a Streaming Data Pipeline! 8:30pm - 9:30pm Additional Q&A & Networking ----- Speaker: Kai Waehner works as Technology Evangelist at Confluent: [masked] / @KaiWaehner / www.kai-waehner.de Talk: Processing IoT Data from End to End with MQTT and Apache Kafka This session discusses end-to-end use cases such as connected cars, smart home or healthcare sensors, where you integrate Internet of Things (IoT) devices with enterprise IT using open source technologies and standards. MQTT is a lightweight messaging protocol for IoT. However, MQTT is not built for high scalability, longer storage or easy integration to legacy systems. Apache Kafka is a highly scalable distributed streaming platform, which ingests, stores, processes and forwards high volumes of data from thousands of IoT devices. This session discusses the Apache Kafka open source ecosystem as a streaming platform to process IoT data. See a live demo of how MQTT brokers like Mosquitto or RabbitMQ integrate with Kafka, and how you can even integrate MQTT clients to Kafka without MQTT Broker. Learn how to analyze the IoT data either natively on Kafka with Kafka Streams/KSQL or on an external big data cluster like Spark, Flink or Elasticsearch leveraging Kafka Connect. ----- Speakers: Will Bleker ING Chapter lead & Middleware Engineer Gary Stewart ING Platform Architect, Distributed Data Talk: Kafka, MQTT, Graph, KSQL and more! Filling in the gap of our database offerings (e.g. graph) we were inspired by our past experience to bring NoSQL into the financial industry. As with NoSQL, we started with 2 use-cases - one small and one big. Small enough to learn, prove, share and deliver. Big enough to challenge, imagine, impress and expand. With challenging requirements in availability, scalability and global reach we needed to reconsider our architecture. A strong principle starting point for customer facing services is that we adopt a master-less architecture where possible. In our use-case(s) availability is often more important than consistency however as time goes by and data quality degrades, consistency becomes more problematic. Imagine, one could design an architecture to remove throughput as a challenge, eliminate migrations and ensure consistency over time by means of re-deployments. We call this the cache cattle pipeline and started viewing our ‘datastore’ as multiple technologies including Apache Kafka to provide a total solution meeting our ultimate demands for the global reach. In our talk, we will share the use-case(s), architectural overview and the paradigms adopted to help us bring graph database to life in our organisation. ----- Speaker: Mic Hussey Bio: Mic is a Systems Engineer at Confluent Talk: Apache Kafka and KSQL in Action : Let’s Build a Streaming Data Pipeline! Hopefully you're already familiar with Apache Kafka - the massively scalable technology which allows you to build robust data pipelines. And have come across the plethora of Kafka Connectors that make it easy to join the ends of your data pipeline to other systems such as DBs or Big Data. But what do you do in the middle? Would you like to be able to transform, enrich and filter the data as it moves along the pipeline? Just break out your compiler! Only kidding, using KSQL it's possible to achieve all this without opening Eclipse at all....

    4
  • Sports Results and Music Videos, powered by Apache Kafka

    Together with our friends from the Utretch Kafka Meetup (https://www.meetup.com/Kafka-Meetup-Utrecht/) group we are deligted to invite you to join us for an Apache Kafka meetup on December 7th from 6:30pm - 8:15pm, hosted by XITE in Amsterdam. The address is 28 Spijkerkade (https://maps.google.com/?q=28+Spijkerkade&entry=gmail&source=g). The agenda and speaker information can be found below. See you there! Agenda: 6:00pm: Doors open 6:30pm - 7:00pm: Networking, Food and Drinks 7:00pm - 7:30pm: Presentation #1: Kafka as an Integration Platform, Roman Ivanov, XITE 7:30pm - 8:15pm: Presentation #2: Kafka pipeline for the Winter Olympics 2018, Jork Zijlstra and Casper Koning, Gracenote Sports 8:15pm - 8:30pm: Additional Q&A and Networking Abstract for XITE : “Besides using Kafka as a streaming data solution, XITE decided to go further and establish company wide integration protocols. These protocols standardise micro-services communication, their integration with Business Intelligence and Data Lake systems . We will share our findings, applied patterns and thoughts about the future.” Abstract for Gracenote Sports: "Gracenote Sports talk about their Kafka pipeline for the Winter Olympics 2018. Powering feeds and Olympic data widgets for a broadcaster's apps and website means delivering the official results faster than their competitors and social media. We are going to present an overview of our development using Kafka streams to enhance to our existing rich Olympic model and infrastructure." (speaker t.b.d.) ------ Special thanks to XITE who are hosting us for this event. Don't forget to join our Community Slack Team (https://slackpass.io/confluentcommunity)! If you would like to speak or host our next event please let us know! [masked] NOTE: We are unable to cater for any attendees under the age of 18. Please do not sign up for this event if you are under 18.

    2