Skip to content

Integrating Databases with Kafka / Swisscom Firehose (aka Kafka aaS @Swisscom)

Photo of Alice Richardson
Hosted By
Alice R.
Integrating Databases with Kafka / Swisscom Firehose (aka Kafka aaS @Swisscom)

Details

Join us for our next Zürich Apache Kafka meetup on May 24th from 6:00pm, hosted by IPT. The agenda, venue and speaker information can be found below. See you there!

----- Agenda:

6:00pm: Doors open
6:10pm - 6:45pm: Presentation #1: Integrating Databases and Kafka : The How and The Why - Robin Moffatt (Confluent)
6:45pm - 7:20pm: Presentation #2: Swisscom Data Lake - Thibaud Chardonnens & David Jacot (Swisscom)
7:20pm - 8:30pm: Networking, apéro and drinks

------

Speaker:
Robin Moffatt, Confluent

Bio:
Robin is a Developer Advocate at Confluent, the company founded by the creators of Apache Kafka, as well as an Oracle ACE Director and Developer Champion. His career has always involved data, from the old worlds of COBOL and DB2, through the worlds of Oracle and Hadoop, and into the current world with Kafka. His particular interests are analytics, systems architecture, performance testing and optimization. He blogs at https://www.confluent.io/blog/author/robin/ and http://rmoff.net/ (and previously http://ritt.md/rmoff ) and can be found tweeting grumpy geek thoughts as @rmoff. Outside of work he enjoys drinking good beer and eating fried breakfasts, although generally not at the same time.

Title:
Integrating Databases and Kafka: The How and The Why

Abstract:
Companies new and old are all recognising the importance of a low-latency, scalable, fault-tolerant data backbone, in the form of the Apache Kafka® streaming platform. With Kafka, developers can integrate multiple sources and systems, which enables low latency analytics, event driven architectures and the population of multiple downstream systems.

In this talk we’ll look at one of the most common integration requirements - connecting databases to Kafka. We’ll consider the concept that all data is a stream of events, including that residing within a database. We’ll look at why we’d want to stream data from a database, including driving applications in Kafka from events upstream. We’ll discuss the different methods for connecting databases to Kafka, and the pros and cons of each. Techniques including Change-Data-Capture (CDC) and Kafka Connect will be covered, as well as an exploration of the power of KSQL for performing transformations such as joins on the inbound data.

Attendees of this talk will learn:

  • That all data is event streams; databases are just a materialised view of a stream of events.

  • The best ways to integrate databases with Kafka.

  • Anti-patterns of which to be aware.

  • The power of KSQL for transforming streams of data in Kafka.

----------

Speakers:
Thibaud Chardonnens & David Jacot Bios:

Bios:
David Jacot is the head of big data infrastructure at Swisscom, where he has growth the big data platform of Swisscom from zero to petabyte-sized production infrastructure, while scaling the infra team from 0 to 20 engineers.

Thibaud Chardonnens is a big data engineer at Swisscom where he was one of the first engineer involved in the development of the big data platform. For the last five years, he was mainly building streaming applications and integrating the Kafka ecosystem within Swisscom.

Title:
Swisscom Firehose (aka Kafka as a Service @ Swisscom)

Abstract:
Kafka has been used for several years at Swisscom to stream data from various sources to sinks such as Hadoop. Providing Kafka as a Service to multiple teams in a large company presents governance, security and multi-tenancy challenges.

In this talk we will present how we have built our Swisscom Firehose platform which enables teams to use Kafka internally. We will explain how we have tackled these challenges by describing our governance model, our identity & ACLs management, and our self-service capabilities. We will also present how we leverage Kubernetes and how it simplifies our operations.

--------

Don't forget to join our Community Slack Team (https://launchpass.com/confluentcommunity) !

If you would like to speak or host our next event please let us know! community@confluent.io

Photo of Zürich Apache Kafka® Meetup by Confluent group
Zürich Apache Kafka® Meetup by Confluent
See more events