Experience at Ooyala and Klarna - Apache Kafka


Details
• What we'll do
Join us for an Apache Kafka meetup on March 15th from 5:30pm. The agenda and speaker information can be found below. See you there!
-----
Agenda:
5:30pm: Doors open
5:30pm - 6:00pm: Networking and Drinks
6:00pm - 6:40pm: Ooyala Adtech: Liberating Data Streams with Kafka and Friends, Stanislav Chizhov, Oolaya Adtech
6:40pm - 7:10pm: Q&A, Networking, Pizza and Drinks
7:10pm - 7:50pm: Kafka story @ Klarna, Ivan Dyachkov, Klarna
7:50pm - 8:30pm: Additional Q&A and Networking
Title:
Ooyala Adtech: Liberating Data Streams with Kafka and Friends Abstract:
In Ooyala Adtech we have 1B+ ad events per day to be collected, delivered in near real-time to tens of different consumers, which include ad server decision engine itself, forecasting, reporting and more. We used to run it on RabbitMQ and a home-built blob storage, but it was not reliable enough, had high operation costs and made onboarding new data streams difficult. Recently we have completed replacing it with new Kafka-based solution, which we does not have the above mentioned limitations.We have switched all the consumers and changed messaged format to Avro and now we are using quite a lot from Kafka ecosystem: Kafka Streams, Google PubSub connector, S3 Connector, Schema registry and Mirror Maker and soon K-SQL and Debezium. For some of the stateful consumers we had to implement backup/restore of event topics to/from S3.
Speaker:
Stanislav Chizhov
Bio:
Stanislav is a software engineer in Ooyala Adtech, where he develops a data delivery backbone and ecosystem with the ultimate goal to make creating and processing arbitrary data streams a commodity. He works mostly with Kafka technology stack and Java
-----
Speaker:
Ivan Dyachkov
Bio:
Ivan has been leading Kafka initiative at Klarna for 3 years from the start. Right now he's a Team Lead for Servicing Reliability team and is gradually transferring ownership of Kafka infrastructure to another team.
Title:
Kafka story @ Klarna
Abstract:
Our love story with Kafka at Klarna is very similar. We felt the pain of running numerous RabbitMQ clusters throughout the organization and when Kafka caught our attention we decided to try it out.
Fast forward 2 years – Kafka has become a universal transport for pretty much everything. Events, metrics, database transactions – if your new service needs to produce or consume some data, the answer is always "just use Kafka".
We'll talk about our journey with Kafka, which challenges did we have, why we reinvented some tools and what are the company-wide guidelines regarding message formats and Kafka usage.
--------
Special thanks to Oolaya Adtech http://www.ooyala.com/ who are hosting us for this event.
Don't forget to join our Community Slack Team !
If you would like to speak or host our next event please let us know! community@confluent.io
NOTE: We are unable to cater for any attendees under the age of 18. Please do not sign up for this event if you are under 18.
• What to bring
• Important to know

Experience at Ooyala and Klarna - Apache Kafka