Skip to content

Apache Kafka® Fundamentals (with Cloud component)

Photo of Alice Richardson
Hosted By
Alice R.
Apache Kafka® Fundamentals (with Cloud component)

Details

Hello Streamers!

Join us for a special Confluent Developer Live Apache Kafka® meetup on Monday, March 21st at 10:00 am GMT. Please find the details to join this free hands-on training with a walkthrough exercise on Apache Kafka® Fundamentals (with Cloud component).

The session will be run by Renu Gokhale, Technical Instructor at Confluent, and will include a presentation on important technical fundamentals as well as live hands-on training.
As with all Confluent Meetups, it is free, purely technical, and purely educational.

******TWO IMPORTANT ACTIONS TO ATTEND THIS EVENT:*******
STEP 1: BETWEEN NOW AND THE EVENT:
Please sign up for Confluent Cloud using this link (its free) https://cnfl.io/live-cloud feel free to use the Promo code CC60COMM as well - this will ensure that you can follow along (for free) as Renu presents the hands-on lab with guidance on setting up Confluent Cloud and simple producing and consuming.

STEP 2: WHEN THE EVENT BEGINS:
Join this meetup by going to the Zoom app and entering the following details:
Meeting ID: 992 3168 9961
Password: 138861

******EVENT DETAILS*******
Agenda (time below is in GMT):
10:00 AM - about 10:50 AM
-Presentation on Apache Kafka® Fundamentals (with Cloud component)
-Two embedded quick knowledge checks with instant feedback

about 10:50 AM - 11:30 AM
-Interactive discussion of one or more concrete scenario problems
-Demo & hands-on lab
-Q&A, additional discussion, time permitting
-----
Speaker:
Renu Gokhale, Technical Instructor, Confluent

Title:
Apache Kafka Fundamentals (with Cloud component)

Abstract:
Apache Kafka was built with the vision to become the central nervous system that makes real-time data available to all the applications that need to use it, with numerous use cases like stock trading and fraud detection, to transportation, data integration, and real-time analytics. Whether you’re just getting started or have already built stream processing applications, you will find actionable insights in this series that will enable you to further derive business value from your data systems.

This session will get into the technical fundamentals of how Apache Kafka works. The presentation is broken down into five lessons:
1. Getting Started
2. How are Messages Organized?
3. How Do I Scale and Do More Things With My Data?
4. What’s Going On Inside Kafka?
5. Recapping and Going Further

After participating the lessons and activities in this course, you will be able to:
* Describe the life cycle of a message
* Explain the concepts of topics and partitions
* Describe some properties of producers and consumers
* Explain two types of offsets
* Explain groups and their benefits
* List some responsibilities of brokers

Bio:
Renu Gokhale is a technical trainer at Confluent based out of its Singapore office. She works as an instructor for Admin, Developer skills and Stream courses. She conducts training programs across cultures and geography spanning Asia & Europe. Prior to joining Confluent Renu taught a variety of computer science courses at Engineering institutes in India, Malaysia and Singapore. She is passionate about translating her love of teaching to help others learn.
-----
Online Meetup Etiquette:
•Please use the “Raise Hand” button and Zoom chat feature to ask questions during the presentation!
•Please raise your hand when you have a question and you will be unmuted.
•Please arrive on time as Zoom meetings can become locked for many reasons. We will begin promptly.
----
Don't forget to join our Forum and Community Slack to ask any follow up questions https://cnfl.io/meetup-questions

If you would like to speak or host our next event please let us know! community@confluent.io

Photo of Swindon Apache Kafka® Meetup by Confluent group
Swindon Apache Kafka® Meetup by Confluent
See more events