Skip to content

Introduction to network telemetry using Apache Kafka in Confluent Cloud

Network event
66 attendees from 3 groups hosting
Photo of Nick Thompson
Hosted By
Nick T. and Joel K.
Introduction to network telemetry using Apache Kafka in Confluent Cloud

Details

The use of telemetry is an increased focus in IT operations providing raw data to the Machine Learning / Artificial Intelligence (ML/AI) algorithms for AIOps (Artificial Intelligence for IT Operations).

Network operators have relied upon SNMP and Syslog to monitor the network. Network telemetry (streaming data pushed to a collector) is replacing the polling of network devices. The push approach is less burden to the CPU of the device, can be delivered promptly, and is initiated by the device when a state change is detected.

There are open source tools to receive telemetry data, store it, visualize and alert; how should the network operator provide access to infrastructure telemetry data, in real-time, at scale across all technology stakeholders?

This session, presented by Joel King, illustrates publishing telemetry data from the Meraki SDK to Apache Kafka deployed in Confluent Cloud. Kafka is a distributed event store and stream-processing platform designed for big data and high throughput. Using the developer instance of Confluent Cloud and the Python SDK, we examine the ease at which a network operator can publish and consume telemetry data to implement its own AIOps approach.

Photo of RTP Programmability and Automation Meetup group
RTP Programmability and Automation Meetup
See more events