Building Zero Data loss pipelines with Apache Kafka

Details
Title: Building Zero Data loss pipelines with Apache Kafka
Abstract:
Kafka is playing an increasingly important role in messaging and streaming systems and is becoming the defacto messaging platform in many enterprises. Managing and maintaining Kafka deployments and tuning the data pipelines for high-performance and scalability can become a challenging task.
In this session, we will discuss the lessons learned and the best practices for achieving zero data loss pipelines.
Speaker Bio: Avinash Ramineni is a principal at Clairvoyant and leads the engineering efforts in the big data space. Avinash is a passionate technologist with a drive to understand the bigger picture and vision and convert them into pragmatic, implementable solutions. Avinash has close to two decades of experience in engineering and architecting systems on a large scale. He specializes in providing solutions in the areas of data platform, data security, cloud, service, and event-driven architectures. Prior to Clairvoyant, Avinash was a principal engineer at Apollo Group, where he was responsible for innovation and technical guidance for all the product development efforts. He earned a master’s degree in computer science from Arizona State University. Avinash is also CTO of Kogni, Clairvoyant’s data security product.

Building Zero Data loss pipelines with Apache Kafka