Skip to content

Operating Apache Pinot @ Uber Scale

Photo of Navin Singh
Hosted By
Navin S.
Operating Apache Pinot @ Uber Scale

Details

Operating Apache Pinot @ Uber Scale
Uber has a complex marketplace consisting of riders, drivers, eaters, restaurants, and so on. Operating that marketplace at a global scale requires real-time intelligence and decision making. For instance, identifying delayed Uber Eats orders or abandoned carts helps to enable our community operations team to take corrective action. Having a real-time dashboard of different events such as consumer demand, driver availability, or trips happening in a city is crucial for day-to-day operation, incident triaging, and financial intelligence.

Over the last few years, we’ve built a self-service platform to power such use cases, and many others, across different parts of Uber. The core building block of this platform is Apache Pinot – a distributed, Online Analytical Processing (OLAP) system designed for performing low latency analytical queries on terabytes-scale data. In this talk, we will first give an overview on Pinot. After that, we will present details of this platform and how it fits in Uber’s ecosystem. We will do a deep dive into a few use cases in which Uber engineers built their analytics products on top of Pinot.

About the Speaker:
Ting Chen is a software engineer in Uber’s Data team. He is a tech lead on the stream analytics team whose mission is to provide fast and reliable real-time insights to Uber products and customers. Ting is an Apache Pinot contributor."

Photo of Big Data, Analytics, and Machine Learning group
Big Data, Analytics, and Machine Learning
See more events