Real-time clinical decision support with Kafka/Ensuring Data Quality in Kafka


Details
Hello Streamers!
Join us for an IN PERSON Apache Kafka® meetup on Thursday, July 18th from 5:30 pm, hosted by our friends at HCA Healthcare in Nashville!
*****VERY IMPORTANT*****Please complete the form below for entry to the venue site. https://docs.google.com/forms/d/e/1FAIpQLSec5Zg0EFSm5_SPCZI2m_guKT3hQe8Z9i5jqiVgaHJmKsn5_A/viewform
📍 Venue:
HCA Healthcare
2555 Park Plaza
Room: Davidson1
Nashville, TN 37203
***🗓 Agenda:
5:30pm - 6:00pm: Doors Open, Networking, Pizza and Drinks
6:00pm - 6:50pm: Derek Schatzlein, Software Engineer, HCA Healthcare
6:50pm - 7:30pm: Sandon Jacobs, Sr. Developer Advocate, Confluent
7:30pm - 8:00pm: More networking, Q&A
***
💡 Speaker:
Derek Schatzlein, Software Engineer, HCA Healthcare
Title of Talk:
Real-time clinical decision support applications powered by the Confluent Cloud Kafka platform
Abstract:
With streaming real-time patient data from 180+ hospitals, data is plentiful at HCA — but it is also eclectic and ill-suited to the needs of data science/real-time decision support applications. Enter Sojourner, a streaming feature engineering platform built by Accelerated Technologies backed by Kafka and GKE. Sojourner uses a lambda architecture of Clojure services to produce features for 40 million patient encounters per year, and it drives various applications in our hospitals, from sepsis detection to data science. This talk will describe the design of the Sojourner system, how we define features and deliver them in Confluent, and the challenges of deploying a fleet of Clojure microservices at scale.
Bio:
Derek Schatzlein is a software engineer, distributed systems enthusiast, and amateur historian. He is the director of software engineering at Accelerated Technologies and guides the development of real-time applications like NATE, SPOT, and CT&I data science endeavors.
-----
💡 Speaker:
Sandon Jacobs, Sr. Developer Advocate, Confluent
Title of Talk:
Ensuring Data Quality in Apache Kafka®: Great Power and Responsibility
Abstract:
The great philosopher Stan Lee once wrote: “with great power comes great responsibility.” Little did he know the lesson he was teaching about data stream quality. But who’s actually responsible for the quality and integrity of Apache Kafka-based data streams? Data contracts and schema evolution are central to building responsible Kafka producers. Let’s discuss the best practices involved in making producer applications good stewards of data streams for downstream consumers.
Bio:
Sandon Jacobs is a software engineer and avid golfer. He is a developer advocate at Confluent, where we're creating the foundational platform for data-in-motion.
Twitter: @SandonJacobs
LinkedIn: linkedin.com/in/sandonjacobs
GitHub: https://github.com/sandonleejacobs and https://github.com/sandonjacobs
***DISCLAIMER BY ATTENDING THIS EVENT IN PERSON, you acknowledge that risk includes possible exposure to and illness from infectious diseases including COVID-19, and accept responsibility for this, if it occurs.As the classroom is a mask-on setting, please be reminded that masks should still be worn at all times unless actively eating or drinkingNOTE: We are unable to cater for any attendees under the age of 18.

Real-time clinical decision support with Kafka/Ensuring Data Quality in Kafka