Stuttgart Apache Airflow® Meetup at Bosch!
Details
Join fellow Airflow enthusiasts and leaders at Bosch for an evening of engaging talks, great food and drinks, and exclusive swag!
PRESENTATIONS
Talk #1: The State of Airflow 2026
- Speaker: Sadok Ben Yahya, Senior Sales Engineer, Astronomer
Apache Airflow® continues to thrive as the world’s leading open-source data orchestration platform, with 30M downloads per month and over 3k contributors. 2025 marked a major milestone with the release of Airflow 3, which introduced DAG versioning, enhanced security and task isolation, assets, and more. These changes have reshaped how data teams build, operate, and govern their pipelines.
In this session, we will share insights from the State of Airflow 2026 report, including:
- Latest trends in how teams are using Airflow today
- What’s next for the project and ecosystem
- A discussion of emerging best practices and evolving use cases
Join us to hear directly from a leader in the community and discover how to get the most out of Airflow in the year ahead.
Talk #2: Operating Apache Airflow at Scale: Monitoring with Prometheus and Grafana
- Speakers: Daniel Wolf + Marco Küttelwesch (Developers, Bosch, Germany)
Operating Apache Airflow at scale requires deep visibility into workflows, system components, and infrastructure. Effective monitoring is key to maintaining stability, performance, and fast root-cause analysis in production environments.
In this talk, you’ll learn how to collect and expose metrics from Apache Airflow using Prometheus and how its ecosystem of exporters helps monitor Airflow and its surrounding platform components. We’ll also cover how to deploy Prometheus alongside Airflow on Kubernetes using Helm, focusing on scalable and production-ready setups.
Finally, we’ll show how Grafana turns Prometheus metrics into actionable dashboards and alerts, enabling teams to detect issues early and operate Airflow reliably as workloads grow.
Talk #3: Sneak Preview: Using Python asyncio natively with PythonOperator in Airflow 3.2
- Speaker: David Blain, BI Data Engineer, Infrabel, Belgium, Committer Apache Airflow
In Airflow 2, running asynchronous code was often as simple as wrapping an async function with asyncio.run().
With the introduction of Airflow 3 and the new Task SDK, this approach was no longer viable, as accessing connections and variables was not supported in an asynchronous context.Starting with Airflow 3.2, this limitation is lifted. The PythonOperator can now execute asynchronous code natively, regardless of the executor being used, and developers can directly annotate async Python tasks.
This enhancement not only simplifies authoring async workflows, but also unlocks new opportunities for building more efficient, scalable, and I/O-bound data pipelines in Airflow.
See more details here.
AGENDA
- 6-6:30 PM: Arrivals, networking, food & drinks
- 6:30-8PM: Presentations
- 8-9PM: Networking

