Agentic Data Engineering: Accelerating dbt, Pipelines, and Warehouse Development
Details
Building and maintaining modern data platforms across dbt, orchestration tools, and cloud warehouses is still time-consuming, error-prone, and heavily dependent on tribal knowledge. While AI coding tools promise speed, they often lack the context needed to safely work across systems like Snowflake, Apache Airflow, and Dagster.
In this session, we’ll explore how “agentic” AI is changing the way data engineers build and deploy across the entire stack. Instead of simple copilots, these systems understand your schemas, lineage, dbt project structure, and pipeline dependencies—allowing them to generate, validate, and optimize code with real context.
We’ll walk through how this approach can:
· Accelerate dbt model development and refactoring
· Generate and validate SQL directly against warehouse schemas
· Debug failing pipelines in Airflow and Dagster with full context
· Automatically create tests, documentation, and lineage-aware changes
· Enforce guardrails for cost, data quality, and governance
The result: faster delivery across your data platform—without sacrificing trust in production systems.
Speaker:
Steven Johnson, Technical Marketing Manager at Altimate
Steven is a former math teacher who after transitioning into a data science career, has worked with a number of startups in the data space. Currently, he is the Technical Marketing Manager at Altimate AI, where I focus on helping data teams understand and adopt modern data tooling. My work includes technical content, product education, and storytelling that connects complex data infrastructure with the people who use it.
