Skip to content

Details

## No Python? No problem. Delta Live Tables (DLTs - Lakeflow Declarative Pipelines) brings powerful, production-ready data pipelines to life with just SQL. In this session, we’ll show how analysts and engineers alike can declaratively define robust pipelines – from raw ingestion to analytics-ready data – all without writing a single line of Python.

Using a real-world business scenario and dataset, we’ll walk through how to:

  • Ingest data with schema inference and Auto Loader into a Bronze layer
  • Apply transformations and quality expectations in the Silver layer
  • Build a business-friendly Star Schema in the Gold layer
  • Add data quality tests directly into your pipeline
  • Compare deployment options in the UI and via Databricks Asset Bundles (DABs)

Whether you’re a data analyst taking your first steps into engineering or an engineer looking for faster, simpler ways to build pipelines, this session will give you the practical foundation you need to unlock value with DLTs.
Agenda:

  • 5 min – Welcome & Business Context
  • 10 min – Why Delta Live Tables?
  • 5 min – Pipeline overview (sources, processing, presentation)
  • 10 min – Bronze layer: ingestion, schema inference, expectations
  • 5 min – Silver layer: transformations, quality rules
  • 5 min – Gold layer: star schema design
  • 5 min – Testing in DLT
  • 10 min – Deployment: UI vs DABs
Big Data
Business Intelligence
Data Analytics
Data Management
Database Professionals

Members are also interested in