Skip to content

Workshop: Building a Data Pipeline in Azure Data Factory using Terraform

Photo of Roman Golovnya
Hosted By
Roman G. and 2 others
Workshop: Building a Data Pipeline in Azure Data Factory using Terraform

Details

Hi All,

We are keen to invite you to a workshop.

Time: 10:00 AM - 11:30 AM (BST)

This workshop will teach you how to create a seamless Azure Data Factory (ADF) pipeline using Terraform. A data pipeline is crucial for efficiently orchestrating data movement, transformation, and loading tasks. By the end of this workshop, you'll be able to create a pipeline that copies data from one storage container to another within Azure Blob Storage.

A sample CSV file will be provided.

Prerequisites

Session Highlights

  • Set Up the Environment: Configure Azure resources and Terraform to define your infrastructure.
  • Create Storage Containers: Build source and destination containers within Azure Blob Storage.
  • Configure ADF: Define linked services to access Azure Blob Storage.
  • Define Datasets: Create custom datasets specifying data format, location, and structure. Create a Pipeline: Use Terraform to orchestrate copy activities from source to destination.
  • Execute the Pipeline: Trigger and perform the data copy operation.
  • Best practices and tips for optimising ETL workflows with Terraform.

Worried about billing? Don't be! We'll use the free tier settings in Azure. If you follow up correctly, you'll incur no charges.

Discover the power of cloud-based data orchestration and the automation capabilities of Terraform. Elevate your data integration and infrastructure-as-code skills in this workshop led by industry experts.

Abdulraouf Atia is a DevOps Engineering Professional with experience providing value to multi-functional teams, integrating and automating applications across fast-paced environments.

Photo of Data Science and Engineering Club group
Data Science and Engineering Club
See more events