Member-only story
Modular Data Stack — Build a Data Platform with Prefect, dbt and Snowflake (Part 4)
Scheduling, data ingestion, and backfilling implemented with modular building blocks and simple deployment patterns
This is a continuation of a series of articles about building a data platform with Prefect, dbt, and Snowflake. If you’re new to this series, check out this summary linking to previous posts. This demo will be hands-on and dive into scheduled and ad-hoc runs, the difference between ad-hoc and scheduled runs, local development with Prefect, data ingestion, and backfilling.
To make the demo easy to follow, you’ll see this 🤖 emoji highlighting sections that prompt you to run or do something (rather than only explaining something). The code for the entire tutorial series is available in the prefect-dataplatform GitHub repository.
Table of contents· 🤖 Creating a flow run from deployment
· 🤖 Inspecting parent and child flow runs from the UI
· 🤖 Scheduling a deployment
∘ Scheduling is decoupled from execution
∘ 🤖 Scheduling from the UI
∘ 🤖 Scheduling from CLI
· Local development of data platform workflows
∘ 🤖 Getting started with data platform development
∘ 🤖 Coordinate Python with Prefect
· Ingestion flow
∘ What is data…