Member-only story

How to Make Your Data Pipelines More Dynamic Using Parameters in Prefect

How to pass runtime-specific parameter values to your data pipelines

Anna Geller
9 min readJan 25, 2022
Photo by Maksym Tymchyk on Unsplash

Parametrization is one of the most critical features of any modern workflow orchestration solution. It allows you to dynamically overwrite parameter values for a given run without having to redeploy your workflow. Most orchestration frameworks provide rather limited functionality in that regard, such as only allowing to override global variables. Prefect, however, provides a first-class abstraction for handling dynamic parametrized workflows. Let’s look at it in more detail.

Table of contents:· How to start an ad-hoc parametrized flow run
1. Start a parametrized local flow run from a Python client
2. Start a parametrized local flow run from a CLI
3. Start a parametrized remote flow run from a CLI
4. Start a parametrized remote flow run from an API call
5. Start a parametrized remote flow run from the UI
6. Start a parametrized remote child flow run from a parent flow
Section summary
· How to schedule parametrized flows
Setting Parameter defaults
· How to use Parameter values in a state handler
· Things to watch out for
How to use dynamic values properly
Avoid Zombie-Parameters
Allowed parameter values
Allowed

--

--

Anna Geller
Anna Geller

Written by Anna Geller

Data Engineering, AWS Cloud, Serverless & .py. Get my articles via email https://annageller.medium.com/subscribe YouTube: https://www.youtube.com/@anna__geller

No responses yet