Data. Platform. Team.
Orchestrate any Python workflow easily and efficiently with dynamic, right-sized infrastructure.
1from custom_tests import quality_test
2from prefect import task, flow
3from prefect.transactions import transaction
4
5
6@task(retries=2, retry_delay_seconds=5)
7def write_file(contents: str):
8 with open("side-effect.txt", "w") as f:
9 f.write(contents)
10
11
12@write_file.on_rollback
13def del_file(transaction):
14 os.unlink("side-effect.txt")
15
16
17@flow
18def pipeline(contents: str):
19 with transaction():
20 write_file(contents)
21 quality_test()
22
23
24if __name__ == "__main__":
25 pipeline.deploy(
26 name="pipeline-deployment",
27 work_pool_name="k8s-work-pool",
28 image="pipeline-img:dev"
29 )
Trusted by Data and Platform Teams
Build Resilient Workflows You Can Trust
Transform your Python code into production-ready data pipelines. Prefect gives you the tools to build, monitor, and scale your critical data workflows with confidence and efficiency.
Why Prefect?
Prefect empowers data and platform teams to build trustworthy workflows quickly by combining Pythonic simplicity with a secure self-service framework. Ensure resilience and efficiency at scale, while reducing your infrastructure costs by up to 70% with Prefect.
Build Confidence
Complete visibility and automated recovery ensure your workflows deliver reliable results. From real-time monitoring to proactive alerts, get the tools you need to maintain system confidence.
Deploy Faster
Prefect's pure Python approach and independent workflows let you focus on solving problems, not fighting frameworks. Deploy to any infrastructure and update workflows individually without impacting others.
Grow Efficiently
Dynamic resource allocation and unified control let you efficiently manage workflows across your organization. Optimize costs while maintaining performance as your operations grow.
What Our Users Are Saying
The Data Engineering and MLOps teams were impressed by the elimination of retrofitting requirements. Switching from Astronomer to Prefect resulted in a 73.78% reduction in invoice costs alone.
Our job is to provide data analysts and data scientists the data they need to create data products that drive business value. And beyond that, we focus on enabling our data scientists by removing roadblocks and giving them powerful tools that make their jobs easier. Prefect is allowing us to achieve these objectives.
We use Prefect to orchestrate dbt Cloud jobs right alongside other data tools. It brings visibility to our entire pipeline and streamlines our deployments. By combining Prefect and dbt Cloud, you get the best of both worlds without sacrificing functionality, governance, or velocity.
We took all the Prefect features and designed an architecture that really works for our infrastructure provisioning and our organization.
With Prefect we can define our workflows precisely, using code that's under version control. Features like tasks, task dependencies & retries, and mapping make it easy to write robust data imports and data pipelines.
Prefect allows us to create a microservices-like architecture for our data pipelines, essentially acting as a contract between independent teams.
Prefect gives us the granular flexibility to build a custom platform that would work for our entire organization, without needing a bloated infra architecture.
Get Started With Prefect
Resources
How Endpoint Cut Costs by 73% After Switching From Airflow to Prefect