Flow is a tool for building, testing, and evolving pipelines that continuously capture, transform, and materialize data across all of your systems. It unifies today's batch and streaming paradigms so that your systems – current and future – are synchronized around the same data sets, updating in milliseconds.
Flow doesn’t replace your other systems; instead, it consolidates and simplifies your infrastructure.
Flow is built and versioned using a GitOps workflow. It uses Docker and devcontainers to run on your machine. You’ll work with the command-line interface in VS Code to test locally and to deploy data pipelines in the cloud. Future versions of Flow will include a user interface to run and monitor your pipelines.
Wondering if Flow is right for you?
If you're unsure if Flow is the right solution for your data integration needs, you can read about the technical benefits and clear comparisons with similar systems, from an engineering perspective.
Want to get up and running ASAP?
Use the Getting started documentation to set up your development environment and run through tutorials designed to quickly teach you the most essential skills and concepts.
Looking to understand the concepts behind Flow at a deeper level?
We recommend starting with the tutorials, but after that, read the Concepts.
Trying to figure out an advanced workflow?
If you've mastered the basics but want to refine your pipeline, you'll find answers in our technical Reference.