Airflow is a data platform to manage workflows

Card Puncher Data Processing

About

Airflow is a data platform to programmatically author, schedule, and monitor workflows written in Python.

Airflow workflow is a list of tasks that are dependent on execution not on data. The dependency graph is an execution one.

The idea behind workflow is that when workflows are defined as code, they become more maintainable, versionable, testable, and collaborative.

Docker

curl -LO https://raw.githubusercontent.com/bitnami/bitnami-docker-airflow/master/docker-compose.yml
docker-compose up

See: https://github.com/bitnami/bitnami-docker-airflow

Documentation / Reference





Discover More
Card Puncher Data Processing
Data Integration Tool (ETL/ELT)

This section regroups software's and frameworks supporting data integration in a batch or stream fashion. Oracle Warehouse Builder SQL Server Integration Services ...
Card Puncher Data Processing
Data Processing - Data Flow (ETL | Workflow | Pipeline)

A data flow is a workflow specialized for data processing Any system where the data moves between code units and triggers execution of the code could be called dataflow Dataflow_architecturecomputer...



Share this page:
Follow us:
Task Runner