Skip to main content

Kedro-Airflow makes it easy to deploy Kedro projects to Airflow

Project description

Kedro-Airflow

License Python Version PyPI Version Code Style: Black

Apache Airflow is a tool for orchestrating complex workflows and data processing pipelines. The Kedro-Airflow plugin can be used for:

  • Rapid pipeline creation in the prototyping phase. You can write Python functions in Kedro without worrying about schedulers, daemons, services or having to recreate the Airflow DAG file.
  • Automatic dependency resolution in Kedro. This allows you to bypass Airflow's need to specify the order of your tasks.
  • Distributing Kedro tasks across many workers. You can also enable monitoring and scheduling of the tasks' runtimes.

Installation

kedro-airflow is a Python plugin. To install it:

pip install kedro-airflow

Usage

You can use kedro-airflow to deploy a Kedro pipeline as an Airflow DAG by following these steps:

Step 1: Generate the DAG file

At the root directory of the Kedro project, run:

kedro airflow create

This command will generate an Airflow DAG file located in the airflow_dags/ directory in your project. You can pass a --pipeline flag to generate the DAG file for a specific Kedro pipeline and an --env flag to generate the DAG file for a specific Kedro environment.

Step 2: Copy the DAG file to the Airflow DAGs folder.

For more information about the DAGs folder, please visit Airflow documentation.

Step 3: Package and install the Kedro pipeline in the Airflow executor's environment

After generating and deploying the DAG file, you will then need to package and install the Kedro pipeline into the Airflow executor's environment. Please visit the guide to deploy Kedro as a Python package for more details.

FAQ

What if my DAG file is in a different directory to my project folder?

By default the generated DAG file is configured to live in the same directory as your project as per this template. If your DAG file is located in a different directory to your project, you will need to tweak this manually after running the kedro airflow create command.

What if I want to use a different Jinja2 template?

You can use the additional command line argument --jinja-file (alias -j) to provide an alternative path to a Jinja2 template. Note that these files have to accept the same variables as those used in the default Jinja2 template.

kedro airflow create --jinja-file=./custom/template.j2

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

kedro-airflow-0.5.1.tar.gz (5.1 kB view details)

Uploaded Source

Built Distribution

kedro_airflow-0.5.1-py3-none-any.whl (5.4 kB view details)

Uploaded Python 3

File details

Details for the file kedro-airflow-0.5.1.tar.gz.

File metadata

  • Download URL: kedro-airflow-0.5.1.tar.gz
  • Upload date:
  • Size: 5.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.8.15

File hashes

Hashes for kedro-airflow-0.5.1.tar.gz
Algorithm Hash digest
SHA256 4eda0b7dccae36119f2c605fe057be11b1a82172e34237b7e53faf06f66b1179
MD5 a88c3bb1feccd667d0875237b36c79c7
BLAKE2b-256 2aa881a04fd5f3eb3b59b4e2f13a65ceb74327336f85b68d7f1c67f77e6d4119

See more details on using hashes here.

File details

Details for the file kedro_airflow-0.5.1-py3-none-any.whl.

File metadata

File hashes

Hashes for kedro_airflow-0.5.1-py3-none-any.whl
Algorithm Hash digest
SHA256 f94dd4d1ed995570da668000ab13cf936a814a1a2b36915957ee25129a118942
MD5 cd61b8a4d5836bc40f13b3d12cb3c0ef
BLAKE2b-256 cb61a0a0f4845aed23187f342fcc0623ac8fb756687fc58ba984da12996a5991

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page