Skip to main content

Kedro-Airflow makes it easy to deploy Kedro projects to Airflow

Project description

Kedro-Airflow

develop master
CircleCI CircleCI

License Python Version PyPI Version Code Style: Black

Apache Airflow is a tool for orchestrating complex workflows and data processing pipelines. The Kedro-Airflow plugin can be used for:

  • Rapid pipeline creation in the prototyping phase. You can write Python functions in Kedro without worrying about schedulers, daemons, services or having to recreate the Airflow DAG file.
  • Automatic dependency resolution in Kedro. This allows you to bypass Airflow's need to specify the order of your tasks.
  • Distributing Kedro tasks across many workers. You can also enable monitoring and scheduling of the tasks' runtimes.

How do I install Kedro-Airflow?

kedro-airflow is a Python plugin. To install it:

pip install kedro-airflow

How do I use Kedro-Airflow?

The Kedro-Airflow plugin adds a kedro airflow create CLI command that generates an Airflow DAG file in the airflow_dags folder of your project. At runtime, this file translates your Kedro pipeline into Airflow Python operators. This DAG object can be modified according to your needs and you can then deploy your project to Airflow by running kedro airflow deploy.

Prerequisites

The following conditions must be true for Airflow to run your pipeline:

  • Your project directory must be available to the Airflow runners in the directory listed at the top of the DAG file.
  • Your source code must be on the Python path (by default the DAG file takes care of this).
  • All datasets must be explicitly listed in catalog.yml and reachable for the Airflow workers. Kedro-Airflow does not support MemoryDataSet or datasets that require Spark.
  • All local paths in configuration files (notably in catalog.yml and logging.yml) should be absolute paths and not relative paths.

Process

  1. Run kedro airflow create to generate a DAG file for your project.
  2. If needed, customize the DAG file as described below.
  3. Run kedro airflow deploy which will copy the DAG file from the airflow_dags folder in your Kedro project into the dags folder in the Airflow home directory.

Note: The generated DAG file will be placed in $AIRFLOW_HOME/dags/ when kedro airflow deploy is run, where AIRFLOW_HOME is an environment variable. If the environment variable is not defined, Kedro-Airflow will create ~/airflow and ~/airflow/dags (if required) and copy the DAG file into it.

Customization

There are a number of items in the DAG file that you may want to customize including:

  • Source location,
  • Project location,
  • DAG construction,
  • Default operator arguments,
  • Operator-specific arguments,
  • And / or Airflow context and execution date.

The following sections guide you to the appropriate location within the file.

Source location

The line sys.path.append("/Users/<user-name>/new-kedro-project/src") enables Python and Airflow to find your project source.

Project location

The line project_path = "/Users/<user-name>/new-kedro-project" sets the location for your project directory. This is passed to your get_config method.

DAG construction

The construction of the actual DAG object can be altered as needed. You can learn more about how to do this by going through the Airflow tutorial.

Default operator arguments

The default arguments for the Airflow operators are contained in the default_args dictionary.

Operator-specific arguments

The operator_specific_arguments callback is called to retrieve any additional arguments specific to individual operators. It is passed the Airflow task_id and should return a dictionary of additional arguments. For example, to change the number of retries on node named analysis to 5 you may have:

def operator_specific_arguments(task_id):
    if task_id == "analysis":
        return {"retries": 5}
    return {}

The easiest way to find the correct task_id is to use Airflow's list_tasks command.

Airflow context and execution date

The process_context callback provides a hook for ingesting Airflow's Jinja context. It is called before every node, receives the context and catalog and must return a catalog. A common use of this is to pick up the execution date and either insert it into the catalog or modify the catalog based on it.

The list of default context variables is available in the Airflow documentation.

What licence do you use?

Kedro-Airflow is licensed under the Apache 2.0 License.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

kedro-airflow-0.2.2.tar.gz (8.1 kB view details)

Uploaded Source

Built Distribution

kedro_airflow-0.2.2-py3-none-any.whl (11.7 kB view details)

Uploaded Python 3

File details

Details for the file kedro-airflow-0.2.2.tar.gz.

File metadata

  • Download URL: kedro-airflow-0.2.2.tar.gz
  • Upload date:
  • Size: 8.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.0 pkginfo/1.5.0.1 requests/2.22.0 setuptools/42.0.1.post20191125 requests-toolbelt/0.9.1 tqdm/4.39.0 CPython/3.6.9

File hashes

Hashes for kedro-airflow-0.2.2.tar.gz
Algorithm Hash digest
SHA256 497a2a4926a3c2f054363576800c4b9bdeda59e0b778a4641eebff9e005dd7e7
MD5 296eb37b6f739caffee3ca84ec03177b
BLAKE2b-256 bd3687e2f2542c87fb69a2d05a39e8f88288e0845563a5c4d50aab4710c0902d

See more details on using hashes here.

File details

Details for the file kedro_airflow-0.2.2-py3-none-any.whl.

File metadata

  • Download URL: kedro_airflow-0.2.2-py3-none-any.whl
  • Upload date:
  • Size: 11.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.0 pkginfo/1.5.0.1 requests/2.22.0 setuptools/42.0.1.post20191125 requests-toolbelt/0.9.1 tqdm/4.39.0 CPython/3.6.9

File hashes

Hashes for kedro_airflow-0.2.2-py3-none-any.whl
Algorithm Hash digest
SHA256 a68719ed7329a441583c329661723c89179cf52d6b8ccf71550b803c4c9de678
MD5 1d875f639e72240a090df0fc3420c5fd
BLAKE2b-256 60905f961fa183f01222d88778fca11846b92cb38720cc36dfa94698c527f03f

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page