Skip to main content

Apache Airflow API (Stable)

Project description

Apache Airflow Python Client

Requirements.

Python >= 3.7

Installation & Usage

pip install

You can install directly using pip:

pip install apache-airflow-client

Setuptools

Or install via Setuptools.

git clone git@github.com:apache/airflow-client-python.git
cd airflow-client-python
python setup.py install --user

(or sudo python setup.py install to install the package for all users)

Then import the package:

import airflow_client.client

Changelog

See CHANGELOG.md for keeping track on what has changed in the client.

Getting Started

Please follow the installation procedure and then run the following example python script:

import uuid

import airflow_client.client
try:
    # If you have rich installed, you will have nice colored output of the API responses
    from rich import print
except ImportError:
    print("Output will not be colored. Please install rich to get colored output: `pip install rich`")
    pass
from airflow_client.client.api import config_api, dag_api, dag_run_api
from airflow_client.client.model.dag_run import DAGRun

# The client must use the authentication and authorization parameters
# in accordance with the API server security policy.
# Examples for each auth method are provided below, use the example that
# satisfies your auth use case.
#
# In case of the basic authentication below, make sure that Airflow is
# configured also with the basic_auth as backend additionally to regular session backend needed
# by the UI. In the `[api]` section of your `airflow.cfg` set:
#
# auth_backend = airflow.api.auth.backend.session,airflow.api.auth.backend.basic_auth
#
# Make sure that your user/name are configured properly - using the user/password that has admin
# privileges in Airflow

# Configure HTTP basic authorization: Basic
configuration = airflow_client.client.Configuration(
    host="http://localhost:8080/api/v1",
    username='admin',
    password='admin'
)

# Make sure in the [core] section, the  `load_examples` config is set to True in your airflow.cfg
# or AIRFLOW__CORE__LOAD_EXAMPLES environment variable set to True
DAG_ID = "example_bash_operator"

# Enter a context with an instance of the API client
with airflow_client.client.ApiClient(configuration) as api_client:

    errors = False

    print('[blue]Getting DAG list')
    dag_api_instance = dag_api.DAGApi(api_client)
    try:
        api_response = dag_api_instance.get_dags()
        print(api_response)
    except airflow_client.client.OpenApiException as e:
        print("[red]Exception when calling DagAPI->get_dags: %s\n" % e)
        errors = True
    else:
        print('[green]Getting DAG list successful')


    print('[blue]Getting Tasks for a DAG')
    try:
        api_response = dag_api_instance.get_tasks(DAG_ID)
        print(api_response)
    except airflow_client.client.exceptions.OpenApiException as e:
        print("[red]Exception when calling DagAPI->get_tasks: %s\n" % e)
        errors = True
    else:
        print('[green]Getting Tasks successful')


    print('[blue]Triggering a DAG run')
    dag_run_api_instance = dag_run_api.DAGRunApi(api_client)
    try:
        # Create a DAGRun object (no dag_id should be specified because it is read-only property of DAGRun)
        # dag_run id is generated randomly to allow multiple executions of the script
        dag_run = DAGRun(
            dag_run_id='some_test_run_' + uuid.uuid4().hex,
        )
        api_response = dag_run_api_instance.post_dag_run(DAG_ID, dag_run)
        print(api_response)
    except airflow_client.client.exceptions.OpenApiException as e:
        print("[red]Exception when calling DAGRunAPI->post_dag_run: %s\n" % e)
        errors = True
    else:
        print('[green]Posting DAG Run successful')

    # Get current configuration. Note, this is disabled by default with most installation.
    # You need to set `expose_config = True` in Airflow configuration in order to retrieve configuration.
    conf_api_instance = config_api.ConfigApi(api_client)
    try:
        api_response = conf_api_instance.get_config()
        print(api_response)
    except airflow_client.client.OpenApiException as e:
        print("[red]Exception when calling ConfigApi->get_config: %s\n" % e)
        errors = True
    else:
        print('[green]Config retrieved successfully')

    if errors:
        print ('\n[red]There were errors while running the script - see above for details')
    else:
        print ('\n[green]Everything went well')

See README for full client API documentation.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

apache-airflow-client-2.6.1rc1.tar.gz (201.3 kB view details)

Uploaded Source

Built Distribution

apache_airflow_client-2.6.1rc1-py3-none-any.whl (1.4 MB view details)

Uploaded Python 3

File details

Details for the file apache-airflow-client-2.6.1rc1.tar.gz.

File metadata

File hashes

Hashes for apache-airflow-client-2.6.1rc1.tar.gz
Algorithm Hash digest
SHA256 a850157f94758e94211622e36f1a478234f1a696ddcfe990a15a35fb44a31642
MD5 cc7225b988b56fa04fb802a08788082f
BLAKE2b-256 290238c1ddeb79fbd40096fbd5878599d333a9507294c3983a2a2f2107c2df64

See more details on using hashes here.

File details

Details for the file apache_airflow_client-2.6.1rc1-py3-none-any.whl.

File metadata

File hashes

Hashes for apache_airflow_client-2.6.1rc1-py3-none-any.whl
Algorithm Hash digest
SHA256 f5f1b5450095e7f3bd60a0c162eb7e218c4644600051382450f0fbd83a19ede2
MD5 96112d25be82aeca8b84fc06194847dd
BLAKE2b-256 1376c4a8f923f673b36de8b3d0b9525d7102b0e3c807aebedcbd06d921a79c90

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page