Skip to main content

{{ DESCRIPTION }}

Project description

Configurable Airflow UI

This library provides a wrapper around airflow, providing a means to add / remove DAGs (Pipelines) via a web-ui based on a configuration defining the Pipeline 'kinds' and the parameters each kind requires.

Pipeline Dashboard

Pipeline Dashboard

Edit/New Pipeline

Edit/New Pipeline

Pipeline Status

Pipeline Status

Quickstart

  1. Create a folder containing:
  • A configuration.yaml file with the details on your pipeline kinds, e.g.
{
    "kinds": [
        {
            "name": "kind1",
            "display": "Kind 1",
            "fields": [
                {
                    "name": "param1",
                    "display": "Parameter 1"
                },
                {
                    "name": "param2",
                    "display": "Parameter 2"
                }
            ]
        },
        {
            "name": "kind2",
            "display": "Kind 2",
            "fields": [
                {
                    "name": "param3",
                    "display": "Parameter 3"
                },
                {
                    "name": "param4",
                    "display": "Parameter 4"
                }
            ]
        }
    ],
    "schedules": [
        {
            "name": "monthly",
            "display": "Monthly"
        },
        {
            "name": "daily",
            "display": "Daily"
        }
    ]

}

(If schedules are not specified, a default schedules list will be used).

  • The Airflow DAGs Creator - a Python file that reads the pipeline configuration and creates your Airflow DAGs. Sample code:
import datetime
import logging
from airflow import DAG
from airflow.operators.bash_operator import BashOperator
from airflow.utils import dates
from etl_server.models import Models

etl_models = Models()

default_args = {
    'owner': 'Airflow',
    'depends_on_past': False,
    'start_date': dates.days_ago(1),
}

for pipeline in etl_models.all_pipelines():
  # pipeline looks like this:
  # {
  #   "id": "<identifier>",
  #   "name": "<English Name of Pipeline>",
  #   "kind": "<kind-name>",
  #   "schedule": "<schedule>",
  #   "params": {
  #      "field1": "value1",
  #      .. other fields, based on kind's fields in configuration
  #   }
  # }
    dag_id = pipeline['id']
    logging.info('Initializing DAG %s', dag_id)
    dag = DAG(dag_id, default_args=default_args, schedule_interval=datetime.timedelta(days=1))
    task = BashOperator(task_id=dag_id,
                        bash_command='echo "%s"; sleep 10 ; echo done' % pipeline['name'],
                        dag=dag)
    globals()[dag_id] = dag
  1. Use a docker-compose setup to run the server, an example docker-compose.yaml file:
version: "3"

services:

  db:
    image: postgres:12
    environment:
      POSTGRES_PASSWORD: postgres
      POSTGRES_USER: postgres
      POSTGRES_DB: etls
    expose:
      - 5432
    volumes: 
      - /var/lib/postgresql/data

  server:
    build: .
    image: akariv/airflow-config-ui
    environment:
      DATABASE_URL: postgresql://postgres:postgres@db/etls
      AIRFLOW__CORE__SQL_ALCHEMY_CONN: postgresql://postgres:postgres@db/etls
    expose:
      - 5000
    ports:
      - 5000:5000
    depends_on: 
      - db
    volumes: 
      - /path/to/local/dags/folder/:/app/dags

After running (docker-compose up -d server), open your browser at http://localhost:5000 to see the web UI.

Another option is to create a new Docker image which inherits from akariv/airflow-config-ui and replaces the contents of /app/dags/ with the configuration.json file and your DAG Python files.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

etl-server-0.0.5.tar.gz (3.3 MB view details)

Uploaded Source

Built Distribution

etl_server-0.0.5-py3-none-any.whl (27.4 kB view details)

Uploaded Python 3

File details

Details for the file etl-server-0.0.5.tar.gz.

File metadata

  • Download URL: etl-server-0.0.5.tar.gz
  • Upload date:
  • Size: 3.3 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.5.0.1 requests/2.23.0 setuptools/49.6.0 requests-toolbelt/0.9.1 tqdm/4.49.0 CPython/3.7.8

File hashes

Hashes for etl-server-0.0.5.tar.gz
Algorithm Hash digest
SHA256 d1410f5b880dff583d5a544253ee8df491ab4f74939f7f32b2332d2bc01e55a0
MD5 56a3cf3410813e583c81d6ffa5be5e8d
BLAKE2b-256 8fea4215c3e5a3785f523f443c908b4c04977d9440f5794b1a8e56d20baeb74c

See more details on using hashes here.

File details

Details for the file etl_server-0.0.5-py3-none-any.whl.

File metadata

  • Download URL: etl_server-0.0.5-py3-none-any.whl
  • Upload date:
  • Size: 27.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.5.0.1 requests/2.23.0 setuptools/49.6.0 requests-toolbelt/0.9.1 tqdm/4.49.0 CPython/3.7.8

File hashes

Hashes for etl_server-0.0.5-py3-none-any.whl
Algorithm Hash digest
SHA256 7c97ce4603dfde59ee0035e9d2b6e1945a53662260bdfd26a70826bbbf8880ba
MD5 42f0dcd72a3e4c02e19d0339e2c5c71c
BLAKE2b-256 af19d9e38c6b47db51a90b5a13d98a946e1576a370bb9dddf0a75613bbd98ee1

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page