Skip to main content

DuckDB (duckdb.org) provider for Apache Airflow

Project description

airflow-provider-duckdb

A DuckDB provider for Airflow. This provider exposes a hook/connection that returns a DuckDB connection.

Installation

pip install airflow-provider-duckdb

Connection

The connection type is duckdb. It supports setting the following parameters:

  • file (optional): The path to the DuckDB database file. If not set, operations will be done in-memory.

Example connection strings:

  • duckdb://:memory:
  • duckdb:///tmp/duckdb.db

Usage

import pandas as pd
import pendulum

from airflow.decorators import dag, task
from duckdb_provider.hooks.duckdb_hook import DuckDBHook


@dag(
    schedule=None,
    start_date=pendulum.datetime(2022, 1, 1, tz="UTC"),
    catchup=False,
)
def duckdb_transform():
    @task
    def create_df() -> pd.DataFrame:
        """
        Create a dataframe with some sample data
        """
        df = pd.DataFrame(
            {
                "a": [1, 2, 3],
                "b": [4, 5, 6],
                "c": [7, 8, 9],
            }
        )
        return df

    @task
    def simple_select(df: pd.DataFrame) -> pd.DataFrame:
        """
        Use DuckDB to select a subset of the data
        """
        hook = DuckDBHook.get_hook('duckdb_default')
        conn = hook.get_conn()

        # execute a simple query
        res = conn.execute("SELECT a, b, c FROM df WHERE a >= 2").df()

        return res

    @task
    def add_col(df: pd.DataFrame) -> pd.DataFrame:
        """
        Use DuckDB to add a column to the data
        """
        hook = DuckDBHook.get_hook('duckdb_default')
        conn = hook.get_conn()

        # add a column
        conn.execute("CREATE TABLE tb AS SELECT *, a + b AS d FROM df")

        # get the table
        return conn.execute("SELECT * FROM tb").df()

    @task
    def aggregate(df: pd.DataFrame) -> pd.DataFrame:
        """
        Use DuckDB to aggregate the data
        """
        hook = DuckDBHook.get_hook('duckdb_default')
        conn = hook.get_conn()

        # aggregate
        return conn.execute("SELECT SUM(a), COUNT(b) FROM df").df()

    create_df_res = create_df()
    simple_select_res = simple_select(create_df_res)
    add_col_res = add_col(simple_select_res)
    aggregate_res = aggregate(add_col_res)


duckdb_transform()

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

airflow-provider-duckdb-0.1.0.tar.gz (7.1 kB view details)

Uploaded Source

Built Distribution

airflow_provider_duckdb-0.1.0-py3-none-any.whl (8.0 kB view details)

Uploaded Python 3

File details

Details for the file airflow-provider-duckdb-0.1.0.tar.gz.

File metadata

File hashes

Hashes for airflow-provider-duckdb-0.1.0.tar.gz
Algorithm Hash digest
SHA256 650b0c9527c897d581900bdfd521c34a9b094d2f83bf0aff042d38f3ef38203a
MD5 83f04e6cbe0b41f563838636805f7c96
BLAKE2b-256 8fde73327517fac2f1733018d87f0f2592dceb915b4d04f3d027610dee981d52

See more details on using hashes here.

File details

Details for the file airflow_provider_duckdb-0.1.0-py3-none-any.whl.

File metadata

File hashes

Hashes for airflow_provider_duckdb-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 0cd3df91812863266dabeef2d4a7a51a329d273417b87d18c836f8608d6fb23d
MD5 45a330ae0ce79c0bd3f9670c81c4277f
BLAKE2b-256 af9951289e9f5c1daf6cdc81f05e5cfd459fafa8ca4b7532a066578ca865df37

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page