Skip to main content

DuckDB (duckdb.org) provider for Apache Airflow

Project description

airflow-provider-duckdb

A DuckDB provider for Airflow. This provider exposes a hook/connection that returns a DuckDB connection.

Installation

pip install airflow-provider-duckdb

Connection

The connection type is duckdb. It supports setting the following parameters:

  • file (optional): The path to the DuckDB database file. If not set, operations will be done in-memory.

Example connection strings:

  • duckdb://:memory:
  • duckdb:///tmp/duckdb.db

Usage

import pandas as pd
import pendulum

from airflow.decorators import dag, task
from duckdb_provider.hooks.duckdb_hook import DuckDBHook


@dag(
    schedule=None,
    start_date=pendulum.datetime(2022, 1, 1, tz="UTC"),
    catchup=False,
)
def duckdb_transform():
    @task
    def create_df() -> pd.DataFrame:
        """
        Create a dataframe with some sample data
        """
        df = pd.DataFrame(
            {
                "a": [1, 2, 3],
                "b": [4, 5, 6],
                "c": [7, 8, 9],
            }
        )
        return df

    @task
    def simple_select(df: pd.DataFrame) -> pd.DataFrame:
        """
        Use DuckDB to select a subset of the data
        """
        hook = DuckDBHook.get_hook('duckdb_default')
        conn = hook.get_conn()

        # execute a simple query
        res = conn.execute("SELECT a, b, c FROM df WHERE a >= 2").df()

        return res

    @task
    def add_col(df: pd.DataFrame) -> pd.DataFrame:
        """
        Use DuckDB to add a column to the data
        """
        hook = DuckDBHook.get_hook('duckdb_default')
        conn = hook.get_conn()

        # add a column
        conn.execute("CREATE TABLE tb AS SELECT *, a + b AS d FROM df")

        # get the table
        return conn.execute("SELECT * FROM tb").df()

    @task
    def aggregate(df: pd.DataFrame) -> pd.DataFrame:
        """
        Use DuckDB to aggregate the data
        """
        hook = DuckDBHook.get_hook('duckdb_default')
        conn = hook.get_conn()

        # aggregate
        return conn.execute("SELECT SUM(a), COUNT(b) FROM df").df()

    create_df_res = create_df()
    simple_select_res = simple_select(create_df_res)
    add_col_res = add_col(simple_select_res)
    aggregate_res = aggregate(add_col_res)


duckdb_transform()

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

airflow-provider-duckdb-0.1.1a1.tar.gz (7.2 kB view details)

Uploaded Source

Built Distribution

File details

Details for the file airflow-provider-duckdb-0.1.1a1.tar.gz.

File metadata

File hashes

Hashes for airflow-provider-duckdb-0.1.1a1.tar.gz
Algorithm Hash digest
SHA256 a7bec24ac0c6cf51de4c1361580115ed2dd046dc537f396d0c71ecf972960ead
MD5 94a8633379c2d57e03545af7c242a973
BLAKE2b-256 e8c08e4a98e0485151f446e001e515168e9118eb9bf66c03772214e012ee9168

See more details on using hashes here.

File details

Details for the file airflow_provider_duckdb-0.1.1a1-py3-none-any.whl.

File metadata

File hashes

Hashes for airflow_provider_duckdb-0.1.1a1-py3-none-any.whl
Algorithm Hash digest
SHA256 8a1f76a3082915f13259cb9a719b86af823a41a63fba8178417b30f63d54dc4c
MD5 cc40fcef14eed75422a4568ebc894e7f
BLAKE2b-256 88b997aab1117c7c4243bb53565d051bada7295feb8ad9433fc1d55e4fdc23da

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page