Skip to main content

Provider for Apache Airflow. Implements apache-airflow-providers-apache-druid package

Project description

Package apache-airflow-providers-apache-druid

Release: 3.3.0

Apache Druid.

Provider package

This is a provider package for apache.druid provider. All classes for this provider package are in airflow.providers.apache.druid python package.

You can find package information and changelog for the provider in the documentation.

Installation

You can install this package on top of an existing Airflow 2 installation (see Requirements below for the minimum Airflow version supported) via pip install apache-airflow-providers-apache-druid

The package supports the following python versions: 3.7,3.8,3.9,3.10

Requirements

PIP package

Version required

apache-airflow

>=2.3.0

apache-airflow-providers-common-sql

>=1.3.0

pydruid

>=0.4.1

Cross provider package dependencies

Those are dependencies that might be needed in order to use all the features of the package. You need to install the specified provider packages in order to use them.

You can install such cross-provider dependencies when installing from PyPI. For example:

pip install apache-airflow-providers-apache-druid[apache.hive]

Dependent package

Extra

apache-airflow-providers-apache-hive

apache.hive

apache-airflow-providers-common-sql

common.sql

Changelog

3.3.0

This release of provider is only available for Airflow 2.3+ as explained in the Apache Airflow providers support policy.

Misc

  • Move min airflow version to 2.3.0 for all providers (#27196)

Bug Fixes

  • BugFix - Druid Airflow Exception to about content (#27174)

3.2.1

Misc

  • Add common-sql lower bound for common-sql (#25789)

3.2.0

Features

  • Move all "old" SQL operators to common.sql providers (#25350)

3.1.0

Features

  • Move all SQL classes to common-sql provider (#24836)

3.0.0

Breaking changes

Misc

  • chore: Refactoring and Cleaning Apache Providers (#24219)

2.3.3

Bug Fixes

  • Fix mistakenly added install_requires for all providers (#22382)

2.3.2

Misc

  • Add Trove classifiers in PyPI (Framework :: Apache Airflow :: Provider)

2.3.1

Misc

  • Support for Python 3.10

2.3.0

Features

  • Add more SQL template fields renderers (#21237)

Bug Fixes

2.2.0

Features

  • Add timeout parameter to DruidOperator (#19984)

2.1.0

Features

  • Add DruidOperator template_fields_renderers fields (#19420)

  • Add max_ingestion_time to DruidOperator docstring (#18693)

  • Add guide for Apache Druid operators (#18527)

2.0.2

Misc

  • Optimise connection importing for Airflow 2.2.0

2.0.1

Bug Fixes

  • Fix error in Druid connection attribute retrieval (#17095)

2.0.0

Breaking changes

  • Auto-apply apply_default decorator (#15667)

1.1.0

Features

  • Refactor SQL/BigQuery/Qubole/Druid Check operators (#12677)

Bugfixes

  • Bugfix: DruidOperator fails to submit ingestion tasks (#14418)

1.0.1

Updated documentation and readme files.

1.0.0

Initial version of the provider.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

Built Distribution

File details

Details for the file apache-airflow-providers-apache-druid-3.3.0.tar.gz.

File metadata

File hashes

Hashes for apache-airflow-providers-apache-druid-3.3.0.tar.gz
Algorithm Hash digest
SHA256 31e034de694a1aec1af73289a52b2faae98b7c112aa223e7228f76b967f4c391
MD5 f3247e619985673f0d97553f0322a54a
BLAKE2b-256 06a884a9d523399f88e829ff17209f76852e389f221d4ca6868c0700cb519e77

See more details on using hashes here.

Provenance

File details

Details for the file apache_airflow_providers_apache_druid-3.3.0-py3-none-any.whl.

File metadata

File hashes

Hashes for apache_airflow_providers_apache_druid-3.3.0-py3-none-any.whl
Algorithm Hash digest
SHA256 0b3230a9e0293d4d86872539544534f27481e7088816314793c6419ec79fcc8e
MD5 642cbe712a2df2524748b7e1ef2e1410
BLAKE2b-256 1178cfca601e9ce30e96c671c2736febe73a79e759df4d5855cb3a9cb407be2d

See more details on using hashes here.

Provenance

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page