Back-ported airflow.providers.databricks.* package for Airflow 1.10.*
Project description
Package apache-airflow-backport-providers-databricks
Release: 2020.5.20
Backport package
This is a backport providers package for databricks
provider. All classes for this provider package
are in airflow.providers.databricks
python package.
Only Python 3.6+ is supported for this backport package.
While Airflow 1.10.* continues to support Python 2.7+ - you need to upgrade python to 3.6+ if you want to use this backport package.
Installation
You can install this package on top of an existing airflow 1.10.* installation via
pip install apache-airflow-backport-providers-databricks
Compatibility
For full compatibility and test status of the backport packages check Airflow Backport Package Compatibility
PIP requirements
PIP package | Version required |
---|---|
requests | >=2.20.0, <3 |
Provider class summary
All classes in Airflow 2.0 are in airflow.providers.databricks
package.
Operators
Moved operators
Airflow 2.0 operators: airflow.providers.databricks package |
Airflow 1.10.* previous location (usually airflow.contrib ) |
---|---|
operators.databricks.DatabricksRunNowOperator | contrib.operators.databricks_operator.DatabricksRunNowOperator |
operators.databricks.DatabricksSubmitRunOperator | contrib.operators.databricks_operator.DatabricksSubmitRunOperator |
Hooks
Moved hooks
Airflow 2.0 hooks: airflow.providers.databricks package |
Airflow 1.10.* previous location (usually airflow.contrib ) |
---|---|
hooks.databricks.DatabricksHook | contrib.hooks.databricks_hook.DatabricksHook |
Releases
Release 2020.5.20
Commit | Committed | Subject |
---|---|---|
00642a46d | 2020-05-26 | Fixed name of 20 remaining wrongly named operators. (#8994) |
f1073381e | 2020-05-22 | Add support for spark python and submit tasks in Databricks operator(#8846) |
375d1ca22 | 2020-05-19 | Release candidate 2 for backport packages 2020.05.20 (#8898) |
12c5e5d8a | 2020-05-17 | Prepare release candidate for backport packages (#8891) |
f3521fb0e | 2020-05-16 | Regenerate readme files for backport package release (#8886) |
92585ca4c | 2020-05-15 | Added automated release notes generation for backport operators (#8807) |
649935e8c | 2020-04-27 | [AIRFLOW-8472]: PATCH for Databricks hook _do_api_call (#8473) |
16903ba3a | 2020-04-24 | [AIRFLOW-8474]: Adding possibility to get job_id from Databricks run (#8475) |
5648dfbc3 | 2020-03-23 | Add missing call to Super class in 'amazon', 'cloudant & 'databricks' providers (#7827) |
3320e432a | 2020-02-24 | [AIRFLOW-6817] Lazy-load airflow.DAG to keep user-facing API untouched (#7517) |
4d03e33c1 | 2020-02-22 | [AIRFLOW-6817] remove imports from airflow/__init__.py , replaced implicit imports with explicit imports, added entry to UPDATING.MD - squashed/rebased (#7456) |
97a429f9d | 2020-02-02 | [AIRFLOW-6714] Remove magic comments about UTF-8 (#7338) |
83c037873 | 2020-01-30 | [AIRFLOW-6674] Move example_dags in accordance with AIP-21 (#7287) |
c42a375e7 | 2020-01-27 | [AIRFLOW-6644][AIP-21] Move service classes to providers package (#7265) |
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for apache-airflow-backport-providers-databricks-2020.5.20rc3.tar.gz
Algorithm | Hash digest | |
---|---|---|
SHA256 | 0b4c5be4c5637fbf44e83151c29288658684bd39d6c1a43ce47e9e903679d08d |
|
MD5 | 432e9cc7dc79a75fae9374590b5704c0 |
|
BLAKE2b-256 | 172bb71616d90cdb61db3855eee819376fa5b6b496bfaeb811f0754b3c2ce08e |
Hashes for apache_airflow_backport_providers_databricks-2020.5.20rc3-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 3cd10c3400a087207dc3467b1a6fbc96b71fa09a14ef29429da275892926a44f |
|
MD5 | f25aac1f1b3893fc00b425e91357c9bf |
|
BLAKE2b-256 | f0c4d412a95488e4ae98f9526831518a663d2068069a4d7bfcef1c68aba0093a |