CKAN integration for Dataflows.
Project description
dataflows-ckan
Dataflows processors to work with CKAN.
Features
dump_to_ckan
processor
Contents
Getting Started
Installation
The package use semantic versioning. It means that major versions could include breaking changes. It's recommended to specify package
version range in your setup/requirements
file e.g. package>=1.0,<2.0
.
$ pip install dataflows-ckan
Examples
These processors have to be used as a part of data flow. For example:
flow = Flow(
load('data/data.csv'),
dump_to_ckan(
host,
api_key,
owner_org,
overwrite_existing_data=True,
push_to_datastore=False,
push_to_datastore_method='insert',
**options,
),
)
flow.process()
Documentation
dump_to_ckan
Saves the DataPackage to a CKAN instance.
Contributing
Create a virtual environment and install Poetry.
Then install the package in editable mode:
$ make install
Run the tests:
$ make test
Format your code:
$ make format
Changelog
0.2.0
- Full port to dataflows, and some refactoring, with a basic integration test.
0.1.0
- an initial port from https://github.com/frictionlessdata/datapackage-pipelines-ckan based on the great work of @brew and @amercader
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
dataflows_ckan-0.3.9.tar.gz
(4.9 kB
view hashes)
Built Distribution
Close
Hashes for dataflows_ckan-0.3.9-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | f89c674c8679360b47f58a445aa1125968538ad5c561840c455a87370e69f0ec |
|
MD5 | 840c96c872686289f9d652280c4d26d1 |
|
BLAKE2b-256 | f13040d5c623c55c8d2bf2a362d06858d03ac292429a6a60094c9ee9ecacba68 |