Skip to main content

Generate BigQuery tables, load and extract data, based on JSON Table Schema descriptors.

Project description

# jsontableschema-bigquery-py

[![Travis](https://img.shields.io/travis/frictionlessdata/jsontableschema-bigquery-py/master.svg)](https://travis-ci.org/frictionlessdata/jsontableschema-bigquery-py)
[![Coveralls](http://img.shields.io/coveralls/frictionlessdata/jsontableschema-bigquery-py.svg?branch=master)](https://coveralls.io/r/frictionlessdata/jsontableschema-bigquery-py?branch=master)
[![PyPi](https://img.shields.io/pypi/v/jsontableschema-bigquery.svg)](https://pypi-hypernode.com/pypi/jsontableschema-bigquery)
[![SemVer](https://img.shields.io/badge/versions-SemVer-brightgreen.svg)](http://semver.org/)
[![Gitter](https://img.shields.io/gitter/room/frictionlessdata/chat.svg)](https://gitter.im/frictionlessdata/chat)

Generate and load BigQuery tables based on JSON Table Schema descriptors.

> Version `v0.3` contains breaking changes:
- renamed `Storage.tables` to `Storage.buckets`
- changed `Storage.read` to read into memory
- added `Storage.iter` to yield row by row

## Getting Started

### Installation

```bash
pip install jsontableschema-bigquery
```

### Storage

Package implements [Tabular Storage](https://github.com/frictionlessdata/jsontableschema-py#storage) interface.

To start using Google BigQuery service:
- Create a new project - [link](https://console.developers.google.com/home/dashboard)
- Create a service key - [link](https://console.developers.google.com/apis/credentials)
- Download json credentials and set `GOOGLE_APPLICATION_CREDENTIALS` environment variable

We can get storage this way:

```python
import io
import os
import json
from apiclient.discovery import build
from oauth2client.client import GoogleCredentials
from jsontableschema_bigquery import Storage

os.environ['GOOGLE_APPLICATION_CREDENTIALS'] = '.credentials.json'
credentials = GoogleCredentials.get_application_default()
service = build('bigquery', 'v2', credentials=credentials)
project = json.load(io.open('.credentials.json', encoding='utf-8'))['project_id']
storage = Storage(service, project, 'dataset', prefix='prefix')
```

Then we could interact with storage:

```python
storage.buckets
storage.create('bucket', descriptor)
storage.delete('bucket')
storage.describe('bucket') # return descriptor
storage.iter('bucket') # yields rows
storage.read('bucket') # return rows
storage.write('bucket', rows)
```

### Mappings

```
schema.json -> bigquery table schema
data.csv -> bigquery talbe data
```

### Drivers

Default Google BigQuery client is used - [docs](https://developers.google.com/resources/api-libraries/documentation/bigquery/v2/python/latest/).

## API Reference

### Snapshot

https://github.com/frictionlessdata/jsontableschema-py#snapshot

### Detailed

- [Docstrings](https://github.com/frictionlessdata/jsontableschema-py/tree/master/jsontableschema/storage.py)
- [Changelog](https://github.com/frictionlessdata/jsontableschema-bigquery-py/commits/master)

## Contributing

Please read the contribution guideline:

[How to Contribute](CONTRIBUTING.md)

Thanks!

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

jsontableschema-bigquery-0.4.2.tar.gz (7.4 kB view details)

Uploaded Source

Built Distribution

jsontableschema_bigquery-0.4.2-py2.py3-none-any.whl (9.2 kB view details)

Uploaded Python 2 Python 3

File details

Details for the file jsontableschema-bigquery-0.4.2.tar.gz.

File metadata

File hashes

Hashes for jsontableschema-bigquery-0.4.2.tar.gz
Algorithm Hash digest
SHA256 6337572236c0553da6b0870f5dbcd2b947fa9502fadec6d695e2761da9d024b1
MD5 0f1ed4547b53f55346bf947281203227
BLAKE2b-256 9eb7648fba80ddec104df629f999de024d9e4ef4d77fae1a578379052e07b45a

See more details on using hashes here.

Provenance

File details

Details for the file jsontableschema_bigquery-0.4.2-py2.py3-none-any.whl.

File metadata

File hashes

Hashes for jsontableschema_bigquery-0.4.2-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 c5cc2e6629dee885ceac013580890f1b44e3465d66596208a57b95efba762e67
MD5 53d15211ac0175f232522224656b466d
BLAKE2b-256 aadf46b62c459f8999ed52381b219c466a1080d2540a36eff016408ce8bb75eb

See more details on using hashes here.

Provenance

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page