Skip to main content

Google BigQuery API client library

Project description

Python idiomatic client for Google BigQuery

pypi versions

Quick Start

$ pip install --upgrade google-cloud-bigquery

For more information on setting up your Python development environment, such as installing pip and virtualenv on your system, please refer to Python Development Environment Setup Guide for Google Cloud Platform.

Authentication

With google-cloud-python we try to make authentication as painless as possible. Check out the Authentication section in our documentation to learn more. You may also find the authentication document shared by all the google-cloud-* libraries to be helpful.

Using the API

Querying massive datasets can be time consuming and expensive without the right hardware and infrastructure. Google BigQuery (BigQuery API docs) solves this problem by enabling super-fast, SQL queries against append-mostly tables, using the processing power of Google’s infrastructure.

Create a dataset

from google.cloud import bigquery
from google.cloud.bigquery import Dataset

client = bigquery.Client()

dataset_ref = client.dataset('dataset_name')
dataset = Dataset(dataset_ref)
dataset.description = 'my dataset'
dataset = client.create_dataset(dataset)  # API request

Load data from CSV

import csv

from google.cloud import bigquery
from google.cloud.bigquery import LoadJobConfig
from google.cloud.bigquery import SchemaField

client = bigquery.Client()

SCHEMA = [
    SchemaField('full_name', 'STRING', mode='required'),
    SchemaField('age', 'INTEGER', mode='required'),
]
table_ref = client.dataset('dataset_name').table('table_name')

load_config = LoadJobConfig()
load_config.skip_leading_rows = 1
load_config.schema = SCHEMA

# Contents of csv_file.csv:
#     Name,Age
#     Tim,99
with open('csv_file.csv', 'rb') as readable:
    client.load_table_from_file(
        readable, table_ref, job_config=load_config)  # API request

Perform a query

# Perform a query.
QUERY = (
    'SELECT name FROM `bigquery-public-data.usa_names.usa_1910_2013` '
    'WHERE state = "TX" '
    'LIMIT 100')
query_job = client.query(QUERY)  # API request
rows = query_job.result()  # Waits for query to finish

for row in rows:
    print(row.name)

See the google-cloud-python API BigQuery documentation to learn how to connect to BigQuery using this Client Library.

Project details


Release history Release notifications | RSS feed

This version

1.5.0

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

google-cloud-bigquery-1.5.0.tar.gz (150.2 kB view details)

Uploaded Source

Built Distribution

google_cloud_bigquery-1.5.0-py2.py3-none-any.whl (77.8 kB view details)

Uploaded Python 2 Python 3

File details

Details for the file google-cloud-bigquery-1.5.0.tar.gz.

File metadata

  • Download URL: google-cloud-bigquery-1.5.0.tar.gz
  • Upload date:
  • Size: 150.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.11.0 pkginfo/1.4.2 requests/2.19.1 setuptools/40.0.0 requests-toolbelt/0.8.0 tqdm/4.24.0 CPython/3.6.0

File hashes

Hashes for google-cloud-bigquery-1.5.0.tar.gz
Algorithm Hash digest
SHA256 9116ef887519da060c90ca1645eecb3d8fe70229b5e7368db4d2e2c87db91bf0
MD5 2b8c0a685ef0d0fb03a1f3d4197ab416
BLAKE2b-256 5d367ff904283f9035ddff987a0cd1512e4d37e60153b77d393f89ed89b9682d

See more details on using hashes here.

Provenance

File details

Details for the file google_cloud_bigquery-1.5.0-py2.py3-none-any.whl.

File metadata

  • Download URL: google_cloud_bigquery-1.5.0-py2.py3-none-any.whl
  • Upload date:
  • Size: 77.8 kB
  • Tags: Python 2, Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.11.0 pkginfo/1.4.2 requests/2.19.1 setuptools/40.0.0 requests-toolbelt/0.8.0 tqdm/4.24.0 CPython/3.6.0

File hashes

Hashes for google_cloud_bigquery-1.5.0-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 c9de858bdb259e3edc3a6e9af4e1d02a256af5a11306515b8953aa078b13f9a3
MD5 deccb2a2563f2a5f645b11d8a7521878
BLAKE2b-256 99b9fbac0fcbdbf737a44998020c2c090ebbfa37dd89d475ce47a06a5a07418f

See more details on using hashes here.

Provenance

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page