Skip to main content

Google BigQuery API client library

Project description

Python idiomatic client for Google BigQuery

pypi versions

Quick Start

$ pip install --upgrade google-cloud-bigquery

For more information on setting up your Python development environment, such as installing pip and virtualenv on your system, please refer to Python Development Environment Setup Guide for Google Cloud Platform.

Authentication

With google-cloud-python we try to make authentication as painless as possible. Check out the Authentication section in our documentation to learn more. You may also find the authentication document shared by all the google-cloud-* libraries to be helpful.

Using the API

Querying massive datasets can be time consuming and expensive without the right hardware and infrastructure. Google BigQuery (BigQuery API docs) solves this problem by enabling super-fast, SQL queries against append-mostly tables, using the processing power of Google’s infrastructure.

Create a dataset

from google.cloud import bigquery
from google.cloud.bigquery import Dataset

client = bigquery.Client()

dataset_ref = client.dataset('dataset_name')
dataset = Dataset(dataset_ref)
dataset.description = 'my dataset'
dataset = client.create_dataset(dataset)  # API request

Load data from CSV

import csv

from google.cloud import bigquery
from google.cloud.bigquery import LoadJobConfig
from google.cloud.bigquery import SchemaField

client = bigquery.Client()

SCHEMA = [
    SchemaField('full_name', 'STRING', mode='required'),
    SchemaField('age', 'INTEGER', mode='required'),
]
table_ref = client.dataset('dataset_name').table('table_name')

load_config = LoadJobConfig()
load_config.skip_leading_rows = 1
load_config.schema = SCHEMA

# Contents of csv_file.csv:
#     Name,Age
#     Tim,99
with open('csv_file.csv', 'rb') as readable:
    client.load_table_from_file(
        readable, table_ref, job_config=load_config)  # API request

Perform a query

# Perform a query.
QUERY = (
    'SELECT name FROM `bigquery-public-data.usa_names.usa_1910_2013` '
    'WHERE state = "TX" '
    'LIMIT 100')
query_job = client.query(QUERY)  # API request
rows = query_job.result()  # Waits for query to finish

for row in rows:
    print(row.name)

See the google-cloud-python API BigQuery documentation to learn how to connect to BigQuery using this Client Library.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

google-cloud-bigquery-1.0.0.tar.gz (133.3 kB view details)

Uploaded Source

Built Distribution

google_cloud_bigquery-1.0.0-py2.py3-none-any.whl (73.2 kB view details)

Uploaded Python 2 Python 3

File details

Details for the file google-cloud-bigquery-1.0.0.tar.gz.

File metadata

File hashes

Hashes for google-cloud-bigquery-1.0.0.tar.gz
Algorithm Hash digest
SHA256 49a0eda0e41a236d3986db69d4a7bb31370908f98ee98c0ef16e72f6bd0e2e3b
MD5 c5315c5b1be223e767ca2863b496062d
BLAKE2b-256 368501f667fbd9daf06101c8a11be06ff44d06d9186b164c9444bd5786a7a1ba

See more details on using hashes here.

Provenance

File details

Details for the file google_cloud_bigquery-1.0.0-py2.py3-none-any.whl.

File metadata

File hashes

Hashes for google_cloud_bigquery-1.0.0-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 593cc03a06f88cccd9c73a10610145e23229e07e2b97c613995968a2efbe7362
MD5 9f65b3ed6ff5233080e068559e105bcd
BLAKE2b-256 f546118110e0115628eef49577cb639bf6aaa545fbd860dfb336f5b0d81789d5

See more details on using hashes here.

Provenance

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page