Skip to main content

Databricks SQL Connector for Python

Reason this release was yanked:

Incompatible with urllib3<=2.0.0

Project description

Databricks SQL Connector for Python

PyPI Downloads

The Databricks SQL Connector for Python allows you to develop Python applications that connect to Databricks clusters and SQL warehouses. It is a Thrift-based client with no dependencies on ODBC or JDBC. It conforms to the Python DB API 2.0 specification and exposes a SQLAlchemy dialect for use with tools like pandas and alembic which use SQLAlchemy to execute DDL.

This connector uses Arrow as the data-exchange format, and supports APIs to directly fetch Arrow tables. Arrow tables are wrapped in the ArrowQueue class to provide a natural API to get several rows at a time.

You are welcome to file an issue here for general use cases. You can also contact Databricks Support here.

Requirements

Python 3.7 or above is required.

Documentation

For the latest documentation, see

Quickstart

Install the library with pip install databricks-sql-connector

Note: Don't hard-code authentication secrets into your Python. Use environment variables

export DATABRICKS_HOST=********.databricks.com
export DATABRICKS_HTTP_PATH=/sql/1.0/endpoints/****************
export DATABRICKS_TOKEN=dapi********************************

Example usage:

import os
from databricks import sql

host = os.getenv("DATABRICKS_HOST")
http_path = os.getenv("DATABRICKS_HTTP_PATH")
access_token = os.getenv("DATABRICKS_TOKEN")

connection = sql.connect(
  server_hostname=host,
  http_path=http_path,
  access_token=access_token)

cursor = connection.cursor()

cursor.execute('SELECT * FROM RANGE(10)')
result = cursor.fetchall()
for row in result:
  print(row)

cursor.close()
connection.close()

In the above example:

  • server-hostname is the Databricks instance host name.
  • http-path is the HTTP Path either to a Databricks SQL endpoint (e.g. /sql/1.0/endpoints/1234567890abcdef), or to a Databricks Runtime interactive cluster (e.g. /sql/protocolv1/o/1234567890123456/1234-123456-slid123)
  • personal-access-token is the Databricks Personal Access Token for the account that will execute commands and queries

Contributing

See CONTRIBUTING.md

License

Apache License 2.0

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

databricks_sql_connector-2.9.1.tar.gz (287.5 kB view details)

Uploaded Source

Built Distribution

databricks_sql_connector-2.9.1-py3-none-any.whl (297.2 kB view details)

Uploaded Python 3

File details

Details for the file databricks_sql_connector-2.9.1.tar.gz.

File metadata

  • Download URL: databricks_sql_connector-2.9.1.tar.gz
  • Upload date:
  • Size: 287.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.5.1 CPython/3.10.2 Linux/5.15.0-1042-azure

File hashes

Hashes for databricks_sql_connector-2.9.1.tar.gz
Algorithm Hash digest
SHA256 fdfaa0653248c814f02f3cd265678bdbb28bf634e7dfa43ac881d83942625a1d
MD5 ed35cbcab7276d8dc2aa1c58bd19c1db
BLAKE2b-256 a84942dc8833c2d402e0685e0f838b7e96b1779310059f2f5172afd2a4fae061

See more details on using hashes here.

File details

Details for the file databricks_sql_connector-2.9.1-py3-none-any.whl.

File metadata

File hashes

Hashes for databricks_sql_connector-2.9.1-py3-none-any.whl
Algorithm Hash digest
SHA256 8e57cc3578ccf51ea6336d0328a782bfbad9fdb47e8196a4e2be806fe29aece6
MD5 81452f63c449a840efbf249ca8392b99
BLAKE2b-256 866eb976f37f24a0e1c5986861200ab5aa417c749b63655912956263ddf1bbd0

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page