Skip to main content

Microsoft Azure Azure Data Tables Client Library for Python

Project description

Azure Data Tables client library for Python

Azure Data Tables is a NoSQL data storage service that can be accessed from anywhere in the world via authenticated calls using HTTP or HTTPS. Tables scales as needed to support the amount of data inserted, and allow for the storing of data with non-complex accessing. The Azure Data Tables client can be used to access Azure Storage or Cosmos accounts.

Source code | Package (PyPI) | API reference documentation | Samples

Getting started

The Azure Data Tables can be accessed using an Azure Storage or a CosmosDB account.

Prerequisites

Create a storage account

If you wish to create a new storage account, you can use Azure Portal, Azure PowerShell, or Azure CLI:

# Create a new resource group to hold the storage account -
# if using an existing resource group, skip this step
az group create --name MyResourceGroup --location westus2
# Create the storage account
az storage account create -n MyStorageAccount -g MyResourceGroup

Creating a Cosmos DB

If you wish to create a new cosmos storage account, you can use Azure Cosmos DB. Create a Cosmos DB account MyCosmosDBDatabaseAccount in resource group MyResourceGroup in the subscription MySubscription and a table named MyTableName in the account.

az cosmosdb create --name MyCosmosDBDatabaseAccount --resource-group MyResourceGroup --subscription MySubscription
az cosmosdb table create --name MyTableName --resource-group MyResourceGroup --acount-name MyCosmosDBDatabaseAccount

Install the package

Install the Azure Data Tables client library for Python with pip:

pip install --pre azure-data-tables

Create the client

The Azure Data Tables client library for Python allows you to interact with two types of resources: the tables in your account, and the entities within the tables. Interaction with these resources starts with an instance of a client. To create a client object, you will need the account's table service endpoint URL and a credential that allows you to access the account:

from azure.data.tables import TableServiceClient
service = TableServiceClient(account_url="https://<myaccount>.table.core.windows.net/", credential=credential)
# Get the table service URL for the account
az storage account show -n mystorageaccount -g MyResourceGroup --query "primaryEndpoints.table"

Types of credentials

The credential parameter may be provided in a number of different forms, depending on the type of authorization you wish to use:

Creating the client from a SAS token

To use a shared access signature (SAS) token, provide the token as a string. If your account URL includes the SAS token, omit the credential parameter. You can generate a SAS token from the Azure Portal under Shared access signature or use one of the generate_*_sas() functions to create a sas token for the account or table:

    from datetime import datetime, timedelta
    from azure.data.tables import TableServiceClient, generate_account_sas, ResourceTypes, AccountSasPermissions

    sas_token = generate_account_sas(
        account_name="<account-name>",
        account_key="<account-access-key>",
        resource_types=ResourceTypes(service=True),
        permission=AccountSasPermissions(read=True),
        expiry=datetime.utcnow() + timedelta(hours=1)
    )

    table_service_client = TableServiceClient(account_url="https://<my_account_name>.table.core.windows.net", credential=sas_token)
Creating the client from a shared key

To use an account shared key (aka account key or access key), provide the key as a string. This can be found in the Azure Portal under the "Access Keys" section or by running the following Azure CLI command:

az storage account keys list -g MyResourceGroup -n MyStorageAccount

Use the key as the credential parameter to authenticate the client:

    from azure.data.tables import TableServiceClient
    service = TableServiceClient(account_url="https://<my_account_name>.table.core.windows.net", credential="<account_access_key>")
Creating the client from a connection string

Depending on your use case and authorization method, you may prefer to initialize a client instance with a connection string instead of providing the account URL and credential separately. To do this, pass the connection string to the client's from_connection_string class method:

    from azure.data.tables import TableServiceClient
    connection_string = "DefaultEndpointsProtocol=https;AccountName=xxxx;AccountKey=xxxx;EndpointSuffix=core.windows.net"
    service = TableServiceClient.from_connection_string(conn_str=connection_string)

The connection string to your account can be found in the Azure Portal under the "Access Keys" section or by running the following CLI command:

az storage account show-connection-string -g MyResourceGroup -n MyStorageAccount

Looking up the account URL

You can find the account's table service URL using the Azure Portal, Azure PowerShell, or Azure CLI:

# Get the table service URL for the account
az storage account show -n MyStorageAccount -g MyResourceGroup --query "primaryEndpoints.table"

Key concepts

Common uses of the Table service included:

  • Storing TBs of structured data capable of serving web scale applications
  • Storing datasets that do not require complex joins, foreign keys, or stored procedures and can be de-normalized for fast access
  • Quickly querying data using a clustered index
  • Accessing data using the OData protocol and LINQ filter expressions

The following components make up the Azure Data Tables Service:

  • The account
  • A table within the account, which contains a set of entities
  • An entity within a table, as a dictionary

The Azure Data Tables client library for Python allows you to interact with each of these components through the use of a dedicated client object.

Clients

Two different clients are provided to interact with the various components of the Table Service:

  1. TableServiceClient - this client represents interaction with the Azure account itself, and allows you to acquire preconfigured client instances to access the tables within. It provides operations to retrieve and configure the account properties as well as query, create, and delete tables within the account. To perform operations on a specific table, retrieve a client using the get_table_client method.
  2. TableClient - this client represents interaction with a specific table (which need not exist yet). It provides operations to create, delete, or update a table and includes operations to query, get, and upsert entities within it.

Entities

Entities are similar to rows. An entity has a primary key, a row key and a set of properties. A property is a name value pair, similar to a column.

  • Create - Adds an entity to the table.
  • Delete - Deletes an entity from the table.
  • Update - Updates an entities information by either merging or replacing the existing entity.
  • Query - Queries existing entities in a table based off of the QueryOptions (OData).
  • Get - Gets a specific entity from a table by partition and row key.
  • Upsert - Merges or replaces an entity in a table, or if the entity does not exist, inserts the entity.

Examples

The following sections provide several code snippets covering some of the most common Table tasks, including:

Creating a table

Create a table in your account

from azure.data.tables import TableServiceClient
table_service_client = TableServiceClient.from_connection_string(conn_str="<connection_string>")
table_service_client.create_table(table_name="myTable")

Creating entities

Create entities in the table

from azure.data.tables import TableClient
my_entity = {'PartitionKey':'part','RowKey':'row'}
table_client = TableClient.from_connection_string(conn_str="<connection_string>", table_name="myTable")
entity = table_client.create_entity(entity=my_entity)

Querying entities

Querying entities in the table

from azure.data.tables import TableClient
my_filter = "text eq 'Marker'"
table_client = TableClient.from_connection_string(conn_str="<connection_string>", table_name="mytable")
entity = table_client.query_entities(filter=my_filter)

Optional Configuration

Optional keyword arguments can be passed in at the client and per-operation level. The azure-core reference documentation describes available configurations for retries, logging, transport protocols, and more.

Retry Policy configuration

Use the following keyword arguments when instantiating a client to configure the retry policy:

  • retry_total (int): Total number of retries to allow. Takes precedence over other counts. Pass in retry_total=0 if you do not want to retry on requests. Defaults to 10.
  • retry_connect (int): How many connection-related errors to retry on. Defaults to 3.
  • retry_read (int): How many times to retry on read errors. Defaults to 3.
  • retry_status (int): How many times to retry on bad status codes. Defaults to 3.
  • retry_to_secondary (bool): Whether the request should be retried to secondary, if able. This should only be enabled of RA-GRS accounts are used and potentially stale data can be handled. Defaults to False.

Other client / per-operation configuration

Other optional configuration keyword arguments that can be specified on the client or per-operation.

Client keyword arguments:

  • connection_timeout (int): Optionally sets the connect and read timeout value, in seconds.
  • transport (Any): User-provided transport to send the HTTP request.

Per-operation keyword arguments:

  • raw_response_hook (callable): The given callback uses the response returned from the service.
  • raw_request_hook (callable): The given callback uses the request before being sent to service.
  • client_request_id (str): Optional user specified identification of the request.
  • user_agent (str): Appends the custom value to the user-agent header to be sent with the request.
  • logging_enable (bool): Enables logging at the DEBUG level. Defaults to False. Can also be passed in at the client level to enable it for all requests.
  • headers (dict): Pass in custom headers as key, value pairs. E.g. headers={'CustomValue': value}

Troubleshooting

General

Azure Data Tables clients raise exceptions defined in Azure Core. When you interact with the Azure table library using the Python SDK, errors returned by the service respond ot the same HTTP status codes for REST API requests. The Table service operations will throw a HttpResponseError on failure with helpful error codes.

For examples, if you try to create a table that already exists, a 409 error is returned indicating "Conflict".

from azure.data.tables import TableServiceClient
from azure.core.exceptions import HttpResponseError
table_name = 'YourTableName

service_client = TableServiceClient.from_connection_string(connection_string)

# Create the table if it does not already exist
tc = service_client.create_table_if_not_exists(table_name)

try:
    service_client.create_table(table_name)
except HttpResponseError:
    print("Table with name {} already exists".format(table_name))

Logging

This library uses the standard logging library for logging. Basic information about HTTP sessions (URLs, headers, etc.) is logged at INFO level.

Detailed DEBUG level logging, including request/response bodies and unredacted headers, can be enabled on a client with the logging_enable argument:

import sys
import logging
from azure.data.tables import TableServiceClient
# Create a logger for the 'azure' SDK
logger = logging.getLogger('azure')
logger.setLevel(logging.DEBUG)

# Configure a console output
handler = logging.StreamHandler(stream=sys.stdout)
logger.addHandler(handler)

# This client will log detailed information about its HTTP sessions, at DEBUG level
service_client = TableServiceClient.from_connection_string("your_connection_string", logging_enable=True)

Similarly, logging_enable can enable detailed logging for a single operation, even when it isn't enabled for the client:

service_client.create_entity(entity=my_entity, logging_enable=True)

Next steps

Get started with our Table samples.

Several Azure Data Tables Python SDK samples are available to you in the SDK's GitHub repository. These samples provide example code for additional scenarios commonly encountered while working with Tables.

Common Scenarios

These code samples show common scenario operations with the Azure Data tables client library. The async versions of the samples (the python sample files appended with _async) show asynchronous operations with Tables and require Python 3.5 or later.

Additional documentation

For more extensive documentation on Azure Data Tables, see the Azure Data Tables documentation on docs.microsoft.com.

Known Issues

A list of currently known issues relating to Cosmos DB table endpoints can be found here.

Contributing

This project welcomes contributions and suggestions. Most contributions require you to agree to a Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visit https://cla.microsoft.com.

When you submit a pull request, a CLA-bot will automatically determine whether you need to provide a CLA and decorate the PR appropriately (e.g., label, comment). Simply follow the instructions provided by the bot. You will only need to do this once across all repos using our CLA.

This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact opencode@microsoft.com with any additional questions or comments.

Impressions

Release History

12.0.0b3 (2020-11-12)

  • Add support for transactional batching of entity operations.
  • Fixed deserialization bug in list_tables and query_tables where TableItem.table_name was an object instead of a string.
  • Fixed issue where unrecognized entity data fields were silently ignored. They will now raise a TypeError.
  • Fixed issue where query filter parameters were being ignored (#15094)

12.0.0b2 (2020-10-07)

  • Adds support for Enumerable types by converting the Enum to a string before sending to the service

12.0.0b1 (2020-09-08)

This is the first beta of the azure-data-tables client library. The Azure Tables client library can seamlessly target either Azure Table storage or Azure Cosmos DB table service endpoints with no code changes.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

azure-data-tables-12.0.0b3.zip (825.9 kB view details)

Uploaded Source

Built Distribution

azure_data_tables-12.0.0b3-py2.py3-none-any.whl (110.6 kB view details)

Uploaded Python 2 Python 3

File details

Details for the file azure-data-tables-12.0.0b3.zip.

File metadata

  • Download URL: azure-data-tables-12.0.0b3.zip
  • Upload date:
  • Size: 825.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.6.1 requests/2.25.0 setuptools/49.2.1 requests-toolbelt/0.9.1 tqdm/4.51.0 CPython/3.9.0

File hashes

Hashes for azure-data-tables-12.0.0b3.zip
Algorithm Hash digest
SHA256 6f0849ad95f9bd2b99be9fd279848837a2fbc241d545457a353712a1ea2fbb4d
MD5 18f31ea996c3563d3eb947c7a20b993e
BLAKE2b-256 95936d938ef1f3b567de031fef50439bd9b4d57fe10c1a8ef157008e044c7163

See more details on using hashes here.

File details

Details for the file azure_data_tables-12.0.0b3-py2.py3-none-any.whl.

File metadata

  • Download URL: azure_data_tables-12.0.0b3-py2.py3-none-any.whl
  • Upload date:
  • Size: 110.6 kB
  • Tags: Python 2, Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.6.1 requests/2.25.0 setuptools/49.2.1 requests-toolbelt/0.9.1 tqdm/4.51.0 CPython/3.9.0

File hashes

Hashes for azure_data_tables-12.0.0b3-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 8a4ab19cc2eaccffd68b97800f661b390050e8a62b6f3b7b550aa9b76fc4a73d
MD5 636529b5f4441db4f7058603f7264471
BLAKE2b-256 7fa6c9ca5dd07101560535aa370e35c65779da00a461a6e4e1cb516f7ef2e47b

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page