Microsoft Azure Azure Data Tables Client Library for Python
Project description
Azure Data Tables client library for Python
Azure Data Tables is a NoSQL data storage service that can be accessed from anywhere in the world via authenticated calls using HTTP or HTTPS. Tables scales as needed to support the amount of data inserted, and allow for the storing of data with non-complex accessing. The Azure Data Tables client can be used to access Azure Storage or Cosmos accounts.
Source code | Package (PyPI) | API reference documentation | Samples
Getting started
The Azure Data Tables SDK can access an Azure Storage or CosmosDB account.
Prerequisites
- Python 2.7, or 3.6 or later is required to use this package.
- You must have an Azure subscription and either
- an Azure Storage account or
- an Azure Cosmos Account.
Create account
- To create a new storage account, you can use Azure Portal, Azure PowerShell, or Azure CLI:
- To create a new cosmos storage account, you can use the Azure CLI or Azure Portal.
Install the package
Install the Azure Data Tables client library for Python with pip:
pip install azure-data-tables
Create the client
The Azure Data Tables library allows you to interact with two types of resources:
- the tables in your account
- the entities within those tables.
Interaction with these resources starts with an instance of a client. To create a client object, you will need the account's table service endpoint URL and a credential that allows you to access the account. The
endpoint
can be found on the page for your storage account in the Azure Portal under the "Access Keys" section or by running the following Azure CLI command:
# Get the table service URL for the account
az storage account show -n mystorageaccount -g MyResourceGroup --query "primaryEndpoints.table"
Once you have the account URL, it can be used to create the service client:
from azure.data.tables import TableServiceClient
service = TableServiceClient(endpoint="https://<my_account_name>.table.core.windows.net/", credential=credential)
For more information about table service URL's and how to configure custom domain names for Azure Storage check out the official documentation
Types of credentials
The credential
parameter may be provided in a number of different forms, depending on the type of authorization you wish to use. The Tables library supports the following authorizations:
- Shared Key
- Connection String
- Shared Access Signature Token
Creating the client from a shared key
To use an account shared key (aka account key or access key), provide the key as a string. This can be found in your storage account in the Azure Portal under the "Access Keys" section or by running the following Azure CLI command:
az storage account keys list -g MyResourceGroup -n MyStorageAccount
Use the key as the credential parameter to authenticate the client:
from azure.core.credentials import AzureNamedKeyCredential
from azure.data.tables import TableServiceClient
credential = AzureNamedKeyCredential("my_account_name", "my_access_key")
service = TableServiceClient(endpoint="https://<my_account_name>.table.core.windows.net", credential=credential)
Creating the client from a connection string
Depending on your use case and authorization method, you may prefer to initialize a client instance with a connection string instead of providing the account URL and credential separately. To do this, pass the
connection string to the client's from_connection_string
class method. The connection string can be found in your storage account in the Azure Portal under the "Access Keys" section or with the following Azure CLI command:
az storage account show-connection-string -g MyResourceGroup -n MyStorageAccount
from azure.data.tables import TableServiceClient
connection_string = "DefaultEndpointsProtocol=https;AccountName=<my_account_name>;AccountKey=<my_account_key>;EndpointSuffix=core.windows.net"
service = TableServiceClient.from_connection_string(conn_str=connection_string)
Creating the client from a SAS token
To use a shared access signature (SAS) token, provide the token as a string. If your account URL includes the SAS token, omit the credential parameter. You can generate a SAS token from the Azure Portal under Shared access signature or use one of the generate_*_sas()
functions to create a sas token for the account or table:
from datetime import datetime, timedelta
from azure.data.tables import TableServiceClient, generate_account_sas, ResourceTypes, AccountSasPermissions
from azure.core.credentials import AzureNamedKeyCredential, AzureSasCredential
credential = AzureNamedKeyCredential("my_account_name", "my_access_key")
sas_token = generate_account_sas(
credential,
resource_types=ResourceTypes(service=True),
permission=AccountSasPermissions(read=True),
expiry=datetime.utcnow() + timedelta(hours=1),
)
table_service_client = TableServiceClient(endpoint="https://<my_account_name>.table.core.windows.net", credential=AzureSasCredential(sas_token))
Key concepts
Common uses of the Table service included:
- Storing TBs of structured data capable of serving web scale applications
- Storing datasets that do not require complex joins, foreign keys, or stored procedures and can be de-normalized for fast access
- Quickly querying data using a clustered index
- Accessing data using the OData protocol and LINQ filter expressions
The following components make up the Azure Data Tables Service:
- The account
- A table within the account, which contains a set of entities
- An entity within a table, as a dictionary
The Azure Data Tables client library for Python allows you to interact with each of these components through the use of a dedicated client object.
Clients
Two different clients are provided to interact with the various components of the Table Service:
TableServiceClient
-- Get and set account setting
- Query, create, and delete tables within the account.
- Get a
TableClient
to access a specific table using theget_table_client
method.
TableClient
-- Interacts with a specific table (which need not exist yet).
- Create, delete, query, and upsert entities within the specified table.
- Create or delete the specified table itself.
Entities
Entities are similar to rows. An entity has a PartitionKey
, a RowKey
, and a set of properties. A property is a name value pair, similar to a column. Every entity in a table does not need to have the same properties. Entities can be represented as dictionaries like this as an example:
entity = {
'PartitionKey': 'color',
'RowKey': 'brand',
'text': 'Marker',
'color': 'Purple',
'price': '5'
}
- create_entity - Add an entity to the table.
- delete_entity - Delete an entity from the table.
- update_entity - Update an entity's information by either merging or replacing the existing entity.
UpdateMode.MERGE
will add new properties to an existing entity it will not delete an existing propertiesUpdateMode.REPLACE
will replace the existing entity with the given one, deleting any existing properties not included in the submitted entity
- query_entities - Query existing entities in a table using OData filters.
- get_entity - Get a specific entity from a table by partition and row key.
- upsert_entity - Merge or replace an entity in a table, or if the entity does not exist, inserts the entity.
UpdateMode.MERGE
will add new properties to an existing entity it will not delete an existing propertiesUpdateMode.REPLACE
will replace the existing entity with the given one, deleting any existing properties not included in the submitted entity
Examples
The following sections provide several code snippets covering some of the most common Table tasks, including:
Creating a table
Create a table in your account and get a TableClient
to perform operations on the newly created table:
from azure.data.tables import TableServiceClient
table_service_client = TableServiceClient.from_connection_string(conn_str="<connection_string>")
table_name = "myTable"
table_client = table_service_client.create_table(table_name=table_name)
Creating entities
Create entities in the table:
from azure.data.tables import TableServiceClient
from datetime import datetime
PRODUCT_ID = u'001234'
PRODUCT_NAME = u'RedMarker'
my_entity = {
u'PartitionKey': PRODUCT_NAME,
u'RowKey': PRODUCT_ID,
u'Stock': 15,
u'Price': 9.99,
u'Comments': u"great product",
u'OnSale': True,
u'ReducedPrice': 7.99,
u'PurchaseDate': datetime(1973, 10, 4),
u'BinaryRepresentation': b'product_name'
}
table_service_client = TableServiceClient.from_connection_string(conn_str="<connection_string>")
table_client = table_service_client.get_table_client(table_name="myTable")
entity = table_client.create_entity(entity=my_entity)
Querying entities
Querying entities in the table:
from azure.data.tables import TableClient
my_filter = "PartitionKey eq 'RedMarker'"
table_client = TableClient.from_connection_string(conn_str="<connection_string>", table_name="mytable")
entities = table_client.query_entities(my_filter)
for entity in entities:
for key in entity.keys():
print("Key: {}, Value: {}".format(key, entity[key]))
Optional Configuration
Optional keyword arguments can be passed in at the client and per-operation level. The azure-core reference documentation describes available configurations for retries, logging, transport protocols, and more.
Retry Policy configuration
Use the following keyword arguments when instantiating a client to configure the retry policy:
- retry_total (int): Total number of retries to allow. Takes precedence over other counts.
Pass in
retry_total=0
if you do not want to retry on requests. Defaults to 10. - retry_connect (int): How many connection-related errors to retry on. Defaults to 3.
- retry_read (int): How many times to retry on read errors. Defaults to 3.
- retry_status (int): How many times to retry on bad status codes. Defaults to 3.
- retry_to_secondary (bool): Whether the request should be retried to secondary, if able.
This should only be enabled of RA-GRS accounts are used and potentially stale data can be handled.
Defaults to
False
.
Other client / per-operation configuration
Other optional configuration keyword arguments that can be specified on the client or per-operation.
Client keyword arguments:
- connection_timeout (int): Optionally sets the connect and read timeout value, in seconds.
- transport (Any): User-provided transport to send the HTTP request.
Per-operation keyword arguments:
- raw_response_hook (callable): The given callback uses the response returned from the service.
- raw_request_hook (callable): The given callback uses the request before being sent to service.
- client_request_id (str): Optional user specified identification of the request.
- user_agent (str): Appends the custom value to the user-agent header to be sent with the request.
- logging_enable (bool): Enables logging at the DEBUG level. Defaults to False. Can also be passed in at the client level to enable it for all requests.
- headers (dict): Pass in custom headers as key, value pairs. E.g.
headers={'CustomValue': value}
Troubleshooting
General
Azure Data Tables clients raise exceptions defined in Azure Core.
When you interact with the Azure table library using the Python SDK, errors returned by the service respond ot the same HTTP status codes for REST API requests. The Table service operations will throw a HttpResponseError
on failure with helpful error codes.
For examples, if you try to create a table that already exists, a 409
error is returned indicating "Conflict".
from azure.data.tables import TableServiceClient
from azure.core.exceptions import HttpResponseError
table_name = 'YourTableName'
service_client = TableServiceClient.from_connection_string(connection_string)
# Create the table if it does not already exist
tc = service_client.create_table_if_not_exists(table_name)
try:
service_client.create_table(table_name)
except HttpResponseError:
print("Table with name {} already exists".format(table_name))
Logging
This library uses the standard logging library for logging. Basic information about HTTP sessions (URLs, headers, etc.) is logged at INFO level.
Detailed DEBUG level logging, including request/response bodies and unredacted
headers, can be enabled on a client with the logging_enable
argument:
import sys
import logging
from azure.data.tables import TableServiceClient
# Create a logger for the 'azure' SDK
logger = logging.getLogger('azure')
logger.setLevel(logging.DEBUG)
# Configure a console output
handler = logging.StreamHandler(stream=sys.stdout)
logger.addHandler(handler)
# This client will log detailed information about its HTTP sessions, at DEBUG level
service_client = TableServiceClient.from_connection_string("your_connection_string", logging_enable=True)
Similarly, logging_enable
can enable detailed logging for a single operation,
even when it is not enabled for the client:
service_client.create_entity(entity=my_entity, logging_enable=True)
Next steps
Get started with our Table samples.
Several Azure Data Tables Python SDK samples are available to you in the SDK's GitHub repository. These samples provide example code for additional scenarios commonly encountered while working with Tables.
Common Scenarios
These code samples show common scenario operations with the Azure Data tables client library. The async versions of the samples (the python sample files appended with _async) show asynchronous operations with Tables and require Python 3.5 or later.
- Create and delete tables: sample_create_delete_table.py (async version)
- List and query tables: sample_query_tables.py (async version)
- Insert and delete entities: sample_insert_delete_entities.py (async version)
- Query and list entities: sample_query_table.py (async version)
- Update, upsert, and merge entities: sample_update_upsert_merge_entities.py (async version)
- Committing many requests in a single transaction: sample_batching.py (async version)
Additional documentation
For more extensive documentation on Azure Data Tables, see the Azure Data Tables documentation on docs.microsoft.com.
Known Issues
A list of currently known issues relating to Cosmos DB table endpoints can be found here.
Contributing
This project welcomes contributions and suggestions. Most contributions require you to agree to a Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visit https://cla.microsoft.com.
When you submit a pull request, a CLA-bot will automatically determine whether you need to provide a CLA and decorate the PR appropriately (e.g., label, comment). Simply follow the instructions provided by the bot. You will only need to do this once across all repos using our CLA.
This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact opencode@microsoft.com with any additional questions or comments.
Release History
12.0.0 (2021-06-08)
Breaking
- EdmType.Binary data in entities will now be deserialized as
bytes
in Python 3 andstr
in Python 2, rather than anEdmProperty
instance. Likewise on serialization,bytes
in Python 3 andstr
in Python 2 will be interpreted as binary (this is unchanged for Python 3, but breaking for Python 2, wherestr
was previously serialized as EdmType.String) TableClient.create_table
now returns an instance ofTableItem
.- All optional parameters for model constructors are now keyword-only.
- Storage service configuration models have now been prefixed with
Table
, includingTableAccessPolicy
,TableMetrics
,TableRetentionPolicy
,TableCorsRule
- All parameters for
TableServiceClient.set_service_properties
are now keyword-only. - The
credential
parameter for all Clients is now keyword-only. - The method
TableClient.get_access_policy
will now returnNone
where previously it returned an "empty" access policy object. - Timestamp properties on
TableAccessPolicy
instances returned fromTableClient.get_access_policy
will now be deserialized todatetime
instances.
Fixes
- Fixed support for Cosmos emulator endpoint, via URL/credential or connection string.
- Fixed table name from URL parsing in
TableClient.from_table_url
classmethod. - The
account_name
attribute on clients will now be pulled from anAzureNamedKeyCredential
if used. - Any additional odata metadata is returned in entitys metadata.
- The timestamp in entity metadata is now deserialized to a timestamp.
- If the
prefer
header is added in thecreate_entity
operation, the echo will be returned. - Errors raised on a 412 if-not-match error will now be a specific
azure.core.exceptions.ResourceModifiedError
. EdmType.DOUBLE
values are now explicitly typed in the request payload.- Fixed de/serialization of list attributes on
TableCorsRule
.
12.0.0b7 (2021-05-11)
Breaking
- The
account_url
parameter in the client constructors has been renamed toendpoint
. - The
TableEntity
object now acts exclusively like a dictionary, and no longer supports key access via attributes. - Metadata of an entity is now accessed via
TableEntity.metadata
attribute rather than a method. - Removed explicit
LinearRetry
andExponentialRetry
in favor of keyword parameter. - Renamed
filter
parameter in query APIs toquery_filter
. - The
location_mode
attribute on clients is now read-only. This has been added as a keyword parameter to the constructor. - The
TableItem.table_name
has been renamed toTableItem.name
. - Removed the
TableClient.create_batch
method along with theTableBatchOperations
object. The transactional batching is now supported via a simple Python list of tuples. TableClient.send_batch
has been renamed toTableClient.submit_transaction
.- Removed
BatchTransactionResult
object in favor of returning an iterable of batched entities with returned metadata. - Removed Batching context-manager behavior
EntityProperty
is now a NampedTuple, and can be represented by a tuple of(entity, EdmType)
.- Renamed
EntityProperty.type
toEntityProperty.edm_type
. BatchErrorException
has been renamed toTableTransactionError
.- The
location_mode
is no longer a public attribute on the Clients. - The only supported credentials are
AzureNamedKeyCredential
,AzureSasCredential
, or authentication by connection string - Removed
date
andapi_version
from theTableItem
class.
Fixes
- Fixed issue with Cosmos merge operations.
- Removed legacy Storage policies from pipeline.
- Removed unused legacy client-side encryption attributes from client classes.
- Fixed sharing of pipeline between service/table clients.
- Added support for Azurite storage emulator
- Throws a
RequestTooLargeError
on transaction requests that return a 413 error code - Added support for Int64 and Binary types in query filters
- Added support for
select
keyword parameter toTableClient.get_entity()
. - On
update_entity
anddelete_entity
if noetag
is supplied via kwargs, theetag
in the entity will be used if it is in the entity.
12.0.0b6 (2021-04-06)
- Updated deserialization of datetime fields in entities to support preservation of the service format with additional decimal place.
- Passing a string parameter into a query filter will now be escaped to protect against injection.
- Fixed bug in incrementing retries in async retry policy
12.0.0b5 (2021-03-09)
- This version and all future versions will require Python 2.7 or Python 3.6+, Python 3.5 is no longer supported.
- Adds SAS credential as an authentication option
- Bumps minimum requirement of
azure-core
to 1.10.0 - Bumped minimum requirement of msrest from
0.6.10
to0.6.19
. - Adds support for datetime entities with milliseconds
- Adds support for Shared Access Signature authentication
12.0.0b4 (2021-01-12)
- Fixes an issue where
query_entities
kwargparameters
would not work with multiple parameters or with non-string parameters. This now works with multiple parameters and numeric, string, boolean, UUID, and datetime objects. - Fixes an issue where
delete_entity
will return aClientAuthenticationError
when the '@' symbol is included in the entity.
12.0.0b3 (2020-11-12)
- Add support for transactional batching of entity operations.
- Fixed deserialization bug in
list_tables
andquery_tables
whereTableItem.table_name
was an object instead of a string. - Fixed issue where unrecognized entity data fields were silently ignored. They will now raise a
TypeError
. - Fixed issue where query filter parameters were being ignored (#15094)
12.0.0b2 (2020-10-07)
- Adds support for Enumerable types by converting the Enum to a string before sending to the service
12.0.0b1 (2020-09-08)
This is the first beta of the azure-data-tables
client library. The Azure Tables client library can seamlessly target either Azure Table storage or Azure Cosmos DB table service endpoints with no code changes.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file azure-data-tables-12.0.0.zip
.
File metadata
- Download URL: azure-data-tables-12.0.0.zip
- Upload date:
- Size: 1.1 MB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.4.1 importlib_metadata/4.5.0 pkginfo/1.7.0 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.61.0 CPython/3.9.5
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | f5511630b2ad282932e177aa6090ee9c3be5f9158f9072bc123081919d4892ab |
|
MD5 | 25af6719bdb25f582fe06decffa6bf03 |
|
BLAKE2b-256 | ed55b4eb6d6e5405e04d6f299e677dcc9ce26e9506ce5b6e20508351d23c3f57 |
File details
Details for the file azure_data_tables-12.0.0-py2.py3-none-any.whl
.
File metadata
- Download URL: azure_data_tables-12.0.0-py2.py3-none-any.whl
- Upload date:
- Size: 106.6 kB
- Tags: Python 2, Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.4.1 importlib_metadata/4.5.0 pkginfo/1.7.0 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.61.0 CPython/3.9.5
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | ddff4f894b4023f4f96c7d003baf2ff2445d94d84a81186ae2683341dd7b6825 |
|
MD5 | fb251eb5294510594c6303f3e8cfe3f9 |
|
BLAKE2b-256 | 0352720ad1cbc18d567d01a347ccc49f9a4d045e598b0e448bbda1069bb73764 |