Skip to main content

Microsoft Azure Schema Registry Avro Encoder Client Library for Python

Project description

Azure Schema Registry Avro Encoder client library for Python

Azure Schema Registry is a schema repository service hosted by Azure Event Hubs, providing schema storage, versioning, and management. This package provides an Avro encoder capable of encoding and decoding payloads containing Schema Registry schema identifiers and Avro-encoded content.

Source code | Package (PyPi) | API reference documentation | Samples | Changelog

Disclaimer

Azure SDK Python packages support for Python 2.7 has ended 01 January 2022. For more information and questions, please refer to https://github.com/Azure/azure-sdk-for-python/issues/20691

Getting started

Install the package

Install the Azure Schema Registry Avro Encoder client library and Azure Identity client library for Python with pip:

pip install azure-schemaregistry-avroencoder azure-identity

Prerequisites:

To use this package, you must have:

Authenticate the client

Interaction with the Schema Registry Avro Encoder starts with an instance of AvroEncoder class, which takes the schema group name and the Schema Registry Client class. The client constructor takes the Event Hubs fully qualified namespace and and Azure Active Directory credential:

  • The fully qualified namespace of the Schema Registry instance should follow the format: <yournamespace>.servicebus.windows.net.

  • An AAD credential that implements the TokenCredential protocol should be passed to the constructor. There are implementations of the TokenCredential protocol available in the azure-identity package. To use the credential types provided by azure-identity, please install the Azure Identity client library for Python with pip:

pip install azure-identity
  • Additionally, to use the async API, you must first install an async transport, such as aiohttp:
pip install aiohttp

Create AvroEncoder using the azure-schemaregistry library:

from azure.schemaregistry import SchemaRegistryClient
from azure.schemaregistry.encoder.avroencoder import AvroEncoder
from azure.identity import DefaultAzureCredential

credential = DefaultAzureCredential()
# Namespace should be similar to: '<your-eventhub-namespace>.servicebus.windows.net'
fully_qualified_namespace = '<< FULLY QUALIFIED NAMESPACE OF THE SCHEMA REGISTRY >>'
group_name = '<< GROUP NAME OF THE SCHEMA >>'
schema_registry_client = SchemaRegistryClient(fully_qualified_namespace, credential)
encoder = AvroEncoder(client=schema_registry_client, group_name=group_name)

Key concepts

AvroEncoder

Provides API to encode to and decode from Avro Binary Encoding plus a content type with schema ID. Uses SchemaRegistryClient to get schema IDs from schema content or vice versa.

Supported message models

Support has been added to certain Azure Messaging SDK model classes for interoperability with the AvroEncoder. These models are subtypes of the MessageType protocol defined under the azure.schemaregistry.encoder.avroencoder namespace. Currently, the supported model classes are:

  • azure.eventhub.EventData for azure-eventhub==5.9.0b2

Message format

If a message type that follows the MessageType protocol is provided to the encoder, it will encode the corresponding content and content type properties as follows:

  • content: Avro payload (in general, format-specific payload)

    • Avro Binary Encoding
    • NOT Avro Object Container File, which includes the schema and defeats the purpose of this encoder to move the schema out of the message payload and into the schema registry.
  • content type: a string of the format avro/binary+<schema ID>, where:

    • avro/binary is the format indicator
    • <schema ID> is the hexadecimal representation of GUID, same format and byte order as the string from the Schema Registry service.

If message type or callback function is not provided, and by default, the encoder will create the following dict: {"content": <Avro encoded payload>, "content_type": 'avro/binary+<schema ID>' }

Examples

The following sections provide several code snippets covering some of the most common Schema Registry tasks, including:

Encoding

Use AvroEncoder.encode method to encode dict content with the given Avro schema. The method will use a schema previously registered to the Schema Registry service and keep the schema cached for future encoding usage. It is also possible to avoid pre-registering the schema to the service and automatically register with the encode method by instantiating the AvroEncoder with the keyword argument auto_register_schemas=True.

import os
from azure.schemaregistry import SchemaRegistryClient
from azure.schemaregistry.encoder.avroencoder import AvroEncoder
from azure.identity import DefaultAzureCredential
from azure.eventhub import EventData

token_credential = DefaultAzureCredential()
fully_qualified_namespace = os.environ['SCHEMAREGISTRY_FULLY_QUALIFIED_NAMESPACE']
group_name = "<your-group-name>"
name = "example.avro.User"
format = "Avro"

definition = """
{"namespace": "example.avro",
 "type": "record",
 "name": "User",
 "fields": [
     {"name": "name", "type": "string"},
     {"name": "favorite_number",  "type": ["int", "null"]},
     {"name": "favorite_color", "type": ["string", "null"]}
 ]
}"""

schema_registry_client = SchemaRegistryClient(fully_qualified_namespace, token_credential)
schema_register_client.register(group_name, name, definition, format)
encoder = AvroEncoder(client=schema_registry_client, group_name=group_name)

with encoder:
    dict_content = {"name": "Ben", "favorite_number": 7, "favorite_color": "red"}
    event_data = encoder.encode(dict_content, schema=definition, message_type=EventData)

    # OR

    message_content_dict = encoder.encode(dict_content, schema=definition)
    event_data = EventData.from_message_content(message_content_dict["content"], message_content_dict["content_type"])

Decoding

Use AvroEncoder.decode method to decode the bytes value into dict content by either:

  • Passing in a message object that is a subtype of the MessageType protocol.
  • Passing in a dict with keys content(type bytes) and content_type (type string). The method automatically retrieves the schema from the Schema Registry Service and keeps the schema cached for future decoding usage.
import os
from azure.schemaregistry import SchemaRegistryClient
from azure.schemaregistry.encoder.avroencoder import AvroEncoder
from azure.identity import DefaultAzureCredential

token_credential = DefaultAzureCredential()
fully_qualified_namespace = os.environ['SCHEMAREGISTRY_FULLY_QUALIFIED_NAMESPACE']
group_name = "<your-group-name>"

schema_registry_client = SchemaRegistryClient(fully_qualified_namespace, token_credential)
encoder = AvroEncoder(client=schema_registry_client, group_name=group_name)

with encoder:
    # event_data is an EventData object with Avro encoded body
    decoded_content = encoder.decode(event_data)

    # OR 

    encoded_bytes = b'<content_encoded_by_azure_schema_registry_avro_encoder>'
    content_type = 'avro/binary+<schema_id_of_corresponding_schema>'
    content_dict = {"content": encoded_bytes, "content_type": content_type}
    decoded_content = encoder.decode(content_dict)

Event Hubs Sending Integration

Integration with Event Hubs to send encoded Avro dict content as the body of EventData.

import os
from azure.eventhub import EventHubProducerClient, EventData
from azure.schemaregistry import SchemaRegistryClient
from azure.schemaregistry.encoder.avroencoder import AvroEncoder
from azure.identity import DefaultAzureCredential

token_credential = DefaultAzureCredential()
fully_qualified_namespace = os.environ['SCHEMAREGISTRY_FULLY_QUALIFIED_NAMESPACE']
group_name = "<your-group-name>"
eventhub_connection_str = os.environ['EVENT_HUB_CONN_STR']
eventhub_name = os.environ['EVENT_HUB_NAME']

definition = """
{"namespace": "example.avro",
 "type": "record",
 "name": "User",
 "fields": [
     {"name": "name", "type": "string"},
     {"name": "favorite_number",  "type": ["int", "null"]},
     {"name": "favorite_color", "type": ["string", "null"]}
 ]
}"""

schema_registry_client = SchemaRegistryClient(fully_qualified_namespace, token_credential)
avro_encoder = AvroEncoder(client=schema_registry_client, group_name=group_name, auto_register_schemas=True)

eventhub_producer = EventHubProducerClient.from_connection_string(
    conn_str=eventhub_connection_str,
    eventhub_name=eventhub_name
)

with eventhub_producer, avro_encoder:
    event_data_batch = eventhub_producer.create_batch()
    dict_content = {"name": "Bob", "favorite_number": 7, "favorite_color": "red"}
    event_data = avro_encoder.encode(dict_content, schema=definition, message_type=EventData)
    event_data_batch.add(event_data)
    eventhub_producer.send_batch(event_data_batch)

Event Hubs Receiving Integration

Integration with Event Hubs to receive EventData and decoded raw bytes into Avro dict content.

import os
from azure.eventhub import EventHubConsumerClient
from azure.schemaregistry import SchemaRegistryClient
from azure.schemaregistry.encoder.avroencoder import AvroEncoder
from azure.identity import DefaultAzureCredential

token_credential = DefaultAzureCredential()
fully_qualified_namespace = os.environ['SCHEMAREGISTRY_FULLY_QUALIFIED_NAMESPACE']
group_name = "<your-group-name>"
eventhub_connection_str = os.environ['EVENT_HUB_CONN_STR']
eventhub_name = os.environ['EVENT_HUB_NAME']

schema_registry_client = SchemaRegistryClient(fully_qualified_namespace, token_credential)
avro_encoder = AvroEncoder(client=schema_registry_client, group_name=group_name)

eventhub_consumer = EventHubConsumerClient.from_connection_string(
    conn_str=eventhub_connection_str,
    consumer_group='$Default',
    eventhub_name=eventhub_name,
)

def on_event(partition_context, event):
    decoded_content = avro_encoder.decode(event)

with eventhub_consumer, avro_encoder:
    eventhub_consumer.receive(on_event=on_event, starting_position="-1")

Troubleshooting

General

Azure Schema Registry Avro Encoder raises exceptions defined in Azure Core.

Logging

This library uses the standard logging library for logging. Basic information about HTTP sessions (URLs, headers, etc.) is logged at INFO level.

Detailed DEBUG level logging, including request/response bodies and unredacted headers, can be enabled on a client with the logging_enable argument:

import sys
import logging
from azure.schemaregistry import SchemaRegistryClient
from azure.schemaregistry.encoder.avroencoder import AvroEncoder
from azure.identity import DefaultAzureCredential

# Create a logger for the SDK
logger = logging.getLogger('azure.schemaregistry')
logger.setLevel(logging.DEBUG)

# Configure a console output
handler = logging.StreamHandler(stream=sys.stdout)
logger.addHandler(handler)

credential = DefaultAzureCredential()
schema_registry_client = SchemaRegistryClient("<your-fully_qualified_namespace>", credential, logging_enable=True)
# This client will log detailed information about its HTTP sessions, at DEBUG level
encoder = AvroEncoder(client=schema_registry_client, group_name="<your-group-name>")

Similarly, logging_enable can enable detailed logging for a single operation, even when it isn't enabled for the client:

encoder.encode(dict_content, schema=schema_definition, logging_enable=True)

Next steps

More sample code

Please find further examples in the samples directory demonstrating common Azure Schema Registry Avro Encoder scenarios.

Contributing

This project welcomes contributions and suggestions. Most contributions require you to agree to a Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visit https://cla.microsoft.com.

When you submit a pull request, a CLA-bot will automatically determine whether you need to provide a CLA and decorate the PR appropriately (e.g., label, comment). Simply follow the instructions provided by the bot. You will only need to do this once across all repos using our CLA.

This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact opencode@microsoft.com with any additional questions or comments.

Release History

1.0.0b2 (2022-03-09)

Features Added

  • request_options has been added to encode and decode on AvroEncoder as an optional parameter to be passed into client requests.
  • The size of the current schema/schema ID caches will be logged at an info level when a new entry has been added.

Breaking Changes

  • MessageMetadataDict has been renamed MessageContent.
  • data in MessageContent has been renamed content.
  • The data parameter in encode and decode on the sync and async AvroEncoder has been renamed content.
  • The from_message_data method in the MessageType protocol has been renamed from_message_content. The data parameter in from_message_content has been renamed content.
  • The __message_data__ method in the MessageType protocol has been renamed __message_content__.

Other Changes

  • This beta release will be backward compatible for decoding data that was encoded with the AvroSerializer.
  • The encode and decode methods on AvroEncoder support the following message models:
    • azure.eventhub.EventData in azure-eventhub==5.9.0b2

1.0.0b1 (2022-02-09)

This version and all future versions will require Python 3.6+. Python 2.7 is no longer supported.

Features Added

  • This package is meant to replace the azure-schemaregistry-avroserializer.
  • APIs have been updated to allow for encoding directly to and decoding from message type objects, where the data value is the Avro encoded payload.
  • The content type of the message will hold the schema ID and record format indicator.

Other Changes

  • This beta release will be backward compatible for decoding data that was encoded with the AvroSerializer.
  • The encode and decode methods on AvroEncoder support the following message models:
    • azure.eventhub.EventData in azure-eventhub==5.9.0b1

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

azure-schemaregistry-avroencoder-1.0.0b2.zip (68.8 kB view details)

Uploaded Source

Built Distribution

File details

Details for the file azure-schemaregistry-avroencoder-1.0.0b2.zip.

File metadata

  • Download URL: azure-schemaregistry-avroencoder-1.0.0b2.zip
  • Upload date:
  • Size: 68.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 pkginfo/1.8.2 readme-renderer/33.0 requests/2.27.1 requests-toolbelt/0.9.1 urllib3/1.26.8 tqdm/4.63.0 importlib-metadata/4.11.2 keyring/23.5.0 rfc3986/2.0.0 colorama/0.4.4 CPython/3.10.2

File hashes

Hashes for azure-schemaregistry-avroencoder-1.0.0b2.zip
Algorithm Hash digest
SHA256 3b633709831a40a17f4c2f5632b8cda099b36b1967d3488d518764e35603508a
MD5 9f5ef11e7549c560d5f8fc6fe732ac47
BLAKE2b-256 290c66ff8afd7ce55606e7ac7f2a17e82ba80c183ad84650635317e4cc00626c

See more details on using hashes here.

File details

Details for the file azure_schemaregistry_avroencoder-1.0.0b2-py3-none-any.whl.

File metadata

  • Download URL: azure_schemaregistry_avroencoder-1.0.0b2-py3-none-any.whl
  • Upload date:
  • Size: 27.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 pkginfo/1.8.2 readme-renderer/33.0 requests/2.27.1 requests-toolbelt/0.9.1 urllib3/1.26.8 tqdm/4.63.0 importlib-metadata/4.11.2 keyring/23.5.0 rfc3986/2.0.0 colorama/0.4.4 CPython/3.10.2

File hashes

Hashes for azure_schemaregistry_avroencoder-1.0.0b2-py3-none-any.whl
Algorithm Hash digest
SHA256 b292dc14cedb15fdfa42de7e88624046b95cd5d8299afcab2f41ecbbb8e5f9de
MD5 8a744d2512557d2006b083012591dd12
BLAKE2b-256 b63794b862babf7da29f78149bdb31f9ef16ea610a49f9a7523120f3b467f9b2

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page