Microsoft Azure IoT Models Repository Library
Project description
Azure IoT Models Repository client library for Python
The Azure IoT Models Repository Library for Python provides functionality for working with the Azure IoT Models Repository
Getting started
Install package
Install the Azure IoT Models Repository library for Python with pip:
pip install azure-iot-modelsrepository
Prerequisites
- A models repository following Azure IoT conventions
- The models repository can be hosted on the local filesystem or hosted on a webserver
- Azure IoT hosts the global Azure IoT Models Repository which the client will use if no custom location is provided
Publishing Models
Follow the guide to publish models to the global Azure IoT Models Repository.
If using a custom local or remote repository, you can simply add your model files to a directory structure in the repository location, e.g. dtmi/com/example/thermostat-1.json
Authentication
Currently, no authentication mechanisms are supported. The global endpoint is not tied to an Azure subscription and does not support authentication. All models published are meant for anonymous public consumption.
Key concepts
The Azure IoT Models Repository enables builders to manage and share digital twin models. The models are JSON-LD documents defined using the Digital Twins Definition Language (DTDL).
The repository defines a pattern to store DTDL interfaces in a directory structure based on the Digital Twin Model Identifier (DTMI). You can locate an interface in the repository by converting the DTMI to a relative path. For example, the DTMI dtmi:com:example:Thermostat;1
translates to /dtmi/com/example/thermostat-1.json
.
Examples
The following sections provide several snippets covering common Models Repository tasks:
Initializing the ModelsRepositoryClient
Repository Location
When no repository location is provided during instantiation, the Azure IoT Models Repository global endpoint (https://devicemodels.azure.com/) is used
client = ModelsRepositoryClient()
Alternatively, you can provide a custom location for where your repository is located via the optional repository_location
keyword. The client accepts the following location formats:
- Web URL - e.g.
"https://contoso.com/models/"
- Local Filesystem URI - e.g.
"file:///path/to/repository/"
- POSIX filepath - e.g.
"/path/to/repository/"
- Drive letter filepath - e.g.
"C:/path/to/repository/"
client = ModelsRepositoryClient(repository_location="https://contoso.com/models/")
Dependency Resolution Mode
The client can be configured with an optional dependency_resolution
mode at instantiation, using one of the following values:
'disabled'
- The client will not resolve model dependencies'enabled'
- The client will resolve any model dependencies'tryFromExpanded'
- The client will attempt to resolve models using an expanded model definition (falling back on'enabled'
mode if not possible)
client = ModelsRepositoryClient(dependency_resolution="enabled")
If the dependency_resolution
mode is not specified:
- Clients configured for the Azure IoT Models Repository global endpoint will default to using
'tryFromExpanded'
- Clients configured for a custom location (remote or local) will default to using
'enabled'
Additional Options
If you need to override default pipeline behavior from the azure-core library, you can provide various keyword arguments during instantiation.
Client cleanup
When you are finished with your client, make sure to call .close()
in order to free up resources
client = ModelsRepositoryClient()
# Do things
client.close()
In order to avoid having to do this, it is recommended that you use your client from within a context manager whenever possible, which will automatically close for you
with ModelsRepositoryClient() as client:
# Do things
ModelsRepositoryClient - Get Models
Note that you must first publish models to your repository before you can fetch them. The following examples assume you are using the global Azure IoT Models Repository.
Calling .get_models()
will fetch the model at the provided DTMI and potentially its dependencies (depending on the dependency resolution mode). It will return a dict
that maps DTMIs to model definitions.
dtmi = "dtmi:com:example:TemperatureController;1"
with ModelsRepositoryClient() as client:
models = get_models(dtmi)
print("{} resolved in {} interfaces".format(dtmi, len(models)))
If you provide multiple DTMIs to the method, you can retrieve multiple models (and potentially their dependencies) at once
dtmis = ["dtmi:com:example:TemperatureController;1", "dtmi:com:example:azuresphere:sampledevice;1"]
with ModelsRepositoryClient() as client:
models = get_models(dtmis)
print("{} resolved in {} interfaces".format(dtmi, len(models)))
By default the client will use whichever dependency resolution mode it was configured with at instantiation when retrieving models. However, this behavior can be overridden by passing any of the valid options in as an optional keyword argument to .get_models()
dtmi = "dtmi:com:example:TemperatureController;1"
with ModelsRepositoryClient(dependency_resolution="disabled") as client:
models = get_models(dtmi, dependency_resolution="enabled")
DTMI Conventions
The package contains a module called dtmi_conventions
, which, when imported provides a series of utility operations for working with DTMIs
# Returns True - this is a valid DTMI
dtmi_conventions.is_valid_dtmi("dtmi:com:example:Thermostat;1")
# Returns False - this is NOT a valid DTMI
dtmi_conventions.is_valid_dtmi("dtmi:com:example:Thermostat")
dtmi = "dtmi:com:example:Thermostat;1"
# Local repository example
repo_uri = "file:///path/to/repository"
print(dtmi_conventions.get_model_uri(dtmi, repo_uri))
# Prints: "file:///path/to/repository/dtmi/com/example/thermostat-1.json"
print(dtmi_conventions.get_model_uri(dtmi, repo_uri, expanded=True))
# Prints: "file:///path/to/repository/dtmi/com/example/thermostat-1.expanded.json"
# Remote repository example
repo_uri = "https://contoso.com/models/"
print(dtmi_conventions.get_model_uri(dtmi, repo_uri))
# Prints: "https://contoso/com/models/dtmi/com/example/thermostat-1.json"
print(dtmi_conventions.get_model_uri(dtmi, repo_uri, expanded=True))
# Prints: "https://contoso/com/models/dtmi/com/example/thermostat-1.expanded.json"
Troubleshooting
Logging
This library uses the standard logging library for logging. Information about HTTP sessions (URLs, headers, etc.) is logged at DEBUG
level.
Exceptions
Models Repository APIs may raise exceptions defined in azure-core.
Additionally, they may raise exceptions defined in the azure-iot-modelsrepository
:
ModelError
- Indicates an error occurred while trying to parse/resolve a model definition. This generally means that there is a malformed model that does not comply with the model DTDL specification
Provide Feedback
If you encounter bugs or have suggestions, please open an issue.
Next steps
Samples
Additional samples are available in the samples repository.
Contributing
This project welcomes contributions and suggestions. Most contributions require you to agree to a Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visit https://cla.microsoft.com.
When you submit a pull request, a CLA-bot will automatically determine whether you need to provide a CLA and decorate the PR appropriately (e.g., label, comment). Simply follow the instructions provided by the bot. You will only need to do this once across all repos using our CLA.
This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact opencode@microsoft.com with any additional questions or comments.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file azure-iot-modelsrepository-1.0.0b1.zip
.
File metadata
- Download URL: azure-iot-modelsrepository-1.0.0b1.zip
- Upload date:
- Size: 32.0 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.4.1 importlib_metadata/4.0.1 pkginfo/1.7.0 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.60.0 CPython/3.9.4
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 86ebfaa37684ad751b425b42666aadf518be805a39df96fd0e770b7484c286fd |
|
MD5 | b50f329ee9cead98134b33a23ad62539 |
|
BLAKE2b-256 | 719db460013a7e295110c7d418e5590accefb87976646ccab999dd8806dabe40 |
File details
Details for the file azure_iot_modelsrepository-1.0.0b1-py2.py3-none-any.whl
.
File metadata
- Download URL: azure_iot_modelsrepository-1.0.0b1-py2.py3-none-any.whl
- Upload date:
- Size: 14.0 kB
- Tags: Python 2, Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.4.1 importlib_metadata/4.0.1 pkginfo/1.7.0 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.60.0 CPython/3.9.4
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 9f6a72a008a0cb93706a3a64117b376de6f4fa2bbb109510d4bbb2ee536f80d7 |
|
MD5 | 85d899d2e284d99f3e8473d8e1e0e3f0 |
|
BLAKE2b-256 | 696cc51df36f3b819734f3c19bfb615018b2e134481fca62e2d8e3e24676dfa6 |