Azure Machine Learning inferencing server.
Project description
Azure Machine Learning Inference HTTP Server (azureml-inference-server-http)
Why use Azure Machine Learning Inference HTTP Server?
Enables Local Development
The local inference server allows users to quickly debug their score script. In the case that the underlying score script has a bug, the server will fail to initialize/serve and will instead throw an exception & LOC where the issues occurred at.
Enables CI/CD Integration
The local inference server enables users to easily create cohesive validation gates in their CI/CD pipelines. In order to do so, simply boot up the server with the candidate script and run the test suite against the local endpoint.
Server Routes
Here's a list of the available routes:
Name | Route |
---|---|
Liveness Probe | 127.0.0.1:5001/ |
Score | 127.0.0.1:5001/score |
Server Arguments
Here's a list of the parameters:
Parameter | Required | Default | Description |
---|---|---|---|
entry_script | True | N/A | The relative or absolute path to the scoring script. |
model_dir | False | N/A | The relative or absolute path to the directory holding the model used for inferencing. |
port | False | 5001 | The serving port of the server. |
worker_count | False | 1 | The number of worker threads which will process concurrent requests. |
FAQ
Do I need to reload the server when changing the score script?
After changing your scoring script (score.py
), stop the server with ctrl + c
. Then restart it with azmlinfsrv --entry_script score.py
.
Which OS is supported?
The Azure Machine Learning inference server runs on Windows & Linux based operating systems.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distributions
Built Distribution
File details
Details for the file azureml_inference_server_http-0.2.0-py3-none-any.whl
.
File metadata
- Download URL: azureml_inference_server_http-0.2.0-py3-none-any.whl
- Upload date:
- Size: 39.1 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/1.11.0 pkginfo/1.7.0 requests/2.22.0 setuptools/45.2.0 requests-toolbelt/0.9.1 tqdm/4.61.1 CPython/3.8.5
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | cfe36b47b099be8cc49e122a6bed13be932c912fb482c006e640924f025379c8 |
|
MD5 | 70d5797d62f626c73ae5fe5f530d2a19 |
|
BLAKE2b-256 | cc9d09762c7913644e2109b0db8ad2ecf4e61ca1b475608b74e17f11bd6213f8 |