Skip to main content

Azure Machine Learning inferencing server.

Project description

Azure Machine Learning Inference HTTP Server (azureml-inference-server-http)

Why use Azure Machine Learning Inference HTTP Server?

Enables Local Development

The local inference server allows users to quickly debug their score script. In the case that the underlying score script has a bug, the server will fail to initialize/serve and will instead throw an exception & LOC where the issues occurred at.

Enables CI/CD Integration

The local inference server enables users to easily create cohesive validation gates in their CI/CD pipelines. In order to do so, simply boot up the server with the candidate script and run the test suite against the local endpoint.

Server Routes

Here's a list of the available routes:

Name Route
Liveness Probe 127.0.0.1:5001/
Score 127.0.0.1:5001/score

Server Arguments

Here's a list of the parameters:

Parameter Required Default Description
entry_script True N/A The relative or absolute path to the scoring script.
model_dir False N/A The relative or absolute path to the directory holding the model used for inferencing.
port False 5001 The serving port of the server.
worker_count False 1 The number of worker threads which will process concurrent requests.

FAQ

Do I need to reload the server when changing the score script?

After changing your scoring script (score.py), stop the server with ctrl + c. Then restart it with azmlinfsrv --entry_script score.py.

Which OS is supported?

The Azure Machine Learning inference server runs on Windows & Linux based operating systems.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distribution

File details

Details for the file azureml_inference_server_http-0.1.5-py3-none-any.whl.

File metadata

  • Download URL: azureml_inference_server_http-0.1.5-py3-none-any.whl
  • Upload date:
  • Size: 37.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.11.0 pkginfo/1.7.0 requests/2.22.0 setuptools/45.2.0 requests-toolbelt/0.9.1 tqdm/4.60.0 CPython/3.8.5

File hashes

Hashes for azureml_inference_server_http-0.1.5-py3-none-any.whl
Algorithm Hash digest
SHA256 25fd523cc46b030810f8e7797652c0a8a9109593d900c01f52aa49df32d04524
MD5 2c044a7a80bd3a763142befd7da7ec72
BLAKE2b-256 6efdec2318369317d8ffc6421095cdb9354f269c7d2b8cbf736a00c55c0ea715

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page