Skip to main content

Azure Machine Learning inferencing server.

Project description

Azure Machine Learning Inference HTTP Server (azureml-inference-server-http)

Why use Azure Machine Learning Inference HTTP Server?

Enables Local Development

The local inference server allows users to quickly debug their score script. In the case that the underlying score script has a bug, the server will fail to initialize/serve and will instead throw an exception & LOC where the issues occurred at.

Enables CI/CD Integration

The local inference server enables users to easily create cohesive validation gates in their CI/CD pipelines. In order to do so, simply boot up the server with the candidate script and run the test suite against the local endpoint.

Server Routes

Here's a list of the available routes:

Name Route
Liveness Probe 127.0.0.1:5001/
Score 127.0.0.1:5001/score

Server Arguments

Here's a list of the parameters:

Parameter Required Default Description
entry_script True N/A The relative or absolute path to the scoring script.
model_dir False N/A The relative or absolute path to the directory holding the model used for inferencing.
port False 5001 The serving port of the server.
worker_count False 1 The number of worker threads which will process concurrent requests.

FAQ

Do I need to reload the server when changing the score script?

After changing your scoring script (score.py), stop the server with ctrl + c. Then restart it with azmlinfsrv --entry_script score.py.

Which OS is supported?

The Azure Machine Learning inference server runs on Windows & Linux based operating systems.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distribution

File details

Details for the file azureml_inference_server_http-0.1.10-py3-none-any.whl.

File metadata

  • Download URL: azureml_inference_server_http-0.1.10-py3-none-any.whl
  • Upload date:
  • Size: 39.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.11.0 pkginfo/1.7.0 requests/2.22.0 setuptools/45.2.0 requests-toolbelt/0.9.1 tqdm/4.61.0 CPython/3.8.5

File hashes

Hashes for azureml_inference_server_http-0.1.10-py3-none-any.whl
Algorithm Hash digest
SHA256 f63c7aac2cc77b799461aeb1320c15129a33a9c4971fd240c6475f96deb4ecfc
MD5 27be9d5f9765ce510e3cfc939bdb6339
BLAKE2b-256 320700e75b3dfaf693a341cf2a3168bc48e3dd65e3079f4bd04e39a3032b8110

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page