Skip to main content

Azure Machine Learning inferencing server.

Project description

Azure Machine Learning Inference HTTP Server (azureml-inference-server-http)

Why use Azure Machine Learning Inference HTTP Server?

Enables Local Development

The local inference server allows users to quickly debug their score script. In the case that the underlying score script has a bug, the server will fail to initialize/serve and will instead throw an exception & LOC where the issues occurred at.

Enables CI/CD Integration

The local inference server enables users to easily create cohesive validation gates in their CI/CD pipelines. In order to do so, simply boot up the server with the candidate script and run the test suite against the local endpoint.

Server Routes

Here's a list of the available routes:

Name Route
Liveness Probe 127.0.0.1:5001/
Score 127.0.0.1:5001/score

Server Arguments

Here's a list of the parameters:

Parameter Required Default Description
entry_script True N/A The relative or absolute path to the scoring script.
model_dir False N/A The relative or absolute path to the directory holding the model used for inferencing.
port False 5001 The serving port of the server.
worker_count False 1 The number of worker threads which will process concurrent requests.

FAQ

Do I need to reload the server when changing the score script?

After changing your scoring script (score.py), stop the server with ctrl + c. Then restart it with azmlinfsrv --entry_script score.py.

Which OS is supported?

The Azure Machine Learning inference server runs on Windows & Linux based operating systems.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distribution

File details

Details for the file azureml_inference_server_http-0.1.9-py3-none-any.whl.

File metadata

  • Download URL: azureml_inference_server_http-0.1.9-py3-none-any.whl
  • Upload date:
  • Size: 38.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.11.0 pkginfo/1.7.0 requests/2.22.0 setuptools/45.2.0 requests-toolbelt/0.9.1 tqdm/4.61.0 CPython/3.8.5

File hashes

Hashes for azureml_inference_server_http-0.1.9-py3-none-any.whl
Algorithm Hash digest
SHA256 948e49e0ca5090c02e267bed24dcbc3063680af8a921b0ecd791720c253ab17a
MD5 825e091bba94f0b02b9162607c3db33a
BLAKE2b-256 c390b45234cf490cb6a6694565595a89204d20dc7ac552f5fe240a0430f8663c

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page