Skip to main content

Azure Machine Learning inferencing server.

Project description

Check our official documentation here.

Python 3.6 Deprecation

  • Python 3.6 support on Windows is dropped from azureml-inference-server-http v0.4.12 to pick up waitress v2.1.1 with the security bugfix of CVE-2022-24761.

  • Python 3.6 support on Mac, Linux and WSL2 will not be impacted by above change for now.

  • Python 3.6 support on all platforms will be dropped in December, 2022 (previously: 6/15/2022)

CORS support

Cross-origin resource sharing is a way to allow resources on a webpage to be requested from another domain. CORS works via HTTP headers sent with the client request and returned with the service response. For more information on CORS and valid headers, see Cross-origin resource sharing in Wikipedia.

Users can specify the domains allowed for access through the AML_CORS_ORIGINS environment variable, as a comma separated list of domains, such as www.microsoft.com, www.bing.com. While discouraged, users can also set it to * to allow access from all domains. CORS is disabled if this environment variable is not set.

Existing usage to employ @rawhttp as a way to specify CORS header is not affected, and can be used if you need more granular control of CORS (such as the need to specify other CORS headers). See here for an example.

Changelog

0.7.4 (2022-07-29)

Fixes

  • Fix an issue where the server would require arguments that have default values in run().

0.7.3 (2022-07-18)

Features

  • CORS can be enabled with the environment variable AML_CORS_ORIGINS. Refer to README for detailed usage.

  • Server can now be started with python -m azureml_inference_server_http in additional to azmlinfsrv.

  • OPTIONS calls are modified to return 200 OK instead of the previous 405 Method not allowed.

  • Users can bring their own swaggers by placing swagger2.json and swagger3.json in AML_APP_ROOT.

Enhancements

  • Swaggers are always generated now, regardless whether the user’s run() function is decorated with inference-schema.

  • The x-request-id and x-client-request-id headers are now limited to 100 characters.

Fixes

  • Fixed an issue that prevents the server from cleanly exiting when the scoring script cannot be initialized. If AppInsights is not enabled, users may see AttributeError: 'AppInsightsClient' object has no attribute 'logger'.

0.7.2 (2022-06-06)

Enhancements

  • Added support for Flask 2.1.

  • The server now responds with a 400 Bad Request when it finds invalid inputs.

0.7.1 (2022-05-10)

Deprecation

  • The “x-ms-request-id” header is deprecated and is being replaced by “x-request-id”. Until “x-ms-request-id” is removed, the server will accept either header and respond with both headers set to the same request id. Providing two request ids through the headers is not allowed and will be responded with a Bad Request.

Enhancements

  • Added support for Flask 2.0. A compatibility layer is introduced to ensure this upgrade doesn’t break users who use @rawhttp as the methods on the Flask request object have slightly changed. Specifically,

    • request.headers.has_keys() was removed

    • request.json throws an exception if the content-type is not “application/json”. Previously it returns None.

    The compatibility layer restores these functionalities to their previous behaviors. However, this compatibility layer will be removed in a future date and users are encouraged to audit their score scripts today. To see if your score script is ready for Flask 2, run the server with the environment variable AML_FLASK_ONE_COMPATIBILITY set to false.

    Flask’s full changelog can be found here: https://flask.palletsprojects.com/en/2.1.x/changes/

  • Added support for the “x-request-id” and “x-client-request-id” headers. A new GUID is generated for “x-request-id” if one is not provided. These values are echoed back to the client in the response headers.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distribution

File details

Details for the file azureml_inference_server_http-0.7.4-py3-none-any.whl.

File metadata

  • Download URL: azureml_inference_server_http-0.7.4-py3-none-any.whl
  • Upload date:
  • Size: 56.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.11.0 pkginfo/1.8.3 requests/2.22.0 setuptools/45.2.0 requests-toolbelt/0.9.1 tqdm/4.64.0 CPython/3.8.10

File hashes

Hashes for azureml_inference_server_http-0.7.4-py3-none-any.whl
Algorithm Hash digest
SHA256 97d1a9f97829a5babd4f651caa37936a006da070f23dff308e95ff6abd3c1447
MD5 3eedbb05a8e1d47838a5f718887d8ee6
BLAKE2b-256 3106fc89ba8570632451872b4cd57f9f4c998799f8c36460532797bb41ac29b6

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page