Skip to main content

Azure Machine Learning inferencing server.

Project description

Check our official documentation here.

Python 3.6 Deprecation

  • Python 3.6 support on Windows is dropped from azureml-inference-server-http v0.4.12 to pick up waitress v2.1.1 with the security bugfix of CVE-2022-24761.

  • Python 3.6 support on Mac, Linux and WSL2 will not be impacted by above change for now.

  • Python 3.6 support on all platforms will be dropped in December, 2022 (previously: 6/15/2022)

Changelog

0.7.2 (2022-06-06)

Enhancements

  • Added support for Flask 2.1.

  • The server now responds with a 400 Bad Request when it finds invalid inputs.

0.7.1 (2022-05-10)

Deprecation

  • The “x-ms-request-id” header is deprecated and is being replaced by “x-request-id”. Until “x-ms-request-id” is removed, the server will accept either header and respond with both headers set to the same request id. Providing two request ids through the headers is not allowed and will be responded with a Bad Request.

Enhancements

  • Added support for Flask 2.0. A compatibility layer is introduced to ensure this upgrade doesn’t break users who use @rawhttp as the methods on the Flask request object have slightly changed. Specifically,

    • request.headers.has_keys() was removed

    • request.json throws an exception if the content-type is not “application/json”. Previously it returns None.

    The compatibility layer restores these functionalities to their previous behaviors. However, this compatibility layer will be removed in a future date and users are encouraged to audit their score scripts today. To see if your score script is ready for Flask 2, run the server with the environment variable AML_FLASK_ONE_COMPATIBILITY set to false.

    Flask’s full changelog can be found here: https://flask.palletsprojects.com/en/2.1.x/changes/

  • Added support for the “x-request-id” and “x-client-request-id” headers. A new GUID is generated for “x-request-id” if one is not provided. These values are echoed back to the client in the response headers.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distribution

File details

Details for the file azureml_inference_server_http-0.7.2-py3-none-any.whl.

File metadata

  • Download URL: azureml_inference_server_http-0.7.2-py3-none-any.whl
  • Upload date:
  • Size: 55.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.11.0 pkginfo/1.8.2 requests/2.22.0 setuptools/45.2.0 requests-toolbelt/0.9.1 tqdm/4.64.0 CPython/3.8.10

File hashes

Hashes for azureml_inference_server_http-0.7.2-py3-none-any.whl
Algorithm Hash digest
SHA256 412bab9647b06ade8bd3a38d70ebde6c604dcc1dbabbfc05a260e8e79a86cede
MD5 e4aa861fead9810378f088580bc826e5
BLAKE2b-256 618800256ff90b9c739ebeb342a405fe87e2ac6b04e773393a3d35a0364d04e2

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page