Skip to main content

ONNX Runtime is a runtime accelerator for Machine Learning models

Project description

OpenVINO™ Execution Provider for ONNX Runtime is a product designed for ONNX Runtime developers who want to get started with OpenVINO™ in their inferencing applications. This product delivers OpenVINO™ inline optimizations which enhance inferencing performance with minimal code modifications.

OpenVINO™ Execution Provider for ONNX Runtime accelerates inference across many AI models on a variety of Intel® hardware such as:
  • Intel® CPUs

  • Intel® integrated GPUs

  • Intel® Movidius™ Vision Processing Units - referred to as VPU.

Installation

Requirements

  • Ubuntu 18.04, 20.04, RHEL(CPU only) or Windows 10 - 64 bit

  • Python 3.7, 3.8 or 3.9 for Linux and only Python3.9 for Windows

This package supports:
  • Intel® CPUs

  • Intel® integrated GPUs

  • Intel® Movidius™ Vision Processing Units (VPUs).

Please Note for VAD-M use Docker installation / Build from Source for Linux.

pip3 install onnxruntime-openvino==1.13.1

Please install OpenVINO™ PyPi Package separately for Windows. For installation instructions on Windows please refer to OpenVINO™ Execution Provider for ONNX Runtime for Windows.

OpenVINO™ Execution Provider for ONNX Runtime Linux Wheels comes with pre-built libraries of OpenVINO™ version 2022.2.0 eliminating the need to install OpenVINO™ separately. The OpenVINO™ libraries are prebuilt with CXX11_ABI flag set to 0.

The package also includes module that is used by torch-ort-inference to accelerate inference for PyTorch models with OpenVINO Execution Provider. See torch-ort-inference for more details.

For more details on build and installation please refer to Build.

Usage

By default, Intel® CPU is used to run inference. However, you can change the default option to either Intel® integrated GPU or Intel® VPU for AI inferencing. Invoke the provider config device type argument to change the hardware on which inferencing is done.

For more API calls and environment variables, see Usage.

Samples

To see what you can do with OpenVINO™ Execution Provider for ONNX Runtime, explore the demos located in the Examples.

License

OpenVINO™ Execution Provider for ONNX Runtime is licensed under MIT. By contributing to the project, you agree to the license and copyright terms therein and release your contribution under these terms.

Support

Please submit your questions, feature requests and bug reports via GitHub Issues.

How to Contribute

We welcome community contributions to OpenVINO™ Execution Provider for ONNX Runtime. If you have an idea for improvement:

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distributions

onnxruntime_openvino-1.13.1-cp39-cp39-win_amd64.whl (4.6 MB view details)

Uploaded CPython 3.9 Windows x86-64

onnxruntime_openvino-1.13.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (41.8 MB view details)

Uploaded CPython 3.9 manylinux: glibc 2.17+ x86-64

onnxruntime_openvino-1.13.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (41.8 MB view details)

Uploaded CPython 3.8 manylinux: glibc 2.17+ x86-64

onnxruntime_openvino-1.13.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (41.8 MB view details)

Uploaded CPython 3.7m manylinux: glibc 2.17+ x86-64

File details

Details for the file onnxruntime_openvino-1.13.1-cp39-cp39-win_amd64.whl.

File metadata

File hashes

Hashes for onnxruntime_openvino-1.13.1-cp39-cp39-win_amd64.whl
Algorithm Hash digest
SHA256 5920b43fe9d18567433e20e134ed969875f489ccae835d47e480776bd832dc5f
MD5 b44ed02f4733c21884e1e4f8c08fdbeb
BLAKE2b-256 349966aacc18d54cdc4d843697aa3a098bc492bcd66419f495128c2132d4c85d

See more details on using hashes here.

File details

Details for the file onnxruntime_openvino-1.13.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for onnxruntime_openvino-1.13.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 6946b2790e8e713825284fef88b3e455708ff4d04ce1172066adeccf907d9f1c
MD5 db375eeef026097d09d6711f2da6cf37
BLAKE2b-256 f18efd8ae3bd50cad5c03b3faa7d895bb997dec7d2c345f6bf0b297fca825482

See more details on using hashes here.

File details

Details for the file onnxruntime_openvino-1.13.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for onnxruntime_openvino-1.13.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 016673faa926ded928a0bf9a6ac3e48ea082d1309d35886472be0e432633a744
MD5 c1242ed00f5fe1d2fb6a299584cd67bf
BLAKE2b-256 3a140deb12815f1343e973dbaf29021360465bc1f9f3d3a09bb7c392ee79a964

See more details on using hashes here.

File details

Details for the file onnxruntime_openvino-1.13.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for onnxruntime_openvino-1.13.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 f4cf4a805801496938adb85056c6c3b7c035aa8682b3ccd9df981e4407c973c3
MD5 aa55e4186e1af4736b5b71a848848c59
BLAKE2b-256 3fb6788abc314bf3e5e6bdf070d3d4b4edfaff0d23c79295d9648b028eacf13a

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page