Skip to main content

ONNX Runtime is a runtime accelerator for Machine Learning models

Project description

OpenVINO™ Execution Provider for ONNX Runtime is a product designed for ONNX Runtime developers who want to get started with OpenVINO™ in their inferencing applications. This product delivers OpenVINO™ inline optimizations which enhance inferencing performance with minimal code modifications.

OpenVINO™ Execution Provider for ONNX Runtime accelerates inference across many AI models on a variety of Intel® hardware such as:
  • Intel® CPUs

  • Intel® integrated GPUs

  • Intel® Movidius™ Vision Processing Units - referred to as VPU.

Installation

Requirements

  • Ubuntu 18.04, 20.04, RHEL(CPU only) or Windows 10 - 64 bit

  • Python 3.7, 3.8 or 3.9 for Linux and only Python3.9 for Windows

This package supports:
  • Intel® CPUs

  • Intel® integrated GPUs

  • Intel® Movidius™ Vision Processing Units (VPUs).

Please Note for VAD-M use Docker installation / Build from Source for Linux.

pip3 install onnxruntime-openvino==1.12.0

Please install OpenVINO™ PyPi Package separately for Windows. For installation instructions on Windows please refer to OpenVINO™ Execution Provider for ONNX Runtime for Windows.

OpenVINO™ Execution Provider for ONNX Runtime Linux Wheels comes with pre-built libraries of OpenVINO™ version 2022.1.0 eliminating the need to install OpenVINO™ separately. The OpenVINO™ libraries are prebuilt with CXX11_ABI flag set to 0.

The package also includes module that is used by torch-ort-inference to accelerate inference for PyTorch models with OpenVINO Execution Provider. See torch-ort-inference for more details.

For more details on build and installation please refer to Build.

Usage

By default, Intel® CPU is used to run inference. However, you can change the default option to either Intel® integrated GPU or Intel® VPU for AI inferencing. Invoke the provider config device type argument to change the hardware on which inferencing is done.

For more API calls and environment variables, see Usage.

Samples

To see what you can do with OpenVINO™ Execution Provider for ONNX Runtime, explore the demos located in the Examples.

Docker Support

The latest OpenVINO™ EP docker image can be downloaded from DockerHub. For more details see Docker ReadMe.

Prebuilt Images

  • Please find prebuilt docker images for Intel® CPU and Intel® iGPU on OpenVINO™ Execution Provider Release Page.

License

OpenVINO™ Execution Provider for ONNX Runtime is licensed under MIT. By contributing to the project, you agree to the license and copyright terms therein and release your contribution under these terms.

Support

Please submit your questions, feature requests and bug reports via GitHub Issues.

How to Contribute

We welcome community contributions to OpenVINO™ Execution Provider for ONNX Runtime. If you have an idea for improvement:

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distributions

onnxruntime_openvino-1.12.0-cp39-cp39-win_amd64.whl (4.6 MB view details)

Uploaded CPython 3.9 Windows x86-64

onnxruntime_openvino-1.12.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (39.7 MB view details)

Uploaded CPython 3.9 manylinux: glibc 2.17+ x86-64

onnxruntime_openvino-1.12.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (39.7 MB view details)

Uploaded CPython 3.8 manylinux: glibc 2.17+ x86-64

onnxruntime_openvino-1.12.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (39.7 MB view details)

Uploaded CPython 3.7m manylinux: glibc 2.17+ x86-64

File details

Details for the file onnxruntime_openvino-1.12.0-cp39-cp39-win_amd64.whl.

File metadata

File hashes

Hashes for onnxruntime_openvino-1.12.0-cp39-cp39-win_amd64.whl
Algorithm Hash digest
SHA256 28e53070d8640348aeb737e2feb23a549b0d29bdc87010085c5739a760b3641f
MD5 5109d64a8302b34e44707dacec1b42d7
BLAKE2b-256 5b5c991eb514441263bf8f89243cd17b221ced7c19e4a97d137ad12fb33c7cf2

See more details on using hashes here.

File details

Details for the file onnxruntime_openvino-1.12.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for onnxruntime_openvino-1.12.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 d1b3df0ea13456cd84bd72171badb2adad2db327a25753ec151ead24ea7e8a68
MD5 dcb22d43948bb471787c489a2c02d29c
BLAKE2b-256 6bd0219b9ffb6fac8ac1a5063a8555e8f0ea4422db8d4ee779d0ea6366584ff4

See more details on using hashes here.

File details

Details for the file onnxruntime_openvino-1.12.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for onnxruntime_openvino-1.12.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 99cce6b2c90de4d08199b4225a49ddfee78797e00c95af485b264bd7ef169a44
MD5 37b0ca88a6665eb11f7f30dba6c9cb7b
BLAKE2b-256 65d3704c492cf95a274808b954c01487b217c51a2123c4ff8ab45e866bf65754

See more details on using hashes here.

File details

Details for the file onnxruntime_openvino-1.12.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for onnxruntime_openvino-1.12.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 f988854ed4391531788188c96c4231aff86f5b677e804daca3fd440a15190c29
MD5 4037c62e91f9e08084a604f7049899ab
BLAKE2b-256 e383550bccf148f5c84939e5f2b4c07f755fb2d32594e7ac4b7965cc481e52b1

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page