Skip to main content

ONNX Runtime is a runtime accelerator for Machine Learning models

Project description

OpenVINO™ Execution Provider for ONNX Runtime is a product designed for ONNX Runtime developers who want to get started with OpenVINO™ in their inferencing applications. This product delivers OpenVINO™ inline optimizations which enhance inferencing performance with minimal code modifications.

OpenVINO™ Execution Provider for ONNX Runtime accelerates inference across many AI models on a variety of Intel® hardware such as:
  • Intel® CPUs

  • Intel® integrated GPUs

  • Intel® discrete GPUs

Installation

Requirements

  • Ubuntu 18.04, 20.04, RHEL(CPU only) or Windows 10 - 64 bit

  • Python 3.8 or 3.9 or 3.10 for Linux and only Python3.10 for Windows

This package supports:
  • Intel® CPUs

  • Intel® integrated GPUs

  • Intel® discrete GPUs

pip3 install onnxruntime-openvino

Please install OpenVINO™ PyPi Package separately for Windows. For installation instructions on Windows please refer to OpenVINO™ Execution Provider for ONNX Runtime for Windows.

OpenVINO™ Execution Provider for ONNX Runtime Linux Wheels comes with pre-built libraries of OpenVINO™ version 2023.0.0 eliminating the need to install OpenVINO™ separately. The OpenVINO™ libraries are prebuilt with CXX11_ABI flag set to 0.

For more details on build and installation please refer to Build.

Usage

By default, Intel® CPU is used to run inference. However, you can change the default option to either Intel® integrated or discrete GPU. Invoke the provider config device type argument to change the hardware on which inferencing is done.

For more API calls and environment variables, see Usage.

Samples

To see what you can do with OpenVINO™ Execution Provider for ONNX Runtime, explore the demos located in the Examples.

License

OpenVINO™ Execution Provider for ONNX Runtime is licensed under MIT. By contributing to the project, you agree to the license and copyright terms therein and release your contribution under these terms.

Support

Please submit your questions, feature requests and bug reports via GitHub Issues.

How to Contribute

We welcome community contributions to OpenVINO™ Execution Provider for ONNX Runtime. If you have an idea for improvement:

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distributions

onnxruntime_openvino-1.16.0-cp310-cp310-win_amd64.whl (5.5 MB view details)

Uploaded CPython 3.10 Windows x86-64

onnxruntime_openvino-1.16.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (49.7 MB view details)

Uploaded CPython 3.10 manylinux: glibc 2.17+ x86-64

onnxruntime_openvino-1.16.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (49.7 MB view details)

Uploaded CPython 3.9 manylinux: glibc 2.17+ x86-64

onnxruntime_openvino-1.16.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (49.7 MB view details)

Uploaded CPython 3.8 manylinux: glibc 2.17+ x86-64

File details

Details for the file onnxruntime_openvino-1.16.0-cp310-cp310-win_amd64.whl.

File metadata

File hashes

Hashes for onnxruntime_openvino-1.16.0-cp310-cp310-win_amd64.whl
Algorithm Hash digest
SHA256 014ba013d9cb07146bfe4dd76da631235eb5caac57845071ac3eb0263686ad1c
MD5 8d2fde07ea36797062471ca0f1e273cd
BLAKE2b-256 98f79bf89bf01af188016a8da344aa975f44a190203497b02e1c8d0c116421e9

See more details on using hashes here.

File details

Details for the file onnxruntime_openvino-1.16.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for onnxruntime_openvino-1.16.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 7537ec4d6f444fbc6c5deaa73192ac7297b5184da183623f61b9f9dbc740eaa7
MD5 6dfbcbe5d0992d7fcb846eaeffa6eac6
BLAKE2b-256 77f15015998f7ce2efbb9e7612221043fc260b632d076c090dfb9a31b26f87f1

See more details on using hashes here.

File details

Details for the file onnxruntime_openvino-1.16.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for onnxruntime_openvino-1.16.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 9ad290f3618758044a9d05f649f63f1dee01cbb179fa61443fbaaec16a5db903
MD5 52a21860323e86c5c020c65f9e12bc0f
BLAKE2b-256 a6b508dc9ebf641cacc6d365c1c3a2f33e5f83b06df06bbb90525e80b7fd3e5b

See more details on using hashes here.

File details

Details for the file onnxruntime_openvino-1.16.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for onnxruntime_openvino-1.16.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 d67415d1f0bcfff1aec4d4402a3c5effde94c21173af84bc45dc9d6d4eec717c
MD5 a9997444d6818cc17f31782fe378782c
BLAKE2b-256 d8343a2ef0467aad65df3f0b1e0e2103b1f46a182db688f8ddefd105854f375c

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page