ONNX Runtime is a runtime accelerator for Machine Learning models
Project description
OpenVINO™ Execution Provider for ONNX Runtime is a product designed for ONNX Runtime developers who want to get started with OpenVINO™ in their inferencing applications. This product delivers OpenVINO™ inline optimizations which enhance inferencing performance with minimal code modifications.
- OpenVINO™ Execution Provider for ONNX Runtime accelerates inference across many AI models on a variety of Intel® hardware such as:
Intel® CPUs
Intel® integrated GPUs
Intel® Movidius™ Vision Processing Units - referred to as VPU.
Installation
Requirements
Ubuntu 18.04, 20.04, RHEL(CPU only) or Windows 10 - 64 bit
Python 3.7, 3.8 or 3.9
- This package supports:
Intel® CPUs
Intel® integrated GPUs
Intel® Movidius™ Vision Processing Units (VPUs).
Please Note for VAD-M use Docker installation / Build from Source for Linux.
pip3 install onnxruntime-openvino==1.11.0
Windows release supports only Python 3.9. Please install OpenVINO™ PyPi Package separately for Windows. For installation instructions on Windows please refer to OpenVINO™ Execution Provider for ONNX Runtime for Windows.
This OpenVINO™ Execution Provider for ONNX Runtime Linux Wheels comes with pre-built libraries of OpenVINO™ version 2022.1.0 meaning you do not have to install OpenVINO™ separately. CXX11_ABI flag for pre built OpenVINO™ libraries is 0.
For more details on build and installation please refer to Build.
Usage
By default, Intel® CPU is used to run inference. However, you can change the default option to either Intel® integrated GPU or Intel® VPU for AI inferencing. Invoke the following function to change the hardware on which inferencing is done.
For more API calls and environment variables, see Usage.
Samples
To see what you can do with OpenVINO™ Execution Provider for ONNX Runtime, explore the demos located in the Examples.
Docker Support
The latest OpenVINO™ EP docker image can be downloaded from DockerHub. For more details see Docker ReadMe.
Prebuilt Images
Please find prebuilt docker images for Intel® CPU and Intel® iGPU on OpenVINO™ Execution Provider Release Page.
License
OpenVINO™ Execution Provider for ONNX Runtime is licensed under MIT. By contributing to the project, you agree to the license and copyright terms therein and release your contribution under these terms.
Support
Please submit your questions, feature requests and bug reports via GitHub Issues.
How to Contribute
We welcome community contributions to OpenVINO™ Execution Provider for ONNX Runtime. If you have an idea for improvement:
Share your proposal via GitHub Issues.
Submit a Pull Request.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distributions
Built Distributions
File details
Details for the file onnxruntime_openvino-1.11.0-cp39-cp39-win_amd64.whl
.
File metadata
- Download URL: onnxruntime_openvino-1.11.0-cp39-cp39-win_amd64.whl
- Upload date:
- Size: 4.3 MB
- Tags: CPython 3.9, Windows x86-64
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.0 CPython/3.8.0
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 236556428cafec13121860df763b1f2aa6446f2b3707868239ddf6c3ee8b29ce |
|
MD5 | 7a1f58adc43d41aebc2f8eb47572eeca |
|
BLAKE2b-256 | 3e3767225aa8eb3f58f6fc16b6a230d9656a58e86b6b9ebd1d8084fa82d7eafa |
File details
Details for the file onnxruntime_openvino-1.11.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
.
File metadata
- Download URL: onnxruntime_openvino-1.11.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
- Upload date:
- Size: 37.6 MB
- Tags: CPython 3.9, manylinux: glibc 2.17+ x86-64
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.0 CPython/3.8.0
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 1a6ef223f4764198f22bf93d216519ef28d31da29cf66e01a441aa2c6dbb996b |
|
MD5 | 09d9c2da4dc0cabfd79b365847d355d0 |
|
BLAKE2b-256 | 93de098d46e42284ccd63d339024c33992caa84f77d32effea554fa287141bc2 |
File details
Details for the file onnxruntime_openvino-1.11.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
.
File metadata
- Download URL: onnxruntime_openvino-1.11.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
- Upload date:
- Size: 37.6 MB
- Tags: CPython 3.8, manylinux: glibc 2.17+ x86-64
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.0 CPython/3.8.0
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | b14bd86da547e047ba40e7e4fc3786ef71fd087204966105de8287920c210cb4 |
|
MD5 | b2bff2aff4fd1ed260f43a0bd0e843aa |
|
BLAKE2b-256 | 68922c2308c27bba87278a42c75642e7f22eb0842fc6d2cf0e40ff67faed50fd |
File details
Details for the file onnxruntime_openvino-1.11.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
.
File metadata
- Download URL: onnxruntime_openvino-1.11.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
- Upload date:
- Size: 37.6 MB
- Tags: CPython 3.7m, manylinux: glibc 2.17+ x86-64
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.0 CPython/3.8.0
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | e043234003e982bb538f84648a584d013c962f960ebf2f77b8bbc38b2caa8a14 |
|
MD5 | 9fb59ed1c9b0d0ee97eba0753587c080 |
|
BLAKE2b-256 | 0cf371ab3629e28c4cb8b2f87752199dd8bcf9e689f3d9512abc5b4026535eb9 |