Skip to main content

ONNX Runtime is a runtime accelerator for Machine Learning models

Project description

OpenVINO™ Execution Provider for ONNX Runtime is a product designed for ONNX Runtime developers who want to get started with OpenVINO™ in their inferencing applications. This product delivers OpenVINO™ inline optimizations which enhance inferencing performance with minimal code modifications.

OpenVINO™ Execution Provider for ONNX Runtime accelerates inference across many AI models on a variety of Intel® hardware such as:
  • Intel® CPUs

  • Intel® integrated GPUs

  • Intel® discrete GPUs

  • Intel® integrated NPUs (Windows only)

Installation

Requirements

  • Ubuntu 18.04, 20.04, RHEL(CPU only) or Windows 10 - 64 bit

  • Python 3.9 or 3.10 or 3.11 for Linux and Python 3.10, 3.11 for Windows

This package supports:
  • Intel® CPUs

  • Intel® integrated GPUs

  • Intel® discrete GPUs

  • Intel® integrated NPUs (Windows only)

pip3 install onnxruntime-openvino

Please install OpenVINO™ PyPi Package separately for Windows. For installation instructions on Windows please refer to OpenVINO™ Execution Provider for ONNX Runtime for Windows.

OpenVINO™ Execution Provider for ONNX Runtime Linux Wheels comes with pre-built libraries of OpenVINO™ version 2024.1.0 eliminating the need to install OpenVINO™ separately.

For more details on build and installation please refer to Build.

Usage

By default, Intel® CPU is used to run inference. However, you can change the default option to either Intel® integrated GPU, discrete GPU, integrated NPU (Windows only). Invoke the provider config device type argument to change the hardware on which inferencing is done.

For more API calls and environment variables, see Usage.

Samples

To see what you can do with OpenVINO™ Execution Provider for ONNX Runtime, explore the demos located in the Examples.

License

OpenVINO™ Execution Provider for ONNX Runtime is licensed under MIT. By contributing to the project, you agree to the license and copyright terms therein and release your contribution under these terms.

Support

Please submit your questions, feature requests and bug reports via GitHub Issues.

How to Contribute

We welcome community contributions to OpenVINO™ Execution Provider for ONNX Runtime. If you have an idea for improvement:

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distributions

onnxruntime_openvino-1.18.0-cp311-cp311-win_amd64.whl (6.0 MB view details)

Uploaded CPython 3.11 Windows x86-64

onnxruntime_openvino-1.18.0-cp311-cp311-manylinux_2_28_x86_64.whl (42.0 MB view details)

Uploaded CPython 3.11 manylinux: glibc 2.28+ x86-64

onnxruntime_openvino-1.18.0-cp310-cp310-win_amd64.whl (6.0 MB view details)

Uploaded CPython 3.10 Windows x86-64

onnxruntime_openvino-1.18.0-cp310-cp310-manylinux_2_28_x86_64.whl (42.0 MB view details)

Uploaded CPython 3.10 manylinux: glibc 2.28+ x86-64

onnxruntime_openvino-1.18.0-cp39-cp39-manylinux_2_28_x86_64.whl (42.0 MB view details)

Uploaded CPython 3.9 manylinux: glibc 2.28+ x86-64

File details

Details for the file onnxruntime_openvino-1.18.0-cp311-cp311-win_amd64.whl.

File metadata

File hashes

Hashes for onnxruntime_openvino-1.18.0-cp311-cp311-win_amd64.whl
Algorithm Hash digest
SHA256 874a1e263dd86674593e5a879257650b06a8609c4d5768c3d8ed8dc4ae874b9c
MD5 c4c709c45823e64a6e52becf61bac0a4
BLAKE2b-256 88d9ca0bfd7ed37153d9664ccdcfb4d0e5b1963563553b05cb4338b46968feb2

See more details on using hashes here.

File details

Details for the file onnxruntime_openvino-1.18.0-cp311-cp311-manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for onnxruntime_openvino-1.18.0-cp311-cp311-manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 eb1723d386f70a8e26398d983ebe35d2c25ba56e9cdb382670ebbf1f5139f8ba
MD5 7c7fb54109d7b9be804e5c12e77ed01b
BLAKE2b-256 7ed38299b7285dc8fa7bd986b6f0d7c50b7f0fd13db50dd3b88b93ec269b1e08

See more details on using hashes here.

File details

Details for the file onnxruntime_openvino-1.18.0-cp310-cp310-win_amd64.whl.

File metadata

File hashes

Hashes for onnxruntime_openvino-1.18.0-cp310-cp310-win_amd64.whl
Algorithm Hash digest
SHA256 7f1931060f710a6c8e32121bb73044c4772ef5925802fc8776d3fe1e87ab3f75
MD5 29a8f689a1eeff4ca268d139a3ea8f55
BLAKE2b-256 347db75913bce58f4ee9bf6a02d1b513b9fc82303a496ec698e6fb1f9d597cb4

See more details on using hashes here.

File details

Details for the file onnxruntime_openvino-1.18.0-cp310-cp310-manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for onnxruntime_openvino-1.18.0-cp310-cp310-manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 565b874d21bcd48126da7d62f57db019f5ec0e1f82ae9b0740afa2ad91f8d331
MD5 03a2bc8f34d626f40a2f0de09f1c5978
BLAKE2b-256 b357e9a080f2477b2a4c16925f766e4615fc545098b0f4e20cf8ad803e7a9672

See more details on using hashes here.

File details

Details for the file onnxruntime_openvino-1.18.0-cp39-cp39-manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for onnxruntime_openvino-1.18.0-cp39-cp39-manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 597eb18f3de7ead69b08a242d74c4573b28bbfba40ca2a1a40f75bf7a834808e
MD5 3672373482c1508743f9ab8e9d6d7878
BLAKE2b-256 17af97c6f34f07c14eb83844c5f1f63f9cb5981aed8b086a44350838ff8070ca

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page