Polygraphy Trtexec: Extension to run on trtexec backend
Project description
Extending polygraphy run
to support trtexec
Introduction
polygraphy run
allows you to run inference with multiple backends, including TensorRT and ONNX-Runtime, and compare outputs.
This extension adds support to run inference with trtexec
.
Installation
Follow the steps below to install the extension module. After installation, you should see the trtexec
options in the help
output of polygraphy run
:
-
Build using
setup.py
:python3 setup.py bdist_wheel
-
Install the wheel: The wheel is installed in the
dist
directory. Install the wheel by running the following commandpython3 -m pip install dist/polygraphy_trtexec-*.whl \ --extra-index-url https://pypi.ngc.nvidia.com
NOTE: You may have to update the above command to install the appropriate version of the wheel
-
After the installation, you can run it on the trtexec backend by using the
--trtexec
flag as follows:polygraphy run sample.onnx --trtexec
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distributions
Built Distribution
Hashes for polygraphy_trtexec-0.0.8-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 1ce3a0a53cde91f4da761d214d9962e1d3dd934cfbf4b24abfc57d0e1236e61d |
|
MD5 | 7e3fda74ab04030037b9eb659750e996 |
|
BLAKE2b-256 | 6527d38ca6dc2a4befba781b4176bb71ba973c9a969e8c6d006f7cccc95a33c3 |