Created
April 12, 2022 13:11
-
-
Save Hyrtsi/708e69efb411af24e2a58c0e8a9fdf85 to your computer and use it in GitHub Desktop.
Python code to analyze onnx graph datatypes to solve error "Unsupported ONNX data type: UINT8 (2)"
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
""" | |
What: | |
Upon converting my .onnx model to TensorRT .engine I got this error | |
Unsupported ONNX data type: UINT8 (2) | |
So I created this short python function to check which inputs cause the error. | |
After some googling I found out that uint8 is not supported in TensorRT. | |
I plan on replacing the uint8's with something else so that I'm able to | |
convert the model successfully. | |
""" | |
import onnx | |
def main(): | |
model = onnx.load("onnx_model.onnx") | |
try: | |
onnx.checker.check_model(model) | |
except onnx.checker.ValidationError as e: | |
print("The model is invalid: %s" % e) | |
""" | |
# 1 = float32 | |
# 2 = uint8 | |
# 3 = int8 | |
# 4 = uint16 | |
# 5 = int16 | |
# 6 = int32 | |
# 7 = int64 | |
""" | |
inputs = model.graph.input | |
for input in inputs: | |
dtype = input.type.tensor_type.elem_type | |
if dtype == 2: | |
print(input.name, ">>>", input.type.tensor_type.elem_type) | |
main() |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
The error may as well result from outputs. You can in theory change the inputs and outputs using this code but that doesn't solve the issue. Basically the whole model must be build without uint8 for it to be compatible with TensorRT