Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Format error with ONNX quantized CNN #101

Open
mbelda opened this issue Oct 5, 2022 · 1 comment
Open

Format error with ONNX quantized CNN #101

mbelda opened this issue Oct 5, 2022 · 1 comment

Comments

@mbelda
Copy link

mbelda commented Oct 5, 2022

Hello,
I am trying to run a particular version of the MobilenetV2 CNN. I have quantized it and it can be seen in netron.app that it is correctly quantized using the QDQ format.

But when I try to execute the systolic_runner example minimaly modified (just the size of the labels array) on firesim I keep getting the following error:

terminate called after throwing an instance of 'Ort::Exception'
  what():  Unexpected input data type. Actual: (tensor(float)) , expected: (tensor(uint8))

I have tried to run the original Mobilenet_V2 quantized from the onnx repository and it works fine, but I have noticed that it is quantized with the operator oriented formal.

I attach my network.
MobileNetV2_0p5_all_quant.zip

Thank you in advance.
Kind regards, MªJosé.

@hngenc
Copy link
Member

hngenc commented Nov 23, 2022

Can you also share the image that you tried to run the Mobilenet inference on?

The MobileNet models we provide expect float inputs, but looking at your ONNX model, it seems you're expecting int8 inputs instead?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants