-
Notifications
You must be signed in to change notification settings - Fork 1.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[LibTorch] Expected Tensor but got None with inception v3 #2526
Comments
Are you able to run your torchscript model using the pytorch python API with the same input data? |
@deadeyegoodwin With pyTorch API with python I can easily run this without any errors. Code: import torch
import numpy as np
import torchvision
model = torch.hub.load('pytorch/vision:v0.6.0',
'inception_v3', pretrained=True)
model.eval()
input0_data = np.random.rand(1, 3, 299, 299)
input0_data_fp32 = input0_data.astype(np.float32)
input_data = torch.from_numpy(input0_data_fp32)
out = model.forward(input_data)
print('Model {0}'.format(out.size()))
ts_model = torch.jit.script(model)
ts_model.eval()
out = ts_model.forward(input_data)
print('JIT Model {0}'.format(out[0].size())) Outputs:
|
@Ursanon If you look at the server logs you will see: Source looks akin to https://github.com/pytorch/vision/blob/master/torchvision/models/inception.py#L208 Note how there is a condition for scripting that causes this. I would recommend tracing the model instead. Output from scripted model: ts_model(input_data)
Output from pytorch native model: model(input_data)
|
@Ursanon I tested tracing instead of scripting and it worked fine.
I replaced the inception_v3.pt file and the models runs in Tritonserver as expected. |
Thanks for your work @CoderHam, will check tracing. |
Description
When I run simple grpc demo client with pyTorch model loaded I get an error.
Triton Information
nvidia/tritonserver:20.12-py3
To Reproduce
Export pre-trained model:
Add to triton with config:
config.pbtxt:
The text was updated successfully, but these errors were encountered: