-
Notifications
You must be signed in to change notification settings - Fork 7.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Panoptic Segmentation Can Not be Exported to ONNX #4354
Comments
Meanwhile, I have tried
|
I met the same error!have you solved it? |
There are some pull requests that will be merged in the next days to solve these issues. |
No, it has not been solved. |
Thanks a lot! I will close this issue after it is solved, and I am focusing on this problem. |
We have couple projects of our own broken because of this. Would be nice to know when we can |
Not a solution. Steps:
Still getting:
|
having the same problem with PointRend model, any solution? |
Maybe @ppwwyyxx can give us any tips. This also doesn't work for Detectron 2 Mask R-CNN R50-FPN 3x in TensorRT ONNX export |
I am having the same problem with Detic : python3.9/site-packages/torch/onnx/symbolic_opset9.py", line 2065, in to These are the warnings that I get when export the Detic Model �[32m[08/02 12:46:11 detectron2]: �[0mCommand line arguments: Namespace(format='onnx', export_method='tracing', config_file='configs/Detic_LbaseI_CLIP_SwinB_896b32_4x_ft4x_max-size.yaml', sample_image=None, run_eval=False, output='models/', opts=['MODEL.WEIGHTS', 'models/Detic_LbaseI_CLIP_R5021k_640b64_4x_ft4x_max-size.pkl', 'MODEL.DEVICE', 'cpu']) |
any updates on this? I'm still facing this problem and I am wondering when this will be fixed? |
No, it has not been fixed. |
never be fixed!Obviously,the repo will be died!no one will provide some help! |
Ive tried using an older commit of detectron2 but it seems like it doesn't work then either. Anyone got any ideas on how to fix or not? |
@ppwwyyxx please can you help on this . failing to export to onnx. tried to modify with dummy in=1,3,224,224 still error |
Any updates yet? Hows the progress coming on those PR's @FrancescoMandru |
I'm not a Meta employee, I collaborated with a ONNX merge general for detectron2 in #4291 |
Oh, didn't know that-because Im still dealing with bugs and what-not. Im now downgrading to v0.6 release instead of newest and seeing how that goes. |
Hello, I've compiled and install pytorch 1.12.1 from source with the BUILD_CAFFE2=1 flag. Traceback (most recent call last): Any ideas? |
I will not leave and keep the focus on until this problem has been solved. |
Will look into this one today |
SummaryAfter looking into this issue, I've managed to export it to ONNX using both 1) standard ONNX tracing (aka Official standard ONNXInstall pytorch 1.12.1 (no need to build from source as Caffe2 is not used), torchvision 0.13 and latest detectron2 master branch. Next, cd into The exported standard ONNX using tracing is below: Workaround for Caffe2 ONNXEdit https://github.com/facebookresearch/detectron2/blob/main/detectron2/export/caffe2_modeling.py#L416-L419 and change from META_ARCH_CAFFE2_EXPORT_TYPE_MAP = {
"GeneralizedRCNN": Caffe2GeneralizedRCNN,
"RetinaNet": Caffe2RetinaNet,
} to META_ARCH_CAFFE2_EXPORT_TYPE_MAP = {
"PanopticFPN": Caffe2GeneralizedRCNN,
"GeneralizedRCNN": Caffe2GeneralizedRCNN,
"RetinaNet": Caffe2RetinaNet,
} I have done that because Build pytorch master from source using something like |
If you are not using libcaffe2 to run the ONNX model, do not export using Also, I could not repro the For recent pytorch version, you may need to update fairseq as shown below: |
I will push a PR that handles the Maybe some updates to deployment.md. Any suggestions on this last part? |
Thanks @thiagocrepaldi for looking at this. To add some corrections:
(As open source doesn't feel like a big priority for this project any more, if anyone's passionate about it, it might be easier to start a separate repo to host deploy-specific parts (basically a more powerful version of |
Thank you, @ppwwyyxx, will look into the docstring and propose something to get these guys going There is some discussion with Meta proposing Microsoft to help supporting ONNX export part of detectron2 (I think we briefly discussed that some time ago). Let's see what comes from it. If there is no deal, we certainly can start such detectron2 onnx zoo (another idea within microsoft - but for any model, not just detectron2) |
#4520 can be used to experiment with standard onnx export |
Hi, python export_model.py --sample-image tm-onnx.png --config-file ~/detectron2/configs/COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_3x.yaml --export-method caffe2_tracing --format onnx --output ./output MODEL.WEIGHTS model_final_f10217.pkl MODEL.DEVICE cuda Any ideas? |
@vincedupuis I seem to be getting this consistently: ImportError: cannot import name 'STABLE_ONNX_OPSET_VERSION' from 'detectron2.export' Did you run across this error? Any idea what it could be? |
Make sure you are using main branch and you should be good to go |
@thiagocrepaldi thanks for the instructions. |
As @ppwwyyxx mentioned before, this hack for Caffe2 only exports the "Mask R-CNN subpart" part of the PanopticFPN, and #4520 raises an exception to let users know about this limitation Regarding the "where the If came from", append |
@thiagocrepaldi thanks for your tip regarding |
full export for non-caffe2 mode |
@thiagocrepaldi thanks I managed to export the PanopticFPN using your branch. Regarding the It points to this line https://github.com/pytorch/vision/blob/07ae61bf9c21ddd1d5f65d326aa9636849b383ca/torchvision/ops/boxes.py#L89 After disabling this line in the source code (for debugging purposes), the Curiously, when I export the nms in a standalone network (so simple pure pytorch network), there is no In the network, the |
I am glad the PR worked out for you, but I am not sure if I undertood what you meant by "After disabling this line in the source code (for debugging purposes), the |
Hi ghost, have you already solve the problem? I met the same error:" KeyError: 'UNKNOWN_SCALAR' " And I tried a lot for a long time, but still didn't find a way to solve it................................... |
i met similar issue when i use the exported cuda onnx model to inference on cuda: can you help me with my problem? |
Instructions To Reproduce the 🐛 Bug:
export DETECTRON2_DATASETS=/data/datasets/ python3 export_model.py \ --config-file ../../configs/COCO-PanopticSegmentation/panoptic_fpn_R_50_1x.yaml \ --output /data/output/ \ --export-method caffe2_tracing \ --format onnx \ MODEL.WEIGHTS /data/model/COCO-PanopticSegmentation/panoptic_fpn_R_50_1x/139514544/model_final_dbfeb4.pkl \ MODEL.DEVICE cpu
I have already prepared the COCO dataset in
/data/datasets/
.run, such as a private dataset.
Expected behavior:
I expected that the official code export_model.py can help me to export onnx model.
Environment:
Provide your environment information using the following command:
If your issue looks like an installation issue / environment issue,
please first try to solve it yourself with the instructions in
https://detectron2.readthedocs.io/tutorials/install.html#common-installation-issues
Finally, thanks to your effects that contribute such excellent repo!
The text was updated successfully, but these errors were encountered: