When adding dropout layers with class inheritance, they are always inserted at the end of the architecture (while that was not programmed!) #2691
Unanswered
pieterblok
asked this question in
Q&A
Replies: 1 comment 1 reply
-
This is how Sequential works. see also pytorch/pytorch#43876 you might need to remove all the modules after the insertion location, and re-add them. |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi guys, sorry for this noob question (and taking so much place, due to the code blocks). I promise it reads fast!
I followed the tutorial https://detectron2.readthedocs.io/en/latest/tutorials/write-models.html to rewrite a part of Faster R-CNN.
Specially I want to add two dropout layers before the linear layers in FastRCNNConvFCHead (./detectron2/modeling/roi_heads/box_head.py).
With this code its works perfect (it's just a copy-paste of the standard FastRCNNConvFCHead with two additional dropout-layers before the linear layers):
When I print the specific architecture, everything is perfect:
As you can understand, the code-block above is rather big for 2 additional lines of code, so I was thinking to add the dropout layers by class inheritance:
When I print the specific architecture:
Unfortunately, the dropout layers are inserted at the end of the architecture, while this was not programmed. Maybe I missed something with the class inheritance. What am I doing wrong?
Is this maybe caused by the nn.Sequential (class inheritance of FastRCNNConvFCHead)? How can I properly insert the dropout layers before the linear layers by using class inheritance?
Thanks in advance, Pieter
Beta Was this translation helpful? Give feedback.
All reactions