-
Notifications
You must be signed in to change notification settings - Fork 377
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug] Fix the bug when the params in shared modules do not require grad #903
[Bug] Fix the bug when the params in shared modules do not require grad #903
Conversation
Codecov Report
Additional details and impacted files@@ Coverage Diff @@
## main #903 +/- ##
=======================================
Coverage ? 78.12%
=======================================
Files ? 132
Lines ? 10031
Branches ? 2004
=======================================
Hits ? 7837
Misses ? 1853
Partials ? 341
Flags with carried forward coverage won't be shown. Click here to find out more. Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here. ☔ View full report at Codecov. |
@@ -576,6 +579,30 @@ def test_default_optimizer_constructor_bypass_duplicate(self): | |||
self._check_sgd_optimizer(optim_wrapper.optimizer, model, | |||
**paramwise_cfg) | |||
|
|||
model = ExampleDuplicateModel(duplicate_model_require_grad=False) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
model = ExampleDuplicateModel(duplicate_model_require_grad=False) | |
# `DefaultOptimWrapperConcstructor` can build an optimizer when the model has duplicated and non-grad parameters. | |
model = ExampleDuplicateModel(duplicate_model_require_grad=False) |
… do not require grad
18f417d
to
e807e8b
Compare
How to reproduce this bug
Error message: ValueError: some parameters appear in more than one parameter group
Modification
The only difference in mmengine/optim/optimizer/default_constructor.py
to