-
Notifications
You must be signed in to change notification settings - Fork 377
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Fix] fix a bug when module is missing in low version of bitsandbytes #1388
[Fix] fix a bug when module is missing in low version of bitsandbytes #1388
Conversation
@@ -146,7 +146,7 @@ def register_bitsandbytes_optimizers() -> List[str]: | |||
'PagedAdamW8bit', 'LAMB8bit', 'LARS8bit', 'RMSprop8bit', | |||
'Lion8bit', 'PagedLion8bit', 'SGD8bit' | |||
]: | |||
_optim = getattr(bnb.optim, module_name) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Register a None optimizer seems weird, maybe we should skip registering the optimizer if it does not exist
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If _optim
is None
, the result of inspect.isclass(_optim)
is False
and the registering will be skipped
mmengine/mmengine/optim/optimizer/builder.py
Lines 149 to 153 in 2135d37
_optim = getattr(bnb.optim, module_name, None) | |
if inspect.isclass(_optim) and issubclass(_optim, | |
torch.optim.Optimizer): | |
OPTIMIZERS.register_module(module=_optim) | |
dadaptation_optimizers.append(module_name) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
From the perspective of code readability, explicitly skipping registration is preferable to implicitly skipping it by passing in None
.
BTW, I'm not familiar with the optimizer in bitsandbytes, why don't we refer to the implementation here
mmengine/mmengine/optim/optimizer/builder.py
Line 120 in 2135d37
for module_name in dir(Sophia): |
to register the optimizer?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Me neither. I just attempted to fix the bug with minimal changes. I'm uncertain if modifying the builder function for bitsandbytes
to resemble Sophia
's could cause any issue
Thanks for your contribution and we appreciate it a lot. The following instructions would make your pull request more healthy and more easily get feedback. If you do not understand some items, don't worry, just make the pull request and seek help from maintainers. By the way, if you're not familiar with how to use pre-commit to fix lint issues or add unit tests, please refer to Contributing to OpenMMLab.
Motivation
In
register_bitsandbytes_optimizers
, some optimizer modules are missing from a low version ofbitsandbytes
, resulting in a bug for users with lower versions. (e.g. open-mmlab/mmpose#2747)Modification
Please briefly describe what modification is made in this PR.
BC-breaking (Optional)
Does the modification introduce changes that break the backward-compatibility of the downstream repos?
If so, please describe how it breaks the compatibility and how the downstream projects should modify their code to keep compatibility with this PR.
Use cases (Optional)
If this PR introduces a new feature, it is better to list some use cases here, and update the documentation.
Checklist