Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Fix] fix a bug when module is missing in low version of bitsandbytes #1388

Merged
merged 2 commits into from
Oct 31, 2023
Merged
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion mmengine/optim/optimizer/builder.py
Original file line number Diff line number Diff line change
Expand Up @@ -146,7 +146,7 @@ def register_bitsandbytes_optimizers() -> List[str]:
'PagedAdamW8bit', 'LAMB8bit', 'LARS8bit', 'RMSprop8bit',
'Lion8bit', 'PagedLion8bit', 'SGD8bit'
]:
_optim = getattr(bnb.optim, module_name)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Register a None optimizer seems weird, maybe we should skip registering the optimizer if it does not exist

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If _optim is None, the result of inspect.isclass(_optim) is False and the registering will be skipped

_optim = getattr(bnb.optim, module_name, None)
if inspect.isclass(_optim) and issubclass(_optim,
torch.optim.Optimizer):
OPTIMIZERS.register_module(module=_optim)
dadaptation_optimizers.append(module_name)

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

From the perspective of code readability, explicitly skipping registration is preferable to implicitly skipping it by passing in None.

BTW, I'm not familiar with the optimizer in bitsandbytes, why don't we refer to the implementation here

for module_name in dir(Sophia):

to register the optimizer?

Copy link
Contributor Author

@Ben-Louis Ben-Louis Oct 18, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Me neither. I just attempted to fix the bug with minimal changes. I'm uncertain if modifying the builder function for bitsandbytes to resemble Sophia's could cause any issue

_optim = getattr(bnb.optim, module_name, None)
if inspect.isclass(_optim) and issubclass(_optim,
torch.optim.Optimizer):
OPTIMIZERS.register_module(module=_optim)
Expand Down