Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Fix] fix a bug when module is missing in low version of bitsandbytes #1388

Merged
merged 2 commits into from
Oct 31, 2023
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
22 changes: 12 additions & 10 deletions mmengine/optim/optimizer/builder.py
Original file line number Diff line number Diff line change
Expand Up @@ -132,6 +132,10 @@ def register_sophia_optimizers() -> List[str]:
def register_bitsandbytes_optimizers() -> List[str]:
"""Register optimizers in ``bitsandbytes`` to the ``OPTIMIZERS`` registry.

In the `bitsandbytes` library, optimizers that have the same name as the
default optimizers in PyTorch are prefixed with ``bnb_``. For example,
``bnb_Adagrad``.

Returns:
List[str]: A list of registered optimizers' name.
"""
Expand All @@ -141,16 +145,14 @@ def register_bitsandbytes_optimizers() -> List[str]:
except ImportError:
pass
else:
for module_name in [
'AdamW8bit', 'Adam8bit', 'Adagrad8bit', 'PagedAdam8bit',
'PagedAdamW8bit', 'LAMB8bit', 'LARS8bit', 'RMSprop8bit',
'Lion8bit', 'PagedLion8bit', 'SGD8bit'
]:
_optim = getattr(bnb.optim, module_name)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Register a None optimizer seems weird, maybe we should skip registering the optimizer if it does not exist

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If _optim is None, the result of inspect.isclass(_optim) is False and the registering will be skipped

_optim = getattr(bnb.optim, module_name, None)
if inspect.isclass(_optim) and issubclass(_optim,
torch.optim.Optimizer):
OPTIMIZERS.register_module(module=_optim)
dadaptation_optimizers.append(module_name)

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

From the perspective of code readability, explicitly skipping registration is preferable to implicitly skipping it by passing in None.

BTW, I'm not familiar with the optimizer in bitsandbytes, why don't we refer to the implementation here

for module_name in dir(Sophia):

to register the optimizer?

Copy link
Contributor Author

@Ben-Louis Ben-Louis Oct 18, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Me neither. I just attempted to fix the bug with minimal changes. I'm uncertain if modifying the builder function for bitsandbytes to resemble Sophia's could cause any issue

if inspect.isclass(_optim) and issubclass(_optim,
torch.optim.Optimizer):
OPTIMIZERS.register_module(module=_optim)
dadaptation_optimizers.append(module_name)
optim_classes = inspect.getmembers(
bnb.optim, lambda _optim: (inspect.isclass(_optim) and issubclass(
_optim, torch.optim.Optimizer)))
for name, optim_cls in optim_classes:
if name in OPTIMIZERS:
name = f'bnb_{name}'
OPTIMIZERS.register_module(module=optim_cls, name=name)
dadaptation_optimizers.append(name)
return dadaptation_optimizers


Expand Down