Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

prepare returns AcceleratedScheduler only if python version != 3.8 in torch==2.0 #1200

Closed
2 of 4 tasks
younesbelkada opened this issue Mar 16, 2023 · 4 comments
Closed
2 of 4 tasks
Assignees
Labels
bug Something isn't working

Comments

@younesbelkada
Copy link
Contributor

younesbelkada commented Mar 16, 2023

Not sure if related to the new release of acclerate + torch==2.0, but prepare returns an AcceleratedScheduler only if python version is different than 3.8 and if one uses pytorch 2.0

Information

  • The official example scripts
  • My own modified scripts

Tasks

  • One of the scripts in the examples/ folder of Accelerate or an officially supported no_trainer script in the examples folder of the transformers repo (such as run_no_trainer_glue.py)
  • My own task or dataset (give details below)

Reproduction

import torch
import torch.nn as nn
from torch.optim.lr_scheduler import LinearLR

from accelerate import Accelerator

accelerator = Accelerator()

dummy_model = nn.Linear(1, 1)
dummy_optimizer = torch.optim.Adam(dummy_model.parameters(), 1.41e-5)

scheduler = LinearLR(dummy_optimizer, 1.41e-5)

dummy_model, dummy_optimizer, scheduler = accelerator.prepare(dummy_model, dummy_optimizer, scheduler)
print(hasattr(scheduler, "scheduler"))

Expected behavior

this should aways print True regardless python version but prints False for python=3.8

cc @muellerzr @sgugger

@younesbelkada
Copy link
Contributor Author

After digging a bit, it seems that the check here

elif isinstance(obj, LRScheduler):

Returns False only for python=3.8 and torch==2.0

@younesbelkada younesbelkada changed the title prepare returns AcceleratedScheduler only if python version != 3.8 prepare returns AcceleratedScheduler only if python version != 3.8 in torch==2.0 Mar 16, 2023
@muellerzr muellerzr self-assigned this Mar 16, 2023
@muellerzr muellerzr added the bug Something isn't working label Mar 16, 2023
@muellerzr
Copy link
Collaborator

@younesbelkada I'm not seeing this.

import torch
import torch.nn as nn
from torch.optim.lr_scheduler import LinearLR

from accelerate import Accelerator

accelerator = Accelerator()

dummy_model = nn.Linear(1, 1)
dummy_optimizer = torch.optim.Adam(dummy_model.parameters(), 1.41e-5)

scheduler = LinearLR(dummy_optimizer, 1.41e-5)

dummy_model, dummy_optimizer, scheduler = accelerator.prepare(dummy_model, dummy_optimizer, scheduler)
print(hasattr(scheduler, "scheduler"))

Accelerate env:

Copy-and-paste the text below in your GitHub issue

- `Accelerate` version: 0.17.1
- Platform: Linux-5.15.0-58-generic-x86_64-with-glibc2.17
- Python version: 3.8.16
- Numpy version: 1.23.0
- PyTorch version (GPU?): 2.0.0+cu117 (True)
- `Accelerate` default config:
        Not found

Just ran via python test.py

@muellerzr
Copy link
Collaborator

Can you tell me more about your system? (accelerate env should suffice)

@younesbelkada
Copy link
Contributor Author

argfh sorry, I had accelerate==0.16 installed on that env and that explains why it was failing, everything seems to working fine now on the trl CIs! Thanks a lot for double checking @muellerzr !

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants