-
Notifications
You must be signed in to change notification settings - Fork 17
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add scheduler #85
Add scheduler #85
Conversation
flowvision/scheduler/plateau_lr.py
Outdated
param_group['lr'] = self.restore_lr[i] | ||
self.restore_lr = None | ||
|
||
self.lr_scheduler.step(metric, epoch) # step the base scheduler |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
flow.optim.lr_scheduler.ReduceLROnPlateau
的step函数超参上缺少了一个epoch参数,可以对齐一下torch里的这个传参
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
oeneflow 仓库的 ReduceLROnPlateau 也要改一下,还是先改了 这个再加吧,不然会报错
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
一个功能加了要保证能运行
Draw LRScheduler Picture# from timm.scheduler.cosine_lr import CosineLRScheduler
from oneflow import optim
from flowvision.scheduler import MultiStepLRScheduler
from flowvision.models import ModelCreator
import oneflow as flow
model = ModelCreator.create_model("alexnet")
optimizer = flow.optim.SGD(model.parameters())
from matplotlib import pyplot as plt
from typing import List
a = [30, 60, 90]
print([_ * 100 for _ in a])
print(isinstance(a, list))
def get_lr_per_epoch(scheduler, num_epoch):
lr_per_epoch = []
for epoch in range(num_epoch):
lr_per_epoch.append(scheduler.get_epoch_values(epoch))
return lr_per_epoch
num_epoch = 100
scheduler = MultiStepLRScheduler(optimizer, decay_t=[30, 60, 90], decay_rate=0.1, warmup_lr_init=0.00001, warmup_t=10)
lr_per_epoch = []
for i in range(num_epoch):
scheduler.step(i)
lr_per_epoch.append(optimizer.param_groups[0]["lr"])
plt.plot([i for i in range(num_epoch)], lr_per_epoch)
plt.savefig("./test_multi_step.jpg") |
TODO
Add more scheduler
PlateauLRScheduler ( ReduceLROnPlateau 传参 need to fix )OneFlow
ReduceLROnPlateau
didn't haveepoch
args