Skip to content

Latest commit

 

History

History
16 lines (11 loc) · 545 Bytes

README.md

File metadata and controls

16 lines (11 loc) · 545 Bytes

CyclicLR_Scheduler_PyTorch

A PyTorch implementation of Cyclical Learning Rates

Please refer to Cyclical Learning Rates for Training Neural Networks for more details

Usage

from cyclic_lr_scheduler import CyclicLR

optimizer = Whatever optimizer you want

scheduler = CyclicLR(optimizer, base_lr=0.0001, max_lr=0.01, step_size=10, mode=decay_strategy)
  • three options for decay_strategy: 'triangular', 'triangular2', 'exp_range'
  • step_size denotes the number of epoch rather than iteration