Skip to content

TensorFlow implementation of Hypergradient Descent modification of Adam algorithm

Notifications You must be signed in to change notification settings

zadaianchuk/HypergradientDescent

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

21 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

HyperGradientDescent

This repo contains TensorFlow implementation of the Adam-HD and SGDN-HD algorithm discribed in the paper Online Learning Rate Adaptation with Hypergradient Descent.

The implementation allows two regimes: one dimentional global learning rate (as was described in the paper and used in the original ADAM) and per-parameter vector of learning rates.

Also, we reproduced some results presented in the paper.

Here you see the results of training CNN on CIFAR-10 with batch size 128 near 100 epoch (50000 iterations) using ADAM algorithm with learning rates img.

ADAM

Here are the results for ADAM-HD algorithm with different iniutial learning rates img The hyper learning rate img is fixed. The value is the same as was in the poaper. ADAM-HD

We plot the evolution of learning rates during training. As wqas described in the paper. the learning rate increases during first stage of the training and then goes down to the img.
Learning rate adaptation

About

TensorFlow implementation of Hypergradient Descent modification of Adam algorithm

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages