Is it possible to specify different lr for different layer or fix some layer? #325
-
Hello!As stated in the title, is it possible to specify different lr for different layer or fix some layer, when retraining a model?I‘ve read the document, but It seems not possible. |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 1 reply
-
Hello @funan-jhc-lee DeepMD-kit does not support different lr in different layers. You can fix the parameters in the embedding net by setting Please refer to https://github.com/deepmodeling/deepmd-kit/blob/master/doc/train-input-auto.rst |
Beta Was this translation helpful? Give feedback.
Hello @funan-jhc-lee
DeepMD-kit does not support different lr in different layers.
You can fix the parameters in the embedding net by setting
trainable
toFalse
, or fix the parameters of the hidden and output layers of the fitting net by a list of booleans provided totrainable
.Please refer to https://github.com/deepmodeling/deepmd-kit/blob/master/doc/train-input-auto.rst
model/descriptor[se_a]/trainable
andmodel/fitting_net[ener]/trainable
for more information