About the Weight Initialization in PL #5816
-
Hi, I am tring to use BERT for a project. The pretrained BERT model is part of my model. I am wondering how will PL initialize the model weights. Will it overwrite the pretrained BERT weights? Thanks. |
Beta Was this translation helpful? Give feedback.
Replies: 3 comments 1 reply
-
lightning doesn’t do any magic like this under the hood. you control all the weights and what gets initiated |
Beta Was this translation helpful? Give feedback.
-
I see. So where should I do the weight initialization step if I want to follow the PL design idea? In the |
Beta Was this translation helpful? Give feedback.
-
This is up to you and you should follow standard PyTorch guidelines. |
Beta Was this translation helpful? Give feedback.
lightning doesn’t do any magic like this under the hood. you control all the weights and what gets initiated