Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Excluding AdamWeightDecayOptimizer internal variables from restoring #16

Merged
merged 1 commit into from
Nov 13, 2018

Conversation

donatasrep
Copy link
Contributor

I tried to use convert_tf_checkpoint_to_pytorch.py script to convert my pretrained model, but in order to do so, I had to make some minor tweaks. I thought I would share in case you find it useful.

@thomwolf
Copy link
Member

Is your pre-trained model a TensorFlow model?

@donatasrep
Copy link
Contributor Author

Yes

@thomwolf thomwolf merged commit 5cd8d7a into huggingface:master Nov 13, 2018
@thomwolf
Copy link
Member

Nice, thanks for that!

qwang70 pushed a commit to DRL36/pytorch-pretrained-BERT that referenced this pull request Mar 2, 2019
Excluding AdamWeightDecayOptimizer internal variables from restoring
@HongyanJiao HongyanJiao mentioned this pull request Sep 19, 2019
amathews-amd referenced this pull request in ROCm/transformers Aug 6, 2021
…module

hack to make roberta can run it ortmodule
rraminen pushed a commit to rraminen/transformers that referenced this pull request Oct 27, 2022
…update_hf_training

Removed hardcoded warmup steps.
jameshennessytempus pushed a commit to jameshennessytempus/transformers that referenced this pull request Jun 1, 2023
SangbumChoi added a commit to SangbumChoi/transformers that referenced this pull request Jan 13, 2025
…rmat_processing

make style & add postprocssing for instance segmentation compatible for triton
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants