-
Notifications
You must be signed in to change notification settings - Fork 251
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Port distilbert transformer checkpoint #1736
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
looks good overall! couple comments.
transformers_config = load_config(preset, HF_CONFIG_FILE) | ||
keras_config = convert_backbone_config(transformers_config) | ||
backbone = cls(**keras_config) | ||
print(backbone.summary()) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
remove prints!
|
||
def load_distilbert_tokenizer(cls, preset): | ||
return cls( | ||
get_file(preset, "vocab.txt"), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
should we handle do_lower_case
like we do for bert?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah. Fixed.
"vocabulary_size": transformers_config["vocab_size"], | ||
"num_layers": transformers_config["n_layers"], | ||
"num_heads": transformers_config["n_heads"], | ||
"hidden_dim": transformers_config["hidden_dim"] // 4, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
shouldn't this just be transformers_config["dim"]
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Right. Fixed.
"num_layers": transformers_config["n_layers"], | ||
"num_heads": transformers_config["n_heads"], | ||
"hidden_dim": transformers_config["hidden_dim"] // 4, | ||
"intermediate_dim": transformers_config["dim"] * 4, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
shouldn't this just be transformers_config["hidden_dim"]
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Right. Fixed.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM! Running a quick end to end test, which merge if all looks good.
Hi @mattdangerw @ariG23498,
Ported distilbert transformers checkpoint in kerasNLP. Please check. Thank you!