-
Notifications
You must be signed in to change notification settings - Fork 2.3k
Update transformers requirement from <3.6,>=3.4 to >=3.4,<4.1 #4831
Update transformers requirement from <3.6,>=3.4 to >=3.4,<4.1 #4831
Conversation
Updates the requirements on [transformers](https://github.com/huggingface/transformers) to permit the latest version. - [Release notes](https://github.com/huggingface/transformers/releases) - [Commits](huggingface/transformers@v3.4.0...v4.0.0) Signed-off-by: dependabot[bot] <[email protected]>
@@ -64,7 +64,8 @@ | |||
"scikit-learn", | |||
"scipy", | |||
"pytest", | |||
"transformers>=3.4,<3.6", | |||
"transformers>=3.4,<4.1", | |||
"sentencepiece", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
As per the release notes, sentencepiece
is not required as a dependency in transformers
by default. But we use certain tokenizers from HF that require it.
@@ -99,7 +99,7 @@ def test_transformers_vocab_sizes(self, model_name): | |||
|
|||
def test_transformers_vocabs_added_correctly(self): | |||
namespace, model_name = "tags", "roberta-base" | |||
tokenizer = cached_transformers.get_tokenizer(model_name) | |||
tokenizer = cached_transformers.get_tokenizer(model_name, use_fast=False) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why use_fast=False
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
RobertaTokenizerFast
does not have the attribute encoder
which we use in this test case.
Co-authored-by: Evan Pete Walsh <[email protected]>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM!
A newer version of transformers exists, but since this PR has been edited by someone other than Dependabot I haven't updated it. You'll get a PR for the updated version as normal once this PR is merged. |
Updates the requirements on transformers to permit the latest version.
Release notes
Sourced from transformers's releases.
... (truncated)
Commits
c781171
Release: v4.0.0ab597c8
Remove deprecatedevalutate_during_training
(#8852)e72b4fa
Add a direct link to the big table (#8850)dc0dea3
Correct docstring. (#8845)4d8f5d1
add xlnet mems and fix merge conflicts710b010
Migration guide from v3.x to v4.x (#8763)87199de
fix mt5 config (#8832)6887947
Big model table (#8774)8c5a2b8
[Flax test] Add require pytorch to flix flax test (#8816)911d848
[FlaxBert] Fix non-broadcastable attention mask for batched forward-passes (#...Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting
@dependabot rebase
.Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
@dependabot rebase
will rebase this PR@dependabot recreate
will recreate this PR, overwriting any edits that have been made to it@dependabot merge
will merge this PR after your CI passes on it@dependabot squash and merge
will squash and merge this PR after your CI passes on it@dependabot cancel merge
will cancel a previously requested merge and block automerging@dependabot reopen
will reopen this PR if it is closed@dependabot close
will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually