Skip to content
This repository has been archived by the owner on Dec 16, 2022. It is now read-only.

Update transformers requirement from <2.5.0,>=2.4.0 to >=2.4.0,<2.8.0 #4003

Conversation

dependabot-preview[bot]
Copy link
Contributor

Updates the requirements on transformers to permit the latest version.

Release notes

Sourced from transformers's releases.

T5 Model, BART summarization example and reduced memory, translation pipeline

T5 Model (@patrickvonplaten, @thomwolf )

T5 is a powerful encoder-decoder model that formats every NLP problem into a text-to-text format. It achieves state of the art results on a variety of NLP tasks (Summarization, Question-Answering, ...).

Five sets of pre-trained weights (pre-trained on a multi-task mixture of unsupervised and supervised tasks) are released. In ascending order from 60 million parameters to 11 billion parameters:

t5-small, t5-base, t5-large, t5-3b, t5-11b

T5 can now be used with the translation and summarization pipeline.

Related:

Big thanks to the original authors, especially @craffel who helped answer our questions, reviewed PRs and tested T5 extensively.

New BART checkpoint: bart-large-xsum (@sshleifer)

These weights are from BART finetuned on the XSum abstractive summarization challenge, which encourages shorter (more abstractive) summaries. It achieves state of the art.

BART summarization example with pytorch-lightning (@acarrera94)

New example: BART for summarization, using Pytorch-lightning. Trains on CNN/DM and evaluates.

Translation pipeline (@patrickvonplaten)

A new pipeline is available, leveraging the T5 model. The T5 model was added to the summarization pipeline as well.

Memory improvements with BART (@sshleifer)

In an effort to have the same memory footprint and same computing power necessary to run inference on BART, several improvements have been made on the model:

  • Remove the LM head and use the embedding matrix instead (~200MB)
  • Call encoder before expanding input_ids (~1GB)
  • SelfAttention only returns weights if config.output_attentions (~500MB)
  • Two separate, smaller decoder attention masks (~500MB)
  • drop columns that are exclusively pad_token_id from input_ids in evaluate_cnn example.

New model: XLMForTokenClassification (@sakares)

A new head was added to XLM: XLMForTokenClassification.

Commits
  • 6f5a12a Release: v2.7.0
  • 296252c fix lm lables in docstring (#3529)
  • 75ec6c9 [T5] make decoder input ids optional for t5 training (#3521)
  • 5b44e0a [T5] Add training documenation (#3507)
  • 33ef700 [Docs] examples/summarization/bart: Simplify CNN/DM preprocessi… (#3516)
  • f6a23d1 [BART] add bart-large-xsum weights (#3422)
  • 601ac5b [model_cards]: use MIT license for all dbmdz models
  • 17dceae Fix circle ci flaky fail of wmt example (#3485)
  • 00ea100 add summarization and translation to notebook (#3478)
  • b08259a run_ner.py / bert-base-multilingual-cased can output empty tokens (#2991)
  • Additional commits viewable in compare view

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options

You can trigger Dependabot actions by commenting on this PR:

  • @dependabot rebase will rebase this PR
  • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
  • @dependabot merge will merge this PR after your CI passes on it
  • @dependabot squash and merge will squash and merge this PR after your CI passes on it
  • @dependabot cancel merge will cancel a previously requested merge and block automerging
  • @dependabot reopen will reopen this PR if it is closed
  • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
  • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
  • @dependabot use these labels will set the current labels as the default for future PRs for this repo and language
  • @dependabot use these reviewers will set the current reviewers as the default for future PRs for this repo and language
  • @dependabot use these assignees will set the current assignees as the default for future PRs for this repo and language
  • @dependabot use this milestone will set the current milestone as the default for future PRs for this repo and language
  • @dependabot badge me will comment on this PR with code to add a "Dependabot enabled" badge to your readme

Additionally, you can set the following in your Dependabot dashboard:

  • Update frequency (including time of day and day of week)
  • Pull request limits (per update run and/or open at any time)
  • Out-of-range updates (receive only lockfile updates, if desired)
  • Security updates (receive only security updates, if desired)

@dependabot-preview dependabot-preview bot added the dependencies Pull requests that update a dependency file label Mar 30, 2020
@dirkgr
Copy link
Member

dirkgr commented Apr 4, 2020

Closed in favor of #4018.

@dirkgr dirkgr closed this Apr 4, 2020
@dependabot-preview
Copy link
Contributor Author

OK, I won't notify you again about this release, but will get in touch when a new version is available. If you'd rather skip all updates until the next major or minor version, let me know by commenting @dependabot ignore this major version or @dependabot ignore this minor version.

If you change your mind, just re-open this PR and I'll resolve any conflicts on it.

@dependabot-preview dependabot-preview bot deleted the dependabot/pip/transformers-gte-2.4.0-and-lt-2.8.0 branch April 4, 2020 00:28
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
dependencies Pull requests that update a dependency file
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant