Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[feat] Add reset_forward_cache to Metric #260

Merged
merged 5 commits into from
May 31, 2021
Merged

Conversation

tchaton
Copy link
Contributor

@tchaton tchaton commented May 25, 2021

Before submitting

  • Was this discussed/approved via a Github issue? (no need for typos and docs improvements)
  • Did you read the contributor guideline, Pull Request section?
  • Did you make sure to update the docs?
  • Did you write any new necessary tests?

What does this PR do?

This PR adds reset_forward_cache option to Metric.

When wrapping Metric in Metric, the forward_cache get overridden. This option enables to get skip this.

This would be used in Logging Refactor in Lightning: Lightning-AI/pytorch-lightning#7631

PR review

Anyone in the community is free to review the PR once the tests have passed.
If we didn't discuss your PR in Github issues there's a high chance it will not be merged.

Did you have fun?

Make sure you had fun coding 🙃

@codecov
Copy link

codecov bot commented May 25, 2021

Codecov Report

Merging #260 (ca20eac) into master (571ac15) will decrease coverage by 0.01%.
The diff coverage is 100.00%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master     #260      +/-   ##
==========================================
- Coverage   96.83%   96.81%   -0.02%     
==========================================
  Files         184       92      -92     
  Lines        6026     3013    -3013     
==========================================
- Hits         5835     2917    -2918     
+ Misses        191       96      -95     
Flag Coverage Δ
Linux 78.95% <100.00%> (ø)
Windows 78.95% <100.00%> (ø)
cpu 96.81% <100.00%> (ø)
gpu ?
macOS 96.81% <100.00%> (ø)
pytest 96.81% <100.00%> (-0.02%) ⬇️
python3.6 95.75% <100.00%> (ø)
python3.8 96.81% <100.00%> (ø)
python3.9 96.71% <100.00%> (ø)
torch1.3.1 95.75% <100.00%> (ø)
torch1.4.0 95.88% <100.00%> (ø)
torch1.8.1 96.71% <100.00%> (ø)

Flags with carried forward coverage won't be shown. Click here to find out more.

Impacted Files Coverage Δ
torchmetrics/metric.py 95.56% <100.00%> (ø)
.../1/s/torchmetrics/regression/explained_variance.py
.../1/s/torchmetrics/functional/classification/roc.py
...s/torchmetrics/classification/matthews_corrcoef.py
__w/1/s/torchmetrics/classification/roc.py
__w/1/s/torchmetrics/utilities/distributed.py
...torchmetrics/functional/classification/__init__.py
...rchmetrics/functional/retrieval/reciprocal_rank.py
...1/s/torchmetrics/functional/classification/dice.py
__w/1/s/torchmetrics/functional/nlp.py
... and 83 more

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 571ac15...ca20eac. Read the comment docs.

@Borda Borda added API / design enhancement New feature or request labels May 25, 2021
@Borda Borda added this to the v0.4 milestone May 25, 2021
torchmetrics/metric.py Outdated Show resolved Hide resolved
@Borda Borda requested a review from SkafteNicki May 27, 2021 08:23
@Borda Borda added the ready label May 27, 2021
@Borda Borda requested a review from maximsch2 May 27, 2021 08:25
@ananthsub
Copy link
Contributor

ananthsub commented May 27, 2021

@tchaton from a metrics standpoint, forward_cache is an internal implementation detail. Is there any way we can avoid this dependency on forward cache in the Lightning trainer?

@justusschock
Copy link
Member

I also think this shouldn't be part of this package but part of lightning since torchmetrics is supposed to be lightning agnostic

@SkafteNicki SkafteNicki merged commit b3f6d54 into master May 31, 2021
@SkafteNicki SkafteNicki deleted the reset_forward_cache branch May 31, 2021 08:29
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants