Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add ignore_index to Accuracy metric #155

Closed
wants to merge 14 commits into from
Closed

Add ignore_index to Accuracy metric #155

wants to merge 14 commits into from

Conversation

avinashsai
Copy link

Added ignore_index to Accuracy metric. Fixes #61

@codecov
Copy link

codecov bot commented Apr 3, 2021

Codecov Report

Merging #155 (e383af2) into master (953b621) will decrease coverage by 16.40%.
The diff coverage is 66.66%.

Impacted file tree graph

@@             Coverage Diff             @@
##           master     #155       +/-   ##
===========================================
- Coverage   96.18%   79.77%   -16.41%     
===========================================
  Files         180       90       -90     
  Lines        5582     2784     -2798     
===========================================
- Hits         5369     2221     -3148     
- Misses        213      563      +350     
Flag Coverage Δ
Linux 79.77% <66.66%> (-0.02%) ⬇️
Windows 79.77% <66.66%> (-0.02%) ⬇️
cpu 79.77% <66.66%> (-16.39%) ⬇️
gpu ?
macOS 79.77% <66.66%> (-16.39%) ⬇️
pytest 79.77% <66.66%> (-16.41%) ⬇️
python3.6 ?
python3.8 ?
python3.9 ?
torch1.3.1 ?
torch1.4.0 ?
torch1.8.1 ?

Flags with carried forward coverage won't be shown. Click here to find out more.

Impacted Files Coverage Δ
torchmetrics/functional/classification/accuracy.py 67.74% <60.00%> (-32.26%) ⬇️
torchmetrics/classification/accuracy.py 88.46% <100.00%> (-11.54%) ⬇️
torchmetrics/utilities/distributed.py 22.85% <0.00%> (-74.29%) ⬇️
torchmetrics/classification/auc.py 47.61% <0.00%> (-52.39%) ⬇️
torchmetrics/functional/classification/auroc.py 46.15% <0.00%> (-40.01%) ⬇️
torchmetrics/metric.py 55.51% <0.00%> (-39.64%) ⬇️
torchmetrics/functional/regression/psnr.py 60.60% <0.00%> (-36.37%) ⬇️
...rics/functional/classification/confusion_matrix.py 66.66% <0.00%> (-33.34%) ⬇️
torchmetrics/utilities/checks.py 60.00% <0.00%> (-32.78%) ⬇️
...chmetrics/functional/classification/stat_scores.py 63.07% <0.00%> (-30.77%) ⬇️
... and 134 more

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 953b621...e383af2. Read the comment docs.

@avinashsai
Copy link
Author

@SkafteNicki Please, review this

@SkafteNicki
Copy link
Member

Hi @avinashsai,
Code is looking great, but could you add some kind of test showing it is working as intended?

@avinashsai
Copy link
Author

I haven't added testcases. But I have checked externally with all cases. Should I add test cases in test_accuracy.py?

@SkafteNicki
Copy link
Member

added

Yes please do so :]

Copy link
Contributor

@tchaton tchaton left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please, add test with different number of ignore_index in preds, targets.

@Borda Borda changed the title Add ignore_index to Accuracy metric (#61) Add ignore_index to Accuracy metric Apr 6, 2021
@Borda Borda added the enhancement New feature or request label Apr 6, 2021
Copy link
Member

@Borda Borda left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

add test for usage and raised new error

torchmetrics/classification/accuracy.py Outdated Show resolved Hide resolved
@avinashsai
Copy link
Author

@SkafteNicki I dont know why tests are failing.

@Borda
Copy link
Member

Borda commented Apr 13, 2021

@SkafteNicki I dont know why tests are failing.

it seems that the outputs do not meet expectations
https://github.com/PyTorchLightning/metrics/pull/155/checks?check_run_id=2331922213

@Borda
Copy link
Member

Borda commented Apr 15, 2021

@avinashsai mind check the failing tests?

@Borda Borda mentioned this pull request Apr 15, 2021
4 tasks
@Borda Borda closed this in #166 Apr 16, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Add ignore_index option to Accuracy metric
4 participants