Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Allow Accuracy to return metric per class #99

Closed
SkafteNicki opened this issue Mar 16, 2021 · 6 comments · Fixed by #166
Closed

Allow Accuracy to return metric per class #99

SkafteNicki opened this issue Mar 16, 2021 · 6 comments · Fixed by #166
Assignees
Labels
enhancement New feature or request good first issue Good for newcomers help wanted Extra attention is needed
Milestone

Comments

@SkafteNicki
Copy link
Member

🚀 Feature

Implement the average argument like in Precision and Recall such that accuracy metric can return the metric per class label.

Motivation

Sometimes it may be beneficially to look at the accuracy per label, especially when working with very unbalanced datasets

Pitch

Alternatives

Additional context

@SkafteNicki SkafteNicki added enhancement New feature or request help wanted Extra attention is needed labels Mar 16, 2021
@SkafteNicki SkafteNicki added this to the 0.3 milestone Mar 16, 2021
@Borda Borda added the good first issue Good for newcomers label Mar 24, 2021
@arv-77
Copy link
Contributor

arv-77 commented Apr 4, 2021

Dear @SkafteNicki , can I work on this?

@SkafteNicki
Copy link
Member Author

@arvindmuralie77 yes, I will assign you. Thanks for wanting to contribe :]
Please make sure that the feature follows the same pattern as with precision, recall ect. by adding an argument average which normally can take the values 'micro', 'macro', 'weighted', 'none' and None where the two last corresponds to returning the score per class.

@arv-77
Copy link
Contributor

arv-77 commented Apr 5, 2021

Sure, will do. Thanks @SkafteNicki

@arv-77
Copy link
Contributor

arv-77 commented Apr 6, 2021

I was thinking of changing the parent class from Metric to StatScores in Accuracy. This way, I can use tp, fp, tn, and fn to calculate accuracy for all values of average. Would that be alright or should I try without changing the base class @SkafteNicki ?

@SkafteNicki
Copy link
Member Author

@arvindmuralie77 Fine by me, that is basically also what I would do :]

@arv-77 arv-77 mentioned this issue Apr 9, 2021
4 tasks
@arv-77
Copy link
Contributor

arv-77 commented Apr 9, 2021

Hey @SkafteNicki , I have created a draft for this issue. When pytest for accuracy is run, 23 tests are failing out of 102 because the mdmc_average parameter does not have default value. If it is set to global all test cases are passing.
I think new test cases will have to be added since there are new parameters.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request good first issue Good for newcomers help wanted Extra attention is needed
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants