Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

GH-1344: Prioritizing classes #1345

Merged
merged 6 commits into from
Jan 22, 2020
Merged

Conversation

klasocki
Copy link
Contributor

Add possibility for including class weights in cross-entropy loss (when not using CRF) and using F-Beta as the evaluation metric

@djstrong
Copy link
Contributor

Weights can be added also to TextClassifier.

@klasocki klasocki force-pushed the prioritizing-classes branch from 7bdafb5 to 3a51e68 Compare January 13, 2020 12:49
Add weights to state_dict for loading and saving models
Remove unnecessary isinstance calls while logging by overriding __str__
Minor refactor
@klasocki klasocki force-pushed the prioritizing-classes branch from 3a51e68 to 25e7614 Compare January 13, 2020 13:29
@djstrong
Copy link
Contributor

Class weighting is important in macro metrics.

@alanakbik
Copy link
Collaborator

@klasocki thanks for adding this and please excuse the delay in checking this PR! I've ran some local tests with different parameters and it works well for prioritizing classes and controlling the precision/recall trade-off so lot's of people will surely find this useful!

@alanakbik
Copy link
Collaborator

👍

@alanakbik alanakbik merged commit f7dd29d into flairNLP:master Jan 22, 2020
@klasocki
Copy link
Contributor Author

@klasocki thanks for adding this and please excuse the delay in checking this PR! I've ran some local tests with different parameters and it works well for prioritizing classes and controlling the precision/recall trade-off so lot's of people will surely find this useful!

No problem whatsoever! Thank you for caring about the quality of Flair. And I'm very glad you found it useful 😄

@BasLaanPAM
Copy link

Thank you for adding this feature, it is very helpful. I was wondering if there is a specific reason it was only added in cross-entropy loss and not the viterbi loss (using CRF)? Having a quick look at the code we should be able to multiple this loss function by weights as well.

@mauryaland mauryaland mentioned this pull request Mar 16, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants