-
-
Notifications
You must be signed in to change notification settings - Fork 2.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
GH-1344: Prioritizing classes #1345
Conversation
Add logging of new training parameters - weights and beta Fix name errors in F-beta metric
Log new parameter values only for SequenceTagger instances
Weights can be added also to |
7bdafb5
to
3a51e68
Compare
Add weights to state_dict for loading and saving models Remove unnecessary isinstance calls while logging by overriding __str__ Minor refactor
3a51e68
to
25e7614
Compare
Class weighting is important in macro metrics. |
@klasocki thanks for adding this and please excuse the delay in checking this PR! I've ran some local tests with different parameters and it works well for prioritizing classes and controlling the precision/recall trade-off so lot's of people will surely find this useful! |
👍 |
No problem whatsoever! Thank you for caring about the quality of Flair. And I'm very glad you found it useful 😄 |
Thank you for adding this feature, it is very helpful. I was wondering if there is a specific reason it was only added in cross-entropy loss and not the viterbi loss (using CRF)? Having a quick look at the code we should be able to multiple this loss function by weights as well. |
Add possibility for including class weights in cross-entropy loss (when not using CRF) and using F-Beta as the evaluation metric