-
-
Notifications
You must be signed in to change notification settings - Fork 2.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Prioritizing classes and precission/recall for sequence tagger #1344
Comments
klasocki
added a commit
to klasocki/flair
that referenced
this issue
Jan 11, 2020
klasocki
added a commit
to klasocki/flair
that referenced
this issue
Jan 11, 2020
Add logging of new training parameters - weights and beta Fix name errors in F-beta metric
klasocki
added a commit
to klasocki/flair
that referenced
this issue
Jan 11, 2020
klasocki
added a commit
to klasocki/flair
that referenced
this issue
Jan 11, 2020
klasocki
added a commit
to klasocki/flair
that referenced
this issue
Jan 11, 2020
klasocki
added a commit
to klasocki/flair
that referenced
this issue
Jan 11, 2020
Log new parameter values only for SequenceTagger instances
klasocki
added a commit
to klasocki/flair
that referenced
this issue
Jan 13, 2020
Add weights to state_dict for loading and saving models Remove unnecessary isinstance calls while logging by overriding __str__
klasocki
added a commit
to klasocki/flair
that referenced
this issue
Jan 13, 2020
Add weights to state_dict for loading and saving models Remove unnecessary isinstance calls while logging by overriding __str__ Minor refactor
klasocki
added a commit
to klasocki/flair
that referenced
this issue
Jan 13, 2020
Add weights to state_dict for loading and saving models Remove unnecessary isinstance calls while logging by overriding __str__ Minor refactor
New version of flair released that contains this feature - thanks again @klasocki! |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Following @apohllo 's idea, I implemented two ways of prioritizing precission/recall during training
Using F-beta as the evaluation metric, has impact on logging and annealing schedule
Adding class weights to cross-entropy loss, enabling class prioritization (e.g less weight to errors on 'O' tag if we want more recall). Works only if using standard loss - no CRF
I'll submit a pull request in case you'd want to include these changes in your library :)
The text was updated successfully, but these errors were encountered: