You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Describe the feature and the current behavior/state.
Callbacks such as EarlyStopping and ReduceLROnPlateau monitor a specified metric (usually the validation loss) to activate their functionality. However, many times the metric value across the training epochs is quite noisy, specially with small datasets or models with stochastic layers (such as models with dropout, variational autoencoders...). This can cause EarlyStopping and ReduceLROnPlateau to be triggered prematurely, even with large patience, due to the presence of outliers. In fact, this is why the TensorBoard already implements a smoothing of the metrics.
My feature request is to make a metric wrapper that is able to smooth/filter any other metric. Then, the EarlyStopping and ReduceLROnPlateau can monitor the smoothed metric. A possible implementation can be found in this stackoverflow answer. What do you think about this?
Relevant information
Are you willing to contribute it (yes/no): yes, but I am new to contributing so I will need some help If you wish to contribute, then read the requirements for new contributions in CONTRIBUTING.md
Are you willing to maintain it going forward? (yes/no): yes
Is there a relevant academic paper? (if so, where): no
Does the relavent academic paper exceed 50 citations? (yes/no):
Is there already an implementation in another framework? (if so, where): no
Was it part of tf.contrib? (if so, where): no
Which API type would this fall under (layer, metric, optimizer, etc.)
Metric, alhtough it needs and auxiliary callback.
Who will benefit with this feature?
Anyone using EarlyStopping or ReduceLROnPlateau callbacks, and want to avoid premature or false triggers.
Any other info.
The text was updated successfully, but these errors were encountered:
TensorFlow Addons is transitioning to a minimal maintenance and release mode. New features will not be added to this repository. For more information, please see our public messaging on this decision: TensorFlow Addons Wind Down
Please consider sending feature requests / contributions to other repositories in the TF community with a similar charters to TFA: Keras Keras-CV Keras-NLP
Describe the feature and the current behavior/state.
Callbacks such as
EarlyStopping
andReduceLROnPlateau
monitor a specified metric (usually the validation loss) to activate their functionality. However, many times the metric value across the training epochs is quite noisy, specially with small datasets or models with stochastic layers (such as models with dropout, variational autoencoders...). This can causeEarlyStopping
andReduceLROnPlateau
to be triggered prematurely, even with large patience, due to the presence of outliers. In fact, this is why the TensorBoard already implements a smoothing of the metrics.My feature request is to make a metric wrapper that is able to smooth/filter any other metric. Then, the
EarlyStopping
andReduceLROnPlateau
can monitor the smoothed metric. A possible implementation can be found in this stackoverflow answer. What do you think about this?Relevant information
If you wish to contribute, then read the requirements for new contributions in
CONTRIBUTING.md
Which API type would this fall under (layer, metric, optimizer, etc.)
Metric, alhtough it needs and auxiliary callback.
Who will benefit with this feature?
Anyone using EarlyStopping or ReduceLROnPlateau callbacks, and want to avoid premature or false triggers.
Any other info.
The text was updated successfully, but these errors were encountered: