Skip to content

Commit

Permalink
Fix linting.
Browse files Browse the repository at this point in the history
  • Loading branch information
Frightera committed Mar 9, 2023
1 parent c1a4725 commit 3c33117
Show file tree
Hide file tree
Showing 2 changed files with 29 additions and 15 deletions.
21 changes: 14 additions & 7 deletions keras/backend.py
Original file line number Diff line number Diff line change
Expand Up @@ -5585,26 +5585,33 @@ def categorical_focal_crossentropy(
from_logits=False,
axis=-1,
):
"""Categorical focal crossentropy (alpha balanced) between an output tensor and a target tensor.
"""Computes the alpha balanced focal crossentropy loss between
the labels and predictions.
According to [Lin et al., 2018](https://arxiv.org/pdf/1708.02002.pdf), it
helps to apply a focal factor to down-weight easy examples and focus more on
hard examples. By default, the focal tensor is computed as follows:
It has pt defined as:
pt = p, if y = 1 else 1 - p
The authors use alpha-balanced variant of focal loss in the paper:
FL(pt) = −α_t * (1 − pt)^gamma * log(pt)
Extending this to multi-class case is straightforward:
FL(pt) = α_t * (1 − pt)^gamma * CE, where minus comes from negative log-likelihood and included in CE.
`modulating_factor` is (1 − pt)^gamma,
where `gamma` is a focusing parameter. When `gamma` = 0, there is no focal
effect on the categorical crossentropy.
FL(pt) = α_t * (1 − pt)^gamma * CE, where minus comes from
negative log-likelihood and included in CE.
`modulating_factor` is (1 − pt)^gamma, where `gamma` is a focusing
parameter. When `gamma` = 0, there is no focal effect on the categorical
crossentropy. And if alpha = 1, at the same time the loss is equivalent
to the categorical crossentropy.
Args:
target: A tensor with the same shape as `output`.
output: A tensor.
alpha: A weight balancing factor for all classes, default is `0.25` as
mentioned in the reference. It can be a list of floats or a scalar.
In the multi-class case, alpha may be set by inverse class frequency by
using `compute_class_weight` from `sklearn.utils`.
In the multi-class case, alpha may be set by inverse class
frequency by using `compute_class_weight` from `sklearn.utils`.
gamma: A focusing parameter, default is `2.0` as mentioned in the
reference. It helps to gradually reduce the importance given to
simple examples in a smooth manner.
Expand Down
23 changes: 15 additions & 8 deletions keras/losses.py
Original file line number Diff line number Diff line change
Expand Up @@ -924,20 +924,27 @@ def __init__(

@keras_export("keras.losses.CategoricalFocalCrossentropy")
class CategoricalFocalCrossentropy(LossFunctionWrapper):
"""Computes the alpha balanced focal crossentropy loss between the labels and predictions.
"""Computes the alpha balanced focal crossentropy loss between
the labels and predictions.
According to [Lin et al., 2018](https://arxiv.org/pdf/1708.02002.pdf), it
helps to apply a focal factor to down-weight easy examples and focus more on
hard examples. By default, the focal tensor is computed as follows:
It has pt defined as:
pt = p, if y = 1 else 1 - p
The authors use alpha-balanced variant of focal loss in the paper:
FL(pt) = −α_t * (1 − pt)^gamma * log(pt)
Extending this to multi-class case is straightforward:
FL(pt) = α_t * (1 − pt)^gamma * CE, where minus comes from negative log-likelihood and included in CE.
`modulating_factor` is (1 − pt)^gamma,
where `gamma` is a focusing parameter. When `gamma` = 0, there is no focal
effect on the categorical crossentropy. And if alpha = 1, at the same time the loss is
equivalent to the categorical crossentropy.
FL(pt) = α_t * (1 − pt)^gamma * CE, where minus comes from
negative log-likelihood and included in CE.
`modulating_factor` is (1 − pt)^gamma, where `gamma` is a focusing
parameter. When `gamma` = 0, there is no focal effect on the categorical
crossentropy. And if alpha = 1, at the same time the loss is equivalent to
the categorical crossentropy.
In the snippet below, there is `# classes` floating pointing values per
example. The shape of both `y_pred` and `y_true` are
`[batch_size, num_classes]`.
Expand Down Expand Up @@ -969,8 +976,8 @@ class CategoricalFocalCrossentropy(LossFunctionWrapper):
Args:
alpha: A weight balancing factor for all classes, default is `0.25` as
mentioned in the reference. It can be a list of floats or a scalar.
In the multi-class case, alpha may be set by inverse class frequency by
using `compute_class_weight` from `sklearn.utils`.
In the multi-class case, alpha may be set by inverse class
frequency by using `compute_class_weight` from `sklearn.utils`.
gamma: A focusing parameter, default is `2.0` as mentioned in the
reference. It helps to gradually reduce the importance given to
simple (easy) examples in a smooth manner.
Expand Down

0 comments on commit 3c33117

Please sign in to comment.