Skip to content

Commit

Permalink
fix comment
Browse files Browse the repository at this point in the history
  • Loading branch information
fsx950223 committed Oct 25, 2019
1 parent 41f1633 commit acf4374
Show file tree
Hide file tree
Showing 2 changed files with 3 additions and 1 deletion.
4 changes: 3 additions & 1 deletion tensorflow_addons/activations/rrelu.py
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@ def rrelu(x, lower=0.125, upper=0.3333333333333333, training=None, seed=None):
"""rrelu function.
Computes rrelu function:
`x if x > 0 else random(lower,upper) * x` or
`x if x > 0 else random(lower, upper) * x` or
`x if x > 0 else x * (lower + upper) / 2`
depending on whether training is enabled.
Expand All @@ -44,13 +44,15 @@ def rrelu(x, lower=0.125, upper=0.3333333333333333, training=None, seed=None):
upper: `float`, upper bound for random alpha.
training: `bool`, indicating whether the `call`
is meant for training or inference.
seed: `int`, this sets the operation-level seed.
Returns:
result: A `Tensor`. Has the same type as `x`.
"""
x = tf.convert_to_tensor(x)
if training is None:
training = tf.keras.backend.learning_phase()
training = bool(tf.keras.backend.get_value(training))
# TODO: get rid of v1 API
seed1, seed2 = tf.compat.v1.random.get_seed(seed)
result, _ = _activation_ops_so.addons_rrelu(x, lower, upper, training,
seed1, seed2)
Expand Down
Empty file modified tensorflow_addons/activations/rrelu_test.py
100755 → 100644
Empty file.

0 comments on commit acf4374

Please sign in to comment.