Skip to content

Commit

Permalink
self.self.activation_dropout -> self.activation_dropout (#8611)
Browse files Browse the repository at this point in the history
(one line typo)
  • Loading branch information
ratthachat authored Nov 18, 2020
1 parent cdf1b7a commit 0df91ee
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion src/transformers/models/bart/modeling_tf_bart.py
Original file line number Diff line number Diff line change
Expand Up @@ -267,7 +267,7 @@ def call(self, x, encoder_padding_mask, training=False):
if self.normalize_before:
x = self.final_layer_norm(x)
x = self.activation_fn(self.fc1(x))
x = tf.nn.dropout(x, rate=self.self.activation_dropout if training else 0)
x = tf.nn.dropout(x, rate=self.activation_dropout if training else 0)
x = self.fc2(x)
x = tf.nn.dropout(x, rate=self.dropout if training else 0)
x = residual + x
Expand Down

0 comments on commit 0df91ee

Please sign in to comment.