-
Notifications
You must be signed in to change notification settings - Fork 6.8k
Conversation
does check_numeric_gradient() pass? |
yes it does pass the numeric_gradient check |
@@ -486,6 +486,20 @@ def fsigmoid(a): | |||
check_symbolic_forward(y, [xa], [ya]) | |||
check_symbolic_backward(y, [xa], [np.ones(shape)], [ya * (1 - ya)]) | |||
|
|||
def test_softsign(): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
add @with_seed()
* Add SoftSign Activation function
* Add SoftSign Activation function
@@ -106,6 +106,23 @@ The storage type of ``sigmoid`` output is always dense | |||
|
|||
MXNET_OPERATOR_REGISTER_BINARY_WITH_SPARSE_CPU(_backward_sigmoid, | |||
unary_bwd<mshadow_op::sigmoid_grad>); | |||
// softsign | |||
MXNET_OPERATOR_REGISTER_UNARY(softsign) | |||
MXNET_ADD_SPARSE_OP_ALIAS(softsign) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This operator doesn't have a sparse implementation. Sparse op alias should not be registered.
@@ -168,6 +173,10 @@ void ActivationGradComputeImpl(const ActivationParam ¶m, const OpContext &ct | |||
ActivationBackward<xpu, mshadow_op::softrelu, mshadow_op::softrelu_grad, DType>( | |||
ctx, out_grad, out_data, req, output); | |||
break; | |||
case activation::kSoftSign: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Are you sure this is correct?
ActivationGradComputeImpl
takes out_grad
and out_data
to calculate output
. In another word,
for y = activation(x)
, it calculates dx = _backward_activation(dy, y)
, not dx = _backward_activation(dy, x)
.
@@ -486,6 +486,21 @@ def fsigmoid(a): | |||
check_symbolic_forward(y, [xa], [ya]) | |||
check_symbolic_backward(y, [xa], [np.ones(shape)], [ya * (1 - ya)]) | |||
|
|||
@with_seed() | |||
def test_softsign(): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is there unit test for Activation(act_type='softsign')
???
* Add SoftSign Activation function
* Add SoftSign Activation function
Description
Add Softsign Activation function
Checklist
Essentials
make lint
)Changes
Comments