Skip to content
This repository has been archived by the owner on Nov 17, 2023. It is now read-only.

Softsign op #9851

Merged
merged 4 commits into from
Feb 23, 2018
Merged

Softsign op #9851

merged 4 commits into from
Feb 23, 2018

Conversation

nswamy
Copy link
Member

@nswamy nswamy commented Feb 21, 2018

Description

Add Softsign Activation function

Checklist

Essentials

  • Passed code style checking (make lint)
  • Changes are complete (i.e. I finished coding on this PR)
  • All changes have test coverage:
  • Unit tests are added for small changes to verify correctness (e.g. adding a new operator)
  • Nightly tests are added for complicated/long-running ones (e.g. changing distributed kvstore)
  • Build tests will be added for build configuration changes (e.g. adding a new build option with NCCL)
  • Code is well-documented:
  • For user-facing API changes, API doc string has been updated.
  • For new C++ functions in header files, their functionalities and arguments are documented.
  • For new examples, README.md is added to explain the what the example does, the source of the dataset, expected performance on test set and reference to the original paper if applicable
  • To the my best knowledge, examples are either not affected by this change, or have been fixed to be compatible with this change

Changes

  • Feature1, tests, (and when applicable, API doc)
  • Feature2, tests, (and when applicable, API doc)

Comments

  • If this change is a backward incompatible change, why must this change be made.
  • Interesting edge cases to note here

@nswamy nswamy requested a review from cjolivier01 as a code owner February 21, 2018 18:57
@nswamy
Copy link
Member Author

nswamy commented Feb 21, 2018

@cjolivier01 @anirudh2290

@cjolivier01
Copy link
Member

does check_numeric_gradient() pass?

@nswamy
Copy link
Member Author

nswamy commented Feb 22, 2018

yes it does pass the numeric_gradient check

@@ -486,6 +486,20 @@ def fsigmoid(a):
check_symbolic_forward(y, [xa], [ya])
check_symbolic_backward(y, [xa], [np.ones(shape)], [ya * (1 - ya)])

def test_softsign():
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

add @with_seed()

@szha szha mentioned this pull request Feb 22, 2018
6 tasks
@nswamy nswamy merged commit 6ebec87 into apache:master Feb 23, 2018
dabraude pushed a commit to dabraude/incubator-mxnet that referenced this pull request Feb 24, 2018
* Add SoftSign Activation function
dabraude pushed a commit to dabraude/incubator-mxnet that referenced this pull request Feb 24, 2018
* Add SoftSign Activation function
@@ -106,6 +106,23 @@ The storage type of ``sigmoid`` output is always dense

MXNET_OPERATOR_REGISTER_BINARY_WITH_SPARSE_CPU(_backward_sigmoid,
unary_bwd<mshadow_op::sigmoid_grad>);
// softsign
MXNET_OPERATOR_REGISTER_UNARY(softsign)
MXNET_ADD_SPARSE_OP_ALIAS(softsign)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This operator doesn't have a sparse implementation. Sparse op alias should not be registered.

@@ -168,6 +173,10 @@ void ActivationGradComputeImpl(const ActivationParam &param, const OpContext &ct
ActivationBackward<xpu, mshadow_op::softrelu, mshadow_op::softrelu_grad, DType>(
ctx, out_grad, out_data, req, output);
break;
case activation::kSoftSign:
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Are you sure this is correct?

ActivationGradComputeImpl takes out_grad and out_data to calculate output. In another word,
for y = activation(x), it calculates dx = _backward_activation(dy, y), not dx = _backward_activation(dy, x).

@@ -486,6 +486,21 @@ def fsigmoid(a):
check_symbolic_forward(y, [xa], [ya])
check_symbolic_backward(y, [xa], [np.ones(shape)], [ya * (1 - ya)])

@with_seed()
def test_softsign():
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is there unit test for Activation(act_type='softsign')???

rahul003 pushed a commit to rahul003/mxnet that referenced this pull request Jun 4, 2018
* Add SoftSign Activation function
zheng-da pushed a commit to zheng-da/incubator-mxnet that referenced this pull request Jun 28, 2018
* Add SoftSign Activation function
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants