-
Notifications
You must be signed in to change notification settings - Fork 6.8k
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for all the changes. There're a couple of minor issues
@@ -26,7 +26,6 @@ In this article, I will cover how to create a new layer from scratch, how to use | |||
|
|||
To create a new layer in Gluon API, one must create a class that inherits from [Block](https://github.com/apache/incubator-mxnet/blob/master/python/mxnet/gluon/block.py#L123) class. This class provides the most basic functionality, and all pre-defined layers inherit from it directly or via other subclasses. Because each layer in Apache MxNet inherits from `Block`, words "layer" and "block" are used interchangeable inside of the Apache MxNet community. | |||
|
|||
- MXNet [7b24137](https://github.com/apache/incubator-mxnet/commit/7b24137ed45df605defa4ce72ec91554f6e445f0). See Instructions in [Setup and Installation]({{'/get_started'|relative_url}}). | |||
The only instance method needed to be implemented is [forward(self, x)](https://github.com/apache/incubator-mxnet/blob/master/python/mxnet/gluon/block.py#L415), which defines what exactly your layer is going to do during forward propagation. Notice, that it doesn't require to provide what the block should do during back propogation. Back propogation pass for blocks is done by Apache MxNet for you. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Could you point to a specific tag commit when linking to code part for forward(self, x)
, hybridize
etc. ?
@@ -160,7 +160,7 @@ In classification, we often apply the | |||
softmax operator to the predicted outputs to obtain prediction probabilities, | |||
and then apply the cross entropy loss against the true labels: | |||
|
|||
$$ \begin{align}\begin{aligned}p = \softmax({pred})\\L = -\sum_i \sum_j {label}_j \log p_{ij}\end{aligned}\end{align} | |||
$$ \begin{align}\begin{aligned}p = softmax({pred})\\L = -\sum_i \sum_j {label}_j \log p_{ij}\end{aligned}\end{align} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Think latex wise using \text{softmax}
is better that plain "softmax".
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Isn't softmax the name of the function, not a LaTex command? I made this change because a ticket pointed out that there was a LaTex rendering issue here: https://mxnet.apache.org/api/python/docs/tutorials/packages/gluon/loss/loss.html#Cross-Entropy-Loss-with-Softmax
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What you did works. Previously it was \softmax
with the rendering issue, but enclosing it in \text{softmax}
tells latex to properly handle text written in latex.
Modulo Ehsan's comments, approved. |
@TEChopra1000 please update the title of the PR. These titles end up in the release notes and they should be properly descriptive... like 'documentation update to fix latex rendering" (or similar) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Addressed @ehsanmok 's comments:
- updated GitHub links
- updated LaTex syntax
* fixing links for auto-gluon api index page * doc: fixing link rendering issues and other small doc fixes * typo fix in custom_layer and fixing links in datasets gluon tutorial * fixing link * Update LaTex syntax * updated github link * updated github link to use the most recent stable release tag
Description
Testing
I tested the doc-build locally and previewed the changes.