This repository has been archived by the owner on Nov 17, 2023. It is now read-only.
-
Notifications
You must be signed in to change notification settings - Fork 6.8k
Fp16 support for layernorm #14073
Labels
Comments
Hey, this is the MXNet Label Bot. |
This was referenced Feb 6, 2019
Current I propose to solve the issue following these two steps:
|
Once #14616 is merged then we can simply switch the LayerNorm's reduction to the safe version to achieve the 1st step @sxjscience proposed. Then we can possibly explore the implementation of the 2nd step later. @eric-haibin-lin What do you think? |
@haojin2 Yes, should first try to directly change the reduce to the safe version. |
This can probably be closed as #15002 is merged |
Sign up for free
to subscribe to this conversation on GitHub.
Already have an account?
Sign in.
Currently, given fp16 inputs, nd.LayerNorm/sym.LayerNorm perform reduction in fp16, which losses precision. The reduction should be done in fp32 instead. @sxjscience
The text was updated successfully, but these errors were encountered: