-
Notifications
You must be signed in to change notification settings - Fork 918
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
New alternatives to layer norm #1114
Merged
Merged
Changes from 21 commits
Commits
Show all changes
29 commits
Select commit
Hold shift + click to select a range
c9f262b
layer norm vairants
04739da
fixed default
8dbdb5d
license, changelog and test
gdevos010 41d1055
Merge branch 'master' into altLayerNorm
gdevos010 45596cb
powerNorm
gdevos010 7d103ea
arxiv link
gdevos010 c9b5a1f
typos
gdevos010 7b8bd77
typo
gdevos010 16f946a
adding PowerNorm to docs
gdevos010 c68a4a6
PR comments
gdevos010 99a739b
PR comments
gdevos010 c465102
PR comments
gdevos010 89314d0
Merge branch 'master' into altLayerNorm
gdevos010 b0d6246
Merge branch 'master' into altLayerNorm
hrzn 0e7d381
PR comments
gdevos010 13c58c1
Merge branch 'master' into altLayerNorm
gdevos010 0892842
removed ScaleNorm. merged files
c4b82bd
Merge branch 'master' into altLayerNorm
gdevos010 52831de
Merge branch 'master' into altLayerNorm
gdevos010 ef9e9e6
Merge branch 'master' into altLayerNorm
hrzn 0000e2d
added LayerNormNoBias and removed powernorm
9dc066f
Merge branch 'master' into altLayerNorm
gdevos010 5354977
custom norm_type
476d666
new norm layers for transformer model
e5b2943
fix test
98b0771
Update darts/models/forecasting/transformer_model.py
gdevos010 f3094e9
Update darts/models/forecasting/tft_model.py
gdevos010 4518995
layer norm test
2692eac
Merge branch 'master' into altLayerNorm
gdevos010 File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,56 @@ | ||
""" | ||
MIT License | ||
|
||
Copyright (c) 2020 Phil Wang | ||
|
||
Permission is hereby granted, free of charge, to any person obtaining a copy | ||
of this software and associated documentation files (the "Software"), to deal | ||
in the Software without restriction, including without limitation the rights | ||
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell | ||
copies of the Software, and to permit persons to whom the Software is | ||
furnished to do so, subject to the following conditions: | ||
|
||
The above copyright notice and this permission notice shall be included in all | ||
copies or substantial portions of the Software. | ||
|
||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR | ||
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, | ||
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE | ||
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER | ||
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, | ||
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE | ||
SOFTWARE. | ||
""" | ||
|
||
import torch | ||
import torch.nn as nn | ||
|
||
|
||
class RMSNorm(nn.Module): | ||
"""An alternate to layer normalization, without mean centering and the learned bias [1] | ||
|
||
References | ||
---------- | ||
.. [1] Zhang, Biao, and Rico Sennrich. "Root mean square layer normalization." Advances in Neural Information | ||
Processing Systems 32 (2019). | ||
""" | ||
|
||
def __init__(self, dim, eps=1e-8): | ||
super().__init__() | ||
self.scale = dim**-0.5 | ||
self.eps = eps | ||
self.g = nn.Parameter(torch.ones(dim)) | ||
|
||
def forward(self, x): | ||
norm = torch.norm(x, dim=-1, keepdim=True) * self.scale | ||
return x / norm.clamp(min=self.eps) * self.g | ||
|
||
|
||
class LayerNormNoBias(nn.LayerNorm): | ||
def __init__(self, input_size, **kwargs): | ||
super().__init__(input_size, elementwise_affine=False, **kwargs) | ||
|
||
|
||
class LayerNorm(nn.LayerNorm): | ||
def __init__(self, input_size, **kwargs) -> None: | ||
super().__init__(input_size, **kwargs) |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
how about renaming to
layer_norm_variant
? To make it clear it touches the layer norm part of the transformer.