Normalization layer - Swappable? #12
CCranney
started this conversation in
Base Code improvements
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Currently, the Normalization step is hard-coded for Layer Normalization, as was described in Attention Is All You Need. I've heard, however, that this too is something that can be adjusted to improve performance.
This would require more research, but I'm thinking it would be a good idea to update the SublayerUnit class so that it can swap out LayerNormalization for other normalization methods.
Beta Was this translation helpful? Give feedback.
All reactions