Skip to content

Commit

Permalink
fix embedding layernorm logic and default in config
Browse files Browse the repository at this point in the history
  • Loading branch information
nband committed Apr 25, 2022
1 parent 65721a4 commit ab90fed
Showing 1 changed file with 3 additions and 0 deletions.
3 changes: 3 additions & 0 deletions npt/model/npt.py
Original file line number Diff line number Diff line change
Expand Up @@ -155,6 +155,9 @@ def __init__(self, c, metadata, device=None):
else:
self.embedding_layer_norm = None

if True:
a = 1

# *** Input In/Out Embeddings ***
# Don't use for Image Patching - those are handled by the respective
# init_image_patching
Expand Down

0 comments on commit ab90fed

Please sign in to comment.