Skip to content

Commit

Permalink
Fix transformer LM arg upgrade logic (#1717)
Browse files Browse the repository at this point in the history
Summary: Pull Request resolved: fairinternal/fairseq-py#1717

Reviewed By: alexeib

Differential Revision: D27156919

Pulled By: myleott

fbshipit-source-id: af3c2e41464c04a7808f40894e7b0106798e4822
  • Loading branch information
Myle Ott authored and facebook-github-bot committed Mar 21, 2021
1 parent b8d15f7 commit 82473be
Showing 1 changed file with 0 additions and 3 deletions.
3 changes: 0 additions & 3 deletions fairseq/models/transformer_lm.py
Original file line number Diff line number Diff line change
Expand Up @@ -230,9 +230,6 @@ def __init__(self, decoder):
def build_model(cls, args, task):
"""Build a new model instance."""

# make sure all arguments are present in older models
base_lm_architecture(args)

if args.decoder_layers_to_keep:
args.decoder_layers = len(args.decoder_layers_to_keep.split(","))

Expand Down

0 comments on commit 82473be

Please sign in to comment.