You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Currently, you can set parameters bidirectional and use_first_representation independently. As it does not make sense to train a LSTM bidirectional and just take the last representation, we should concatenate the last and first representation as soon as the parameter bidirectional is set. Thus, we could remove the parameter use_first_representation.
Additionally, we need to register the token embeddings as module to avoid cuda - cpu conflicts.
for i, embedding in enumerate(embeddings):
self.add_module('token_embedding_{}'.format(i), embedding)
The text was updated successfully, but these errors were encountered:
Currently, you can set parameters
bidirectional
anduse_first_representation
independently. As it does not make sense to train a LSTM bidirectional and just take the last representation, we should concatenate the last and first representation as soon as the parameterbidirectional
is set. Thus, we could remove the parameteruse_first_representation
.Additionally, we need to register the token embeddings as module to avoid cuda - cpu conflicts.
The text was updated successfully, but these errors were encountered: