Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update DocumentLSTMEmbeddings #150

Closed
tabergma opened this issue Oct 18, 2018 · 0 comments · Fixed by #156
Closed

Update DocumentLSTMEmbeddings #150

tabergma opened this issue Oct 18, 2018 · 0 comments · Fixed by #156
Assignees
Labels
enhancement Improving of an existing feature

Comments

@tabergma
Copy link
Collaborator

tabergma commented Oct 18, 2018

Currently, you can set parameters bidirectional and use_first_representation independently. As it does not make sense to train a LSTM bidirectional and just take the last representation, we should concatenate the last and first representation as soon as the parameter bidirectional is set. Thus, we could remove the parameter use_first_representation.

Additionally, we need to register the token embeddings as module to avoid cuda - cpu conflicts.

for i, embedding in enumerate(embeddings):
    self.add_module('token_embedding_{}'.format(i), embedding)
@tabergma tabergma added the enhancement Improving of an existing feature label Oct 18, 2018
@tabergma tabergma self-assigned this Oct 18, 2018
alanakbik pushed a commit that referenced this issue Oct 18, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement Improving of an existing feature
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant