Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Using One Hot Embeddings fails on gpu #1141

Closed
carljohanrehn opened this issue Sep 22, 2019 · 1 comment · Fixed by #1147
Closed

Using One Hot Embeddings fails on gpu #1141

carljohanrehn opened this issue Sep 22, 2019 · 1 comment · Fixed by #1147
Labels
question Further information is requested

Comments

@carljohanrehn
Copy link
Contributor

carljohanrehn commented Sep 22, 2019

One Hot Embedding fails when printing embeddings, It works if I use cpu, but not with gpu. I think I found an issue with the use of torch.nn.Embedding(...) in embedddings.py, OneHotEmbeddings.init(...).

I have,

import flair, torch

# flair.device = torch.device('cpu')
torch.device('cuda:0')

from flair.embeddings import OneHotEmbeddings

word_embedding = OneHotEmbeddings(corpus)
sentence = corpus.train[0]
word_embedding.embed(sentence)

for token in sentence:
    print(token)
    print(token.embedding)

If I explicitly add cuda in the embedding layer everything works as expected, but fails otherwise. Thus,

# model architecture
self.embedding_layer = torch.nn.Embedding(
    len(self.vocab_dictionary), self.__embedding_length
)
if torch.cuda.is_available() and flair.device.type != 'cpu':
    self.embedding_layer.cuda()

Without the " fix" I get an error message:

Traceback (most recent call last):
  File ".../examples/flaircorpus.py", line 291, in <module>
    word_embedding.embed(sentence)
  File .../github/flair/flair/embeddings.py", line 85, in embed
    self._add_embeddings_internal(sentences)
  File ".../github/flair/flair/embeddings.py", line 522, in _add_embeddings_internal
    embedded = self.embedding_layer.forward(one_hot_sentences)
  File ".../lib/python3.6/site-packages/torch/nn/modules/sparse.py", line 117, in forward
    self.norm_type, self.scale_grad_by_freq, self.sparse)
  File ".../lib/python3.6/site-packages/torch/nn/functional.py", line 1506, in embedding
    return torch.embedding(weight, input, padding_idx, scale_grad_by_freq, sparse)
RuntimeError: Expected object of backend CPU but got backend CUDA for argument #3 'index'
@carljohanrehn carljohanrehn added the question Further information is requested label Sep 22, 2019
@alanakbik
Copy link
Collaborator

@carljohanrehn thanks for reporting this! I was able to reproduce and found its a bug in the constructor of the OneHotEmbeddings class. Will put in a PR to fix this!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants