Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

is openspeech.modules.conv2d_subsampling forward method made a mistake while calculating the output_lengths? #76

Closed
wl-junlin opened this issue Aug 16, 2021 · 4 comments · Fixed by #78
Assignees
Labels
BUG Something isn't working DONE QUESTION Further information is requested

Comments

@wl-junlin
Copy link

wl-junlin commented Aug 16, 2021

❓ Questions & Help

def forward(self, inputs: torch.Tensor, input_lengths: torch.Tensor) -> Tuple[torch.Tensor, torch.Tensor]:
  outputs, input_lengths = super().forward(inputs, input_lengths)
  output_lengths = input_lengths >> 2
  output_lengths -= 1
  return outputs, output_lengths

I believe the line <3, 4> of above forward method is redundant and wrong
In line 2, input_lengths has already been updated to the proper output_lengths.

Details

eg:
inputs.shape ==> [batch_size, time, features] == [2, 1000, 80]
input_lengths ==> [1000, 587]

then after line 2 executed
outputs.shape ==> [2, 249, num_out_features]
input_lengths ==> [249, 146]

the intput_lengths has been updated to the proper output_lengths.
but later in line <3, 4>, output_lenghts is assigned to input_lenghts divided by 4 which I believe leads to an error.

@sooftware sooftware added QUESTION Further information is requested BUG Something isn't working labels Aug 20, 2021
@sooftware sooftware self-assigned this Aug 20, 2021
@sooftware
Copy link
Member

@wl-junlin Oh Thank you!
I'll check. It seems to me that there is a problem too.

@sooftware
Copy link
Member

sooftware commented Aug 20, 2021

@hasangchun I think our code is wrong. How do you think?

@upskyy
Copy link
Member

upskyy commented Aug 20, 2021

@sooftware I think so. I think we can delete the two lines below, but I'll test it when I have time. [link]

def forward(self, inputs: torch.Tensor, input_lengths: torch.Tensor) -> Tuple[torch.Tensor, torch.Tensor]:
    outputs, output_lengths = super().forward(inputs, input_lengths)
    # output_lengths = input_lengths >> 2
    # output_lengths -= 1
    return outputs, output_lengths

@sooftware
Copy link
Member

@hasangchun Great. Add test file too.
@wl-junlin thanks for letting us know.

upskyy added a commit that referenced this issue Aug 24, 2021
@upskyy upskyy mentioned this issue Aug 24, 2021
sooftware added a commit that referenced this issue Aug 24, 2021
@sooftware sooftware added the DONE label Aug 24, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
BUG Something isn't working DONE QUESTION Further information is requested
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants