-
Notifications
You must be signed in to change notification settings - Fork 416
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Use argmax when topk=1 #419
Conversation
Hello @SkafteNicki! Thanks for updating this PR. There are currently no PEP 8 issues detected in this Pull Request. Cheers! 🍻 Comment last updated at 2021-08-03 10:52:15 UTC |
for more information, see https://pre-commit.ci
…tning/metrics into cpu_topk_performance
Codecov Report
@@ Coverage Diff @@
## master #419 +/- ##
==========================================
- Coverage 96.08% 96.00% -0.08%
==========================================
Files 126 126
Lines 4083 4085 +2
==========================================
- Hits 3923 3922 -1
- Misses 160 163 +3
Flags with carried forward coverage won't be shown. Click here to find out more.
Continue to review full report at Codecov.
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM !
Before submitting
What does this PR do?
Fixes #417
torch.topk
has terrible performance on cpu. This PR exchanges the call totorch.topk
totorch.argmax
whentopk=1
which give better performance (much better on cpu, the same on gpu):Using
torch.topk
:Timing cpu: 0.03081380844116211+-0.006388481400645466
Timing cuda: 0.0037766695022583008+-0.0007294738042081203
Using
torch.argmax
:Timing cpu: 0.0041289663314819335+-0.0004889978751252193
Timing cuda: 0.0038677167892456053+-0.002895336883318616
Script:
PR review
Anyone in the community is free to review the PR once the tests have passed.
If we didn't discuss your PR in Github issues there's a high chance it will not be merged.
Did you have fun?
Make sure you had fun coding 🙃