Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Clean cuda.empty_cache usage #8199

Merged
merged 4 commits into from
Jun 30, 2021
Merged

Conversation

carmocca
Copy link
Contributor

@carmocca carmocca commented Jun 29, 2021

What does this PR do?

The empty_cache function already checks if is_initialized:

https://github.com/pytorch/pytorch/blob/v1.4.1/torch/cuda/memory.py#L34-L35

And we don't need to set the root device as the default device has been set previously.

Before submitting

  • Was this discussed/approved via a GitHub issue? (not for typos and docs)
  • Did you read the contributor guideline, Pull Request section?
  • Did you make sure your PR does only one thing, instead of bundling different changes together?
  • [n/a] Did you make sure to update the documentation with your changes? (if necessary)
  • [n/a] Did you write any new necessary tests? (not for typos and docs)
  • [n/a] Did you verify new and existing tests pass locally with your changes?
  • [n/a] Did you update the CHANGELOG? (not for typos, docs, test updates, or internal minor changes/refactorings)

PR review

  • Is this pull request ready for review? (if not, please submit in draft mode)
  • Check that all items from Before submitting are resolved
  • Make sure the title is self-explanatory and the description concisely explains the PR
  • Add labels and milestones (and optionally projects) to the PR so it can be classified

@carmocca carmocca added distributed Generic distributed-related topic refactor labels Jun 29, 2021
@carmocca carmocca added this to the v1.4 milestone Jun 29, 2021
@carmocca carmocca self-assigned this Jun 29, 2021
@codecov
Copy link

codecov bot commented Jun 29, 2021

Codecov Report

Merging #8199 (948dd59) into master (87b1b86) will decrease coverage by 1%.
The diff coverage is 81%.

@@          Coverage Diff           @@
##           master   #8199   +/-   ##
======================================
- Coverage      93%     92%   -1%     
======================================
  Files         212     212           
  Lines       13588   13664   +76     
======================================
- Hits        12626   12619    -7     
- Misses        962    1045   +83     

@Borda Borda force-pushed the refactor/clean-empty-cache-usage branch from 26ca1d2 to ff38b10 Compare June 29, 2021 21:15
@mergify mergify bot removed the has conflicts label Jun 29, 2021
Copy link
Contributor

@awaelchli awaelchli left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

fingers crossed ;)

@carmocca carmocca enabled auto-merge (squash) June 30, 2021 11:03
Copy link
Member

@Borda Borda left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

just curious what was the reason we did it before only for root?

@carmocca carmocca merged commit 74eb6cc into master Jun 30, 2021
@carmocca carmocca deleted the refactor/clean-empty-cache-usage branch June 30, 2021 11:04
@carmocca
Copy link
Contributor Author

just curious what was the reason we did it before only for root?

We set the default device:

https://github.com/PyTorchLightning/pytorch-lightning/blob/74eb6cc7e90b4b06c0136504101d6f5c343e93dc/pytorch_lightning/plugins/training_type/ddp.py#L373

So the context manager was uneccessary

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
distributed Generic distributed-related topic refactor
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants