Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

gc.collect when spilling on demand #771

Merged

Conversation

madsbk
Copy link
Member

@madsbk madsbk commented Nov 5, 2021

When spilling on demand, we now call gc.collect().

@madsbk madsbk added 2 - In Progress Currently a work in progress improvement Improvement / enhancement to an existing function non-breaking Non-breaking change labels Nov 5, 2021
@github-actions github-actions bot added the python python code needed label Nov 5, 2021
@madsbk madsbk marked this pull request as ready for review November 5, 2021 11:57
@madsbk madsbk requested a review from a team as a code owner November 5, 2021 11:57
@madsbk madsbk added 3 - Ready for Review Ready for review by team and removed 2 - In Progress Currently a work in progress labels Nov 5, 2021
Copy link
Member

@pentschev pentschev left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, thanks @madsbk !

@pentschev
Copy link
Member

I'm not sure whether the segfault in https://gpuci.gpuopenanalytics.com/job/rapidsai/job/gpuci/job/dask-cuda/job/prb/job/dask-cuda-gpu-test/CUDA=11.0,GPU_LABEL=gpu-a100,LINUX_VER=centos7,PYTHON=3.7/558/display/redirect is due to this PR or not, see below:

12:53:20 collecting ... [2040802eb7b1:346  :�:346] Caught signal 11 (Segmentation fault: address not mapped to object at address 0x385f6)
12:53:23 ci/gpu/build.sh: line 114:   346 Segmentation fault      (core dumped) DASK_CUDA_TEST_SINGLE_GPU=1 UCXPY_IFNAME=eth0 UCX_WARN_UNUSED_ENV_VARS=n UCX_MEMTYPE_CACHE=n pytest -vs -Werror::DeprecationWarning -Werror::FutureWarning --cache-clear --basetemp="$WORKSPACE/dask-cuda-tmp" --junitxml="$WORKSPACE/junit-dask-cuda.xml" --cov-config=.coveragerc --cov=dask_cuda --cov-report=xml:"$WORKSPACE/dask-cuda-coverage.xml" --cov-report term dask_cuda/tests/

It looks like it happened before the actual testing started, so let's rerun and see.

@pentschev pentschev closed this Nov 5, 2021
@pentschev pentschev reopened this Nov 5, 2021
@codecov-commenter
Copy link

codecov-commenter commented Nov 5, 2021

Codecov Report

❗ No coverage uploaded for pull request base (branch-21.12@984263c). Click here to learn what that means.
The diff coverage is n/a.

Impacted file tree graph

@@               Coverage Diff               @@
##             branch-21.12     #771   +/-   ##
===============================================
  Coverage                ?   57.60%           
===============================================
  Files                   ?       21           
  Lines                   ?     2944           
  Branches                ?        0           
===============================================
  Hits                    ?     1696           
  Misses                  ?     1248           
  Partials                ?        0           

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 984263c...11fd5c3. Read the comment docs.

@pentschev
Copy link
Member

Apologies, clicked the wrong button and closed by accident.

@madsbk
Copy link
Member Author

madsbk commented Nov 5, 2021

rerun tests

@pentschev
Copy link
Member

rerun tests

1 similar comment
@pentschev
Copy link
Member

rerun tests

@pentschev
Copy link
Member

@gpucibot merge

@rapids-bot rapids-bot bot merged commit e4b8e19 into rapidsai:branch-21.12 Nov 5, 2021
@pentschev
Copy link
Member

Thanks @madsbk !

@madsbk madsbk deleted the gc_collect_when_spilling_on_demand branch November 8, 2021 08:09
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
3 - Ready for Review Ready for review by team improvement Improvement / enhancement to an existing function non-breaking Non-breaking change python python code needed
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants