Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add config for Optuna #111

Merged
merged 7 commits into from
Dec 20, 2022
Merged

Add config for Optuna #111

merged 7 commits into from
Dec 20, 2022

Conversation

KKIEEK
Copy link
Contributor

@KKIEEK KKIEEK commented Dec 20, 2022

Modification

I added config for Optuna. Please refer to this docs.

Limitation

/usr/local/lib/python3.7/site-packages/optuna/distributions.py:427: UserWarning: Choices for a categorical distribution should be a tuple of None, bool, int, float and str for persistent storage but contains segformer_mit_b0 which is of type ImmutableContainer.

However, I think we have overridden __eq__ special method for ImmutableContainer before, so this warning is probably not a critical problem.
Refer to optuna's code and ours for details.

Result

MedianStoppingRule w/ Optuna

urrent time: 2022-12-20 06:53:00 (running for 00:01:14.86)
Memory usage on this node: 40.4/1771.8 GiB 
Using MedianStoppingRule: num_stopped=0.
Resources requested: 4.0/92 CPUs, 4.0/4 GPUs, 0.0/1574.54 GiB heap, 0.0/186.26 GiB objects
Result logdir: /some/path/siatune/work_dirs/mmseg_median_optuna/DataParallelTrainer_2022-12-20_06-51-45
Number of trials: 5/8 (1 PENDING, 4 RUNNING)
+------------------------------+----------+------------------+------------------------+------------------------+------------------------+
| Trial name                   | status   | loc              |   train_loop_config/da | train_loop_config/mo   | train_loop_config/op   |
|                              |          |                  |     ta.samples_per_gpu | del                    | timizer                |
|------------------------------+----------+------------------+------------------------+------------------------+------------------------|
| DataParallelTrainer_c490a0fe | RUNNING  | 172.17.0.7:56624 |                      2 | upernet_swin           | adam                   |
| DataParallelTrainer_cf4645ee | RUNNING  | 172.17.0.7:56901 |                      3 | pspnet_r50_d8          | rms                    |
| DataParallelTrainer_d9311f70 | RUNNING  | 172.17.0.7:57446 |                      5 | segformer_mit_b0       | adam                   |
| DataParallelTrainer_e39e0d1a | RUNNING  | 172.17.0.7:58279 |                      6 | deeplabv3plus_r50_d8   | rms                    |
| DataParallelTrainer_ee1b810a | PENDING  |                  |                      6 | pspnet_r50_d8          | sgd                    |
+------------------------------+----------+------------------+------------------------+------------------------+------------------------+

@KKIEEK KKIEEK changed the title Test Optuna Support Optuna Dec 20, 2022
@codecov-commenter
Copy link

codecov-commenter commented Dec 20, 2022

Codecov Report

❗ No coverage uploaded for pull request base (main@796e6da). Click here to learn what that means.
Patch has no changes to coverable lines.

Additional details and impacted files
@@           Coverage Diff           @@
##             main     #111   +/-   ##
=======================================
  Coverage        ?   72.79%           
=======================================
  Files           ?       59           
  Lines           ?     1632           
  Branches        ?      235           
=======================================
  Hits            ?     1188           
  Misses          ?      345           
  Partials        ?       99           
Flag Coverage Δ
unittests 72.79% <0.00%> (?)

Flags with carried forward coverage won't be shown. Click here to find out more.

Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here.

☔ View full report at Codecov.
📢 Do you have feedback about the report comment? Let us know in this issue.

@KKIEEK KKIEEK requested review from yhna940 and nijkah December 20, 2022 05:03
@KKIEEK KKIEEK changed the title Support Optuna Add config for Optuna Dec 20, 2022
Copy link
Contributor

@yhna940 yhna940 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM 💡

@KKIEEK KKIEEK merged commit 598b8a7 into main Dec 20, 2022
@KKIEEK KKIEEK deleted the feat/optuna branch December 23, 2022 09:41
@yhna940 yhna940 mentioned this pull request Jan 7, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants