Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature] Support HyperBand and BOHB scheduler #101

Merged
merged 36 commits into from
Dec 19, 2022
Merged

Conversation

KKIEEK
Copy link
Contributor

@KKIEEK KKIEEK commented Dec 16, 2022

Modification

  • Support HyperBand
  • Support HyperBandForBOHB
  • Refactor builder for SEARCHERS and TRIAL_SCHEDULERS

Refer to this documentation for details.

Example

Async HB

+------------------------------+------------+---------------------+------------------------+------------------------+------------------------+--------+------------------+------------------------+------------------------+---------------------+
| Trial name                   | status     | loc                 |   train_loop_config/da | train_loop_config/mo   | train_loop_config/op   |   iter |   total time (s) |   train/decode.loss_ce |   train/decode.acc_seg |   train/aux.loss_ce |
|                              |            |                     |     ta.samples_per_gpu | del                    | timizer                |        |                  |                        |                        |                     |
|------------------------------+------------+---------------------+------------------------+------------------------+------------------------+--------+------------------+------------------------+------------------------+---------------------|
| DataParallelTrainer_1c12a62e | RUNNING    | 10.244.28.73:850033 |                      2 | deeplabv3plus_r50_d8   | adamw                  |        |                  |                        |                        |                     |
| DataParallelTrainer_b98f53c6 | TERMINATED | 10.244.28.73:838522 |                      4 | deeplabv3plus_r50_d8   | adamw                  |     16 |          31.2932 |                2.59718 |                7.51988 |            1.04215  |
| DataParallelTrainer_d22babd2 | TERMINATED | 10.244.28.73:841482 |                      6 | pspnet_r50_d8          | rms                    |      2 |          21.4354 |                2.33154 |                2.80937 |            0.932751 |
| DataParallelTrainer_ea98f0d0 | TERMINATED | 10.244.28.73:845325 |                      7 | pspnet_r50_d8          | adam                   |      2 |          21.5254 |                2.51206 |                3.1459  |            0.995834 |
| DataParallelTrainer_bcc66fde | ERROR      | 10.244.28.73:840642 |                      5 | segformer_mit_b0       | adamw                  |        |                  |                        |                        |                     |
| DataParallelTrainer_da805af8 | ERROR      | 10.244.28.73:843367 |                      3 | deeplabv3plus_r50_d8   | adam                   |      1 |          22.0902 |                2.6276  |                1.36758 |            1.05352  |
| DataParallelTrainer_fb4c9e7c | ERROR      | 10.244.28.73:847215 |                      3 | deeplabv3plus_r50_d8   | adamw                  |      1 |          22.2709 |                2.0389  |                1.93737 |            0.81018  |
| DataParallelTrainer_0b625c34 | ERROR      | 10.244.28.73:849181 |                      4 | fpn_r50                | rms                    |        |                  |                        |                        |                     |
+------------------------------+------------+---------------------+------------------------+------------------------+------------------------+--------+------------------+------------------------+------------

HB

+------------------------------+------------+---------------------+------------------------+------------------------+------------------------+--------+------------------+------------------------+------------------------+---------------------+
| Trial name                   | status     | loc                 |   train_loop_config/da | train_loop_config/mo   | train_loop_config/op   |   iter |   total time (s) |   train/decode.loss_ce |   train/decode.acc_seg |   train/aux.loss_ce |
|                              |            |                     |     ta.samples_per_gpu | del                    | timizer                |        |                  |                        |                        |                     |
|------------------------------+------------+---------------------+------------------------+------------------------+------------------------+--------+------------------+------------------------+------------------------+---------------------|
| DataParallelTrainer_bfa37ef8 | RUNNING    | 10.244.28.73:865977 |                      7 | upernet_swin           | adam                   |        |                  |                        |                        |                     |
| DataParallelTrainer_d814d7c0 | PENDING    |                     |                      6 | segformer_mit_b0       | sgd                    |        |                  |                        |                        |                     |
| DataParallelTrainer_542da87e | TERMINATED | 10.244.28.73:853457 |                      7 | pspnet_r50_d8          | rms                    |     16 |          33.2772 |                1.7473  |              37.4854   |            0.768425 |
| DataParallelTrainer_57e6f61e | TERMINATED | 10.244.28.73:855683 |                      5 | upernet_swin           | sgd                    |     16 |          28.8432 |                2.07158 |               5.04539  |            0.844206 |
| DataParallelTrainer_95ea01ea | TERMINATED | 10.244.28.73:861581 |                      5 | deeplabv3plus_r50_d8   | adamw                  |     16 |          33.5505 |                1.93402 |               2.68645  |            0.75626  |
| DataParallelTrainer_a601eb38 | TERMINATED | 10.244.28.73:863830 |                      4 | deeplabv3plus_r50_d8   | adamw                  |     16 |          31.2407 |                2.12421 |               2.73684  |            0.83207  |
| DataParallelTrainer_6e9271cc | ERROR      | 10.244.28.73:857771 |                      6 | upernet_swin           | adamw                  |      1 |          21.7337 |                2.28166 |               0.824149 |            0.889275 |
| DataParallelTrainer_8538e8fc | ERROR      | 10.244.28.73:859686 |                      2 | pspnet_r50_d8          | adam                   |      1 |          20.8052 |                2.55568 |               0.919884 |            1.01873  |
+------------------------------+------------+---------------------+------------------------+------------------------+------------------------+--------+------------------+------------------------+------------------------+---------------------+

BOHB

+------------------------------+------------+---------------------+------------------------+------------------------+------------------------+--------+------------------+------+------------------------+------------------------+---------------------+
| Trial name                   | status     | loc                 |   train_loop_config/da | train_loop_config/mo   | train_loop_config/op   |   iter |   total time (s) |   ts |   train/decode.loss_ce |   train/decode.acc_seg |   train/aux.loss_ce |
|                              |            |                     |     ta.samples_per_gpu | del                    | timizer                |        |                  |      |                        |                        |                     |
|------------------------------+------------+---------------------+------------------------+------------------------+------------------------+--------+------------------+------+------------------------+------------------------+---------------------|
| DataParallelTrainer_ff2d4d6c | RUNNING    | 10.244.28.73:829315 |                      2 | upernet_swin           | adamw                  |      3 |          39.7579 |    0 |                2.72033 |               9.35506  |            1.11077  |
| DataParallelTrainer_d5f9948c | PAUSED     | 10.244.28.73:827289 |                      7 | deeplabv3plus_r50_d8   | sgd                    |      6 |          46.7037 |    0 |                2.29587 |               3.97511  |            0.924609 |
| DataParallelTrainer_1a0aca2e | PENDING    | 10.244.28.73:823490 |                      4 | pspnet_r50_d8          | adam                   |      2 |          20.916  |      |                1.46375 |               2.46173  |            0.573465 |
| DataParallelTrainer_27202c68 | PENDING    | 10.244.28.73:825381 |                      5 | pspnet_r50_d8          | adam                   |      2 |          21.5591 |      |                2.2028  |               3.57602  |            0.878709 |
| DataParallelTrainer_d91cf17c | TERMINATED | 10.244.28.73:816132 |                      5 | deeplabv3plus_r50_d8   | sgd                    |      2 |          21.1208 |      |                2.28296 |               5.48833  |            0.91002  |
| DataParallelTrainer_0784bcde | TERMINATED | 10.244.28.73:821535 |                      4 | deeplabv3plus_r50_d8   | sgd                    |      2 |          21.0901 |      |                2.49598 |               0.576004 |            0.973098 |
| DataParallelTrainer_e6bbddde | ERROR      | 10.244.28.73:818008 |                      2 | segformer_mit_b0       | rms                    |        |                  |      |                        |                        |                     |
| DataParallelTrainer_f6d4129a | ERROR      | 10.244.28.73:818819 |                      3 | segformer_mit_b0       | sgd                    |        |                  |      |                        |                        |                     |
+------------------------------+------------+---------------------+------------------------+------------------------+------------------------+--------+------------------+------+------------------------+------------------------+---------------------+

KKIEEK and others added 28 commits December 1, 2022 02:33
Signed-off-by: Junhwa Song <[email protected]>
Co-authored-by: Hakjin Lee <[email protected]>
Signed-off-by: Junhwa Song <[email protected]>
Co-authored-by: Hakjin Lee <[email protected]>
Signed-off-by: Junhwa Song <[email protected]>
Co-authored-by: Hakjin Lee <[email protected]>
Signed-off-by: Junhwa Song <[email protected]>
* Support custom trainer and backend

* Add comment

* Add assertion

* Fix typo

* Update siatune/ray/config.py

* Apply lint

* Fix test code

Co-authored-by: Hakjin Lee <[email protected]>
Co-authored-by: Hakjin Lee <[email protected]>
Signed-off-by: Junhwa Song <[email protected]>
* Update class signature

* Update mmseg

* Update mmdet

* Update mmcls

* Update configs

* Fix test code
* Fix blocking issue at test_tasks.py

* Support single GPU tuning

* Bump FLAML to v1.0.14 to avoid deprecated warning
@KKIEEK KKIEEK requested review from nijkah and yhna940 December 16, 2022 16:00
@KKIEEK KKIEEK force-pushed the v2.1.0/scheduler-cfg branch from 29ba7d2 to 5296f0b Compare December 16, 2022 16:50
@KKIEEK KKIEEK force-pushed the v2.1.0/scheduler-cfg branch from 5296f0b to 7c1cf9a Compare December 16, 2022 16:55
@codecov-commenter
Copy link

codecov-commenter commented Dec 16, 2022

Codecov Report

❗ No coverage uploaded for pull request base (main@54dc8f1). Click here to learn what that means.
Patch has no changes to coverable lines.

Additional details and impacted files
@@           Coverage Diff           @@
##             main     #101   +/-   ##
=======================================
  Coverage        ?   72.79%           
=======================================
  Files           ?       60           
  Lines           ?     1632           
  Branches        ?      235           
=======================================
  Hits            ?     1188           
  Misses          ?      345           
  Partials        ?       99           
Flag Coverage Δ
unittests 72.79% <0.00%> (?)

Flags with carried forward coverage won't be shown. Click here to find out more.

Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here.

☔ View full report at Codecov.
📢 Do you have feedback about the report comment? Let us know in this issue.

@KKIEEK KKIEEK changed the title Support additional scheduler Support additional schedulers Dec 16, 2022
@KKIEEK KKIEEK force-pushed the v2.1.0/scheduler-cfg branch from 4fb2e83 to 1f917fc Compare December 16, 2022 18:45
@KKIEEK KKIEEK changed the title Support additional schedulers Support HyperBand and BOHB scheduler Dec 17, 2022
Base automatically changed from ray/v2.1.0 to main December 19, 2022 02:42
@nijkah nijkah changed the title Support HyperBand and BOHB scheduler [Feature] Support HyperBand and BOHB scheduler Dec 19, 2022
@nijkah nijkah merged commit 835efa1 into main Dec 19, 2022
@nijkah nijkah deleted the v2.1.0/scheduler-cfg branch December 19, 2022 04:25
@nijkah nijkah mentioned this pull request Dec 19, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants