Skip to content

Commit

Permalink
Add config for Optuna (#111)
Browse files Browse the repository at this point in the history
  • Loading branch information
KKIEEK authored Dec 20, 2022
1 parent 796e6da commit 598b8a7
Show file tree
Hide file tree
Showing 3 changed files with 14 additions and 2 deletions.
5 changes: 3 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -19,9 +19,10 @@ SIATune is an open-source deep learning model hyperparameter tuning toolbox espe
- **Support hyperparameter search algorithms**

We provide hyperparameter search algorithms such as below;
- [x] [Nevergrad](https://github.com/facebookresearch/nevergrad)
- [x] [HyperOpt](https://github.com/hyperopt/hyperopt)
- [x] [FLAML](https://github.com/microsoft/FLAML)
- [x] [HyperOpt](https://github.com/hyperopt/hyperopt)
- [x] [Nevergrad](https://github.com/facebookresearch/nevergrad)
- [x] [Optuna](https://github.com/optuna/optuna)
- [ ] [Adaptive Experimentation (AX)](https://ax.dev/)
- [ ] [Scikit-optimize](https://github.com/scikit-optimize/scikit-optimize)

Expand Down
1 change: 1 addition & 0 deletions configs/_base_/searcher/optuna.py
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
searcher = dict(type='OptunaSearch')
10 changes: 10 additions & 0 deletions tests/test_hyper_optim/test_searchers.py
Original file line number Diff line number Diff line change
Expand Up @@ -89,3 +89,13 @@ def test_nevergrad(trainable, config):
dict(type='NevergradSearch', optimizer='PSO', budget=1)),
num_samples=2,
config=config)


def test_optuna(trainable, config):
tune.run(
trainable,
metric='mean_loss',
mode='min',
search_alg=build_searcher(dict(type='OptunaSearch')),
num_samples=2,
config=config)

0 comments on commit 598b8a7

Please sign in to comment.