You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Issue when running: fast_dev_run=True
"TypeError: log_hyperparams() takes 2 positional arguments but 3 were given"
To Reproduce
When using the following: Where self.hp_metrics is a list of strings where each string is an available metric that is being logged, example "accuracy/val".
def on_train_start(self):
if self.logger:
self.logger.log_hyperparams(self.hparams, {metric:0 for metric in self.hp_metrics})
Expected behavior
Assume the unit test is wrong since the documentation say that self.logger.log_hyperparams takes one positional argument and one dictionary. The code run fine without fast_dev_run=True and everything is logged correctly to tensorboard.
Environment
pytorch_lightning 1.2.2
The text was updated successfully, but these errors were encountered:
🐛 Bug
Issue when running:
fast_dev_run=True
"TypeError: log_hyperparams() takes 2 positional arguments but 3 were given"
To Reproduce
When using the following: Where self.hp_metrics is a list of strings where each string is an available metric that is being logged, example "accuracy/val".
Expected behavior
Assume the unit test is wrong since the documentation say that self.logger.log_hyperparams takes one positional argument and one dictionary. The code run fine without fast_dev_run=True and everything is logged correctly to tensorboard.
Environment
pytorch_lightning 1.2.2
The text was updated successfully, but these errors were encountered: