-
Notifications
You must be signed in to change notification settings - Fork 3.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Avoid duplicate logging of hyperparameters with Wandb #13899
Comments
Hi @remivanbel ! Engineer from W&B here. I think adding a flatten option like this could possibly lead to some unexpected behaviour and might cause changes to other workflows because we would be manually changing the config dict keys. I think for your use case, adding |
Hi @manangoel99 , it is an honor that my issue got noticed by a W&B engineer ;) I agree that letting wandb handle the flattening is a more elegant solution (also if for example in some distant future wandb would use an other delimiter). This does make me wonder why lighting still does all this conversions to
Whether The reason I would like to see a change in the
if I also want to have access to Many thanks already btw for your suggested solution, its quick and simple which is always nice! |
This issue has been automatically marked as stale because it hasn't had any recent activity. This issue will be closed in 7 days if no further activity occurs. Thank you for your contributions - the Lightning Team! |
🚀 Feature
Change the flattening format of the hyperparameter dicts for the WandbLogger, such that it is the same as the one used by wandb for logging its configuration.
Motivation
At the moment, I have the problem that certain hyperparameters are logged twice in wandb.
Because I want to use the hyperparameter sweep capabilities of wandb, I have to first get my configs from wandb, and then feed this
to my lightning trainer. However, due to this setup, the hyperparameters are saved twice: once during initialization of the WandbLogger, and once during initialization of the Trainer (save_hyperparameters):
with:
Pitch
I think the most elegent fix would be to make the delimiter a '.' when flattening the hyperparameters. Therefore the keys should be the same as the ones already used by Wandb.
params = _flatten_dict(params, delimiter=".")
https://github.com/Lightning-AI/lightning/blob/511875e5675b0543d89e2aae3950a7834b35238e/src/pytorch_lightning/loggers/wandb.py#L374-L379
Alternatives
The alternative would be not to log your hyperparameters to the logger using
self.save_hyperparameters(logger=False)
because all hyperparameters should already be present in the config provided to the WandbLogger. However, it is more fail save to log any additional hyperparameters present.cc @awaelchli @morganmcg1 @borisdayma @scottire @parambharat @AyushExel @manangoel99
The text was updated successfully, but these errors were encountered: