-
Notifications
You must be signed in to change notification settings - Fork 3.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Use ArgumentParser.add_argument_group for clarity of presentation of flags #4119
Comments
Hi! thanks for your contribution!, great first issue! |
@mauvilsa, @ananyahjha93 I agree these issues are related, but I think they are not duplicates. My suggestion was to always use an argument group for the Trainer arguments, rather than make the Trainer argument parser behavior more customizable. I was thinking along the lines of "one, right way to do it" kind of thinking. Perhaps it would make sense to merge the discussion on these various suggestions. @williamFalcon I will give it a try. The change I suggested in itself should not be hard to do. I am just wondering what kind of testing would make sense for this. |
@odedbd I only said that #3076 is a duplicate. And I still think it is a duplicate. Quoting from it
Essentially this means always having the trainer arguments in an argument group. The same as what you are saying. Also please look at the comments in these two issues #3076 and #3720 and the new code written by @nateraw (pytorch-lightning-bolts/arguments.py) before developing anything so that we don't conflict. Or lets first discuss and agree on a way forward. |
@mauvilsa Thank you for your feedback. I must be misunderstanding something. The desired solution in #3076 is as given follows-
As far as I understand, this solution puts the choice (and responsibility) of adding the Trainer arguments into an argument group on the developer. If the developer passes an argument group to pl.Trainer.add_argparse_args then the arguments will be added to that group; if the developer passes the parser directly (not a group), then the arguments would be outside a group. I don't think that's necessarily a bad idea, I was just thinking along a more authoritative direction, of forcing the arguments to be part of an argument group. Perhaps my misunderstanding is be that you referred to the problem addressed as being a duplicate, whereas I was thinking of this from the point of view of a feature request; the feature (solution) I was suggesting is not a duplicate of the one suggested in those issues. This is my first time writing an Issue for PL, sorry if it takes me time to get it. Regardless, I am fine on waiting on this for more feedback before developing anything. |
@odedbd I am not sure if there is a standard definition of what makes something duplicate or not. Certainly I was not considering the differences in the proposed solutions since anyway a solution depends on the discussion. I would also vote for the arguments being added to a group without this being the responsibility of the user. It would be great if @nateraw gives his opinion regarding this. @carmocca you could also give your opinion here. |
The solution I proposed is just the easiest step towards supporting ArgumentGroups, since it requires minimal code changes (just remove this line). However, I agree that it would better that Lightning takes care of it. Also, I don't mind choosing a more complete solution instead (e.g. jsonargparse or our own bolts solution). |
Right! I went with a looser more abstract definition of the That being said, I think we can make it more strict for DMs, Models, and implicitly add Trainer by default, which would probably be a better API. Maybe something like... from pytorch_lightning.utils import LightningArgumentParser
from my_models import LitModel
from my_datamodules import MyDM
parser = LightningArgumentParser()
parser.add_datamodule_args(MyDM)
parser.add_model_args(LitModel)
args = parser.parse_args() What do you think? |
This issue has been automatically marked as stale because it hasn't had any recent activity. This issue will be closed in 7 days if no further activity occurs. Thank you for your contributions, Pytorch Lightning Team! |
🚀 Feature
Add Trainer arguments in an argument groups, to allow for clearer command line help.
Motivation
The Trainer class already has a large number of arguments. When adding the arguments for a Model, DataModule and program level arguments the list of flags in the command line help is too long and confused to be useful.
Pitch
The Trainer arguments should be grouped within an arguments group using add_arument_group. In addition, the documentation for add_model_specific_args and similar functions should recommend using add_argument_group for grouping together the arguments of each part of the code in a sensible way.
Alternatives
I have explored switching to using a configuration file, specifically using Hydra, but I find that I like the ArgumentParser approach better for my use cases.
The text was updated successfully, but these errors were encountered: