-
Notifications
You must be signed in to change notification settings - Fork 399
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Test fewer models in trainers to avoid exceeding RAM #1377
Conversation
From #1376:
We want to make sure our weights actually load in a model correctly. All of them. Every time someone adds a new one. |
We certainly don't need to check the cross product of pretrained weights with every trainer as there will be a lot of duplicate work (e.g. we have several different sets of ResNet50 weights). We need to check to make sure the pretrained weights are valid, then separately need to check to see if our trainers work with resnets, vits, etc.
Right, but the BYOL trainer is not the right place for this. We can check that the weights load in the model correctly with:
|
However, I would also say we should change |
Can you make the same change to the other trainers? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Not only does this decrease memory usage, it also shaves off 1/3 of the time our tests take to run!
It's unclear if this is a solution or if we're just kicking the can. We could find that if we double the number of models, the model tests start to fail instead of the trainer tests. But I guess we'll find out when we add all of our Landsat weights.
If we are using dummy models, does it even make sense to test that all the model weights work in a trainer other than just testing that the weights load properly into the specified backbone? |
At a bare minimum, we need these tests for test coverage. But we also want to make sure that enums, strings, and paths all work correctly |
We don't need to test the BYOL trainer with every set of pretrained weights (we don't actually need to involve pretrained weights at all).