-
Notifications
You must be signed in to change notification settings - Fork 3.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Inconsistent API for on_predict_epoch_end #8479
Comments
I'd like to work on this, could you assign this issue to me? |
So I just want to follow up on the standard way of now returning data using the predict step. Would it be something like this, because it seems less intuitive to me compared to the older methods and adds alot of boilerplate code:
|
That works. You could also do the combination after |
So there is no way to overload how the trainer.predict() method combines the output of each batch? Will it always return as a list? |
Are you asking about the predictions returned? |
Exactly. By default it returns a list of whatever was returned by the model's
|
You would have to define your own |
🚀 Feature
Background
We are auditing the Lightning components and APIs to assess opportunities for improvements:
One item that came up was
on_predict_epoch_end()
defined on the ModelHooks Mixin. This accepts anoutputs: List[Any]
argument. However, this is inconsistent with the other model hooks of the same type:on_train_epoch_end
,on_validation_epoch_end
, andon_test_epoch_end
Motivation
API consistency with other epoch end model hooks.
Pitch
prediction_epoch_end
hook to the LightningModuleoutputs
argument fromon_predict_epoch_end
in v1.5 and remove entirely in v1.7prediction_epoch_end
is implementedUsers can optionally avoid this entirely and cache their prediction outputs as needed by implementing this logic in
predict_step
directlyAlternatives
Keep as is?
Additional context
cc @Borda @tchaton @justusschock @awaelchli @carmocca @ninginthecloud @daniellepintz @rohitgr7
The text was updated successfully, but these errors were encountered: