Gap in IndexBatchSamplerWrapper
check for dataloaders used for prediction
#9425
Labels
IndexBatchSamplerWrapper
check for dataloaders used for prediction
#9425
🐛 Bug
Carrying forward the discussion here: #9380 (comment)
The Trainer is currently inconsistent around checks for batch samplers. This code first checks if the dataloader has a batch sampler before applying the wrapper for prediction. https://github.com/PyTorchLightning/pytorch-lightning/blob/8407238d66df14c4476f880a6e6260b4bfa83b40/pytorch_lightning/trainer/data_loading.py#L161-L171
But this code in the prediction loop unconditionally accesses the attribute: https://github.com/PyTorchLightning/pytorch-lightning/blob/8407238d66df14c4476f880a6e6260b4bfa83b40/pytorch_lightning/loops/epoch/prediction_epoch_loop.py#L163-L164
This means if for any reason the
IndexBatchSamplerWrapper
wasn't applied, this access will fail.Proposal for fix:
cc @ninginthecloud as this might occur when using iterables with the trainer
To Reproduce
Expected behavior
Environment
conda
,pip
, source):torch.__config__.show()
:Additional context
The text was updated successfully, but these errors were encountered: