Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add sequence fill support for ElasticTransform #7141

Merged
merged 4 commits into from
Jan 27, 2023
Merged

add sequence fill support for ElasticTransform #7141

merged 4 commits into from
Jan 27, 2023

Conversation

pmeier
Copy link
Collaborator

@pmeier pmeier commented Jan 27, 2023

According to our docstring we already have it

fill (sequence or number): Pixel fill value for the area outside the transformed
image. Default is ``0``. If given a number, the value is used for all bands respectively.

However, we do

if not isinstance(fill, (int, float)):
raise TypeError(f"fill should be int or float. Got {type(fill)}")
self.fill = fill

which will fail for sequences.

cc @vfdev-5

Comment on lines -1542 to -1543
Only number is supported for torch Tensor.
Only int or str or tuple value is supported for PIL Image.
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Drive-by since I was looking into the fill support. This seems to be a copy-paste error. Internally we just convert PIL images to tensors

t_img = img
if not isinstance(img, torch.Tensor):
if not F_pil._is_pil_image(img):
raise TypeError(f"img should be PIL Image or Tensor. Got {type(img)}")
t_img = pil_to_tensor(img)

and then call the tensor kernel:

output = F_t.elastic_transform(
t_img,
displacement,
interpolation=interpolation.value,
fill=fill,
)

Meaning, there is no difference between both types.

Copy link
Member

@NicolasHug NicolasHug left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks Philip, some minor comments below. Should we add a non-regression test?

if isinstance(fill, (int, float)):
fill = [float(fill)]
elif isinstance(fill, (list, tuple)):
fill = [float(f) for f in fill]
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we actually need to convert to float?

Copy link
Collaborator Author

@pmeier pmeier Jan 27, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Unfortunately, we do due to JIT 🥲 Removing the float conversion from L2110 gives us

import torch.jit

from torchvision import transforms

torch.jit.script(transforms.ElasticTransform(fill=[1]))
[...]
Expected a value of type 'Optional[List[float]]' for argument 'fill' but instead found type 'List[int]'.
[...]

I know this is ugly AF and far from being Pythonic, but given that it is on v1 I really don't want to deal with this any more than I have to.

@pmeier pmeier mentioned this pull request Jan 27, 2023
2 tasks
@pmeier
Copy link
Collaborator Author

pmeier commented Jan 27, 2023

I just realized that we don't have any JIT tests for ElasticTransform. Adding some, revealed that ElasticTransform is not scriptable at all.

  1. If you use anything that is not a scalar int or float, the regular constructor will fail.
  2. If you use an int (default) or float, JIT will scream at you:
import torch.jit

from torchvision import transforms

torch.jit.script(transforms.ElasticTransform(fill=0.0))
[...]
Expected a value of type 'Optional[List[float]]' for argument 'fill' but instead found type 'float'.
[...]

Copy link
Member

@NicolasHug NicolasHug left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks Philip, just nits, LGTM anyway

@pmeier pmeier added the bug label Jan 27, 2023
@pmeier pmeier merged commit 71073cb into main Jan 27, 2023
@pmeier pmeier deleted the elastic-fill branch January 27, 2023 12:32
facebook-github-bot pushed a commit that referenced this pull request Feb 8, 2023
Reviewed By: vmoens

Differential Revision: D43116102

fbshipit-source-id: 4ea89bf9faf5612d4e168483d0f236e9d1f569ef
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants