Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[feature request] Make transforms.functional_tensor functions differential w.r.t. their parameters #5000

Open
ain-soph opened this issue Nov 28, 2021 · 1 comment

Comments

@ain-soph
Copy link
Contributor

ain-soph commented Nov 28, 2021

🚀 The feature

Make operations in torchvision.transforms.functional_tensor differential w.r.t. hyper-parameters, which is helpful for Faster AutoAugment search (hyper-parameters are learnable parameters via backward). (while keeping the backward compatibility to previous codes)

Some operations are not differential (e.g., Posterize), which might require users to write their own implementations.

Motivation, pitch

The main motivation is for research purpose. Faster Autoaugment proposes to search for augment architectures using a DARTS-like framework, and all magnitudes and weights are trainable parameters. This requires all operations to have gradients w.r.t. magnitudes. This idea provides a faster search strategy as state-of-the-art AutoAugment policy search algorithms.
This work has been maintained by autoalbument and applied on some industrial scenarios from their document claims.

I think adding the backward feature wrt magnitudes would be more convenient and support future research as well.

Alternatives

No response

Additional context

Linked PR: #4995

cc @vfdev-5 @datumbox

@datumbox
Copy link
Contributor

Probably the biggest issue will face here is JIT scriptability. If we manage to achieve it without BC breaking changes, we could implement this.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants