Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fixes observer attachment to model based on config for wanda sparsifier #1265

Merged
merged 4 commits into from
Dec 18, 2024

Conversation

agrawal-aka
Copy link
Contributor

@agrawal-aka agrawal-aka commented Nov 12, 2024

This PR fixes the issue https://github.com/pytorch/ao/issues/1133 by only attaching the observers to the layers passed in the sparse config.
Earlier irrespective of the layers being defined in sparse config, the observer was being attached to the entire model, leading to runtime errors as mentioned in the issue.

cc: @jcaip @HDCharles @msaroufim

Copy link

pytorch-bot bot commented Nov 12, 2024

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/ao/1265

Note: Links to docs will display an error until the docs builds have been completed.

✅ No Failures

As of commit f2e8f82 with merge base 53d2486 (image):
💚 Looks good so far! There are no failures yet. 💚

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@facebook-github-bot
Copy link

Hi @agrawal-aka!

Thank you for your pull request and welcome to our community.

Action Required

In order to merge any pull request (code, docs, etc.), we require contributors to sign our Contributor License Agreement, and we don't seem to have one on file for you.

Process

In order for us to review and merge your suggested changes, please sign at https://code.facebook.com/cla. If you are contributing on behalf of someone else (eg your employer), the individual CLA may not be sufficient and your employer may need to sign the corporate CLA.

Once the CLA is signed, our tooling will perform checks and validations. Afterwards, the pull request will be tagged with CLA signed. The tagging process may take up to 1 hour after signing. Please give it that time before contacting us about it.

If you have received this in error or have any questions, please contact us at [email protected]. Thanks!

Copy link

pytorch-bot bot commented Nov 26, 2024

Didn't find following labels among repository labels: bug fix

@agrawal-aka
Copy link
Contributor Author

@pytorchbot label "topic: bug fix"

@pytorch-bot pytorch-bot bot added the topic: bug fix Use this tag for PRs that fix bugs label Nov 26, 2024
@agrawal-aka agrawal-aka force-pushed the akash/wanda-observer-fix branch 3 times, most recently from 15f8e40 to 366261e Compare November 26, 2024 10:44
@agrawal-aka
Copy link
Contributor Author

Hi, @jcaip @HDCharles @msaroufim. Can you please review this PR whenever you get some time.

@jcaip
Copy link
Contributor

jcaip commented Nov 26, 2024

Hi @agrawal-aka thanks for the fix - looks good, but can you add in a test for this case here:

import torch

@agrawal-aka agrawal-aka force-pushed the akash/wanda-observer-fix branch 2 times, most recently from b1f1c61 to 63eeb7e Compare November 27, 2024 06:44
@agrawal-aka
Copy link
Contributor Author

agrawal-aka commented Nov 27, 2024

Hi @jcaip, thanks for reviewing. I have added a test case in test_wanda.py, which sparsify layers based on config, let me know if anything needs to be changed

@agrawal-aka agrawal-aka force-pushed the akash/wanda-observer-fix branch from ce020eb to b04b587 Compare December 4, 2024 07:02
@agrawal-aka
Copy link
Contributor Author

Hi @jcaip, I rebased the PR with the latest main, could you retrigger the build pipelines?

Copy link
Contributor

@jcaip jcaip left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm once the lint check passes - ty for the fixes!

@agrawal-aka
Copy link
Contributor Author

Hi @jcaip, lint CI seems to be failing on some different file which was not edited in this PR. Could you tell the fix?
image

@agrawal-aka
Copy link
Contributor Author

agrawal-aka commented Dec 4, 2024

@jcaip I think there was an issue in the lint for main branch fixed by #1379 a while back, I have rebased again with the latest main. could you retrigger the build pipelines, it should clear this failure for this PR as well

@agrawal-aka agrawal-aka force-pushed the akash/wanda-observer-fix branch from b04b587 to f2e8f82 Compare December 4, 2024 18:50
@agrawal-aka
Copy link
Contributor Author

@jcaip , all checks are passed, I think you can go ahead and merge the PR and close the issue as well, if everything looks good to you

yanbing-j pushed a commit to yanbing-j/ao that referenced this pull request Dec 9, 2024
@maajidkhann
Copy link

@pytorchbot rebase

Copy link

pytorch-bot bot commented Dec 18, 2024

You don't have permissions to rebase this PR since you are a first time contributor. If you think this is a mistake, please contact PyTorch Dev Infra.

@agrawal-aka
Copy link
Contributor Author

@pytorchbot rebase

Copy link

pytorch-bot bot commented Dec 18, 2024

You don't have permissions to rebase this PR since you are a first time contributor. If you think this is a mistake, please contact PyTorch Dev Infra.

@agrawal-aka
Copy link
Contributor Author

@jcaip @jerryzh168 @msaroufim, can you help in merging this PR? The changes are already reviewed and approved.

@jerryzh168 jerryzh168 merged commit 33d57af into pytorch:main Dec 18, 2024
18 checks passed
amdfaa pushed a commit that referenced this pull request Jan 10, 2025
…er (#1265)

* Fixes observer attachment to model based on config for wanda sparsifier

* handles case when no config is specified

* Added test case in test_wanda.py for custom config

* lint fix
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. topic: bug fix Use this tag for PRs that fix bugs
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants