Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Test Fix] Quant model reload #974

Merged
merged 8 commits into from
Jan 10, 2025
Merged

[Test Fix] Quant model reload #974

merged 8 commits into from
Jan 10, 2025

Conversation

horheynm
Copy link
Collaborator

@horheynm horheynm commented Dec 11, 2024

Contingent on merge of huggingface/transformers#34719
^ has been merged not yet released
^ has been released

SUMMARY:
Update test to use AutoModelForCausalLM decompressor instead of manually instantiating the compressor and decompressing. AutoModelForCausalLM will run code that if quantization_config is recognized, it will run the same decompression

TEST PLAN:
Ran the test using transformers main
Must pass: tests/llmcompressor/transformers/sparsification/test_compress_tensor_utils.py

@horheynm horheynm marked this pull request as ready for review December 11, 2024 19:59
Copy link

👋 Hi! Thank you for contributing to llm-compressor. Please add the ready label when the PR is ready for review.

@horheynm
Copy link
Collaborator Author

/ready

@dsikka dsikka marked this pull request as draft December 12, 2024 17:00
@horheynm horheynm changed the title fix test - use automodelforcausallm decompress [Test Fix] Decompression Dec 16, 2024
@horheynm horheynm changed the title [Test Fix] Decompression [Test Fix] Sparse model reload Dec 16, 2024
@horheynm horheynm marked this pull request as ready for review December 23, 2024 14:09
@horheynm horheynm marked this pull request as draft December 23, 2024 15:02
@horheynm horheynm changed the title [Test Fix] Sparse model reload [Test Fix] Quanti model reload Jan 9, 2025
kylesayrs
kylesayrs previously approved these changes Jan 9, 2025
@horheynm horheynm changed the title [Test Fix] Quanti model reload [Test Fix] Quant model reload Jan 9, 2025
@horheynm horheynm marked this pull request as ready for review January 10, 2025 13:49
Copy link
Collaborator

@rahul-tuli rahul-tuli left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could we make the description a little clearer with details of what was going on and why this change is needed?

@horheynm
Copy link
Collaborator Author

Could we make the description a little clearer with details of what was going on and why this change is needed?

Done!

@dsikka dsikka merged commit 0535613 into main Jan 10, 2025
6 of 7 checks passed
@dsikka dsikka deleted the fix-test_compress-tensors-utils branch January 10, 2025 21:26
kylesayrs pushed a commit that referenced this pull request Jan 15, 2025
~~Contingent on merge of
huggingface/transformers#34719
~~^ has been merged not yet released~~
^ has been released

SUMMARY:
Update test to use AutoModelForCausalLM decompressor instead of manually
instantiating the compressor and decompressing. AutoModelForCausalLM
will run code that if quantization_config is recognized, it will run the
same decompression

TEST PLAN:
Ran the test using transformers main
Must pass:
tests/llmcompressor/transformers/sparsification/test_compress_tensor_utils.py

Signed-off-by: Kyle Sayers <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants