Skip to content

Commit

Permalink
lint fix
Browse files Browse the repository at this point in the history
  • Loading branch information
agrawal-aka committed Dec 4, 2024
1 parent bea60b7 commit b04b587
Showing 1 changed file with 2 additions and 1 deletion.
3 changes: 2 additions & 1 deletion torchao/sparsity/wanda.py
Original file line number Diff line number Diff line change
Expand Up @@ -65,7 +65,8 @@ def prepare(self, model: nn.Module, config: List[Dict]) -> None:
# Apply the qconfig directly to the module if it exists
if module is not None:
module.qconfig = QConfig(
activation=PerChannelNormObserver, weight=default_placeholder_observer
activation=PerChannelNormObserver,
weight=default_placeholder_observer,
) # type: ignore[assignment]
torch.ao.quantization.prepare(model, inplace=True)

Expand Down

0 comments on commit b04b587

Please sign in to comment.