Skip to content

Commit

Permalink
Make passing the IP Adapter mask to the attention mechanism optional (#…
Browse files Browse the repository at this point in the history
…10346)

Make passing the IP Adapter mask to the attention mechanism optional if there is no need to apply it to a given IP Adapter.
  • Loading branch information
elismasilva authored Dec 24, 2024
1 parent 6dfaec3 commit c0c1168
Showing 1 changed file with 4 additions and 0 deletions.
4 changes: 4 additions & 0 deletions src/diffusers/models/attention_processor.py
Original file line number Diff line number Diff line change
Expand Up @@ -4839,6 +4839,8 @@ def __call__(
)
else:
for index, (mask, scale, ip_state) in enumerate(zip(ip_adapter_masks, self.scale, ip_hidden_states)):
if mask is None:
continue
if not isinstance(mask, torch.Tensor) or mask.ndim != 4:
raise ValueError(
"Each element of the ip_adapter_masks array should be a tensor with shape "
Expand Down Expand Up @@ -5056,6 +5058,8 @@ def __call__(
)
else:
for index, (mask, scale, ip_state) in enumerate(zip(ip_adapter_masks, self.scale, ip_hidden_states)):
if mask is None:
continue
if not isinstance(mask, torch.Tensor) or mask.ndim != 4:
raise ValueError(
"Each element of the ip_adapter_masks array should be a tensor with shape "
Expand Down

0 comments on commit c0c1168

Please sign in to comment.