Skip to content

Commit 18bb5f8

Browse files
committed
some reference-freeing I missed from @neonsecret's optimized attention CompVis#177
plus some extras from me (del v, del h)
1 parent 91d29c2 commit 18bb5f8

File tree

1 file changed

+3
-0
lines changed

1 file changed

+3
-0
lines changed

ldm/modules/attention.py

+3
Original file line numberDiff line numberDiff line change
@@ -191,9 +191,12 @@ def forward(self, x, context=None, mask=None):
191191

192192
# attention, what we cannot get enough of
193193
attn = sim.softmax(dim=-1)
194+
del sim
194195

195196
out = einsum('b i j, b j d -> b i d', attn, v)
197+
del attn, v
196198
out = rearrange(out, '(b h) n d -> b n (h d)', h=h)
199+
del h
197200
return self.to_out(out)
198201

199202

0 commit comments

Comments
 (0)