Skip to content

Commit b4ca05f

Browse files
DN6patrickvonplaten
andcommitted
Fix Basic Transformer Block (#5683)
* fix * Update src/diffusers/models/attention.py Co-authored-by: Patrick von Platen <[email protected]> --------- Co-authored-by: Patrick von Platen <[email protected]>
1 parent a1d33fc commit b4ca05f

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

src/diffusers/models/attention.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -287,7 +287,7 @@ def forward(
287287
else:
288288
raise ValueError("Incorrect norm")
289289

290-
if self.pos_embed is not None and self.use_ada_layer_norm_single is None:
290+
if self.pos_embed is not None and self.use_ada_layer_norm_single is False:
291291
norm_hidden_states = self.pos_embed(norm_hidden_states)
292292

293293
attn_output = self.attn2(

0 commit comments

Comments
 (0)