Skip to content

Commit 080081b

Browse files
hisushantasayakpaulDN6
authored
Remove the redundant line from the adapter.py file. (#5618)
* I added a new doc string to the class. This is more flexible to understanding other developers what are doing and where it's using. * Update src/diffusers/models/unet_2d_blocks.py This changes suggest by maintener. Co-authored-by: Sayak Paul <[email protected]> * Update src/diffusers/models/unet_2d_blocks.py Add suggested text Co-authored-by: Sayak Paul <[email protected]> * Update unet_2d_blocks.py I changed the Parameter to Args text. * Update unet_2d_blocks.py proper indentation set in this file. * Update unet_2d_blocks.py a little bit of change in the act_fun argument line. * I run the black command to reformat style in the code * Update unet_2d_blocks.py similar doc-string add to have in the original diffusion repository. * I removed the dummy variable defined in both the encoder and decoder. * Now, I run black package to reformat my file * Remove the redundant line from the adapter.py file. * Black package using to reformated my file --------- Co-authored-by: Sayak Paul <[email protected]> Co-authored-by: Dhruv Nair <[email protected]>
1 parent dd9a5ca commit 080081b

File tree

1 file changed

+4
-6
lines changed

1 file changed

+4
-6
lines changed

src/diffusers/models/adapter.py

Lines changed: 4 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -456,9 +456,8 @@ def forward(self, x: torch.Tensor) -> torch.Tensor:
456456
This method takes input tensor x and applies a convolutional layer, ReLU activation, and another convolutional
457457
layer on the input tensor. It returns addition with the input tensor.
458458
"""
459-
h = x
460-
h = self.block1(h)
461-
h = self.act(h)
459+
460+
h = self.act(self.block1(x))
462461
h = self.block2(h)
463462

464463
return h + x
@@ -578,9 +577,8 @@ def forward(self, x: torch.Tensor) -> torch.Tensor:
578577
This function takes input tensor x and processes it through one convolutional layer, ReLU activation, and
579578
another convolutional layer and adds it to input tensor.
580579
"""
581-
h = x
582-
h = self.block1(h)
583-
h = self.act(h)
580+
581+
h = self.act(self.block1(x))
584582
h = self.block2(h)
585583

586584
return h + x

0 commit comments

Comments
 (0)