Skip to content

Commit 38eba56

Browse files
Fix torch.clamp() issue #237
1 parent b720b41 commit 38eba56

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

models/swin_transformer_v2.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -153,7 +153,7 @@ def forward(self, x, mask=None):
153153

154154
# cosine attention
155155
attn = (F.normalize(q, dim=-1) @ F.normalize(k, dim=-1).transpose(-2, -1))
156-
logit_scale = torch.clamp(self.logit_scale, max=torch.log(torch.tensor(1. / 0.01))).exp()
156+
logit_scale = torch.clamp(self.logit_scale, max=torch.log(torch.tensor(1. / 0.01)).item()).exp()
157157
attn = attn * logit_scale
158158

159159
relative_position_bias_table = self.cpb_mlp(self.relative_coords_table).view(-1, self.num_heads)

0 commit comments

Comments
 (0)