Skip to content

Commit ef2cd47

Browse files
committed
Minor fix
Signed-off-by: Amit Raj <[email protected]>
1 parent 75d951b commit ef2cd47

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

QEfficient/transformers/models/t5/modeling_t5.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -177,7 +177,7 @@ def forward(
177177
output_attentions=output_attentions,
178178
cache_position=cache_position,
179179
)
180-
hidden_states = hidden_states * self.scaling_factor + self.dropout(attention_output[0]) # Modified by patch
180+
hidden_states = hidden_states * 1.0 + self.dropout(attention_output[0]) # Modified by patch
181181
outputs = (hidden_states,) + attention_output[1:] # add attentions if we output them
182182
return outputs
183183

0 commit comments

Comments
 (0)