Skip to content

Commit

Permalink
cleaned the scaled-dot attention path
Browse files Browse the repository at this point in the history
  • Loading branch information
l-k-11235 committed May 28, 2024
1 parent e9edb12 commit 506b355
Showing 1 changed file with 0 additions and 4 deletions.
4 changes: 0 additions & 4 deletions onmt/modules/multi_headed_attn.py
Original file line number Diff line number Diff line change
Expand Up @@ -733,10 +733,6 @@ def forward(
attn_output.add_(relative_matmul(drop_attn, relations_values, False))

context = unshape(attn_output)
if key_pad_mask is not None:
if key_pad_mask.size(0) > 1 and context.size(1) > 1:
x = key_pad_mask.squeeze(1).unsqueeze(2).expand(-1, -1, context.size(2))
context = context.masked_fill(x, 0)

if self.layer_cache[0]:
attn_output = self.final_linear(context)
Expand Down

0 comments on commit 506b355

Please sign in to comment.