Skip to content

Commit ae8f06f

Browse files
kit1980facebook-github-bot
authored andcommitted
Fix require_grad typo (#1771)
Summary: Fix require_grad typos (should be requires_grad). Before the fix, the code doesn't cause any errors but doesn't do what it's supposed to do. Fixed with TorchFix https://github.com/pytorch/test-infra/tree/main/tools/torchfix Upstream PR: codertimo/BERT-pytorch#104 Pull Request resolved: #1771 Reviewed By: xuzhao9 Differential Revision: D47531187 Pulled By: kit1980 fbshipit-source-id: 738b1866cc5cd3fedfa878cc40827236717f6f27
1 parent e8c1cf0 commit ae8f06f

File tree

1 file changed

+2
-1
lines changed
  • torchbenchmark/models/BERT_pytorch/bert_pytorch/model/embedding

1 file changed

+2
-1
lines changed

torchbenchmark/models/BERT_pytorch/bert_pytorch/model/embedding/position.py

+2-1
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,8 @@ def __init__(self, d_model, max_len=512):
1010

1111
# Compute the positional encodings once in log space.
1212
pe = torch.zeros(max_len, d_model).float()
13-
pe.require_grad = False
13+
# Changed from upstream, see https://github.com/codertimo/BERT-pytorch/pull/104
14+
pe.requires_grad = False
1415

1516
position = torch.arange(0, max_len).float().unsqueeze(1)
1617
div_term = (torch.arange(0, d_model, 2).float() * -(math.log(10000.0) / d_model)).exp()

0 commit comments

Comments
 (0)