Skip to content

Commit f0768a0

Browse files
ge0405facebook-github-bot
authored andcommitted
Fix false alarm of Adagrad (#2601)
Summary: Pull Request resolved: #2601 Some non-Ads PyPer models use "Adagrad" for the sparse optimizer. However, this path is not supported. The reason for the it to work on current TTK sharder is because there's no sparse parameter at all. For Torchrec sharder to honor that, this diff re-examine this path and make the behavior on-par with TTK sharder. Differential Revision: D65957418 fbshipit-source-id: 70368b8a072b46a139834a9928f0e5f8f06a7fe0
1 parent 99162b5 commit f0768a0

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

torchrec/distributed/model_parallel.py

+1-1
Original file line numberDiff line numberDiff line change
@@ -598,7 +598,7 @@ def named_buffers(
598598
yield key, param
599599

600600
@property
601-
def fused_optimizer(self) -> KeyedOptimizer:
601+
def fused_optimizer(self) -> CombinedOptimizer:
602602
return self._optim
603603

604604
@property

0 commit comments

Comments
 (0)