Skip to content
This repository was archived by the owner on Aug 7, 2024. It is now read-only.

Commit 9f3074a

Browse files
committed
add no grad
1 parent 55e6a28 commit 9f3074a

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

float8_experimental/float8_linear_utils.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -134,7 +134,7 @@ def get_float8_layers(model: torch.nn.Module):
134134

135135
return fp8_layers
136136

137-
137+
@torch.no_grad()
138138
def sync_float8_amax_and_scale_history(model: torch.nn.Module, fp8_layers=None) -> None:
139139
"""
140140
Manages the float8 amax and scale bookkeeping. In detail, it does the

0 commit comments

Comments
 (0)