Skip to content

Commit b870038

Browse files
Ahmed Shuaibifacebook-github-bot
authored andcommitted
feat: add OptimizationTechnique.NONE for non-technique-specific events (meta-pytorch#4025)
Summary: Add NONE value to OptimizationTechnique enum so general planner events can indicate no specific technique is active. Differential Revision: D99529682
1 parent 401702a commit b870038

2 files changed

Lines changed: 7 additions & 1 deletion

File tree

torchrec/distributed/logging_handlers.py

Lines changed: 6 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,7 @@
1010
import logging
1111
from collections import defaultdict
1212
from enum import Enum
13-
from typing import Any, Callable, Dict, Generator, Optional, TypeVar
13+
from typing import Any, Callable, Dict, Generator, List, Optional, TypeVar
1414

1515
from torchrec.distributed.logging_utils import (
1616
EventLoggingHandlerBase,
@@ -202,4 +202,9 @@ def log_context(
202202
yield
203203

204204

205+
def detect_technique(items: List) -> OptimizationTechnique: # type: ignore[type-arg]
206+
"""Detect if EMO is active. No-op OSS stub."""
207+
return OptimizationTechnique.NONE
208+
209+
205210
_log_handlers: dict[str, logging.Handler] = defaultdict(logging.NullHandler)

torchrec/distributed/logging_utils.py

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -37,6 +37,7 @@ class StackLayer(Enum):
3737
class OptimizationTechnique(Enum):
3838
"""Training optimization techniques."""
3939

40+
NONE = "none"
4041
EMO = "emo"
4142
ITEP = "itep"
4243
ALBT = "albt"

0 commit comments

Comments
 (0)