Skip to content

Commit

Permalink
Update AC pass use_reentrant message (pytorch#134472)
Browse files Browse the repository at this point in the history
Pull Request resolved: pytorch#134472
Approved by: https://github.com/albanD
  • Loading branch information
soulitzer authored and pytorchmergebot committed Aug 26, 2024
1 parent dbef2b0 commit a23dae2
Showing 1 changed file with 4 additions and 4 deletions.
8 changes: 4 additions & 4 deletions torch/utils/checkpoint.py
Original file line number Diff line number Diff line change
Expand Up @@ -433,7 +433,7 @@ def checkpoint(
use_reentrant(bool):
specify whether to use the activation checkpoint variant that
requires reentrant autograd. This parameter should be passed
explicitly. In version 2.4 we will raise an exception if
explicitly. In version 2.5 we will raise an exception if
``use_reentrant`` is not passed. If ``use_reentrant=False``,
``checkpoint`` will use an implementation that does not require
reentrant autograd. This allows ``checkpoint`` to support additional
Expand Down Expand Up @@ -464,7 +464,7 @@ def checkpoint(
if use_reentrant is None:
warnings.warn(
"torch.utils.checkpoint: the use_reentrant parameter should be "
"passed explicitly. In version 2.4 we will raise an exception "
"passed explicitly. In version 2.5 we will raise an exception "
"if use_reentrant is not passed. use_reentrant=False is "
"recommended, but if you need to preserve the current default "
"behavior, you can pass use_reentrant=True. Refer to docs for more "
Expand Down Expand Up @@ -533,7 +533,7 @@ def checkpoint_sequential(functions, segments, input, use_reentrant=None, **kwar
use_reentrant(bool):
specify whether to use the activation checkpoint variant that
requires reentrant autograd. This parameter should be passed
explicitly. In version 2.4 we will raise an exception if
explicitly. In version 2.5 we will raise an exception if
``use_reentrant`` is not passed. If ``use_reentrant=False``,
``checkpoint`` will use an implementation that does not require
reentrant autograd. This allows ``checkpoint`` to support additional
Expand All @@ -553,7 +553,7 @@ def checkpoint_sequential(functions, segments, input, use_reentrant=None, **kwar
warnings.warn(
"torch.utils.checkpoint.checkpoint_sequential: the use_reentrant "
"parameter should be passed explicitly. "
"In version 2.4 we will raise an exception if use_reentrant "
"In version 2.5 we will raise an exception if use_reentrant "
"is not passed. use_reentrant=False is "
"recommended, but if you need to preserve the current default "
"behavior, you can pass use_reentrant=True. Refer to docs for more "
Expand Down

0 comments on commit a23dae2

Please sign in to comment.