Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[DO NOT REVIEW][SMOKE TEST] Skip prims shape in grad transform #1704

Draft
wants to merge 21 commits into
base: main
Choose a base branch
from

Conversation

jjsjann123
Copy link
Collaborator

No description provided.

Base automatically changed from backward_transform_dependency_fix to main January 28, 2025 21:37
@@ -123,7 +123,7 @@ def keep_or_swap(p):
if not isinstance(p, NumberProxyInterface):
return p
if p.name in seen:
return p.value # don't make it a duplicate
return None # don't make it a duplicate
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

note for myself, I needed this to avoid an error in rematerialization pass. I'll work on a separate PR with a repro when I get that.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

errrrr... this one doesn't seem to be working any more... I'm seeing the assert in rematerialization.py again.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

import thunder

def foo(a, b):
    return a + b

import torch

a = torch.randn(1, 32, 232, 232)
a.requires_grad_()
b = torch.randn(1, 1, 232, 232)
#b.requires_grad_()

jfoo = thunder.jit(foo, cache="symbolic values")

out = jfoo(a, b)

took me a while to get a repro here.
I think the issue is coming from saving for backward not properly identifying which gradient path isn't required. So after dce kicks in, the saved_for_backward is inconsistent... now I feel that I just missed a dce somewhere.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

trying my luck with #1725

else:
assert isinstance(new, ProxyInterface), (old, new)
swap_map[variableify(new)] = old
if variableify(old) != variableify(new):
swap_map[variableify(new.primal)] = old
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

note for myself, this is a separate thing. break this into a separate PR.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant