Skip to content

Add Context Parallel tutorial #3319

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 32 commits into from
Apr 18, 2025
Merged

Add Context Parallel tutorial #3319

merged 32 commits into from
Apr 18, 2025

Conversation

XilunWu
Copy link
Contributor

@XilunWu XilunWu commented Apr 9, 2025

Copy link

pytorch-bot bot commented Apr 9, 2025

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/tutorials/3319

Note: Links to docs will display an error until the docs builds have been completed.

❌ 1 New Failure

As of commit 5872433 with merge base 7cb6915 (image):

NEW FAILURE - The following job has failed:

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@XilunWu XilunWu requested a review from wconstab April 10, 2025 23:03
out = F.scaled_dot_product_attention(*qkv, is_causal=True)

# make a clean copy of QKV for output comparison
cp_qkv = [t.detach().clone() for t in qkv]
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ahh, so this is not even needed for cp just for the reference?

I wonder if it's better to delete the reference. That's more appropriate for a unit test, but for an example people usually want something that's minimal and copy able, and in this case they might be distracted by this line.

Copy link
Contributor

@AlannaBurke AlannaBurke left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just some minor formatting fixes.

Copy link

@fegin fegin left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There is a wrong statement about pass-kv, we should change that.

@XilunWu XilunWu requested review from AlannaBurke and fegin April 14, 2025 21:37
Copy link

@fegin fegin left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Two final comments. LGTM

@svekars svekars changed the base branch from 2.7-RC-TEST to main April 15, 2025 18:41
@svekars svekars changed the base branch from main to 2.7-RC-TEST April 15, 2025 18:41
@svekars
Copy link
Contributor

svekars commented Apr 15, 2025

Can you please rebase the base branch to main.

@XilunWu XilunWu changed the base branch from 2.7-RC-TEST to main April 15, 2025 20:11
@XilunWu XilunWu changed the base branch from main to 2.7-RC-TEST April 16, 2025 19:00
@XilunWu XilunWu changed the base branch from 2.7-RC-TEST to main April 16, 2025 19:09
@svekars svekars merged commit aebeff4 into pytorch:main Apr 18, 2025
18 of 19 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

8 participants