-
Notifications
You must be signed in to change notification settings - Fork 237
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Remove libOMP linking for experimental kernels #1822
Conversation
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/ao/1822
Note: Links to docs will display an error until the docs builds have been completed. This comment was automatically generated by Dr. CI and updates every 15 minutes. |
@metascroy @manuelcandales Is there CI that auto checks for this? How should we test this? Edit: Hmmm interesting, Run TorchAO Experimental Tests / test (macos-14) passed on main |
It looks like it fails with linking error if you don't make it explicit. It might have to do with the version of pytorch being used in the test. Do you know which version has OMP bundled? |
I think you're right, it was updated circa pytorch/pytorch#145870 I'll bump the pinned PT |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
lgtm, feel free to merge once CI passes
* CPUOffload: only offload parameters above a certain size * lint * ruff --------- Co-authored-by: Mark Saroufim <[email protected]>
* update typehint Signed-off-by: Masaki Kozuki <[email protected]> * Update float8_linear_utils.py --------- Signed-off-by: Masaki Kozuki <[email protected]> Co-authored-by: Mark Saroufim <[email protected]>
* Update [ghstack-poisoned] * Update [ghstack-poisoned] * Update [ghstack-poisoned] * Update [ghstack-poisoned] * Update [ghstack-poisoned] * Update [ghstack-poisoned] * Update [ghstack-poisoned] * Update [ghstack-poisoned] * Update [ghstack-poisoned] * Update [ghstack-poisoned]
* init * up * up * up * up * up * up * up * up * up
Remade PR: #1828 This one got flooded on a bad rebase |
As mentioned in pytorch/torchchat#1493, we no longer need to explicitly link to OMP libs