Skip to content

Add autoloading tutorial #3037

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 50 commits into from
Oct 9, 2024
Merged
Show file tree
Hide file tree
Changes from 10 commits
Commits
Show all changes
50 commits
Select commit Hold shift + click to select a range
ee33c2d
docs: add autoload
shink Sep 5, 2024
6215e6e
update
shink Sep 5, 2024
9f23f16
update
shink Sep 5, 2024
37c72ee
update
shink Sep 5, 2024
67ad7ff
update
shink Sep 5, 2024
22ddca1
Merge branch 'main' into docs/autoload
svekars Sep 5, 2024
891753e
update
shink Sep 6, 2024
54dd911
Merge branch 'main' into docs/autoload
shink Sep 6, 2024
5c9d78b
update
shink Sep 6, 2024
aa47a21
update
shink Sep 6, 2024
7caaad8
Merge branch 'pytorch:main' into docs/autoload
shink Sep 9, 2024
b4e884d
update
shink Sep 9, 2024
523d289
update
shink Sep 9, 2024
b3766ed
update
shink Sep 9, 2024
5d9adfb
update
shink Sep 9, 2024
5d36b03
Update advanced_source/python_extension_autoload.rst
shink Sep 9, 2024
9123d82
Update advanced_source/python_extension_autoload.rst
shink Sep 9, 2024
e726565
Update advanced_source/python_extension_autoload.rst
shink Sep 9, 2024
84879c6
update
shink Sep 10, 2024
6ac2d2e
update
shink Sep 10, 2024
5a0f00e
update
shink Sep 10, 2024
a85ebed
update
shink Sep 10, 2024
2db4cee
update
shink Sep 10, 2024
4d44a78
Merge branch 'main' into docs/autoload
shink Sep 10, 2024
d5fe718
add authors
shink Sep 14, 2024
b6281bf
Update advanced_source/python_extension_autoload.rst
shink Sep 14, 2024
a4ace51
Update advanced_source/python_extension_autoload.rst
shink Sep 14, 2024
3980ab7
Update advanced_source/python_extension_autoload.rst
shink Sep 14, 2024
dcb5fd3
Update advanced_source/python_extension_autoload.rst
shink Sep 14, 2024
93087be
Update advanced_source/python_extension_autoload.rst
shink Sep 14, 2024
a48cfc4
Update advanced_source/python_extension_autoload.rst
shink Sep 14, 2024
d1217dc
Update advanced_source/python_extension_autoload.rst
shink Sep 14, 2024
23cfef4
Update advanced_source/python_extension_autoload.rst
shink Sep 14, 2024
3c0c1e0
Update advanced_source/python_extension_autoload.rst
shink Sep 14, 2024
dda22c4
Update advanced_source/python_extension_autoload.rst
shink Sep 14, 2024
0a47d48
Update advanced_source/python_extension_autoload.rst
shink Sep 14, 2024
bcbe0f6
Update advanced_source/python_extension_autoload.rst
shink Sep 14, 2024
80c8683
Update advanced_source/python_extension_autoload.rst
shink Sep 14, 2024
2ba51d0
Update advanced_source/python_extension_autoload.rst
shink Sep 14, 2024
5ea5a36
Merge branch 'main' into docs/autoload
shink Sep 14, 2024
f8365e8
Update advanced_source/python_extension_autoload.rst
shink Sep 14, 2024
0cc9850
Update advanced_source/python_extension_autoload.rst
shink Sep 14, 2024
33c60cc
Update advanced_source/python_extension_autoload.rst
shink Sep 14, 2024
ee5c353
Update advanced_source/python_extension_autoload.rst
shink Sep 14, 2024
e425fe9
update
shink Sep 14, 2024
f1018e3
Merge branch 'main' into docs/autoload
svekars Sep 23, 2024
ebfcbff
Merge branch 'main' into docs/autoload
shink Sep 24, 2024
0b52b02
move to prototype_source
shink Sep 24, 2024
9a1b2f7
Merge branch 'main' into docs/autoload
svekars Sep 25, 2024
d45477c
Merge branch 'main' into docs/autoload
svekars Oct 9, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Binary file added _static/img/python_extension_autoload_impl.png
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The diagram seems missing a pointer from import torch to the entry points to load - the loading of entry points is triggered by import torch, right?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yes, I will make some changes to this diagram

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@jgong5 updated, please have a look

Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
116 changes: 116 additions & 0 deletions advanced_source/python_extension_autoload.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,116 @@
Out-of-tree extension autoloading in Python
===========================================

What is it?
-----------

The extension autoloading mechanism enables PyTorch to automatically
load out-of-tree backend extensions without explicit import statements. This
mechanism is very useful for users. On the one hand, it improves the user
experience and enables users to adhere to the familiar PyTorch device
programming model without needing to explicitly load or import device-specific
extensions. On the other hand, it facilitates effortless
adoption of existing PyTorch applications with zero-code changes on
out-of-tree devices. For more information,
see `[RFC] Autoload Device Extension <https://github.com/pytorch/pytorch/issues/122468>`_.

.. note::

This feature is enabled by default and can be disabled using
``export TORCH_DEVICE_BACKEND_AUTOLOAD=0``.
If you get an error like this: "Failed to load the backend extension",
this error has nothing to do with PyTorch, you should disable this feature
and ask the out-of-tree extension maintainer for help.

How to apply this mechanism to out-of-tree extensions?
--------------------------------------------

For example, if you have a backend named ``foo`` and a package named
``torch_foo``. Make sure your package is based on PyTorch 2.5+ and includes
the following in its ``__init__.py``:

.. code-block:: python

def _autoload():
print("No need to import torch_foo anymore! You can run torch.foo.is_available() directly.")

Then the only thing you need to do is add an entry point to your Python
package:

.. code-block:: python

setup(
name="torch_foo",
version="1.0",
entry_points={
"torch.backends": [
"torch_foo = torch_foo:_autoload",
],
}
)

Now the ``torch_foo`` module can be imported when running import torch:

.. code-block:: python

>>> import torch
No need to import torch_foo anymore! You can run torch.foo.is_available() directly.
>>> torch.foo.is_available()
True

Examples
^^^^^^^^

TODO: take HPU and NPU as examples

`habana_frameworks.torch`_ is a Python package that enables users to run
PyTorch programs on Intel Gaudi via the PyTorch ``HPU`` device key.
``import habana_frameworks.torch`` is no longer necessary after this mechanism
is applied.

.. _habana_frameworks.torch: https://docs.habana.ai/en/latest/PyTorch/Getting_Started_with_PyTorch_and_Gaudi/Getting_Started_with_PyTorch.html

.. code-block:: diff

import torch
import torchvision.models as models
- import habana_frameworks.torch # <-- extra import
model = models.resnet50().eval().to("hpu")
input = torch.rand(128, 3, 224, 224).to("hpu")
output = model(input)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@jgong5 should the implementation example of this mechanism be added here?

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, please leave it here for now. @bsochack is the habana bridge ready for autoloading?

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It is implemented and verified. Not yet released.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@bsochack we need an implementation example here, can you provide one? Thanks!

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Any suggestion how it should be provided?

The only challenge that we had is the circular dependencies. We solved it in the way below. Other than that, it worked out of box.

habana_frameworks/init.py:

import os

is_loaded = False  # A member variable of habana_frameworks module to track if our module has been imported

def __autoload():
    # This is an entrypoint for pytorch autoload mechanism
    # If the following confition is true, that means our backend has already been loaded, either explicitly
    # or by the autoload mechanism and importing it again should be skipped to avoid circular imports
    global is_loaded
    if is_loaded:
        return
    import habana_frameworks.torch

habana_frameworks/torch/init.py:

import os

# This is to prevent torch autoload mechanism from causing circular imports
import habana_frameworks

habana_frameworks.is_loaded = True

and one addition in setup.py:

entry_points={
        "torch.backends": [
            "device_backend = habana_frameworks:__autoload",
        ],
},

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@bsochack thanks so much for your quick reply!

The circular dependency issue is worth noting, I will state this in this doc.

I noticed habana_frameworks is not an open-source project, so are there any public links about the autoloading mechanism? It's OK if not.

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Right, habana_frameworks is not open source and there is no official release with the above change.
Btw, it requires PT 2.5 (not yet released) so it is hard to provide a real example.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@bsochack OK it's alright

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@bsochack updated, please take a look at the HPU section

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm

`torch_npu`_ enables users to run PyTorch program on Huawei Ascend NPU, it
leverages the ``PrivateUse1`` device key and exposes the device name
as ``npu`` to the end users.
``import torch_npu`` is also no longer needed after applying this mechanism.

.. _torch_npu: https://github.com/Ascend/pytorch

.. code-block:: diff

import torch
import torchvision.models as models
- import torch_npu # <-- extra import
model = models.resnet50().eval().to("npu")
input = torch.rand(128, 3, 224, 224).to("npu")
output = model(input)

How it works
------------

.. image:: ../_static/img/python_extension_autoload_impl.png
:alt: Autoloading implementation
:align: center

This mechanism is implemented based on Python's `Entry points
<https://packaging.python.org/en/latest/specifications/entry-points/>`_
mechanism. We discover and load all of the specific entry points
in ``torch/__init__.py`` that are defined by out-of-tree extensions.
Its implementation is in `[RFC] Add support for device extension autoloading
<https://github.com/pytorch/pytorch/pull/127074>`_.

Conclusion
----------

This tutorial has guided you through the out-of-tree extension autoloading
mechanism, including its usage and implementation.
8 changes: 8 additions & 0 deletions index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -509,6 +509,13 @@ Welcome to PyTorch Tutorials
:link: advanced/privateuseone.html
:tags: Extending-PyTorch,Frontend-APIs,C++

.. customcarditem::
:header: Out-of-tree extension autoloading in Python
:card_description: Learn how to improve the seamless integration of out-of-tree extension with PyTorch based on the autoloading mechanism.
:image: _static/img/thumbnails/cropped/generic-pytorch-logo.png
:link: advanced/python_extension_autoload.html
:tags: Extending-PyTorch

.. customcarditem::
:header: Custom Function Tutorial: Double Backward
:card_description: Learn how to write a custom autograd Function that supports double backward.
Expand Down Expand Up @@ -1110,6 +1117,7 @@ Additional Resources
advanced/dispatcher
advanced/extend_dispatcher
advanced/privateuseone
advanced/python_extension_autoload

.. toctree::
:maxdepth: 2
Expand Down
Loading