Skip to content

Add autoloading tutorial #3037

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 50 commits into from
Oct 9, 2024
Merged
Show file tree
Hide file tree
Changes from 24 commits
Commits
Show all changes
50 commits
Select commit Hold shift + click to select a range
ee33c2d
docs: add autoload
shink Sep 5, 2024
6215e6e
update
shink Sep 5, 2024
9f23f16
update
shink Sep 5, 2024
37c72ee
update
shink Sep 5, 2024
67ad7ff
update
shink Sep 5, 2024
22ddca1
Merge branch 'main' into docs/autoload
svekars Sep 5, 2024
891753e
update
shink Sep 6, 2024
54dd911
Merge branch 'main' into docs/autoload
shink Sep 6, 2024
5c9d78b
update
shink Sep 6, 2024
aa47a21
update
shink Sep 6, 2024
7caaad8
Merge branch 'pytorch:main' into docs/autoload
shink Sep 9, 2024
b4e884d
update
shink Sep 9, 2024
523d289
update
shink Sep 9, 2024
b3766ed
update
shink Sep 9, 2024
5d9adfb
update
shink Sep 9, 2024
5d36b03
Update advanced_source/python_extension_autoload.rst
shink Sep 9, 2024
9123d82
Update advanced_source/python_extension_autoload.rst
shink Sep 9, 2024
e726565
Update advanced_source/python_extension_autoload.rst
shink Sep 9, 2024
84879c6
update
shink Sep 10, 2024
6ac2d2e
update
shink Sep 10, 2024
5a0f00e
update
shink Sep 10, 2024
a85ebed
update
shink Sep 10, 2024
2db4cee
update
shink Sep 10, 2024
4d44a78
Merge branch 'main' into docs/autoload
shink Sep 10, 2024
d5fe718
add authors
shink Sep 14, 2024
b6281bf
Update advanced_source/python_extension_autoload.rst
shink Sep 14, 2024
a4ace51
Update advanced_source/python_extension_autoload.rst
shink Sep 14, 2024
3980ab7
Update advanced_source/python_extension_autoload.rst
shink Sep 14, 2024
dcb5fd3
Update advanced_source/python_extension_autoload.rst
shink Sep 14, 2024
93087be
Update advanced_source/python_extension_autoload.rst
shink Sep 14, 2024
a48cfc4
Update advanced_source/python_extension_autoload.rst
shink Sep 14, 2024
d1217dc
Update advanced_source/python_extension_autoload.rst
shink Sep 14, 2024
23cfef4
Update advanced_source/python_extension_autoload.rst
shink Sep 14, 2024
3c0c1e0
Update advanced_source/python_extension_autoload.rst
shink Sep 14, 2024
dda22c4
Update advanced_source/python_extension_autoload.rst
shink Sep 14, 2024
0a47d48
Update advanced_source/python_extension_autoload.rst
shink Sep 14, 2024
bcbe0f6
Update advanced_source/python_extension_autoload.rst
shink Sep 14, 2024
80c8683
Update advanced_source/python_extension_autoload.rst
shink Sep 14, 2024
2ba51d0
Update advanced_source/python_extension_autoload.rst
shink Sep 14, 2024
5ea5a36
Merge branch 'main' into docs/autoload
shink Sep 14, 2024
f8365e8
Update advanced_source/python_extension_autoload.rst
shink Sep 14, 2024
0cc9850
Update advanced_source/python_extension_autoload.rst
shink Sep 14, 2024
33c60cc
Update advanced_source/python_extension_autoload.rst
shink Sep 14, 2024
ee5c353
Update advanced_source/python_extension_autoload.rst
shink Sep 14, 2024
e425fe9
update
shink Sep 14, 2024
f1018e3
Merge branch 'main' into docs/autoload
svekars Sep 23, 2024
ebfcbff
Merge branch 'main' into docs/autoload
shink Sep 24, 2024
0b52b02
move to prototype_source
shink Sep 24, 2024
9a1b2f7
Merge branch 'main' into docs/autoload
svekars Sep 25, 2024
d45477c
Merge branch 'main' into docs/autoload
svekars Oct 9, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Binary file added _static/img/python_extension_autoload_impl.png
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The diagram seems missing a pointer from import torch to the entry points to load - the loading of entry points is triggered by import torch, right?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yes, I will make some changes to this diagram

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@jgong5 updated, please have a look

Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
170 changes: 170 additions & 0 deletions advanced_source/python_extension_autoload.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,170 @@
Out-of-tree extension autoloading in Python
===========================================

What is it?
-----------

The extension autoloading mechanism enables PyTorch to automatically
load out-of-tree backend extensions without explicit import statements. This
mechanism is very useful for users. On the one hand, it improves the user
experience and enables users to adhere to the familiar PyTorch device
programming model without needing to explicitly load or import device-specific
extensions. On the other hand, it facilitates effortless
adoption of existing PyTorch applications with zero-code changes on
out-of-tree devices. For further details, refer to the
`[RFC] Autoload Device Extension <https://github.com/pytorch/pytorch/issues/122468>`_.

.. note::

This feature is enabled by default and can be disabled using
``export TORCH_DEVICE_BACKEND_AUTOLOAD=0``.
If you get an error like this: "Failed to load the backend extension",
this error is independent with PyTorch, you should disable this feature
and ask the out-of-tree extension maintainer for help.

How to apply this mechanism to out-of-tree extensions?
------------------------------------------------------

For instance, suppose you have a backend named ``foo`` and a corresponding package named ``torch_foo``. Ensure that
your package is compatible with PyTorch 2.5+ and includes the following snippet in its ``__init__.py`` file:

.. code-block:: python

def _autoload():
print("No need to import torch_foo anymore! Check things are working with `torch.foo.is_available()`.")

Then the only thing you need to do is define an entry point within your Python package:

.. code-block:: python

setup(
name="torch_foo",
version="1.0",
entry_points={
"torch.backends": [
"torch_foo = torch_foo:_autoload",
],
}
)

Now the ``torch_foo`` module can be imported when running import torch:

.. code-block:: python

>>> import torch
No need to import torch_foo anymore! Check things are working with `torch.foo.is_available()`.
>>> torch.foo.is_available()
True

You may encounter issues with circular imports, the following examples are intended to help you address them.

Examples
^^^^^^^^

Here we take Intel Gaudi HPU and Huawei Ascend NPU as examples to determine how to
integrate your out-of-tree extension with PyTorch using the autoloading mechanism.

`habana_frameworks.torch`_ is a Python package that enables users to run
PyTorch programs on Intel Gaudi via the PyTorch ``HPU`` device key.

.. _habana_frameworks.torch: https://docs.habana.ai/en/latest/PyTorch/Getting_Started_with_PyTorch_and_Gaudi/Getting_Started_with_PyTorch.html

``habana_frameworks.torch`` is a submodule of ``habana_frameworks``, we add an entry point to
``__autoload()`` in ``habana_frameworks/setup.py``:

.. code-block:: diff

setup(
name="habana_frameworks",
version="2.5",
+ entry_points={
+ 'torch.backends': [
+ "device_backend = habana_frameworks:__autoload",
+ ],
+ }
)

In ``habana_frameworks/init.py``, we use a global variable to track if our module has been loaded:

.. code-block:: python

import os

is_loaded = False # A member variable of habana_frameworks module to track if our module has been imported

def __autoload():
# This is an entrypoint for pytorch autoload mechanism
# If the following condition is true, that means our backend has already been loaded, either explicitly
# or by the autoload mechanism and importing it again should be skipped to avoid circular imports
global is_loaded
if is_loaded:
return
import habana_frameworks.torch

In ``habana_frameworks/torch/init.py``, We prevent circular imports by updating the state of the global variable:

.. code-block:: python

import os

# This is to prevent torch autoload mechanism from causing circular imports
import habana_frameworks

habana_frameworks.is_loaded = True

`torch_npu`_ enables users to run PyTorch programs on Huawei Ascend NPU, it
leverages the ``PrivateUse1`` device key and exposes the device name
as ``npu`` to the end users.

.. _torch_npu: https://github.com/Ascend/pytorch

We define an entry point in `torch_npu/setup.py`_:

.. _torch_npu/setup.py: https://github.com/Ascend/pytorch/blob/master/setup.py#L618

.. code-block:: diff

setup(
name="torch_npu",
version="2.5",
+ entry_points={
+ 'torch.backends': [
+ 'torch_npu = torch_npu:_autoload',
+ ],
+ }
)

Unlike ``habana_frameworks``, ``torch_npu`` uses the environment variable ``TORCH_DEVICE_BACKEND_AUTOLOAD``
to control the autoloading process. For example, we set it to ``0`` to disable autoloading to prevent circular imports:

.. code-block:: python

# Disable autoloading before running 'import torch'
os.environ['TORCH_DEVICE_BACKEND_AUTOLOAD'] = '0'

import torch

How it works
------------

.. image:: ../_static/img/python_extension_autoload_impl.png
:alt: Autoloading implementation
:align: center

This mechanism is implemented based on Python's `Entry points
<https://packaging.python.org/en/latest/specifications/entry-points/>`_
mechanism. We discover and load all of the specific entry points
in ``torch/__init__.py`` that are defined by out-of-tree extensions.

As shown above, after installing ``torch_foo``, your Python module can be imported
when loading the entry point you defined, and then you can do some necessary work when
calling it.

See the implementation in this pull request: `[RFC] Add support for device extension autoloading
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not sure if we should link to a pull request. Maybe you can link to the file - it looks like the changes have landed already.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This section talks about the implementation of autoloading, I think it makes sense to put a PR here.
PR can be used to let users know what has changed and to follow discussions about the implementation.

<https://github.com/pytorch/pytorch/pull/127074>`_.

Conclusion
----------

This tutorial has guided you through the out-of-tree extension autoloading
mechanism, including its usage and implementation.
8 changes: 8 additions & 0 deletions index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -509,6 +509,13 @@ Welcome to PyTorch Tutorials
:link: advanced/privateuseone.html
:tags: Extending-PyTorch,Frontend-APIs,C++

.. customcarditem::
:header: Out-of-tree extension autoloading in Python
:card_description: Learn how to improve the seamless integration of out-of-tree extension with PyTorch based on the autoloading mechanism.
:image: _static/img/thumbnails/cropped/generic-pytorch-logo.png
:link: advanced/python_extension_autoload.html
:tags: Extending-PyTorch,Frontend-APIs

.. customcarditem::
:header: Custom Function Tutorial: Double Backward
:card_description: Learn how to write a custom autograd Function that supports double backward.
Expand Down Expand Up @@ -1110,6 +1117,7 @@ Additional Resources
advanced/dispatcher
advanced/extend_dispatcher
advanced/privateuseone
advanced/python_extension_autoload

.. toctree::
:maxdepth: 2
Expand Down