Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix dask #89

Merged
merged 12 commits into from
Feb 27, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 3 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -127,3 +127,6 @@ dmypy.json

# Pyre type checker
.pyre/

# macOS specific iles
.DS_Store
28 changes: 26 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -126,11 +126,11 @@ part of the specification but which are useful for using the array API:
[`x.device`](https://data-apis.org/array-api/latest/API_specification/generated/signatures.array_object.array.device.html)
in the array API specification. Included because `numpy.ndarray` does not
include the `device` attribute and this library does not wrap or extend the
array object. Note that for NumPy, `device(x)` is always `"cpu"`.
array object. Note that for NumPy and dask, `device(x)` is always `"cpu"`.

- `to_device(x, device, /, *, stream=None)`: Equivalent to
[`x.to_device`](https://data-apis.org/array-api/latest/API_specification/generated/signatures.array_object.array.to_device.html).
Included because neither NumPy's, CuPy's, nor PyTorch's array objects
Included because neither NumPy's, CuPy's, Dask's, nor PyTorch's array objects
include this method. For NumPy, this function effectively does nothing since
the only supported device is the CPU, but for CuPy, this method supports
CuPy CUDA
Expand Down Expand Up @@ -241,6 +241,30 @@ Unlike the other libraries supported here, JAX array API support is contained
entirely in the JAX library. The JAX array API support is tracked at
https://github.com/google/jax/issues/18353.

## Dask

If you're using dask with numpy, many of the same limitations that apply to numpy
will also apply to dask. Besides those differences, other limitations include missing
sort functionality (no `sort` or `argsort`), and limited support for the optional `linalg`
and `fft` extensions.

In particular, the `fft` namespace is not compliant with the array API spec. Any functions
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Out of interest, is the plan to implement array_api_compat.dask.{fft, linalg} or wait for support from dask itself? A similar question w.r.t JAX.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I haven't attempted to wrap fft yet - waiting on #78 to do so.

Linalg can only be partially supported by us since there's missing methods in dask.

that you find under the `fft` namespace are the original, unwrapped functions under [`dask.array.fft`](https://docs.dask.org/en/latest/array-api.html#fast-fourier-transforms), which may or may not be Array API compliant. Use at your own risk!

For `linalg`, several methods are missing, for example:
- `cross`
- `det`
- `eigh`
- `eigvalsh`
- `matrix_power`
- `pinv`
- `slogdet`
- `matrix_norm`
- `matrix_rank`
Other methods may only be partially implemented or return incorrect results at times.

The minimum supported Dask version is 2023.12.0.

## Vendoring

This library supports vendoring as an installation method. To vendor the
Expand Down
23 changes: 21 additions & 2 deletions array_api_compat/common/_helpers.py
Original file line number Diff line number Diff line change
Expand Up @@ -159,7 +159,16 @@ def _check_device(xp, device):
if device not in ["cpu", None]:
raise ValueError(f"Unsupported device for NumPy: {device!r}")

# device() is not on numpy.ndarray and to_device() is not on numpy.ndarray
# Placeholder object to represent the dask device
# when the array backend is not the CPU.
# (since it is not easy to tell which device a dask array is on)
class _dask_device:
def __repr__(self):
return "DASK_DEVICE"

_DASK_DEVICE = _dask_device()

# device() is not on numpy.ndarray or dask.array and to_device() is not on numpy.ndarray
# or cupy.ndarray. They are not included in array objects of this library
# because this library just reuses the respective ndarray classes without
# wrapping or subclassing them. These helper functions can be used instead of
Expand All @@ -181,7 +190,17 @@ def device(x: Array, /) -> Device:
"""
if is_numpy_array(x):
return "cpu"
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As I noted on the other PR, it would probably be better to use some kind of basic DaskDevice object here instead of the string "cpu", given that CPU isn't necessarily an accurate description of the device dask is running on. See https://github.com/data-apis/array-api-strict/blob/main/array_api_strict/_array_object.py#L43-L49 for example.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I return cpu now only if the type of the array backing the dask array is a ndarray.

The rest of the time, I return a DaskDevice.

Is this something close to what you wanted?

(We might be able to do this for cupy, but it's tricky for e.g. multigpu cases I guess)

if is_jax_array(x):
elif is_dask_array(x):
# Peek at the metadata of the jax array to determine type
try:
import numpy as np
if isinstance(x._meta, np.ndarray):
# Must be on CPU since backed by numpy
return "cpu"
except ImportError:
pass
return _DASK_DEVICE
elif is_jax_array(x):
# JAX has .device() as a method, but it is being deprecated so that it
# can become a property, in accordance with the standard. In order for
# this function to not break when JAX makes the flip, we check for
Expand Down
Empty file.
5 changes: 3 additions & 2 deletions array_api_compat/dask/array/_aliases.py
Original file line number Diff line number Diff line change
Expand Up @@ -36,7 +36,8 @@
from typing import TYPE_CHECKING
if TYPE_CHECKING:
from typing import Optional, Union
from ...common._typing import ndarray, Device, Dtype

from ...common._typing import Device, Dtype, Array

import dask.array as da

Expand All @@ -60,7 +61,7 @@ def _dask_arange(
dtype: Optional[Dtype] = None,
device: Optional[Device] = None,
**kwargs,
) -> ndarray:
) -> Array:
_check_device(xp, device)
args = [start]
if stop is not None:
Expand Down
15 changes: 11 additions & 4 deletions array_api_compat/dask/array/linalg.py
Original file line number Diff line number Diff line change
@@ -1,6 +1,5 @@
from __future__ import annotations

from dask.array.linalg import svd
from ...common import _linalg
from ..._internal import get_xp

Expand All @@ -16,8 +15,7 @@

from typing import TYPE_CHECKING
if TYPE_CHECKING:
from typing import Union, Tuple
from ...common._typing import ndarray
from ...common._typing import Array

# cupy.linalg doesn't have __all__. If it is added, replace this with
#
Expand All @@ -39,7 +37,16 @@
matrix_rank = get_xp(da)(_linalg.matrix_rank)
matrix_norm = get_xp(da)(_linalg.matrix_norm)

def svdvals(x: ndarray) -> Union[ndarray, Tuple[ndarray, ...]]:

# Wrap the svd functions to not pass full_matrices to dask
# when full_matrices=False (as that is the default behavior for dask),
# and dask doesn't have the full_matrices keyword
def svd(x: Array, full_matrices: bool = True, **kwargs) -> SVDResult:
if full_matrices:
raise ValueError("full_matrics=True is not supported by dask.")
return da.linalg.svd(x, **kwargs)

def svdvals(x: Array) -> Array:
# TODO: can't avoid computing U or V for dask
_, s, _ = svd(x)
return s
Expand Down
16 changes: 15 additions & 1 deletion dask-xfails.txt
Original file line number Diff line number Diff line change
Expand Up @@ -78,6 +78,20 @@ array_api_tests/test_linalg.py::test_tensordot
# probably same reason for failing as numpy
array_api_tests/test_linalg.py::test_trace

# AssertionError: out.dtype=uint64, but should be uint8 [tensordot(uint8, uint8)]
array_api_tests/test_linalg.py::test_linalg_tensordot

# AssertionError: out.shape=(1,), but should be () [linalg.vector_norm(keepdims=True)]
array_api_tests/test_linalg.py::test_vector_norm

# ZeroDivisionError in dask's normalize_chunks/auto_chunks internals
array_api_tests/test_linalg.py::test_inv
array_api_tests/test_linalg.py::test_matrix_power

# did not raise error for invalid shapes
array_api_tests/test_linalg.py::test_matmul
array_api_tests/test_linalg.py::test_linalg_matmul

# Linalg - these don't exist in dask
array_api_tests/test_signatures.py::test_extension_func_signature[linalg.cross]
array_api_tests/test_signatures.py::test_extension_func_signature[linalg.det]
Expand All @@ -88,6 +102,7 @@ array_api_tests/test_signatures.py::test_extension_func_signature[linalg.pinv]
array_api_tests/test_signatures.py::test_extension_func_signature[linalg.slogdet]
array_api_tests/test_linalg.py::test_cross
array_api_tests/test_linalg.py::test_det
array_api_tests/test_linalg.py::test_eigh
array_api_tests/test_linalg.py::test_eigvalsh
array_api_tests/test_linalg.py::test_pinv
array_api_tests/test_linalg.py::test_slogdet
Expand All @@ -112,7 +127,6 @@ array_api_tests/test_linalg.py::test_solve
# missing full_matrics kw
# https://github.com/dask/dask/issues/10389
# also only supports 2-d inputs
array_api_tests/test_signatures.py::test_extension_func_signature[linalg.svd]
array_api_tests/test_linalg.py::test_svd

# Missing dlpack stuff
Expand Down
2 changes: 1 addition & 1 deletion setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@
setup(
name='array_api_compat',
version=array_api_compat.__version__,
packages=find_packages(include=['array_api_compat*']),
packages=find_packages(include=["array_api_compat*"]),
author="Consortium for Python Data API Standards",
description="A wrapper around NumPy and other array libraries to make them compatible with the Array API standard",
long_description=long_description,
Expand Down
3 changes: 0 additions & 3 deletions tests/test_common.py
Original file line number Diff line number Diff line change
Expand Up @@ -31,9 +31,6 @@ def test_is_xp_array(library, func):

@pytest.mark.parametrize("library", ["cupy", "numpy", "torch", "dask.array", "jax.numpy"])
def test_device(library):
if library == "dask.array":
pytest.xfail("device() needs to be fixed for dask")

xp = import_(library, wrapper=True)

# We can't test much for device() and to_device() other than that
Expand Down
Loading