Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix CI on windows and mac #312

Merged
merged 3 commits into from
Feb 21, 2025
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
12 changes: 10 additions & 2 deletions .github/workflows/nocache.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -27,11 +27,19 @@ jobs:
cache: true
activate-environment: true

# https://github.com/xarray-contrib/xarray-tutorial/issues/311
- name: Configure graphviz
if: matrix.runs-on == 'macos-latest'
run: |
dot -c

- name: Build JupyterBook
id: jb-build
continue-on-error: true
run: |
jupyter-book build ./ --warningiserror --keep-going

- name: Dump Build Logs
if: always()
if: steps.jb-build.outcome == 'failure'
run: |
if (test -a _build/html/reports/*log); then cat _build/html/reports/*log ; fi
cat _build/html/reports/**/*.log
3 changes: 3 additions & 0 deletions .github/workflows/qaqc.yaml
Original file line number Diff line number Diff line change
@@ -1,6 +1,7 @@
name: QualityContol

on:
workflow_dispatch:
pull_request:
branches:
- main
Expand Down Expand Up @@ -46,8 +47,10 @@ jobs:
with open('./_config.yml', 'w') as f:
yaml.dump(data, f)

# Checking links is flaky, so continue-on-error: true
- name: Check External Links
timeout-minutes: 5
continue-on-error: true
if: always()
run: |
jupyter-book build ./ --builder linkcheck
4 changes: 2 additions & 2 deletions advanced/apply_ufunc/example-interp.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -562,7 +562,7 @@
"source": [
"`np.vectorize` is a very convenient function but is unfortunately slow. It is only marginally faster than writing a for loop in Python and looping. A common way to get around this is to write a base interpolation function that can handle nD arrays in a compiled language like Fortran and then pass that to `apply_ufunc`.\n",
"\n",
"Another option is to use the numba package which provides a very [convenient `guvectorize` decorator](https://numba.pydata.org/numba-doc/latest/user/vectorize.html#the-guvectorize-decorator). Any decorated function gets compiled and will loop over any non-core dimension in parallel when necessary. \n",
"Another option is to use the numba package which provides a very [convenient `guvectorize` decorator](https://numba.readthedocs.io/en/stable/user/vectorize.html#the-guvectorize-decorator). Any decorated function gets compiled and will loop over any non-core dimension in parallel when necessary. \n",
"\n",
"We need to specify some extra information:\n",
"\n",
Expand Down Expand Up @@ -606,7 +606,7 @@
"tags": []
},
"source": [
"The warnings are about [object-mode compilation](https://numba.pydata.org/numba-doc/latest/user/performance-tips.html#no-python-mode-vs-object-mode) relating to the `print` statement. This means we don't get much speed up. We'll keep the `print` statement temporarily to make sure that `guvectorize` acts like we want it to."
"The warnings are about [object-mode compilation](https://numba.readthedocs.io/en/stable/user/performance-tips.html) relating to the `print` statement. This means we don't get much speed up. We'll keep the `print` statement temporarily to make sure that `guvectorize` acts like we want it to."
]
},
{
Expand Down
2 changes: 1 addition & 1 deletion advanced/apply_ufunc/numba-vectorization.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@
"\n",
"Another option is to use the [numba package](https://numba.pydata.org/) which provides two very convenient decorators to build [numpy universal functions or ufuncs](https://numba.readthedocs.io/en/stable/user/vectorize.html):\n",
"1. [`vectorize`](https://numba.readthedocs.io/en/stable/user/vectorize.html#the-vectorize-decorator) for functions that act on scalars, and \n",
"2. [`guvectorize`](https://numba.pydata.org/numba-doc/latest/user/vectorize.html#the-guvectorize-decorator) for functions that operates on subsets of the array along core-dimensions. Any decorated function gets compiled and will loop over the loop dimensions in parallel when necessary. \n",
"2. [`guvectorize`](https://numba.readthedocs.io/en/stable/user/vectorize.html#the-guvectorize-decorator) for functions that operates on subsets of the array along core-dimensions. Any decorated function gets compiled and will loop over the loop dimensions in parallel when necessary. \n",
"\n",
"For `apply_ufunc` the key concept is that we must provide `vectorize=False` (the default) when using Numba vectorized functions. \n",
"Numba handles the vectorization (or looping) and `apply_ufunc` handles converting Xarray objects to bare arrays and handling metadata."
Expand Down
4 changes: 2 additions & 2 deletions intermediate/remote_data/remote-data.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -132,7 +132,7 @@
"The `open_dataset()` method is our entry point to n-dimensional data with xarray, the first argument we pass indicates what we want to open and is used by xarray to get the right backend and in turn is used by the backend to open the file locally or remote. The accepted types by xarray are:\n",
"\n",
"\n",
"* **str**: \"my-file.nc\" or \"s3:://my-zarr-store/data.zarr\"\n",
"* **str**: `my-file.nc` or `s3:://my-zarr-store/data.zarr`\n",
"* **os.PathLike**: Posix compatible path, most of the times is a Pathlib cross-OS compatible path.\n",
"* **BufferedIOBase**: some xarray backends can read data from a buffer, this is key for remote access.\n",
"* **AbstractDataStore**: This one is the generic store and backends should subclass it, if we do we can pass a \"store\" to xarray like in the case of Opendap/Pydap\n",
Expand Down Expand Up @@ -178,7 +178,7 @@
"id": "11",
"metadata": {},
"source": [
"xarray iterated through the registered backends and netcdf4 returned a `\"yes, I can open that extension\"` see: [netCDF4_.py#L618 ](https://github.com/pydata/xarray/blob/6c2d8c3389afe049ccbfd1393e9a81dd5c759f78/xarray/backends/netCDF4_.py#L618). However, **the backend doesn't know how to \"talk\" to a remote store** and thus it fails to open our file.\n",
"xarray iterated through the registered backends and netcdf4 returned a `\"yes, I can open that extension\"` see: [netCDF4_.py#L618](https://github.com/pydata/xarray/blob/6c2d8c3389afe049ccbfd1393e9a81dd5c759f78/xarray/backends/netCDF4_.py#L618). However, **the backend doesn't know how to \"talk\" to a remote store** and thus it fails to open our file.\n",
"\n"
]
},
Expand Down
Loading