diff --git a/.github/workflows/nocache.yaml b/.github/workflows/nocache.yaml index c2baa2a9..a33d0b22 100644 --- a/.github/workflows/nocache.yaml +++ b/.github/workflows/nocache.yaml @@ -27,11 +27,19 @@ jobs: cache: true activate-environment: true + # https://github.com/xarray-contrib/xarray-tutorial/issues/311 + - name: Configure graphviz + if: matrix.runs-on == 'macos-latest' + run: | + dot -c + - name: Build JupyterBook + id: jb-build + continue-on-error: true run: | jupyter-book build ./ --warningiserror --keep-going - name: Dump Build Logs - if: always() + if: steps.jb-build.outcome == 'failure' run: | - if (test -a _build/html/reports/*log); then cat _build/html/reports/*log ; fi + cat _build/html/reports/**/*.log diff --git a/.github/workflows/qaqc.yaml b/.github/workflows/qaqc.yaml index 72bb21bd..1b22eb99 100644 --- a/.github/workflows/qaqc.yaml +++ b/.github/workflows/qaqc.yaml @@ -1,6 +1,7 @@ name: QualityContol on: + workflow_dispatch: pull_request: branches: - main @@ -46,8 +47,10 @@ jobs: with open('./_config.yml', 'w') as f: yaml.dump(data, f) + # Checking links is flaky, so continue-on-error: true - name: Check External Links timeout-minutes: 5 + continue-on-error: true if: always() run: | jupyter-book build ./ --builder linkcheck diff --git a/advanced/apply_ufunc/example-interp.ipynb b/advanced/apply_ufunc/example-interp.ipynb index 2cfe45d1..d895b9aa 100644 --- a/advanced/apply_ufunc/example-interp.ipynb +++ b/advanced/apply_ufunc/example-interp.ipynb @@ -562,7 +562,7 @@ "source": [ "`np.vectorize` is a very convenient function but is unfortunately slow. It is only marginally faster than writing a for loop in Python and looping. A common way to get around this is to write a base interpolation function that can handle nD arrays in a compiled language like Fortran and then pass that to `apply_ufunc`.\n", "\n", - "Another option is to use the numba package which provides a very [convenient `guvectorize` decorator](https://numba.pydata.org/numba-doc/latest/user/vectorize.html#the-guvectorize-decorator). Any decorated function gets compiled and will loop over any non-core dimension in parallel when necessary. \n", + "Another option is to use the numba package which provides a very [convenient `guvectorize` decorator](https://numba.readthedocs.io/en/stable/user/vectorize.html#the-guvectorize-decorator). Any decorated function gets compiled and will loop over any non-core dimension in parallel when necessary. \n", "\n", "We need to specify some extra information:\n", "\n", @@ -606,7 +606,7 @@ "tags": [] }, "source": [ - "The warnings are about [object-mode compilation](https://numba.pydata.org/numba-doc/latest/user/performance-tips.html#no-python-mode-vs-object-mode) relating to the `print` statement. This means we don't get much speed up. We'll keep the `print` statement temporarily to make sure that `guvectorize` acts like we want it to." + "The warnings are about [object-mode compilation](https://numba.readthedocs.io/en/stable/user/performance-tips.html) relating to the `print` statement. This means we don't get much speed up. We'll keep the `print` statement temporarily to make sure that `guvectorize` acts like we want it to." ] }, { diff --git a/advanced/apply_ufunc/numba-vectorization.ipynb b/advanced/apply_ufunc/numba-vectorization.ipynb index 6bcbf30f..5cff94b1 100644 --- a/advanced/apply_ufunc/numba-vectorization.ipynb +++ b/advanced/apply_ufunc/numba-vectorization.ipynb @@ -25,7 +25,7 @@ "\n", "Another option is to use the [numba package](https://numba.pydata.org/) which provides two very convenient decorators to build [numpy universal functions or ufuncs](https://numba.readthedocs.io/en/stable/user/vectorize.html):\n", "1. [`vectorize`](https://numba.readthedocs.io/en/stable/user/vectorize.html#the-vectorize-decorator) for functions that act on scalars, and \n", - "2. [`guvectorize`](https://numba.pydata.org/numba-doc/latest/user/vectorize.html#the-guvectorize-decorator) for functions that operates on subsets of the array along core-dimensions. Any decorated function gets compiled and will loop over the loop dimensions in parallel when necessary. \n", + "2. [`guvectorize`](https://numba.readthedocs.io/en/stable/user/vectorize.html#the-guvectorize-decorator) for functions that operates on subsets of the array along core-dimensions. Any decorated function gets compiled and will loop over the loop dimensions in parallel when necessary. \n", "\n", "For `apply_ufunc` the key concept is that we must provide `vectorize=False` (the default) when using Numba vectorized functions. \n", "Numba handles the vectorization (or looping) and `apply_ufunc` handles converting Xarray objects to bare arrays and handling metadata." diff --git a/intermediate/remote_data/remote-data.ipynb b/intermediate/remote_data/remote-data.ipynb index c6783a41..c2b54ba5 100644 --- a/intermediate/remote_data/remote-data.ipynb +++ b/intermediate/remote_data/remote-data.ipynb @@ -132,7 +132,7 @@ "The `open_dataset()` method is our entry point to n-dimensional data with xarray, the first argument we pass indicates what we want to open and is used by xarray to get the right backend and in turn is used by the backend to open the file locally or remote. The accepted types by xarray are:\n", "\n", "\n", - "* **str**: \"my-file.nc\" or \"s3:://my-zarr-store/data.zarr\"\n", + "* **str**: `my-file.nc` or `s3:://my-zarr-store/data.zarr`\n", "* **os.PathLike**: Posix compatible path, most of the times is a Pathlib cross-OS compatible path.\n", "* **BufferedIOBase**: some xarray backends can read data from a buffer, this is key for remote access.\n", "* **AbstractDataStore**: This one is the generic store and backends should subclass it, if we do we can pass a \"store\" to xarray like in the case of Opendap/Pydap\n", @@ -178,7 +178,7 @@ "id": "11", "metadata": {}, "source": [ - "xarray iterated through the registered backends and netcdf4 returned a `\"yes, I can open that extension\"` see: [netCDF4_.py#L618 ](https://github.com/pydata/xarray/blob/6c2d8c3389afe049ccbfd1393e9a81dd5c759f78/xarray/backends/netCDF4_.py#L618). However, **the backend doesn't know how to \"talk\" to a remote store** and thus it fails to open our file.\n", + "xarray iterated through the registered backends and netcdf4 returned a `\"yes, I can open that extension\"` see: [netCDF4_.py#L618](https://github.com/pydata/xarray/blob/6c2d8c3389afe049ccbfd1393e9a81dd5c759f78/xarray/backends/netCDF4_.py#L618). However, **the backend doesn't know how to \"talk\" to a remote store** and thus it fails to open our file.\n", "\n" ] },