The API reference (generated from docstrings) and Jupyter Notebook tutorials are automatically built by a Github action (see .github/workflows/deploy.yaml) on commit to develop
and published to https://secondmind-labs.github.io/GPflux/.
If you want to run the Jupyter Notebook tutorials interactively, install additional dependencies in the docs
directory:
pip install -r docs_requirements.txt
...and then run the appropriate Notebook:
jupyter-notebook notebooks/<name-of-notebook>
If you want to create a new Notebook tutorial for inclusion in the doc set, see notebooks/README.md
.
If you want to build the documentation locally:
-
Make sure you have a Python 3.7 virtualenv and
gpflux
is installed as per the instructions in../README.md
) -
In the
docs
directory, install dependencies:pip install -r docs_requirements.txt
If pandoc does not install via pip, or step 3) fails with a 'Pandoc' error, download and install Pandoc separately from
https://github.com/jgm/pandoc/releases/
(e.g.pandoc-<version>-amd64.deb
for Ubuntu), and try running step 2) again. -
Compile the documentation:
make html
-
Run a web server:
python -m http.server
-
Check documentation locally by opening (in a browser):
TensorFlow and TensorFlow Probability have their own custom API docs system. We have manually produced (web-scraped) intersphinx inventories to be able to cross-reference tf and tfp classes and functions. They are now hosted on the GPflow/tensorflow-intersphinx GitHub.
TF/P provides a lot of aliases for accessing objects. However, the intersphinx inventories only contain some of them. You can find the correct way of referencing by looking through the list generated by
python -msphinx.ext.intersphinx tf2_py_objects.inv
or
python -msphinx.ext.intersphinx tfp_py_objects.inv
Note that this requires you to have local copies, e.g. downloaded from GitHub (TF/TFP).