Skip to content

Commit eddad0d

Browse files
author
Adrian Gonzalez-Martin
authored
Add initial documentation (#224)
1 parent 9becdd1 commit eddad0d

File tree

99 files changed

+489
-112
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

99 files changed

+489
-112
lines changed

.gitignore

+3
Original file line numberDiff line numberDiff line change
@@ -42,3 +42,6 @@ _bundle
4242
# Conda-packed environment
4343
old-sklearn.tar.gz
4444
mlruns
45+
46+
# Sphinx documentation
47+
docs/_build/

.readthedocs.yaml

+20
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,20 @@
1+
# .readthedocs.yaml
2+
# Read the Docs configuration file
3+
# See https://docs.readthedocs.io/en/stable/config-file/v2.html for details
4+
5+
# Required
6+
version: 2
7+
8+
# Build documentation in the docs/ directory with Sphinx
9+
sphinx:
10+
configuration: docs/conf.py
11+
12+
# Optionally build your docs in additional formats such as PDF
13+
formats:
14+
- pdf
15+
16+
# Optionally set the version of Python and requirements required to build your docs
17+
python:
18+
version: 3.7
19+
install:
20+
- requirements: docs/requirements.txt

Makefile

+1-1
Original file line numberDiff line numberDiff line change
@@ -62,7 +62,7 @@ lint: generate
6262
mypy $$_runtime; \
6363
done
6464
mypy ./benchmarking
65-
mypy ./examples
65+
mypy ./docs/examples
6666
# Check if something has changed after generation
6767
git \
6868
--no-pager diff \

README.md

+29-32
Original file line numberDiff line numberDiff line change
@@ -1,8 +1,6 @@
11
# MLServer
22

3-
An open source inference server to serve your machine learning models.
4-
5-
> :warning: **This is a Work in Progress**.
3+
An open source inference server for your machine learning models.
64

75
## Overview
86

@@ -32,55 +30,54 @@ pip install mlserver-sklearn
3230
```
3331

3432
For further information on how to use MLServer, you can check any of the
35-
[available examples](#Examples).
33+
[available examples](#examples).
3634

3735
## Inference Runtimes
3836

3937
Inference runtimes allow you to define how your model should be used within
4038
MLServer.
39+
You can think of them as the **backend glue** between MLServer and your machine
40+
learning framework of choice.
41+
You can read more about [inference runtimes in their documentation
42+
page](./docs/runtimes/index.md).
43+
4144
Out of the box, MLServer comes with a set of pre-packaged runtimes which let
42-
you interact with a subset of common ML frameworks.
45+
you interact with a subset of common frameworks.
4346
This allows you to start serving models saved in these frameworks straight
4447
away.
4548

46-
To avoid bringing in dependencies for frameworks that you don't need to use,
47-
these runtimes are implemented as independent optional packages.
48-
This mechanism also allows you to rollout your [own custom runtimes]( very easily.
49-
50-
To pick which runtime you want to use for your model, you just need to make
51-
sure that the right package is installed, and then point to the correct runtime
52-
class in your `model-settings.json` file.
53-
54-
The included runtimes are:
49+
Out of the box, MLServer provides support for:
5550

56-
| Framework | Package Name | Implementation Class | Example | Source Code |
57-
| ------------ | ------------------- | --------------------------------- | ---------------------------------------------------- | ---------------------------------------------------------------- |
58-
| Scikit-Learn | `mlserver-sklearn` | `mlserver_sklearn.SKLearnModel` | [Scikit-Learn example](./examples/sklearn/README.md) | [`./runtimes/sklearn`](./runtimes/sklearn) |
59-
| XGBoost | `mlserver-xgboost` | `mlserver_xgboost.XGBoostModel` | [XGBoost example](./examples/xgboost/README.md) | [`./runtimes/xgboost`](./runtimes/xgboost) |
60-
| Spark MLlib | `mlserver-mllib` | `mlserver_mllib.MLlibModel` | Coming Soon | [`./runtimes/mllib`](./runtimes/mllib) |
61-
| LightGBM | `mlserver-lightgbm` | `mlserver_lightgbm.LightGBMModel` | [LightGBM example](./examples/lightgbm/README.md) | [`./runtimes/lightgbm`](./runtimes/lightgbm) |
62-
| Tempo | `tempo` | `tempo.mlserver.InferenceRuntime` | [Tempo example](./examples/tempo/README.md) | [`github.com/SeldonIO/tempo`](https://github.com/SeldonIO/tempo) |
63-
| MLflow | `mlserver-mlflow` | `mlserver_mlflow.MLflowRuntime` | [MLflow example](./examples/mlflow/README.md) | [`./runtimes/mlflow`](./runtimes/mlflow) |
51+
| Framework | Supported | Documentation |
52+
| ------------ | --------- | ---------------------------------------------------------------- |
53+
| Scikit-Learn | :+1: | [MLServer SKLearn](./runtimes/sklearn) |
54+
| XGBoost | :+1: | [MLServer XGBoost](./runtimes/xgboost) |
55+
| Spark MLlib | :+1: | [MLServer MLlib](./runtimes/mllib) |
56+
| LightGBM | :+1: | [MLServer LightGBM](./runtimes/lightgbm) |
57+
| Tempo | :+1: | [`github.com/SeldonIO/tempo`](https://github.com/SeldonIO/tempo) |
58+
| MLflow | :+1: | [MLServer MLflow](./runtimes/mlflow) |
6459

6560
## Examples
6661

67-
On the list below, you can find a few examples on how you can leverage
68-
`mlserver` to start serving your machine learning models.
62+
To see MLServer in action, check out [our full list of
63+
examples](./docs/examples/index.md).
64+
You can find below a few selected examples showcasing how you can leverage
65+
MLServer to start serving your machine learning models.
6966

70-
- [Serving a `scikit-learn` model](./examples/sklearn/README.md)
71-
- [Serving a `xgboost` model](./examples/xgboost/README.md)
72-
- [Serving a `lightgbm` model](./examples/lightgbm/README.md)
73-
- [Serving a `tempo` pipeline](./examples/tempo/README.md)
74-
- [Serving a custom model](./examples/custom/README.md)
75-
- [Multi-Model Serving with multiple frameworks](./examples/mms/README.md)
76-
- [Loading / unloading models from a model repository](./examples/model-repository/README.md)
67+
- [Serving a `scikit-learn` model](./docs/examples/sklearn/README.md)
68+
- [Serving a `xgboost` model](./docs/examples/xgboost/README.md)
69+
- [Serving a `lightgbm` model](./docs/examples/lightgbm/README.md)
70+
- [Serving a `tempo` pipeline](./docs/examples/tempo/README.md)
71+
- [Serving a custom model](./docs/examples/custom/README.md)
72+
- [Multi-Model Serving with multiple frameworks](./docs/examples/mms/README.md)
73+
- [Loading / unloading models from a model repository](./docs/examples/model-repository/README.md)
7774

7875
## Developer Guide
7976

8077
### Versioning
8178

8279
Both the main `mlserver` package and the [inference runtimes
83-
packages](./runtimes) try to follow the same versioning schema.
80+
packages](./docs/runtimes/index.md) try to follow the same versioning schema.
8481
To bump the version across all of them, you can use the
8582
[`./hack/update-version.sh`](./hack/update-version.sh) script.
8683
For example:

docs/Makefile

+23
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,23 @@
1+
# Minimal makefile for Sphinx documentation
2+
#
3+
4+
# You can set these variables from the command line, and also
5+
# from the environment for the first two.
6+
SPHINXOPTS ?=
7+
SPHINXBUILD ?= sphinx-build
8+
SOURCEDIR = .
9+
BUILDDIR = _build
10+
11+
# Put it first so that "make" without argument is like "make help".
12+
help:
13+
@$(SPHINXBUILD) -M help "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)
14+
15+
.PHONY: help Makefile install-dev
16+
17+
# Catch-all target: route all unknown targets to Sphinx using the new
18+
# "make mode" option. $(O) is meant as a shortcut for $(SPHINXOPTS).
19+
%: Makefile
20+
@$(SPHINXBUILD) -M $@ "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)
21+
22+
install-dev:
23+
pip install -r ./requirements.txt

docs/_static/css/custom.css

+20
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,20 @@
1+
2+
/* Hide first elem of nav bar */
3+
.md-tabs__list > li:first-child {
4+
display: none;
5+
}
6+
7+
dt {
8+
display: table;
9+
margin: 6px 0;
10+
margin-top: 6px;
11+
font-size: 90%;
12+
line-height: normal;
13+
background: #e7f2fa;
14+
color: #2980B9;
15+
border-top: solid 3px #6ab0de;
16+
padding: 6px;
17+
position: relative;
18+
}
19+
20+

docs/assets/architecture.svg

+1
Loading

docs/conf.py

+180
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,180 @@
1+
# Configuration file for the Sphinx documentation builder.
2+
#
3+
# This file only contains a selection of the most common options. For a full
4+
# list see the documentation:
5+
# https://www.sphinx-doc.org/en/master/usage/configuration.html
6+
7+
# -- Path setup --------------------------------------------------------------
8+
9+
# If extensions (or modules to document with autodoc) are in another directory,
10+
# add these directories to sys.path here. If the directory is relative to the
11+
# documentation root, use os.path.abspath to make it absolute, like shown here.
12+
#
13+
# import os
14+
# import sys
15+
# sys.path.insert(0, os.path.abspath('.'))
16+
17+
18+
# -- Project information -----------------------------------------------------
19+
import sphinx_material
20+
21+
project = "MLServer"
22+
copyright = "2021, Seldon Technologies"
23+
html_title = "MLServer"
24+
author = "Seldon Technologies"
25+
26+
# The full version, including alpha/beta/rc tags
27+
release = "0.4.0.dev1"
28+
29+
30+
# -- General configuration ---------------------------------------------------
31+
32+
# Add any Sphinx extension module names here, as strings. They can be
33+
# extensions coming with Sphinx (named 'sphinx.ext.*') or your custom
34+
# ones.
35+
extensions = [
36+
# "sphinx.ext.autodoc",
37+
# Creates .nojekyll config
38+
# "sphinx.ext.githubpages",
39+
# Converts markdown to rst
40+
"myst_parser",
41+
# "sphinx.ext.napoleon",
42+
# automatically generate API docs
43+
# see https://github.com/rtfd/readthedocs.org/issues/1139
44+
# "sphinxcontrib.apidoc",
45+
]
46+
47+
# Add any paths that contain templates here, relative to this directory.
48+
templates_path = ["_templates"]
49+
50+
# List of patterns, relative to source directory, that match files and
51+
# directories to ignore when looking for source files.
52+
# This pattern also affects html_static_path and html_extra_path.
53+
exclude_patterns = ["_build", "Thumbs.db", ".DS_Store"]
54+
55+
# apidoc settings
56+
apidoc_module_dir = "../mlserver"
57+
apidoc_output_dir = "api"
58+
apidoc_excluded_paths = ["**/*test*"]
59+
apidoc_module_first = True
60+
apidoc_separate_modules = True
61+
apidoc_extra_args = ["-d 6"]
62+
63+
# mock imports
64+
autodoc_mock_imports = [
65+
"pandas",
66+
"sklearn",
67+
"skimage",
68+
"requests",
69+
"cv2",
70+
"bs4",
71+
"keras",
72+
"seaborn",
73+
"PIL",
74+
"tensorflow",
75+
"spacy",
76+
"numpy",
77+
"tensorflow_probability",
78+
"scipy",
79+
"matplotlib",
80+
"creme",
81+
"cloudpickle",
82+
"fbprophet",
83+
"dask",
84+
"transformers",
85+
]
86+
87+
88+
# Napoleon settings
89+
napoleon_google_docstring = True
90+
napoleon_numpy_docstring = True
91+
napoleon_include_init_with_doc = True
92+
napoleon_include_private_with_doc = False
93+
napoleon_include_special_with_doc = True
94+
napoleon_use_admonition_for_examples = False
95+
napoleon_use_admonition_for_notes = False
96+
napoleon_use_admonition_for_references = False
97+
napoleon_use_ivar = False
98+
napoleon_use_param = True
99+
napoleon_use_rtype = False
100+
101+
102+
# -- Options for HTML output -------------------------------------------------
103+
104+
# The theme to use for HTML and HTML Help pages. See the documentation for
105+
# a list of builtin themes.
106+
#
107+
# Chosen Themes:
108+
# * https://github.com/bashtage/sphinx-material/
109+
# * https://github.com/myyasuda/sphinx_materialdesign_theme
110+
html_theme = "sphinx_material"
111+
112+
if html_theme == "sphinx_material":
113+
html_theme_options = {
114+
"google_analytics_account": "",
115+
"base_url": "https://mlserver.readthedocs.io",
116+
"color_primary": "teal",
117+
"color_accent": "light-blue",
118+
"repo_url": "https://github.com/SeldonIO/MLServer/",
119+
"repo_name": "MLServer",
120+
"globaltoc_depth": 2,
121+
"globaltoc_collapse": False,
122+
"globaltoc_includehidden": True,
123+
"repo_type": "github",
124+
"nav_links": [
125+
{
126+
"href": "https://docs.seldon.io",
127+
"internal": False,
128+
"title": "🚀 Our Other Projects & Products:",
129+
},
130+
{
131+
"href": "https://docs.seldon.io/projects/seldon-core/en/latest/",
132+
"internal": False,
133+
"title": "Seldon Core",
134+
},
135+
{
136+
"href": "https://docs.seldon.io/projects/alibi/en/stable/",
137+
"internal": False,
138+
"title": "Alibi Explain",
139+
},
140+
{
141+
"href": "https://docs.seldon.io/projects/alibi-detect/en/stable/",
142+
"internal": False,
143+
"title": "Alibi Detect",
144+
},
145+
{
146+
"href": "https://tempo.readthedocs.io/en/latest/",
147+
"internal": False,
148+
"title": "Tempo SDK",
149+
},
150+
{
151+
"href": "https://deploy.seldon.io/",
152+
"internal": False,
153+
"title": "Seldon Deploy (Enterprise)",
154+
},
155+
{
156+
"href": (
157+
"https://github.com/SeldonIO/seldon-deploy-sdk#seldon-deploy-sdk"
158+
),
159+
"internal": False,
160+
"title": "Seldon Deploy SDK (Enterprise)",
161+
},
162+
],
163+
}
164+
165+
extensions.append("sphinx_material")
166+
html_theme_path = sphinx_material.html_theme_path()
167+
html_context = sphinx_material.get_html_context()
168+
169+
html_sidebars = {
170+
"**": ["logo-text.html", "globaltoc.html", "localtoc.html", "searchbox.html"]
171+
}
172+
173+
# Add any paths that contain custom static files (such as style sheets) here,
174+
# relative to this directory. They are copied after the builtin static files,
175+
# so a file named "default.css" will overwrite the builtin "default.css".
176+
html_static_path = ["_static"]
177+
178+
html_css_files = [
179+
"css/custom.css",
180+
]
File renamed without changes.
File renamed without changes.

examples/conda/README.ipynb renamed to docs/examples/conda/README.ipynb

+4-4
Original file line numberDiff line numberDiff line change
@@ -4,10 +4,10 @@
44
"cell_type": "markdown",
55
"metadata": {},
66
"source": [
7-
"# Custom environments in MLServer\n",
7+
"# Custom Conda environments in MLServer\n",
88
"\n",
99
"It's not unusual that model runtimes require extra dependencies that are not direct dependencies of MLServer.\n",
10-
"This is the case when we want to use [custom runtimes](../custom), but also when our model artifacts are the output of older versions of a toolkit (e.g. models trained with an older version of SKLearn).\n",
10+
"This is the case when we want to use [custom runtimes](../custom/README), but also when our model artifacts are the output of older versions of a toolkit (e.g. models trained with an older version of SKLearn).\n",
1111
"\n",
1212
"In these cases, since these dependencies (or dependency versions) are not known in advance by MLServer, they **won't be included in the default `seldonio/mlserver` Docker image**.\n",
1313
"To cover these cases, the **`seldonio/mlserver` Docker image allows you to load custom environments** before starting the server itself.\n",
@@ -76,7 +76,7 @@
7676
"We can now train and save a Scikit-Learn model using the older version of our environment.\n",
7777
"This model will be serialised as `model.joblib`.\n",
7878
"\n",
79-
"You can find more details of this process in the [Scikit-Learn example](../sklearn)."
79+
"You can find more details of this process in the [Scikit-Learn example](../sklearn/README)."
8080
]
8181
},
8282
{
@@ -257,7 +257,7 @@
257257
"name": "python",
258258
"nbconvert_exporter": "python",
259259
"pygments_lexer": "ipython3",
260-
"version": "3.7.0"
260+
"version": "3.7.8"
261261
}
262262
},
263263
"nbformat": 4,

0 commit comments

Comments
 (0)