Skip to content

Commit 78d72a9

Browse files
committed
Update CLI and most docs to use multi-model-server
1 parent 6af0292 commit 78d72a9

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

41 files changed

+261
-250
lines changed

.github/PULL_REQUEST_TEMPLATE.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -6,6 +6,6 @@ Before or while filing an issue please feel free to join our [<img src='../docs/
66

77
## Testing done:
88

9-
**To run CI tests on your changes refer [README.md](https://github.com/awslabs/mxnet-model-server/blob/master/ci/README.md)**
9+
**To run CI tests on your changes refer [README.md](https://github.com/awslabs/multi-model-server/blob/master/ci/README.md)**
1010

1111
By submitting this pull request, I confirm that my contribution is made under the terms of the Apache 2.0 license.

PyPiDescription.rst

+11-11
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,7 @@ Use the MMS Server CLI, or the pre-configured Docker images, to start a
99
service that sets up HTTP endpoints to handle model inference requests.
1010

1111
Detailed documentation and examples are provided in the `docs
12-
folder <https://github.com/awslabs/mxnet-model-server/blob/master/docs/README.md>`__.
12+
folder <https://github.com/awslabs/multi-model-server/blob/master/docs/README.md>`__.
1313

1414
Prerequisites
1515
-------------
@@ -60,24 +60,24 @@ Installation
6060

6161
::
6262

63-
pip install mxnet-model-server
63+
pip install multi-model-server
6464

6565
Development
6666
-----------
6767

6868
We welcome new contributors of all experience levels. For information on
6969
how to install MMS for development, refer to the `MMS
70-
docs <https://github.com/awslabs/mxnet-model-server/blob/master/docs/install.md>`__.
70+
docs <https://github.com/awslabs/multi-model-server/blob/master/docs/install.md>`__.
7171

7272
Important links
7373
---------------
7474

7575
- `Official source code
76-
repo <https://github.com/awslabs/mxnet-model-server>`__
76+
repo <https://github.com/awslabs/multi-model-server>`__
7777
- `Download
78-
releases <https://pypi.org/project/mxnet-model-server/#files>`__
78+
releases <https://pypi.org/project/multi-model-server/#files>`__
7979
- `Issue
80-
tracker <https://github.com/awslabs/mxnet-model-server/issues>`__
80+
tracker <https://github.com/awslabs/multi-model-server/issues>`__
8181

8282
Source code
8383
-----------
@@ -86,24 +86,24 @@ You can check the latest source code as follows:
8686

8787
::
8888

89-
git clone https://github.com/awslabs/mxnet-model-server.git
89+
git clone https://github.com/awslabs/multi-model-server.git
9090

9191
Testing
9292
-------
9393

9494
After installation, try out the MMS Quickstart for
9595

96-
- `Serving a Model <https://github.com/awslabs/mxnet-model-server/blob/master/README.md#serve-a-model>`__
97-
- `Create a Model Archive <https://github.com/awslabs/mxnet-model-server/blob/master/README.md#model-archive>`__.
96+
- `Serving a Model <https://github.com/awslabs/multi-model-server/blob/master/README.md#serve-a-model>`__
97+
- `Create a Model Archive <https://github.com/awslabs/multi-model-server/blob/master/README.md#model-archive>`__.
9898

9999
Help and Support
100100
----------------
101101

102-
- `Documentation <https://github.com/awslabs/mxnet-model-server/blob/master/docs/README.md>`__
102+
- `Documentation <https://github.com/awslabs/multi-model-server/blob/master/docs/README.md>`__
103103
- `Forum <https://discuss.mxnet.io/latest>`__
104104

105105
Citation
106106
--------
107107

108108
If you use MMS in a publication or project, please cite MMS:
109-
https://github.com/awslabs/mxnet-model-server
109+
https://github.com/awslabs/multi-model-server

README.md

+8-8
Original file line numberDiff line numberDiff line change
@@ -24,7 +24,7 @@ Join our [<img src='docs/images/slack.png' width='20px' /> slack channel](https:
2424

2525
## Other Relevant Documents
2626
* [Latest Version Docs](docs/README.md)
27-
* [v0.4.0 Docs](https://github.com/awslabs/mxnet-model-server/blob/v0.4.0/docs/README.md)
27+
* [v0.4.0 Docs](https://github.com/awslabs/multi-model-server/blob/v0.4.0/docs/README.md)
2828
* [Migrating from v0.4.0 to v1.0.0](docs/migration.md)
2929

3030
## Quick Start
@@ -96,11 +96,11 @@ pip install mxnet-cu92mkl
9696
**Step 3:** Install or Upgrade MMS as follows:
9797
9898
```bash
99-
# Install latest released version of mxnet-model-server
100-
pip install mxnet-model-server
99+
# Install latest released version of multi-model-server
100+
pip install multi-model-server
101101
```
102102
103-
To upgrade from a previous version of `mxnet-model-server`, please refer [migration reference](docs/migration.md) document.
103+
To upgrade from a previous version of `multi-model-server`, please refer [migration reference](docs/migration.md) document.
104104
105105
**Notes:**
106106
* A minimal version of `model-archiver` will be installed with MMS as dependency. See [model-archiver](model-archiver/README.md) for more options and details.
@@ -111,14 +111,14 @@ To upgrade from a previous version of `mxnet-model-server`, please refer [migrat
111111
Once installed, you can get MMS model server up and running very quickly. Try out `--help` to see all the CLI options available.
112112
113113
```bash
114-
mxnet-model-server --help
114+
multi-model-server --help
115115
```
116116
117117
For this quick start, we'll skip over most of the features, but be sure to take a look at the [full server docs](docs/server.md) when you're ready.
118118
119119
Here is an easy example for serving an object classification model:
120120
```bash
121-
mxnet-model-server --start --models squeezenet=https://s3.amazonaws.com/model-server/model_archive_1.0/squeezenet_v1.1.mar
121+
multi-model-server --start --models squeezenet=https://s3.amazonaws.com/model-server/model_archive_1.0/squeezenet_v1.1.mar
122122
```
123123
124124
With the command above executed, you have MMS running on your host, listening for inference requests. **Please note, that if you specify model(s) during MMS start - it will automatically scale backend workers to the number equal to available vCPUs (if you run on CPU instance) or to the number of available GPUs (if you run on GPU instance). In case of powerful hosts with a lot of compute resoures (vCPUs or GPUs) this start up and autoscaling process might take considerable time. If you would like to minimize MMS start up time you can try to avoid registering and scaling up model during start up time and move that to a later point by using corresponding [Management API](docs/management_api.md#register-a-model) calls (this allows finer grain control to how much resources are allocated for any particular model).**
@@ -170,9 +170,9 @@ Now you've seen how easy it can be to serve a deep learning model with MMS! [Wou
170170
### Stopping the running model server
171171
To stop the current running model-server instance, run the following command:
172172
```bash
173-
$ mxnet-model-server --stop
173+
$ multi-model-server --stop
174174
```
175-
You would see output specifying that mxnet-model-server has stopped.
175+
You would see output specifying that multi-model-server has stopped.
176176
177177
### Create a Model Archive
178178

benchmarks/README.md

+4-4
Original file line numberDiff line numberDiff line change
@@ -26,10 +26,10 @@ The benchmarking script requires the following to run:
2626

2727
## Models
2828

29-
The pre-loaded models for the benchmark can be mostly found in the [MMS model zoo](https://github.com/awslabs/mxnet-model-server/blob/master/docs/model_zoo.md). We currently support the following:
30-
- [resnet: ResNet-18 (Default)](https://github.com/awslabs/mxnet-model-server/blob/master/docs/model_zoo.md#resnet-18)
31-
- [squeezenet: SqueezeNet V1.1](https://github.com/awslabs/mxnet-model-server/blob/master/docs/model_zoo.md#squeezenet_v1.1)
32-
- [lstm: lstm-ptb](https://github.com/awslabs/mxnet-model-server/blob/master/docs/model_zoo.md#lstm-ptb)
29+
The pre-loaded models for the benchmark can be mostly found in the [MMS model zoo](https://github.com/awslabs/multi-model-server/blob/master/docs/model_zoo.md). We currently support the following:
30+
- [resnet: ResNet-18 (Default)](https://github.com/awslabs/multi-model-server/blob/master/docs/model_zoo.md#resnet-18)
31+
- [squeezenet: SqueezeNet V1.1](https://github.com/awslabs/multi-model-server/blob/master/docs/model_zoo.md#squeezenet_v1.1)
32+
- [lstm: lstm-ptb](https://github.com/awslabs/multi-model-server/blob/master/docs/model_zoo.md#lstm-ptb)
3333
- [noop: noop-v1.0](https://s3.amazonaws.com/model-server/models/noop/noop-v1.0.model) Simple Noop model which returns "Hello world" to any input specified.
3434
- [noop_echo: noop_echo-v1.0](https://s3.amazonaws.com/model-server/models/noop/noop_echo-v1.0.model) Simple Noop model which returns whatever input is given to it.
3535

ci/README.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -22,7 +22,7 @@ Following files in this folder is used to create the docker images
2222
To make it easy for developer debug build issue locally, MMS support AWS codebuild local.
2323
Developer can use following command to build MMS locally:
2424
```bash
25-
$ cd mxnet-model-server
25+
$ cd multi-model-server
2626
$ ./run_ci_tests.sh
2727
```
2828

ci/buildspec.yml

+1-1
Original file line numberDiff line numberDiff line change
@@ -25,7 +25,7 @@ phases:
2525
- cd model-archiver/ && python -m pytest model_archiver/tests/unit_tests && cd ../
2626
- cd model-archiver/ && python -m pytest model_archiver/tests/integ_tests && cd ../
2727
- cd serving-sdk/ && mvn clean deploy && cd ../
28-
# integration test is broken: https://github.com/awslabs/mxnet-model-server/issues/437
28+
# integration test is broken: https://github.com/awslabs/multi-model-server/issues/437
2929
#- python -m pytest mms/tests/integration_tests
3030
- pylint -rn --rcfile=./mms/tests/pylintrc mms/.
3131
- cd model-archiver/ && pylint -rn --rcfile=./model_archiver/tests/pylintrc model_archiver/. && cd ../

0 commit comments

Comments
 (0)