|
25 | 25 | <li>Docker Desktop: <a href="vespa-quick-start.html">Install and run Vespa locally</a></li> |
26 | 26 | <li>Docker Desktop: <a href="vespa-quick-start-java.html">Install and run Vespa locally, with Java components</a></li> |
27 | 27 | </ul> |
28 | | - <p>The <a href="/en/developer-guide.html">developer guide</a> is an intro to developing, testing and deploying applications.</p> |
| 28 | + <p>The <a href="/en/developer-guide.html">developer guide</a> is an intro to developing, testing, and deploying applications.</p> |
29 | 29 | <p>Until you add multiple nodes an application can be deployed both on cloud and locally with no modifications.</p> |
30 | 30 | </td> |
31 | 31 | </tr> |
|
42 | 42 | </li> |
43 | 43 |
|
44 | 44 | <li><a href="tutorials/hybrid-search.html">Tutorial: Hybrid Text Search</a>. |
45 | | - A search tutorial and introduction to hybrid text ranking with Vespa combining BM25 with text embedding models. |
| 45 | + A search tutorial and introduction to hybrid text ranking with Vespa, combining BM25 with text embedding models. |
46 | 46 | </li> |
47 | 47 |
|
48 | 48 | <li><a href="tutorials/text-search-ml.html">Tutorial: Improving Text Search with Machine Learning</a>. |
|
74 | 74 | <strong>ML Model Serving</strong> |
75 | 75 | <p> |
76 | 76 | Learn how to use Vespa for ML model serving in <a href="stateless-model-evaluation.html">Stateless Model Evaluation</a>. |
77 | | - Vespa has support for running inference with models from many popular ML frameworks which can be used |
78 | | - for ranking, query classification, question answering, multi-modal retrieval and more. |
| 77 | + Vespa supports running inference with models from many popular ML frameworks, which can be used |
| 78 | + for ranking, query classification, question answering, multi-modal retrieval, and more. |
79 | 79 | </p> |
80 | 80 | <ul> |
81 | 81 | <li><a href="onnx.html">Ranking with ONNX models</a>. Export models from |
82 | 82 | popular deep learning frameworks such as <a href="https://pytorch.org/docs/stable/onnx.html">PyTorch</a> |
83 | 83 | to <a href="https://onnx.ai/">ONNX</a> format for serving in Vespa. Vespa integrates with |
84 | 84 | <a href="https://blog.vespa.ai/stateful-model-serving-how-we-accelerate-inference-using-onnx-runtime/">ONNX-Runtime</a> |
85 | | - for <a href="https://blog.vespa.ai/stateless-model-evaluation/">accelerated inference</a>. Many ML framework |
86 | | - support exporting model to ONNX, including <a href="http://onnx.ai/sklearn-onnx/">sklearn</a>. |
| 85 | + for <a href="https://blog.vespa.ai/stateless-model-evaluation/">accelerated inference</a>. Many ML frameworks |
| 86 | + support exporting models to ONNX, including <a href="http://onnx.ai/sklearn-onnx/">sklearn</a>. |
87 | 87 | </li> |
88 | 88 | <li><a href="lightgbm.html">Ranking with LightGBM models</a></li> |
89 | 89 | <li><a href="xgboost.html">Ranking with XGBoost models</a></li> |
|
92 | 92 |
|
93 | 93 | <strong>Embedding Model Inference</strong> |
94 | 94 | <p> |
95 | | - Vespa supports integrating <a href="embedding.html">embedding</a> models, this avoids transferring large amounts of embedding vector data |
| 95 | + Vespa supports integrating <a href="embedding.html">embedding</a> models, which avoids transferring large amounts of embedding vector data |
96 | 96 | over the network and allows for efficient serving of embedding models. |
97 | 97 | <ul> |
98 | 98 | <li><a href="embedding.html#huggingface-embedder">Huggingface Embedder</a> Use single-vector embedding models from Hugging face</li> |
|
110 | 110 |
|
111 | 111 | <strong>E-Commerce Search</strong> |
112 | 112 | <p>The <a href="use-case-shopping.html">e-commerce shopping sample application</a> demonstrates Vespa grouping, |
113 | | - true in-place partial updates, custom ranking and more.</p> |
| 113 | + true in-place partial updates, custom ranking, and more.</p> |
114 | 114 |
|
115 | 115 | <strong>Examples and starting sample applications</strong> |
116 | 116 | <p> |
117 | 117 | There are many examples and starting applications on |
118 | | -<a href="https://github.com/vespa-engine/sample-apps/">GitHub</a> and <a href="https://pyvespa.readthedocs.io/en/latest/examples.html">PyVespa examples</a>. |
| 118 | +<a href="https://github.com/vespa-engine/sample-apps/">GitHub</a> and <a href="https://vespa-engine.github.io/pyvespa/index.html">Pyvespa examples</a>. |
119 | 119 | </p> |
120 | 120 | </td> |
121 | 121 | </tr> |
|
144 | 144 | <th class="p-t-10">Custom component development</th> |
145 | 145 | <td class="p-t-10"> |
146 | 146 | <p> |
147 | | - Vespa applications can contain custom components to be run by Vespa, for example on receiving queries or documents. |
| 147 | + Vespa applications can contain custom components that are run by Vespa, for example, when receiving queries or documents. |
148 | 148 | The applications must be able to run on a JVM. |
149 | | - While all the built-in behavior of Vespa can be invoked by an YQL query, |
| 149 | + While all the built-in behavior of Vespa can be invoked by a YQL query, |
150 | 150 | advanced applications often choose to use plugin components to build queries from frontend requests |
151 | 151 | as doing this closer to the data is faster and simpler. |
152 | 152 | </p> |
|
0 commit comments