-
Notifications
You must be signed in to change notification settings - Fork 521
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
fix ml inference parameters description #9246
fix ml inference parameters description #9246
Conversation
Signed-off-by: Mingshi Liu <[email protected]>
Thank you for submitting your PR. The PR states are In progress (or Draft) -> Tech review -> Doc review -> Editorial review -> Merged. Before you submit your PR for doc review, make sure the content is technically accurate. If you need help finding a tech reviewer, tag a maintainer. When you're ready for doc review, tag the assignee of this PR. The doc reviewer may push edits to the PR directly or leave comments and editorial suggestions for you to address (let us know in a comment if you have a preference). The doc reviewer will arrange for an editorial review. |
@@ -53,7 +53,7 @@ The following table lists the required and optional parameters for the `ml-infer | |||
|:--- | :--- | :--- | :--- | | |||
| `model_id` | String | Required | The ID of the ML model used by the processor. | | |||
| `function_name` | String | Optional for externally hosted models<br/><br/>Required for local models | The function name of the ML model configured in the processor. For local models, valid values are `sparse_encoding`, `sparse_tokenize`, `text_embedding`, and `text_similarity`. For externally hosted models, valid value is `remote`. Default is `remote`. | | |||
| `model_config` | Object | Optional | Custom configuration options for the ML model. For more information, see [The `model_config` object]({{site.url}}{{site.baseurl}}/ml-commons-plugin/api/model-apis/register-model/#the-model_config-object). | | |||
| `model_config` | Object | Optional | Custom configuration options for the ML model. For remote models, if set, this overrides the default parameters in connectors. For local models, it can be added to model_input to override the model configuration set during registration.. For more information, see [The `model_config` object]({{site.url}}{{site.baseurl}}/ml-commons-plugin/api/model-apis/register-model/#the-model_config-object). | |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
"For local models, it can be added to model_input to override the model configuration set during registration" Does this mean we can't use model_config
for local model ?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
we could add model_config
for local model, but the model_config
needs to add to model_input
, here is the example in below: model_config
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thank you, @mingshl!
_search-plugins/search-pipelines/ml-inference-search-request.md
Outdated
Show resolved
Hide resolved
_search-plugins/search-pipelines/ml-inference-search-response.md
Outdated
Show resolved
Hide resolved
Signed-off-by: kolchfa-aws <[email protected]>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@kolchfa-aws @mingshl Please see my changes and let me know if you have any questions. Thanks!
_search-plugins/search-pipelines/ml-inference-search-request.md
Outdated
Show resolved
Hide resolved
_search-plugins/search-pipelines/ml-inference-search-response.md
Outdated
Show resolved
Hide resolved
Co-authored-by: Nathan Bower <[email protected]> Signed-off-by: kolchfa-aws <[email protected]>
* fix ml inference parameters description Signed-off-by: Mingshi Liu <[email protected]> * address leftover comment from https://github.com/opensearch-project/documentation-website/pull/9213/files#r1962257484 Signed-off-by: Mingshi Liu <[email protected]> * Apply suggestions from code review Signed-off-by: kolchfa-aws <[email protected]> * Apply suggestions from code review Co-authored-by: Nathan Bower <[email protected]> Signed-off-by: kolchfa-aws <[email protected]> --------- Signed-off-by: Mingshi Liu <[email protected]> Signed-off-by: kolchfa-aws <[email protected]> Co-authored-by: kolchfa-aws <[email protected]> Co-authored-by: Nathan Bower <[email protected]> (cherry picked from commit 4865e9d) Signed-off-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
Description
per @prasadnu request, should make the model_config description more descriptives. Also fix the descriptions for search processors for ignoreMissing and ignoreFailure.
For searching, the partial predict might lead to wrong search result, thus ignore processor when required field is missing.
Issues Related
Closes #[delete this text, including the brackets, and replace with the issue number]
Version
2.14, 2.15, 2.16, 2.17, 2.18, 2.19
Checklist
For more information on following Developer Certificate of Origin and signing off your commits, please check here.