Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix ml inference parameters description #9246

Conversation

mingshl
Copy link
Contributor

@mingshl mingshl commented Feb 19, 2025

Description

per @prasadnu request, should make the model_config description more descriptives. Also fix the descriptions for search processors for ignoreMissing and ignoreFailure.

For searching, the partial predict might lead to wrong search result, thus ignore processor when required field is missing.

Issues Related

Closes #[delete this text, including the brackets, and replace with the issue number]

Version

2.14, 2.15, 2.16, 2.17, 2.18, 2.19

Checklist

  • By submitting this pull request, I confirm that my contribution is made under the terms of the Apache 2.0 license and subject to the Developers Certificate of Origin.
    For more information on following Developer Certificate of Origin and signing off your commits, please check here.

Copy link

Thank you for submitting your PR. The PR states are In progress (or Draft) -> Tech review -> Doc review -> Editorial review -> Merged.

Before you submit your PR for doc review, make sure the content is technically accurate. If you need help finding a tech reviewer, tag a maintainer.

When you're ready for doc review, tag the assignee of this PR. The doc reviewer may push edits to the PR directly or leave comments and editorial suggestions for you to address (let us know in a comment if you have a preference). The doc reviewer will arrange for an editorial review.

@@ -53,7 +53,7 @@ The following table lists the required and optional parameters for the `ml-infer
|:--- | :--- | :--- | :--- |
| `model_id` | String | Required | The ID of the ML model used by the processor. |
| `function_name` | String | Optional for externally hosted models<br/><br/>Required for local models | The function name of the ML model configured in the processor. For local models, valid values are `sparse_encoding`, `sparse_tokenize`, `text_embedding`, and `text_similarity`. For externally hosted models, valid value is `remote`. Default is `remote`. |
| `model_config` | Object | Optional | Custom configuration options for the ML model. For more information, see [The `model_config` object]({{site.url}}{{site.baseurl}}/ml-commons-plugin/api/model-apis/register-model/#the-model_config-object). |
| `model_config` | Object | Optional | Custom configuration options for the ML model. For remote models, if set, this overrides the default parameters in connectors. For local models, it can be added to model_input to override the model configuration set during registration.. For more information, see [The `model_config` object]({{site.url}}{{site.baseurl}}/ml-commons-plugin/api/model-apis/register-model/#the-model_config-object). |
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

"For local models, it can be added to model_input to override the model configuration set during registration" Does this mean we can't use model_config for local model ?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

we could add model_config for local model, but the model_config needs to add to model_input, here is the example in below: model_config

Copy link
Collaborator

@kolchfa-aws kolchfa-aws left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you, @mingshl!

Copy link
Collaborator

@natebower natebower left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@kolchfa-aws @mingshl Please see my changes and let me know if you have any questions. Thanks!

@kolchfa-aws kolchfa-aws merged commit 4865e9d into opensearch-project:main Feb 27, 2025
5 checks passed
opensearch-trigger-bot bot pushed a commit that referenced this pull request Feb 27, 2025
* fix ml inference parameters description

Signed-off-by: Mingshi Liu <[email protected]>

* address leftover comment from https://github.com/opensearch-project/documentation-website/pull/9213/files#r1962257484

Signed-off-by: Mingshi Liu <[email protected]>

* Apply suggestions from code review

Signed-off-by: kolchfa-aws <[email protected]>

* Apply suggestions from code review

Co-authored-by: Nathan Bower <[email protected]>
Signed-off-by: kolchfa-aws <[email protected]>

---------

Signed-off-by: Mingshi Liu <[email protected]>
Signed-off-by: kolchfa-aws <[email protected]>
Co-authored-by: kolchfa-aws <[email protected]>
Co-authored-by: Nathan Bower <[email protected]>
(cherry picked from commit 4865e9d)
Signed-off-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants