You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Adds input_type parameter to POST inference docs at the root level (#4846)
* Adds input_type parameter to POST inference docs at the root level.
* Update specification/inference/inference/InferenceRequest.ts
Co-authored-by: David Kyle <[email protected]>
---------
Co-authored-by: David Kyle <[email protected]>
* > Inference endpoints for the `completion` task type currently only support a single string as input.
83
83
*/
84
84
input: string|Array<string>
85
+
/**
86
+
* Specifies the input data type for the text embedding model. The `input_type` parameter only applies to Inference Endpoints with the `text_embedding` task type. Possible values include:
87
+
* * `SEARCH`
88
+
* * `INGEST`
89
+
* * `CLASSIFICATION`
90
+
* * `CLUSTERING`
91
+
* Not all services support all values. Unsupported values will trigger a validation exception.
92
+
* Accepted values depend on the configured inference service, refer to the relevant service-specific documentation for more info.
93
+
*
94
+
* > info
95
+
* > The `input_type` parameter specified on the root level of the request body will take precedence over the `input_type` parameter specified in `task_settings`.
96
+
*/
97
+
input_type?: string
85
98
/**
86
99
* Task settings for the individual inference request.
87
100
* These settings are specific to the task type you specified and override the task settings specified when initializing the service.
0 commit comments