-
Notifications
You must be signed in to change notification settings - Fork 1.4k
Open
Description
Hi, I am trying to out aisuite - and was wondering if there is any interest in standardizing LLM params like temperature, top_k, etc.
I notice that currently every provider has it's own independent set of params
Bedrock -
aisuite/aisuite/providers/aws_provider.py
Line 13 in cf9df9a
| INFERENCE_PARAMETERS = ["maxTokens", "temperature", "topP", "stopSequences"] |
Ollama -
| def chat_completions_create(self, model, messages, **kwargs): |
etc.
Metadata
Metadata
Assignees
Labels
No labels