We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Out of the Documentation, there is no reference of max_tokens. the ModelSettings does not accept the max_tokens parameter..
This becomes a problem especially when using anthropic models as they dont assume a max tokens value and need one to get passed.
The text was updated successfully, but these errors were encountered:
I am fixing the issue
Sorry, something went wrong.
Fixes openai#71: Added support for max_tokens in ModelSettings
a1b4dbc
Apologies, I didn't see this issue/PR in time and implemented it myself via #105
No worries, getting that fixed was the whole point.
Successfully merging a pull request may close this issue.
Out of the Documentation, there is no reference of max_tokens. the ModelSettings does not accept the max_tokens parameter..
This becomes a problem especially when using anthropic models as they dont assume a max tokens value and need one to get passed.
The text was updated successfully, but these errors were encountered: