Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

max_tokens is not an accepted parameter #71

Closed
s44002 opened this issue Mar 12, 2025 · 3 comments
Closed

max_tokens is not an accepted parameter #71

s44002 opened this issue Mar 12, 2025 · 3 comments
Labels
bug Something isn't working

Comments

@s44002
Copy link

s44002 commented Mar 12, 2025

Out of the Documentation, there is no reference of max_tokens. the ModelSettings does not accept the max_tokens parameter..

This becomes a problem especially when using anthropic models as they dont assume a max tokens value and need one to get passed.

@s44002 s44002 added the bug Something isn't working label Mar 12, 2025
@s44002
Copy link
Author

s44002 commented Mar 12, 2025

I am fixing the issue

@rm-openai
Copy link
Collaborator

Apologies, I didn't see this issue/PR in time and implemented it myself via #105

@s44002
Copy link
Author

s44002 commented Mar 13, 2025

No worries, getting that fixed was the whole point.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants