Skip to content

Use max_completion_tokens param for OpenAI Chat Completion API #679

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

miphreal
Copy link

The current implementation uses max_tokens param for OpenAI Chat Completions API, tho according to the docs this parameter is deprecated and we should use max_completion_tokens.

image

Changes

  • use max_completion_tokens param for setting max output tokens for OpenAI Chat Completions

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant