Skip to content

Support o3-mini model #245

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
Piotr1215 opened this issue Jan 31, 2025 · 3 comments · Fixed by #246
Closed

Support o3-mini model #245

Piotr1215 opened this issue Jan 31, 2025 · 3 comments · Fixed by #246

Comments

@Piotr1215
Copy link

With the release of o3-mini we need to add support for it similar to o1 models.

Crud implementation idea:

if provider == "openai" and (model.model:sub(1, 2) == "o1" or model.model == "o3-mini") then

In this line

@QuestionableAntics
Copy link
Contributor

Being able to set the reasoning_effort property for reasoning models would also be nice.

@QuestionableAntics
Copy link
Contributor

I created an MR to add o3 in

@ctrlplusb
Copy link

ctrlplusb commented Feb 3, 2025

you can monkeypatch the fix that @Piotr1215 suggested via:

require('gp').setup { 
  -- your config
}

-- Monkey patch the dispatcher after setup
local dispatcher = require 'gp.dispatcher'
local original_prepare_payload = dispatcher.prepare_payload
dispatcher.prepare_payload = function(messages, model, provider)
  local output = original_prepare_payload(messages, model, provider)
  if provider == 'openai' and model.model:sub(1, 2) == 'o3' then
    for i = #messages, 1, -1 do
      if messages[i].role == 'system' then
        table.remove(messages, i)
      end
    end
    output.max_tokens = nil
    output.temperature = nil
    output.top_p = nil
    output.stream = true
  end
  return output
end

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
3 participants