-
-
Notifications
You must be signed in to change notification settings - Fork 232
Closed
Labels
bugSomething isn't workingSomething isn't working
Description
Basic checks
- I searched existing issues - this hasn't been reported
- I can reproduce this consistently
- This is a RubyLLM bug, not my application code
What's broken?
Setting a custom tool choice param for OpenAi is being overridden and set to tool_choice: "auto"
- preventing the possibility of using the tool_choice: "required"
feature.
This is due to
ruby_llm/lib/ruby_llm/providers/openai/chat.rb
Lines 24 to 27 in aceeaf2
if tools.any? | |
payload[:tools] = tools.map { |_, tool| tool_for(tool) } | |
payload[:tool_choice] = 'auto' | |
end |
This is related to #303 ("dangerously" setting params), but I think the tool choice in render_payload
should be removed?
The open ai docs suggest that it would default to auto
if tools are present.
How to reproduce
- Configure RubyLLM using OpenAi and a model that supports tools e.g.
gpt-4o-mini
- Enable debug to see the request
export RUBYLLM_DEBUG=true
- create and set a tool to the chat instance
- set a custom
tool_choice: 'required'
param - ask.
Expected behavior
tool_choice
should be "required" in the payload.
What actually happened
tool_choice
is set to "auto"
Environment
Ruby version: 3.2.3
RubyLLM version: 1.4.0
tpaulshippy and keangnage
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't working