Skip to content

Conversation

michaelmdeng
Copy link

Normally, llm prompt ... calls use Model.prompt(), which takes model options as Model.prompt(..., **options). However, llm prompt ... calls with tools (or other configurations that require a conversation) use Conversation.chain(), which takes model options as Conversation.chain(..., options: Optional[dict] = None). This causes those calls to fail with invalid options when custom model options are provided.

An easy way to trigger this error is via a custom provider option for OpenRouter models via the llm-openrouter plugin.

$ llm -m openrouter/deepseek/deepseek-chat-v3-0324 -T llm_version "what version of llm is this?" -o provider '{"only": ["deepseek"]}'

Error: Conversation.chain() got an unexpected keyword argument 'provider'

There is no equivalent error when tools are not used:

$ llm -m openrouter/deepseek/deepseek-chat-v3-0324 "what version of llm is this?" -o provider '{"only": ["deepseek"]}'
<response>...

With this fix, we can successfully prompt with the tool call:

$ llm -m openrouter/deepseek/deepseek-chat-v3-0324 -T llm_version "what version of llm is this?" -o provider '{"only": ["deepseek"]}'

The installed version of LLM is 0.26.

to support custom model options the same way model.prompt does
@michaelmdeng
Copy link
Author

resolved in #1233

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant