Skip to content

fix: convert litellm response with reasoning content to openai message #1098

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 2 commits into
base: main
Choose a base branch
from

Conversation

zkllll2002
Copy link

1. Description

This PR fixes an issue where reasoning content from models accessed via LiteLLM was not being correctly parsed into the ChatCompletionMessage format. This was particularly noticeable when using reasoning models.

2. Context

I am using the openai-agents-python library in my project, and it has been incredibly helpful. Thank you for building such a great tool!

My setup uses litellm to interface with gemini-2.5-pro. I noticed that while the agent could receive a response, the reasoning(thinking) from the Gemini model was lost during the conversion process from the LiteLLM response format to the OpenAI ChatCompletionMessage object.

I saw that PR #871 made progress on a similar issue, but it seems the specific response structure from LiteLLM still requires a small adaptation. This fix adds the necessary logic to ensure that these responses are handled.

Relates to: #871

3. Key Changes

  • LitellmConverter.convert_message_to_openai: add reasoing_content
  • Converter.items_to_messages: just pass the reasoning item

@ericlu88
Copy link

@seratch Gentle ping on this PR. We've been using a fork of the repo to get this fix in. Would love to get back to main stream.

@seratch
Copy link
Member

seratch commented Jul 23, 2025

The essential change should be good to go but before reviews, could you fix the mypy errors?

uv run mypy .
src/agents/extensions/models/litellm_model.py:371: error: Unexpected keyword argument "reasoning_content" for "ChatCompletionMessage"  [call-arg]
Found 1 error in 1 file (checked 261 source files)
make: *** [Makefile:20: mypy] Error 1
Error: Process completed with exit code 2.

@zkllll2002 zkllll2002 force-pushed the fix/reasoning-content branch from 3f688bb to af9aba4 Compare July 23, 2025 04:07
@zkllll2002
Copy link
Author

The essential change should be good to go but before reviews, could you fix the mypy errors?

uv run mypy .
src/agents/extensions/models/litellm_model.py:371: error: Unexpected keyword argument "reasoning_content" for "ChatCompletionMessage"  [call-arg]
Found 1 error in 1 file (checked 261 source files)
make: *** [Makefile:20: mypy] Error 1
Error: Process completed with exit code 2.

@seratch thanks, done

Copy link
Member

@seratch seratch left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good to me; @rm-openai any thoughts?

@seratch seratch requested a review from rm-openai July 25, 2025 00:28
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants