Skip to content

Conversation

grs
Copy link
Contributor

@grs grs commented Aug 14, 2025

What does this PR do?

Handles MCP tool calls in a previous response

Closes #3105

Test Plan

Made call to create response with tool call, then made second call with the first linked through previous_response_id. Did not get error.

Also added unit test.

@meta-cla meta-cla bot added the CLA Signed This label is managed by the Meta Open Source bot. label Aug 14, 2025
@ashwinb
Copy link
Contributor

ashwinb commented Aug 14, 2025

Could you run and modify the integration tests as appropriate too? These are in tests/integration/non_ci/responses -- I am working on making them recordable and part of our regular CI runs also.

@grs
Copy link
Contributor Author

grs commented Aug 14, 2025

Could you run and modify the integration tests as appropriate too? These are in tests/integration/non_ci/responses -- I am working on making them recordable and part of our regular CI runs also.

Two concerns/questions to run by you on this:

(1) I see that the current function call integration test is being skipped as "Tool calling is not reliable."[1]
(2) I would need to run a test MCP server of some kind to be able to actually get a successful tool call invocation. That would likely bring in some dependencies. Any thoughts or advice on how best to do that? (e.g. import fastmcp libs directly in test and define and run something in the background for the test to connect to? or use some existing mcp server and start it as separate process when kicking of integration test run? or something else?)

[1]

@pytest.mark.skip(reason="Tool calling is not reliable.")
def test_function_call_output_response(openai_client, client_with_models, text_model_id):

@ashwinb
Copy link
Contributor

ashwinb commented Aug 14, 2025

@grs if you use library client for testing, then all of this is actually already possible. See https://github.com/llamastack/llama-stack/blob/main/tests/integration/non_ci/responses/test_responses.py#L556

you can execute this test as such:

pytest -s -v tests/integration/non_ci/responses/ \
   --stack-config=starter \
   --text-model openai/gpt-4o \
   --embedding-model=sentence-transformers/all-MiniLM-L6-v2 \
   -k "multi_turn_tool and streaming and client_with_models"

there are a couple of important things in that command line:

  • using stack-config=starter and not stack-config=server:starter because the MCP server needs to be reachable from the stack instance. the MCP server is started by the test, so this means we must use library client. hence stack-config=starter
  • using "client_with_models" in the filtering. this makes sure you don't use openai_client fixture which won't work with the library client (that one is designed to be used with the server.)

Let me know if this helps.

@grs
Copy link
Contributor Author

grs commented Aug 14, 2025

@ashwinb Thank you that does indeed help! Sorry also that I missed the non_ci dir in the path first time round and so was looking at the wrong tests. Thanks again for your help!

@ashwinb
Copy link
Contributor

ashwinb commented Aug 15, 2025

@grs I moved a bunch of things around and refactored. Your changes should be relatively easy to rebase though.

@grs grs force-pushed the issue-3105 branch 3 times, most recently from bceb88f to 7ad5ef2 Compare August 15, 2025 18:45
Copy link
Contributor

@ashwinb ashwinb left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm

@ashwinb ashwinb merged commit 14082b2 into llamastack:main Aug 20, 2025
24 checks passed
franciscojavierarceo pushed a commit to franciscojavierarceo/llama-stack that referenced this pull request Aug 22, 2025
…#3155)

# What does this PR do?

Handles MCP tool calls in a previous response

Closes llamastack#3105

## Test Plan
Made call to create response with tool call, then made second call with
the first linked through previous_response_id. Did not get error.

Also added unit test.

Signed-off-by: Gordon Sim <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed This label is managed by the Meta Open Source bot.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

error using responses API when previous response includes mcp tool listing or mcp tool call
2 participants