Skip to content

Conversation

@rakotomandimby
Copy link
Contributor

This is work in progress.

  • The /responses endpoint is queried
  • We get the response
  • We get the streamed response
  • We display the response in a streamed way in the chat

…n- Detect models that only support /responses and mark as responses_only\n- Send requests with {model, stream, instructions, input} format\n- Parse /responses SSE events (response.output_text.delta, response.completed)\n- Keep backward compatibility with /chat/completions
Mihamina RKTMB added 2 commits October 11, 2025 19:31
- Detect supported_endpoints and toggle use_responses_api; route to /responses endpoints
- Add responses input/output paths, including streaming event parsing and overwrite semantics
- Accumulate tool calls by id/index and safely concatenate arguments; allow string|number ids
- Avoid empty on_progress updates; support skip_progress, content_overwrite, reasoning_overwrite
- Fix RESOURCE_SHORT_FORMAT placeholder bug; expand generate_resource_block args
- Respect top_p from options; propagate supported_endpoints in model metadata
@rakotomandimby rakotomandimby deleted the feat/1442-responses-api-gpt5codex branch October 11, 2025 16:54
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant