-
Notifications
You must be signed in to change notification settings - Fork 3.4k
feat(provider/openai):Set the annotations from the Responses API to doStream #10253
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat(provider/openai):Set the annotations from the Responses API to doStream #10253
Conversation
Co-authored-by: vercel[bot] <35613825+vercel[bot]@users.noreply.github.com>
|
I have reviewed this pull request (self-review) |
|
@tsuzaki430 can confirm this works well for azure. Tested the script in azure-responses-code-interpreter.ts Merging this would allow for streamdown rendering of code-interpreter download links for streaming as talked about in the community session (with the start and end index) @gr2m |
|
Thank you for test and show us the result. |
|
I also reviewed this as well, great work @tsuzaki430 |
…0370) ## Background Web Search Preview has become available on Microsoft Azure OpenAI. We enabled the use of `web_search_preview` from `@ai-sdk/openai` within `@ai-sdk/azure`. [https://learn.microsoft.com/en-us/azure/ai-foundry/openai/how-to/web-search?view=foundry-classic](https://learn.microsoft.com/en-us/azure/ai-foundry/openai/how-to/web-search?view=foundry-classic) ## Summary Using Next.js, it is now possible to perform web searches with both `useChat` and `streamText`. * We enabled Azure’s webSearchPreview tool by routing through the OpenAI internal implementation. * In `doGenerate`, the Zod type definition for `url_citation` was updated to support annotations. * In `doStream`, there were some missing pieces when switching between `web_search` and `web_search_preview`. * add description `web_search_preview` in azure provider web page. ## Manual Verification * Added verification for `web_search_preview` in Azure’s CI/CD tests. `packages/azure/src/__fixtures__/azure-web-search-preview-tool.1.chunks.txt` `packages/azure/src/__fixtures__/azure-web-search-preview-tool.1.json` * Added tests for `generateText` and `streamText` in `examples/ai-core`. `generateText` correctly includes the `url_citation` annotation, while `streamText` currently does not output it. This is expected to be resolved once PR #10253 is merged. `examples/ai-core/src/generate-text/azure-responses-web-search-preview.ts' `examples/ai-core/src/stream-text/azure-responses-web-search-preview.ts' * Added a working Next.js example demonstrating web search in `examples/next-openai`. `http://localhost:3000/test-azure-web-search-preview` `examples/next-openai/app/test-azure-web-search-preview/page.tsx` <img width="908" height="1032" alt="image" src="https://github.com/user-attachments/assets/b51bf4b8-ffb2-4106-9859-a3b017128aae" /> ## Future Work Since `streamText` does not currently output annotations, we need to confirm that this is resolved once PR #10253 is merged. ## Related Issues #10253 --------- Co-authored-by: tsuzaki430 <[email protected]>
gr2m
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thank you Azure team 💙
| openai: { | ||
| itemId: value.item.id, | ||
| ...(ongoingAnnotations.length > 0 && { | ||
| annotations: ongoingAnnotations, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
| annotations: ongoingAnnotations, | |
| annotations: [...ongoingAnnotations], |
The ongoingAnnotations array is passed by reference (not copied) when creating the text-end event. This causes a bug where if another message arrives and clears the array before the consumer processes the text-end event, the annotations will appear empty.
View Details
Analysis
Array reference causes annotations to be lost in text-end event
What fails: In OpenAIResponsesLanguageModel.doStream(), when multiple messages are streamed, annotations from the first message become empty in the text-end event if a second message arrives before the event is consumed.
How to reproduce:
Stream a request with multiple messages where the first message contains annotations. The second message will arrive and clear the `ongoingAnnotations` array before the first message's `text-end` event is fully processed, causing the array reference to point to an empty array.
This occurs in the TransformStream's synchronous transform() method where:
1. Message 1's text-end event is created with direct reference to ongoingAnnotations (line 1474)
2. Message 2 arrives and immediately clears the array with splice() (line 984)
3. The queued text-end event now references an empty array
Result: The text-end event's providerMetadata.openai.annotations field is empty even though annotations were present in the stream.
Expected: Each text-end event should contain the annotations that were present for that specific message, regardless of when subsequent messages arrive.
Fix: Create a copy of the array when passing it to the event using the spread operator: annotations: [...ongoingAnnotations] instead of annotations: ongoingAnnotations (line 1474 of openai-responses-language-model.ts)
|
|
…s API to doStream (#10419) This is an automated backport of #10253 to the release-v5.0 branch. --------- Co-authored-by: tsuzaki430 <[email protected]> Co-authored-by: Gregor Martynus <[email protected]>
| > = {}; | ||
|
|
||
| // set annotations in 'text-end' part providerMetadata. | ||
| const ongoingAnnotations: Array< |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this is potentially buggy. it needs to be indexed by the text part id to avoid conflicts. this may not be an issue now but could lead to bugs in the future.
| } | ||
| } else if ( | ||
| isResponseOutputItemDoneChunk(value) && | ||
| value.item.type === 'message' |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
reverted the if-else split in #10586
|
How are file annotations related to the text parts? I feel this PR can potentially be problematic because we set a precedent of adding unrelated information to text parts, and it will become part of the stored history so we have to maintain backwards compat forever. I would have preferred to wait until we properly implement citation support. |
## Background #10253 split the if-else statement unnecessarily. ## Summary Revert the if-else split. ## Verification Refactoring. Unit tests passing without changes.

Background
In OpenAI, after obtaining a
file_idthrough Code Interpreter, the results included in thetextof aTextUIPartdepend on the attached annotations.Because of this, we needed to adjust the implementation so that annotations could be properly attached at the end of the
doStreamprocess.Summary
We have now made it possible for
doStream—just likedoGenerate—to set Response API annotations intoproviderMetadata.By using an
ongoingAnnotationsvariable in doStream, we accumulate the annotation array until processing is completed at the finaltext-endevent.As a result, in next-openai,
useChatcan utilize both annotations and Streamdown’s Markdown transformation to produce clickable text content.This allows us to significantly improve the user experience.
Below is a comparison of plain text vs. markdown rendering (Streamdown).

When the
container_file_citationannotation is applied, users can click to download the file directly — improving both functionality and UX.Manual Verification
In our CI/CD tests, we verified that annotations added via doStream were correctly applied as expected, and updated the implementation accordingly.
packages/openai/src/responses/openai-responses-language-model.test.tspackages/azure/src/azure-openai-provider.test.tsAfter running the Code Interpreter via the streamText function, we confirmed that the file-container_file_citation annotation was successfully received.
examples/ai-core/src/stream-text/openai-responses-code-interpreter.tsexamples/ai-core/src/stream-text/azure-responses-code-interpreter.tsWe updated the Next.js sample so that the result returned by useChat can embed file-download links directly into the text output.
http://localhost:3000/test-openai-code-interpreterChecklist
pnpm changesetin the project root)Future Work
#10255
Related Issues
#10255