Skip to content

Conversation

@tsuzaki430
Copy link
Collaborator

@tsuzaki430 tsuzaki430 commented Nov 15, 2025

Background

In OpenAI, after obtaining a file_id through Code Interpreter, the results included in the text of a TextUIPart depend on the attached annotations.
Because of this, we needed to adjust the implementation so that annotations could be properly attached at the end of the doStream process.

Summary

We have now made it possible for doStream—just like doGenerate—to set Response API annotations into providerMetadata.

By using an ongoingAnnotations variable in doStream, we accumulate the annotation array until processing is completed at the final text-end event.

As a result, in next-openai, useChat can utilize both annotations and Streamdown’s Markdown transformation to produce clickable text content.
This allows us to significantly improve the user experience.

image

Below is a comparison of plain text vs. markdown rendering (Streamdown).
When the container_file_citation annotation is applied, users can click to download the file directly — improving both functionality and UX.
image

Manual Verification

  • In our CI/CD tests, we verified that annotations added via doStream were correctly applied as expected, and updated the implementation accordingly.
    packages/openai/src/responses/openai-responses-language-model.test.ts
    packages/azure/src/azure-openai-provider.test.ts

  • After running the Code Interpreter via the streamText function, we confirmed that the file-container_file_citation annotation was successfully received.
    examples/ai-core/src/stream-text/openai-responses-code-interpreter.ts
    examples/ai-core/src/stream-text/azure-responses-code-interpreter.ts

  • We updated the Next.js sample so that the result returned by useChat can embed file-download links directly into the text output.
    http://localhost:3000/test-openai-code-interpreter

Checklist

  • Tests have been added / updated (for bug fixes / features)
  • Documentation has been added / updated (for bug fixes / features)
  • A patch changeset for relevant packages has been added (for bug fixes / features - run pnpm changeset in the project root)
  • I have reviewed this pull request (self-review)

Future Work

#10255

Related Issues

#10255

@tsuzaki430 tsuzaki430 marked this pull request as ready for review November 15, 2025 12:25
Co-authored-by: vercel[bot] <35613825+vercel[bot]@users.noreply.github.com>
@tsuzaki430
Copy link
Collaborator Author

I have reviewed this pull request (self-review)

@jephal
Copy link
Collaborator

jephal commented Nov 17, 2025

@tsuzaki430 can confirm this works well for azure. Tested the script in azure-responses-code-interpreter.ts
image

Merging this would allow for streamdown rendering of code-interpreter download links for streaming as talked about in the community session (with the start and end index) @gr2m

@tsuzaki430
Copy link
Collaborator Author

@jephal

Thank you for test and show us the result.
Yes , this feature provides us the url link using useChat and streamdown😊

@rahulbhadja
Copy link
Collaborator

I also reviewed this as well, great work @tsuzaki430

gr2m pushed a commit that referenced this pull request Nov 20, 2025
…0370)

## Background
Web Search Preview has become available on Microsoft Azure OpenAI.
We enabled the use of `web_search_preview` from `@ai-sdk/openai` within
`@ai-sdk/azure`.

[https://learn.microsoft.com/en-us/azure/ai-foundry/openai/how-to/web-search?view=foundry-classic](https://learn.microsoft.com/en-us/azure/ai-foundry/openai/how-to/web-search?view=foundry-classic)

## Summary

Using Next.js, it is now possible to perform web searches with both
`useChat` and `streamText`.

* We enabled Azure’s webSearchPreview tool by routing through the OpenAI
internal implementation.
* In `doGenerate`, the Zod type definition for `url_citation` was
updated to support annotations.
* In `doStream`, there were some missing pieces when switching between
`web_search` and `web_search_preview`.
* add description `web_search_preview` in azure provider web page.


## Manual Verification
* Added verification for `web_search_preview` in Azure’s CI/CD tests.

`packages/azure/src/__fixtures__/azure-web-search-preview-tool.1.chunks.txt`
  `packages/azure/src/__fixtures__/azure-web-search-preview-tool.1.json`
* Added tests for `generateText` and `streamText` in `examples/ai-core`.
`generateText` correctly includes the `url_citation` annotation, while
`streamText` currently does not output it.
  This is expected to be resolved once PR #10253 is merged.

`examples/ai-core/src/generate-text/azure-responses-web-search-preview.ts'

`examples/ai-core/src/stream-text/azure-responses-web-search-preview.ts'

* Added a working Next.js example demonstrating web search in
`examples/next-openai`.
  `http://localhost:3000/test-azure-web-search-preview`
  `examples/next-openai/app/test-azure-web-search-preview/page.tsx`
<img width="908" height="1032" alt="image"
src="https://github.com/user-attachments/assets/b51bf4b8-ffb2-4106-9859-a3b017128aae"
/>


## Future Work

Since `streamText` does not currently output annotations, we need to
confirm that this is resolved once PR #10253 is merged.

## Related Issues

#10253

---------

Co-authored-by: tsuzaki430 <[email protected]>
Copy link
Collaborator

@gr2m gr2m left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you Azure team 💙

@gr2m gr2m enabled auto-merge (squash) November 20, 2025 18:08
openai: {
itemId: value.item.id,
...(ongoingAnnotations.length > 0 && {
annotations: ongoingAnnotations,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
annotations: ongoingAnnotations,
annotations: [...ongoingAnnotations],

The ongoingAnnotations array is passed by reference (not copied) when creating the text-end event. This causes a bug where if another message arrives and clears the array before the consumer processes the text-end event, the annotations will appear empty.

View Details

Analysis

Array reference causes annotations to be lost in text-end event

What fails: In OpenAIResponsesLanguageModel.doStream(), when multiple messages are streamed, annotations from the first message become empty in the text-end event if a second message arrives before the event is consumed.

How to reproduce:

Stream a request with multiple messages where the first message contains annotations. The second message will arrive and clear the `ongoingAnnotations` array before the first message's `text-end` event is fully processed, causing the array reference to point to an empty array.

This occurs in the TransformStream's synchronous transform() method where:
1. Message 1's text-end event is created with direct reference to ongoingAnnotations (line 1474)
2. Message 2 arrives and immediately clears the array with splice() (line 984)
3. The queued text-end event now references an empty array

Result: The text-end event's providerMetadata.openai.annotations field is empty even though annotations were present in the stream.

Expected: Each text-end event should contain the annotations that were present for that specific message, regardless of when subsequent messages arrive.

Fix: Create a copy of the array when passing it to the event using the spread operator: annotations: [...ongoingAnnotations] instead of annotations: ongoingAnnotations (line 1474 of openai-responses-language-model.ts)

@gr2m gr2m merged commit 423ba08 into vercel:main Nov 20, 2025
17 of 18 checks passed
vercel-ai-sdk bot pushed a commit that referenced this pull request Nov 20, 2025
@vercel-ai-sdk vercel-ai-sdk bot removed the backport label Nov 20, 2025
@vercel-ai-sdk
Copy link
Contributor

vercel-ai-sdk bot commented Nov 20, 2025

⚠️ Backport to release-v5.0 created but has conflicts: #10419

gr2m added a commit that referenced this pull request Nov 20, 2025
…s API to doStream (#10419)

This is an automated backport of #10253 to the release-v5.0 branch.

---------

Co-authored-by: tsuzaki430 <[email protected]>
Co-authored-by: Gregor Martynus <[email protected]>
@tsuzaki430 tsuzaki430 deleted the tsuz/openai-responses-annotations-dostream branch November 21, 2025 12:14
> = {};

// set annotations in 'text-end' part providerMetadata.
const ongoingAnnotations: Array<
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this is potentially buggy. it needs to be indexed by the text part id to avoid conflicts. this may not be an issue now but could lead to bugs in the future.

}
} else if (
isResponseOutputItemDoneChunk(value) &&
value.item.type === 'message'
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

reverted the if-else split in #10586

@lgrammel
Copy link
Collaborator

lgrammel commented Nov 25, 2025

How are file annotations related to the text parts? I feel this PR can potentially be problematic because we set a precedent of adding unrelated information to text parts, and it will become part of the stored history so we have to maintain backwards compat forever. I would have preferred to wait until we properly implement citation support.

@tsuzaki430
Copy link
Collaborator Author

@lgrammel

When I originally wrote the code, text-end was executed before the annotations were captured, so I wasn’t able to retrieve them.
With #10586, I understand that this workaround is no longer necessary.
I apologize for the odd implementation and any inconvenience it may have caused.

lgrammel added a commit that referenced this pull request Nov 26, 2025
## Background

#10253 split the if-else statement unnecessarily.

## Summary

Revert the if-else split.

## Verification

Refactoring. Unit tests passing without changes.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants