Why Isn’t My LlamaIndex Tool Streaming Output with streamText() from the AI SDK? #2228
Unanswered
nikolailehbrink
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I’m using the following setup to integrate
@llamaindex/vercelwith the AI SDK:The tool returns the following result:
..., { type: 'tool-result', toolCallId: 'call_example123', toolName: 'queryTool', input: { query: 'Nikolais hobbies' }, providerExecuted: undefined, providerMetadata: { openai: { itemId: 'example...' } }, output: "Nikolai's hobbies include... and a lot of other text." }The issue:
The tool output isn’t streamed, it only appears once the generation is fully complete.
When I uncomment the
options.fieldsarray, the response becomes a structured object instead of plain text, and I noticed one property isstream: false.So the question is: how can I make the LlamaIndex tool stream its output continuously rather than returning it all at once?
Beta Was this translation helpful? Give feedback.
All reactions