Skip to content

Commit 1bb33a2

Browse files
authored
Apply suggestions from code review
Signed-off-by: Nathan Bower <[email protected]>
1 parent eaf158a commit 1bb33a2

File tree

1 file changed

+11
-11
lines changed

1 file changed

+11
-11
lines changed

_posts/2025-10-16-Introducing-Real-Time-Streaming-for-AI-Models-and-Agents-in-OpenSearch.md

Lines changed: 11 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -155,7 +155,7 @@ POST /_plugins/_ml/models/yFT0m5kB-SbOBOkMDNIa/_predict/stream
155155

156156
#### Sample response
157157

158-
The streaming format uses Server-Sent Events (SSE), with each chunk containing a portion of the model's response. Each data line represents a separate chunk transmitted in real-time as the model generates output:
158+
The streaming format uses Server-Sent Events (SSE), with each chunk containing a portion of the model's response. Each data line represents a separate chunk transmitted in real time as the model generates output:
159159

160160
```json
161161
data: {"inference_results":[{"output":[{"name":"response","dataAsMap":{"content":"#","is_last":false}}]}]}
@@ -175,8 +175,8 @@ data: {"inference_results":[{"output":[{"name":"response","dataAsMap":{"content"
175175

176176
Each chunk has the following key elements:
177177

178-
* `content` - The text fragment generated in this chunk (for example, a word, or phrase)
179-
* `is_last` - A Boolean flag indicating whether this is the final chunk (`false` for intermediate chunks, `true` for the last one)
178+
* `content` -- The text fragment generated in this chunk (for example, a word, or phrase).
179+
* `is_last` -- A Boolean flag indicating whether this is the final chunk (`false` for intermediate chunks, `true` for the last one).
180180

181181
### Step 2: Set up agent streaming
182182

@@ -291,7 +291,7 @@ POST /_plugins/_ml/agents/37YmxZkBphfsuvK7qIj4/_execute/stream
291291

292292
#### Sample response
293293

294-
The streaming format uses SSE, with each chunk containing a portion of the agent's response. Each data line represents a separate chunk transmitted in real-time as the agent generates output.
294+
The streaming format uses SSE, with each chunk containing a portion of the agent's response. Each data line represents a separate chunk transmitted in real time as the agent generates output:
295295

296296
```json
297297
data: {"inference_results":[{"output":[{"name":"memory_id","result":"LvU1iJkBCzHrriq5hXbN"},{"name":"parent_interaction_id","result":"L_U1iJkBCzHrriq5hXbs"},{"name":"response","dataAsMap":{"content":"[{\"index\":0.0,\"id\":\"call_HjpbrbdQFHK0omPYa6m2DCot\",\"type\":\"function\",\"function\":{\"name\":\"RetrieveIndexMetaTool\",\"arguments\":\"\"}}]","is_last":false}}]}]}
@@ -327,10 +327,10 @@ data: {"inference_results":[{"output":[{"name":"memory_id","result":"LvU1iJkBCzH
327327

328328
Each chunk has the following key elements:
329329

330-
* `content` - The text or data fragment generated in this chunk (for example, a word or phrase).
331-
* `is_last` - A Boolean flag indicating whether this is the final chunk (`false` for intermediate chunks, `true` for the last one).
332-
* `memory_id` - A unique identifier for the conversation memory session.
333-
* `parent_interaction_id` - An identifier linking related interactions in the conversation.
330+
* `content` -- The text or data fragment generated in this chunk (for example, a word or phrase).
331+
* `is_last` -- A Boolean flag indicating whether this is the final chunk (`false` for intermediate chunks, `true` for the last one).
332+
* `memory_id` -- A unique identifier for the conversation memory session.
333+
* `parent_interaction_id` -- An identifier linking related interactions in the conversation.
334334

335335
## Conclusion
336336

@@ -340,6 +340,6 @@ Streaming capabilities in OpenSearch represent a significant step forward in del
340340

341341
## What's next?
342342

343-
* Explore the official documentation for [Predict Stream](https://docs.opensearch.org/latest/ml-commons-plugin/api/train-predict/predict-stream/) and [Execute Stream Agent](https://docs.opensearch.org/latest/ml-commons-plugin/api/agent-apis/execute-stream-agent/) API references
344-
* Share your feedback on [OpenSearch forum](https://forum.opensearch.org/)
345-
* Stay tuned for updates as streaming support expands in future releases
343+
* Explore the [Predict Stream](https://docs.opensearch.org/latest/ml-commons-plugin/api/train-predict/predict-stream/) and [Execute Stream Agent](https://docs.opensearch.org/latest/ml-commons-plugin/api/agent-apis/execute-stream-agent/) API references.
344+
* Share your feedback on the [OpenSearch forum](https://forum.opensearch.org/).
345+
* Stay tuned for updates as streaming support expands in future releases.

0 commit comments

Comments
 (0)