Skip to content

Commit b95426f

Browse files
gustavocidornelaswhoseoyster
authored andcommitted
OpenAI LLM monito streaming data instead of publishing batches
1 parent 9861278 commit b95426f

File tree

1 file changed

+3
-2
lines changed

1 file changed

+3
-2
lines changed

openlayer/llm_monitors.py

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -316,8 +316,9 @@ def _handle_data_publishing(self) -> None:
316316
If `publish` is set to True, publish the latest row to Openlayer.
317317
"""
318318
if self.publish:
319-
self.inference_pipeline.publish_batch_data(
320-
batch_df=self.df.tail(1), batch_config=self.data_config
319+
self.inference_pipeline.stream_data(
320+
stream_data=self.df.tail(1).to_dict(orient="records"),
321+
stream_config=self.data_config,
321322
)
322323

323324
def start_monitoring(self) -> None:

0 commit comments

Comments
 (0)