Skip to content

Conversation

@bai-uipath
Copy link
Collaborator

@bai-uipath bai-uipath commented Oct 17, 2025

This PR adds a file-based caching system for both LLM and input mocker responses used in evals. The hash of the prompt and model parameters are used as the cache key, and the LLM response is the cached value, using a hierarchical folder structure under .uipath/eval_cache.

@github-actions github-actions bot added test:uipath-langchain Triggers tests in the uipath-langchain-python repository test:uipath-llamaindex Triggers tests in the uipath-llamaindex-python repository labels Oct 17, 2025
@bai-uipath bai-uipath force-pushed the bai/caching-for-mocks branch from 2d3c169 to abcebb4 Compare October 17, 2025 20:46
@bai-uipath bai-uipath marked this pull request as ready for review October 17, 2025 20:46
@bai-uipath bai-uipath requested a review from akshaylive October 17, 2025 20:49
@bai-uipath bai-uipath force-pushed the bai/caching-for-mocks branch from abcebb4 to 2a843b4 Compare October 17, 2025 21:00
@bai-uipath bai-uipath force-pushed the bai/caching-for-mocks branch from 33a6e6b to 16ceb62 Compare October 20, 2025 20:29
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

test:uipath-langchain Triggers tests in the uipath-langchain-python repository test:uipath-llamaindex Triggers tests in the uipath-llamaindex-python repository

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants