-
Notifications
You must be signed in to change notification settings - Fork 1.1k
feat: Updating files/content
response to return additional fields
#3054
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
feat: Updating files/content
response to return additional fields
#3054
Conversation
251c900
to
552b14b
Compare
Signed-off-by: Francisco Javier Arceo <[email protected]>
Signed-off-by: Francisco Javier Arceo <[email protected]>
a67efca
to
a151f59
Compare
files/content
response to return additional fields
@ashwinb any chance you can take a look at this PR? Do you have any info on running the |
assert len(file_contents.content) > 0 | ||
|
||
for content_item in file_contents.content: | ||
if isinstance(compat_client, LlamaStackClient): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is where we are incompatible with OpenAI so the test currently handles it here.
What does this PR do?
VectorStoreContent
which are returned inopenai/v1/vector_stores/{vector_store_id}/files/{file_id}/content
VectorStoreContent
class inVectorStoreFileContentsResponse
, which is inconsistent with OpenAI's client which just returnsdict[str, Any]
. I will provide a follow up PR to fix that and we'll have to update the client in the next client release..github/actions/setup-test-environment/action.yml
to set the OLLAMA_URL during replay mode, which was causing failures from the new integration test.Test Plan
Added unit test and integration test.