-
Notifications
You must be signed in to change notification settings - Fork 1.2k
Description
🚀 Describe the new functionality needed
We should implement openai/v1/vector_stores/{vector_store_id}/files/{file_id}/chunks/{chunk_id}
so that users could have feature parity with the existing vector_io precomputed embeddings feature. Additionally, supporting read operations over openai/v1/vector_stores/{vector_store_id}/files/{file_id}/chunks will mean the Admin UI can nicely render chunks, which will be very useful for debugging and investigation.
💡 Why is this needed? What if we don't build it?
This is needed for supporting the ingestion of precomputed embeddings, similar to what's available with VectorIO today.
See this example here that is in use at Red Hat: https://github.com/opendatahub-io/rag/tree/main/demos/kfp/docling/pdf-conversion
I enabled the ingestion of precomputed embeddings in #2317, which has been used by a number of our customers via VectorIO.insert(). This would give us feature parity and be consistent with OpenAI's naming conventions.
Other thoughts
N/A