Does RAG not work for the Anthropic provider? #1449
-
|
Hi, I’m trying to implement a simple RAG setup with .context() and .dynamic_context(), using the example code here: https://book.rig.rs/playbook/rag.html#using-rag-with-agents I use OpenAI for the embedding and the vector store. When I use OpenAI for the agent, it works. but when I use Anthropic for the agent, none of the documents get included. Static and dynamic, neither get included. I also implemented my own vector store index to confirm that top_n does in fact get called and return data, but it just never gets included in the context when the request is made. Is this expected or a known issue, or am I doing it wrong? |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
|
Ah. It seems the issue isn't actually the Antrhopic provider itself, but the model I was using: MiniMax's Anthropic-compatible endpoint doesn't appear to support documents in multi-part messages: https://platform.minimax.io/docs/api-reference/text-anthropic-api#supported-parameters
So this has nothing to do with Rig 😅 (In case anyone else is trying to get this to work, using the OpenAI chat completions API does work) |
Beta Was this translation helpful? Give feedback.
Ah. It seems the issue isn't actually the Antrhopic provider itself, but the model I was using: MiniMax's Anthropic-compatible endpoint doesn't appear to support documents in multi-part messages: https://platform.minimax.io/docs/api-reference/text-anthropic-api#supported-parameters
So this has nothing to do with Rig 😅
(In case anyone else is trying to get this to work, using the OpenAI chat completions API does work)