Context
Currently, the pkg/llm package works natively with OpenAI-compatible APIs. This covers OpenAI, Ollama, LocalAI, vLLM, and Groq.
I would like to expand native support to include other major providers that use different API schemas, specifically Anthropic (Claude) and Google Gemini (Vertex AI).
The Task
This issue seeks a contributor to implement specific client adapters for these providers.
- Create Adapters: Add new client implementations in
pkg/llm/ (e.g., claude_client.go, gemini_client.go).
- Update Interface: Ensure they implement the generic
Client interface used by the Proxy and Vectorizer.
- Config: Update
pkg/llm/config.go to handle provider-specific settings if necessary.
Note for Contributors
I currently do not have access to paid API keys for Claude or Gemini to verify the integration.
I am looking for a contributor who uses these services and can implement/verify that the integration works correctly with the RAG pipeline.
Context
Currently, the
pkg/llmpackage works natively with OpenAI-compatible APIs. This covers OpenAI, Ollama, LocalAI, vLLM, and Groq.I would like to expand native support to include other major providers that use different API schemas, specifically Anthropic (Claude) and Google Gemini (Vertex AI).
The Task
This issue seeks a contributor to implement specific client adapters for these providers.
pkg/llm/(e.g.,claude_client.go,gemini_client.go).Clientinterface used by the Proxy and Vectorizer.pkg/llm/config.goto handle provider-specific settings if necessary.Note for Contributors
I currently do not have access to paid API keys for Claude or Gemini to verify the integration.
I am looking for a contributor who uses these services and can implement/verify that the integration works correctly with the RAG pipeline.