Skip to content

feat: Add support for Anthropic (Claude) and Google Gemini providers #1

@sanonone

Description

@sanonone

Context

Currently, the pkg/llm package works natively with OpenAI-compatible APIs. This covers OpenAI, Ollama, LocalAI, vLLM, and Groq.

I would like to expand native support to include other major providers that use different API schemas, specifically Anthropic (Claude) and Google Gemini (Vertex AI).

The Task

This issue seeks a contributor to implement specific client adapters for these providers.

  1. Create Adapters: Add new client implementations in pkg/llm/ (e.g., claude_client.go, gemini_client.go).
  2. Update Interface: Ensure they implement the generic Client interface used by the Proxy and Vectorizer.
  3. Config: Update pkg/llm/config.go to handle provider-specific settings if necessary.

Note for Contributors

I currently do not have access to paid API keys for Claude or Gemini to verify the integration.
I am looking for a contributor who uses these services and can implement/verify that the integration works correctly with the RAG pipeline.

Metadata

Metadata

Assignees

No one assigned

    Labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions