-
-
Notifications
You must be signed in to change notification settings - Fork 225
Closed
Labels
bugSomething isn't workingSomething isn't working
Description
Basic checks
- I searched existing issues - this hasn't been reported
- I can reproduce this consistently
- This is a RubyLLM bug, not my application code
What's broken?
Ollama cannot be configured as the sole provider as it depends on the OpenAI provider.
Could be because the Ollama provider extends on Ollama, which in turn requires an API key, at
extend OpenAI |
RubyLLM::Models.fetch_from_providers
[]
How to reproduce
On a fresh installation,
RubyLLM.configure do |config|
config.ollama_api_base = "http://localhost:11434/v1"
end
RubyLLM.chat.ask
Expected behavior
Chat response.
What actually happened
RubyLLM::ConfigurationError: openai provider is not configured. Add this to your initialization: (RubyLLM::ConfigurationError)
Environment
- Ruby (3.4.3)
- ruby_llm (1.31)
- With the sole provider, Ollama.
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't working