Skip to content

[BUG] Ollama Wouldn’t Get Configured as the Only Provider #284

@o0nj

Description

@o0nj

Basic checks

  • I searched existing issues - this hasn't been reported
  • I can reproduce this consistently
  • This is a RubyLLM bug, not my application code

What's broken?

Ollama cannot be configured as the sole provider as it depends on the OpenAI provider.

Could be because the Ollama provider extends on Ollama, which in turn requires an API key, at

RubyLLM::Models.fetch_from_providers
[]

How to reproduce

On a fresh installation,

RubyLLM.configure do |config|
  config.ollama_api_base = "http://localhost:11434/v1"
end

RubyLLM.chat.ask

Expected behavior

Chat response.

What actually happened

RubyLLM::ConfigurationError: openai provider is not configured. Add this to your initialization: (RubyLLM::ConfigurationError)

Environment

  • Ruby (3.4.3)
  • ruby_llm (1.31)
  • With the sole provider, Ollama.

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions