Skip to content

Fixing Ollama Provider to match api endpoing and response structure a… #300

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 3 commits into from

Conversation

DVG
Copy link

@DVG DVG commented Jul 27, 2025

…s of today

What this does

When testing with Ollama and Deepseek-r1, both the completion URL and the expectations of the response structure were wrong as of the latest versions.

Type of change

  • Bug fix
  • New feature
  • Breaking change
  • Documentation
  • Performance improvement

Scope check

  • I read the Contributing Guide
  • This aligns with RubyLLM's focus on LLM communication
  • This isn't application-specific logic that belongs in user code
  • This benefits most users, not just my specific use case

Quality check

  • I ran overcommit --install and all hooks pass
  • I tested my changes thoroughly
  • I updated documentation if needed
  • I didn't modify auto-generated files manually (models.json, aliases.json)

API changes

  • Breaking change
  • New public methods/classes
  • Changed method signatures
  • No API changes

Related issues

@crmne
Copy link
Owner

crmne commented Jul 30, 2025

@DVG I downloaded the latest version of Ollama and I couldn't reproduce the error with qwen3. Is it specific to deepseek-r1? do you have a link to the documentation?

@crmne
Copy link
Owner

crmne commented Aug 6, 2025

Not necessary.

@crmne crmne closed this Aug 6, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants