feat: add Ollama Cloud as a declarative provider#8189
feat: add Ollama Cloud as a declarative provider#8189vincenzopalazzo wants to merge 1 commit intoaaif-goose:mainfrom
Conversation
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: 66bfb71079
ℹ️ About Codex in GitHub
Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: 0cc95fd9d8
ℹ️ About Codex in GitHub
Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".
0cc95fd to
1936bd3
Compare
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: 1936bd3d1b
ℹ️ About Codex in GitHub
Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: 6f09f43c0b
ℹ️ About Codex in GitHub
Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".
|
On second thought, I think this should be a declarative provider that directly connects to ollama.com - possibly as an openai compatible provider rather than ollama. adding api keys to ollama itself seems confusing |
DOsinga
left a comment
There was a problem hiding this comment.
should this be a declaratitve provider instead?
vincenzopalazzo
left a comment
There was a problem hiding this comment.
Good call -- fixed in 2e6236d. Ollama Cloud supports the OpenAI-compatible /v1/chat/completions endpoint, so a declarative provider is the right approach. Reverted the auth/URL complexity from ollama.rs and toolshim.rs and added a simple ollama_cloud.json declarative provider with engine: "openai". Much simpler.
1e830d3 to
df25df7
Compare
4094c17 to
1f1f583
Compare
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: 1f1f583403
ℹ️ About Codex in GitHub
Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".
michaelneale
left a comment
There was a problem hiding this comment.
this looks good to me - @vincenzopalazzo do you mind updating to main, and shouldn't need changes to goose hints or String utils etc - to keep it clean, but I like this.
|
could it be as simple as this? #8335 or am I missing something? thanks!, ollama cloud is pretty neat |
ae8d69f to
be36210
Compare
|
Ok, this is built on top of the review that @michaelneale did (thanks BTW), and this should be the final message be36210 Please let me know if there is anything else to do! |
Add ollama_cloud.json using the OpenAI-compatible engine with dynamic model discovery via /v1/models. Co-developed-by: Michael Neale <michael@michaelneale.com>
be36210 to
5420bda
Compare
|
ok so to be clear - this means if someone wants to use ollama cloud, they don't need ollama desktop installed? (it is so weird to me ollama do that but I get it!) |
Yes, correct, I found this pretty cool for beginners who are moving from Claude to something else, or just for people who are starting out and don't want (or can't afford) to use a paid model. Free open-source models hosted via Ollama Cloud are a great on-ramp. This feature actually came out of a conversation with some friends who were using the free tier of Claude and hitting their quota way too fast. For their use case, Goose + an open-source model was more than enough, but they didn't have the disk space to download local models either. That's when it clicked: Goose should just ship this out of the box. P.S. Honestly, this use case deserves a blog post too 😄 |
|
@DOsinga @michaelneale your 'requesting changes' reviews block merging this, can you take a look and switch them to 'approve' if you're happy with @vincenzopalazzo's new declarative approach? |
Summary
ollama_cloud.json)/v1/modelsUse Case
Run Goose against Ollama Cloud by setting
OLLAMA_CLOUD_API_KEYand selecting any cloud-hosted model — no local Ollama daemon required.Simplified approach co-developed with @michaelneale (see #8335).
Verification
source bin/activate-hermit && cargo clippy --all-targets -- -D warningshttps://ollama.com/v1/modelsreturns a valid OpenAI-compatible model list