Skip to content

Conversation

@otterammo
Copy link

Changes

These changes seek to resolve this issue.

Core

  • mod.rs
    • Updated the configuration merging logic to use insert instead of or_insert. This ensures that user-defined providers in config.toml correctly override the built-in defaults.

Verification

I have added a reproduction test case test_user_config_overrides_builtin_provider to core/src/config/mod.rs

  • This test verifies that a user-defined ollama provider in config.toml correctly overrides the built-in one.

@github-actions
Copy link

github-actions bot commented Nov 22, 2025

All contributors have signed the CLA ✍️ ✅
Posted by the CLA Assistant Lite bot.

@otterammo
Copy link
Author

I have read the CLA Document and I hereby sign the CLA

@etraut-openai
Copy link
Collaborator

Thanks for the contribution. Before I have someone on the codex team review this, please fix the CI failures. Looks like a lint issue (cargo clippy).

@etraut-openai
Copy link
Collaborator

@codex review

@etraut-openai etraut-openai added the needs-response Additional information is requested label Nov 24, 2025
@chatgpt-codex-connector
Copy link
Contributor

Codex Review: Didn't find any major issues. 🚀

ℹ️ About Codex in GitHub

Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

needs-response Additional information is requested

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Local model base_url not working

2 participants