Skip to content

fix(openrouter): strip 'openrouter/' prefix in chat transform_request#24275

Open
NIK-TIGER-BILL wants to merge 1 commit intoBerriAI:mainfrom
NIK-TIGER-BILL:fix/openrouter-strip-prefix-in-chat-transform
Open

fix(openrouter): strip 'openrouter/' prefix in chat transform_request#24275
NIK-TIGER-BILL wants to merge 1 commit intoBerriAI:mainfrom
NIK-TIGER-BILL:fix/openrouter-strip-prefix-in-chat-transform

Conversation

@NIK-TIGER-BILL
Copy link

Summary

Fixes #24234

Problem

OpenRouter chat completions are broken when the model is passed with the openrouter/ provider prefix (e.g. openrouter/mistralai/mistral-7b-instruct). The full string — including openrouter/ — was being sent to the OpenRouter API, which does not recognise it and returns an error.

Root cause

get_llm_provider_logic.py returns early at L166 when it detects custom_llm_provider == "openrouter" and model.startswith("openrouter/"), preserving the prefix in the model string. OpenrouterConfig.transform_request() then forwarded this unparsed string directly to the upstream API.

Fix

Strip the openrouter/ prefix at the start of OpenrouterConfig.transform_request() — consistent with the approach already used in OpenrouterConfig for embeddings (litellm/llms/openrouter/embedding/transformation.py, lines 112–113).

# Before (broken)
model = "openrouter/mistralai/mistral-7b-instruct"  # sent as-is

# After (fixed)
model = "mistralai/mistral-7b-instruct"  # prefix stripped

Testing

Manually verified that the transform now produces the correct model string. Unit tests pass.

Fixes BerriAI#24234

The embedding transformer already strips the 'openrouter/' prefix before
sending the model name to the API (see litellm/llms/openrouter/embedding/
transformation.py:112-113).  The chat transformer was missing the same
guard: when litellm receives a model string like
'openrouter/mistralai/mistral-7b-instruct' it internally preserves the
'openrouter/' prefix (get_llm_provider_logic.py returns early at L166)
so the full string was being forwarded to the OpenRouter API, which does
not recognise it and returns a 404/invalid-model error.

Fix: strip the 'openrouter/' prefix at the top of
OpenrouterConfig.transform_request() — consistent with the approach
already used in the embedding transformer.
@vercel
Copy link

vercel bot commented Mar 21, 2026

The latest updates on your projects. Learn more about Vercel for GitHub.

Project Deployment Actions Updated (UTC)
litellm Ready Ready Preview, Comment Mar 21, 2026 7:11am

Request Review

@CLAassistant
Copy link

CLA assistant check
Thank you for your submission! We really appreciate it. Like many open source projects, we ask that you sign our Contributor License Agreement before we can accept your contribution.


NIK-TIGER-BILL seems not to be a GitHub user. You need a GitHub account to be able to sign the CLA. If you have already a GitHub account, please add the email address used for this commit to your account.
You have signed the CLA already but the status is still pending? Let us recheck it.

@greptile-apps
Copy link
Contributor

greptile-apps bot commented Mar 21, 2026

Greptile Summary

This PR fixes a bug in OpenrouterConfig.transform_request() where the openrouter/ provider prefix was forwarded verbatim to the OpenRouter API, causing the API to reject the request with an unrecognised-model error. The one-line fix strips the prefix before constructing the request body, mirroring the identical pattern already in the OpenRouter embedding transformation.

Key points:

  • The fix is correct and minimal — the stripped model name is now passed to super().transform_request(), which places it in the "model" field of the outgoing JSON body.
  • _supports_cache_control_in_content is called after the strip, which is fine: it matches substrings like "claude" / "gemini" that are still present in the unprefixed name.
  • The implementation uses model[len("openrouter/"):] while the embedding transformation uses model.replace("openrouter/", "", 1) — both produce identical output, but the inconsistency is a minor readability nit.
  • No new test is added that explicitly asserts the "model" field in the transformed request dict is stripped. The existing tests pass "openrouter/..." models but only assert other fields (provider, messages, usage), so there is no direct regression guard for this specific fix.

Confidence Score: 4/5

  • Safe to merge — the fix is correct, minimal, and consistent with the existing embedding pattern; the only gaps are a style nit and a missing regression test.
  • The change is a single-hunk, easy-to-reason-about prefix strip that directly mirrors the already-working embedding transformation. No backward-incompatible changes are introduced. The minor deduction is for the absence of a dedicated unit test that asserts transformed_request["model"] is stripped — a gap that slightly reduces traceability of the fix.
  • No files require special attention beyond the noted missing test case in tests/test_litellm/llms/openrouter/chat/test_openrouter_chat_transformation.py.

Important Files Changed

Filename Overview
litellm/llms/openrouter/chat/transformation.py Strips the openrouter/ prefix from the model string at the start of transform_request(), mirroring the same logic already in the embedding transformation. The fix is correct and minimal, though no new test explicitly asserts the stripped model value in the resulting request dict.

Sequence Diagram

sequenceDiagram
    participant Caller as Caller (litellm)
    participant Logic as get_llm_provider_logic
    participant Chat as OpenrouterConfig.transform_request
    participant API as OpenRouter API

    Caller->>Logic: model="openrouter/mistralai/mistral-7b-instruct"
    Logic-->>Caller: returns early, model unchanged

    Note over Caller,Chat: Before fix — prefix is forwarded as-is
    Caller->>Chat: model="openrouter/mistralai/mistral-7b-instruct"
    Chat->>API: { "model": "openrouter/mistralai/mistral-7b-instruct" }
    API-->>Chat: ❌ 400 — model not recognised

    Note over Caller,Chat: After fix — prefix is stripped in transform_request
    Caller->>Chat: model="openrouter/mistralai/mistral-7b-instruct"
    Chat->>Chat: model.startswith("openrouter/") → strip prefix
    Chat->>API: { "model": "mistralai/mistral-7b-instruct" }
    API-->>Chat: ✅ 200 OK
Loading

Last reviewed commit: "fix(openrouter): str..."

Comment on lines +164 to +165
if model.startswith("openrouter/"):
model = model[len("openrouter/"):]
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P2 Minor style inconsistency with embedding transformation

The analogous logic in the embedding transformation (litellm/llms/openrouter/embedding/transformation.py, line 113) uses model.replace("openrouter/", "", 1), while this PR uses model[len("openrouter/"):]. Both produce identical results given the startswith guard, but keeping the same idiom makes the codebase easier to maintain.

Suggested change
if model.startswith("openrouter/"):
model = model[len("openrouter/"):]
model = model.replace("openrouter/", "", 1)

Note: If this suggestion doesn't match your team's coding style, reply to this and let me know. I'll remember it for next time!

Comment on lines +164 to +165
if model.startswith("openrouter/"):
model = model[len("openrouter/"):]
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P2 Missing regression test for the bug fix

The existing tests in tests/test_litellm/llms/openrouter/chat/test_openrouter_chat_transformation.py pass model="openrouter/..." to transform_request() but none of them assert that the resulting transformed_request["model"] has the prefix stripped. A dedicated test case would directly verify the fix and guard against regression:

def test_openrouter_transform_request_strips_provider_prefix():
    """Model field sent to the API must not contain the 'openrouter/' prefix."""
    config = OpenrouterConfig()

    transformed_request = config.transform_request(
        model="openrouter/mistralai/mistral-7b-instruct",
        messages=[{"role": "user", "content": "Hello"}],
        optional_params={},
        litellm_params={},
        headers={},
    )

    assert transformed_request["model"] == "mistralai/mistral-7b-instruct"

Rule Used: What: Ensure that any PR claiming to fix an issue ... (source)

@codspeed-hq
Copy link
Contributor

codspeed-hq bot commented Mar 21, 2026

Merging this PR will not alter performance

✅ 16 untouched benchmarks


Comparing NIK-TIGER-BILL:fix/openrouter-strip-prefix-in-chat-transform (ebd31bc) with main (d8e4fc4)

Open in CodSpeed

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[Bug]: OpenRouter endpoints broken (not stripping "openrouter/" from model) since v1.82.3

2 participants