Skip to content

[Bug]: Failed to proxy vllm rerank model by UI dashboard #13793

@NiuBlibing

Description

@NiuBlibing

What happened?

I'm adding a vllm rerank model by daashboard, but it failed. I tried openai/Qwen3-Reranker-8B, openai/hosted_vllm/Qwen3-Reranker-8B, and hosted_vllm/Qwen3-Reranker-8B, none works.

Image Image Image

Relevant log output

litellm.APIConnectionError: APIConnectionError: OpenAIException - Unsupported provider: openai
stack trace: Traceback (most recent call last):
  File "/usr/lib/python3.13/site-packages/litellm/rerank_api/main.py", line 369, in rerank
    raise ValueError(f"Unsupported provider: {_custom_llm_provider}")
ValueError: Unsupported provider: openai

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/lib/python3.13/site-packages/litellm/main.py", line 5556, in ahealth_check
    _response = await mode_handlers[mode]()
                ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3.13/site-packages/litellm/utils.py", line 1586, in wrapper_async
    raise e
  File "/usr/lib/python3.13/site-packages/litellm/utils.py", line 1437, in wrapper_async
    result = await original_function(*args, **kwargs)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3.13/site-packages/litellm/rerank_api/main.py", line 69, in arerank
    raise e
  File "/usr/lib/python3.13/site-packages/litellm/rerank_api/main.py", line 61, i

Are you a ML Ops Team?

No

What LiteLLM version are you on ?

1.74.15-stable

Twitter / LinkedIn details

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions