-
-
Notifications
You must be signed in to change notification settings - Fork 4.2k
Closed
Labels
Description
What happened?
I'm adding a vllm rerank model by daashboard, but it failed. I tried openai/Qwen3-Reranker-8B
, openai/hosted_vllm/Qwen3-Reranker-8B
, and hosted_vllm/Qwen3-Reranker-8B
, none works.



Relevant log output
litellm.APIConnectionError: APIConnectionError: OpenAIException - Unsupported provider: openai
stack trace: Traceback (most recent call last):
File "/usr/lib/python3.13/site-packages/litellm/rerank_api/main.py", line 369, in rerank
raise ValueError(f"Unsupported provider: {_custom_llm_provider}")
ValueError: Unsupported provider: openai
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/lib/python3.13/site-packages/litellm/main.py", line 5556, in ahealth_check
_response = await mode_handlers[mode]()
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/lib/python3.13/site-packages/litellm/utils.py", line 1586, in wrapper_async
raise e
File "/usr/lib/python3.13/site-packages/litellm/utils.py", line 1437, in wrapper_async
result = await original_function(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/lib/python3.13/site-packages/litellm/rerank_api/main.py", line 69, in arerank
raise e
File "/usr/lib/python3.13/site-packages/litellm/rerank_api/main.py", line 61, i
Are you a ML Ops Team?
No
What LiteLLM version are you on ?
1.74.15-stable
Twitter / LinkedIn details
No response