Skip to content

[Bug] LiteLLM Ollama provider fails with 'str' object has no attribute 'get' when called via Anthropic endpoint #13429

@Moep90

Description

@Moep90

What happened?

When routing Anthropic-style requests from Claude Code to an Ollama model through LiteLLM, completions fail with a 500 due to a AttributeError: 'str' object has no attribute 'get'. This appears to be caused by LiteLLM’s Ollama prompt-template handler expecting messages[*].content to be a list of content blocks instead of a string.

Additional context:
I also tried with several other models:

  1. qwen3: 8b
  2. qwen3: 0.6b
  3. qwen2.5-coder: 1.5b
  4. qwen2.5-coder: 3b
  5. qwen2.5-coder: 7b
  6. llama3-groq-tool-use: 8b
  7. hhao/qwen2.5-coder-tools: 7b
  8. granite3-dense: 2b

Workstation config:

$  env | grep ANTHROPIC
ANTHROPIC_BASE_URL=http://<litellm url>:4000
ANTHROPIC_API_KEY=your-expected-anthropic-api-key
ANTHROPIC_AUTH_TOKEN=sk-1234567890

$ cat litellm/config.yaml

---
model_list:
  - model_name: "*"
    litellm_params:
      model: ollama/qwen2.5-coder:7b
      keep_alive: "8m"
      api_base: http://<ollama url>:<ollama port>
      model_info:
        supports_function_calling: true

litellm_settings:
  master_key: "sk-1234567890"

Relevant log output

Workstation Claude log:


> /init is analyzing your codebase…
  ⎿  API Error (500 {"error":{"message":"Error calling litellm.acompletion for non-Anthropic model: litellm.APIConnectionError: 'str' object has no attribute 'get'\nTraceback (most recent call last):\n  File \"/usr/lib/python3.13/site-packages/litellm/main.py\", line 3086, in completion\n    response = base_llm_http_handler.completion(\n        model=model,\n    ...<13 lines>...\n        client=client,\n    )\n  File \"/usr/lib/python3.13/site-packages/litellm/llms/custom_httpx/llm_http_handler.py\", line 329, in completion\n    data = provider_config.transform_request(\n        model=model,\n    ...<3 lines>...\n        headers=headers,\n    )\n  File \"/usr/lib/python3.13/site-packages/litellm/llms/ollama/completion/transformation.py\", line 342, in transform_request\n    modified_prompt = ollama_pt(model=model, messages=messages)\n  File \"/usr/lib/python3.13/site-packages/litellm/litellm_core_utils/prompt_templates/factory.py\", line 222, in ollama_pt\n    system_content_str, msg_i = _handle_ollama_system_message(\n                                ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^\n        messages, prompt, msg_i\n        ^^^^^^^^^^^^^^^^^^^^^^^\n    )\n    ^\n  File \"/usr/lib/python3.13/site-packages/litellm/litellm_core_utils/prompt_templates/factory.py\", line 180, in _handle_ollama_system_message\n    msg_content = convert_content_list_to_str(messages[msg_i])\n  File \"/usr/lib/python3.13/site-packages/litellm/litellm_core_utils/prompt_templates/common_utils.py\", line 132, in convert_content_list_to_str\n    text_content = c.get(\"text\")\n                   ^^^^^\nAttributeError: 'str' object has no attribute 'get'\n","type":"None","param":"None","code":"500"}}) · Retrying in 1 seconds… (attempt 1/10)


LiteLLM log:


litellm-1    | 16:01:16 - LiteLLM Router:INFO: router.py:2779 - ageneric_api_call_with_fallbacks(model=hhao/qwen2.5-coder-tools:7b) Exception Error calling litellm.acompletion for non-Anthropic model: litellm.APIConnectionError: 'str' object has no attribute 'get'
litellm-1    | Traceback (most recent call last):
litellm-1    |   File "/usr/lib/python3.13/site-packages/litellm/main.py", line 3086, in completion
litellm-1    |     response = base_llm_http_handler.completion(
litellm-1    |         model=model,
litellm-1    |     ...<13 lines>...
litellm-1    |         client=client,
litellm-1    |     )
litellm-1    |   File "/usr/lib/python3.13/site-packages/litellm/llms/custom_httpx/llm_http_handler.py", line 329, in completion
litellm-1    |     data = provider_config.transform_request(
litellm-1    |         model=model,
litellm-1    |     ...<3 lines>...
litellm-1    |         headers=headers,
litellm-1    |     )
litellm-1    |   File "/usr/lib/python3.13/site-packages/litellm/llms/ollama/completion/transformation.py", line 342, in transform_request
litellm-1    |     modified_prompt = ollama_pt(model=model, messages=messages)
litellm-1    |   File "/usr/lib/python3.13/site-packages/litellm/litellm_core_utils/prompt_templates/factory.py", line 222, in ollama_pt
litellm-1    |     system_content_str, msg_i = _handle_ollama_system_message(
litellm-1    |                                 ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^
litellm-1    |         messages, prompt, msg_i
litellm-1    |         ^^^^^^^^^^^^^^^^^^^^^^^
litellm-1    |     )
litellm-1    |     ^
litellm-1    |   File "/usr/lib/python3.13/site-packages/litellm/litellm_core_utils/prompt_templates/factory.py", line 180, in _handle_ollama_system_message
litellm-1    |     msg_content = convert_content_list_to_str(messages[msg_i])
litellm-1    |   File "/usr/lib/python3.13/site-packages/litellm/litellm_core_utils/prompt_templates/common_utils.py", line 132, in convert_content_list_to_str
litellm-1    |     text_content = c.get("text")
litellm-1    |                    ^^^^^
litellm-1    | AttributeError: 'str' object has no attribute 'get'
litellm-1    |

Are you a ML Ops Team?

Yes

What LiteLLM version are you on ?

1.74.15

Twitter / LinkedIn details

No response

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions