Skip to content

LiteLLM issue summary - 2026-03-26 #32

@arielb1-sun-security

Description

@arielb1-sun-security

Open Issues from berriai/litellm Repository

Retrieved on: 2026-03-26 16:00 UTC
Total Open Issues: 30


Issue List

#24638 - fix: use redis_kwargs host/port in cache ping health check (#24636)


#24637 - Fix overlong S3 logging object keys

  • Type: Pull Request
  • Status: Open
  • Created: 2026-03-26T15:35:17Z
  • URL: Fix overlong S3 logging object keys BerriAI/litellm#24637
  • Author: raashish1601
  • Description: Fixes #24628. Cap the final S3 filename component to a filesystem-safe length inside get_s3_object_key, preserve the readable time/id prefix while appending a deterministic hash suffix for overlong keys, and add regression coverage.

#24636 - Cache health check returns None for host and port despite working Redis connection


#24635 - feat(volcengine): add image generation support for Ark/Seedream


#24634 - fix(security): add SSRF protection to custom code guardrail HTTP primitives


#24633 - Litellm fix opus tests


#24632 - Fix tests


#24631 - feat(models): add openrouter/minimax/minimax-m2.7 model pricing


#24629 - feat(models): add openrouter/minimax/minimax-m2.7 to model pricing registry


#24628 - [Bug]: S3 filename too long

  • Type: Issue
  • Status: Open
  • Created: 2026-03-26T12:53:04Z
  • URL: [Bug]: S3 filename too long BerriAI/litellm#24628
  • Author: rodriciru
  • Labels: bug, proxy
  • Description: When trying to store logs on a dockerized S3 compatible rustfs image, 500 errors occur due to filename length exceeding the 255 character limit on Windows/Linux hosts.

#24627 - [Bug]: Pass-through multipart audio transcription endpoint returns UnicodeDecodeError


#24626 - [Bug]: Gemini file retrieval fails: Error parsing file retrieve response


#24625 - fix(mcp): block arbitrary command execution via stdio transport


#24624 - fix(proxy): sanitize user_id input and block dangerous env var keys


#24623 - fix docker documentation MASTER_KEY -> LITELLM_MASTER_KEY


#24622 - fix(vertex_ai): Gemini tool-use prompt tokens ignored, causing wrong usage counts


#24621 - [Bug]: Cannot generate 2K images with Gemini 3.1 Flash Image Preview (stuck at 1K) - extra_body is stripped


#24620 - fix: add openrouter/minimax/minimax-m2.7 model pricing


#24618 - fix(ollama): preserve image_url blocks in ollama_chat multimodal requests


#24617 - fix: Bedrock internalServerException mapping, AuthError no-retry, xai drop_params, SSE error handling


#24616 - fix(router): 429 routing — cooldown bypass, providers.json mapping, Anthropic credit balance fallback


#24615 - fix(ollama): preserve image_url blocks in ollama_chat multimodal requests


#24613 - Feature/add hpc ai provider

  • Type: Pull Request
  • Status: Open
  • Created: 2026-03-26T07:08:36Z
  • URL: Feature/add hpc ai provider BerriAI/litellm#24613
  • Author: lioZ129
  • Description: Adds HPC-AI as an OpenAI-compatible provider with slug hpc_ai and default base URL https://api.hpc-ai.com/inference/v1.

#24612 - fix(model): add supports_reasoning for gemini-3.1-flash-image-preview


#24611 - feat(router): order-based fallback across deployment priority levels


#24610 - feat(gemini): Lyria 3 preview models in cost map and docs


#24609 - [Bug]: No Error Handling in /v1/messages Path

  • Type: Issue
  • Status: Open
  • Created: 2026-03-26T05:52:56Z
  • URL: [Bug]: No Error Handling in /v1/messages Path BerriAI/litellm#24609
  • Author: urainshah
  • Labels: bug, proxy, llm translation
  • Description: async_sse_wrapper function has no try/except block, so when bedrock sends InternalServerException, the raw bedrock error passes through to the proxy unhandled.

#24608 - [Bug]: [Bedrock] internalServerException mid-stream error incorrectly mapped to BadRequestError (400) instead of internalServerException (500)


#24606 - Fix Ollama model info URL normalization


#24605 - MCP Server: TiOLi AGENTIS — AI Agent Exchange (23 tools, SSE)


Summary Statistics

  • Pull Requests: 23
  • Issues: 7
  • Bug Reports: 5 (labeled as bug)
  • Security-related: 2
  • Proxy-related: 8
  • LLM Translation: 4

This report was automatically generated by the LiteLLM Issue Summary bot.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions