Skip to content

[BUG]Improve UX for integrating Private/Local LLMs (e.g., vLLM, Ollama) #389

[BUG]Improve UX for integrating Private/Local LLMs (e.g., vLLM, Ollama)

[BUG]Improve UX for integrating Private/Local LLMs (e.g., vLLM, Ollama) #389

Triggered via issue February 11, 2026 06:13
Status Success
Total duration 7s
Artifacts
Add issue to project
3s
Add issue to project
Fit to window
Zoom out
Zoom in