|
| 1 | +# LiteLLM Integration Guide |
| 2 | + |
| 3 | +This guide explains how to use LiteLLM with OpenCode, including environment variable configuration and model setup. |
| 4 | + |
| 5 | +## Overview |
| 6 | + |
| 7 | +LiteLLM is a proxy that allows you to use 100+ LLM providers through a unified OpenAI-compatible API. OpenCode supports LiteLLM through both environment variables and configuration files. |
| 8 | + |
| 9 | +## Quick Start with Environment Variables |
| 10 | + |
| 11 | +The simplest way to get started is using environment variables: |
| 12 | + |
| 13 | +```bash |
| 14 | +# Set your LiteLLM proxy URL (defaults to http://localhost:4000) |
| 15 | +export LITELLM_BASE_URL="http://localhost:4000" |
| 16 | + |
| 17 | +# Set your API key (optional, leave empty if not required) |
| 18 | +export LITELLM_API_KEY="sk-your-api-key" |
| 19 | + |
| 20 | +# Automatically add a model |
| 21 | +export LITELLM_MODEL_NAME="gpt-4" |
| 22 | +``` |
| 23 | + |
| 24 | +Then run OpenCode. The model will be automatically available. |
| 25 | + |
| 26 | +## Environment Variables |
| 27 | + |
| 28 | +### `LITELLM_BASE_URL` |
| 29 | + |
| 30 | +The URL of your LiteLLM proxy server. |
| 31 | + |
| 32 | +- **Default**: `http://localhost:4000` |
| 33 | +- **Example**: `export LITELLM_BASE_URL="https://litellm.example.com"` |
| 34 | + |
| 35 | +### `LITELLM_API_KEY` |
| 36 | + |
| 37 | +Your LiteLLM API key. Optional if your proxy doesn't require authentication. |
| 38 | + |
| 39 | +- **Default**: Uses a dummy key for local proxy |
| 40 | +- **Example**: `export LITELLM_API_KEY="sk-1234567890"` |
| 41 | + |
| 42 | +### `LITELLM_MODEL_NAME` |
| 43 | + |
| 44 | +Automatically adds a model to OpenCode without needing to configure `opencode.json`. |
| 45 | + |
| 46 | +- **Example**: `export LITELLM_MODEL_NAME="gpt-4"` |
| 47 | +- **Example**: `export LITELLM_MODEL_NAME="claude-3-5-sonnet-20241022"` |
| 48 | + |
| 49 | +When `LITELLM_MODEL_NAME` is set, OpenCode will automatically create a model configuration with sensible defaults: |
| 50 | +- Context limit: 128,000 tokens |
| 51 | +- Output limit: 4,096 tokens |
| 52 | +- Tool calling: enabled |
| 53 | +- Temperature: enabled |
| 54 | +- Text input/output: enabled |
| 55 | + |
| 56 | +**Note**: Models defined in `opencode.json` take precedence over the `LITELLM_MODEL_NAME` environment variable. |
| 57 | + |
| 58 | +## Configuration Priority |
| 59 | + |
| 60 | +The authentication configuration follows this priority order: |
| 61 | + |
| 62 | +1. **Environment variables** (`LITELLM_BASE_URL`, `LITELLM_API_KEY`) |
| 63 | +2. **Stored authentication** (configured via `/connect` command) |
| 64 | +3. **Defaults** (`http://localhost:4000` with dummy key) |
| 65 | + |
| 66 | +## Interactive Configuration |
| 67 | + |
| 68 | +You can also configure LiteLLM interactively using the `/connect` command: |
| 69 | + |
| 70 | +1. Run `/connect` in OpenCode |
| 71 | +2. Select "LiteLLM" from the provider list |
| 72 | +3. Enter your base URL and API key when prompted |
| 73 | + |
| 74 | +This stores the configuration persistently. |
| 75 | + |
| 76 | +## Advanced Configuration with `opencode.json` |
| 77 | + |
| 78 | +For more control over models and their capabilities, create or update your `opencode.json` file: |
| 79 | + |
| 80 | +```json |
| 81 | +{ |
| 82 | + "$schema": "https://opencode.ai/config.json", |
| 83 | + "provider": { |
| 84 | + "litellm": { |
| 85 | + "baseURL": "http://localhost:4000", |
| 86 | + "apiKey": "{env:LITELLM_API_KEY}", |
| 87 | + "models": { |
| 88 | + "gpt-4": { |
| 89 | + "name": "GPT-4 (via LiteLLM)", |
| 90 | + "limit": { |
| 91 | + "context": 128000, |
| 92 | + "output": 8192 |
| 93 | + }, |
| 94 | + "tool_call": true, |
| 95 | + "temperature": true, |
| 96 | + "reasoning": false, |
| 97 | + "attachment": false |
| 98 | + }, |
| 99 | + "claude-3-5-sonnet-20241022": { |
| 100 | + "name": "Claude 3.5 Sonnet (via LiteLLM)", |
| 101 | + "limit": { |
| 102 | + "context": 200000, |
| 103 | + "output": 8192 |
| 104 | + }, |
| 105 | + "tool_call": true, |
| 106 | + "temperature": true, |
| 107 | + "reasoning": true, |
| 108 | + "attachment": false |
| 109 | + }, |
| 110 | + "gpt-3.5-turbo": { |
| 111 | + "name": "GPT-3.5 Turbo (via LiteLLM)" |
| 112 | + } |
| 113 | + } |
| 114 | + } |
| 115 | + } |
| 116 | +} |
| 117 | +``` |
| 118 | + |
| 119 | +### Configuration Options |
| 120 | + |
| 121 | +- **`baseURL`**: The URL of your LiteLLM proxy server |
| 122 | +- **`apiKey`**: Your LiteLLM API key. Use `{env:VARIABLE_NAME}` to reference environment variables |
| 123 | +- **`models`**: Object defining available models with the following optional properties: |
| 124 | + - `name`: Display name for the model in the UI |
| 125 | + - `limit.context`: Maximum input tokens (default: 128000) |
| 126 | + - `limit.output`: Maximum output tokens (default: 4096) |
| 127 | + - `tool_call`: Whether the model supports function calling (default: true) |
| 128 | + - `temperature`: Whether the model supports temperature parameter (default: true) |
| 129 | + - `reasoning`: Whether the model has reasoning capabilities (default: false) |
| 130 | + - `attachment`: Whether the model supports file attachments (default: false) |
| 131 | + |
| 132 | +## Implementation Details |
| 133 | + |
| 134 | +The LiteLLM integration is implemented in [`packages/opencode/src/plugin/litellm.ts`](packages/opencode/src/plugin/litellm.ts): |
| 135 | + |
| 136 | +- Checks environment variables first (`LITELLM_BASE_URL`, `LITELLM_API_KEY`, `LITELLM_MODEL_NAME`) |
| 137 | +- If `LITELLM_MODEL_NAME` is set, automatically adds it as an available model with sensible defaults |
| 138 | +- Only adds the env model if not already configured in `opencode.json` (user config takes precedence) |
| 139 | +- Supports interactive configuration via custom prompts |
| 140 | +- Stores authentication as `baseURL|apiKey` composite format |
| 141 | + |
| 142 | +## Learn More |
| 143 | + |
| 144 | +- [LiteLLM Documentation](https://docs.litellm.ai/) |
| 145 | +- [OpenCode Providers Documentation](packages/web/src/content/docs/providers.mdx) |
| 146 | +- [LiteLLM Plugin Source](packages/opencode/src/plugin/litellm.ts) |
0 commit comments