diff --git a/CHANGELOG.md b/CHANGELOG.md
index dfcedda9..1abf96b6 100644
--- a/CHANGELOG.md
+++ b/CHANGELOG.md
@@ -7,6 +7,9 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
## [Unreleased]
+### Added
+- [ask sb] Added OpenAI Compatible Language Provider. [#424](https://github.com/sourcebot-dev/sourcebot/pull/424)
+
## [4.6.2] - 2025-07-31
### Changed
diff --git a/docs/docs/configuration/language-model-providers.mdx b/docs/docs/configuration/language-model-providers.mdx
index d372c83d..1b442a43 100644
--- a/docs/docs/configuration/language-model-providers.mdx
+++ b/docs/docs/configuration/language-model-providers.mdx
@@ -3,6 +3,12 @@ title: Language Model Providers
sidebarTitle: Language model providers
---
+import LanguageModelSchema from '/snippets/schemas/v3/languageModel.schema.mdx'
+
+
+Looking to self-host your own model? Check out the [OpenAI Compatible](#openai-compatible) provider.
+
+
To use [Ask Sourcebot](/docs/features/ask) you must define at least one Language Model Provider. These providers are defined within the [config file](/docs/configuration/config-file) you
provide Sourcebot.
@@ -45,7 +51,7 @@ For a detailed description of all the providers, please refer to the [schema](ht
### Amazon Bedrock
-[Vercel AI SDK Amazon Bedrock Docs](https://v5.ai-sdk.dev/providers/ai-sdk-providers/amazon-bedrock)
+[Vercel AI SDK Amazon Bedrock Docs](https://ai-sdk.dev/providers/ai-sdk-providers/amazon-bedrock)
```json wrap icon="code" Example config with Amazon Bedrock provider
{
@@ -70,7 +76,7 @@ For a detailed description of all the providers, please refer to the [schema](ht
### Anthropic
-[Vercel AI SDK Anthropic Docs](https://v5.ai-sdk.dev/providers/ai-sdk-providers/anthropic)
+[Vercel AI SDK Anthropic Docs](https://ai-sdk.dev/providers/ai-sdk-providers/anthropic)
```json wrap icon="code" Example config with Anthropic provider
{
@@ -91,7 +97,7 @@ For a detailed description of all the providers, please refer to the [schema](ht
### Azure OpenAI
-[Vercel AI SDK Azure OpenAI Docs](https://v5.ai-sdk.dev/providers/ai-sdk-providers/azure)
+[Vercel AI SDK Azure OpenAI Docs](https://ai-sdk.dev/providers/ai-sdk-providers/azure)
```json wrap icon="code" Example config with Azure AI provider
{
@@ -114,7 +120,7 @@ For a detailed description of all the providers, please refer to the [schema](ht
### Deepseek
-[Vercel AI SDK Deepseek Docs](https://v5.ai-sdk.dev/providers/ai-sdk-providers/deepseek)
+[Vercel AI SDK Deepseek Docs](https://ai-sdk.dev/providers/ai-sdk-providers/deepseek)
```json wrap icon="code" Example config with Deepseek provider
{
@@ -135,7 +141,7 @@ For a detailed description of all the providers, please refer to the [schema](ht
### Google Generative AI
-[Vercel AI SDK Google Generative AI Docs](https://v5.ai-sdk.dev/providers/ai-sdk-providers/google-generative-ai)
+[Vercel AI SDK Google Generative AI Docs](https://ai-sdk.dev/providers/ai-sdk-providers/google-generative-ai)
```json wrap icon="code" Example config with Google Generative AI provider
{
@@ -159,7 +165,7 @@ For a detailed description of all the providers, please refer to the [schema](ht
If you're using an Anthropic model on Google Vertex, you must define a [Google Vertex Anthropic](#google-vertex-anthropic) provider instead
The `credentials` paramater here expects a **path** to a [credentials](https://console.cloud.google.com/apis/credentials) file. This file **must be in a volume mounted by Sourcebot** for it to be readable.
-[Vercel AI SDK Google Vertex AI Docs](https://v5.ai-sdk.dev/providers/ai-sdk-providers/google-vertex)
+[Vercel AI SDK Google Vertex AI Docs](https://ai-sdk.dev/providers/ai-sdk-providers/google-vertex)
```json wrap icon="code" Example config with Google Vertex provider
{
@@ -185,7 +191,7 @@ For a detailed description of all the providers, please refer to the [schema](ht
The `credentials` paramater here expects a **path** to a [credentials](https://console.cloud.google.com/apis/credentials) file. This file **must be in a volume mounted by Sourcebot** for it to be readable.
-[Vercel AI SDK Google Vertex Anthropic Docs](https://v5.ai-sdk.dev/providers/ai-sdk-providers/google-vertex#google-vertex-anthropic-provider-usage)
+[Vercel AI SDK Google Vertex Anthropic Docs](https://ai-sdk.dev/providers/ai-sdk-providers/google-vertex#google-vertex-anthropic-provider-usage)
```json wrap icon="code" Example config with Google Vertex Anthropic provider
{
@@ -208,7 +214,7 @@ For a detailed description of all the providers, please refer to the [schema](ht
### Mistral
-[Vercel AI SDK Mistral Docs](https://v5.ai-sdk.dev/providers/ai-sdk-providers/mistral)
+[Vercel AI SDK Mistral Docs](https://ai-sdk.dev/providers/ai-sdk-providers/mistral)
```json wrap icon="code" Example config with Mistral provider
{
@@ -229,7 +235,7 @@ For a detailed description of all the providers, please refer to the [schema](ht
### OpenAI
-[Vercel AI SDK OpenAI Docs](https://v5.ai-sdk.dev/providers/ai-sdk-providers/openai)
+[Vercel AI SDK OpenAI Docs](https://ai-sdk.dev/providers/ai-sdk-providers/openai)
```json wrap icon="code" Example config with OpenAI provider
{
@@ -248,9 +254,36 @@ For a detailed description of all the providers, please refer to the [schema](ht
}
```
+### OpenAI Compatible
+
+[Vercel AI SDK OpenAI Compatible Docs](https://ai-sdk.dev/providers/openai-compatible-providers)
+
+The OpenAI compatible provider allows you to use any model that is compatible with the OpenAI [Chat Completions API](https://github.com/ollama/ollama/blob/main/docs/openai.md). This includes self-hosted tools like [Ollama](https://ollama.ai/) and [llama.cpp](https://github.com/ggerganov/llama.cpp).
+
+```json wrap icon="code" Example config with OpenAI Compatible provider
+{
+ "$schema": "https://raw.githubusercontent.com/sourcebot-dev/sourcebot/main/schemas/v3/index.json",
+ "models": [
+ {
+ "provider": "openai-compatible",
+ "baseUrl": "BASE_URL_HERE",
+ "model": "YOUR_MODEL_HERE",
+ "displayName": "OPTIONAL_DISPLAY_NAME",
+ "token": {
+ "env": "OPTIONAL_API_KEY"
+ }
+ }
+ ]
+}
+```
+
+
+- When using [llama.cpp](https://github.com/ggml-org/llama.cpp), if you hit "Failed after 3 attempts. Last error: tools param requires --jinja flag", add the `--jinja` flag to your `llama-server` command.
+
+
### OpenRouter
-[Vercel AI SDK OpenRouter Docs](https://v5.ai-sdk.dev/providers/community-providers/openrouter)
+[Vercel AI SDK OpenRouter Docs](https://ai-sdk.dev/providers/community-providers/openrouter)
```json wrap icon="code" Example config with OpenRouter provider
{
@@ -271,7 +304,7 @@ For a detailed description of all the providers, please refer to the [schema](ht
### xAI
-[Vercel AI SDK xAI Docs](https://v5.ai-sdk.dev/providers/ai-sdk-providers/xai)
+[Vercel AI SDK xAI Docs](https://ai-sdk.dev/providers/ai-sdk-providers/xai)
```json wrap icon="code" Example config with xAI provider
{
@@ -288,4 +321,14 @@ For a detailed description of all the providers, please refer to the [schema](ht
}
]
}
-```
\ No newline at end of file
+```
+
+
+## Schema reference
+
+
+[schemas/v3/languageModel.json](https://github.com/sourcebot-dev/sourcebot/blob/main/schemas/v3/languageModel.json)
+
+
+
+
\ No newline at end of file
diff --git a/docs/snippets/schemas/v3/index.schema.mdx b/docs/snippets/schemas/v3/index.schema.mdx
index 0b284164..1078c42d 100644
--- a/docs/snippets/schemas/v3/index.schema.mdx
+++ b/docs/snippets/schemas/v3/index.schema.mdx
@@ -1785,6 +1785,69 @@
],
"additionalProperties": false
},
+ "OpenAICompatibleLanguageModel": {
+ "type": "object",
+ "properties": {
+ "provider": {
+ "const": "openai-compatible",
+ "description": "OpenAI Compatible Configuration"
+ },
+ "model": {
+ "type": "string",
+ "description": "The name of the language model."
+ },
+ "displayName": {
+ "type": "string",
+ "description": "Optional display name."
+ },
+ "token": {
+ "description": "Optional API key. If specified, adds an `Authorization` header to request headers with the value Bearer .",
+ "anyOf": [
+ {
+ "type": "object",
+ "properties": {
+ "secret": {
+ "type": "string",
+ "description": "The name of the secret that contains the token."
+ }
+ },
+ "required": [
+ "secret"
+ ],
+ "additionalProperties": false
+ },
+ {
+ "type": "object",
+ "properties": {
+ "env": {
+ "type": "string",
+ "description": "The name of the environment variable that contains the token. Only supported in declarative connection configs."
+ }
+ },
+ "required": [
+ "env"
+ ],
+ "additionalProperties": false
+ }
+ ]
+ },
+ "baseUrl": {
+ "type": "string",
+ "format": "url",
+ "pattern": "^https?:\\/\\/[^\\s/$.?#].[^\\s]*$",
+ "description": "Base URL of the OpenAI-compatible chat completions API endpoint.",
+ "examples": [
+ "http://localhost:8080/v1"
+ ]
+ }
+ },
+ "required": [
+ "provider",
+ "model",
+ "baseUrl"
+ ],
+ "additionalProperties": false
+ },
"OpenRouterLanguageModel": {
"type": "object",
"properties": {
@@ -2528,6 +2591,69 @@
],
"additionalProperties": false
},
+ {
+ "type": "object",
+ "properties": {
+ "provider": {
+ "const": "openai-compatible",
+ "description": "OpenAI Compatible Configuration"
+ },
+ "model": {
+ "type": "string",
+ "description": "The name of the language model."
+ },
+ "displayName": {
+ "type": "string",
+ "description": "Optional display name."
+ },
+ "token": {
+ "description": "Optional API key. If specified, adds an `Authorization` header to request headers with the value Bearer .",
+ "anyOf": [
+ {
+ "type": "object",
+ "properties": {
+ "secret": {
+ "type": "string",
+ "description": "The name of the secret that contains the token."
+ }
+ },
+ "required": [
+ "secret"
+ ],
+ "additionalProperties": false
+ },
+ {
+ "type": "object",
+ "properties": {
+ "env": {
+ "type": "string",
+ "description": "The name of the environment variable that contains the token. Only supported in declarative connection configs."
+ }
+ },
+ "required": [
+ "env"
+ ],
+ "additionalProperties": false
+ }
+ ]
+ },
+ "baseUrl": {
+ "type": "string",
+ "format": "url",
+ "pattern": "^https?:\\/\\/[^\\s/$.?#].[^\\s]*$",
+ "description": "Base URL of the OpenAI-compatible chat completions API endpoint.",
+ "examples": [
+ "http://localhost:8080/v1"
+ ]
+ }
+ },
+ "required": [
+ "provider",
+ "model",
+ "baseUrl"
+ ],
+ "additionalProperties": false
+ },
{
"type": "object",
"properties": {
diff --git a/docs/snippets/schemas/v3/languageModel.schema.mdx b/docs/snippets/schemas/v3/languageModel.schema.mdx
index 2df94d6f..92b05568 100644
--- a/docs/snippets/schemas/v3/languageModel.schema.mdx
+++ b/docs/snippets/schemas/v3/languageModel.schema.mdx
@@ -623,6 +623,69 @@
],
"additionalProperties": false
},
+ "OpenAICompatibleLanguageModel": {
+ "type": "object",
+ "properties": {
+ "provider": {
+ "const": "openai-compatible",
+ "description": "OpenAI Compatible Configuration"
+ },
+ "model": {
+ "type": "string",
+ "description": "The name of the language model."
+ },
+ "displayName": {
+ "type": "string",
+ "description": "Optional display name."
+ },
+ "token": {
+ "description": "Optional API key. If specified, adds an `Authorization` header to request headers with the value Bearer .",
+ "anyOf": [
+ {
+ "type": "object",
+ "properties": {
+ "secret": {
+ "type": "string",
+ "description": "The name of the secret that contains the token."
+ }
+ },
+ "required": [
+ "secret"
+ ],
+ "additionalProperties": false
+ },
+ {
+ "type": "object",
+ "properties": {
+ "env": {
+ "type": "string",
+ "description": "The name of the environment variable that contains the token. Only supported in declarative connection configs."
+ }
+ },
+ "required": [
+ "env"
+ ],
+ "additionalProperties": false
+ }
+ ]
+ },
+ "baseUrl": {
+ "type": "string",
+ "format": "url",
+ "pattern": "^https?:\\/\\/[^\\s/$.?#].[^\\s]*$",
+ "description": "Base URL of the OpenAI-compatible chat completions API endpoint.",
+ "examples": [
+ "http://localhost:8080/v1"
+ ]
+ }
+ },
+ "required": [
+ "provider",
+ "model",
+ "baseUrl"
+ ],
+ "additionalProperties": false
+ },
"OpenRouterLanguageModel": {
"type": "object",
"properties": {
@@ -1366,6 +1429,69 @@
],
"additionalProperties": false
},
+ {
+ "type": "object",
+ "properties": {
+ "provider": {
+ "const": "openai-compatible",
+ "description": "OpenAI Compatible Configuration"
+ },
+ "model": {
+ "type": "string",
+ "description": "The name of the language model."
+ },
+ "displayName": {
+ "type": "string",
+ "description": "Optional display name."
+ },
+ "token": {
+ "description": "Optional API key. If specified, adds an `Authorization` header to request headers with the value Bearer .",
+ "anyOf": [
+ {
+ "type": "object",
+ "properties": {
+ "secret": {
+ "type": "string",
+ "description": "The name of the secret that contains the token."
+ }
+ },
+ "required": [
+ "secret"
+ ],
+ "additionalProperties": false
+ },
+ {
+ "type": "object",
+ "properties": {
+ "env": {
+ "type": "string",
+ "description": "The name of the environment variable that contains the token. Only supported in declarative connection configs."
+ }
+ },
+ "required": [
+ "env"
+ ],
+ "additionalProperties": false
+ }
+ ]
+ },
+ "baseUrl": {
+ "type": "string",
+ "format": "url",
+ "pattern": "^https?:\\/\\/[^\\s/$.?#].[^\\s]*$",
+ "description": "Base URL of the OpenAI-compatible chat completions API endpoint.",
+ "examples": [
+ "http://localhost:8080/v1"
+ ]
+ }
+ },
+ "required": [
+ "provider",
+ "model",
+ "baseUrl"
+ ],
+ "additionalProperties": false
+ },
{
"type": "object",
"properties": {
diff --git a/packages/schemas/src/v3/index.schema.ts b/packages/schemas/src/v3/index.schema.ts
index d1a29dc7..70fb23cf 100644
--- a/packages/schemas/src/v3/index.schema.ts
+++ b/packages/schemas/src/v3/index.schema.ts
@@ -1784,6 +1784,69 @@ const schema = {
],
"additionalProperties": false
},
+ "OpenAICompatibleLanguageModel": {
+ "type": "object",
+ "properties": {
+ "provider": {
+ "const": "openai-compatible",
+ "description": "OpenAI Compatible Configuration"
+ },
+ "model": {
+ "type": "string",
+ "description": "The name of the language model."
+ },
+ "displayName": {
+ "type": "string",
+ "description": "Optional display name."
+ },
+ "token": {
+ "description": "Optional API key. If specified, adds an `Authorization` header to request headers with the value Bearer .",
+ "anyOf": [
+ {
+ "type": "object",
+ "properties": {
+ "secret": {
+ "type": "string",
+ "description": "The name of the secret that contains the token."
+ }
+ },
+ "required": [
+ "secret"
+ ],
+ "additionalProperties": false
+ },
+ {
+ "type": "object",
+ "properties": {
+ "env": {
+ "type": "string",
+ "description": "The name of the environment variable that contains the token. Only supported in declarative connection configs."
+ }
+ },
+ "required": [
+ "env"
+ ],
+ "additionalProperties": false
+ }
+ ]
+ },
+ "baseUrl": {
+ "type": "string",
+ "format": "url",
+ "pattern": "^https?:\\/\\/[^\\s/$.?#].[^\\s]*$",
+ "description": "Base URL of the OpenAI-compatible chat completions API endpoint.",
+ "examples": [
+ "http://localhost:8080/v1"
+ ]
+ }
+ },
+ "required": [
+ "provider",
+ "model",
+ "baseUrl"
+ ],
+ "additionalProperties": false
+ },
"OpenRouterLanguageModel": {
"type": "object",
"properties": {
@@ -2527,6 +2590,69 @@ const schema = {
],
"additionalProperties": false
},
+ {
+ "type": "object",
+ "properties": {
+ "provider": {
+ "const": "openai-compatible",
+ "description": "OpenAI Compatible Configuration"
+ },
+ "model": {
+ "type": "string",
+ "description": "The name of the language model."
+ },
+ "displayName": {
+ "type": "string",
+ "description": "Optional display name."
+ },
+ "token": {
+ "description": "Optional API key. If specified, adds an `Authorization` header to request headers with the value Bearer .",
+ "anyOf": [
+ {
+ "type": "object",
+ "properties": {
+ "secret": {
+ "type": "string",
+ "description": "The name of the secret that contains the token."
+ }
+ },
+ "required": [
+ "secret"
+ ],
+ "additionalProperties": false
+ },
+ {
+ "type": "object",
+ "properties": {
+ "env": {
+ "type": "string",
+ "description": "The name of the environment variable that contains the token. Only supported in declarative connection configs."
+ }
+ },
+ "required": [
+ "env"
+ ],
+ "additionalProperties": false
+ }
+ ]
+ },
+ "baseUrl": {
+ "type": "string",
+ "format": "url",
+ "pattern": "^https?:\\/\\/[^\\s/$.?#].[^\\s]*$",
+ "description": "Base URL of the OpenAI-compatible chat completions API endpoint.",
+ "examples": [
+ "http://localhost:8080/v1"
+ ]
+ }
+ },
+ "required": [
+ "provider",
+ "model",
+ "baseUrl"
+ ],
+ "additionalProperties": false
+ },
{
"type": "object",
"properties": {
diff --git a/packages/schemas/src/v3/index.type.ts b/packages/schemas/src/v3/index.type.ts
index b22b88ec..0ca4467f 100644
--- a/packages/schemas/src/v3/index.type.ts
+++ b/packages/schemas/src/v3/index.type.ts
@@ -21,6 +21,7 @@ export type LanguageModel =
| GoogleVertexLanguageModel
| MistralLanguageModel
| OpenAILanguageModel
+ | OpenAICompatibleLanguageModel
| OpenRouterLanguageModel
| XaiLanguageModel;
@@ -791,6 +792,40 @@ export interface OpenAILanguageModel {
*/
baseUrl?: string;
}
+export interface OpenAICompatibleLanguageModel {
+ /**
+ * OpenAI Compatible Configuration
+ */
+ provider: "openai-compatible";
+ /**
+ * The name of the language model.
+ */
+ model: string;
+ /**
+ * Optional display name.
+ */
+ displayName?: string;
+ /**
+ * Optional API key. If specified, adds an `Authorization` header to request headers with the value Bearer .
+ */
+ token?:
+ | {
+ /**
+ * The name of the secret that contains the token.
+ */
+ secret: string;
+ }
+ | {
+ /**
+ * The name of the environment variable that contains the token. Only supported in declarative connection configs.
+ */
+ env: string;
+ };
+ /**
+ * Base URL of the OpenAI-compatible chat completions API endpoint.
+ */
+ baseUrl: string;
+}
export interface OpenRouterLanguageModel {
/**
* OpenRouter Configuration
diff --git a/packages/schemas/src/v3/languageModel.schema.ts b/packages/schemas/src/v3/languageModel.schema.ts
index 4bf2a82e..63937fdb 100644
--- a/packages/schemas/src/v3/languageModel.schema.ts
+++ b/packages/schemas/src/v3/languageModel.schema.ts
@@ -622,6 +622,69 @@ const schema = {
],
"additionalProperties": false
},
+ "OpenAICompatibleLanguageModel": {
+ "type": "object",
+ "properties": {
+ "provider": {
+ "const": "openai-compatible",
+ "description": "OpenAI Compatible Configuration"
+ },
+ "model": {
+ "type": "string",
+ "description": "The name of the language model."
+ },
+ "displayName": {
+ "type": "string",
+ "description": "Optional display name."
+ },
+ "token": {
+ "description": "Optional API key. If specified, adds an `Authorization` header to request headers with the value Bearer .",
+ "anyOf": [
+ {
+ "type": "object",
+ "properties": {
+ "secret": {
+ "type": "string",
+ "description": "The name of the secret that contains the token."
+ }
+ },
+ "required": [
+ "secret"
+ ],
+ "additionalProperties": false
+ },
+ {
+ "type": "object",
+ "properties": {
+ "env": {
+ "type": "string",
+ "description": "The name of the environment variable that contains the token. Only supported in declarative connection configs."
+ }
+ },
+ "required": [
+ "env"
+ ],
+ "additionalProperties": false
+ }
+ ]
+ },
+ "baseUrl": {
+ "type": "string",
+ "format": "url",
+ "pattern": "^https?:\\/\\/[^\\s/$.?#].[^\\s]*$",
+ "description": "Base URL of the OpenAI-compatible chat completions API endpoint.",
+ "examples": [
+ "http://localhost:8080/v1"
+ ]
+ }
+ },
+ "required": [
+ "provider",
+ "model",
+ "baseUrl"
+ ],
+ "additionalProperties": false
+ },
"OpenRouterLanguageModel": {
"type": "object",
"properties": {
@@ -1365,6 +1428,69 @@ const schema = {
],
"additionalProperties": false
},
+ {
+ "type": "object",
+ "properties": {
+ "provider": {
+ "const": "openai-compatible",
+ "description": "OpenAI Compatible Configuration"
+ },
+ "model": {
+ "type": "string",
+ "description": "The name of the language model."
+ },
+ "displayName": {
+ "type": "string",
+ "description": "Optional display name."
+ },
+ "token": {
+ "description": "Optional API key. If specified, adds an `Authorization` header to request headers with the value Bearer .",
+ "anyOf": [
+ {
+ "type": "object",
+ "properties": {
+ "secret": {
+ "type": "string",
+ "description": "The name of the secret that contains the token."
+ }
+ },
+ "required": [
+ "secret"
+ ],
+ "additionalProperties": false
+ },
+ {
+ "type": "object",
+ "properties": {
+ "env": {
+ "type": "string",
+ "description": "The name of the environment variable that contains the token. Only supported in declarative connection configs."
+ }
+ },
+ "required": [
+ "env"
+ ],
+ "additionalProperties": false
+ }
+ ]
+ },
+ "baseUrl": {
+ "type": "string",
+ "format": "url",
+ "pattern": "^https?:\\/\\/[^\\s/$.?#].[^\\s]*$",
+ "description": "Base URL of the OpenAI-compatible chat completions API endpoint.",
+ "examples": [
+ "http://localhost:8080/v1"
+ ]
+ }
+ },
+ "required": [
+ "provider",
+ "model",
+ "baseUrl"
+ ],
+ "additionalProperties": false
+ },
{
"type": "object",
"properties": {
diff --git a/packages/schemas/src/v3/languageModel.type.ts b/packages/schemas/src/v3/languageModel.type.ts
index 88034678..9e99042f 100644
--- a/packages/schemas/src/v3/languageModel.type.ts
+++ b/packages/schemas/src/v3/languageModel.type.ts
@@ -10,6 +10,7 @@ export type LanguageModel =
| GoogleVertexLanguageModel
| MistralLanguageModel
| OpenAILanguageModel
+ | OpenAICompatibleLanguageModel
| OpenRouterLanguageModel
| XaiLanguageModel;
@@ -363,6 +364,40 @@ export interface OpenAILanguageModel {
*/
baseUrl?: string;
}
+export interface OpenAICompatibleLanguageModel {
+ /**
+ * OpenAI Compatible Configuration
+ */
+ provider: "openai-compatible";
+ /**
+ * The name of the language model.
+ */
+ model: string;
+ /**
+ * Optional display name.
+ */
+ displayName?: string;
+ /**
+ * Optional API key. If specified, adds an `Authorization` header to request headers with the value Bearer .
+ */
+ token?:
+ | {
+ /**
+ * The name of the secret that contains the token.
+ */
+ secret: string;
+ }
+ | {
+ /**
+ * The name of the environment variable that contains the token. Only supported in declarative connection configs.
+ */
+ env: string;
+ };
+ /**
+ * Base URL of the OpenAI-compatible chat completions API endpoint.
+ */
+ baseUrl: string;
+}
export interface OpenRouterLanguageModel {
/**
* OpenRouter Configuration
diff --git a/packages/web/package.json b/packages/web/package.json
index 11d8522d..774ac778 100644
--- a/packages/web/package.json
+++ b/packages/web/package.json
@@ -20,6 +20,7 @@
"@ai-sdk/google-vertex": "3.0.0",
"@ai-sdk/mistral": "2.0.0",
"@ai-sdk/openai": "2.0.0",
+ "@ai-sdk/openai-compatible": "^1.0.0",
"@ai-sdk/react": "2.0.0",
"@ai-sdk/xai": "2.0.0",
"@auth/prisma-adapter": "^2.7.4",
diff --git a/packages/web/src/app/api/(server)/chat/route.ts b/packages/web/src/app/api/(server)/chat/route.ts
index 54325599..834bdc63 100644
--- a/packages/web/src/app/api/(server)/chat/route.ts
+++ b/packages/web/src/app/api/(server)/chat/route.ts
@@ -17,6 +17,7 @@ import { createVertex } from '@ai-sdk/google-vertex';
import { createVertexAnthropic } from '@ai-sdk/google-vertex/anthropic';
import { createMistral } from '@ai-sdk/mistral';
import { createOpenAI, OpenAIResponsesProviderOptions } from "@ai-sdk/openai";
+import { createOpenAICompatible } from "@ai-sdk/openai-compatible";
import { LanguageModelV2 as AISDKLanguageModelV2 } from "@ai-sdk/provider";
import { createXai } from '@ai-sdk/xai';
import { createOpenRouter } from '@openrouter/ai-sdk-provider';
@@ -444,6 +445,19 @@ const getAISDKLanguageModelAndOptions = async (config: LanguageModel, orgId: num
},
};
}
+ case 'openai-compatible': {
+ const openai = createOpenAICompatible({
+ baseURL: config.baseUrl,
+ name: config.displayName ?? modelId,
+ apiKey: config.token
+ ? await getTokenFromConfig(config.token, orgId, prisma)
+ : undefined,
+ });
+
+ return {
+ model: openai.chatModel(modelId),
+ }
+ }
case 'openrouter': {
const openrouter = createOpenRouter({
baseURL: config.baseUrl,
diff --git a/packages/web/src/features/chat/components/chatBox/modelProviderLogo.tsx b/packages/web/src/features/chat/components/chatBox/modelProviderLogo.tsx
index b04d50ee..535cc79d 100644
--- a/packages/web/src/features/chat/components/chatBox/modelProviderLogo.tsx
+++ b/packages/web/src/features/chat/components/chatBox/modelProviderLogo.tsx
@@ -13,6 +13,7 @@ import deepseekLogo from "@/public/deepseek.svg";
import mistralLogo from "@/public/mistral.svg";
import openrouterLogo from "@/public/openrouter.svg";
import xaiLogo from "@/public/xai.svg";
+import { Box, LucideIcon } from "lucide-react";
interface ModelProviderLogoProps {
provider: LanguageModelProvider;
@@ -23,12 +24,12 @@ export const ModelProviderLogo = ({
provider,
className,
}: ModelProviderLogoProps) => {
- const { src, className: logoClassName } = useMemo(() => {
+ const { src, Icon, className: logoClassName } = useMemo((): { src?: string, Icon?: LucideIcon, className?: string } => {
switch (provider) {
case 'amazon-bedrock':
return {
src: bedrockLogo,
- className: 'w-3.5 h-3.5 dark:invert'
+ className: 'dark:invert'
};
case 'anthropic':
return {
@@ -38,23 +39,20 @@ export const ModelProviderLogo = ({
case 'azure':
return {
src: azureAiLogo,
- className: 'w-3.5 h-3.5'
};
case 'deepseek':
return {
src: deepseekLogo,
- className: 'w-3.5 h-3.5'
};
case 'openai':
return {
src: openaiLogo,
- className: 'dark:invert w-3.5 h-3.5'
+ className: 'dark:invert'
};
case 'google-generative-ai':
case 'google-vertex':
return {
src: geminiLogo,
- className: 'w-3.5 h-3.5'
};
case 'google-vertex-anthropic':
return {
@@ -64,30 +62,40 @@ export const ModelProviderLogo = ({
case 'mistral':
return {
src: mistralLogo,
- className: 'w-3.5 h-3.5'
};
case 'openrouter':
return {
src: openrouterLogo,
- className: 'dark:invert w-3.5 h-3.5'
+ className: 'dark:invert'
};
case 'xai':
return {
src: xaiLogo,
- className: 'dark:invert w-3.5 h-3.5'
+ className: 'dark:invert'
+ };
+ case 'openai-compatible':
+ return {
+ Icon: Box,
+ className: 'text-muted-foreground'
};
}
}, [provider]);
- return (
+ return src ? (
- )
+ ) : Icon ? (
+
+ ) : null;
}
diff --git a/schemas/v3/languageModel.json b/schemas/v3/languageModel.json
index c0ac8e40..65807e91 100644
--- a/schemas/v3/languageModel.json
+++ b/schemas/v3/languageModel.json
@@ -351,6 +351,42 @@
],
"additionalProperties": false
},
+ "OpenAICompatibleLanguageModel": {
+ "type": "object",
+ "properties": {
+ "provider": {
+ "const": "openai-compatible",
+ "description": "OpenAI Compatible Configuration"
+ },
+ "model": {
+ "type": "string",
+ "description": "The name of the language model."
+ },
+ "displayName": {
+ "type": "string",
+ "description": "Optional display name."
+ },
+ "token": {
+ "$ref": "./shared.json#/definitions/Token",
+ "description": "Optional API key. If specified, adds an `Authorization` header to request headers with the value Bearer ."
+ },
+ "baseUrl": {
+ "type": "string",
+ "format": "url",
+ "pattern": "^https?:\\/\\/[^\\s/$.?#].[^\\s]*$",
+ "description": "Base URL of the OpenAI-compatible chat completions API endpoint.",
+ "examples": [
+ "http://localhost:8080/v1"
+ ]
+ }
+ },
+ "required": [
+ "provider",
+ "model",
+ "baseUrl"
+ ],
+ "additionalProperties": false
+ },
"OpenRouterLanguageModel": {
"type": "object",
"properties": {
@@ -448,6 +484,9 @@
{
"$ref": "#/definitions/OpenAILanguageModel"
},
+ {
+ "$ref": "#/definitions/OpenAICompatibleLanguageModel"
+ },
{
"$ref": "#/definitions/OpenRouterLanguageModel"
},
diff --git a/yarn.lock b/yarn.lock
index 6b572135..febfdf7c 100644
--- a/yarn.lock
+++ b/yarn.lock
@@ -110,7 +110,7 @@ __metadata:
languageName: node
linkType: hard
-"@ai-sdk/openai-compatible@npm:1.0.0":
+"@ai-sdk/openai-compatible@npm:1.0.0, @ai-sdk/openai-compatible@npm:^1.0.0":
version: 1.0.0
resolution: "@ai-sdk/openai-compatible@npm:1.0.0"
dependencies:
@@ -6506,6 +6506,7 @@ __metadata:
"@ai-sdk/google-vertex": "npm:3.0.0"
"@ai-sdk/mistral": "npm:2.0.0"
"@ai-sdk/openai": "npm:2.0.0"
+ "@ai-sdk/openai-compatible": "npm:^1.0.0"
"@ai-sdk/react": "npm:2.0.0"
"@ai-sdk/xai": "npm:2.0.0"
"@auth/prisma-adapter": "npm:^2.7.4"