-
Notifications
You must be signed in to change notification settings - Fork 123
feat(ask_sb): OpenAI compatible language models #424
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
WalkthroughSupport for OpenAI-compatible language model providers has been added across schema definitions, TypeScript types, documentation, and UI. This enables configuring self-hosted models compatible with the OpenAI Chat Completions API. Updates include new schema entries, TypeScript interfaces, documentation, dependency additions, and UI logic for provider handling and display. Changes
Sequence Diagram(s)sequenceDiagram
participant User
participant WebUI
participant API_Server
participant OpenAICompatibleClient
participant SelfHostedModel
User->>WebUI: Selects "OpenAI Compatible" provider
WebUI->>API_Server: Sends chat request with provider config
API_Server->>OpenAICompatibleClient: Instantiates client with baseUrl, model, token
OpenAICompatibleClient->>SelfHostedModel: Sends Chat Completions API request
SelfHostedModel-->>OpenAICompatibleClient: Returns response
OpenAICompatibleClient-->>API_Server: Returns model response
API_Server-->>WebUI: Returns chat response
WebUI-->>User: Displays response with provider icon
Estimated code review effort🎯 3 (Moderate) | ⏱️ ~15–20 minutes Suggested reviewers
Poem
Note ⚡️ Unit Test Generation is now available in beta!Learn more here, or try it out under "Finishing Touches" below. 📜 Recent review detailsConfiguration used: CodeRabbit UI 📒 Files selected for processing (7)
🚧 Files skipped from review as they are similar to previous changes (7)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (1)
✨ Finishing Touches
🧪 Generate unit tests
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
SupportNeed help? Create a ticket on our support page for assistance with any issues or questions. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
CodeRabbit Configuration File (
|
I was just digging in to why my locally hosted openAI model wasn't working with sourcebot and realized this project didn't have openai-compatible-providers setup through the Vercel API. Props for putting this together! Excited for the add |
This comment has been minimized.
This comment has been minimized.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
♻️ Duplicate comments (1)
docs/snippets/schemas/v3/index.schema.mdx (1)
2637-2642
: Duplicate inconsistency forbaseUrl
in theoneOf
branchThe same “Optional base URL.” wording is repeated here while
baseUrl
remains mandatory (Lines 2644-2648). Please sync the description with the actual requirement to avoid schema drift.
🧹 Nitpick comments (1)
packages/schemas/src/v3/index.schema.ts (1)
2590-2649
: Consider using schema references to reduce duplication.The schema definition is correct and follows the established pattern in this file. However, the complete duplication of the schema definition (both in
definitions
andoneOf
) could lead to maintenance challenges.Consider using JSON Schema
$ref
to reference the definition:- { - "type": "object", - "properties": { - "provider": { - "const": "openai-compatible", - "description": "OpenAI Compatible Configuration" - }, - "model": { - "type": "string", - "description": "The name of the language model." - }, - // ... rest of properties - }, - "required": [ - "provider", - "model", - "baseUrl" - ], - "additionalProperties": false - } + { + "$ref": "#/definitions/OpenAICompatibleLanguageModel" + }This would maintain consistency with the definitions section while reducing duplication.
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (1)
yarn.lock
is excluded by!**/yarn.lock
,!**/*.lock
📒 Files selected for processing (11)
docs/docs/configuration/language-model-providers.mdx
(13 hunks)docs/snippets/schemas/v3/index.schema.mdx
(2 hunks)docs/snippets/schemas/v3/languageModel.schema.mdx
(2 hunks)packages/schemas/src/v3/index.schema.ts
(2 hunks)packages/schemas/src/v3/index.type.ts
(2 hunks)packages/schemas/src/v3/languageModel.schema.ts
(2 hunks)packages/schemas/src/v3/languageModel.type.ts
(2 hunks)packages/web/package.json
(1 hunks)packages/web/src/app/api/(server)/chat/route.ts
(2 hunks)packages/web/src/features/chat/components/chatBox/modelProviderLogo.tsx
(4 hunks)schemas/v3/languageModel.json
(2 hunks)
🧰 Additional context used
📓 Path-based instructions (1)
**/*
📄 CodeRabbit Inference Engine (.cursor/rules/style.mdc)
Filenames should always be camelCase. Exception: if there are filenames in the same directory with a format other than camelCase, use that format to keep things consistent.
Files:
packages/web/package.json
packages/schemas/src/v3/index.type.ts
docs/docs/configuration/language-model-providers.mdx
packages/schemas/src/v3/languageModel.schema.ts
packages/web/src/app/api/(server)/chat/route.ts
packages/web/src/features/chat/components/chatBox/modelProviderLogo.tsx
packages/schemas/src/v3/languageModel.type.ts
schemas/v3/languageModel.json
docs/snippets/schemas/v3/languageModel.schema.mdx
packages/schemas/src/v3/index.schema.ts
docs/snippets/schemas/v3/index.schema.mdx
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (1)
- GitHub Check: build
🔇 Additional comments (23)
packages/web/package.json (1)
23-23
: LGTM!The addition of
@ai-sdk/openai-compatible
dependency is correctly placed in alphabetical order and uses an appropriate version constraint consistent with other AI SDK packages.packages/web/src/app/api/(server)/chat/route.ts (2)
20-20
: LGTM!The import for
createOpenAICompatible
is correctly added and follows the existing import pattern for AI SDK providers.
448-460
: Well-implemented OpenAI-compatible provider support.The implementation correctly:
- Uses required
baseUrl
from config (consistent with schema)- Falls back to
modelId
whendisplayName
is not provided- Handles optional token authentication properly
- Returns the chat model instance consistently with other providers
The simpler approach without provider options or headers is appropriate for a generic compatibility provider.
packages/schemas/src/v3/index.type.ts (2)
24-24
: LGTM!The
OpenAICompatibleLanguageModel
is correctly added to theLanguageModel
union type, maintaining alphabetical order with other provider types.
795-828
: Well-defined interface for OpenAI-compatible provider.The
OpenAICompatibleLanguageModel
interface correctly:
- Defines the provider as a literal type
"openai-compatible"
- Requires
baseUrl
field (appropriate for self-hosted models)- Follows established patterns for
token
authentication with secret/env union- Includes clear documentation for the token's Authorization header behavior
The interface structure is consistent with other language model providers in the codebase.
schemas/v3/languageModel.json (2)
354-386
: Well-structured schema definition for OpenAI-compatible provider.The
OpenAICompatibleLanguageModel
schema correctly:
- Defines required properties (
provider
,model
,baseUrl
) appropriate for self-hosted models- Uses proper URL format validation with regex pattern consistent with other providers
- References shared
Token
definition for authentication- Includes clear description of token behavior for Authorization header
- Enforces strict validation with
additionalProperties: false
484-486
: LGTM!The new
OpenAICompatibleLanguageModel
reference is correctly added to theoneOf
array, enabling proper schema validation for the new provider type.docs/docs/configuration/language-model-providers.mdx (4)
6-10
: Excellent documentation addition.The import statement and introductory note effectively guide users to the new OpenAI-compatible provider option for self-hosting models.
257-283
: Comprehensive OpenAI Compatible provider documentation.The new section effectively:
- Clearly explains the purpose and compatibility scope (Ollama, llama.cpp)
- Provides a well-structured example configuration
- Links to relevant Vercel AI SDK documentation
- Includes practical troubleshooting guidance for llama.cpp users
The documentation will help users successfully configure self-hosted OpenAI-compatible models.
327-334
: Valuable schema reference addition.The new schema reference section with accordion provides users easy access to the complete JSON schema definition, enhancing the documentation's completeness.
54-307
: Consistent link updates.All Vercel AI SDK documentation links have been properly updated from the old
v5.ai-sdk.dev
domain to the currentai-sdk.dev
domain, ensuring users access up-to-date documentation.packages/schemas/src/v3/languageModel.type.ts (2)
13-13
: LGTM! Proper integration of the new provider type.The addition of
OpenAICompatibleLanguageModel
to the union type is correctly placed alphabetically and follows the established pattern.
367-400
: Well-designed interface for OpenAI-compatible providers.The
OpenAICompatibleLanguageModel
interface is well-structured with a key architectural decision: makingbaseUrl
required rather than optional. This is appropriate since OpenAI-compatible providers (like Ollama, llama.cpp) require specifying the self-hosted endpoint. The token documentation clearly explains the Authorization header behavior.docs/snippets/schemas/v3/languageModel.schema.mdx (2)
626-685
: Comprehensive schema definition for OpenAI-compatible providers.The JSON schema properly defines all required and optional properties with appropriate validation rules. The required
baseUrl
field (line 682) correctly enforces that self-hosted endpoints must be specified, which is essential for OpenAI-compatible providers.
1429-1488
: Consistent schema validation in oneOf array.The schema definition in the oneOf array correctly mirrors the definitions section, ensuring proper validation. The required fields (lines 1482-1486) are consistent with the interface requirements.
packages/web/src/features/chat/components/chatBox/modelProviderLogo.tsx (5)
16-16
: Appropriate icon import for the new provider.The
Box
icon from lucide-react is a suitable choice for representing OpenAI-compatible providers, providing a generic yet recognizable symbol for self-hosted models.
27-27
: Well-typed interface extension.The updated return type properly includes the
Icon
property withLucideIcon
type, maintaining type safety while supporting both image and icon-based logos.
32-75
: Good refactoring of logo class handling.Removing the explicit width/height classes from individual provider cases improves maintainability since these dimensions are now consistently applied in the render logic (lines 89, 96).
76-80
: Thoughtful provider representation.The
openai-compatible
case uses theBox
icon with muted styling, which appropriately represents the generic/self-hosted nature of OpenAI-compatible providers while maintaining visual consistency.
84-100
: Clean conditional rendering logic.The render logic elegantly handles both image-based and icon-based logos with consistent sizing and styling. The fallback to
null
ensures graceful handling of undefined providers.packages/schemas/src/v3/index.schema.ts (1)
1787-1846
: LGTM! Well-structured schema definition.The
OpenAICompatibleLanguageModel
schema is properly structured and consistent with other language model definitions. The requiredbaseUrl
field appropriately distinguishes this from the standard OpenAI provider, and the token handling follows established patterns.packages/schemas/src/v3/languageModel.schema.ts (2)
625-684
: LGTM! Well-designed schema for OpenAI-compatible providers.The schema definition correctly implements support for OpenAI-compatible providers with appropriate required fields. Making
baseUrl
required is the right choice for self-hosted models, and the token description clearly explains the Authorization header behavior.
1428-1487
: LGTM! Consistent oneOf entry implementation.The oneOf entry correctly mirrors the OpenAI-compatible provider definition and follows the established pattern used throughout the schema file.
This PR adds support for any OpenAI compatible language model provider, such as Ollama and llama.cpp.
Summary by CodeRabbit
New Features
Documentation