Skip to content

fix(openai-compatible): inject JSON schema instruction when structuredOutputs disabled#12608

Open
giulio-leone wants to merge 2 commits intovercel:mainfrom
giulio-leone:fix/issue-12491-output-object-json-mode
Open

fix(openai-compatible): inject JSON schema instruction when structuredOutputs disabled#12608
giulio-leone wants to merge 2 commits intovercel:mainfrom
giulio-leone:fix/issue-12491-output-object-json-mode

Conversation

@giulio-leone
Copy link

Fixes #12491

Problem

generateText + Output.object() had ~15% failure rate vs generateObject's 100% reliability with OpenAI-compatible providers.

Root Cause

When supportsStructuredOutputs is false (default for openai-compatible), the provider sends response_format: json_object but silently drops the schema - giving the model no knowledge of what JSON structure to produce.

Fix

In openai-compatible-chat-language-model.ts, when falling back to json_object mode, inject the JSON schema as a system instruction using injectJsonInstructionIntoMessages from @ai-sdk/provider-utils. This matches patterns used by other providers (e.g., Mistral).

giulio-leone and others added 2 commits February 14, 2026 22:05
…dOutputs disabled (vercel#12491)

When the OpenAI-compatible provider has structuredOutputs disabled (the
default), it falls back from json_schema to json_object response format,
which silently drops the schema. The model is told to produce JSON but
has no knowledge of what schema to produce, leading to validation
failures.

This fix injects the JSON schema as a system instruction into the prompt
messages (using injectJsonInstructionIntoMessages from provider-utils)
when the provider cannot enforce the schema via response_format. This
gives the model explicit guidance about the expected output structure,
matching the reliability of providers that support structured outputs
natively.

Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>
Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>
@giulio-leone giulio-leone force-pushed the fix/issue-12491-output-object-json-mode branch from a848c87 to 91897c3 Compare February 14, 2026 21:05
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

generateText + Output.object() doesn't use provider-level JSON mode like generateObject

1 participant