Problem
When using the Vercel AI SDK's Output.object({ schema }) with generateText() or generateObject() for Anthropic models (e.g., anthropic/claude-sonnet-4-6) through the Helicone gateway, the model returns prose/markdown instead of JSON, causing NoObjectGeneratedError: No object generated: could not parse the response.
Root Cause
The provider's buildRequestBody() converts the AI SDK's responseFormat to OpenAI's response_format parameter:
{
"response_format": {
"type": "json_schema",
"json_schema": { "schema": { ... }, "strict": true, "name": "response" }
}
}
The Helicone gateway passes this field through to Anthropic's API, but Anthropic does not recognize response_format — it's an OpenAI-specific parameter. Anthropic silently ignores it and the model generates unstructured text.
Reproduction
import { generateText, Output } from 'ai';
import { createHelicone } from '@helicone/ai-sdk-provider';
import { z } from 'zod';
const helicone = createHelicone({ apiKey: process.env.HELICONE_API_KEY });
const model = helicone('anthropic/claude-sonnet-4-6');
const { output } = await generateText({
model,
output: Output.object({
schema: z.object({
name: z.string(),
age: z.number(),
hobbies: z.array(z.string()),
}),
}),
prompt: 'Generate a random person profile.',
});
// ❌ Throws: NoObjectGeneratedError: No object generated: could not parse the response.
// The model returned markdown like "# Person Profile\n**Name:** John..."
What's sent to the API (from Helicone request logs)
{
"model": "anthropic/claude-sonnet-4-6",
"response_format": {
"type": "json_schema",
"json_schema": {
"schema": { "type": "object", "properties": { "name": { "type": "string" }, ... } },
"strict": true,
"name": "response"
}
},
"messages": [...]
}
Anthropic ignores response_format entirely. The response is prose, not JSON.
What should be sent instead
Anthropic supports structured output natively via the output_format parameter:
{
"model": "anthropic/claude-sonnet-4-6",
"output_format": {
"type": "json_schema",
"json_schema": {
"schema": { "type": "object", "properties": { "name": { "type": "string" }, ... } },
"name": "response"
}
},
"messages": [...]
}
Note: Anthropic's output_format does not use strict: true (that's OpenAI-specific).
Expected Behavior
Output.object({ schema }) and generateObject() should produce valid, schema-conforming JSON output when used with Anthropic models through the Helicone gateway.
Environment
@helicone/ai-sdk-provider: 1.0.12
ai (Vercel AI SDK): 5.x
- Models affected: All
anthropic/* models
- Models NOT affected:
openai/*, google/* (they use response_format correctly)
Problem
When using the Vercel AI SDK's
Output.object({ schema })withgenerateText()orgenerateObject()for Anthropic models (e.g.,anthropic/claude-sonnet-4-6) through the Helicone gateway, the model returns prose/markdown instead of JSON, causingNoObjectGeneratedError: No object generated: could not parse the response.Root Cause
The provider's
buildRequestBody()converts the AI SDK'sresponseFormatto OpenAI'sresponse_formatparameter:{ "response_format": { "type": "json_schema", "json_schema": { "schema": { ... }, "strict": true, "name": "response" } } }The Helicone gateway passes this field through to Anthropic's API, but Anthropic does not recognize
response_format— it's an OpenAI-specific parameter. Anthropic silently ignores it and the model generates unstructured text.Reproduction
What's sent to the API (from Helicone request logs)
{ "model": "anthropic/claude-sonnet-4-6", "response_format": { "type": "json_schema", "json_schema": { "schema": { "type": "object", "properties": { "name": { "type": "string" }, ... } }, "strict": true, "name": "response" } }, "messages": [...] }Anthropic ignores
response_formatentirely. The response is prose, not JSON.What should be sent instead
Anthropic supports structured output natively via the
output_formatparameter:{ "model": "anthropic/claude-sonnet-4-6", "output_format": { "type": "json_schema", "json_schema": { "schema": { "type": "object", "properties": { "name": { "type": "string" }, ... } }, "name": "response" } }, "messages": [...] }Note: Anthropic's
output_formatdoes not usestrict: true(that's OpenAI-specific).Expected Behavior
Output.object({ schema })andgenerateObject()should produce valid, schema-conforming JSON output when used with Anthropic models through the Helicone gateway.Environment
@helicone/ai-sdk-provider: 1.0.12ai(Vercel AI SDK): 5.xanthropic/*modelsopenai/*,google/*(they useresponse_formatcorrectly)