Skip to content

Conversation

@jsonbailey
Copy link
Contributor

@jsonbailey jsonbailey commented Oct 29, 2025

feat!: AI Config defaults require the "enabled" attribute
feat!: Renamed LDAIAgentConfig to LDAIAgentConfigRequest for improved clarity
feat!: Renamed LDAIAgent to LDAIAgentConfig *note the previous use of this name
feat!: Removed LDAIAgentDefault in favor of LDAIAgentConfig
feat!: Removed LDAIDefaults in favor of LDAIConfig
feat: Added judge method to AI SDK to retrieve an AI Judge Config
feat: Added initJudge method to create a Judge based on the judge key provided
feat: Added trackEvalScores method to config tracker


Note

Introduces Judge evaluations with structured outputs and sampling, adds judgeConfig/createJudge, refactors config into typed modes (completion/agent/judge) with new completionConfig/agentConfig/agentConfigs/createChat, and enhances tracking with eval score metrics.

  • API (breaking/renames)
    • Add judgeConfig and createJudge for judge evaluations; ChatResponse.evaluations returns async judge results.
    • Replace initChat with createChat; add completionConfig, agentConfig, and agentConfigs (deprecate config, agent, agents).
    • Replace legacy config/agent types with new typed variants in api/config/types (LDAIConversationConfig*, LDAIAgentConfig*, LDAIJudgeConfig*); remove old LDAIConfig and agents module.
  • Core/config
    • Introduce config modes (completion/agent/judge) with validation and interpolation; new internal LDAIConfigUtils for flag <-> config mapping.
    • TrackedChat now initializes and runs attached judges in parallel and exposes getJudges().
  • Tracking/metrics
    • LDAIConfigTracker adds getTrackData() and trackEvalScores(); used to emit per-metric judge scores.
  • Providers
    • AIProvider adds default invokeStructuredModel for judges (and non-abstract invokeModel); factory updated to accept new config union type.
  • Docs/examples/tests
    • README and examples switch to createChat and require enabled in defaults.
    • Comprehensive tests for Judge, client config APIs, and TrackedChat behavior.

Written by Cursor Bugbot for commit 6cc3c43. This will update automatically on new commits. Configure here.

@jsonbailey jsonbailey requested a review from a team as a code owner October 29, 2025 19:04
@github-actions
Copy link
Contributor

@launchdarkly/browser size report
This is the brotli compressed size of the ESM build.
Compressed size: 169118 bytes
Compressed size limit: 200000
Uncompressed size: 789399 bytes

cursor[bot]

This comment was marked as outdated.

@github-actions
Copy link
Contributor

@launchdarkly/js-sdk-common size report
This is the brotli compressed size of the ESM build.
Compressed size: 24988 bytes
Compressed size limit: 26000
Uncompressed size: 122411 bytes

@github-actions
Copy link
Contributor

@launchdarkly/js-client-sdk size report
This is the brotli compressed size of the ESM build.
Compressed size: 21721 bytes
Compressed size limit: 25000
Uncompressed size: 74698 bytes

@github-actions
Copy link
Contributor

@launchdarkly/js-client-sdk-common size report
This is the brotli compressed size of the ESM build.
Compressed size: 17636 bytes
Compressed size limit: 20000
Uncompressed size: 90259 bytes

@tanderson-ld
Copy link
Contributor

Will feat! cause it to release a non-alpha version?

@kinyoklion
Copy link
Member

Will feat! cause it to release a non-alpha version?

If this is set:
"bump-minor-pre-major": true,

Then it will only be a minor.

}));
}
const config = await this._evaluate(key, context, defaultValue, 'completion', variables);
return this._addVercelAISDKSupport(config as LDAIConversationConfig);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It seems like the core AI SDK impl shouldn't know about the specifics of an extending SDK.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This was previously supported and just left "unchanged" in this PR other than moving it into a helper function. We plan to deprecate this and remove it in support of the new vercel provider package.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@abarker-launchdarkly do you have any reservations if we remove this now since it will be a major breaking change anyways?

@jsonbailey
Copy link
Contributor Author

Will feat! cause it to release a non-alpha version?

If this is set: "bump-minor-pre-major": true,

Then it will only be a minor.

I only want it to be a minor, but I want the proper change logs to show breaking even for the minor bump.

@tanderson-ld tanderson-ld self-requested a review November 3, 2025 15:22
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants