fix(e2e): stabilize canary tests for langchain version drift and anthropic timeout#1707
Merged
Luca Forstner (lforst) merged 3 commits intomainfrom Mar 31, 2026
Merged
fix(e2e): stabilize canary tests for langchain version drift and anthropic timeout#1707Luca Forstner (lforst) merged 3 commits intomainfrom
Luca Forstner (lforst) merged 3 commits intomainfrom
Conversation
…ropic timeout Normalize invocation-parameter fields (max_tokens, model, stream, temperature) that older @langchain/openai included alongside the standardized ls_* metadata keys but newer versions removed. Dropping these from the snapshot makes the wrap-langchain-js-traces canary test stable across both the locked and latest langchain versions. Also increase the anthropic-instrumentation scenario timeout from 90s to 150s to accommodate the additional messages.batches API calls (create, retrieve, list, cancel) that will be exercised once the batches PR merges. Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
- Add stream_options to LANGCHAIN_LS_VOLATILE_KEYS — newer @langchain/openai dropped it from the ls_* metadata block, causing the canary snapshot to mismatch - Update ai-sdk-v6.otel-spans.json: ai 6.0.1 reordered spans so ai.generateText now precedes ai.generateText.doGenerate (previously reversed) Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
c0d3f7d to
386d195
Compare
OTel spans arrive in non-deterministic order across requests, causing ai.generateText vs ai.generateText.doGenerate to flap position between runs. Sort by name (then hasParent) before snapshotting so the comparison is stable. Both v5 and v6 snapshots are now identical since they contain the same set of spans. Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
Luca Forstner (lforst)
approved these changes
Mar 31, 2026
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
@langchain/openairemovedmax_tokens,model,stream, andtemperaturefrom thels_*metadata block (they're already captured in the standardizedls_max_tokens,ls_model_name, etc. fields). Added normalization inassertions.tsto delete these volatile keys from any metadata object that containsls_keys, making thewrap-langchain-js-tracessnapshot stable across both the locked (1.3.0) and latest langchain versions.anthropic-instrumentationscenario timeout from 90s to 150s to accommodate the additionalmessages.batchesAPI calls (create, retrieve, list, cancel) that will land when feat: Add Message Batches API instrumentation #1698 merges.Test plan
wrap-langchain-js-tracescanary test passes with both locked and latest@langchain/openai🤖 Generated with Claude Code