Skip to content

fix(e2e): stabilize canary tests for langchain version drift and anthropic timeout#1707

Merged
Luca Forstner (lforst) merged 3 commits intomainfrom
fix/e2e-canary-fixes
Mar 31, 2026
Merged

fix(e2e): stabilize canary tests for langchain version drift and anthropic timeout#1707
Luca Forstner (lforst) merged 3 commits intomainfrom
fix/e2e-canary-fixes

Conversation

@Qard
Copy link
Copy Markdown
Contributor

Summary

  • Langchain canary: Newer versions of @langchain/openai removed max_tokens, model, stream, and temperature from the ls_* metadata block (they're already captured in the standardized ls_max_tokens, ls_model_name, etc. fields). Added normalization in assertions.ts to delete these volatile keys from any metadata object that contains ls_ keys, making the wrap-langchain-js-traces snapshot stable across both the locked (1.3.0) and latest langchain versions.
  • Anthropic timeout: Increase the anthropic-instrumentation scenario timeout from 90s to 150s to accommodate the additional messages.batches API calls (create, retrieve, list, cancel) that will land when feat: Add Message Batches API instrumentation #1698 merges.

Test plan

  • wrap-langchain-js-traces canary test passes with both locked and latest @langchain/openai
  • Anthropic instrumentation tests unaffected (timeout increase is headroom only)

🤖 Generated with Claude Code

Stephen Belanger (Qard) and others added 2 commits March 31, 2026 10:35
…ropic timeout

Normalize invocation-parameter fields (max_tokens, model, stream,
temperature) that older @langchain/openai included alongside the
standardized ls_* metadata keys but newer versions removed. Dropping
these from the snapshot makes the wrap-langchain-js-traces canary test
stable across both the locked and latest langchain versions.

Also increase the anthropic-instrumentation scenario timeout from 90s to
150s to accommodate the additional messages.batches API calls (create,
retrieve, list, cancel) that will be exercised once the batches PR merges.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
- Add stream_options to LANGCHAIN_LS_VOLATILE_KEYS — newer @langchain/openai
  dropped it from the ls_* metadata block, causing the canary snapshot to mismatch
- Update ai-sdk-v6.otel-spans.json: ai 6.0.1 reordered spans so ai.generateText
  now precedes ai.generateText.doGenerate (previously reversed)

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
OTel spans arrive in non-deterministic order across requests, causing
ai.generateText vs ai.generateText.doGenerate to flap position between
runs. Sort by name (then hasParent) before snapshotting so the comparison
is stable. Both v5 and v6 snapshots are now identical since they contain
the same set of spans.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
@lforst Luca Forstner (lforst) merged commit e5a81cb into main Mar 31, 2026
45 checks passed
@lforst Luca Forstner (lforst) deleted the fix/e2e-canary-fixes branch March 31, 2026 08:31
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

bug Something isn't working

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants