-
-
Notifications
You must be signed in to change notification settings - Fork 1.8k
Description
Is there an existing issue for this?
- I have checked for existing issues https://github.com/getsentry/sentry-javascript/issues
- I have reviewed the documentation https://docs.sentry.io/
- I am using the latest SDK release https://github.com/getsentry/sentry-javascript/releases
How do you use Sentry?
Sentry Saas (sentry.io)
Which SDK are you using?
@sentry/browser
SDK Version
^10.40.0
Framework Version
No response
Link to Sentry event
Reproduction Example/SDK Setup
No response
Steps to Reproduce
const graph = new StateGraph(IdeaState)
.addNode("expand", expandNode) // calls llm.invoke()
.addNode("validate", validateNode) // calls llm.invoke()
.addNode("differentiate", diffNode) // calls llm.invoke()
.addNode("roadmap", roadmapNode) // calls llm.invoke()
.addNode("pitch", pitchNode) // calls llm.invoke()
.addEdge(START, "expand")
.addEdge("expand", "validate")
.addEdge("validate", "differentiate")
.addEdge("differentiate", "roadmap")
.addEdge("roadmap", "pitch")
.addEdge("pitch", END);
Sentry.instrumentLangGraph(graph, { recordInputs: true, recordOutputs: true });
const compiled = graph.compile({ name: "idea-forge" });
const sentryHandler = Sentry.createLangChainCallbackHandler({
recordInputs: true,
recordOutputs: true,
});
const result = await compiled.invoke({ idea: "test" }, { callbacks: [sentryHandler] });Expected Result
All 5 LLM calls should produce chat spans in the trace.
Actual Result
| Setup | chat spans captured |
|---|---|
instrumentLangGraph + createLangChainCallbackHandler |
1 out of 5 |
createLangChainCallbackHandler only |
5 out of 5 |
instrumentLangGraph only |
0 (expected — no callback handler) |
4 out of 5 chat spans are silently dropped with no error or warning.
Additionally, the trace contains multiple spurious nested invoke_agent sub-spans with near-zero durations (0.20ms) that should not exist:
invoke_agent idea-forge 7.43s ← top-level (expected)
invoke_agent idea-forge 2.70ms ← spurious
invoke_agent idea-forge 1.60ms ← spurious
invoke_agent idea-forge 0.20ms ← spurious
invoke_agent idea-forge 4.41s
chat claude-haiku-4-5-20251001 4.41s ← only 1 of 5 chat spans
invoke_agent idea-forge 0.20ms ← spurious
invoke_agent idea-forge 0.20ms ← spurious
Additional Context
Possible Root Cause:
instrumentLangGraph wraps invoke() inside Sentry.startSpan(). As async execution passes through sequential nodes, the span context from startSpan is not consistently maintained across all node invocations. Some handleChatModelStart callbacks from the handler fire outside an active span context and their spans are dropped.
Priority
React with 👍 to help prioritize this issue. Please use comments to provide useful context, avoiding +1 or me too, to help us triage it.