Skip to content

Conversation

@danexello
Copy link

Addresses #15

  • Introduced BaggageSpanProcessor that integrates baggage entries into span attributes upon span creation.
  • Updated CHANGELOG for version 0.9.2 to reflect this new feature.
  • Modified CONTRIBUTING.md to update repository links.
  • Added unit tests for BaggageSpanProcessor to ensure correct functionality and handling of various baggage scenarios.

…as span attributes

- Introduced BaggageSpanProcessor that integrates baggage entries into span attributes upon span creation.
- Updated CHANGELOG for version 0.9.2 to reflect this new feature.
- Modified CONTRIBUTING.md to update repository links.
- Added unit tests for BaggageSpanProcessor to ensure correct functionality and handling of various baggage scenarios.
@michaelbushe
Copy link
Member

@danexello This is very good! I appreciate the coverage of many test cases and the doc was very good. I have some updates coming. Minor things like removing the reference to inferior languages like Python. ;) (and not being tied to changes in pythons baggage implementation via doc)
Also, we've already released 0.9.3 so this will be in 0.9.4. You'll need to rebase. Also, if you rebase after I merge PR 18, you'll also get the coverage fixes and can run ./tool/coverage.sh to see coverage for your changes.

Thank you very much.

@michaelbushe
Copy link
Member

Questions I have:

  1. Should we consider max cardinality limits to prevent attribute explosion?
  2. Consider adding a benchmark test in test/performance/baggage/ to measure overhead
  3. The only non-doc change I had was ensuring the parent context gets the baggage if it's passed in:
    final baggage = (parentContext ?? Context.current).baggage;
    Please add a test for this.

@michaelbushe
Copy link
Member

@danexello Please merge in my pr17-improvements branch
Thank you.

@danexello
Copy link
Author

@michaelbushe thanks for the feedback! and sounds good. i can get to this tomorrow

@danexello
Copy link
Author

danexello commented Oct 30, 2025

@michaelbushe i "merged" in the changes indicated in you pr17-improvements branch (scare quotes because I had some trouble getting the branch onto my local fork, and so just manually added them in)

One thing -- I'm not sure if we actually want this proposed change. I think we want to prefer the Baggage on the current context, as it will inherit any Baggage present on the Parent, which I think addresses your comment:

ensuring the parent context gets the baggage if it's passed in:
final baggage = (parentContext ?? Context.current).baggage

I think we still want Context.current to be used for all Spans that get passed into the SpanProcessor pipeline, even if they have Parent Contexts. Let me know if you think I'm mistaken; more than happy to make the quick change

@danexello
Copy link
Author

Questions I have:

  1. Should we consider max cardinality limits to prevent attribute explosion?
  2. Consider adding a benchmark test in test/performance/baggage/ to measure overhead
  1. Good question, and I don't know if that is necessary. It seems like managing the cardinality of Baggage should be an end-user responsibility, and that I'd expect an end user to understand basic entailments of using the BaggageSpanProcessor re: how Baggage gets added as SpanAttributes. I also believe that the cardinality of Baggage is expected to be significantly lower than those of SpanAttributes, as Baggage is designed for cross-service context propagation (i.e. in HTTP headers), and the W3C spec places a significant ceiling on Baggage cardinality.

  2. sounds good!

@danexello
Copy link
Author

I started playing around with getting a perf test up, but ran into problems getting test/performance/baggage/baggage_benchmarks.dart to run (even without the tests handling the BaggageSpanProcessor, and the scope of the fixes were such that I'd much rather put them into a separate PR

@michaelbushe
Copy link
Member

@danexello Apologies for the delay. I just came back to it wondering were you were and saw that I'm the blocker now. Yes, I agree with your comments. Let's get this in. We can worry about performance later for this, it's an opt-in.

@michaelbushe
Copy link
Member

The performance test is acting oddly. It's failing intermittently.
./tool/test.sh passes locally in the shell and in in CI/CD.
./tool/coverage.sh passes locally in the shell and in in CI/CD.

When I run it locally via ./test/coverage.sh these are the three consistent failures.

I'm still working on this. I got of the three to pass, perhaps by setting concurrency to 1 (from 10) in coverage.sh . Or perhaps from fixing a bug where the test replaces the print for OTelLog to capture log output but OTel.initialize via OTelEnv.initializeLogging() was not respecting it. Let me look at this some more.

It doesn't happen on main so though the feature is good, we want to ensure it's quality before merging.

Here are the tests that fail locally.

00:12 +179 ~2: test/unit/context/context_propagator_test.dart: Context Propagation composite propagator combines multiple propagators
[2025-11-13T12:19:14.094441] [DEBUG] OTel: Shutting down tracer providers
[2025-11-13T12:19:14.094508] [DEBUG] TracerProvider: Force flushing 1 processors
[2025-11-13T12:19:14.094598] [DEBUG] TracerProvider: Flushing processor BatchSpanProcessor
[2025-11-13T12:19:14.094894] [DEBUG] TracerProvider: Successfully flushed processor BatchSpanProcessor
[2025-11-13T12:19:14.094951] [DEBUG] TracerProvider: Force flush complete
[2025-11-13T12:19:14.094975] [DEBUG] OTel: Tracer provider flush complete
[2025-11-13T12:19:14.094996] [DEBUG] TracerProvider: Shutting down with 1 processors
[2025-11-13T12:19:14.095014] [DEBUG] TracerProvider: Shutting down with 1 processors
[2025-11-13T12:19:14.095035] [DEBUG] TracerProvider: Shutting down processor BatchSpanProcessor
[2025-11-13T12:19:14.095055] [DEBUG] SDKTracerProvider: Shutting down processor BatchSpanProcessor
[2025-11-13T12:19:14.095118] [DEBUG] OtlpHttpSpanExporter: Shutdown requested
[2025-11-13T12:19:14.095138] [DEBUG] OtlpHttpSpanExporter: Shutting down - waiting for 0 pending exports
[2025-11-13T12:19:14.095178] [DEBUG] OtlpHttpSpanExporter: Shutdown complete
[2025-11-13T12:19:14.095234] [DEBUG] TracerProvider: Successfully shut down processor BatchSpanProcessor
[2025-11-13T12:19:14.095255] [DEBUG] TracerProvider: Cleared cached tracers
[2025-11-13T12:19:14.095301] [DEBUG] TracerProvider: Delegate shutdown complete
[2025-11-13T12:19:14.095320] [DEBUG] TracerProvider: Shutdown complete
[2025-11-13T12:19:14.095338] [DEBUG] OTel: Tracer provider shutdown complete
[2025-11-13T12:19:14.095360] [DEBUG] OTel: Shutting down meter provider
[2025-11-13T12:19:14.095419] [metric] PeriodicExportingMetricReader: Collected 0 metrics
[2025-11-13T12:19:14.095467] [export] OtlpGrpcMetricExporter: Channel shutdown completed
[2025-11-13T12:19:14.095507] [DEBUG] OTel: Meter provider shutdown complete
[2025-11-13T12:19:14.095528] [DEBUG] OTel: Resetting state
[2025-11-13T12:19:14.095548] [DEBUG] OTel: Shutting down tracer providers
[2025-11-13T12:19:14.095566] [DEBUG] TracerProvider: Force flushing 1 processors
[2025-11-13T12:19:14.095582] [DEBUG] TracerProvider: Cannot force flush - provider is shut down
[2025-11-13T12:19:14.095617] [DEBUG] OTel: Tracer provider flush complete
[2025-11-13T12:19:14.095636] [DEBUG] TracerProvider: Shutting down with 1 processors
[2025-11-13T12:19:14.095657] [DEBUG] TracerProvider: Shutting down with 1 processors
[2025-11-13T12:19:14.095674] [DEBUG] TracerProvider: Already shut down
[2025-11-13T12:19:14.095697] [DEBUG] OTel: Tracer provider shutdown complete
[2025-11-13T12:19:14.095721] [DEBUG] OTel: Shutting down meter provider
[2025-11-13T12:19:14.095747] [DEBUG] OTel: Meter provider shutdown complete
[2025-11-13T12:19:14.095766] [DEBUG] OTel: Reset static fields
[2025-11-13T12:19:14.095788] [DEBUG] OTel: Reset OTelAPI
[2025-11-13T12:19:14.095806] [DEBUG] OTel: Reset OTelFactory
[2025-11-13T12:19:14.095825] [DEBUG] OTel: Cleared test environment
00:12 +179 ~2 -1: test/unit/metrics/meter_coverage_test.dart: OTelLog Control Tests Meter creation logs when metrics logging is enabled [E]
Expected: true
Actual:

package:matcher expect
test/unit/metrics/meter_coverage_test.dart 387:7 main..

00:26 +397 ~12 -1: test/unit/util/otel_log_test.dart: OTelLog Tests OTelLog functions respect isXxx() convenience methods
[2025-11-13T12:22:27.947371] [DEBUG] OTel logging initialized
[2025-11-13T12:22:27.947538] [DEBUG] OTel initialized with endpoint: http://localhost:4317/, service: test-log-service
[2025-11-13T12:22:27.947745] [DEBUG] Resource merge result attributes:
[2025-11-13T12:22:27.947877] [DEBUG] Resource merge result attributes:
[2025-11-13T12:22:27.948115] [DEBUG] Resource merge result attributes:
[2025-11-13T12:22:27.948269] [DEBUG] Resource merge result attributes:
[2025-11-13T12:22:27.948326] [DEBUG] service.name: test-log-service
[2025-11-13T12:22:27.948380] [DEBUG] Resource after platform merge:
[2025-11-13T12:22:27.948434] [DEBUG] service.name: test-log-service
[2025-11-13T12:22:27.948616] [DEBUG] OtlpHttpSpanExporter: Created with endpoint: http://localhost:4318/
[2025-11-13T12:22:27.948675] [DEBUG] OtlpHttpSpanExporter: Configured headers count: 0
[2025-11-13T12:22:27.948797] [DEBUG] TracerProvider: Created with resource: null, sampler: null
[2025-11-13T12:22:27.948843] [DEBUG] OTel.tracerProvider: Setting resource from default
[2025-11-13T12:22:27.948989] [DEBUG] service.name: test-log-service
[2025-11-13T12:22:27.949193] [DEBUG] SDKTracerProvider: Adding span processor of type BatchSpanProcessor
[2025-11-13T12:22:27.949340] [export] OtlpGrpcMetricExporter: Creating client for localhost:4317
[2025-11-13T12:22:27.949438] [DEBUG] MeterProvider: Created with resource: null
00:26 +397 ~12 -2: test/unit/trace/span_lifecycle_test.dart: Span Lifecycle span should record events

Perhaps the third failure is this:
[2025-11-13T12:25:25.095615] [metric] PeriodicExportingMetricReader: Collected 0 metrics
[2025-11-13T12:25:25.101458] [export] OtlpGrpcMetricExporter: Channel shutdown completed
[2025-11-13T12:25:25.101518] [DEBUG] OTel: Meter provider shutdown complete
[2025-11-13T12:25:25.101529] [DEBUG] OTel: Reset static fields
[2025-11-13T12:25:25.101584] [DEBUG] OTel: Reset OTelAPI
[2025-11-13T12:25:25.101590] [DEBUG] OTel: Reset OTelFactory
[2025-11-13T12:25:25.101602] [DEBUG] OTel: Cleared test environment
00:43 +558 ~14 -3: test/unit/context/context_propagation_test.dart: Context Propagation withSpanContext prevents trace ID changes

00:35 +514 ~12 -2: test/unit/export/console_exporter_test.dart: ConsoleExporter Unit Tests Debug logging shows processor notifications
[2025-11-13T12:25:17.897395] [DEBUG] OTel logging initialized
[2025-11-13T12:25:17.897458] [DEBUG] OTel initialized with endpoint: http://localhost:4317/, service: @dart/dartastic_opentelemetry
[2025-11-13T12:25:17.897534] [DEBUG] Resource merge result attributes:
[2025-11-13T12:25:17.897592] [DEBUG] Resource merge result attributes:
[2025-11-13T12:25:17.897721] [DEBUG] Resource merge result attributes:
[2025-11-13T12:25:17.897774] [DEBUG] Resource merge result attributes:
[2025-11-13T12:25:17.897797] [DEBUG] service.name: @dart/dartastic_opentelemetry
[2025-11-13T12:25:17.897813] [DEBUG] Resource after platform merge:
[2025-11-13T12:25:17.897835] [DEBUG] service.name: @dart/dartastic_opentelemetry
[2025-11-13T12:25:17.897859] [DEBUG] TracerProvider: Created with resource: null, sampler: null
[2025-11-13T12:25:17.897874] [DEBUG] OTel.tracerProvider: Setting resource from default
[2025-11-13T12:25:17.897895] [DEBUG] service.name: @dart/dartastic_opentelemetry
[2025-11-13T12:25:17.897913] [DEBUG] SDKTracerProvider: Adding span processor of type TestSpanProcessor
[2025-11-13T12:25:17.898788] [export] OtlpGrpcMetricExporter: Creating client for localhost:4317
[2025-11-13T12:25:17.899146] [DEBUG] MeterProvider: Created with resource: null
[2025-11-13T12:25:17.899217] [DEBUG] TracerProvider: Getting tracer with name: dartastic, version: 1.0.0, schemaUrl: null
[2025-11-13T12:25:17.899313] [DEBUG] Tracer: Starting span with name: debug-test-span, kind: SpanKind.internal
[2025-11-13T12:25:17.900541] [DEBUG] Creating root span: traceId=df288ec4d956dd650d003bc2974f4925
[2025-11-13T12:25:17.900614] [DEBUG] Sampling decision for span debug-test-span: SamplingDecision.recordAndSample
[2025-11-13T12:25:17.900774] [DEBUG] SDKSpan: Created new span with name debug-test-span
[2025-11-13T12:25:17.900819] [DEBUG] SDKSpan: Starting to end span 6623fc5d6eaf71fe with name debug-test-span
[2025-11-13T12:25:17.900835] [DEBUG] SDKSpan: Calling delegate.end() for span debug-test-span
[2025-11-13T12:25:17.900853] [DEBUG] SDKSpan: Delegate.end() completed for span debug-test-span
[2025-11-13T12:25:17.900869] [DEBUG] SDKSpan: Notifying 1 span processors
[2025-11-13T12:25:17.900887] [DEBUG] SDKSpan: Calling onEnd for processor TestSpanProcessor
[2025-11-13T12:25:17.900904] [DEBUG] SDKSpan: Successfully called onEnd for processor TestSpanProcessor
00:35 +515 ~12 -3: test/unit/export/console_exporter_test.dart: ConsoleExporter Unit Tests Debug logging shows processor notifications [E]
Expected: contains 'onEnd'
Actual: ''
Which: does not contain 'onEnd'

package:matcher expect
test/unit/export/console_exporter_test.dart 466:7 main..

@michaelbushe
Copy link
Member

This just needs a rebase to catch up with the latest main and will work and will (now) publish the coverage to GitHub pages. baggage_propagator had 100% test coverage - nice! Thank you.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants