Skip to content

llmobs: misc instrumentation doc improvements #30684

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 4 commits into from
Aug 4, 2025
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion config/_default/menus/main.en.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -4648,7 +4648,7 @@ menu:
parent: llm_obs_instrumentation
identifier: llm_obs_instrumentation_custom
weight: 202
- name: API
- name: HTTP API
url: llm_observability/instrumentation/api
parent: llm_obs_instrumentation
identifier: llm_obs_instrumentation_api
Expand Down
8 changes: 7 additions & 1 deletion content/en/llm_observability/instrumentation/_index.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,6 +13,12 @@ You can visualize the interactions and performance data of your LLM applications

{{< img src="llm_observability/traces.png" alt="An LLM Observability trace displaying each span of a request" style="width:100%;" >}}


## Further Reading

{{< partial name="whats-next/whats-next.html" >}}


[1]: /llm_observability/auto_instrumentation
[2]: /llm_observability/setup/api
[3]: https://app.datadoghq.com/llm/traces
[3]: https://app.datadoghq.com/llm/traces
4 changes: 2 additions & 2 deletions content/en/llm_observability/instrumentation/api.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
---
title: API Reference
title: HTTP API Reference
aliases:
- /tracing/llm_observability/api
- /llm_observability/api
Expand All @@ -8,7 +8,7 @@ aliases:

## Overview

The LLM Observability API provides an interface for developers to send LLM-related traces and spans to Datadog. If your application is written in Python or Node.js, you can use the [LLM Observability SDKs][1].
The LLM Observability HTTP API provides an interface for developers to send LLM-related traces and spans to Datadog. If your application is written in Python or Node.js, you can use the [LLM Observability SDKs][1].

The API accepts spans with timestamps no more than 24 hours old, allowing limited backfill of delayed data.

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,6 @@ further_reading:
text: 'Learn about the LLM Observability SDK for Python'
---

<div class="alert alert-info">Datadog offers a variety of artificial intelligence (AI) and machine learning (ML) capabilities. The <a href="/integrations/#cat-aiml">AI/ML integrations on the Integrations page and the Datadog Marketplace</a> are platform-wide Datadog functionalities. <br><br> For example, APM offers a native integration with OpenAI for monitoring your OpenAI usage, while Infrastructure Monitoring offers an integration with NVIDIA DCGM Exporter for monitoring compute-intensive AI workloads. These integrations are different from the LLM Observability offering.</div>

{{< tabs >}}
{{% tab "Python" %}}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -25,8 +25,6 @@ This page explains how to use the Datadog LLM Observability SDK's custom instrum

## Instrument an LLM application

<div class="alert alert-info">These instructions use the <a href="/llm_observability/setup/sdk">LLM Observability SDK for Python</a>. If your application is running in a serverless environment, follow the <a href="/llm_observability/setup/sdk/#aws-lambda-setup">serverless setup instructions</a>. </br></br> If your application is not written in Python, you can complete the steps below with API requests instead of SDK function calls.</div>

To instrument an LLM application:

1. [Install the LLM Observability SDK for Python][5].
Expand Down
3 changes: 0 additions & 3 deletions content/en/llm_observability/instrumentation/sdk.md
Original file line number Diff line number Diff line change
Expand Up @@ -2283,6 +2283,3 @@ tracer.use('http', false) // disable the http integration
[11]: /tracing/trace_collection/compatibility/python/#integrations
[12]: /tracing/trace_collection/compatibility/python/#library-compatibility
[13]: /llm_observability/instrumentation/auto_instrumentation/
[14]: /serverless/aws_lambda/installation/python/?tab=custom#installation
[15]: /llm_observability/quickstart?tab=python#trace-an-llm-application-in-aws-lambda
[16]: https://app.datadoghq.com/llm/settings/evaluations
5 changes: 1 addition & 4 deletions content/en/llm_observability/quickstart.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,15 +3,12 @@ title: Quickstart
aliases:
- /tracing/llm_observability/quickstart
further_reading:
- link: '/llm_observability'
tag: 'Documentation'
text: 'Learn about LLM Observability'
- link: '/llm_observability/evaluations'
tag: 'Evaluations'
text: 'Configure Evaluations on your application'
- link: '/llm_observability/instrumentation/custom_instrumentation'
tag: 'Custom Instrumentation'
text: 'Custom instrumentation'
text: 'Instrument your application with custom spans'
---

This page demonstrates using Datadog's LLM Observability SDK to instrument a Python or Node.js LLM application.
Expand Down
Loading