diff --git a/config/_default/menus/main.en.yaml b/config/_default/menus/main.en.yaml index 7f72d64878e6a..1fb7c9743e6bf 100644 --- a/config/_default/menus/main.en.yaml +++ b/config/_default/menus/main.en.yaml @@ -4648,7 +4648,7 @@ menu: parent: llm_obs_instrumentation identifier: llm_obs_instrumentation_custom weight: 202 - - name: API + - name: HTTP API url: llm_observability/instrumentation/api parent: llm_obs_instrumentation identifier: llm_obs_instrumentation_api diff --git a/content/en/llm_observability/instrumentation/_index.md b/content/en/llm_observability/instrumentation/_index.md index cc17d4e979b53..bc2f0709f383a 100644 --- a/content/en/llm_observability/instrumentation/_index.md +++ b/content/en/llm_observability/instrumentation/_index.md @@ -13,6 +13,12 @@ You can visualize the interactions and performance data of your LLM applications {{< img src="llm_observability/traces.png" alt="An LLM Observability trace displaying each span of a request" style="width:100%;" >}} + +## Further Reading + +{{< partial name="whats-next/whats-next.html" >}} + + [1]: /llm_observability/auto_instrumentation [2]: /llm_observability/setup/api -[3]: https://app.datadoghq.com/llm/traces \ No newline at end of file +[3]: https://app.datadoghq.com/llm/traces diff --git a/content/en/llm_observability/instrumentation/api.md b/content/en/llm_observability/instrumentation/api.md index 0bae20e2feb69..111aa8dc99115 100644 --- a/content/en/llm_observability/instrumentation/api.md +++ b/content/en/llm_observability/instrumentation/api.md @@ -1,5 +1,5 @@ --- -title: API Reference +title: HTTP API Reference aliases: - /tracing/llm_observability/api - /llm_observability/api @@ -8,7 +8,7 @@ aliases: ## Overview -The LLM Observability API provides an interface for developers to send LLM-related traces and spans to Datadog. If your application is written in Python or Node.js, you can use the [LLM Observability SDKs][1]. +The LLM Observability HTTP API provides an interface for developers to send LLM-related traces and spans to Datadog. If your application is written in Python or Node.js, you can use the [LLM Observability SDKs][1]. The API accepts spans with timestamps no more than 24 hours old, allowing limited backfill of delayed data. diff --git a/content/en/llm_observability/instrumentation/auto_instrumentation.md b/content/en/llm_observability/instrumentation/auto_instrumentation.md index 0f9269eb87138..c392f0f068539 100644 --- a/content/en/llm_observability/instrumentation/auto_instrumentation.md +++ b/content/en/llm_observability/instrumentation/auto_instrumentation.md @@ -11,7 +11,6 @@ further_reading: text: 'Learn about the LLM Observability SDK for Python' --- -
Datadog offers a variety of artificial intelligence (AI) and machine learning (ML) capabilities. The AI/ML integrations on the Integrations page and the Datadog Marketplace are platform-wide Datadog functionalities.

For example, APM offers a native integration with OpenAI for monitoring your OpenAI usage, while Infrastructure Monitoring offers an integration with NVIDIA DCGM Exporter for monitoring compute-intensive AI workloads. These integrations are different from the LLM Observability offering.
{{< tabs >}} {{% tab "Python" %}} diff --git a/content/en/llm_observability/instrumentation/custom_instrumentation.md b/content/en/llm_observability/instrumentation/custom_instrumentation.md index 02141119c1bb0..1d2be62970844 100644 --- a/content/en/llm_observability/instrumentation/custom_instrumentation.md +++ b/content/en/llm_observability/instrumentation/custom_instrumentation.md @@ -25,8 +25,6 @@ This page explains how to use the Datadog LLM Observability SDK's custom instrum ## Instrument an LLM application -
These instructions use the LLM Observability SDK for Python. If your application is running in a serverless environment, follow the serverless setup instructions.

If your application is not written in Python, you can complete the steps below with API requests instead of SDK function calls.
- To instrument an LLM application: 1. [Install the LLM Observability SDK for Python][5]. diff --git a/content/en/llm_observability/instrumentation/sdk.md b/content/en/llm_observability/instrumentation/sdk.md index b117b3af95209..f9dba12984061 100644 --- a/content/en/llm_observability/instrumentation/sdk.md +++ b/content/en/llm_observability/instrumentation/sdk.md @@ -2283,6 +2283,3 @@ tracer.use('http', false) // disable the http integration [11]: /tracing/trace_collection/compatibility/python/#integrations [12]: /tracing/trace_collection/compatibility/python/#library-compatibility [13]: /llm_observability/instrumentation/auto_instrumentation/ -[14]: /serverless/aws_lambda/installation/python/?tab=custom#installation -[15]: /llm_observability/quickstart?tab=python#trace-an-llm-application-in-aws-lambda -[16]: https://app.datadoghq.com/llm/settings/evaluations diff --git a/content/en/llm_observability/quickstart.md b/content/en/llm_observability/quickstart.md index 5e2e53a735ff0..da791471fda2e 100644 --- a/content/en/llm_observability/quickstart.md +++ b/content/en/llm_observability/quickstart.md @@ -3,15 +3,12 @@ title: Quickstart aliases: - /tracing/llm_observability/quickstart further_reading: - - link: '/llm_observability' - tag: 'Documentation' - text: 'Learn about LLM Observability' - link: '/llm_observability/evaluations' tag: 'Evaluations' text: 'Configure Evaluations on your application' - link: '/llm_observability/instrumentation/custom_instrumentation' tag: 'Custom Instrumentation' - text: 'Custom instrumentation' + text: 'Instrument your application with custom spans' --- This page demonstrates using Datadog's LLM Observability SDK to instrument a Python or Node.js LLM application.