You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This guide uses the LLM Observability SDKs for [Python][1] and [Node.js][2]. If your application is written in another language, you can create traces by calling the [API][8] instead.
14
-
15
-
## Setup
16
-
17
-
### Jupyter notebooks
18
-
19
-
To better understand LLM Observability terms and concepts, you can explore the examples in the [LLM Observability Jupyter Notebooks repository][12]. These notebooks provide a hands-on experience, and allow you to apply these concepts in real time.
20
-
21
-
## Trace an LLM application
22
-
23
-
To generate an LLM Observability trace, you can run a Python or Node.js script.
24
11
25
12
### Prerequisites
26
13
27
-
- LLM Observability requires a Datadog API key. For more information, see [the instructions for creating an API key][7].
28
-
- The following example script uses OpenAI, but you can modify it to use a different provider. To run the script as written, you need:
29
-
- An OpenAI API key stored in your environment as `OPENAI_API_KEY`. To create one, see [Account Setup][4] and [Set up your API key][6] in the official OpenAI documentation.
30
-
- The OpenAI Python library installed. See [Setting up Python][5] in the official OpenAI documentation for instructions.
14
+
LLM Observability requires a Datadog API key if you don't have a Datadog Agent running. Find your API key [in Datadog](https://app.datadoghq.com/organization-settings/api-keys).
15
+
16
+
### Setup
31
17
32
18
{{< tabs >}}
33
19
{{% tab "Python" %}}
34
20
35
-
1. Install the SDK and OpenAI packages:
21
+
1. Install the SDK:
36
22
37
23
```shell
38
24
pip install ddtrace
39
-
pip install openai
40
25
```
41
26
42
-
2. Create a script, which makes a single OpenAI call.
**Note**: `DD_LLMOBS_AGENTLESS_ENABLED` is only required if you do not have the Datadog Agent running. If the Agent is running in your production environment, make sure this environment variable is unset.
66
+
### View traces
120
67
121
-
4. View the trace of your LLM call on the **Traces** tab [of the **LLM Observability** page][3] in Datadog.
68
+
Make requests to your application triggering LLM calls and then view traces in the **Traces** tab [of the **LLM Observability** page][3] in Datadog. If you don't see any traces, make sure you are using a supported library. Otherwise, you may need to instrument your application's LLM calls manually.
122
69
123
-
{{< img src="llm_observability/quickstart_trace_1.png" alt="An LLM Observability trace displaying a single LLM request" style="width:100%;" >}}
124
70
125
-
The trace you see is composed of a single LLM span. The `ddtrace-run` or `NODE_OPTIONS="--import dd-trace/initialize.mjs"` command automatically traces your LLM calls from [Datadog's list of supported integrations][10].
71
+
## Example "Hello World" application
126
72
127
-
If your application consists of more elaborate prompting or complex chains or workflows involving LLMs, you can trace it using the [Setup documentation][11] and the [SDK documentation][1].
73
+
See below for a simple application that can be used to begin exploring the LLM Observability product.
128
74
129
-
## Trace an LLM application in AWS Lambda
130
-
The following steps generate an LLM Observability trace in an AWS Lambda environment and create an Amazon Bedrock based chatbot running with LLM Observability in AWS Lambda.
131
75
132
-
1. Create a [Lambda function chatbot using Amazon Bedrock][13].
133
-
2. Instrument your Lambda function:
134
-
1. Open a Cloudshell
135
-
2. Install the Datadog CLI client
136
-
```shell
137
-
npm install -g @datadog/datadog-ci
138
-
```
139
-
3. Set the Datadog API key and site
140
-
```shell
141
-
export DD_SITE=<YOUR_DD_SITE>
142
-
export DD_API_KEY=<YOUR_DATADOG_API_KEY>
143
-
```
144
-
If you already have or prefer to use a secret in Secrets Manager, you can set the API key by using the secret ARN:
1. In the Datadog UI, navigate to `Infrastructure > Serverless`
163
-
2. Search for the name of your function.
164
-
3. Click on it to open the details panel.
165
-
4. Under the `Configuration` tab are the details of the Lambda function, attached layers, and a list of `DD_` Datadog-related environment variables under the `Datadog Environment Variables` section.
166
-
4. Invoke your Lambda functionand verify that LLM Observability traces are visible in the Datadog UI.
167
78
168
-
### Force flushing traces
79
+
1. Install OpenAI with `pip install openai`.
169
80
170
-
For either serverless environments other than AWS Lambda or issues seeing traces from AWS Lambdas, use the `flush` method to ensure traces are flushed before the process exits.
0 commit comments