Skip to content

Commit 94b7ed5

Browse files
Kyle-Verhoogcswatt
andauthored
[llmobs] Make quickstart actually quick (#30472)
The quickstart guide was not quick at all. Co-authored-by: cecilia saixue watt <[email protected]>
1 parent 303dc02 commit 94b7ed5

File tree

1 file changed

+69
-128
lines changed

1 file changed

+69
-128
lines changed

content/en/llm_observability/quickstart.md

Lines changed: 69 additions & 128 deletions
Original file line numberDiff line numberDiff line change
@@ -8,193 +8,134 @@ further_reading:
88
text: 'Learn about LLM Observability'
99
---
1010

11-
## Overview
12-
13-
This guide uses the LLM Observability SDKs for [Python][1] and [Node.js][2]. If your application is written in another language, you can create traces by calling the [API][8] instead.
14-
15-
## Setup
16-
17-
### Jupyter notebooks
18-
19-
To better understand LLM Observability terms and concepts, you can explore the examples in the [LLM Observability Jupyter Notebooks repository][12]. These notebooks provide a hands-on experience, and allow you to apply these concepts in real time.
20-
21-
## Trace an LLM application
22-
23-
To generate an LLM Observability trace, you can run a Python or Node.js script.
2411

2512
### Prerequisites
2613

27-
- LLM Observability requires a Datadog API key. For more information, see [the instructions for creating an API key][7].
28-
- The following example script uses OpenAI, but you can modify it to use a different provider. To run the script as written, you need:
29-
- An OpenAI API key stored in your environment as `OPENAI_API_KEY`. To create one, see [Account Setup][4] and [Set up your API key][6] in the official OpenAI documentation.
30-
- The OpenAI Python library installed. See [Setting up Python][5] in the official OpenAI documentation for instructions.
14+
LLM Observability requires a Datadog API key if you don't have a Datadog Agent running. Find your API key [in Datadog](https://app.datadoghq.com/organization-settings/api-keys).
15+
16+
### Setup
3117

3218
{{< tabs >}}
3319
{{% tab "Python" %}}
3420

35-
1. Install the SDK and OpenAI packages:
21+
1. Install the SDK:
3622

3723
```shell
3824
pip install ddtrace
39-
pip install openai
4025
```
4126

42-
2. Create a script, which makes a single OpenAI call.
43-
44-
```python
45-
import os
46-
from openai import OpenAI
47-
48-
oai_client = OpenAI(api_key=os.environ.get("OPENAI_API_KEY"))
49-
50-
completion = oai_client.chat.completions.create(
51-
model="gpt-3.5-turbo",
52-
messages=[
53-
{"role": "system", "content": "You are a helpful customer assistant for a furniture store."},
54-
{"role": "user", "content": "I'd like to buy a chair for my living room."},
55-
],
56-
)
57-
```
58-
59-
3. Run the script with the following shell command. This sends a trace of the OpenAI call to Datadog.
27+
2. Prefix your Python start command with `ddtrace-run`:
6028

6129
```shell
62-
DD_LLMOBS_ENABLED=1 DD_LLMOBS_ML_APP=onboarding-quickstart \
63-
DD_API_KEY=<YOUR_DATADOG_API_KEY> DD_SITE=<YOUR_DD_SITE> \
64-
DD_LLMOBS_AGENTLESS_ENABLED=1 ddtrace-run python quickstart.py
30+
DD_LLMOBS_ENABLED=1 \
31+
DD_LLMOBS_ML_APP=quickstart-app \
32+
DD_API_KEY=<YOUR_DATADOG_API_KEY> \
33+
ddtrace-run <your application command>
6534
```
6635

67-
Replace `<YOUR_DATADOG_API_KEY>` with your Datadog API key, and replace `<YOUR_DD_SITE>` with your [Datadog site][2].
36+
Replace `<YOUR_DATADOG_API_KEY>` with your Datadog API key.
6837

69-
For more information about required environment variables, see [the SDK documentation][1].
7038

7139
[1]: /llm_observability/setup/sdk/python/#command-line-setup
7240
[2]: /getting_started/site/
7341
{{% /tab %}}
7442

7543
{{% tab "Node.js" %}}
76-
1. Install the SDK and OpenAI packages:
44+
1. Install the SDK:
7745

7846
```shell
7947
npm install dd-trace
80-
npm install openai
8148
```
82-
2. Create a script, which makes a single OpenAI call.
83-
84-
```javascript
85-
const { OpenAI } = require('openai');
8649

87-
const oaiClient = new OpenAI(process.env.OPENAI_API_KEY);
88-
89-
function main () {
90-
const completion = await oaiClient.chat.completions.create({
91-
model: 'gpt-3.5-turbo',
92-
messages: [
93-
{ role: 'system', content: 'You are a helpful customer assistant for a furniture store.' },
94-
{ role: 'user', content: 'I\'d like to buy a chair for my living room.' },
95-
]
96-
});
97-
}
98-
99-
main();
100-
```
101-
102-
3. Run the script with the following shell command. This sends a trace of the OpenAI call to Datadog.
50+
2. Add `NODE_OPTIONS` to your Node.js start command:
10351
```shell
104-
DD_LLMOBS_ENABLED=1 DD_LLMOBS_ML_APP=onboarding-quickstart \
105-
DD_API_KEY=<YOUR_DATADOG_API_KEY> DD_SITE=<YOUR_DD_SITE> \
106-
DD_LLMOBS_AGENTLESS_ENABLED=1 NODE_OPTIONS="--import dd-trace/initialize.mjs" node quickstart.js
52+
DD_LLMOBS_ENABLED=1 \
53+
DD_LLMOBS_ML_APP=quickstart-app \
54+
DD_API_KEY=<YOUR_DATADOG_API_KEY> \
55+
NODE_OPTIONS="--import dd-trace/initialize.mjs" <your application command>
10756
```
10857

109-
Replace `<YOUR_DATADOG_API_KEY>` with your Datadog API key, and replace `<YOUR_DD_SITE>` with your [Datadog site][2].
110-
111-
For more information about required environment variables, see [the SDK documentation][1].
58+
Replace `<YOUR_DATADOG_API_KEY>` with your Datadog API key.
11259

11360
[1]: /llm_observability/setup/sdk/nodejs/#command-line-setup
11461
[2]: /getting_started/site/
11562

11663
{{% /tab %}}
11764
{{< /tabs >}}
11865

119-
**Note**: `DD_LLMOBS_AGENTLESS_ENABLED` is only required if you do not have the Datadog Agent running. If the Agent is running in your production environment, make sure this environment variable is unset.
66+
### View traces
12067

121-
4. View the trace of your LLM call on the **Traces** tab [of the **LLM Observability** page][3] in Datadog.
68+
Make requests to your application triggering LLM calls and then view traces in the **Traces** tab [of the **LLM Observability** page][3] in Datadog. If you don't see any traces, make sure you are using a supported library. Otherwise, you may need to instrument your application's LLM calls manually.
12269

123-
{{< img src="llm_observability/quickstart_trace_1.png" alt="An LLM Observability trace displaying a single LLM request" style="width:100%;" >}}
12470

125-
The trace you see is composed of a single LLM span. The `ddtrace-run` or `NODE_OPTIONS="--import dd-trace/initialize.mjs"` command automatically traces your LLM calls from [Datadog's list of supported integrations][10].
71+
## Example "Hello World" application
12672

127-
If your application consists of more elaborate prompting or complex chains or workflows involving LLMs, you can trace it using the [Setup documentation][11] and the [SDK documentation][1].
73+
See below for a simple application that can be used to begin exploring the LLM Observability product.
12874

129-
## Trace an LLM application in AWS Lambda
130-
The following steps generate an LLM Observability trace in an AWS Lambda environment and create an Amazon Bedrock based chatbot running with LLM Observability in AWS Lambda.
13175

132-
1. Create a [Lambda function chatbot using Amazon Bedrock][13].
133-
2. Instrument your Lambda function:
134-
1. Open a Cloudshell
135-
2. Install the Datadog CLI client
136-
```shell
137-
npm install -g @datadog/datadog-ci
138-
```
139-
3. Set the Datadog API key and site
140-
```shell
141-
export DD_SITE=<YOUR_DD_SITE>
142-
export DD_API_KEY=<YOUR_DATADOG_API_KEY>
143-
```
144-
If you already have or prefer to use a secret in Secrets Manager, you can set the API key by using the secret ARN:
145-
```shell
146-
export DATADOG_API_KEY_SECRET_ARN=<DATADOG_API_KEY_SECRET_ARN>
147-
```
148-
4. Instrument your Lambda function with LLM Observability (this requires at least version 77 of the Datadog Extension layer).
14976
{{< tabs >}}
15077
{{% tab "Python" %}}
151-
```shell
152-
datadog-ci lambda instrument -f <YOUR_LAMBDA_FUNCTION_NAME> -r <AWS_REGION> -v {{< latest-lambda-layer-version layer="python" >}} -e {{< latest-lambda-layer-version layer="extension" >}} --llmobs <YOUR_LLMOBS_ML_APP>
153-
```
154-
{{% /tab %}}
155-
{{% tab "Node.js" %}}
156-
```shell
157-
datadog-ci lambda instrument -f <YOUR_LAMBDA_FUNCTION_NAME> -r <AWS_REGION> -v {{< latest-lambda-layer-version layer="node" >}} -e {{< latest-lambda-layer-version layer="extension" >}} --llmobs <YOUR_LLMOBS_ML_APP>
158-
```
159-
{{% /tab %}}
160-
{{< /tabs >}}
161-
3. Verify that your function was instrumented.
162-
1. In the Datadog UI, navigate to `Infrastructure > Serverless`
163-
2. Search for the name of your function.
164-
3. Click on it to open the details panel.
165-
4. Under the `Configuration` tab are the details of the Lambda function, attached layers, and a list of `DD_` Datadog-related environment variables under the `Datadog Environment Variables` section.
166-
4. Invoke your Lambda function and verify that LLM Observability traces are visible in the Datadog UI.
16778

168-
### Force flushing traces
79+
1. Install OpenAI with `pip install openai`.
16980

170-
For either serverless environments other than AWS Lambda or issues seeing traces from AWS Lambdas, use the `flush` method to ensure traces are flushed before the process exits.
81+
2. Save example script `app.py`.
17182

172-
{{< tabs >}}
173-
{{% tab "Python" %}}
83+
```python
84+
import os
85+
from openai import OpenAI
86+
87+
oai_client = OpenAI(api_key=os.environ.get("OPENAI_API_KEY"))
88+
completion = oai_client.chat.completions.create(
89+
model="gpt-4o-mini",
90+
messages=[
91+
{"role": "system", "content": "You are a helpful customer assistant for a furniture store."},
92+
{"role": "user", "content": "I'd like to buy a chair for my living room."},
93+
],
94+
)
95+
```
17496

175-
```python
176-
from ddtrace.llmobs import LLMObs
177-
def handler():
178-
# function body
179-
LLMObs.flush()
180-
```
97+
3. Run the application:
18198

99+
```shell
100+
# Make sure you have the required environment variables listed above
101+
DD_...= \
102+
ddtrace-run app.py
103+
```
182104
{{% /tab %}}
105+
183106
{{% tab "Node.js" %}}
107+
1. Install OpenAI `npm install openai`.
184108

185-
```javascript
186-
import tracer from 'dd-trace';
187-
const llmobs = tracer.llmobs;
109+
2. Save example script `app.js`
188110

189-
export const handler = async (event) => {
190-
// your function body
191-
llmobs.flush();
192-
};
193-
```
111+
```js
112+
const { OpenAI } = require('openai');
113+
const oaiClient = new OpenAI(process.env.OPENAI_API_KEY);
194114

115+
async function main () {
116+
const completion = await oaiClient.chat.completions.create({
117+
model: 'gpt-4o-mini',
118+
messages: [
119+
{ role: 'system', content: 'You are a helpful customer assistant for a furniture store.' },
120+
{ role: 'user', content: 'I\'d like to buy a chair for my living room.' },
121+
]
122+
});
123+
return completion;
124+
}
125+
126+
main().then(console.log)
127+
128+
3. Run the application:
129+
130+
```
131+
# Make sure you have the required environment variables listed above
132+
DD_...= \
133+
NODE_OPTIONS="--import dd-trace/initialize.mjs" node app.js
134+
```
195135
{{% /tab %}}
196136
{{< /tabs >}}
197137
138+
198139
## Further Reading
199140
200141
{{< partial name="whats-next/whats-next.html" >}}

0 commit comments

Comments
 (0)