Skip to content

add logs and metrics docs #30537

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 4 commits into from
Jul 18, 2025
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
72 changes: 72 additions & 0 deletions content/en/ddsql_reference/_index.md
Original file line number Diff line number Diff line change
Expand Up @@ -33,6 +33,7 @@
- [SQL functions](#functions)
- [Window functions](#window-functions)
- [JSON functions](#json-functions-and-operators)
- [Table functions](#table-functions)
- [Tags](#tags)


Expand Down Expand Up @@ -453,6 +454,77 @@
| json_extract_path_text(text json, text path…) | text | Extracts a JSON sub-object as text, defined by the path. Its behavior is equivalent to the [Postgres function with the same name][3]. For example, `json_extract_path_text(col, ‘forest')` returns the value of the key `forest` for each JSON object in `col`. See the example below for a JSON array syntax.|
| json_extract_path(text json, text path…) | JSON | Same functionality as `json_extract_path_text`, but returns a column of JSON type instead of text type.|

## Table functions

{{< callout url="https://www.datadoghq.com/product-preview/logs-metrics-support-in-ddsql-editor/" >}}
Querying Logs and Metrics through DDSQL is in Preview. Use this form to request access.
{{< /callout >}}

Table functions are used to query Logs and Metrics

<table style="width: 100%; table-layout: fixed;">
<thead>
<tr>
<th style="width: 33%;">Function</th>
<th style="width: 33%;">Description</th>
<th style="width: 33%;">Example</th>
</tr>
</thead>
<tbody>
<tr>
<td>
<pre>
dd.logs(
filter => varchar,
columns => array < varchar >,
indexes ? => array < varchar >,
from_timestamp ? => timestamp,
to_timestamp ? => timestamp
) AS (column_name type [, ...])</pre>
</td>
<td>Returns log data as a table. The columns parameter specifies which log fields to extract, and the AS clause defines the schema of the returned table. Optional: filtering by index or time range. When time is not specified, we default to the past 1 hour of data.</td>

Check warning on line 485 in content/en/ddsql_reference/_index.md

View workflow job for this annotation

GitHub Actions / vale

Datadog.pronouns

Avoid first-person pronouns such as 'we'.
<td>
{{< code-block lang="sql" >}}
SELECT timestamp, host, service, message
FROM dd.logs(
filter => 'source:java',
columns => ARRAY['timestamp','host', 'service','message']
) AS (
timestamp TIMESTAMP,
host VARCHAR,
service VARCHAR,
message VARCHAR
){{< /code-block >}}
</td>
</tr>
<tr>
<td>
<pre>
dd.metric_scalar(
query varchar,
reducer varchar [, from_timestamp timestamp, to_timestamp timestamp]
)</pre>
</td>
<td>Returns metric data as a scalar value. The function accepts a metrics query (with optional grouping), a reducer to determine how values are aggregated (avg, max, etc.), and optional timestamp parameters (default 1 hour) to define the time range.</td>
<td>
{{< code-block lang="sql" >}}
SELECT *
FROM dd.metric_scalar(
'avg:system.cpu.user{*} by {service}',
'avg',
TIMESTAMP '2025-07-10 00:00:00.000-04:00',
TIMESTAMP '2025-07-17 00:00:00.000-04:00'
)
ORDER BY value DESC;{{< /code-block >}}
</td>
</tr>
</tbody>
</table>





## Tags

DDSQL exposes tags as an `hstore` type, which you can query using the PostgreSQL arrow operator. For example:
Expand Down
Loading