Skip to content

Commit 0dcdd04

Browse files
committed
Merge branch 'main' into tp/clickpipes-advanced-settings
2 parents 589ca01 + 8e377ee commit 0dcdd04

File tree

19 files changed

+145
-11
lines changed

19 files changed

+145
-11
lines changed

docs/best-practices/json_type.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -224,7 +224,7 @@ ORDER BY doc.update_date
224224
We provide a type hint for the `update_date` column in the JSON definition, as we use it in the ordering/primary key. This helps ClickHouse to know that this column won't be null and ensures it knows which `update_date` sub-column to use (there may be multiple for each type, so this is ambiguous otherwise).
225225
:::
226226

227-
We can insert into this table and view the subsequently inferred schema using the [`JSONAllPathsWithTypes`](/sql-reference/functions/json-functions#jsonallpathswithtypes) function and [`PrettyJSONEachRow`](/interfaces/formats/PrettyJSONEachRow) output format:
227+
We can insert into this table and view the subsequently inferred schema using the [`JSONAllPathsWithTypes`](/sql-reference/functions/json-functions#JSONAllPathsWithTypes) function and [`PrettyJSONEachRow`](/interfaces/formats/PrettyJSONEachRow) output format:
228228

229229
```sql
230230
INSERT INTO arxiv FORMAT JSONAsObject

docs/cloud/onboard/02_migrate/01_migration_guides/04_snowflake/02_migration_guide.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -101,7 +101,7 @@ input_format_parquet_case_insensitive_column_matching = 1 -- Column matching bet
101101
:::note Note on nested column structures
102102
The `VARIANT` and `OBJECT` columns in the original Snowflake table schema will be output as JSON strings by default, forcing us to cast these when inserting them into ClickHouse.
103103

104-
Nested structures such as `some_file` are converted to JSON strings on copy by Snowflake. Importing this data requires us to transform these structures to Tuples at insert time in ClickHouse, using the [JSONExtract function](/sql-reference/functions/json-functions#jsonextract) as shown above.
104+
Nested structures such as `some_file` are converted to JSON strings on copy by Snowflake. Importing this data requires us to transform these structures to Tuples at insert time in ClickHouse, using the [JSONExtract function](/sql-reference/functions/json-functions#JSONExtract) as shown above.
105105
:::
106106

107107
## Test successful data export {#3-testing-successful-data-export}

docs/getting-started/index.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -42,6 +42,7 @@ by https://github.com/ClickHouse/clickhouse-docs/blob/main/scripts/autogenerate-
4242
| [Foursquare places](/getting-started/example-datasets/foursquare-places) | Dataset with over 100 million records containing information about places on a map, such as shops, restaurants, parks, playgrounds, and monuments. |
4343
| [GitHub Events Dataset](/getting-started/example-datasets/github-events) | Dataset containing all events on GitHub from 2011 to Dec 6 2020, with a size of 3.1 billion records. |
4444
| [Hacker News dataset](/getting-started/example-datasets/hacker-news) | Dataset containing 28 million rows of hacker news data. |
45+
| [Hacker News Vector Search dataset](/getting-started/example-datasets/hackernews-vector-search-dataset) | Dataset containing 28+ million Hacker News postings & their vector embeddings |
4546
| [LAION 5B dataset](/getting-started/example-datasets/laion-5b-dataset) | Dataset containing 100 million vectors from the LAION 5B dataset |
4647
| [Laion-400M dataset](/getting-started/example-datasets/laion-400m-dataset) | Dataset containing 400 million images with English image captions |
4748
| [New York Public Library "What's on the Menu?" Dataset](/getting-started/example-datasets/menus) | Dataset containing 1.3 million records of historical data on the menus of hotels, restaurants and cafes with the dishes along with their prices. |

docs/integrations/data-ingestion/clickpipes/postgres/faq.md

Lines changed: 11 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -6,6 +6,9 @@ sidebar_position: 2
66
title: 'ClickPipes for Postgres FAQ'
77
---
88

9+
import failover_slot from '@site/static/images/integrations/data-ingestion/clickpipes/postgres/failover_slot.png'
10+
import Image from '@theme/IdealImage';
11+
912
# ClickPipes for Postgres FAQ
1013

1114
### How does idling affect my Postgres CDC ClickPipe? {#how-does-idling-affect-my-postgres-cdc-clickpipe}
@@ -32,7 +35,7 @@ To set the replica identity to FULL, you can use the following SQL command:
3235
```sql
3336
ALTER TABLE your_table_name REPLICA IDENTITY FULL;
3437
```
35-
REPLICA IDENTITY FULL also enabled replication of unchanged TOAST columns. More on that [here](./toast).
38+
REPLICA IDENTITY FULL also enables replication of unchanged TOAST columns. More on that [here](./toast).
3639

3740
Note that using `REPLICA IDENTITY FULL` can have performance implications and also faster WAL growth, especially for tables without a primary key and with frequent updates or deletes, as it requires more data to be logged for each change. If you have any doubts or need assistance with setting up primary keys or replica identities for your tables, please reach out to our support team for guidance.
3841

@@ -345,3 +348,10 @@ If your initial load has completed without error but your destination ClickHouse
345348
Also worth checking:
346349
- If the user has sufficient permissions to read the source tables.
347350
- If there are any row policies on ClickHouse side which might be filtering out rows.
351+
352+
### Can I have the ClickPipe create a replication slot with failover enabled? {#failover-slot}
353+
Yes, for a Postgres ClickPipe with replication mode as CDC or Snapshot + CDC, you can have ClickPipes create a replication slot with failover enabled, by toggling the below switch in the `Advanced Settings` section while creating the ClickPipe. Note that your Postgres version must be 17 or above to use this feature.
354+
355+
<Image img={failover_slot} border size="md"/>
356+
357+
If the source is configured accordingly, the slot is preserved after failovers to a Postgres read replica, ensuring continuous data replication. Learn more [here](https://www.postgresql.org/docs/current/logical-replication-failover.html).

docs/integrations/data-ingestion/clickpipes/postgres/schema-changes.md

Lines changed: 4 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -10,6 +10,8 @@ ClickPipes for Postgres can detect schema changes in the source tables and, in s
1010

1111
| Schema Change Type | Behaviour |
1212
| ----------------------------------------------------------------------------------- | ------------------------------------- |
13-
| Adding a new column (`ALTER TABLE ADD COLUMN ...`) | Propagated automatically. The new column(s) will be populated for all rows replicated after the schema change |
14-
| Adding a new column with a default value (`ALTER TABLE ADD COLUMN ... DEFAULT ...`) | Propagated automatically. The new column(s) will be populated for all rows replicated after the schema change, but existing rows will not show the default value without a full table refresh |
13+
| Adding a new column (`ALTER TABLE ADD COLUMN ...`) | Propagated automatically once the table gets an insert/update/delete. The new column(s) will be populated for all rows replicated after the schema change |
14+
| Adding a new column with a default value (`ALTER TABLE ADD COLUMN ... DEFAULT ...`) | Propagated automatically once the table gets an insert/update/delete. The new column(s) will be populated for all rows replicated after the schema change, but existing rows will not show the default value without a full table refresh |
1515
| Dropping an existing column (`ALTER TABLE DROP COLUMN ...`) | Detected, but **not** propagated. The dropped column(s) will be populated with `NULL` for all rows replicated after the schema change |
16+
17+
Note that column addition will be propagated at the end of a batch's sync, which could occur after the sync interval or pull batch size is reached. More information on controlling syncs [here](./controlling_sync.md)

docs/integrations/data-ingestion/data-formats/json/other.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -70,7 +70,7 @@ SELECT JSONExtractString(tags, 'holidays') AS holidays FROM people
7070
1 row in set. Elapsed: 0.002 sec.
7171
```
7272

73-
Notice how the functions require both a reference to the `String` column `tags` and a path in the JSON to extract. Nested paths require functions to be nested e.g. `JSONExtractUInt(JSONExtractString(tags, 'car'), 'year')` which extracts the column `tags.car.year`. The extraction of nested paths can be simplified through the functions [`JSON_QUERY`](/sql-reference/functions/json-functions#json_query) and [`JSON_VALUE`](/sql-reference/functions/json-functions#json_value).
73+
Notice how the functions require both a reference to the `String` column `tags` and a path in the JSON to extract. Nested paths require functions to be nested e.g. `JSONExtractUInt(JSONExtractString(tags, 'car'), 'year')` which extracts the column `tags.car.year`. The extraction of nested paths can be simplified through the functions [`JSON_QUERY`](/sql-reference/functions/json-functions#JSON_QUERY) and [`JSON_VALUE`](/sql-reference/functions/json-functions#json_value).
7474

7575
Consider the extreme case with the `arxiv` dataset where we consider the entire body to be a `String`.
7676

docs/integrations/data-visualization/tableau/tableau-and-clickhouse.md

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -30,7 +30,7 @@ the [Tableau Exchange](https://exchange.tableau.com/products/1064).
3030
The connector is based on ClickHouse's advanced [JDBC driver](/integrations/language-clients/java/jdbc).
3131

3232
With this connector, Tableau integrates ClickHouse databases and tables as data sources. To enable this functionality,
33-
follow the setup guide bellow.
33+
follow the setup guide below.
3434

3535
<TOCInline toc={toc}/>
3636

@@ -45,10 +45,10 @@ follow the setup guide bellow.
4545
of <a href="https://github.com/ClickHouse/clickhouse-java/releases/" target="_blank">ClickHouse JDBC driver</a>.
4646

4747
:::note
48-
Make sure you download the **clickhouse-jdbc-x.x.x-shaded-all.jar** JAR file. Currently, we recommended using versions `0.8.X`.
48+
Make sure you download the [clickhouse-jdbc-X.X.X-all-dependencies.jar](https://github.com/ClickHouse/clickhouse-java/releases) JAR file. This artifact is available from version `0.9.2`.
4949
:::
5050

51-
4. Store the JDBC driver in the following folder (based on your OS, if the folder doesn't exist you can create it):
51+
4. Store the JDBC driver in the following folder (based on your OS, if the folder doesn't exist, you can create it):
5252
- macOS: `~/Library/Tableau/Drivers`
5353
- Windows: `C:\Program Files\Tableau\Drivers`
5454
5. Configure a ClickHouse data source in Tableau and start building data visualizations!
@@ -76,7 +76,7 @@ To solve that, consider upgrading your Tableau Desktop application, or [install
7676
<br/>
7777

7878
4. Click **Install and Restart Tableau**. Restart the application.
79-
5. After restarting, the connector will have its full name: `ClickHouse JDBC by ClickHouse, Inc.`. When clicking it the following dialog will pop up:
79+
5. After restarting, the connector will have its full name: `ClickHouse JDBC by ClickHouse, Inc.`. When clicking it, the following dialog will pop up:
8080

8181
<Image size="md" img={tableau_connector_dialog} alt="ClickHouse connection dialog in Tableau showing fields for server, port, database, username and password" border />
8282
<br/>
@@ -116,7 +116,7 @@ You are now ready to build some visualizations in Tableau!
116116

117117
## Building Visualizations in Tableau {#building-visualizations-in-tableau}
118118

119-
Now that have a ClickHouse data source configured in Tableau, let's visualize the data...
119+
Now that we have a ClickHouse data source configured in Tableau, let's visualize the data...
120120

121121
1. Drag the **CUSTOMER** table onto the workbook. Notice the columns appear, but the data table is empty:
122122

Lines changed: 110 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,110 @@
1+
---
2+
slug: /use-cases/AI_ML/AIChat
3+
sidebar_label: 'AI Chat'
4+
title: 'Using AI Chat in ClickHouse Cloud'
5+
pagination_prev: null
6+
pagination_next: null
7+
description: 'Guide to enabling and using the AI Chat feature in ClickHouse Cloud Console'
8+
keywords: ['AI', 'ClickHouse Cloud', 'Chat', 'SQL Console', 'Agent', 'Docs AI']
9+
show_related_blogs: true
10+
sidebar_position: 2
11+
---
12+
13+
import Link from '@docusaurus/Link';
14+
import Image from '@theme/IdealImage';
15+
import img_open from '@site/static/images/use-cases/AI_ML/AIChat/1_open_chat.png';
16+
import img_consent from '@site/static/images/use-cases/AI_ML/AIChat/2_consent.png';
17+
import img_modes from '@site/static/images/use-cases/AI_ML/AIChat/3_modes.png';
18+
import img_thinking from '@site/static/images/use-cases/AI_ML/AIChat/4_thinking.png';
19+
import img_history from '@site/static/images/use-cases/AI_ML/AIChat/5_history.png';
20+
import img_result_actions from '@site/static/images/use-cases/AI_ML/AIChat/6_result_actions.png';
21+
import img_new_tab from '@site/static/images/use-cases/AI_ML/AIChat/7_open_in_editor.png';
22+
23+
# Using AI Chat in ClickHouse Cloud
24+
25+
> This guide explains how to enable and use the AI Chat feature in the ClickHouse Cloud Console.
26+
27+
<VerticalStepper headerLevel="h2">
28+
29+
## Prerequisites {#prerequisites}
30+
31+
1. You must have access to a ClickHouse Cloud organization with AI features enabled (contact your org admin or support if unavailable).
32+
33+
## Open the AI Chat panel {#open-panel}
34+
35+
1. Navigate to a ClickHouse Cloud service.
36+
2. In the left sidebar, click the sparkle icon labeled “Ask AI”.
37+
3. (Shortcut) Press <kbd>⌘</kbd> + <kbd>'</kbd> (macOS) or <kbd>Ctrl</kbd> + <kbd>'</kbd> (Linux/Windows) to toggle open.
38+
39+
<Image img={img_open} alt="Open AI Chat flyout" size="md"/>
40+
41+
## Accept the data usage consent (first run) {#consent}
42+
43+
1. On first use you are prompted with a consent dialog describing data handling and third‑party LLM sub-processors.
44+
2. Review and accept to proceed. If you decline, the panel will not open.
45+
46+
<Image img={img_consent} alt="Consent dialog" size="md"/>
47+
48+
## Choose a chat mode {#modes}
49+
50+
AI Chat currently supports:
51+
52+
- **Agent**: Multi‑step reasoning over schema + metadata (service must be awake).
53+
- **Docs AI (Ask)**: Focused Q&A grounded in official ClickHouse documentation and best‑practice references.
54+
55+
Use the mode selector at the bottom-left of the flyout to switch.
56+
57+
<Image img={img_modes} alt="Mode selection" size="sm"/>
58+
59+
## Compose and send a message {#compose}
60+
61+
1. Type your question (e.g. “Create a materialized view to aggregate daily events by user”).
62+
2. Press <kbd>Enter</kbd> to send (use <kbd>Shift</kbd> + <kbd>Enter</kbd> for a newline).
63+
3. While the model is processing you can click “Stop” to interrupt.
64+
65+
## Understanding “Agent” thinking steps {#thinking-steps}
66+
67+
In Agent mode you may see expandable intermediate “thinking” or planning steps. These provide transparency into how the assistant forms its answer. Collapse or expand as needed.
68+
69+
<Image img={img_thinking} alt="Thinking steps" size="md"/>
70+
71+
## Starting new chats {#new-chats}
72+
73+
Click the “New Chat” button to clear context and begin a fresh session.
74+
75+
## Viewing chat history {#history}
76+
77+
1. The lower section lists your recent chats.
78+
2. Select a previous chat to load its messages.
79+
3. Delete a conversation using the trash icon.
80+
81+
<Image img={img_history} alt="Chat history list" size="md"/>
82+
83+
## Working with generated SQL {#sql-actions}
84+
85+
When the assistant returns SQL:
86+
87+
- Review for correctness.
88+
- Click “Open in editor” to load the query into a new SQL tab.
89+
- Modify and execute within the Console.
90+
91+
<Image img={img_result_actions} alt="Result actions" size="md"/>
92+
93+
<Image img={img_new_tab} alt="Open generated query in editor" size="md"/>
94+
95+
## Stopping or interrupting a response {#interrupt}
96+
97+
If a response is taking too long or diverging:
98+
99+
1. Click the “Stop” button (visible while processing).
100+
2. The message is marked as interrupted; you can refine your prompt and resend.
101+
102+
## Keyboard shortcuts {#shortcuts}
103+
104+
| Action | Shortcut |
105+
| ------ | -------- |
106+
| Open AI Chat | `⌘ + '` / `Ctrl + '` |
107+
| Send message | `Enter` |
108+
| New line | `Shift + Enter` |
109+
110+
</VerticalStepper>

scripts/aspell-dict-file.txt

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -271,6 +271,7 @@ autovacuum
271271
VACUUM
272272
resync
273273
Resync
274+
failovers
274275
--docs/integrations/data-ingestion/clickpipes/mysql/faq.md--
275276
PlanetScale
276277
Planetscale

scripts/aspell-ignore/en/aspell-dict.txt

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -3034,6 +3034,7 @@ resultset
30343034
resync
30353035
resynchronization
30363036
resyncing
3037+
failovers
30373038
retentions
30383039
rethrow
30393040
retransmit

0 commit comments

Comments
 (0)