Skip to content

Conversation

nataliefiann
Copy link
Contributor

@nataliefiann nataliefiann commented Apr 7, 2025

What are you changing in this pull request and why?

I have created this PR following this thread (https://app.slack.com/client/T0Z0T0223/C02NCQ9483C) raised by benoit who recommended a table at the top of the Snowflake configs page. Used this idea to update config pages for popular adapters and add a table with short description

I've added a TLDR table to the top of the BigQuery, Databricks, Postgres, Redshift and Snowflake config pages

Checklist


🚀 Deployment available! Here are the direct links to the updated files:

@nataliefiann nataliefiann requested a review from a team as a code owner April 7, 2025 13:03
Copy link

vercel bot commented Apr 7, 2025

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Updated (UTC)
docs-getdbt-com ✅ Ready (Inspect) Visit Preview Jul 9, 2025 1:35pm

@github-actions github-actions bot added content Improvements or additions to content size: medium This change will take up to a week to address labels Apr 7, 2025
@nataliefiann
Copy link
Contributor Author

May need to add setting row policies to Snowflake config page: https://github.com/dbt-labs/docs.getdbt.com/pull/7162/files

Copy link
Contributor Author

@nataliefiann nataliefiann left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

May need to add setting row policies to Snowflake config page: https://github.com/dbt-labs/docs.getdbt.com/pull/7162/files

|[Transient tables](/reference/resource-configs/snowflake-configs#transient-tables)|Transient tables allow time travel for 1 day, with no fail-safe period. By default, dbt creates all Snowflake tables as transient.|
|[Query tags](/reference/resource-configs/snowflake-configs#query-tags)|Snowflake parameter that can be quite useful when searching in the `QUERY_HISTORY` view|.
|[Merge behavior (incremental models)](/reference/resource-configs/snowflake-configs#merge-behavior-incremental-models)|The `incremental_strategy` config determines how dbt builds incremental models. By default, dbt uses a merge statement on Snowflake to refresh these tables. The Snowflake adapter supports the following incremental materialization strategies — `append`, `delete+insert`, `insert_overwrite`, `merge` and [`microbatch`](/docs/build/incremental-microbatch).|
|[`cluster_by`](/reference/resource-configs/snowflake-configs#configuring-table-clustering)|Use the `cluster_by` config to control clustering for a table or incremental model. It orders the table by the specified fields and adds the clustering keys to the target table.|
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this doesn't include automatic_clustering -- should it? this is tied to my previous feedback about consolidating some of the rows so it matches the headers?

|[Incremental models](/reference/resource-configs/databricks-configs#incremental-models-1)|The `incremental_strategy` config determines how dbt builds incremental models. The dbt-databricks plugin supports the following incremental materialization strategies — `append`, `insert_overwrite`, `merge`, [`microbatch`](/docs/build/incremental-microbatch) and `replace_where`.|
|[Selecting compute per model](/reference/resource-configs/databricks-configs#selecting-compute-per-model)|From v1.7.2, you can assign which compute resource to use on a per-model basis. |
|[`persist_docs`](/reference/resource-configs/databricks-configs#persisting-model-descriptions)|When the `persist_docs` is configured correctly, model descriptions will appear in the `Comment` field of `describe [table] extended` or `show table extended in [database] like '*'`.|
|[Default file format configurations](/reference/resource-configs/databricks-configs#default-file-format-configurations)|Use the Delta or Hudi file format as the default file format to use advanced incremental strategies features.|
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

do we want to explicitly share the config?

Suggested change
|[Default file format configurations](/reference/resource-configs/databricks-configs#default-file-format-configurations)|Use the Delta or Hudi file format as the default file format to use advanced incremental strategies features.|
|[Default file format configurations](/reference/resource-configs/databricks-configs#default-file-format-configurations)| Use the Delta or Hudi file format (`file_format`) as the default file format to use advanced incremental strategies features.|

| Configuration | Description |
|------------------|----------------|
|[Iceberg table format](/reference/resource-configs/snowflake-configs#iceberg-table-format)|The dbt-snowflake adapter supports the Iceberg table format and is available for three of the Snowflake materializations [table](/docs/build/materializations#table), [incremental](/docs/build/materializations#incremental) and [dynamic tables](/reference/resource-configs/snowflake-configs#dynamic-tables)|
|[Dynamic tables](/reference/resource-configs/snowflake-configs#dynamic-tables)|Specific to Snowflake but follows the implementation of [materialized views](/docs/build/materializations#Materialized-View).|
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can we give some more info here?

e.g. use the Snowflake-specific dynamic materialization to create dynamic tables. Supports settings like target_lag, refresh_mode, and on_configuration_change.

|[Merge behavior (incremental models)](/reference/resource-configs/snowflake-configs#merge-behavior-incremental-models)|The `incremental_strategy` config determines how dbt builds incremental models. By default, dbt uses a merge statement on Snowflake to refresh these tables. The Snowflake adapter supports the following incremental materialization strategies — `append`, `delete+insert`, `insert_overwrite`, `merge` and [`microbatch`](/docs/build/incremental-microbatch).|
|[`cluster_by`](/reference/resource-configs/snowflake-configs#configuring-table-clustering)|Use the `cluster_by` config to control clustering for a table or incremental model. It orders the table by the specified fields and adds the clustering keys to the target table.|
|[Configuring virtual warehouses](/reference/resource-configs/snowflake-configs#configuring-virtual-warehouses)|Use the `snowflake_warehouse` model configuration to override the warehouse that is used for specific models.|
|[Copying grants](/reference/resource-configs/snowflake-configs#copying-grants)|`copy_grants` = `true', dbt adds the copy grants DDL qualifier when rebuilding tables and views. The default is false.|
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

is copy_grants = true' the correct config? or is it copy_grants: true` per example? also is the sentence missing a 'When copy_grants is true....etc.

|[`cluster_by`](/reference/resource-configs/snowflake-configs#configuring-table-clustering)|Use the `cluster_by` config to control clustering for a table or incremental model. It orders the table by the specified fields and adds the clustering keys to the target table.|
|[Configuring virtual warehouses](/reference/resource-configs/snowflake-configs#configuring-virtual-warehouses)|Use the `snowflake_warehouse` model configuration to override the warehouse that is used for specific models.|
|[Copying grants](/reference/resource-configs/snowflake-configs#copying-grants)|`copy_grants` = `true', dbt adds the copy grants DDL qualifier when rebuilding tables and views. The default is false.|
|[Secure views](/reference/resource-configs/snowflake-configs#secure-views)|Use the `secure` config for view models which can be used to limit access to sensitive data.|
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🔥

@mirnawong1
Copy link
Contributor

mirnawong1 commented Apr 8, 2025

hey @nataliefiann , thnks for this pr and it was a quick turnaround!

a few comments that applies to most of the adapters:

  • sugg having a nice intro sentence before introducing the table so users know what the table is?
  • also for consistency purposes, should you consolidate some of the rows so it matches the headers? right now, it seems like you've added multiple rows for a header (e.g. sortkey). some of the other rows just highlight the h2s
  • also you can use anchor links and not relative links when tagging the subsections in the table (e.g. you can use #iceberg-table-format instead of /reference/resource-configs/snowflake-configs#iceberg-table-format) but up to you!
  • also suggested some description be more consistent with the actual section content
  • also sugg the snowflake doc maybe can be broken up - it's kind of long?

@github-actions github-actions bot added size: large This change will more than a week to address and might require more than one person and removed size: medium This change will take up to a week to address labels Apr 11, 2025
@nataliefiann nataliefiann changed the title TLDR Adapter config table TLDR Adapter config table (we'll be speaking with Benoit about this PR) Jun 23, 2025
@github-actions github-actions bot added the size: medium This change will take up to a week to address label Jul 9, 2025
@matthewshaver matthewshaver added the icebox For issues we're closing but will revisit at a future date if possible! label Aug 27, 2025
@matthewshaver
Copy link
Contributor

Iceboxing this conversation for the time being. Will re-open when appropriate

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
content Improvements or additions to content icebox For issues we're closing but will revisit at a future date if possible! size: large This change will more than a week to address and might require more than one person size: medium This change will take up to a week to address
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants