Presentation: Building Data Pipelines with dbt Core and Snowflake
This hands-on workshop will introduce you to the fundamentals of dbt Core for building data pipelines. You will learn how to load data, create data models, add data quality tests and documentation using dbt Core and Snowflake. By the end of this workshop, you'll understand how to transform raw data into analytics-ready models using industry best practices.
The workshop will cover:
- Introduction to dbt Core and its architecture
- Setting up a dbt project with Snowflake
- Creating data models using SQL
- Implementing testing and documentation
- Best practices for data pipeline development
The data sources for this workshop are stored in the workshop/datasources folder. During setup, you'll load these files into your Snowflake instance following the instructions in the workshop setup guide.
- A Snowflake trial account
- Python 3.7 or higher
- Basic knowledge of SQL
Run the following code to clone the workshop repo:
git clone https://github.com/pyladiesams/data-pipelines-with-dbtcore-snowflake-sep2025
cd data-pipelines-with-dbtcore-snowflake-sep2025Before attending the workshop, please follow the setup instructions in the workshop_setup.md file, including:
- Creating a Snowflake trial account
- Setting up a database and loading sample data
- Installing dbt Core locally
- Configuring your dbt environment
Re-watch this YouTube stream
This workshop was set up by @pyladiesams and @anyalitica
Happy Coding :)