This project is designed as a learning journey to help you understand how to build an end-to-end analytics pipeline using dbt, PostgreSQL, and Grafana, using simulated smart meter energy data.
- Understand how dbt works (models, seeds, tests, sources)
- Practice building transformations from raw data to insights
- Explore PostgreSQL as a warehouse for analytics
- Build Grafana dashboards from SQL models
- Learn about data quality checks and incremental models
.
├── docker-compose.yml # Runs Postgres, dbt, Grafana locally
├── dbt_project/
│ ├── dbt_project.yml # dbt config file
│ ├── models/
│ │ ├── staging/ # Clean raw data into stg models
│ │ ├── fact/ # Business logic (e.g., energy summaries)
│ │ └── schema.yml # Define sources, tests, and descriptions
│ ├── data/ # Seed data in CSV form
│ └── config/profiles.yml # dbt connection settings
└── grafana/ # Prebuilt dashboards
| File/Folder | Purpose |
|---|---|
dbt_project.yml |
Project config (name, paths, settings) |
models/ |
Your SQL logic and business rules |
schema.yml |
Tests + docs for models and sources |
data/ |
CSVs for raw/seed data |
dbt_packages/ |
Installed packages (ignore in Git) |
target/ |
Compiled files and logs (ignore in Git) |
profiles.yml |
DB connection info |
git clone https://github.com/ankit-khare-2015/energy-meter-metrics.git
cd energy-meter-metrics
docker-compose up -dThis brings up:
- PostgreSQL (data warehouse)
- dbt (transformation engine)
- Grafana (dashboard UI)
| Seed File | Description |
|---|---|
households.csv |
Household metadata including city, meter ID, and installation date |
smart_meter_readings.csv |
Hourly readings of energy consumption, voltage, and current from each meter |
tariff_rates.csv |
Tariff pricing over date ranges (for cost calculation) |
These tables are referenced as sources in dbt models and form the basis for all downstream transformations.
Run this to load them:
docker exec -it dbt bash
cd /usr/app
dbt seeddbt rundbt testdbt docs generate
dbt docs serve
Console Output
>:/usr/app# dbt docs serve
14:31:36 Running with dbt=1.7.7
Serving docs at 8080
To access from your browser, navigate to: http://localhost:8080Explore your DAG, model descriptions, and test results.
Dashboards in Grafana (http://localhost:3000)
Login: admin / admin
For each panel:
Click the three dots (⋮) in the panel's top-right corner
Select Edit
Then immediately click Back
This will force Grafana to refresh the SQL query and pull the data
This step is needed only once after setup or if data sources were reloaded.
| Report | Description | Model/Table |
|---|---|---|
| Daily Energy & Cost Trends | kWh & cost over time | fct_daily_energy, fct_energy_cost |
| Top 5 Households by Usage | Highest consumers by kWh | fct_daily_energy |
| City-wise Avg Consumption | Avg kWh per city | fct_daily_energy |
| Voltage Anomaly Alerts | Detect when avg voltage is out of range | fct_daily_energy |
| Cost Efficiency Overview | Cost per kWh | fct_energy_cost |
seeds: load CSVs into your warehousemodels: SQL files that transform datasources: define upstream tables (likeraw.smart_meter_readings)tests: catch bad data early (e.g.,not_null,unique)ref()andsource(): create dependency graphincremental models: only load new data (advanced)
- Add incremental logic to
fct_daily_energy - Simulate streaming using Kafka → PostgreSQL
- Add more anomaly detection logic
- Create alerts in Grafana
- Deploy your pipeline to Cloud (Azure/Snowflake/BigQuery)
- Explore tools like Metabase or Apache Superset
- Automate tests using CI/CD (GitHub Actions)
Built as a self-learning project by Ankit Khare
Data Architect • Azure • AI • Streaming • dbt • Kafka • Postgres

