-
Notifications
You must be signed in to change notification settings - Fork 12
Description
There is an issue with trying to fix deprecations when a macro contains a model configuration block. Here is an example macro that I know fails from testing locally.
{% macro create_ctv_raw_model(platform) %}
{{
config(
materialized = 'table',
alias = 'google_ctv_' + platform + '_tv_raw'
)
}}
...sql logic
{% endmacro %}
I've kept the SQL logic out, but we only use the {{ source() }} dbt macro in this macro. We don't have any custom logic for config or source macros in our project. You can see the error message below:
(dbt-project-venv) PS /path/on/my/machine/dbt-project> dbt-autofix deprecations --dry-run
Error processing /path/on/my/machine/dbt-project/macros/mmm_digital/create_ctv_raw_model.sql: KeyError: 'macros'
-- Dry run mode, not applying changes --We leverage uv for managing our python runtime environments:
Python venv Packages
Package Version
----------------------------- ---------------
agate 1.9.1
annotated-types 0.7.0
anyio 4.12.1
attrs 25.4.0
azure-core 1.38.2
azure-identity 1.25.2
azure-storage-blob 12.28.0
babel 2.18.0
boto3 1.42.59
botocore 1.42.59
certifi 2026.2.25
cffi 2.0.0
charset-normalizer 3.4.4
click 8.3.1
colorama 0.4.6
croniter 6.0.0
cryptography 46.0.5
daff 1.4.2
db-dtypes 1.5.0
dbt-adapters 1.22.6
dbt-bigquery 1.11.0
dbt-common 1.37.2
dbt-core 1.11.6
dbt-extractor 0.6.0
dbt-loom 0.9.4
dbt-protos 1.0.431
dbt-semantic-interfaces 0.9.0
deepdiff 8.6.1
distro 1.9.0
docstring-parser 0.17.0
fastjsonschema 2.21.2
gitdb 4.0.12
gitpython 3.1.46
google-api-core 2.30.0
google-auth 2.48.0
google-auth-oauthlib 1.3.0
google-cloud-aiplatform 1.139.0
google-cloud-bigquery 3.40.1
google-cloud-core 2.5.0
google-cloud-dataproc 5.25.0
google-cloud-resource-manager 1.16.0
google-cloud-storage 3.1.1
google-crc32c 1.8.0
google-genai 1.65.0
google-resumable-media 2.8.0
googleapis-common-protos 1.72.0
grpc-google-iam-v1 0.14.3
grpcio 1.78.0
grpcio-status 1.78.0
h11 0.16.0
httpcore 1.0.9
httpx 0.28.1
idna 3.11
importlib-metadata 8.7.1
isodate 0.7.2
jinja2 3.1.6
jmespath 1.1.0
jsonschema 4.26.0
jsonschema-specifications 2025.9.1
jupyter-core 5.9.1
leather 0.4.1
markdown-it-py 4.0.0
markupsafe 3.0.3
mashumaro 3.14
mdurl 0.1.2
more-itertools 10.8.0
msal 1.35.0
msal-extensions 1.3.1
msgpack 1.1.2
nbformat 5.10.4
networkx 3.6.1
numpy 2.4.2
oauthlib 3.3.1
orderly-set 5.5.0
packaging 26.0
pandas 2.3.3
pandas-gbq 0.33.0
paradime-io 4.19.0
parsedatetime 2.6
pathspec 0.12.1
platformdirs 4.9.2
proto-plus 1.27.1
protobuf 6.33.5
psutil 7.2.2
pyarrow 23.0.1
pyasn1 0.6.2
pyasn1-modules 0.4.2
pycparser 3.0
pydantic 2.12.5
pydantic-core 2.41.5
pydata-google-auth 1.9.1
pygments 2.19.2
pyjwt 2.11.0
python-dateutil 2.9.0.post0
python-dotenv 1.2.2
python-slugify 8.0.4
pytimeparse 1.1.8
pytz 2025.2
pyyaml 6.0.3
referencing 0.37.0
requests 2.32.5
requests-oauthlib 2.0.0
rich 14.3.3
rpds-py 0.30.0
rsa 4.9.1
s3transfer 0.16.0
setuptools 82.0.0
six 1.17.0
smmap 5.0.2
sniffio 1.3.1
snowplow-tracker 1.1.0
sqlparse 0.5.4
tenacity 9.1.4
text-unidecode 1.3
traitlets 5.14.3
types-networkx 3.6.1.20260210
types-pyyaml 6.0.12.20250915
typing-extensions 4.15.0
typing-inspection 0.4.2
tzdata 2025.3
urllib3 2.6.3
websockets 16.0
zipp 3.23.0The project parses and compiles fine but dbt-autofix fails over immediately when the config block is in a macro. If I comment out the config macro code you can see the same command passes no issue:
(dbt-project-venv) PS /path/on/my/machine/dbt-project> dbt-autofix deprecations --dry-run
-- Dry run mode, not applying changes --I've looked through the dbt-autofix code and even searched the whole project and I cannot find where this error is getting introduced. It seems like this is a known feature of dbt-core, so I'm not sure why autofix fails on this type of macro call. We have about 30-40 macros with this exact same behaviour, used across about 700 models, so this is quite a big problem for us to get our project up to the latest standard ready for dbt Fusion Engine.
Any help on this would be much appreciated π π