Integrating dbt Core with SYNQ
This guide shows you how to connect your dbt Core project to SYNQ to track model runs, test results, and metadata changes.
Prerequisites:
⏱️ Estimated time: 15 minutes
Using dbt Cloud? You can integrate directly through Settings → Integrations → Add Integration → dbt Cloud instead of following this guide.
Integration name: Enter a descriptive name (e.g., Production dbt Core
)
Generate token: Click Create to generate your integration token. You’ll use this token with the synq-dbt
tool to send artifacts securely to SYNQ.
Git integration: Select your Git provider to link model changes to repository commits. This enables change tracking and lineage visualization.
Relative path to dbt: If your dbt project isn’t in the repository root, specify the directory path (e.g., analytics/dbt/
).
Access token management through Settings → Integrations, then select your dbt Core integration and click Manage tokens.
From the token management screen, you can:
synq-dbt
is a command-line wrapper that runs your existing dbt Core commands and automatically uploads artifacts to SYNQ. It’s compatible with any dbt Core version and works seamlessly with orchestration tools like Airflow, GitHub Actions, and Dagster.
Collected artifacts:
manifest.json
— Project structure and dependenciesrun_results.json
— Execution status and performance metricscatalog.json
— Complete data warehouse schema informationsources.json
— Source freshness test resultsHow it works:
SYNQ_TOKEN
environment variableChoose the installation method that matches your dbt orchestration setup:
Set environment variable: In Airflow UI, create a new environment variable SYNQ_TOKEN
with your integration token.
Update Dockerfile:
Set environment variable: Create SYNQ_TOKEN
in Airflow UI.
Install synq-dbt:
Configure environment: Add SYNQ_TOKEN=<your-token>
to your .env
file.
Update resources in definitions.py
:
assets.py
:Add to your Dockerfile:
Download and install:
Replace your existing dbt Core commands with synq-dbt
:
All dbt Core arguments and options work exactly the same way.
If you have already generated dbt artifacts and want to upload them to SYNQ:
Include dbt logs:
If your SYNQ_TOKEN
doesn’t start with st-
, you’re using a legacy token that will be deprecated. Migrate to v2 tokens for improved security and performance.
Migration steps:
Generate new v2 token through Settings → Integrations → Manage tokens
Replace your SYNQ_TOKEN
environment variable with the new token (starts with st-
)
Test the integration to ensure artifacts upload successfully
v2 tokens require synq-dbt v1.8.0 or later. Legacy tokens will continue working during the transition period.
Environment variables:
SYNQ_TOKEN
— Your integration token (required)SYNQ_TARGET_DIR
— Artifact directory path (default: target/
)Network requirements:
dbtapi.synq.io:443
when using legacy token or old uploader.developer.synq.io:443
Data visibility:
For advanced configuration options and troubleshooting, see the synq-dbt GitHub repository.
Integrating dbt Core with SYNQ
This guide shows you how to connect your dbt Core project to SYNQ to track model runs, test results, and metadata changes.
Prerequisites:
⏱️ Estimated time: 15 minutes
Using dbt Cloud? You can integrate directly through Settings → Integrations → Add Integration → dbt Cloud instead of following this guide.
Integration name: Enter a descriptive name (e.g., Production dbt Core
)
Generate token: Click Create to generate your integration token. You’ll use this token with the synq-dbt
tool to send artifacts securely to SYNQ.
Git integration: Select your Git provider to link model changes to repository commits. This enables change tracking and lineage visualization.
Relative path to dbt: If your dbt project isn’t in the repository root, specify the directory path (e.g., analytics/dbt/
).
Access token management through Settings → Integrations, then select your dbt Core integration and click Manage tokens.
From the token management screen, you can:
synq-dbt
is a command-line wrapper that runs your existing dbt Core commands and automatically uploads artifacts to SYNQ. It’s compatible with any dbt Core version and works seamlessly with orchestration tools like Airflow, GitHub Actions, and Dagster.
Collected artifacts:
manifest.json
— Project structure and dependenciesrun_results.json
— Execution status and performance metricscatalog.json
— Complete data warehouse schema informationsources.json
— Source freshness test resultsHow it works:
SYNQ_TOKEN
environment variableChoose the installation method that matches your dbt orchestration setup:
Set environment variable: In Airflow UI, create a new environment variable SYNQ_TOKEN
with your integration token.
Update Dockerfile:
Set environment variable: Create SYNQ_TOKEN
in Airflow UI.
Install synq-dbt:
Configure environment: Add SYNQ_TOKEN=<your-token>
to your .env
file.
Update resources in definitions.py
:
assets.py
:Add to your Dockerfile:
Download and install:
Replace your existing dbt Core commands with synq-dbt
:
All dbt Core arguments and options work exactly the same way.
If you have already generated dbt artifacts and want to upload them to SYNQ:
Include dbt logs:
If your SYNQ_TOKEN
doesn’t start with st-
, you’re using a legacy token that will be deprecated. Migrate to v2 tokens for improved security and performance.
Migration steps:
Generate new v2 token through Settings → Integrations → Manage tokens
Replace your SYNQ_TOKEN
environment variable with the new token (starts with st-
)
Test the integration to ensure artifacts upload successfully
v2 tokens require synq-dbt v1.8.0 or later. Legacy tokens will continue working during the transition period.
Environment variables:
SYNQ_TOKEN
— Your integration token (required)SYNQ_TARGET_DIR
— Artifact directory path (default: target/
)Network requirements:
dbtapi.synq.io:443
when using legacy token or old uploader.developer.synq.io:443
Data visibility:
For advanced configuration options and troubleshooting, see the synq-dbt GitHub repository.