dbt Fusion + Hedra Local Development
Use the dbt Fusion CLI for local development and debugging on the Hedra analytics project. Fusion runs against Snowflake via profiles.yml (no dbt Cloud session), so you can iterate quickly and isolate failing models without “session occupied” or long cloud invocations.
References: Install Fusion CLI, Fusion availability.
For the recommended dev loop (run in dev → validate → PR), see dbt Dev Database Loop.
When to use which
| Tool | Use for |
|---|---|
dbt Fusion (dbt or dbtf from ~/.local/bin) | Local runs: compile, run, test, build. Fast feedback, no cloud session limit. Use to isolate and fix failing models (e.g. dbt run --select int_user_subscriptions). |
dbt Cloud CLI (e.g. Homebrew dbt at /opt/homebrew/bin/dbt) | Runs that execute in dbt Cloud (same as Cloud jobs: deferral, platform credentials). Use when you need to match Cloud behavior or run full builds in Cloud. |
If both are installed, PATH order decides which dbt you get. For Hedra local dev, use Fusion (ensure ~/.local/bin is before /opt/homebrew/bin when working in the Hedra repo, or call dbtf explicitly).
Prerequisites
- Fusion installed (e.g.
curl -fsSL https://public.cdn.getdbt.com/fs/install/install.sh | sh -s -- --update). Verify withdbtf --version. - Snowflake access for Hedra (see Snowflake CLI and, in brainforge-platform, snowflake-verify for key-pair setup and verification scripts).
~/.dbt/profiles.ymlwith ahedra_analyticsprofile (see below). Do not commit credentials.
profiles.yml for Hedra (Fusion)
Fusion uses profiles.yml from ~/.dbt/ (or project root, or --profiles-dir). It does not use dbt_cloud.yml.
- Copy the structure from hedra/analytics/dbt_project/profiles.example.yml into
~/.dbt/profiles.ymlunder the keyhedra_analytics. - Set credentials via env vars (or a
.envin the project root; Fusion loads.envfrom the current working directory). Example env vars:SNOWFLAKE_ACCOUNT,SNOWFLAKE_USER,SNOWFLAKE_ROLE,SNOWFLAKE_DATABASE,SNOWFLAKE_SCHEMA- For key-pair:
SNOWFLAKE_PRIVATE_KEY_PATH(path to your .p8 key). Seeknowledge/clients/hedra/resources/snowflake-verify/(KEYPAIR_SETUP.md) and Snowflake CLI setup for key-pair auth.
- Use a dev target (e.g.
DEV_MARTS+ a personal or dev schema) so you don’t write to prod during debugging.
Running from the Hedra project
Always run from the dbt project directory (where dbt_project.yml lives):
cd hedra/analytics/dbt_projectThen, for example:
- Debug connection:
dbt debug(ordbtf debug) - Compile:
dbt compile - Run one model (isolate failures):
dbt run --select int_user_subscriptions - Run and test one model:
dbt build --select int_user_subscriptions - Full build (long):
dbt build
Fusion will use profile: hedra_analytics from dbt_project.yml and the matching profile in ~/.dbt/profiles.yml.
Isolating failing models
When a dbt Cloud job fails on a specific model:
- From
hedra/analytics/dbt_project, run that model locally with Fusion:dbt run --select <model_name> - Fix any SQL or config errors (e.g. missing
with, wrong column refs). - Re-run the same command until it succeeds, then run downstreams if needed:
dbt run --select <model_name>+ - Push changes and re-run the Cloud job (or let the next schedule run).
This avoids “session occupied” and long Cloud invocations while debugging.
Optional: .env in project root
To keep secrets out of profiles.yml, put a .env in the directory you run dbt from (e.g. hedra/analytics/dbt_project/):
SNOWFLAKE_ACCOUNT=udvfvcl-dqb33810
SNOWFLAKE_USER=Uttam
SNOWFLAKE_ROLE=role_prod_write
SNOWFLAKE_PRIVATE_KEY_PATH=/path/to/your/.p8
Reference these in ~/.dbt/profiles.yml with {{ env_var('SNOWFLAKE_ACCOUNT') }}, etc. Add .env to .gitignore (do not commit).
Summary
- Fusion = local dev, fast iteration, isolate failures using
profiles.yml+ Snowflake. - Use a dev database loop when modeling: run in dev → validate in dev → show impact → iterate. See dbt Dev Database Loop.
- dbt Cloud CLI = run in dbt Cloud, match production/Cloud job behavior.
- Hedra project: run from
hedra/analytics/dbt_project; usehedra_analyticsprofile and a dev Snowflake target.