Brainforge Internal Data Platform dbt Project
This dbt project initializes the transformation layer for the internal data platform.
Project path: knowledge/engineering/data-platform/dbt/
Profile name: brainforge_internal
Reference scaffold: knowledge/engineering/data-platform/examples/dbt-test-slim/
Scope
- Staging:
stg_clockify_time_entriesfromRAW.CLOCKIFY.TIME_ENTRIESstg_operating_time_entriesfromRAW.OPERATING_IO.RAW_TIME_ENTRIES(+ person/project enrichment)
- Mart:
mart_delivery_time_effortunifying Clockify + Operating time/effort
Phase 2 finance models:
- Staging:
stg_quickbooks_journal_entriesstg_quickbooks_invoicesstg_quickbooks_billsstg_quickbooks_payments
- Marts:
mart_finance_account_monthly(monthly journal rollup at account grain with optional Finance+Ops mapping)mart_finance_summary(month-level revenue/COGS/opex + AR/AP + cash metrics)
Source schema profiling context lives in knowledge/engineering/data-platform/raw-schema-profiles/.
Directory layout
dbt/
├── dbt_project.yml
├── profiles.yml
├── models/
│ ├── staging/
│ │ ├── sources.yml
│ │ ├── schema.yml
│ │ ├── clockify/stg_clockify_time_entries.sql
│ │ └── operating/stg_operating_time_entries.sql
│ └── marts/
│ ├── schema.yml
│ └── delivery/mart_delivery_time_effort.sql
└── .gitignoreLocal setup
- Install dbt Snowflake adapter (
dbt-core+dbt-snowflake) in your Python environment. - Export Snowflake env vars (credentials from 1Password / Cursor secrets):
export SNOWFLAKE_ACCOUNT=...
export SNOWFLAKE_USER=...
export SNOWFLAKE_ROLE=ROLE_TRANSFORM
export SNOWFLAKE_WAREHOUSE=WAREHOUSE_TRANSFORM
# Option A: password auth
export SNOWFLAKE_PASSWORD=...
# Option B: key-pair auth
export SNOWFLAKE_PRIVATE_KEY='-----BEGIN PRIVATE KEY-----...'If you are running in a Cursor Cloud Agent environment, the profile also accepts:
SNOWFLAKE_CLOUD_AGENT_ACCOUNT, SNOWFLAKE_CLOUD_AGENT_USER,
SNOWFLAKE_CLOUD_AGENT_ROLE, SNOWFLAKE_CLOUD_AGENT_WAREHOUSE,
SNOWFLAKE_CLOUD_AGENT_DATABASE, and SNOWFLAKE_PRIVATE_KEY.
Optional environment overrides used by this project:
export SNOWFLAKE_RAW_DATABASE=RAW
export SNOWFLAKE_INTERMEDIATE_DATABASE=DEV_INTERMEDIATE
export SNOWFLAKE_STAGING_DATABASE=STG_INTERMEDIATE
export SNOWFLAKE_PROD_DATABASE=PROD_INTERMEDIATE
export SNOWFLAKE_MARTS_DATABASE=DEV_MARTS
export SNOWFLAKE_STAGING_MARTS_DATABASE=STG_MARTS
# Optional for non-dev targets:
# export SNOWFLAKE_${TARGET^^}_MARTS_DATABASE=<TARGET_MARTS_DB>
export SNOWFLAKE_SCHEMA=DELIVERY
export DBT_THREADS=8
# Optional Finance+Ops account mapping relation (used by mart_finance_account_monthly when present):
# export SNOWFLAKE_FINANCE_MAPPING_DATABASE=<database>
# export SNOWFLAKE_FINANCE_MAPPING_SCHEMA=FINANCE_OPS
# export SNOWFLAKE_FINANCE_MAPPING_IDENTIFIER=QUICKBOOKS_ACCOUNT_MAPPINGSNOWFLAKE_STG_MARTS_DATABASE is also accepted as a compatibility alias.
Target routing defaults:
dbt --target dev:- staging models → profile database (default
DEV_INTERMEDIATE) - marts models →
SNOWFLAKE_MARTS_DATABASE(defaultDEV_MARTS)
- staging models → profile database (default
dbt --target staging:- staging models → profile database (default
STG_INTERMEDIATE) - marts models →
SNOWFLAKE_STAGING_MARTS_DATABASE(defaultSTG_MARTS)
- staging models → profile database (default
dbt --target prod:- staging models → profile database (default
PROD_INTERMEDIATE) - marts models →
SNOWFLAKE_${TARGET^^}_MARTS_DATABASE(fallback derives from target DB by replacing_INTERMEDIATEwith_MARTS)
- staging models → profile database (default
Run commands
From repository root:
dbt debug \
--project-dir knowledge/engineering/data-platform/dbt \
--profiles-dir knowledge/engineering/data-platform/dbt
dbt compile \
--project-dir knowledge/engineering/data-platform/dbt \
--profiles-dir knowledge/engineering/data-platform/dbt
dbt parse \
--project-dir knowledge/engineering/data-platform/dbt \
--profiles-dir knowledge/engineering/data-platform/dbt
dbt run \
--project-dir knowledge/engineering/data-platform/dbt \
--profiles-dir knowledge/engineering/data-platform/dbt \
--select stg_clockify_time_entries stg_operating_time_entries mart_delivery_time_effort
dbt docs generate \
--project-dir knowledge/engineering/data-platform/dbt \
--profiles-dir knowledge/engineering/data-platform/dbt
dbt docs serve \
--project-dir knowledge/engineering/data-platform/dbt \
--profiles-dir knowledge/engineering/data-platform/dbt \
--port 8080
dbt run \
--project-dir knowledge/engineering/data-platform/dbt \
--profiles-dir knowledge/engineering/data-platform/dbt \
--select stg_quickbooks_journal_entries stg_quickbooks_invoices stg_quickbooks_bills stg_quickbooks_payments mart_finance_account_monthly mart_finance_summaryData freshness and lineage reference
See knowledge/engineering/data-platform/data-freshness-and-lineage.md for:
- raw source freshness checkpoints and ownership (
RAW.CLOCKIFY,RAW.OPERATING_IO,RAW.POLYTOMIC_QUICKBOOKS,RAW.POLYTOMIC_HUBSPOT) - source → staging → mart dependency mapping
- where
dbt docsartifacts are generated (target/manifest.json,target/catalog.json)
Scheduled builds (GitHub Actions)
- Workflow:
.github/workflows/dbt-build-schedule.yml - Schedule: daily at
06:00 UTC - Manual run: GitHub Actions → dbt build schedule → Run workflow
- Manual target options:
dev(default),staging,prod
The workflow runs dbt build from this project directory and reads Snowflake credentials from GitHub secrets.
At minimum, set SNOWFLAKE_ACCOUNT, SNOWFLAKE_USER, and one auth secret (SNOWFLAKE_PASSWORD or SNOWFLAKE_PRIVATE_KEY or SNOWFLAKE_PRIVATE_KEY_PATH).
Notes
profiles.ymlis committed with environment variable references only (no secrets).- Default targets are configured for
dev,staging, andprodin profile outputs. - Use
targetswitching (--target staging/--target prod) as deployments evolve.