Engineering Setup Guides
This directory contains setup guides for CLI tools, APIs, and integrations used by Brainforge engineering teams and Cursor AI agents.
Vendor / product index: To browse the same tools by vendor name and find Platform-owned integration packs (e.g. HubSpot), see knowledge/platform/integrations/README.md.
🚀 Start Here: Local Development Environment
New team member? Start with the comprehensive setup guide:
📖 Local Development Environment Setup — Complete macOS setup for Platform, OpenWork, data pipelines, and all engineering workflows. Includes one-liner install script and verification checklist.
Quick Reference
Essential Setup
| Guide | Purpose | Required For |
|---|---|---|
| Local Development Environment | Complete macOS CLI toolchain setup | All engineers — Platform, OpenWork, data pipelines |
| Brainforge Setup Skill | Auto-diagnostic + install for MCPs, browser automation, CLIs | Run /brainforge-setup to check & install missing tools |
Active Integrations (In Use)
| Guide | Purpose | Cursor MCP? |
|---|---|---|
| 1Password CLI | Secure credential management | No |
| Gitleaks pre-commit | Secret detection before commit | No |
| HubSpot access + API | HubSpot app first; MCP read fallback; API for agent writes/notes | Read-only MCP fallback; writes via API |
| Langfuse API | Prompt management & observability | No |
| Cursor skills (submodules) | External Cursor skills (e.g. visual-explainer) as submodules | No |
| AI Legal Assistant (Cursor + PDF) | Contract-review and drafting skills; optional ReportLab PDF | No |
| Loophole CLI | Adversarial ethics / rule stress-test tool (tools/loophole, Anthropic API) | No |
| Linear API | Ticket/issue management | Yes |
| Calendar Clockify Sync | Configure Calendar → Clockify sync tool env and APIs | No |
| Google Workspace CLI (gws) | Sheets/Drive/Docs/Gmail/Calendar from shell; GTM sheet and OKR workflows | No |
| Email agent workflows | Agent SOP for Gmail/Calendar client comms (CLI first, MCP fallback) + standards linkage | No |
| Supabase Query Access | Query Internal AI Core, Zoom, and Slack Supabase (JS client, direct Postgres, or Supabase MCP with project_id) | Yes |
| Optional PostgREST audit logging | Append-only audit rows from Node via PostgREST; env gating, RLS, truncation, non-blocking inserts | No |
| Turbopuffer API | Vector search & full-text search | No |
| Railway CLI | Deployment & hosting | Yes |
| Slack ticket approval | Approve/Reject/Reassign dry-run Linear tickets from Slack | No |
| GitHub CLI | Repository & PR management | Yes |
| GitHub + Railway without UI | Repo variables, secrets, and Railway services via CLI only (no dashboard) | No |
| Snowflake CLI | Data warehouse access | No |
| dbt Fusion + Hedra local dev | Local dbt runs for Hedra (Fusion CLI, isolate failing models) | No |
| dbt Brainforge internal data platform setup | Local dbt runs for internal data platform models (Clockify/Operating delivery marts) | No |
| OpenWork local build setup | Archived monorepo note; current Work local setup lives in brainforge-work | No |
| OpenWork Labs access policy | Internal-only access gate + usage logging for hosted Labs (labs.brainforge.ai) | No |
| OpenWork hosted ops runbook | Hosted Work operational baseline; active automation and app code now live in brainforge-work | No |
| OpenWork Labs Railway deploy | Hosted Work Railway deploy notes; source repo is brainforge-work | Yes |
| files.brainforge.ai Railway migration | Move files.brainforge.ai from Heroku to Railway (DNS + custom domain) | No |
| OpenWork hosted runtime contract | Storage/env contract for hosted Work runtime | No |
| Azure OpenAI | Base URL, deployments, Cursor config | No |
| Codex | Codex app/CLI/IDE, Azure East US 2, Local Environments | No |
| OpenAI admin usage & cleanup | Platform usage by user, projects/keys report; ChatGPT Team CSV analysis for seat cleanup | No |
| Snowflake CLI | Data warehouse access | No |
| Rill | Delivery & financial dashboards (rill start) | No |
| Omni CLI (Cursor + Blobby) | OmniSync / Model Local Editor; Cursor + Blobby for Omni chart build and migration | No |
Rill production credentials (service_user_report) | Configure/rotate Snowflake DSN in Rill Cloud projects | No |
| Cursor Cloud Agent (Snowflake + Rill) | Service account and env vars for Cursor Cloud Agents | No |
| Granola MCP | Meeting transcript retrieval and querying | Yes |
| MotherDuck CLI | DuckDB CLI install, MOTHERDUCK_TOKEN, connect to MotherDuck (prereq for all MD CLI work) | No |
| MotherDuck — Default export load | Land Default (Calendly-style) CSV exports into my_db + upsert raw_export | No |
| Hex MCP | Query Hex apps (e.g. GTM Daily Metrics Tracker) for ARR/metrics | Yes |
| Notion MCP | Read/write Notion pages and databases (OAuth per user) | Yes |
| Google Cloud CLI | GCP access, project for Google AI (Vertex/Gemini), startup credits | No |
| Figma | Design context from Figma files; asset map in apps/platform/docs/figma-assets-map.md | Yes |
| Webflow API | brainforge.ai site + CMS for migration | No |
| Linear MCP | Ticket/issue management (in .cursor/mcp.json) | Yes |
| Operating MCP + API | Operating.app integration — MCP for common ops, Direct API for project/client management | Yes |
Legacy/Migration (Being Phased Out)
| Guide | Purpose | Status |
|---|---|---|
| n8n API | Workflow automation | Migrating to Mastra |
| Windmill CLI | Pipeline orchestration | Migrating to Railway |
| Dagster CLI | Data orchestration | Migrating to inline processing |
| Dagster inline migration tracker | Pipeline-by-pipeline migration status | Active tracker |
| Dagster transition deploy strategy | Hosting/cutover approach during migration | Active strategy |
Cursor AI Agent Access
MCP Integrations (Built-in)
These services have native Cursor MCP support - agents can call them directly:
- Linear -
user-linear-*tools (in repo.cursor/mcp.json; OAuth on first use) - Railway -
user-Railway-*tools - GitHub -
user-github-*tools - Figma - design context from Figma files (in repo
.cursor/mcp.json; OAuth on first use) - Granola -
mcp_brainforge-platform-granola_*tools (meeting transcripts; OAuth required; see granola-mcp-setup.md for rate limit guidance) - Hex - query Hex apps for ARR/metrics; OAuth required; see hex-mcp-setup.md
- Notion - read/write Notion; OAuth on first use, per user; see notion-mcp-setup.md
- Operating - Operating.app integration via MCP (in repo
.cursor/mcp.json). For project/client management (create, archive, rename), use the Direct API. - Supabase -
user-supabasetools (execute_sql,list_tables) withproject_idfor Zoom (viqeppmsqvwpslpvttkk) and Slack (oqtkgsndvitzyfzwcdoz); see supabase-query-access.md
API Integrations (Manual)
These services require API calls via shell commands:
- HubSpot - use the HubSpot app first for normal review/manual work, HubSpot MCP as read-only fallback in Cursor, and REST API for agent-performed writes such as notes/activities. See hubspot-api-setup.md.
- Langfuse - REST API with Basic Auth
- Turbopuffer - REST API with Bearer token
- 1Password - CLI tool (
op) - Google Cloud (gcloud) - CLI for GCP access; use when user asks about creating a GCP project, Google AI/Vertex/Gemini, or startup credits. See google-cloud-cli-setup.md.
- Google Workspace (MCP + gws) - For Cursor/agent workflows, prefer the Brainforge Google Workspace MCP first (see setup-google-workspace-mcp.md), then fall back to the
gwsCLI for bulk/iterative work or when MCP is unavailable. See google-workspace-cli-setup.md.
Credential Management
All credentials are stored in 1Password. Common vaults:
| Vault | Purpose |
|---|---|
Brainforge AI Team | Shared team credentials |
Employee | Personal API keys |
Access credentials:
# List items in vault
op item list --vault "Brainforge AI Team"
# Get specific item
op item get "Item Name" --vault "Vault Name"Adding New Guides
When documenting a new CLI/API integration:
- Use the existing format - Follow structure in existing guides
- Include 1Password location - Where to find credentials
- Show authentication pattern - How to authenticate
- Provide common operations - Practical examples
- Note Cursor integration - MCP support or manual API calls
Questions?
For questions about these integrations, contact the AI Team.