Dagster CLI Setup for Cursor AI Agents

Version: 1.3
Date: March 7, 2026
Owner: AI Team


1. Purpose

This guide helps Cursor agents inspect and migrate Dagster jobs now that the Dagster codebase lives in this monorepo.

Dagster is still used for legacy orchestration, but the target state is platform-native inline processing where feasible.


2. Code Location (Monorepo)

Dagster Cloud:

  • Access via Dagster Cloud UI
  • Jobs defined in monorepo: apps/dagster-pipelines/ (see docs/dagster-migration-next-steps.md for post-merge next steps)

Dagster pipelines are now stored at:

/workspace/apps/dagster-pipelines/

Primary files:

  • apps/dagster-pipelines/pipelines/repository.py — job and schedule registration
  • apps/dagster-pipelines/pipelines/*/ — pipeline implementations/config
  • apps/dagster-pipelines/pipelines_tests/ — tests

3. Dagster Cloud Credentials (If Needed)

Most migration work should be code-first and local. For cloud inspection:

DAGSTER_CLOUD_API_TOKEN=your_token
DAGSTER_CLOUD_ORGANIZATION=brainforge

Dagster Cloud UI: https://dagster.cloud/


4. CLI Installation

# Install cloud CLI when cloud inspection is needed
pip install dagster-cloud
 
# Verify installation
dagster-cloud --help

5. Local Code Inspection (Preferred)

For most inspection needs, read the job definitions directly from the monorepo.

Repository Location (brainforge-platform monorepo):

<monorepo-root>/apps/dagster-pipelines/

Example: .../brainforge-platform/apps/dagster-pipelines/

Key Files:

  • apps/dagster-pipelines/pipelines/repository.py - Job registration
  • apps/dagster-pipelines/pipelines/*/ - Individual pipeline definitions

Use code inspection first before cloud operations.

5.1 List Dagster jobs

rg "@job" apps/dagster-pipelines/pipelines

5.2 List schedules and cron usage

# From monorepo root
grep -r "@job" apps/dagster-pipelines/pipelines/
 
 
### 5.3 Find a specific job implementation
 
```bash
rg "def .*_job\\(" apps/dagster-pipelines/pipelines

5.4 Inspect one pipeline folder quickly

rg "def |@op|@job|config_schema" apps/dagster-pipelines/pipelines/<pipeline_name>/

6. Local Runtime (Optional)

From apps/dagster-pipelines/:

python3 -m venv .venv
source .venv/bin/activate
pip install -r requirements.txt
dagster dev -m pipelines.repository -p 3001

6.1 Cursor Cloud VM defaults for this repo

The repo-level Cursor Cloud environment now preps Dagster dependencies out of the box via:

  • .cursor/environment.json
  • .cursor/scripts/cloud-install.sh
  • .cursor/scripts/cloud-startup.sh

Cloud setup includes:

  • python3-venv plus baseline Python build dependencies (python3-dev, build-essential, pkg-config, libffi-dev, libssl-dev, libpq-dev)
  • Node.js 22 activation (via nvm, when available)
  • npm ci --legacy-peer-deps preinstalled in apps/platform/
  • Creating apps/dagster-pipelines/.venv
  • Preinstalling apps/dagster-pipelines/requirements.txt into that venv
  • Ensuring ~/.local/bin is on PATH through ~/.cursor_cloud_env.sh and startup bootstrap

Quick validation commands in a cloud session:

  1. Find the job definition:

    grep -r "def job_name" apps/dagster-pipelines/
  2. Confirm JS tooling baseline is ready:

    node -v
    ls apps/platform/node_modules >/dev/null

Schedule safety:

  • By default, this monorepo copy does not expose schedules.
  • Set DAGSTER_ENABLE_SCHEDULES=true only when you explicitly want to inspect/validate schedule definitions.

Tests:

pytest pipelines_tests

7. Migration Workflow (Dagster → Inline)

For each pipeline:

  • Dagster Cloud: https://dagster.cloud/ (login required)
  • Code: Monorepo apps/dagster-pipelines/ (standalone dagster-pipelines repo is deprecated)
  1. Identify the job entrypoint and schedule in pipelines/repository.py.
  2. Map each op:
    • input contracts
    • external dependencies
    • output tables/files/events
  3. Implement platform-native equivalent in apps/platform.
  4. Validate parity on a bounded data window.
  5. Cut over trigger/schedule after parity is confirmed.

Use standards/03-knowledge/engineering/setup/dagster-inline-migration.md as the migration tracker.


8. Migration Status

ItemStatusNotes
Dagster repo locationCompleteImported as subtree under apps/dagster-pipelines/
zoom_meeting_to_turbopuffer_jobIn migrationTrack status in dagster-inline-migration.md

9. Troubleshooting

Job not found

  • Confirm registration in apps/dagster-pipelines/pipelines/repository.py.
  • Confirm branch includes latest subtree import commit.

Import or dependency failure

  • Activate venv before running Dagster commands.
  • Reinstall dependencies with pip install -r requirements.txt.

Cloud visibility mismatch

  • Verify DAGSTER_CLOUD_API_TOKEN and org values.
  • Validate deployment state in Dagster Cloud UI.

10. References


11. Version History

  • v1.3 (March 7, 2026) — Expanded Cursor Cloud defaults to include Node.js 22 plus preinstalled Platform/OpenWork dependencies (npm ci --legacy-peer-deps and pnpm install) while preserving Dagster setup
  • v1.2 (March 3, 2026) — Added Cursor Cloud VM image/startup defaults for Dagster (python3-venv, build deps, preinstalled requirements.txt, and persistent ~/.local/bin path)
  • v1.1 (February 28, 2026) — Updated for monorepo subtree location and migration workflow
  • v1.0 (January 30, 2026) — Initial migration inspection guide