OpenCode Desktop — Brainforge Azure + MCP
Audience: Engineers using OpenCode Desktop with this monorepo.
See also: knowledge/engineering/opencode-cli-brainforge.md (CLI, clean slate, Figma quirks), azure-models-for-devs.md.
1. Open the correct workspace
- File → Open Folder and choose the
brainforge-platformgit root (the directory that containsopencode.jsonc). - If you open a subfolder only, project
mcpentries that use${workspaceFolder}(e.g. HubSpot) will not resolve to the script path and those servers can fail to start.
2. Azure — user config (East US only)
brainforge-work: That repo’s chat providers live in root opencode.jsonc. Keep a single project config file there (do not add an extra root opencode.json next to opencode.jsonc). GPT models use @ai-sdk/openai-compatible providers with deployment baseURL values (ids like azure-eastus-gpt-5-4-mini) and AZURE_KEY_BRAINFORGE_OPENAI — they do not need resourceName. See brainforge-work/AGENTS.md → OpenCode Desktop.
For brainforge-platform, repo opencode.jsonc does not define chat providers. Merge into ~/.config/opencode/opencode.json:
| Goal | Example file | API key env var |
|---|---|---|
East US (brainforge-openai), models gpt-5.4 + gpt-5.4-mini (extend as needed) | opencode-user-config.azure-eastus-legacy.example.jsonc | AZURE_OPENAI_EASTUS_API_KEY (legacy @ai-sdk/openai layout) |
Same, minimal starter JSON (deployment-scoped openai-compatible providers) | opencode-user-config.example.json | AZURE_KEY_BRAINFORGE_OPENAI |
Azure resource name (Desktop): Only legacy @ai-sdk/openai azure-eastus blocks need resourceName. Prefer the opencode-user-config.example.json layout above so Desktop does not hit the /deployments/.../responses 404 on brainforge-openai.
“Invalid subscription key or wrong API endpoint”: Usually the wrong key for the resource or a baseURL that already ends in /openai/v1 while the client also appends /v1. For openai-compatible deployment URLs, use https://brainforge-openai.openai.azure.com/openai/deployments/{deployment} plus api-version query params as in the examples.
Desktop and shell env: The app started from the Dock does not load your interactive shell **export**s. Either:
- Paste the key via Settings → Providers (if you use the app’s Azure flow), or
- Set
AZURE_KEY_BRAINFORGE_OPENAI(and any other{env:…}vars your merged config references) for GUI apps (e.g.launchctl setenvfor the current session, or a small wrapper that launches OpenCode after exporting vars — see Apple docs for persistent GUI env if needed).
After editing config: quit OpenCode fully (Cmd+Q) and reopen.
Sanity check (terminal, repo root):
opencode debug config
opencode modelsTo see the current set of available/enabled models for the Brainforge OpenCode workspace, check:
https://opencode.ai/workspace/wrk_01KPGSTF1J83WKBPS3DKE301AB
3. MCPs look “not connected” in Desktop
Project MCPs are merged from repo opencode.jsonc plus user ~/.config/opencode/opencode.json.
- Workspace must be the monorepo root (see §1).
- OAuth / PAT MCPs (Linear, Notion, Supabase, Figma, Exa): from a terminal with cwd = repo root, run e.g.
opencode mcp auth linear,opencode mcp auth supabase,opencode mcp auth exa, and complete the browser flow. Tokens land under~/.local/share/opencode/and are shared with Desktop once written. - List from CLI:
opencode mcp list(ornpm run opencode:mcp:list). If CLI shows servers healthy but Desktop UI still looks empty, update Desktop, hard-quit, and reopen the same folder — some builds lag the UI until after first successful session start. - HubSpot (local): requires
nodeonPATHand a workingscripts/hubspot-mcp-stdio.mjspath; fix workspace root first. control-chrome: usesnpx; first run may download packages — allow network.
If opencode debug config shows an empty or wrong mcp block, you are not loading the repo file (wrong folder) or user config overwrote mcp — fix user JSON or remove conflicting mcp keys there.
4. Model picker shows only 5.4 and 5.4 mini
First, check the Brainforge OpenCode workspace page to see which models are currently available/enabled:
https://opencode.ai/workspace/wrk_01KPGSTF1J83WKBPS3DKE301AB
Then use the East US example files above; the canonical copy defines gpt-5.4 and gpt-5.4-mini. Remove other provider entries from user config if you do not want extra models in the picker.
Related
- Deployments and policy:
apps/platform/docs/azure-openai-deployment-options.md - Full cheat sheet:
opencode-cli-brainforge.md