Codex setup (OpenAI Codex app / CLI / IDE)
How to run OpenAI Codex against the brainforge-platform repo: auth, Azure East US 2, Local Environments, and optional config.
1. Install and sign in
- App: Codex app — install and sign in (SWIC or API key).
- CLI:
codex— install per Codex CLI; use the same auth as the app. - IDE: Codex IDE extension — use when you want Codex in your editor; auth same as app.
For admin/managed API usage (e.g. team defaults), you need an OpenAI admin API key. See Config advanced and Authentication. Credentials live in 1Password or env; never commit secrets.
2. Azure East US 2 only
We use Azure OpenAI in East US 2 only. Use the same key and base URL as the rest of the platform:
- Resource:
brainforge-openai-eastus2(East US 2). - Env: Set
AZURE_OPENAI_API_KEYto the East US 2 key (same asBRAINFORGE_OPENAI_EASTUS2_API_KEY). Get it from 1Password (Brainforge AI Team vault) or your local.env. - Base URL:
https://eastus2.api.cognitive.microsoft.com/openai(for Codexbase_urlin a custom provider; this is the regional endpoint fromaz cognitiveservices account show).
See azure-openai-setup.md and azure-models-for-devs.md for the full deployment list and policy.
3. Local Environments (worktree setup)
When Codex creates a new worktree, it can run a setup script so dependencies are installed. Configure in the Codex app: Settings → Local Environments.
Option B (recommended for this repo): install at root, then install the platform app:
npm install
cd apps/platform && npm ci --legacy-peer-depsYou can paste that into Codex Settings → Local Environments → Setup script, or run codex-setup.sh if your setup supports it. See repo README.md for details.
4. Project config (optional)
The repo can ship a project-scoped .codex/config.toml that:
- Defines a custom Azure East US 2 provider.
- Sets default model to
gpt-5.2-codex(or another deployment name you use). - Sets approval_policy and sandbox_mode to your preference.
Full permissions (always-allow)
For unrestricted tool use without approval prompts, configure in ~/.codex/config.toml:
approval_policy = "always-allow"Or per-tool override:
[apps.your-app.tools.tool_name]
approval_mode = "approve"See Advanced config and Config reference.
Verify Codex is using Azure (Codex Cloud or app): Open the Codex app, ensure this project is trusted so it loads .codex/config.toml. Run a small task (e.g. “List files in the repo root”). In the run details or session info, confirm the model shows gpt-5.2-codex and the provider is Azure (or that the request went to the East US 2 endpoint). If you use the Codex CLI, run from the repo root with AZURE_OPENAI_API_KEY set and check that the first request uses the configured provider. You can also verify in Azure Monitor: az monitor metrics list on the East US 2 resource with metric AzureOpenAIRequests and dimension ModelDeploymentName to see requests to gpt-5.2-codex.
5. Optional: Compliance and analytics
- Compliance / audit: Use an OpenAI admin API key and any org-level settings; see Admin setup and Security.
- Analytics: Codex sends anonymous usage/health data by default. To disable: in your config set
[analytics] enabled = false. See Advanced config – Metrics.