Notion → GitHub Vault Sync (Cron)
This document describes how to run a scheduled sync that:
- Runs the Notion export script from
standards/05-scripts/notion(in this repo) - Writes exported Markdown into
knowledge/notion - Note: People/recruitment markdown may live in
knowledge/people/recruitment/(not under this folder); keep automated export slugs separate from that tree unless the workflow is explicitly pointed there. - Commits and pushes changes to brainforge-platform on a cron schedule (GitHub Actions)
High-level approach
- The scheduled workflow lives in brainforge-platform (this repo contains both
standards/andknowledge/). - The workflow checks out brainforge-platform once.
- The workflow installs the script dependencies from
standards/05-scripts/notion/ - The workflow runs the script with CWD at repo root, so output lands in
knowledge/notion/... - The workflow commits changes under
knowledge/notion/back to brainforge-platform
Repo structure (brainforge-platform)
brainforge-platform/
├─ .github/workflows/notion-sync.yml
├─ standards/05-scripts/notion/
│ ├─ sync-notion.mjs
│ ├─ package.json
│ └─ package-lock.json
└─ knowledge/notion/
└─ <generated markdown files>
Step 1 — Notion prerequisites
1.1 Create a Notion integration token
- Create a Notion internal integration and copy the token.
1.2 Share the source Notion content with the integration
- For each Notion database (or page) you want to export:
- Open it in Notion → Share → invite the integration
- Without this, the API token will not be able to read content.
1.3 Capture IDs
- For database exports: collect the Notion Database ID.
- Define a slug per database to control output folder naming:
- Example slug:
meeting-notes,internal-wiki,client-notes
- Example slug:
Step 2 — Implement the script in this repo
2.1 Script output rule (important)
The script must write output relative to process.cwd(), so when the workflow runs it from the repo root, output goes to:
knowledge/notion/<slug>/...
2.2 Script file
Create or update:
standards/05-scripts/notion/sync-notion.mjs
import fs from "node:fs";
import path from "node:path";
import { Client } from "@notionhq/client";
import { NotionToMarkdown } from "notion-to-md";
const notion = new Client({ auth: process.env.NOTION_TOKEN });
const n2m = new NotionToMarkdown({ notionClient: notion });
const databaseId = process.env.NOTION_DATABASE_ID;
if (!databaseId) throw new Error("Missing NOTION_DATABASE_ID");
// Optional slug so we can export multiple sources cleanly
const databaseSlug = process.env.NOTION_DATABASE_SLUG || "default";
// IMPORTANT: output relative to CWD (vault repo root in the workflow)
const outDir = path.join(process.cwd(), "notion", databaseSlug);
fs.mkdirSync(outDir, { recursive: true });
const db = await notion.databases.query({ database_id: databaseId });
for (const page of db.results) {
const pageId = page.id;
// Best-effort title extraction
const titleProp = Object.values(page.properties).find((p) => p.type === "title");
const title = titleProp?.title?.map((t) => t.plain_text).join("") || pageId;
const mdBlocks = await n2m.pageToMarkdown(pageId);
const md = n2m.toMarkdownString(mdBlocks).parent;
// Simple stable filename slug
const slug = title
.toLowerCase()
.replace(/[^a-z0-9]+/g, "-")
.replace(/(^-|-$)/g, "");
fs.writeFileSync(path.join(outDir, `${slug}.md`), md, "utf8");
}2.3 Dependencies (playbook)
In standards/05-scripts/notion/package.json:
{ “name”: “notion-sync”, “private”: true, “type”: “module”, “dependencies”: { “@notionhq/client”: “^2.2.16”, “notion-to-md”: “^3.1.7” } } From standards/05-scripts/notion/ run:
npm install
Commit: sync-notion.mjs package.json package-lock.json
Step 3 — Add the workflow in brainforge-platform
Create:
.github/workflows/notion-sync.yml
name: Sync Notion into Vault
on:
schedule:
- cron: "0 7 * * *" # daily at 07:00 UTC
workflow_dispatch:
permissions:
contents: write
jobs:
sync:
runs-on: ubuntu-latest
steps:
- name: Checkout repo
uses: actions/checkout@v4
- name: Setup Node
uses: actions/setup-node@v4
with:
node-version: "20"
- name: Install script dependencies
working-directory: standards/05-scripts/notion
run: npm ci
# Run script with CWD = vault so output lands in knowledge/notion/
- name: Run Notion sync
working-directory: vault
run: node ../standards/05-scripts/notion/sync-notion.mjs
env:
NOTION_TOKEN: ${{ secrets.NOTION_TOKEN }}
NOTION_DATABASE_ID: ${{ secrets.NOTION_DATABASE_ID }}
NOTION_DATABASE_SLUG: ${{ secrets.NOTION_DATABASE_SLUG }}
- name: Commit and push changes
run: |
if git diff --quiet; then
echo "No changes to commit."
exit 0
fi
git config user.name "notion-sync-bot"
git config user.email "notion-sync-bot@users.noreply.github.com"
git add knowledge/notion
git commit -m "Sync Notion"
git pushNotes:
The workflow uses permissions: contents: write so GITHUB_TOKEN can push commits to brainforge-platform.
GitHub scheduled cron times are UTC.
Step 4 — Configure secrets in brainforge-platform
In brainforge-platform repo settings: Settings → Secrets and variables → Actions → New repository secret
Add:
NOTION_TOKEN Notion integration token
NOTION_DATABASE_ID Database to export
NOTION_DATABASE_SLUG Folder name under knowledge/notion/ (example: meeting-notes)
Step 5 — Validate (manual run)
GitHub → brainforge-platform → Actions
Select “Sync Notion into Vault”
Click “Run workflow”
Expected:
Markdown files appear in knowledge/notion/
A commit is created only if content changed
Common issues / fixes
403 / Not authorized to read Notion content
Confirm the Notion page/database was explicitly shared with the integration.
Playbook script not found
Ensure you have the latest brainforge-platform checkout; the script lives at standards/05-scripts/notion/.
No files generated
The Notion database might be empty, or the integration lacks access.
The script’s title property extraction might not match your database schema.
Runs successfully but no commit
This is expected when there are no diffs.
Optional: multiple databases (recommended pattern)
If you want to sync multiple Notion databases into different subfolders:
Option A: Duplicate “Run Notion sync” step (simplest)
Add multiple steps, each with its own secrets, e.g.:
- NOTION_DATABASE_ID_INTERNAL, NOTION_DATABASE_SLUG_INTERNAL
- NOTION_DATABASE_ID_MEETINGS, NOTION_DATABASE_SLUG_MEETINGS
Option B: Add a JSON config
- Provide a single secret like NOTION_SYNC_CONFIG containing an array of { databaseId, slug }
- Loop through each in the script
Operational considerations
This setup exports text blocks to Markdown. If you need:
- recursive subpages
- attachments/images mirrored into the repo
- frontmatter metadata (e.g., Notion page id, last edited time)
- stable filenames keyed by page id
- extend sync-notion.mjs accordingly before relying on it for long-term archival.