[Client] — [Domain] — Dashboard specification

About this document (Brainforge)

Internal conventions for how this file works in the repo. Optional: strip or export without this section when sharing a client-only artifact.

Titling and filename

Use [Client] — [Domain / program name] — Dashboard specification for the document title (the H1 above) and any PDF cover. Examples: Acme — Wholesale reporting — Dashboard specification · Acme — Retail & POS — Dashboard specification.

Filename: {client}-{domain}-dashboard-spec.md under knowledge/clients/{client}/resources/.

Single source of truth

Repo markdown under knowledge/clients/{client}/resources/ is the canonical dashboard spec once merged. Google Docs may support live stakeholder review; this file wins on scope and build detail.

  • This spec (repo markdown) is the version-of-record for dashboard structure (dashboards, sections, tiles, filters, formatting, and rollout). Update it when the build changes; do not maintain a parallel “v2” inside one doc — a material redesign gets a new spec revision (new file version or superseding doc), not informal “V1 vs V2” labels in the same specification.

  • Metric definitions live in Data Platform Documentation (Core Metrics sheet) and the optional locked metrics file in the repo — the same definitions in two places; keep them in sync. That pair is definition authority. This spec links and may copy definitions here for readability; if anything disagrees, reconcile to the Data Platform source of truth first, then this spec.

  • Legacy: Older Notion/Google Doc intake templates are non-authoritative once this file exists for the engagement.


Specification template

Everything from here down is the engagement spec to fill in.

Status: [Draft / In stakeholder review / Approved / Locked for build]
Visualization tool: [Omni / Tableau / Looker / Power BI / Other — specify]
Prepared for: [Client stakeholder names and roles]
Prepared by: Brainforge
Last updated: [YYYY-MM-DD]


ArtifactLink / pathNotes
Data Platform Documentation (Core Metrics)[Google Sheet link] + optional repo mirror [./data_platform_documentation/METRIC_DEFINITIONS_LOCKED.md]Single definition source — sheet and locked file must match; same metrics, formulas, and grain
Linear project[Linear project URL]Milestones and issues → acceptance criteria
Legacy reports and spreadsheets (folder)[Drive folder]Optional; only if the client keeps prior artifacts in one place

1. Engagement context

1.1 Program intent

[Narrative first: What problem is solved? What becomes possible that was not possible before? Why now?]

1.2 Users, usage, and questions

Document who uses the dashboards, how often, and which questions they must be able to answer. A dashboard that does not serve a named user and question should not be built.

User or personaRoleHow often they use this (e.g. daily standup, weekly review, monthly close)Questions they must answer
[Name or persona][Team][Frequency][Q1]; [Q2]

Primary questions (pilot): List the minimum set of questions this pilot must answer end-to-end.

  1. [Question — measurable]
  2. [Question]
  3. [Question]

1.3 Data scope and modeling context

Narrative: [Sources, subject area, and grain — plain language before any tables.]

Optional — Omni (when using Omni): List enough in-context detail that builders (and skills) can resolve fields and joins accurately:

  • Topics: [topic names]
  • Schema / views: [relevant schemas, key views or tables]
  • Relationships: [grain, primary keys, known join caveats]

Official modeling references: Curating datasets with topics, Topic best practices, Topic file parameters.

Optional — dbt / Snowflake (only when needed): If the dashboard depends on logic not obvious from the BI layer, add:

  • dbt models: [model names]
  • Snowflake objects: [only if required beyond Omni/Topic docs]
Object / tableRoleGrain
[name][KPIs / facts / dims][One row = …]

1.4 Definition authority (metrics)

PriorityArtifactWhat it governs
1Data Platform Documentation — Core Metrics sheet and optional METRIC_DEFINITIONS_LOCKED.md in the repo (same content; keep in sync)Formulas, grain, approved metric names
2This specLayout, sections, tiles, filters, formatting — not competing definitions

1.5 Legacy reports and spreadsheets this work replaces or parallels

Report or spreadsheetWhat it coveredSuperseded by (dashboard in this spec)
[name / link][metrics or use case][Dashboard — Section]

1.6 Linear project alignment

Milestone / issueOwnerTargetMaps to
[from Linear][§9 Dashboard …]

2. Dashboard design guidelines (overall clarity)

These are dashboard standards—readability, hierarchy, and honest data. Use this list as the default bar for every dashboard in §9; override only where noted per dashboard.

  • Glanceability: Top-level numbers and trends should be readable in under a minute for the intended use cadence.
  • Hierarchy: Put the most important comparisons and KPIs where the eye lands first; support detail in sections below.
  • Consistency: Same metric names, units, and date logic as the Data Platform sheet; same formatting family as §4 (global) and per-dashboard overrides in §9.
  • Honest scales and caveats: Label axes, periods, and comparisons; footnote known gaps (latency, partial sources).
  • Color: Use color for meaning (directionality, thresholds), not decoration; note accessibility where relevant.
  • Density: Match the audience—executive dashboards stay lighter; operational dashboards may include more tiles with clear sectioning.

3. Metrics and definitions (Data Platform sheet and locked file)

The Data Platform Documentation (Core Metrics) and the optional repo mirror (METRIC_DEFINITIONS_LOCKED.md) are one definition set — keep them aligned. This section does not replace them.

  • Links: [URL to Core Metrics tab] · [path to METRIC_DEFINITIONS_LOCKED.md if used]
  • How this spec uses it: Copy key definitions below for convenience in review only. If copied text drifts, reconcile to the sheet and locked file and update this file.

Metrics referenced by this spec (copy from sheet or cite rows)

Metric name (as in sheet)Sheet row / IDNotes for builders (optional)
[metric][row][e.g. dual label Ops vs Finance]

4. Global formatting

Defaults for all dashboards in this spec unless a dashboard specifies otherwise in §9.

ElementConvention
Currency[e.g. $ in titles; $#,##0]
Counts[Integer; labels]
Percentages[%; state denominator]
Dates[e.g. MM/DD/YYYY in tables]
Abbreviations[YoY, WoW, MTD, …]
Text casing[Sentence vs title case]

Date and grain (global):

  • Primary date field for the program: [field name] — document in the BI tool so every consumer uses the same field unless a tile explicitly overrides.
  • Default time window: [e.g. last 13 weeks + MTD]
  • Period controls: [week / month / quarter] — note fiscal or retail calendar if applicable.

Business rules that span dashboards: [e.g. Ops vs Finance labels; exclusions]


5. Data freshness, caching, and scheduled exports

5.1 Freshness and latency

  • Target freshness: [e.g. T+1 by 8 AM local]
  • Source pipeline: [what runs when]
  • Known lag: [e.g. partner feeds delay 24h]

5.2 Caching and performance

Document how the visualization tool caches or refreshes query results (dashboard load, tile-level cache, extract refresh if applicable).

  • Tool behavior: [e.g. link to vendor doc on caching]
  • Expected interaction latency: [e.g. filter changes within N seconds]

Omni: Improving query performance with caching

5.3 Scheduled exports

If stakeholders receive scheduled PDF, CSV, email, or Slack exports:

ExportAudienceCadenceContentsOwner
[name][who][daily / weekly][dashboard or slice]

If none, state: No scheduled exports for this pilot.


6. Visualization tool — documentation and rationale

Link to official documentation for the stack you are using. Add one line on why the dashboard is organized the way it is (e.g. Topics by grain, workbook per domain).

ResourceLinkWhy this structure
[Primary doc for this tool][URL][One line]

Omni (when applicable — add rows as needed):

ResourceLink
Documentation homedocs.omni.co
Analyze & build (workbooks, queries)Analyze & build in Omni
Editing & publishing dashboards & documentsEditing & publishing content
Topics & semantic modelCurating datasets with topics

7. Tool-specific implementation notes

Keep lean; point to playbooks and vendor docs. Remove subsections that do not apply.

Omni

Tableau

  • Workbook / datasource: [structure]
  • Extract or live / refresh: [schedule]

Looker

  • Explore / LookML: [notes]
  • Access: [model permissions]

Power BI

  • [Dataset, gateway, refresh]

8. Access, training, and rollout

TopicDetail
Access / permissions[Who can view vs edit; folders; row-level rules if any]
Training[What users are shown (live session, Loom, office hours)]
Rollout plan[Phased rollout: pilot group → broader org; dates]
Support channel[Slack / email for questions]

9. Dashboard specifications

Structure: Dashboard → Section → Tile (chart). Narrative first (why, who, questions); then filters; then tile detail. Repeat for each dashboard.

Tile documentation — required content (two layouts)

Every tile must make chart type, rows, columns (or KPI equivalent), sort, formats, and data lineage easy to scan. Pick one layout:

LayoutWhen to use
Compact tableSingle `
Stacked + data sources tableRecommended for bridges, pivots, or multiple Snowflake / Omni objects. Put Chart type, Rows, and Columns on separate lines (bold label + line break + body) — never one sentence cramming all three. Follow with a short table for Sort, Formats, Tile-level filters, Notes. Use a Data sources (Snowflake) or Data sources (Omni) mini-table: Object | Notes (availability, post-pilot, gap reference). Do not replace that with one dense Source: paragraph.

Example of a filled spec using this template: knowledge/clients/lmnt/resources/lmnt-retail-omni-dashboard-spec.md (pilot instance; §9 — e.g. Net revenue bridge).


Dashboard [1] — [Short name]

Why this dashboard exists

[2–4 sentences.]

Primary audience

  • [Who]

Primary questions (tie to §1.2)

  • [Q1]
  • [Q2]

Dashboard-level filters

Filters that apply across the whole dashboard (unless a section or tile overrides).

FilterScopeDefaultNotes
[Date range]Dashboard[default]
[Dimension]Dashboard[All]

Dashboard-specific formatting

Overrides to §4 only where this dashboard differs.

ElementOverride
[e.g. currency][only if different]

Section [1.1] — [Section title]

Purpose: [What this section answers — one or two sentences.]

Section-level filters (apply to all tiles in this section unless a tile overrides):

FilterDefaultNotes
[if any]

Tile [1.1a] — [Tile label] — stacked layout (multi-source or dense tiles)

Chart type
[KPI, line, bar, table, waterfall, map, combo, etc.]

Rows
[Dimensions or series on rows — list order if fixed]

Columns
[Measures or dimensions as columns]

FieldDetail
Sort[Sort field, direction, top N if applicable]
Formats[Per column or measure; link to §4 or dashboard overrides]
Tile-level filters[Only if different from section or dashboard]
Notes[Caveats, comparisons, conditional formatting]

Data sources (Snowflake) — or Data sources (Omni) if topic-only

Object / topicNotes
[OBJECT_OR_TOPIC][In sync / post-pilot / pending dbt — see Gap X if applicable]
[OBJECT_OR_TOPIC][…]

Tile [1.1b] — [Tile label] — compact layout (simple tile)
FieldDetail
Chart type[KPI, line, bar, table, map, combo, etc.]
Columns[Measures or dimensions placed as columns in the viz or pivot]
Rows[Dimensions or breakdown on rows]
Sort[Sort field, direction, top N if applicable]
Formats[Per column or measure; link to §4 or dashboard overrides]
Data source[Single topic, view, or table — enough context to build]
Tile-level filters[Only if different from section or dashboard]
Notes[Caveats, comparisons, conditional formatting]

(Add more sections and tiles as needed. Copy Section / Tile blocks for Dashboard [2], …)


Known gaps and caveats (Dashboard [1])

  • [Source or definition gap]

Out of scope for this pilot (Dashboard [1])

  • [What is explicitly not required for sign-off]

10. Deferred scope (outside this pilot)

Use this for work that is not in the current pilot. A future build or major redesign is tracked as a new spec revision or new engagement artifact—not “V2” inside this document.

  • [Item]Reason: […]Follow-up: [ticket / later spec]

11. Acceptance criteria

This pilot is accepted when:

  1. Users in §1.2 can answer the listed primary questions using the published dashboards.
  2. All metrics trace to the Data Platform sheet; nothing in §3 contradicts the sheet.
  3. Freshness and caching (§5) match what stakeholders were told; scheduled exports (if any) run as specified.
  4. Access and rollout (§8) are executed for the agreed pilot group.
  5. §9 is build-complete: every section and tile documents chart type, Rows, Columns, Sort, filters at the right level (dashboard, section, or tile), and data lineage (compact Data source cell and/or Data sources table per §9 guidance — no ambiguous single-line Source: blobs for multi-object tiles).
  6. §12 sign-off checklist is complete.

12. Review, open questions, and sign-off

12.1 Open questions

TopicQuestionDecisionOwner

12.2 Sign-off

Check each box when that approval is recorded (name and date can live in your ticket system or email thread).

  • Client sponsor or stakeholder — approved: [name, date]
  • Analytics build owner — approved: [name, date]
  • Brainforge delivery lead — approved: [name, date]
  • Client data or IT owner (if required) — approved: [name, date]

Appendix A — Why these dashboard practices matter

Reference only. Principles are stated in business language and tied to Brainforge dashboard delivery—not slide decks or general presentation design.

Glanceability and focus — Stephen Few, Information Dashboard Design
Busy teams need to see whether KPIs are healthy without reading a narrative. In practice, that means a small number of headline metrics, clear section boundaries, and avoiding decoration that competes with the numbers. Specs should list sections and tiles so builders know what must be visible at first glance versus what sits in drill-down.

Clear message before chart choice — Cole Nussbaumer Knaflic, Storytelling with Data
A chart is only useful if the reader knows what question it answers. In practice, §1.2 questions and per-dashboard “why” text come before the tile table. That order prevents building a chart and then retrofitting a business question.

Patterns that match the domain — Wexler, Shaffer, Cotgreave, The Big Book of Dashboards
Retail, wholesale, and SaaS each have common dashboard patterns (e.g. trend + ranking + detail). In practice, reuse familiar layouts so stakeholders do not fight the UI; the Dashboard → Section → Tile structure makes that explicit for handoff.

Honest numbers and scales — Edward Tufte, The Visual Display of Quantitative Information; Alberto Cairo, The Functional Art
Misleading axes or unlabeled comparisons erode trust. In practice, Formats, Notes, and Data Platform alignment ensure finance and ops can defend the same figures in a meeting.

Color and attention — Colin Ware, Visual Thinking for Design
People notice color and position before they read labels. In practice, use color for meaning (directionality, thresholds) and document it in tile Notes so accessibility and brand rules stay consistent.


Omni documentation — visualization and dashboard building

Use these when the engagement is on Omni. They cover workbooks, pivot queries, AI-assisted charts, publishing, dashboards, filters, topics, and performance.

TopicLink
Documentation homedocs.omni.co
Full index (discover all pages)llms.txt
Analyze & build (overview)Analyze & build in Omni
WorkbooksBuild analyses in workbooks
Point-and-click / pivot queriesBuilding point-and-click queries
Querying (AI, point-and-click, SQL)Querying data in Omni
Workbook inspectorWorkbook inspector
AI: charts, tables, KPIsGenerating visualizations with AI
Caching & performanceImproving query performance with caching
Aggregate awarenessOptimizing query performance with aggregate awareness
Drafts, publish, branch modeEditing & publishing content
Chart palettes (org)Managing chart palettes
Topics & semantic modelCurating datasets with topics
Topic best practicesTopic best practices
Topic YAML parametersTopic file parameters
Dashboard Assistant (AI in dashboard)Dashboard Assistant
Guides (tutorials index)Guides
Links in visualizationsAdd links to visualizations
Dashboard filter: calculated fieldFilter dashboard charts with calculated fields
EmbeddingEmbedding Omni in external applications
Embed events from visualizationsSending embed events from Omni visualizations
PermissionsManaging data access with connection permissions

Appendix B — Pre-handoff QA checklist

  • Metrics match Data Platform sheet (spot-check).
  • §9 tiles have clear data lineage (compact Data source and/or Data sources table — multi-object tiles are not a single vague paragraph).
  • Freshness and caching behavior match §5.
  • Scheduled exports (if any) verified on a cadence.
  • Dashboard / section / tile filters behave as documented.
  • Access matches §8 for pilot users.
  • §12 sign-off checklist complete.