Learning Moment: Analysis Plan Patterns from Eden Experimentation Example

Analysis: Eden Experimentation Analytics Workstream (good example that follows /plan_analysis)

Date: 2026-01-20


What We Learned

Process

  • “By driving improvements in” can be initiative-based. It’s not only a bridge into Approach. For multi-workstream plans (experimentation engine, top of funnel, CRO, email, SEO, retention), it becomes a list of initiatives with sub-tasks (e.g., “Build the experimentation engine” → brainstorm & log experiments, select A/B tool, run tests, analyze in 5 days).
  • Two plan styles: (1) Analytical — Approach blocks with define/include/map/repurpose/exclude (e.g., Telehealth Margins). (2) Initiative-based — “By driving improvements in” carries the workstreams; Approach can be slim or omitted.
  • “What do I want to know?” can include prior/partial answers. Use GS/Given: (existing finding, Mixpanel link, data) and Suggestion: (next step, e.g., “Experiment with mobile intake UX,” “Add page ‘next’ events”) when data or findings already exist.
  • Appendix / Notes. Use for test ideas, brainstorms, and rough notes (e.g., “Henry’s notes: Test ideas — conversion, marketing, email, retention”) that don’t fit Goal, Metrics, Approach, What/Why/So what, or Measured by.
  • “Measured by?” can be directional. Besides specific KPIs (e.g., dose pickup rate by plan), use “Decrease in X,” “Increase in Y” (e.g., Decrease in CAC by product; Increase in month-3+ retention; Decrease in treatment abandon rate) when that’s the right level.

Business

  • Multi-workstream plans need a clear bridge from Metrics to work. “By driving improvements in” is that bridge; it ties outcome metrics to concrete initiatives (build engine, improve funnel, optimize CRO, etc.).
  • Suggestions inside the plan (e.g., mobile UX, add “next” events) make the plan actionable and show where data or tooling is missing.

Technical

  • N/A — This learning is about plan structure and format, not SQL or tooling.

What to Do Differently Next Time

Immediate Changes

  1. Use initiative-based “By driving improvements in” when a plan spans many workstreams (Impact: High) — Avoid forcing everything into Approach blocks; use workstreams + sub-tasks instead.
  2. Add GS/Given and Suggestion under “What do I want to know?” when prior findings exist (Impact: High) — Surfaces existing data and next steps in one place.
  3. Add “Appendix / Notes” when there are test ideas or rough lists (Impact: Med) — Keeps the main structure clean while preserving useful context.

Long-Term Improvements

  1. Build a small “Good plan examples” library (Telehealth = analytical, Experimentation = initiative-based) and reference it in /plan_analysis (Timeline: done in this update).
  2. In /discover, prompt for “Do we already have partial answers or data?” so plans can pre-fill GS/Suggestion where relevant (Timeline: when revising /discover).

What to Remember

Reusable Patterns

  • Initiative-based “By driving improvements in”: Initiative name → sub-tasks (and optionally sub-focus bullets). Use for experimentation, CRO, retention, multi-channel workstreams.
  • GS/Given + Suggestion under a question: **GS / Given:** [finding + link]. **Suggestion:** [next step]. Use when Mixpanel, Tableau, or prior analysis already partially answers the question.
  • Directional “Measured by?”: “Decrease in X by segment,” “Increase in Y overall” — when the plan is about improving a KPI rather than defining a new one.

Pitfalls to Avoid

  • Forcing initiative-heavy plans into only Approach blocks — Use “By driving improvements in” as the main carrier for workstreams when that’s the natural structure.
  • Leaving prior findings out of the plan — If we already know “mobile converts at 19% vs desktop 23%,” put it under “What do I want to know?” with a Suggestion so the plan reflects current state and next steps.

Domain Knowledge

  • Eden experimentation: Intake CRO by form/device, winback/abandon-cart, CAC by channel, SEO by page, retention and treatment completion. Plans that span these need initiative-based structure.
  • Mixpanel limitations: e.g., no “page next” in intake → Suggestion to add events. Plan can record both the gap and the suggested fix.

Update Knowledge Base

Slash Commands to Update:

  • /plan_analysis — (1) “By driving improvements in” can be initiative-based with sub-tasks; (2) “What do I want to know?” can include GS/Given + Suggestion; (3) “Measured by?” can be directional; (4) optional “Appendix / Notes”; (5) Approach is for analytical blocks—slim or omit when initiative-based; (6) Notes for AI and good-plan examples.

Teaching Moment

If explaining this to a junior analyst:

A good analysis plan has two main “middles”: (1) Analytical — we’re building lifecycle maps, unit economics, attribution; use Approach with define/include/map/exclude. (2) Initiative-based — we’re standing up an experimentation engine, improving funnel, CRO, email, SEO, retention; use “By driving improvements in” as a workstream list with sub-tasks. Don’t force an initiative-heavy plan into only Approach blocks. When we already have partial answers (e.g., from Mixpanel), put them under “What do I want to know?” as GS/Given and add a Suggestion so the plan is both current and actionable.