BrainForge 30-Day Learning Curriculum

For Go-to-Market, Sales, and Non-Technical Team Members

Last Updated: January 7, 2026
Created by: Luke’s 30-Day Challenge
Maintained by: GTM & Operations Teams


🎯 Purpose of This Document

This curriculum is designed to take someone from zero technical knowledge to confidently explaining BrainForge’s services in 30 days. Whether you’re new to the team, transitioning roles, or just curious about what we do — start here.

What You’ll Learn:

  • The basics of data and AI (starting from zero)
  • How BrainForge helps clients
  • Real client stories and case studies
  • How to talk about our services confidently

What You Won’t Learn:

  • How to code (that’s not the goal)
  • Deep technical implementation details
  • Every tool and technology (we focus on concepts)

📚 Table of Contents


Week 1: Data Fundamentals

Goal: Understand what data is, why it matters, and how it moves through systems.

Day 1-2: What is Data?

The Basics: Data is just information. When you make a purchase online, that’s data. When someone visits a website, that’s data. When a sensor measures temperature, that’s data.

Types of Data:

  • Structured Data — Organized in tables (like a spreadsheet). Example: Customer names, emails, purchase dates
  • Semi-structured Data — Has some organization but not rigid. Example: JSON files, XML
  • Unstructured Data — No fixed format. Example: Emails, images, videos, social media posts

Why Companies Care About Data:

  • Understand what customers want
  • Make better business decisions
  • Find problems before they get expensive
  • Automate repetitive work
  • Predict future trends

The Problem: Most companies have data scattered everywhere:

  • Sales data in Salesforce
  • Marketing data in HubSpot
  • Financial data in QuickBooks
  • Website data in Google Analytics
  • Support tickets in Zendesk

BrainForge’s Role: We help companies bring all this data together in one place so they can actually use it.


Day 3-4: Data Warehouses vs Data Lakes

What is a Data Warehouse? Think of it as a giant, organized filing cabinet in the cloud where all a company’s data gets stored in one place. The data is cleaned, structured, and ready to use for analysis and reporting.

Popular Data Warehouses:

  • Snowflake — Works year-round, not just in winter ❄️
  • Google BigQuery — Google’s warehouse
  • Amazon Redshift — Amazon’s warehouse
  • Azure Synapse — Microsoft’s warehouse

Key Features:

  • Structured and organized
  • Fast queries (you can ask questions and get answers quickly)
  • Built for analytics and reporting
  • Relatively expensive storage, but optimized for speed

What is a Data Lake? A data lake is like dumping everything into… well, a lake. You store raw, unprocessed data in its original format. It’s cheaper but messier.

Key Features:

  • Store everything (logs, files, raw data)
  • Cheaper storage
  • Data isn’t cleaned yet
  • Good for “we might need this someday”

What is a Data Lakehouse? A hybrid approach that combines the best of both:

  • Cheap storage like a lake
  • Organized structure like a warehouse
  • Not actually located at a lake (disappointingly)

BrainForge’s Approach: We typically recommend data warehouses (especially Snowflake) for most clients because they need clean, ready-to-use data — not a dumping ground.


Day 5-7: How Data Moves — Pipelines, ETL, and ELT

What is a Data Pipeline? A data pipeline is the system that moves data from where it’s created (like Shopify or Salesforce) to where it’s stored and analyzed (like Snowflake).

Think of it like plumbing — data flows through pipes, gets cleaned along the way, and ends up somewhere useful.

What is ETL? Extract, Transform, Load

  1. Extract — Pull data from source systems (Shopify, Google Ads, etc.)
  2. Transform — Clean it up, fix errors, reshape it
  3. Load — Store it in the data warehouse

Example:

  • Extract customer orders from Shopify
  • Transform by removing test orders and standardizing zip codes
  • Load into Snowflake for reporting

What is ELT? Extract, Load, Transform

The order is different — you load the raw data first, then transform it inside the warehouse.

  1. Extract — Pull data from sources
  2. Load — Dump it into the warehouse (still messy)
  3. Transform — Clean it up using tools inside the warehouse

Why ELT? Modern cloud warehouses (like Snowflake) are powerful enough to handle transformations themselves. It’s often faster and more flexible.

BrainForge’s Approach: We use ELT for most modern cloud projects. We extract data using tools like Fivetran or Polytomic, load it into Snowflake, then transform it using dbt.

Common Tools:

  • Extraction/Loading: Fivetran, Polytomic, Airbyte, Stitch
  • Transformation: dbt (more on this in Week 2)

Week 2: The Modern Data Stack

Goal: Understand the tools and technologies that power modern data operations.

Day 8-10: Snowflake & Cloud Data Warehouses

What is Snowflake? Snowflake is a cloud-based data warehouse — the most popular one we use at BrainForge. It’s where clients store all their cleaned, organized data.

Why Snowflake?

  • Separation of storage and compute — You can scale them independently (store tons of data cheaply, only pay for compute when running queries)
  • Fast — Uses Massively Parallel Processing (MPP) to run queries quickly
  • Easy to use — SQL-based, no need to manage servers
  • Secure — Built-in encryption, access controls, compliance features

Key Concepts:

  • Database — The top-level container for your data
  • Schema — A way to organize tables within a database
  • Tables — Where your actual data lives (rows and columns)
  • Warehouses — The compute resources that run queries (confusingly named, but it’s the “engine”)

When to Use Snowflake:

  • You have data from multiple sources
  • You need fast analytics
  • You’re growing and need to scale
  • You want to avoid managing servers

BrainForge’s Snowflake Work:

  • Set up Snowflake environments from scratch
  • Design database schemas
  • Optimize query performance
  • Build data models and pipelines

Day 11-13: dbt (Data Build Tool)

What is dbt? dbt is the tool we use to transform raw data into clean, usable data models inside the warehouse. Think of it as the construction crew that takes raw materials and builds something useful.

Why dbt?

  • Write transformations in SQL (the language of data)
  • Version control your data logic (like Git for code)
  • Test your data automatically
  • Document everything
  • Modular and reusable

How dbt Works:

  1. Models — SQL files that define transformations
  2. Tests — Check that data is correct (e.g., no nulls in required fields)
  3. Documentation — Auto-generate docs that explain what each table does
  4. Lineage — See how data flows from source to final output

Example:

-- models/staging/stg_customers.sql
select
    customer_id,
    lower(email) as email,  -- standardize emails
    created_at
from raw.shopify.customers
where deleted_at is null  -- remove deleted customers

dbt Layers:

  • Staging — Clean raw data, standardize column names
  • Intermediate — Join and combine data
  • Marts — Final business-ready tables (for dashboards and reports)

BrainForge’s dbt Work:

  • Build dbt projects from scratch
  • Organize models into logical layers
  • Write tests to ensure data quality
  • Create documentation
  • Train client teams to maintain dbt

Day 14-16: BI Tools & Dashboards

What is BI (Business Intelligence)? BI tools turn data into visualizations — charts, graphs, dashboards — so non-technical people can understand what’s happening in the business.

Popular BI Tools:

  • Omni — Modern, SQL-based (our preferred tool)
  • Looker — Google’s BI tool, powerful but complex
  • Tableau — Drag-and-drop, very visual
  • Power BI — Microsoft’s tool
  • Metabase — Open-source, simple

What Makes a Good Dashboard?

  • Answers a specific question — Don’t just show data, solve a problem
  • Easy to understand — Non-technical users should “get it” immediately
  • Up-to-date — Data refreshes automatically
  • Actionable — Users know what to do next

Common Dashboard Types:

  • Executive Dashboard — High-level metrics (revenue, profit, growth)
  • Sales Dashboard — Pipeline, deals, quotas
  • Marketing Dashboard — Campaign performance, leads, conversions
  • Product Dashboard — User engagement, feature adoption, retention

BrainForge’s BI Work:

  • Design mockups with stakeholders
  • Build dashboards in Omni, Looker, or Tableau
  • Connect to Snowflake or other warehouses
  • Train teams to use and maintain dashboards

Day 17-19: Product Analytics

What is Product Analytics? Product analytics is about understanding how users interact with your product — what features they use, where they drop off, what drives retention.

Popular Tools:

  • Mixpanel — Event-based analytics
  • Amplitude — Similar to Mixpanel
  • PostHog — Open-source alternative
  • GA4 (Google Analytics 4) — Google’s free tool

Key Concepts:

  • Events — User actions (clicked button, viewed page, completed purchase)
  • Funnels — Step-by-step journeys (Sign up → Activate → Subscribe)
  • Cohorts — Groups of users (users who signed up in January)
  • Retention — Do users come back?
  • Activation — Did users experience the “aha moment”?

Example Questions Product Analytics Answers:

  • What percentage of users complete onboarding?
  • Which features drive retention?
  • Where do users drop off in the checkout flow?
  • What’s the conversion rate from free to paid?

BrainForge’s Product Analytics Work:

  • Set up event tracking (what to measure)
  • Implement tracking in apps
  • Build funnels and cohort analyses
  • Create activation and retention dashboards
  • Train product teams to self-serve

Day 20-21: Data Quality & Testing

Why Data Quality Matters: Bad data = bad decisions. If the data in your dashboard is wrong, people make the wrong calls.

Common Data Quality Issues:

  • Missing values — Nulls where there shouldn’t be
  • Duplicates — Same record appears twice
  • Inconsistent formats — “New York” vs “NY” vs “new york”
  • Stale data — Data hasn’t updated in days
  • Wrong types — Text in a number field

How We Ensure Data Quality:

  1. dbt Tests — Automated checks

    • Not null (field must have a value)
    • Unique (no duplicates)
    • Relationships (foreign keys exist)
    • Custom checks (revenue > 0)
  2. Data Monitoring — Alerts when things break

    • Pipelines fail
    • Data volume drops
    • Anomalies detected
  3. Documentation — So people know what data means

BrainForge’s Quality Work:

  • Write dbt tests
  • Set up monitoring and alerts
  • Create data dictionaries
  • Audit existing data quality

Week 3: AI & Automation Explained

Goal: Understand what AI actually is, how it works, and how we use it.

Day 22-23: What is AI? (For Real)

The Hype vs Reality: AI is not magic. It’s not sentient. It’s pattern recognition at scale.

Types of AI:

  1. Machine Learning (ML) — Computers learn patterns from data

    • Example: Spam filters, recommendation engines
  2. Large Language Models (LLMs) — AI trained on text to understand and generate language

    • Examples: ChatGPT, Claude, GPT-4
  3. Generative AI — AI that creates new content (text, images, code)

    • Examples: ChatGPT writes text, DALL-E creates images

What LLMs Can Do:

  • Summarize text
  • Answer questions
  • Write content
  • Extract information
  • Classify and categorize
  • Translate languages

What LLMs Can’t Do:

  • Know things that aren’t in their training data (unless you tell them)
  • Do math reliably (without tools)
  • Access real-time information (without integrations)
  • Reason like humans (they’re pattern matchers)

BrainForge’s AI Philosophy: “Raise the Floor, Not the Ceiling” We don’t chase bleeding-edge research. We use AI that works today to solve real problems. We build practical, reliable solutions — not science experiments.


Day 24-25: RAG, Copilots, and AI Agents

What is RAG (Retrieval Augmented Generation)? RAG = AI that searches your documents first, then answers based on what it finds.

How RAG Works:

  1. User asks a question
  2. System searches company docs, wikis, policies
  3. Relevant content is retrieved
  4. LLM uses that content to answer the question

Example:

  • Without RAG: “I don’t know your company’s vacation policy”
  • With RAG: Searches HR docs → “You get 15 days PTO per year, accrued monthly”

BrainForge’s RAG Work:

  • Build internal knowledge hubs
  • Connect to Notion, Google Drive, Confluence
  • Make company knowledge searchable via AI chat

What is a Copilot? A copilot is AI that assists a human — the human is still in control, but AI speeds up the work.

Examples:

  • Analyst Copilot — Summarizes meetings, creates action items
  • Lead Researcher Copilot — Auto-researches accounts for sales teams
  • Customer Support Copilot — Suggests responses to support tickets

Key Principle: Human in the loop. AI suggests, human decides.


What is an AI Agent? An agent is AI that takes actions on its own (with guardrails).

Examples:

  • AI Intake Optimizer — Triages incoming requests and routes them
  • Lead Distribution Agent — Assigns leads to sales reps automatically
  • Ticket Automation Agent — Creates tickets from meetings

Key Principle: Guardrails and oversight. AI acts, but humans monitor.


Day 26-27: AI in Practice — Evals, Guardrails, Safety

The Problem with AI: LLMs can hallucinate (make things up), be biased, or give unsafe responses.

How We Make AI Reliable:

  1. Evals (Evaluations) — Testing AI output quality

    • Does it answer correctly?
    • Is the tone appropriate?
    • Does it follow instructions?
  2. Guardrails — Rules that prevent bad behavior

    • Block unsafe content
    • Enforce company policies
    • Prevent leaking sensitive data
  3. Human Review — Spot-checking AI outputs

BrainForge’s AI Safety Work:

  • Build evaluation frameworks
  • Set up guardrails and policy layers
  • Create audit trails
  • Train clients on safe AI usage

Week 4: BrainForge in Practice

Goal: Understand how we engage with clients and deliver value.

Day 28: Who We Are & What We Do

Who is BrainForge? We’re a forward-deployed AI & data services team for mid-market growing companies ($10M+ revenue). We embed with client teams to solve hard, unsexy problems with data and AI.

What Makes Us Different:

  • Outcome-focused — We care about results, not just deliverables
  • Raise the floor, not the ceiling — We build working solutions, not science experiments
  • Human + AI — We pair consultants with AI tools for leverage
  • Speed & pragmatism — We deliver value in weeks, not quarters

Our Core Services:

CategoryWhat We Do
Data PlatformSet up warehouses, pipelines, orchestration, and data foundations
Data ModelingBuild data models, marts, and semantic layers
Reporting & InsightsBuild dashboards, reporting, and decision-support analytics
Product AnalyticsEvent tracking, funnels, retention analysis
AICopilots, agents, knowledge systems, and workflow automation
Training & EnablementAI literacy, data literacy, tool training

Day 29: Client Stories & Case Studies

1. ABC Home & Commercial — AI CSR Agent

  • Problem: Call center overwhelmed, 3-6 month CSR training time
  • Solution: Built “Andi” — AI agent that handles customer inquiries
  • Result: Faster onboarding, reduced call center load, actual ROI

2. StackBlitz — Modern Analytics Stack

  • Problem: No centralized data, teams working in silos
  • Solution: Built Snowflake + dbt + Omni stack
  • Result: GTM team has clean, AI-ready analytics

3. Vita Coco — Real-Time Stock Monitoring

  • Problem: Couldn’t track inventory across 900+ Target stores
  • Solution: Real-time monitoring system
  • Result: Detect stockouts immediately, optimize supply chain

4. Amazon Dashboard — Segmented Sales Reporting

  • Problem: Sales data scattered, no unified view
  • Solution: Built segmented dashboard
  • Result: Adopted across marketing, finance, and ops

5. Ticket Automation — Meeting → Tickets

  • Problem: Manual ticket creation from meetings
  • Solution: AI agent listens to meetings and creates tickets
  • Result: Hours saved weekly

6. Lead Enrichment — Account Based Marketing

  • Problem: Completely manual ABM workflow
  • Solution: Automated enrichment and workflow
  • Result: Scaled ABM without adding headcount

Key Themes:

  • We solve real problems
  • We deliver measurable results
  • We make AI practical, not theoretical

Day 30: How We Work & Engage

Typical Engagement Flow:

  1. Discovery (2-4 weeks)

    • Understand the problem
    • Audit current state
    • Map data sources and workflows
    • Identify quick wins
  2. Build (4-8 weeks)

    • Set up infrastructure
    • Build pipelines and models
    • Create dashboards or AI systems
    • Test and iterate
  3. Enablement (2-4 weeks)

    • Train client teams
    • Document everything
    • Hand off ownership
    • Set up maintenance plan

Our Delivery Principles:

  • Fast feedback loops — Show value in weeks, not months
  • Modular and iterative — Build in pieces, ship continuously
  • Knowledge transfer — Clients should be self-sufficient when we leave
  • Outcome-driven — We measure success by business impact, not hours billed

Commercial Models:

  • Diagnostic Sprint — 2-4 week assessment
  • Accelerator Package — Fixed-scope, fixed-timeline project
  • Retainer — Ongoing embedded support
  • Outcome-based — Tied to specific KPIs

Our Typical Client:

  • 500M revenue
  • Digital transformation or operations lead
  • Has data but can’t use it effectively
  • Needs speed and pragmatism, not a 6-month roadmap

Glossary: Common Terms & Buzzwords

TermWhat It Means
AI (Artificial Intelligence)Computers that can learn and make decisions
LLM (Large Language Model)AI trained on text (ChatGPT, Claude)
RAG (Retrieval Augmented Generation)AI that searches docs before answering
Data WarehouseCentralized storage for structured data (Snowflake, BigQuery)
Data LakeStorage for raw, unprocessed data
Data LakehouseHybrid of warehouse and lake
ETLExtract, Transform, Load
ELTExtract, Load, Transform
Data PipelineSystem that moves data from source to destination
dbtData build tool — transforms data in the warehouse
BI (Business Intelligence)Tools for dashboards and reporting (Omni, Looker, Tableau)
Product AnalyticsUnderstanding user behavior (Mixpanel, Amplitude)
GA4Google Analytics 4 — website/app analytics
SnowflakePopular cloud data warehouse
MPP (Massively Parallel Processing)Running queries across many servers at once
SchemaStructure of a database (tables, columns, types)
SQLLanguage for querying databases
APIWay for systems to talk to each other
CopilotAI assistant (human in control)
AgentAI that takes actions (with guardrails)
GuardrailsRules to keep AI safe and reliable
EvalsTesting AI output quality
Stand-upQuick daily team meeting (10-15 mins)
Edge ServerServer close to users for speed (not super relevant to us)
Client-ServerArchitecture where clients request data from servers

Resources & Next Steps

Internal Resources

Notion Pages:

  • Services Overview
  • Brainforge Data Playbooks
  • Case Studies
  • Tool Comparisons

Files (brainforge-files/sales/):

  • brainforge_capabilities_deck.pdf — What we do and how we engage
  • deep_dive_on_ai_automation_services.pdf — Technical AI services
  • brainforge_general_services.pdf — Full service catalog
  • Case study PDFs in case_studies/ folder
  • Tool comparison PDFs in comparisons/ folder

External Resources

Data Fundamentals:

AI/LLM Fundamentals:

Product Analytics:

Who to Talk To

  • Data questions: Engineering team
  • AI questions: AI team
  • Client engagement: Uttam, Robert
  • Sales questions: Sales team
  • Specific project questions: Project leads

Next Steps After 30 Days

Suggested Activities:

  1. Shadow a client call — See how we engage with prospects
  2. Read 3 case studies — Understand our proven outcomes
  3. Watch a dashboard demo — See our BI work in action
  4. Sit in on a standup — Experience how engineering operates
  5. Review an SOW — Understand how we scope projects

Deepen Your Knowledge:

  • Pick one service line and go deep
  • Learn the basics of SQL (not required, but helpful)
  • Get hands-on with Notion, Snowflake, or dbt (if curious)
  • Join a project kickoff or demo

Feedback & Maintenance

This curriculum is a living document. As BrainForge evolves, so should this guide.

How to Contribute:

  • Found something confusing? Flag it
  • Missing a topic? Add it
  • Better explanation? Update it
  • New case study? Include it

Maintained by: GTM & Operations

Last Updated: January 7, 2026


Welcome to BrainForge. Let’s make AI and data actually useful. 🚀