Forge Your Brain

The Intelligent Company Framework

Owner: Luke Scorziell Status: In progress — Positioning drafted. Research compiled across all client calls. How We Get You There pending. Last updated: 2026-02-26 Output of: forge-your-brain-context.md


How to use this document Positioning is Luke’s framework — the belief system in his own words, with the insights that led him there.

Research is the raw evidence — verbatim quotes and documented observations from real client and prospect calls, organized by source.

How We Get You There is the manifesto — Luke’s voice, pending.

This document feeds: service posts (Ryan), webinar scripts (Hannah), campaign messaging (Rico), and the public-facing manifesto (Luke).


Strategy at a Glance

Why we exist To give every business on-demand access to the knowledge it needs to win.

What we do We forge your company brain, giving you custom data and AI infrastructure that turns your company’s information into something everyone can use.

What we value Custom-built — we build for how your company actually works. Your data — we bring AI to your data, not the other way around. Forward thinking — we’re AI-native because we believe your people should spend their time on the work only humans can do.

How we sound Contrarian and welcoming. We’re direct about what’s broken and warm about what’s possible. We’re especially interested in talking to non-technical people who are ready to move.

Who we’re fighting

ArchetypeThe short version
The StackEvery problem became a new subscription. Now there are fifty tools, none of them talking to each other, and a team managed by their software instead of the other way around.
The FirmThe partner closed. The junior delivered. Six months and $300K later, there’s a deck — with a slide recommending three more tools to buy. The knowledge walks out when the engagement ends.
The BotChatGPT with a different logo. It doesn’t know your clients, your data, or your processes. Sixty days in, everyone’s back to copy-pasting into the free version. The team “tried AI.” It didn’t stick.

Brand Strategy

Why We Exist

To give every business on-demand access to the knowledge it needs to win.


What We Do

Short: We forge your company brain.

Descriptive: We build custom data and AI infrastructure that turns your company’s information into something everyone can use.


What We Value

Custom, not generic We build for how your company actually works. Off-the-shelf is someone else’s brain.

Your data stays yours We bring AI to your data. Privacy isn’t a feature, it’s our foundation.

Forward thinking We’re AI-native because we believe your people should spend their time on the work only humans can do.


How We Look, Feel, and Sound

Contrarian The old way is dead: bloated SaaS stacks, generic AI tools, consulting firms that charge $400/hr and build decks in PowerPoint.

Welcoming We welcome anyone willing to move, especially non-technical people.


Who We’re Fighting

Archetype 1: The Stack

The Stack is what happens when every problem gets solved with a new subscription. Eight CRMs. Fifty systems. A Power BI dashboard nobody logs into. Salesforce that technically contains the revenue data but has almost nothing useful in it. Fathom notes that record every meeting and connect to nothing downstream.

The Stack costs $15–50K/month. The team learns a new interface, then works around it. The tools were supposed to talk to each other — they don’t. The data is “in the system” but nobody can find it when they need it.

The damage: your team is managed by their tools instead of the other way around. The company conforms to the software, not the software to the company. When the next wave of AI-native tools makes the stack obsolete, you’re migrating — again.

The Brainforge counter: We don’t add to your stack. We replace it. One custom system, built for how your business actually works.


Archetype 2: The Firm

The Firm has been around since before the internet. They’ve reinvented themselves every decade — first “digital transformation,” then “agile,” now “AI strategy.” The partner closes the deal. The junior executes. Six months and $300K later, you have a deck.

Not a system. A deck.

And somewhere in that deck is a slide recommending you purchase three tools from their preferred vendor list. The Salesforce rep shows up. They pitch you Data Cloud. The number on the proposal has too many zeros.

We’re all thinking it — but a past client put it plainly: “If somebody tries to pitch me Data Cloud one more time, I will punch them in the face.”

That’s what The Firm does. They translate your problem into a more expensive version of itself — one that happens to generate another engagement for them. Their “AI practice” was stood up 18 months ago, built on frameworks from a different era. They don’t use AI the way you’d need to use it to actually build with it.

The damage: the knowledge walks out the door when the engagement ends. Nothing compounds. Your team is back where they started, only lighter by a few hundred thousand dollars and a few months of distracted leadership.

The Brainforge counter: We build something that stays. We’re AI-native — not as a practice area, but as the only way we know how to work.

We leave your team more capable than when we arrived, not more dependent.


Archetype 3: The Bot

The Bot is what most companies call “AI adoption.” It’s ChatGPT with a different logo, or the AI feature your existing SaaS tool added in 2023 to keep up with the market. It doesn’t know your clients, your processes, or your data. You paste context in manually. It gives you something plausible. You fact-check it, adjust it, send it.

Next week, it’s forgotten everything. Sixty to ninety days in, the team is back to copy-pasting into the free version — which is exactly where they started.

“It doesn’t learn,” said a prospect recently. That’s not a bug. That’s the product. Generic AI is built to serve everyone, which means it’s built for no one.

The damage: you get the theater of AI adoption without the substance. The team “tried it” and it didn’t stick. The status quo wins — again.

The Brainforge counter: The AI we build knows your business. It’s grounded in your data, your context, your history. It learns. Because the infrastructure underneath it was built first.


The through-line

These aren’t three separate villains — they’re three chapters of the same story. A company bought the stack because each tool solved a real problem. They hired the firm because the stakes felt high. They rolled out the bot because AI sounded important. After all of it, the data still doesn’t move. The information still doesn’t reach the people who need it. The team is still waiting on the analyst.

That’s who we’re talking to. And that’s what we fix.


Positioning

The Intelligent Company

Three pillars:

  • Everyone is empowered to get insights from data
  • Team adopts one custom solution that iterates
  • Start as soon as possible

What it looks like:

  • Manual tasks are easier
  • Increased output with the same team
  • Information moves easily and fast
  • Automatic reports that come in via Slack or Teams
  • Everything lives in one place
  • Chats learn from you
  • Unified dashboard
  • Confidence in a fast-moving team
  • Willing to move, even when it’s uncomfortable
  • Everyone becomes someone that can talk with the data
  • Ask a question and get an immediate answer, that easy
  • Confidence when negotiating with data
  • That you’re using the right tools, tracking the right metrics
  • Purposeful analytics with clear definitions and why
  • Tools are built to be used, not built to be sold
  • Unnecessary SaaS tools get cut
  • A team that builds quickly

Staying the Same

Three pillars:

  • Things only get worse the longer you wait
  • Teams operate in silos and important information gets missed
  • Keep adding more tools but never quite solve the root issue

What it looks like:

  • Relying on SaaS that doesn’t fulfill their needs anymore
  • Costs that scale with customers but can’t be passed along
  • Building everything themselves
  • Spreadsheets that don’t get updated, MCPs that break
  • Have to think about rebuilding features instead of focusing on the work that matters
  • If they’ve tried working with someone else, it’s a dev shop that isn’t meeting any of their needs — slow, doesn’t communicate, doesn’t see the vision, doesn’t advance with AI
  • People are the bottleneck to information traveling between each other
  • At ABC, services get updated and the CSR team doesn’t find out → Unhappy clients
  • Spreadsheet from Hell
    • Lists people who haven’t worked at the company in three years
    • Manually maintained services
    • Documentation used by key team members is inaccurate and outdated
  • Everyone has their own source of truth
  • Spending time manually updating things instead of higher-leverage activities
  • 100 emails a day to read, reply to, and then it might not even be a qualified lead
  • New leads only get into the CRM if a human does it
  • Off-the-shelf AI tools slow things down — they start from zero every time
  • “It doesn’t learn.” — said a prospect recently
  • Staff going to the data team for every question — why can’t we use an LLM?
  • Getting it right from the beginning is most important
    • If you don’t set up the infrastructure right, you’ll be spending years and tens of thousands of dollars trying to get it right
    • Destined to fail
  • Static forecasts that rely on people to update them
  • Filling data gaps with anyone around — destining them to fail
  • Turn a dumpster fire into something beautiful
  • Scared to make changes near peak seasons
    • This is a good point of urgency: make changes when it’s not too late
  • Eight different CRMs
  • 50 different systems
  • No one is using anything → dashboards not being used
  • One of my older clients turned off their PowerBI dashboard and no one noticed. What’s stopping you from running this same experiment at your company? Pride? Fear? Or are people actually using the dashboards?

Research

Verbatim quotes and documented observations from client and prospect calls, organized by source. Each entry leads with the point being made, followed by the verbatim quote cited to source. Where the quote is from a cleaned meeting note rather than a raw transcript, that is noted.


Kelley Weaver — Melrose PR / Bitwire

Discovery call, Feb 24, 2026

  1. Her call notes go nowhere. They record everything and use nothing. Kelley logs every call in Fathom — and those notes disappear into a folder no one opens. Nothing connects downstream to her team, her pipeline, or her next meeting.

    “Fathom notes go nowhere.” — Kelley Weaver, discovery call (documented in meeting notes)

  2. Slack is the place where information goes to die. If something wasn’t said in the last 24 hours, it doesn’t exist. Nothing is retrievable. The more the team uses it, the worse the signal-to-noise becomes.

    “Slack is messy and disorganized.” — Kelley Weaver, discovery call (documented in meeting notes)

  3. She is manually copy-pasting Fathom links into Trello by hand after every single call. Not a system problem she’s ignoring — a system problem she has accepted. Every call, every time, by hand. (Documented observation from discovery call)

  4. She’s the only one using AI. Her team of 20 isn’t. She described her own AI usage as high and personal — generating content, editing, drafting. Her team of 20 is sitting at zero. The knowledge stays with her and doesn’t compound.

    “I have a million chats with Claude.” — Kelley Weaver, discovery call (documented in meeting notes)

  5. Four tools doing four things. None of them talk to each other. Claude, Fathom, Trello, Slack — each bought to solve a problem, none of them connected.

    “Nothing is integrated. Tools aren’t talking to each other.” — Kelley Weaver, discovery call (documented in meeting notes)

  6. She learns things every day and has no way to share that operational knowledge with her team. What she knows as the founder doesn’t flow down. What the team learns doesn’t flow across. Knowledge stays personal and expires when she’s unavailable. (Documented observation from discovery call)


Joshua Dent — David & Goliath

Discovery call, Feb 25, 2026 — mirrored back in demo prep, Mar 2026

  1. Adoption is the actual problem — not AI capability, not cost. Joshua’s concern isn’t whether AI works. It’s whether it sticks. He’s seen the cycle: budget approved, vendor onboarded, training done, adoption announced. Then, quietly, everyone goes back to what they know.

    “Tools get bought, teams get trained, and 60–90 days later most people are back to copy-pasting into ChatGPT.” — Luke’s mirror of Joshua’s exact language from the discovery call (documented in demo prep notes)

  2. He’s tried AI rollouts. They don’t stick. Not a hypothetical — a lived pattern. Training happened. People nodded. Two months later, nothing changed.

    “We’ve tried AI rollouts before. They never stick.” — Reported objection from Joshua Dent (documented in demo prep notes)

  3. Ten-plus SaaS tools in the stack. None of them changed how the team works. D&G runs Kanto, Frame.io, Teams, and more. Each was bought to solve a problem. The team navigates a different interface for every workflow, and still does most of the actual work outside of all of them. (Documented from discovery context)


Third Bridge SVP

Exploratory call, Feb 19, 2026

  1. A research goldmine buried in transcripts and expert interviews — and nobody can find anything. Third Bridge’s core product is expert knowledge — structured interviews, proprietary content, human insights from thousands of conversations. That content lives in files. Nobody can surface it on demand. (Documented observation from post-call notes)

  2. Their highest-value experts doing manual triage work that a system should be doing. The people who should be focused on research quality and client insight are spending time on routing, qualification, and sorting. (Documented observation from post-call notes)

  3. Scaling headcount to solve problems that should be solved with intelligence. The response to “we need to process more” has been “hire more people.” The data exists. The infrastructure to reason over it doesn’t. (Documented observation from post-call notes)


Lilo Social — Zac Fromson

Sales call / discovery, Nov 13, 2025

  1. SaaS costs are eating the agency alive — and the tools still don’t do what they need. Zac runs an ecom performance agency managing 60+ brands. He’s paying for Triple Whale, Klaviyo, MCP tools — and still building things himself to fill the gaps. Every new software pitch is another $7,000 he can’t pass on to clients.

    “Triple Whale has certain features, like Moby, for example. I built my own Moby, and that was kind of their first piece of what we built. Their cost is just ridiculous.” — Zac Fromson, LiloSocial, Nov 2025 transcript

  2. He’s pitching new tools to clients and getting pushback on the sticker shock. Clients don’t want to hear about another platform that requires another subscription. The math stops working at scale.

    “I can’t come to every brand that we work in. You need lifetime. You need this. You need this report. And they’re saying… you need another $7,000 in software and. They’re like, what the fuck, right?” — Zac Fromson, LiloSocial, Nov 2025 transcript

  3. MCPs are too brittle to work at agency scale. One brand at a time. Sixty accounts. Impossible. He tried building AI tooling on MCP integrations. Token limits break. Security becomes a nightmare when you have 60 Klaviyo accounts and you can’t give everyone access to everything.

    “It just wasn’t working. You can only connect one brand at a time to mcp. Like, a lot of times, the token limits were breaking. We have 60 [Klaviyo] accounts at the agency, so it’s just so tough for security, giving everyone access.” — Zac Fromson, LiloSocial, Nov 2025 transcript

  4. Forecasting is still running out of Google Sheets. Manually. He’s demoed a $600/brand forecasting SaaS — basically a monetized spreadsheet — and is thinking about rebuilding it himself instead. Because the alternative is keeping the manual process going.

    “Right now, we run it all manually out of Google sheets. Super fucking time [consuming].” — Zac Fromson, LiloSocial, Nov 2025 transcript

  5. His tools are a hodgepodge. He knows it. He can’t help it yet. He’s tried to build something coherent. What he has is a collection of half-working integrations duct-taped together — and the connectors keep breaking.

    “I don’t want to say a hodgepodge of things, but in some regards, a couple of different things that I think make sense for us as an agency that fits what we do.” — Zac Fromson, LiloSocial, Nov 2025 transcript

  6. His dev agency is slow, messy, and doesn’t communicate. The relationship has no rhythm. Timelines missed. Miscommunications piling up. No weekly standups. He’s doing the UI himself because their design wasn’t good enough. The output is slightly behind “complete” and has been for months.

    “We don’t have, like, weekly standups with them. It’s like, it’s. A kind of janky, messy. Relationship. It’s not how I run my agency.” — Zac Fromson, LiloSocial, Nov 2025 transcript

  7. He wants to buy outcomes, not hours. The industry still sells hours. He’s not interested in scope and timelines. He knows what a feature is worth to his business and he wants to pay that — not a blended hourly rate.

    “I’d rather pay you for the outcome. Rather be like, I don’t really care if you can do this in 24 hours, but it’s worth $7,000 to me.” — Zac Fromson, LiloSocial, Nov 2025 transcript


Lilo Social — Zac Fromson, Bobby Palmieri

Project sync, Feb 13, 2026 — ~60–75 days into active build

  1. In a matter of weeks, they knocked off a funded SaaS product. LiloSocial came in wanting a brand intelligence and forecasting platform. Brainforge built a feature set that competes directly with Raylion — a funded product in the same space — in less than a month.

    “In a matter of a month, we just, like, knocked off a product, and that’s just, like, one aspect of what we’re doing.” — Zac Fromson, LiloSocial, Feb 2026 transcript

  2. “Velocity is certainly up.” And they feel it. The build was fast and getting faster. Not progress on paper — the pace was visible, real, and translating to shipped features week over week.

    “Velocity is certainly up, and I think, like, Bobby being able to, I think, get things… like, the fact that we stood up, like, I mean, we basically built Raylion, which is, like, a competing product… you know, what we accomplished on that generator in that short period of time.” — Zac Fromson, LiloSocial, Feb 2026 transcript

  3. They went from a “janky, messy” dev agency to a shipped platform — in weeks. In November 2025, Zac described his previous dev agency as uncommunicative and months behind. By February, the core platform was live, features were shipping, and the team was “super happy.”

    “I think we’re super happy. And I think, like, where the product is, like… we’re super excited. I think that we… are definitely, like, if we weren’t already fully in sprint mode of, like, having a lot of things stood up.” — Bobby Palmieri, LiloSocial, Feb 2026 transcript

  4. 60 days. A lot of shit stood up. Zac’s own framing: step back, look at what’s been built, and acknowledge the reality. In 60 days they went from scattered tools and a failing agency to a functioning product roadmap.

    “In 60 days, we do have quite a bit of shit stood up. And I think we kind of need to just take a breath and be like, hey, like, a lot has really been done.” — Zac Fromson, LiloSocial, Feb 2026 transcript

  5. Scheduled insights to Slack — automated. It’s being built because they asked for it. Not a demo feature. The client described the workflow they wanted: schedule a prompt, push the result to the right Slack channel, don’t make the team go looking. It’s on the roadmap.

    “Admin settings, like, we want to be able to schedule prompts to send to Slack, right?” — Bobby Palmieri, LiloSocial, Feb 2026 transcript


ABC Home and Commercial — Yvette Ruiz, Steven, Scott Harmon

Discovery and requirements session, Feb 14, 2025

  1. Their service availability data lives in a spreadsheet. When managers update it in their heads, Yvette doesn’t find out. Each division manager controls their own territory rules. If they decide to expand or contract a zip code, that information travels through informal communication — or doesn’t travel at all.

    “Until a division manager changes their mind and decides to do a Zip code and then doesn’t let [Yvette] know. And then we’re all over the place.” — Steven, ABC Home and Commercial, Feb 2025 transcript

  2. The spreadsheet is so complex it’s earned a nickname. Service coverage by zip code, by trade, by inspector — all manually maintained. The moment something changes in the field, the spreadsheet is wrong.

    “What I sort of tongue in cheek called the spreadsheet from hell.” — Scott Harmon, ABC Home and Commercial, Feb 2025 transcript

  3. Service agreements change all the time. None of those changes get passed to the CSR team. The business updates what they offer. The ops team may or may not know. The CSRs definitely don’t — until a customer catches them in the gap.

    “They’ve changed it so many times like today, they can go in there and say, Hey, we’re gonna adjust this agreement. We’re not gonna do this, which they just did. That stuff doesn’t get passed to us.” — YvetteRuiz, ABC Home and Commercial, Feb 2025 transcript

  4. The staff directory in the spreadsheet lists people who no longer work there. Not a metaphor. Three names on the list are gone. Someone has to check every time — or schedule on the wrong person.

    “There are 3 sales guys up there that don’t work here anymore.” — Steven, ABC Home and Commercial, Feb 2025 transcript

  5. She asked for guidance. Got told different things. Checked the document. The document was wrong. CSRs are fielding real customer calls with outdated, inconsistent documentation as their only resource. The document is the authority — and the authority is unreliable.

    “I keep getting told different things. I’m like, well, let’s look for the document. And the document was incorrect.” — YvetteRuiz, ABC Home and Commercial, Feb 2025 transcript


ABC Home and Commercial — “Andy” AI Agent

User feedback session, Jul 23, 2025 — CSR team using the AI agent built by Brainforge

  1. From “before I was having a difficult time” to scheduling faster, with less back-and-forth. A CSR who doesn’t love AI admitted it out loud. She uses Andy. It works. Her scheduling accuracy improved. That’s the conversion that matters — not the enthusiasts.

    “I will admit I’m not a huge fan of AI. But I really feel like this is gonna help a lot… I’m able to schedule things a little bit easier, whereas before I was having a difficult time with that.” — RaeannOliver, ABC Home and Commercial, Jul 2025 transcript

  2. The answer came back fast, comprehensive, and easy to follow — while she was on a live customer call. This is the test. The system doesn’t exist for demos. It exists for the moment a CSR has a customer waiting on the other end of the line and needs the right answer now.

    “The answer was very comprehensive. Very easy to follow. Came back very quickly, which obviously, when you’re on the phone is a huge thing, right?” — MelissaLykins, ABC Home and Commercial, Jul 2025 transcript

  3. Everything in one place. Searchable. And every time she’s looked, she’s found what she needed. Not “sometimes it works.” Every time. That’s what a well-built knowledge system does — it removes the doubt about whether to go look.

    “I think it’s helpful that it’s all in one place, and you can Control-F. Put in your words — because we know what kind of terms we’re looking for — and every time I’ve used it, I found what I needed to find.” — TiffanyTorres, ABC Home and Commercial, Jul 2025 transcript

  4. Her own personal AI that isn’t contaminated by everyone else’s conversations. Teams using shared tools — Slack channels, shared docs, generic GPTs — get noise from everyone. A system built per person, with her own history, changes the dynamic.

    “I like that you have your own personal chat with Andy, and it’s not flooded with other people’s chats. So that helps a lot.” — RaeannOliver, ABC Home and Commercial, Jul 2025 transcript

  5. The history compounds. She caught herself before asking a duplicate question — and went back to find the answer herself. That moment — “wait, I already asked that” — is the system working. It’s institutional memory made personal and accessible.

    “I like that it keeps the history so you can go back — wait, I already asked that question — go back and find it. So that helps.” — RaeannOliver, ABC Home and Commercial, Jul 2025 transcript

  6. They’re building it in public — refining it in real time during the feedback session itself. Brainforge’s PM updated the doc mid-call, re-ran the query, and showed the corrected output live. The system got better during the meeting. That’s not possible when knowledge lives in spreadsheets. (Documented observation from feedback session transcript)


Plan Medicare — Brian Krantz

Discovery/requirements call, Jan 22, 2026

  1. He’s spending his busiest seasons drowning in emails when he should be on the phone. Insurance agent. Solo operator. 100 emails a day during normal season. 250 during open enrollment. Every email he reads, every reply he drafts, is a phone call he didn’t make.

    “I’m getting like 100 a day, and sometimes they’re my busy season. It could be 250 a day, and I would just want to be on the phone more.” — Brian Krantz, Plan Medicare, Jan 2026 transcript

  2. Every new introduction requires manual copy-paste gymnastics just to see who the person is. An advisor introduces him to a prospect via email. To get context on them, he has to open the CRM separately, copy the name, search, read. Every time.

    “I’m always like copying the emails to pull up their stuff.” — Brian Krantz, Plan Medicare, Jan 2026 transcript

  3. Every new contact has to be manually created in the CRM. When someone is introduced via email, a human still has to enter them as a new record. No inference. No automation. Copy-paste the name, email, company — every time.

    “This contact that she introduced me to. Now someone needs to manually create that in my CRM.” — Brian Krantz, Plan Medicare, Jan 2026 transcript

  4. Generic AI tools don’t remember him, his clients, or how his business works. He tried using off-the-shelf AI tools to speed things up. The fundamental problem: they start from zero every session. They don’t know his business.

    “But it doesn’t learn. It doesn’t learn.” — Brian Krantz, Plan Medicare, Jan 2026 transcript


Breezy — Bareket Sigal and James Lee

Early discovery call, Dec 2025

  1. They’ve both watched companies set up data wrong from scratch — and spend years cleaning it up. Founders building a new company. They’ve seen the pattern before: skip the data infrastructure early, optimize for speed, and then spend three years fixing it.

    “We all have been in two companies that fucked up the initial setting and then spent years fixing it.” — Bareket Sigal, Breezy, Dec 2025 transcript


StackBlitz — Mitchell

Data needs discussion, Dec 2024

  1. Events piped into Mixpanel with no rhyme or reason. Nobody knows what’s being tracked or why. A fast-growing startup with real product analytics needs. Their current setup: Segment piping random events into Mixpanel. The tracking was set up without a taxonomy, without a strategy, without ownership.

    “They’re piping some like random events through Segment and into Mixpanel… I don’t love it like the eventing. There’s not really like any rhyme or reason to it. Really?” — Mitchell, StackBlitz, Dec 2024 transcript


StackBlitz

Completed engagement — data infrastructure and analytics foundation

  1. One month in. Core infra, ETL, data marts, and dashboards — all established. StackBlitz went from “random events piped into Mixpanel with no rhyme or reason” to a structured, working data infrastructure in four weeks. The company didn’t slow down while it was built.

    “The amount of work we’ve been able to do for them in just a month has been pretty good from my standards, like we’ve established all the core infra, all the core etl… data marts and dashboards.” — Uttam Kumaran, internal renewal call, Mar 2025 transcript

  2. 100% adoption across GTM and leadership. Not 60%. Not the “power users.” Every person the system was built for uses it. Daily. (From StackBlitz engagement, referenced in campaign brief and context file)

  3. 80% reduction in manual reporting. Hours that used to go into assembling reports now happen automatically. The team’s time shifted from assembling data to acting on it. (From StackBlitz engagement, referenced in campaign brief and context file)

  4. 3x more decisions made per week. When access to insight is instant, the pace of decision-making changes. Teams stop waiting for data and start operating on it. (From StackBlitz engagement, referenced in campaign brief and context file)

  5. They were sitting on a trove of data no one had ever analyzed. The data wasn’t missing. The capability to ask it questions was. Once the infrastructure was in place, the value that had always been there became accessible.

    “They’re sitting on a trove of data that they’ve never analyzed.” — Uttam Kumaran, internal renewal call, Mar 2025 transcript


Inteleos — Daniel Carpenter

Intro call, Jan 30, 2026

  1. Staff still going to the data team for every question — even the ones LLMs could answer. Data scientists doing first-line triage because the business hasn’t built the layer that would let anyone else get to the data themselves.

    “How [do we use] LLMs to better support staff internally so they don’t have to go to us to get data.” — Daniel Carpenter, Inteleos, Jan 2026 transcript


Urban Stems — Emily Giant, Zack Gibbs

Proposal meeting, Jan 2025 / Renewal discussion, Nov 2025

  1. Their demand forecasting process was, in their own words, “insane.” UrbanStems makes most of its revenue in two 2-week windows: Valentine’s Day and Mother’s Day (which is 5x V-Day). Their plan for what to buy, when, and how much was built on static guesses — not data.

    “The way that we do it today is insane, and it’s very rudimentary.” — Zack Gibbs, UrbanStems, Nov 2025 renewal transcript

  2. Their “forecast” was a budget allocation spreadsheet. Manually reconciled. Once a month. Maybe. Not a forecast — an estimate that got updated when someone remembered to update it. In a business with perishable inventory and massive seasonal swings, this is the difference between profit and write-offs.

    “It’s really just, like, a budget allocation exercise, kind of like, if you use YNAB or something, it’s kind of… that’s the type of forecast that you have, which is very much just, like, off estimates, very static, it’s not updated, unless I’m assuming somebody maybe does some manual reconciliation once a month.” — Robert Tseng, Brainforge (reviewing UrbanStems’ forecasting models), Nov 2025 renewal transcript

  3. They’d been burned by infrastructure changes close to peak seasons. So nothing ever changed. A rational fear: touch the pipeline before Mother’s Day, break something, lose millions. But the cost of that fear was years of technical debt compounding with every delay.

    “We’ve done things like that in the past, and it has come back to bite us like every time.” — Zack Gibbs, UrbanStems, Jan 2025 transcript

  4. Missing the forecast by 25–30% at their scale is not a rounding error. For a company generating the bulk of its revenue in two 2-week windows, a 30% miss means spoilage, stockouts, and margin destruction — every year.

    “For a company your size, if you’re kind of missing your forecast by, you know, 25, 30%, it’s pretty significant.” — Robert Tseng, Brainforge, UrbanStems renewal call, Nov 2025 transcript

  5. Their BI hire was a customer care rep — volunteered into the role by the CEO. Everyone knew it wasn’t going to work. Not a hiring mistake — a staffing gap. The company needed data capability and filled it with whoever was available, then watched the gap persist.

    “Emily was… kind of volunteered for this role by our CEO, and she was coming out of our customer care group. And so I knew that she was going to struggle with this assignment from the very beginning, and I didn’t know, and I told my CEO this, my boss, like… I don’t know if this is the right choice.” — Zack Gibbs, UrbanStems, Nov 2025 renewal transcript

  6. Fulfilled revenue went from zero to an actual number. A data integrity win that unlocked everything downstream — revenue attribution, subscription tracking, and reporting that leadership could actually trust. Emily’s reaction says it all.

    “Fulfilled revenue is a number that isn’t zero. I know, woo! That’s a huge win.” — Emily Giant, UrbanStems, Jan 2026 transcript


CTA — Katherine Bayless, SVP Data & Analytics

First analytics discovery conversation, Sept 2025

  1. Eight CRMs. Salesforce has the revenue — but not the data. CTA runs on a patchwork of systems from different eras and teams. The one tool everyone gravitates to — Salesforce — turns out to be almost empty from a data perspective.

    “We have 50-some different systems total. I’ve identified 8 core CRMs. One of them is Salesforce, but ironically… while that system contains the data that goes with the most of our revenue, it doesn’t actually have a lot of data data in it.” — Katherine Bayless, CTA, Sept 2025 transcript

  2. She pulled the plug on all the Power BI dashboards. Nobody noticed. That’s not an endorsement of the change — that’s evidence of how little those dashboards were actually being used. Years of BI investment, silently deprecated.

    “I pulled the plug on all the data that connects to them and nobody’s noticed yet, which I’m thinking gives us a lot of wiggle room.” — Katherine Bayless, CTA, Sept 2025 transcript

  3. Her job title is “turn the dumpster fire into something beautiful.” She didn’t say it was bad and getting fixed. She called it a dumpster fire and called herself the person tasked with cleaning it up.

    “I’m a proper data ops team tasked with, you know, turn the dumpster fire into something beautiful.” — Katherine Bayless, CTA, Sept 2025 transcript

  4. Salesforce keeps trying to upsell them on Data Cloud. She’s not having it. A pattern every data team knows: the vendor solution to every problem is the vendor’s most expensive product. The team wants it. Katherine knows it wouldn’t solve anything.

    “If somebody tries to kill me Data Cloud one more time, I will punch them in the face.” — Katherine Bayless, CTA, Sept 2025 transcript

  5. Power BI is the incumbent. It’s staying. And she hates it. Not a choice — a constraint. The licenses exist, people are familiar, and migrating would be more disruptive than dealing with a tool she’s already checked out on.

    “I have no desire to stay on Power BI after 2025.” — Katherine Bayless, CTA, Sept 2025 transcript

  6. She already knows exactly what she wants: people ask questions in plain English and get answers where they work. Not dashboards. Not BI tools. Just: go to where you’re already working, ask what you need to know, get an answer. Katherine articulated the intelligent company vision unprompted — before Brainforge had even pitched it.

    “Rather than have them go and have to, like, self-serve in a dashboard that’s built to answer 10,000 different questions, why can’t we just have them go right where they’re already working and ask a natural question against the data and get the response they need?” — Katherine Bayless, CTA, Sept 2025 transcript


PoolPartsToGo (PP2G)

Completed engagement — Data & BI foundation

  1. ~80–90% discounts on UPS and FedEx rates. Savings in the hundreds of thousands of dollars. Once PP2G had a model that mapped every order to its actual cost and could forecast shipping spend with confidence, they walked into negotiation as the data authority. (From PP2G case study)

  2. Shipping cost model delivered in weeks, not months. Before: no contract visibility, no forecasting capability, reactive decisions. After: a working SQL model, forecasting across growth scenarios, and a seat at the negotiating table. (From PP2G case study)

  3. Dashboards back in active daily use — leadership makes decisions on current data. Before: scattered data, stale reports, executives making calls on gut feel. After: a unified dashboard used by leadership and operations for daily KPIs, pricing decisions, and campaign planning. (From PP2G case study)

  4. Customer Q&A chatbot (pptg.ai) built and deployed in ~2 weeks. Full POC — scraping, UI, pipelines, QA, handoff — in 14 days. Support load reduced. Branded, fast, accurate answers on demand. (From PP2G case study)


Agency Intelligence — Measured & Demo-Validated Outcomes

From agency demos, post outlines, and engagement stats — all grounded in documented workflows

  1. Brief creation: 5+ hours per brand → minutes. First draft is 75% complete before a human touches it. Strategists now handle the 20% that needs judgment — tone, refinement, strategic nuance. The 80% that used to be manual drafting runs itself.

    “Brief creation was taking 4–5 hours. This produces a first draft in under a minute — grounded in your actual brand context, not a blank template.” — D&G demo script, live demo outcome

  2. Brief volume: ~10/week → 30+/week with the same team. When output is no longer bottlenecked by manual production time, throughput triples. Same people, same hours — more output, better quality. (From campaign brief stats, D&G demo prep)

  3. Meeting prep: 30–40 minutes per client → 5–7 minutes. The pre-call scramble — opening 5 tools, searching Slack, pulling last quarter’s report — replaced by a single natural language query that pulls from every connected source and responds in seconds.

    “This is what your account team currently does manually in 30–40 minutes. It just happened in 10 seconds — and every answer links back to the source.” — D&G demo script, live demo outcome

  4. **120+ hours saved per month. ~60–80/hour, that’s the value of one and a half full-time employees — recovered without adding headcount. (From D&G demo prep stats and agency campaign brief)

  5. A campaign status question that used to require three Slack pings and a team meeting — now instant, with the bottleneck surfaced automatically. Not a status update. An analysis. Pulled from Asana + Slack. What’s the state of this campaign and what’s behind schedule — answered in seconds.

    “This is the question that currently requires three Slack pings and a team meeting to answer. It’s instant now — and the Copilot surfaces the bottleneck, not just the status.” — D&G demo script, live demo outcome

  6. Reports on autopilot — delivered to Slack on schedule, with narrative insight, not just numbers. “Revenue up 12% vs. last month, driven by strong new-customer segment.” Not a number dump — an analysis. Delivered to the right Slack channel every Friday. No human hours required. (From agency intelligence demo script and post outline)

  7. Onboarding time cut from days to questions. New team members used to spend days reading docs and digging through Slack to learn a client. Now they ask: “What were this brand’s top campaigns?” and get cited answers in seconds. (From agency intelligence post outline, grounded in client demos)


Third Bridge — Projected Outcomes

From post-call analysis and comparable engagements

  1. Lead scoring: 5–10 minutes per profile → under 30 seconds. Manual triage — reading, evaluating, routing — replaced by an AI scoring system using the firm’s own criteria. High-value people stop spending time on qualification. (From Third Bridge call prep — comparable engagement stats)

  2. 69 profiles qualified in minutes using AI scoring built around the client’s ICP criteria. Not a batch process done overnight. Real-time qualification as profiles come in. (From comparable engagement stats, documented in Third Bridge call prep)

  3. 20–30 qualified conversations per month — generated by a system that routes and prioritizes automatically. The pipeline doesn’t wait on a human to sort it. The right people get the right attention at the right time. (From comparable engagement stats, documented in Third Bridge call prep)


Clarence — Private Cloud Platform & Internal Strategy

From strategy session, Feb 24, 2026

  1. The tools aren’t the problem. The infrastructure underneath them is. Companies buy the right tools and still fail. Clarence’s core observation after years of watching AI implementations collapse: the application layer was fine — the foundation was missing.

    “It’s not an application problem, it’s an infrastructure problem.” — Clarence, strategy session (documented in meeting notes)

  2. Fifteen SaaS tools that each force your business to conform to their structure is not a system. It’s chaos with subscriptions. The default behavior — buy another $15/month tool, fit your process into its interface, repeat — is the enemy.

    “Stop buying general SaaS tools that force your business to conform to their structure.” — Clarence, strategy session (documented in meeting notes)

  3. Brief creation: 5 hours → 30 minutes. ROI case: easily more than $1 million over the lifetime of the engagement. Clarence’s before/after methodology: pick a workflow, measure the time, multiply across the team. The math closes itself. (From Clarence strategy session notes)

  4. “Sessions don’t cross over.” Client A’s data cannot be seen or referenced by Client B. Full data isolation by design. For companies managing multiple client brands or sensitive proprietary data, this isn’t a nice-to-have — it’s a requirement that most shared SaaS environments can’t meet. (From Clarence strategy session — platform architecture)

  5. One switch to privacy. Moving from shared to private deployment is an infrastructure toggle — not an application rebuild. Companies that assumed private AI deployment meant a 12-month enterprise project and a seven-figure price tag are wrong. The barrier is infrastructure, not cost. (From Clarence strategy session — platform architecture)


Internal Operations — Brainforge Team

From founding ops team mission and internal planning documents

  1. “Deliver 20% faster and 20% better with each increment of growth, using a systems and AI-first mindset.” The founding ops team’s own operating principle. Intelligence compounds. The system gets better as it learns. The team gets faster as the system handles more. (From Luke’s OKR and metrics document)

How We Get You There

This section is Luke’s voice. It is the manifesto — the belief system, not the feature list.

[DRAFT PENDING — Luke’s voice only.]


Document created: 2026-02-27 | Last updated: 2026-02-26 | Owner: Luke Scorziell | Source: forge-your-brain-context.md Research compiled from vault transcripts (raw Fathom/Fireflies), meeting notes, case studies, and demos. Sources: Kelley Weaver discovery (Feb 2026), Joshua Dent discovery/demo prep (Feb–Mar 2026), Third Bridge call (Feb 2026), LiloSocial discovery transcript (Nov 2025) + project sync (Feb 2026), ABC Home and Commercial transcript (Feb 2025) + Andy feedback session (Jul 2025), Plan Medicare transcript (Jan 2026), Breezy transcript (Dec 2025), StackBlitz transcript (Dec 2024) + renewal regroup (Mar 2025), Inteleos intro (Jan 2026), UrbanStems proposal and renewal transcripts (Jan & Nov 2025) + data troubleshooting (Jan 2026), CTA analytics discovery transcript (Sept 2025), PP2G case study, D&G demo scripts, agency intelligence post outlines, Clarence strategy session (Feb 2026), Luke’s OKR and metrics document.