Meeting Title: PJ AI Strategy Planning Date: 2025-09-10 Meeting participants: Clarence Stone, Uttam Kumaran


WEBVTT

1 00:00:38.930 00:00:39.890 Clarence Stone: Hello?

2 00:07:42.560 00:07:43.650 Uttam Kumaran: Yo, dude.

3 00:07:46.810 00:07:48.060 Uttam Kumaran: You’re on mute.

4 00:07:49.520 00:07:52.130 Clarence Stone: Good call.

5 00:07:52.350 00:07:53.030 Uttam Kumaran: Faye.

6 00:07:53.340 00:07:54.969 Uttam Kumaran: I’m good, just gonna get another…

7 00:07:54.970 00:07:56.250 Clarence Stone: library again?

8 00:07:56.400 00:07:58.070 Uttam Kumaran: No, coffee shop.

9 00:07:58.070 00:07:59.080 Clarence Stone: Oh, nice!

10 00:07:59.170 00:08:02.879 Uttam Kumaran: Just gonna get another coffee, but yeah, dude, the news is crazy right now.

11 00:08:02.880 00:08:04.300 Clarence Stone: Dude, I…

12 00:08:04.300 00:08:04.880 Uttam Kumaran: What the hell?

13 00:08:04.880 00:08:08.090 Clarence Stone: I believe he’s in the hospital and he’s alive.

14 00:08:08.540 00:08:10.440 Uttam Kumaran: No way, yeah, dude, I just saw it, man.

15 00:08:10.440 00:08:16.580 Clarence Stone: From Nick Sorter saying that he’s in some hospital in Utah, and he’s alive.

16 00:08:17.270 00:08:18.360 Uttam Kumaran: Shit.

17 00:08:18.860 00:08:20.659 Uttam Kumaran: I mean, it’s insane.

18 00:08:21.580 00:08:26.220 Uttam Kumaran: You kind of, like, always ex… you kind of always expect…

19 00:08:26.550 00:08:29.320 Uttam Kumaran: This sort of stuff to happen, but, like…

20 00:08:29.730 00:08:32.049 Uttam Kumaran: I mean, it’s insane that, like.

21 00:08:32.960 00:08:36.690 Uttam Kumaran: The derangement is, like, getting extremely violent. It’s crazy.

22 00:08:37.640 00:08:50.799 Clarence Stone: And, like, the TikTok, account, that talks about liberals on X is posting all of these kids on TikTok, like, celebrating over what just happened. It’s, like, devastating to see.

23 00:08:50.800 00:08:57.739 Uttam Kumaran: Yeah, I mean, those people will get arrested. It’s like, they’re morals. It’s insane.

24 00:08:59.950 00:09:04.259 Uttam Kumaran: Yeah, I just, I just, like, don’t get… I don’t know.

25 00:09:04.520 00:09:08.729 Uttam Kumaran: It’s like senseless violence. Just don’t go to the thing, I don’t agree with him.

26 00:09:21.040 00:09:22.710 Uttam Kumaran: How’s the day going otherwise?

27 00:09:24.140 00:09:30.200 Clarence Stone: Good! I hope Hannah doesn’t mind the fact that I have a bunch of typos.

28 00:09:30.200 00:09:36.569 Uttam Kumaran: No, dude, that’s like the… that’s the depth of feedback that was amazing. As you can tell, it’s our first stab into…

29 00:09:36.770 00:09:41.220 Uttam Kumaran: consulting… Like, vaporware.

30 00:09:41.720 00:09:47.699 Uttam Kumaran: Lead magnets, so… Yeah. I knew, I was like, if something seems, like, not perfect here, and I was like.

31 00:09:48.140 00:09:52.230 Uttam Kumaran: Clarence probably seen a million of these, so… hold on, give me one second, let’s order a copy.

32 00:10:26.550 00:10:32.180 Uttam Kumaran: Yeah, no, it was actually super, super helpful. Like, I think we’re…

33 00:10:32.330 00:10:51.319 Uttam Kumaran: We have someone on our team that’s doing these, like, SME interviews, so we’re interviewing, like, vendor partners, and, like, subject matter experts in different industries, so we can sort of build, like, a reusable asset and a blog post, and then start to build, like, a campaign around. Yeah.

34 00:10:51.570 00:11:11.490 Uttam Kumaran: So, this is one that we’re doing around, I believe he should’ve… I think he sent you the auto one, but we’re generating probably, like, another two to four of these, which are just, like, great assets, and we’re using AI, so we’re going and interviewing people, taking that output, generating these. The nice thing is they can co-brand it.

35 00:11:11.650 00:11:16.680 Uttam Kumaran: I promote it as well, so… Yeah, that’s sort of the…

36 00:11:17.150 00:11:18.810 Uttam Kumaran: It’s kind of the culmination of that.

37 00:11:18.810 00:11:30.269 Clarence Stone: I mean, I can say that it is very well written. It’s well above the bar of the average one I’ve seen, so I hope, at the very least, Hannah, you know, takes some confidence in the quality.

38 00:11:30.270 00:11:31.240 Uttam Kumaran: Frank, crank, crank.

39 00:11:34.540 00:11:38.709 Uttam Kumaran: Okay, cool, so how should we, how should we proceed?

40 00:11:38.710 00:11:39.420 Clarence Stone: No.

41 00:11:39.420 00:11:46.259 Uttam Kumaran: I kind of read… I read the thing a little bit, but you tell me what I can answer, or what we can brainstorm, and then, yeah, let’s… let’s go.

42 00:11:46.260 00:11:54.359 Clarence Stone: So, that one I sent you last night, I didn’t realize it sent to the wrong person, so that… that is brand new, net new work that…

43 00:11:54.360 00:11:54.890 Uttam Kumaran: Sure.

44 00:11:54.890 00:11:57.360 Clarence Stone: I’ll probably get on the US side.

45 00:11:57.740 00:12:02.880 Clarence Stone: Claire’s a partner that just pings me all the time with ideas like that.

46 00:12:03.330 00:12:04.700 Clarence Stone: So… Okay.

47 00:12:04.700 00:12:23.389 Clarence Stone: just an example of the kind of shit that, you know, comes across my desk, but I will give you the complete breakdown. So my goal here really is to see if we can come up with, like, a good narrative, or a good talk track, or maybe you have some ideas on how we can present, you know, what

48 00:12:23.470 00:12:43.359 Clarence Stone: PJ is looking for, and just, you know, I told you a little about it yesterday, that PJ is really looking for vision, and a strategy on how he can build out this AI arm, within his tax technology transformation organization in Canada. He’s, like, the lead partner there, so…

49 00:12:43.600 00:13:05.410 Clarence Stone: following that model that, like, he did for data and analytics, he wants to do the same thing with AI, and the first thing that he kind of presented to me was him playing around with Vicinity and coming up with a vision of what, you know, he thinks would be valuable to C-suite stakeholders, so I’m going to share that, and I’ll send you this deck after all.

50 00:13:05.410 00:13:06.010 Uttam Kumaran: Okay.

51 00:13:06.010 00:13:15.430 Clarence Stone: So you can look back at it, but it’s awesome to see somebody’s AI journey, because he used AI to create, you know, this,

52 00:13:15.710 00:13:21.379 Clarence Stone: you know, refined narrative of this C-suite AI-powered decision intelligence tool.

53 00:13:21.700 00:13:24.750 Clarence Stone: And the gist of it is.

54 00:13:24.860 00:13:37.450 Clarence Stone: you know, we… we work in tax, and most of the time, it seems like it’s just a service that we ship away and, you know, we file documents, but what EY actually does is help

55 00:13:37.670 00:13:53.619 Clarence Stone: these leaders make better decisions because we give them intelligence into their financials and the position that the company is in in the current state, right? So, though this looks a lot like it doesn’t touch tax, it very much is.

56 00:13:53.620 00:13:54.030 Uttam Kumaran: Yes.

57 00:13:54.030 00:14:06.729 Clarence Stone: sort of the background, right? But PJ’s vision is to be able to create a tool that takes all the data we’ve already have sitting in, a, you know.

58 00:14:07.070 00:14:13.440 Clarence Stone: cloud environment. Most of the time, it’s probably something like a, like…

59 00:14:14.700 00:14:27.980 Clarence Stone: Azure databases, like, or Snowflake, and be able to leverage that for these executives to have meaningful conversations that will help them make more informed decisions.

60 00:14:27.980 00:14:28.860 Uttam Kumaran: Yes.

61 00:14:29.170 00:14:51.160 Clarence Stone: So, long story short, this is an agent with all the context of the BI intelligence that a Snowflake environment reporting environment would have, right? It’s gonna use that as well as, you know, some tool plugins to understand current events, and then provide some responses or insights so that these CEOs, they’re not

62 00:14:51.160 00:14:59.839 Clarence Stone: asking AI to make the decision, but rather their scenario playing through multiple outcomes, and what would happen if they did

63 00:14:59.840 00:15:00.710 Clarence Stone: something.

64 00:15:01.520 00:15:02.350 Clarence Stone: Right.

65 00:15:03.080 00:15:03.830 Uttam Kumaran: Yes.

66 00:15:06.150 00:15:12.829 Uttam Kumaran: So I have, I have something that’s, like… yeah, I have something that’s actually a very similar…

67 00:15:12.970 00:15:16.169 Uttam Kumaran: That we thought about as, like, a long tail for…

68 00:15:16.970 00:15:25.329 Uttam Kumaran: like, an AI, sticky AI offering. Let me just bring it up… let me just get you the notion, like, that I’ll… I’ll share it with you, because I think it’s honestly pretty…

69 00:15:25.510 00:15:29.619 Uttam Kumaran: It’s pretty similar, let me see if I can find it,

70 00:15:54.110 00:15:55.930 Uttam Kumaran: Yeah, that’s just not good.

71 00:16:02.760 00:16:04.900 Uttam Kumaran: Is it just clearance at vicinity?

72 00:16:05.630 00:16:06.650 Clarence Stone: Yeah.

73 00:16:06.950 00:16:08.540 Clarence Stone: Trustvicinity data.

74 00:16:10.030 00:16:11.589 Clarence Stone: Oh, I’ll send you.

75 00:16:12.970 00:16:14.090 Clarence Stone: I’ll ping it to you.

76 00:16:31.860 00:16:36.430 Uttam Kumaran: Okay, so I sent you two links, and I think this conversation can go, like.

77 00:16:36.730 00:16:40.210 Uttam Kumaran: two ways. So, one is,

78 00:16:41.810 00:16:49.159 Uttam Kumaran: One is, I think, totally, I think we should talk about this, like, application layer. I think there also is a lot of,

79 00:16:49.350 00:16:57.960 Uttam Kumaran: Sort of, how do you structure, like, even the teams around executing AI-related project.

80 00:16:58.320 00:17:02.100 Uttam Kumaran: So, like, I think we can kind of… you can kind of go into both directions.

81 00:17:02.870 00:17:18.869 Uttam Kumaran: Yeah, yeah, so it’s just like… and even in my business, like, we are deploying AI, but there’s also innovation happening because of… there’s, like, business model and organizational innovation happening. So I kind of want to separate those two, because you can go do the same tax stuff.

82 00:17:19.079 00:17:20.460 Uttam Kumaran: Cheaper and leaner.

83 00:17:20.740 00:17:22.850 Uttam Kumaran: But then you can also do AI stuff.

84 00:17:23.220 00:17:36.860 Uttam Kumaran: the same way, or in a different way, so we kind of separate those two. The first document I sent is sort of, like, an exploration that I did, I don’t know, this may have been, like, 4 or 5 months ago, kind of about, like,

85 00:17:38.000 00:17:54.180 Uttam Kumaran: basically, I was trying to envision a way for us that once in my company, we end up finding all of these assets, which can be decisions around data, unstructured structured data, meeting notes, project plans.

86 00:17:54.350 00:18:10.009 Uttam Kumaran: You actually build a really great, rich context repository for, probably someone that’s closer to, like, a revenue, like a revenue, optimization or revenue acceleration, like, analyst.

87 00:18:10.030 00:18:16.890 Uttam Kumaran: Right? Someone that is, like, on the hunt to find money. And so one of the ways that this…

88 00:18:17.410 00:18:36.920 Uttam Kumaran: you know, what I kind of wrote part of, like, a lot of this is deep research, and it’s kind of just, like, me exploring, is, like, okay, one of the things that I wanted to look at is, like, could we actually leave our clients with, like, a revenue analyst agent? And what does that revenue analyst do?

89 00:18:36.920 00:18:37.310 Clarence Stone: Yeah.

90 00:18:37.310 00:18:48.699 Uttam Kumaran: it maybe monitors a bunch of dashboards, it looks through a bunch of contracts, and it finds opportunities. And there’s, like, levels to it, right? And this is what kind of, like.

91 00:18:48.760 00:19:06.949 Uttam Kumaran: also informed my talk, which is there’s one level that’s just, like, can I ask somebody questions? Another level that something’s, like, almost, like, finding opportunities, right? It’s almost ambiently looking, or it’s looking at stuff that’s, like, on a time series, and not only coming up with ideas, but then, like.

92 00:19:07.260 00:19:09.350 Uttam Kumaran: Doing scenario analysis, and…

93 00:19:09.350 00:19:09.730 Clarence Stone: and…

94 00:19:10.080 00:19:22.710 Uttam Kumaran: like, there are companies that do, like, pricing intelligence or revenue analytics. It’s sort of just like, hey, you want to come in and find money? Like, can we price things better to increase

95 00:19:22.750 00:19:41.259 Uttam Kumaran: you know, revenue, top line revenue. So it’s just, like, kind of… this is all questions that, in my world, you ask a revenue analyst. You ask a growth analyst or a growth ops person, like, hey, go find, like, how we should roll out our next service so that we do it cheaper, and there’s a bigger splash. Like, how do we…

96 00:19:41.280 00:19:54.579 Uttam Kumaran: how do we look at our existing client base and run a cohort analysis, or retention curves, or do LTV analysis? So, I think it’s kind of on… this… I feel like what’s kind of similar in that

97 00:19:54.730 00:20:14.379 Uttam Kumaran: One, I think there is interesting, like, business models that you can do, where if your agent assists in the finding of some amount of money, you could get a piece of that. That is also all, like, pretty easily logged, so you could understand, like, hey, if an agent contributed to a decision that made some money.

98 00:20:14.490 00:20:22.400 Uttam Kumaran: There’s also, like, an ability for you to pair… ideally, in my business, what this should start as is, like, we have these analysts already.

99 00:20:22.400 00:20:35.779 Uttam Kumaran: they would start using the agent to do their work, and then eventually, I think, we’ll be in a position to just actually sell the agent as a service. You know, and have that… have that be the lead behind where

100 00:20:35.930 00:20:49.920 Uttam Kumaran: we do all this data modeling, we do all this, like, upfront work to organize all your contacts, and then you have this agent that is now something you just, like, pay for. Honestly, it’s probably closer to paying for, like, a full-time person. It’s just probably cheaper, you know?

101 00:20:49.920 00:20:51.520 Clarence Stone: And, and he was…

102 00:20:51.520 00:20:52.140 Uttam Kumaran: So, like, I don’t know.

103 00:20:52.140 00:20:52.560 Clarence Stone: I think this is.

104 00:20:52.560 00:20:53.850 Uttam Kumaran: Probably, like, a… yeah.

105 00:20:54.100 00:21:13.960 Clarence Stone: Because he already has a lot of the financial data, right? So his lever now is like, hey, you’ve already given us your historical financials, we have direct links into your source systems, because we’re filing your taxes and advising you, so how about this tool? It’s supposed to be another bolt-on on top of it, right?

106 00:21:14.360 00:21:16.370 Clarence Stone: So, for him.

107 00:21:16.370 00:21:23.140 Uttam Kumaran: Yeah, I think the biggest thing is that you just want the scope to be very specific. So if you look at, like, for example.

108 00:21:23.310 00:21:28.269 Uttam Kumaran: one of the examples I have here is, like, Output me 3 revenue plays.

109 00:21:28.330 00:21:46.070 Uttam Kumaran: Based on, like, patterns you see in the data. One of them could be, hey, there are 5 clients in your current client base that look like they could be ramped… they could be ripe for expansion, based on public market signals, based on their usage patterns, based on sales calls we’ve had with them.

110 00:21:46.110 00:21:48.880 Uttam Kumaran: Alright, so what, basically, you have to build, like.

111 00:21:48.960 00:21:53.480 Uttam Kumaran: there’s a lot of, like, AI logic, but you want it to be sort of specific on, like.

112 00:21:54.030 00:22:12.829 Uttam Kumaran: Finding revenue is where we’re really, really focused on, because these are all, like, revenue opportunity finding agents. It could also be, like, hey, look through contracts and find people that are breached, or, like, opportunities to collect, or I don’t know, I feel like there’s… there’s a lot there.

113 00:22:13.060 00:22:17.840 Uttam Kumaran: But we kind of centered around, like, revenue optimization.

114 00:22:18.100 00:22:24.520 Uttam Kumaran: Upsell, renewal, retention, revenue per unit, because there are very key, fixed KPIs.

115 00:22:24.630 00:22:28.209 Uttam Kumaran: And there are many companies that their entire business is, like.

116 00:22:28.400 00:22:35.429 Uttam Kumaran: data science around, like, revenue analytics. Yeah. Like, they’re the best at taking, like, a user table.

117 00:22:35.560 00:22:43.300 Uttam Kumaran: With a bunch of dimensions and characteristics and building you, like, the best cohorting chart for you to test pricing and things like that, so…

118 00:22:43.590 00:22:51.250 Uttam Kumaran: Certainly, I think there’s a lot of room in this world, for sure.

119 00:22:51.250 00:23:10.510 Clarence Stone: Yeah, and I think you’re on the nose here. Like, the thing that I told PJ was going for the CEO role is probably the hardest thing to do, because you’re gonna have to cross-cut through a bunch of different perspectives, but if we focus in on someone like a CFO, their goals are more…

120 00:23:10.510 00:23:12.430 Uttam Kumaran: You know, yeah.

121 00:23:12.430 00:23:18.299 Clarence Stone: Right? Their priorities are more obvious, and you’re gonna be able to get some better outputs.

122 00:23:18.300 00:23:22.820 Uttam Kumaran: You can’t build this from top down. It has to be piecemeal from the bottom.

123 00:23:22.950 00:23:25.089 Uttam Kumaran: But it’s, like, what starts as, like.

124 00:23:25.260 00:23:29.189 Uttam Kumaran: One workflow that gets augmented, then turns into, like, a tool.

125 00:23:29.440 00:23:39.920 Uttam Kumaran: that tool gets run by an agent. That agent affects the multiple tools. That agent is one of many agents that gets orchestrated, and then it sort of, like, compounds from there.

126 00:23:40.160 00:23:44.349 Uttam Kumaran: But, yeah, but, like, you, but you’re gonna need things like.

127 00:23:44.670 00:23:52.829 Uttam Kumaran: search, you’re gonna need things like something that can… you’re gonna also need the ability to look at all this different data, and also, like.

128 00:23:53.420 00:23:59.739 Uttam Kumaran: you’re gonna have to provide it with some amount of, like, prioritization and give it in the loop. Like, it’s a pretty big thing to build.

129 00:23:59.740 00:24:00.070 Clarence Stone: Yep.

130 00:24:00.070 00:24:19.939 Uttam Kumaran: The other thing is there are some great products in market, I think, for you to consider, especially on the data analysis side. Like, there are… there’s a tool called Omni that is, like, really blowing up. They are, like, a really, like, AI-powered, data analysis tool. There’s a bunch of people that built Looker. They went and started this company.

131 00:24:20.150 00:24:23.230 Uttam Kumaran: They’re, like, gonna become really, really massive.

132 00:24:23.360 00:24:32.210 Uttam Kumaran: So I think it also, part of it matters, like, that their infrastructure and their data pipelines are, like, set up appropriately. Otherwise, like, you’re kind of cucked.

133 00:24:32.350 00:24:42.449 Uttam Kumaran: You know, because for all of these things, like, people are not going to blindly trust these decisions. You need to back into queries that can get run, or, like.

134 00:24:42.620 00:24:45.600 Uttam Kumaran: Show your work, or scenarios, like…

135 00:24:46.270 00:24:48.849 Uttam Kumaran: It’s just a… it’s not a,

136 00:24:49.770 00:24:55.889 Uttam Kumaran: oh, here’s the answer, like, go do this. Like, it almost has to be tailor-fit towards, like.

137 00:24:56.500 00:25:07.400 Uttam Kumaran: who’s gonna, in the execution chain, is it, like, because someone on marketing has to go do this, now the exec team, we have to host these three meetings. Like, I don’t know, there’s just a lot to diagnose here.

138 00:25:07.760 00:25:09.579 Clarence Stone: Yep. So, I mean…

139 00:25:09.980 00:25:27.510 Clarence Stone: I left PJ with the clear expectation that there would be a lot of work involved in context engineering to do this successfully, so I think he has a good, solid understanding of, like, the burden of setting up these pipelines.

140 00:25:27.510 00:25:39.870 Clarence Stone: And what that conversation eventually just led to, and I just made this slide to, like, reiterate it to him, that, like, the first iteration is just going to show one single concept, one single use case.

141 00:25:39.870 00:25:51.090 Clarence Stone: using, you know, an already predetermined set of demo data, and some, you know, web search APIs or maybe current events APIs to prove that we can add tooling

142 00:25:51.090 00:26:08.640 Clarence Stone: To expand the context of an AI, and that would just be a demo that he can take and actually start, you know, selling on his side, so that it, you know, when he gets somebody that bites, we can actually continue to build, you know, that mature product for them.

143 00:26:08.640 00:26:14.730 Uttam Kumaran: So one step I would even go this further is to, like, maybe even focus on a metric.

144 00:26:15.010 00:26:15.920 Uttam Kumaran: Like…

145 00:26:16.130 00:26:24.639 Uttam Kumaran: for example, in that Notion doc, I’ve listed several sort of metrics that are, like, interesting to optimize for.

146 00:26:24.790 00:26:32.819 Uttam Kumaran: that most companies have. Renewal rates, revenue retention, churn, upsell expansion, revenue per user, cash.

147 00:26:33.120 00:26:37.739 Uttam Kumaran: I would almost think that he made… Understand this, is that, like.

148 00:26:37.980 00:26:42.170 Uttam Kumaran: Your agent is literally just trained on optimizing a KPI.

149 00:26:42.340 00:27:01.170 Uttam Kumaran: Which I feel like is also… it’s just, in itself a struggle for many people, and naturally can fit right into, like, what these C-suite priority out, which is already OKRs tied to KPIs. So starting with something that’s just focused on ARPU or CAC, it can be specific to the whatever industries that they’re selling into.

150 00:27:01.190 00:27:12.339 Uttam Kumaran: I think is a good way, because the input data into identifying one metric and the drivers is a lot easier than just, like, revenue.

151 00:27:12.630 00:27:23.510 Uttam Kumaran: If you’re like, okay, cool, they’re gonna have a definition somewhere, and that means there’s a query, and that query runs with some data, that data comes from somewhere, so you can… you can at least know that

152 00:27:23.800 00:27:35.659 Uttam Kumaran: a human can identify… can do this analysis, which then gives us the ability to start to decompose that. If you’re, like, grow revenue broadly, can a human do that? Like, yes, but, like.

153 00:27:35.690 00:27:47.510 Uttam Kumaran: it’s in any meaningful timeframe, no, but if you’re like, hey, can you measure and provide, like, 3 opportunities to… to improve CAC? Okay, that is, like, something I think can get accomplished in a timeframe that is reasonable.

154 00:27:48.180 00:28:03.260 Uttam Kumaran: And that could be a great MVP. And again, it’s gonna take this whole chain. It’s gonna take you having a really clear understanding of, like, yeah, the fabric, like, the Microsoft fabric environment of, like, where this data comes from. You’re gonna have to build either, like, a…

155 00:28:03.760 00:28:07.139 Uttam Kumaran: Some type of something that can, like, query a database.

156 00:28:07.200 00:28:17.719 Uttam Kumaran: use some semantic understanding of the data, but then you’ll be able to test the various things of, like, cool, let’s say we have a dummy data set of, like, 5 meetings about CAC.

157 00:28:17.770 00:28:26.609 Uttam Kumaran: We have, like, pretty clear data models that show CAC the core marketing spend areas, the core acquisition sources.

158 00:28:26.660 00:28:39.379 Uttam Kumaran: and you have a time series that’s in a really clean database, you know, and maybe we have a couple of other, like, sort of, like, reasonable data sources, like, okay, what is a goal for CAC?

159 00:28:39.530 00:28:57.790 Uttam Kumaran: what are past analyses that have been… like, that’s all dummy data, synthetic stuff we can generate. Then you can pretty easily produce a demo that shows, like, hey, if an agent had all this, could it… could it realize relevant information? Another thing that just happened this week, dude, is the new Claude

160 00:28:59.430 00:29:07.450 Uttam Kumaran: The new Clod can do a lot of crazy data analysis shit out of the box that wasn’t possible before.

161 00:29:07.500 00:29:22.029 Uttam Kumaran: And that just happened this week. I think your cell may… may be even better showing some of that, and I can send you some of the tweets I’ve been reading today. But people are now using it to do entire DCFs and some more complicated analysis that I didn’t…

162 00:29:22.240 00:29:24.809 Uttam Kumaran: Think it was gonna be possible for a while.

163 00:29:25.120 00:29:30.089 Uttam Kumaran: So you can… some of these are GIFs or quick videos, like, you should bring that, because

164 00:29:30.290 00:29:35.820 Uttam Kumaran: If that’s already possible today, 6 months will be much, much, much farther.

165 00:29:36.040 00:29:52.029 Uttam Kumaran: So that’ll only go to show you what that, like, putting together these complex analyses is on the roadmap for even the base models, let alone, like, some of the use case-specific vendors, like Omni, or some of these tech AI data analytics

166 00:29:52.110 00:30:05.230 Uttam Kumaran: data analyst vendors. So, like, all roads are pointing towards, like, decision intelligence. But you’re… where I think it’s a good way of putting it is, like, these people… people still want to feel like they’re in control.

167 00:30:05.850 00:30:08.399 Uttam Kumaran: Nobody’s gonna hand the keys over just yet.

168 00:30:08.400 00:30:08.910 Clarence Stone: Yep.

169 00:30:08.910 00:30:18.930 Uttam Kumaran: So it has to be, like, co-pilot, it has to be, like, a decision, like, an understanding engine, but it also needs to really have a clear, like, personalization.

170 00:30:19.080 00:30:32.349 Uttam Kumaran: This is something in my company, too, dude. As part of onboarding, I want people to be able to set up their profile, and the way that agents interact with you is different than any other person’s company. Like, there is, like, almost a user profile as part of each context.

171 00:30:32.420 00:30:39.459 Uttam Kumaran: So there are these unique things that I think the vendors are not going to be able to do that you need a consultancy

172 00:30:39.580 00:30:43.649 Uttam Kumaran: someone to build this still over the next few years.

173 00:30:43.650 00:31:01.930 Clarence Stone: So, like, a great example, I think you’re right on the nose, like, let’s say we created a demo agent for him that optimized for CAC. The use case that these executives would be looking at is not just to generally look across all the data to optimize CAC, they would say, hey.

174 00:31:01.930 00:31:02.440 Uttam Kumaran: all that work.

175 00:31:02.440 00:31:05.079 Clarence Stone: We’re… we need to build a new.

176 00:31:05.080 00:31:05.450 Uttam Kumaran: facility.

177 00:31:05.450 00:31:10.230 Clarence Stone: for, I don’t know, production of widgets, right?

178 00:31:10.540 00:31:18.890 Clarence Stone: How would putting it in India, or Buenos Aires, or the US, or Europe impact that CAC?

179 00:31:19.470 00:31:19.990 Uttam Kumaran: Yeah.

180 00:31:20.190 00:31:23.350 Clarence Stone: Right? So, it’s… it’s awesome, because, like.

181 00:31:23.630 00:31:28.490 Clarence Stone: If you built enough of these, you would get, you know, the response with different perspectives.

182 00:31:28.490 00:31:29.000 Uttam Kumaran: Yes.

183 00:31:29.000 00:31:41.970 Clarence Stone: Right? And imagine being able to run through that scenario, right? You’re pretty much equipped for that meeting as a CEO, because you know, like, everybody’s perspective, where they’re coming from, and…

184 00:31:42.050 00:31:51.529 Clarence Stone: like, the AI’s not telling you what to do, right? You just have the data to evaluate across all of these parameters and make your own informed decision.

185 00:31:54.020 00:31:58.919 Uttam Kumaran: Okay. I think almost, like, this is where I would… I want to understand from you, like.

186 00:31:59.450 00:32:02.639 Uttam Kumaran: For me, the way, if I was to be in…

187 00:32:02.960 00:32:19.289 Uttam Kumaran: if I was getting pitched this, I would almost be like, show me the before and after flowchart. How much time and how many people have been involved to put something like this together, and what it could look like with AI, right? Like, for my team, as part of our…

188 00:32:19.720 00:32:29.859 Uttam Kumaran: like, this is something we’re gonna start rolling out as our, like, common delivery playbooks for AI stuff, is we have to do a before and after visual diagramming and process mapping.

189 00:32:29.960 00:32:34.580 Uttam Kumaran: Right? Of the before and after, right? And, like, I think, like.

190 00:32:34.690 00:32:47.359 Uttam Kumaran: even if we’re doing it just for this demo, I think it’s important to show how many people have to get involved, the accuracy of their downstream work, and the time it would take to do it.

191 00:32:47.670 00:32:57.099 Uttam Kumaran: Like, that type of analysis, I’m sure, takes at least a month, because you’re gonna have so many… and what are the common problems?

192 00:32:57.280 00:33:09.659 Uttam Kumaran: my data is not in the right place, or, like, so-and-so is out of office, or what do we define CAC as? Or, like, is this the right marketing spend?

193 00:33:09.760 00:33:19.179 Uttam Kumaran: And then you have all this, like, redundancy because so-and-so-and-so was… you have all these people, and this is the thing that I think is the sort of broader philosophy, is, like.

194 00:33:19.390 00:33:28.690 Uttam Kumaran: You have all these people that have… each are taking on part of this task, but there’s a loss between each step.

195 00:33:28.950 00:33:43.760 Uttam Kumaran: You know? And you can only do… if one of these takes a month, imagine if one of these took… imagine instead of, like, waiting a month to get something that’s 70% accurate, you can get something that’s 40% accurate in a day.

196 00:33:43.920 00:33:55.229 Uttam Kumaran: and then get a clear understanding of, like, what parts are low confidence, high confidence, and you can run 30 of these at the same time, right? Like, you can just trigger as many of these as you want.

197 00:33:55.510 00:34:00.820 Uttam Kumaran: Right? It’s sort of what people are talking about, which is, like, having, like, cursor or deep research in the background is just, like.

198 00:34:01.030 00:34:04.499 Uttam Kumaran: keep looking for stuff, you know?

199 00:34:04.820 00:34:14.189 Uttam Kumaran: Like, it’s both the parallelization and the ability to get something with an indication of confidence today versus, like, in a month

200 00:34:14.949 00:34:18.550 Uttam Kumaran: You know, and it’s only one, because you’re limited by people.

201 00:34:18.550 00:34:19.420 Clarence Stone: But before that…

202 00:34:19.420 00:34:20.899 Uttam Kumaran: Things are commonly, yeah.

203 00:34:20.900 00:34:26.800 Clarence Stone: Because we are selling to the person who’s gonna sell it to someone else, right? So…

204 00:34:26.800 00:34:42.880 Clarence Stone: that the person who would pilot this experience is actually PJ’s friend, who’s the CEO of an oil company in Canada, so I can ping him back and say, hey, I’d love to sit down and just talk to your friend, right? And, you know, get a perspective of what a before and after looks like.

205 00:34:42.909 00:34:45.600 Clarence Stone: That could work.

206 00:34:46.590 00:34:54.850 Clarence Stone: Yeah, yeah, yeah, so something like that would probably be an approach, like, we can definitely get a before and after, but a little bit harder to do than, you know, your typical engagement.

207 00:34:54.889 00:34:57.229 Uttam Kumaran: Yeah, okay, okay, okay, great.

208 00:34:58.360 00:34:59.050 Uttam Kumaran: Okay.

209 00:34:59.480 00:35:18.390 Clarence Stone: Yeah, so that’s the one piece, and then, you know, eventually, I told you last night, the conversation grew to him just asking, like, what is next, right? Like, how should I be positioning myself, you know, as an organization to be prepared for this? Like, what is the real transformation that’s going to be happening with AI, and, you know, how do I sell that?

210 00:35:18.740 00:35:20.630 Clarence Stone: I, you know, I, I…

211 00:35:20.960 00:35:33.720 Clarence Stone: absolutely piggybacked on the importance of, you know, pretty much what you said, like, the ability to run through scenarios at an increased rate makes you a much higher performing CEO.

212 00:35:33.720 00:35:43.979 Uttam Kumaran: Dude, the pipe is bigger, like, you’re… like, if you think about it in physics terms, the pipe that the water is going through is bigger, and the speed of the water is going faster.

213 00:35:44.150 00:35:49.329 Uttam Kumaran: it’s like, that is the opportunity, right? And… and for me, in my… in…

214 00:35:49.560 00:35:59.130 Uttam Kumaran: I don’t know, this is something that, as I’ve been building, the only thing that matters for a CEO is that you make more decisions, and that those decisions are higher accuracy.

215 00:35:59.290 00:36:01.939 Uttam Kumaran: Yeah. Right? But either one counts.

216 00:36:02.190 00:36:04.849 Uttam Kumaran: Like, either one can be helpful.

217 00:36:05.140 00:36:16.130 Uttam Kumaran: And at the most part, people are making decisions with such long feedback loop windows, and with such low accuracy, that it’s like…

218 00:36:16.670 00:36:19.129 Uttam Kumaran: You just don’t get enough shots on goal.

219 00:36:19.350 00:36:28.139 Uttam Kumaran: And they’re constr… and this guy is gonna know that the data… what the common problems in data is, like, I don’t have everything.

220 00:36:28.210 00:36:43.129 Uttam Kumaran: People come in and out, definitions aren’t standard, like, these are all things that, given the additional context, meetings, contracts, other stuff, you can bridge those semantic layer gaps a lot easier.

221 00:36:43.240 00:36:50.380 Uttam Kumaran: Those aren’t getting solved by, like, getting better comments or getting a data dictionary, like, that’s not the solution here.

222 00:36:50.830 00:37:01.070 Uttam Kumaran: Those things have been pitched for the last 10 years, like, you need a data dictionary, you need a data governance, you need a data stewardship, like, that is bullshit. Like, that’s not gonna work.

223 00:37:02.050 00:37:02.720 Uttam Kumaran: End quote.

224 00:37:03.390 00:37:17.830 Clarence Stone: 100%. And so I think what really resonated was… when you said, like, the amount of times you get to, you know, take that shot. And I told PJ that I fully anticipate that

225 00:37:18.540 00:37:31.769 Clarence Stone: he’s gonna have an increased demand in creating products because these CEOs are gonna be making decisions more rapidly. So he needs to start taking a look at his organization and seeing how quickly can he respond with.

226 00:37:31.770 00:37:32.340 Uttam Kumaran: Yes.

227 00:37:32.340 00:37:42.909 Clarence Stone: right, you know, product or tool. And that’s when I said, hey, what if, you know… because he was hesitant. He’s like, I don’t think my team knows how to do that, right? I was just like.

228 00:37:42.910 00:37:43.930 Uttam Kumaran: There’s no way they will.

229 00:37:43.930 00:37:44.500 Clarence Stone: releases.

230 00:37:44.500 00:37:45.420 Uttam Kumaran: There’s no way they’re gonna…

231 00:37:45.420 00:37:48.980 Clarence Stone: solution, though, right? Like, something just fast enough

232 00:37:49.180 00:37:54.649 Clarence Stone: that allows you to understand, did we hit the mark? Are we in the right direction? Is this what the client wants?

233 00:37:54.960 00:38:03.810 Clarence Stone: Because what I want to do is then just build an enduring retainer, where once they hit that 70%, they have a prospective sale, we go.

234 00:38:03.810 00:38:04.340 Uttam Kumaran: Yes.

235 00:38:04.340 00:38:17.200 Clarence Stone: do the last 30%, put the final touches in, and make sure that it’s durable and production-ready. And, you know, when I said that to him, he’s like, exactly. That’s the partnership I want ongoing.

236 00:38:17.200 00:38:18.670 Uttam Kumaran: we beat it.

237 00:38:19.150 00:38:27.449 Uttam Kumaran: Yeah, I mean, that’s… so that goes to the second thing that I sent you, which is that tweet, which is… I don’t know, like, he should just copy, sort of, like, what Palantir did.

238 00:38:27.850 00:38:34.989 Uttam Kumaran: And what they continue to do. And… but it takes two different modes of thinking. And right now…

239 00:38:35.070 00:38:49.750 Uttam Kumaran: I’m sort of in this, like, bipolar, like, split-the-movie sort of motion, where I… I have both of them in my head, where you can’t… it’s really hard to get people that think long-term and short-term at the same time.

240 00:38:50.070 00:38:54.309 Uttam Kumaran: They’re… they’re… they tend to compete, and you need… you need, like, kind of, like, a…

241 00:38:54.870 00:39:08.530 Uttam Kumaran: governor to make some of those decisions, so some people are equipped, like, I do that all the time. Most people are not, so there has to be some people that are short-term, and there has to be some people that think long-term. In that tweet I sent in Zoom.

242 00:39:08.570 00:39:17.560 Uttam Kumaran: There’s a point about having some people that are on the ground building, but then immediately they pass it sort of up the chain to get

243 00:39:17.800 00:39:22.529 Uttam Kumaran: Built into some type of reusability, whether that’s reusable across clients.

244 00:39:22.530 00:39:24.550 Clarence Stone: So, I mean, why I think you got.

245 00:39:24.550 00:39:24.990 Uttam Kumaran: Yeah.

246 00:39:24.990 00:39:25.660 Clarence Stone: I know.

247 00:39:25.950 00:39:27.010 Clarence Stone: Because…

248 00:39:27.010 00:39:33.210 Uttam Kumaran: Only people who are working every day at EY understand the nuance of that problem.

249 00:39:33.500 00:39:42.459 Clarence Stone: to be able to take the first swing at, you know, creating a viable product. But they’re not gonna know how to make it a real product that’s durable.

250 00:39:42.460 00:39:42.970 Uttam Kumaran: Yes.

251 00:39:42.970 00:39:44.329 Clarence Stone: The right pipelines and systems.

252 00:39:44.330 00:39:50.300 Uttam Kumaran: So this is where Palantir is, like, crapped, because they do both. Like, they’re… it’s… they’re, like, it’s insane.

253 00:39:50.490 00:40:07.779 Uttam Kumaran: But this doesn’t mean that, like, people can’t start to do both, but… but they innovated… they didn’t innovate in engineering, they… they… maybe a little bit, but they innovated on their… on the org structure. Like, they innovated on, like, doing things that don’t scale first.

254 00:40:07.890 00:40:13.600 Uttam Kumaran: And then… Start to increase the… Customization.

255 00:40:13.750 00:40:33.739 Uttam Kumaran: and the outcome, and the contract size, and then start to generalize it towards the end, right? Like, if you look at this, what they generalize is the concept, like the ontology, the primitives, like, having multiple customers try something before trying to build, sort of, a larger platform or product.

256 00:40:33.830 00:40:40.709 Uttam Kumaran: Right? And so… I think, like, this sort of list…

257 00:40:41.300 00:40:45.399 Uttam Kumaran: Is… is, like, a really, really good way of thinking about it, where…

258 00:40:45.660 00:40:49.529 Uttam Kumaran: You want to measure towards outcome, and you…

259 00:40:49.870 00:40:52.999 Uttam Kumaran: Want to have both sides of the house thinking about this, like.

260 00:40:53.070 00:41:10.939 Uttam Kumaran: you will have to go in and get some clients at a lower margin to continue to develop in, like, sort of, like, piecemeal, like, next thing, next thing, next to next thing. Slowly, out of that, these products will emerge. It’s sort of what they recommend to most startups do anyways. It’s like, you get a couple of design partners and you do it. These guys are just doing it

261 00:41:11.090 00:41:20.049 Uttam Kumaran: At a different… in, like, a much more, like, where you always are doing this. You’re always getting more design partners, and then developing the next software.

262 00:41:20.830 00:41:21.370 Clarence Stone: Okay.

263 00:41:21.370 00:41:24.860 Uttam Kumaran: And their innovation is just at a rate that’s, like, incredible, right?

264 00:41:25.010 00:41:33.079 Clarence Stone: So I think that’s the second offering I want to put in front of him, right? To say, one, we’re… we’re gonna help you

265 00:41:33.220 00:41:46.859 Clarence Stone: be able to get to these demo-driven, you know, solutions on your side, but when it comes to that end-to-end workflow that, you know, exactly what the customer wants, with all the technology and the functions, you know, you can.

266 00:41:46.860 00:41:47.330 Uttam Kumaran: Yeah.

267 00:41:47.330 00:41:53.070 Clarence Stone: put the bell on it. That’s gonna accelerate your ability to be more active in the market.

268 00:41:53.330 00:41:54.120 Clarence Stone: Right.

269 00:41:54.120 00:41:54.710 Uttam Kumaran: Yeah.

270 00:41:55.170 00:41:58.689 Clarence Stone: And it’s gonna allow you to take more swings at, you know.

271 00:41:58.690 00:42:09.540 Uttam Kumaran: But it’s also way more expensive to develop the thing that works for everybody than it is to develop the thing that works for one. But there’s no enduring value

272 00:42:09.690 00:42:11.879 Uttam Kumaran: In the one thing that works for one.

273 00:42:12.030 00:42:16.230 Uttam Kumaran: So, but this is the thing, you can’t architect your teams both ways, like…

274 00:42:17.110 00:42:23.240 Uttam Kumaran: I… like, I have to do that right now, but, like, I don’t think those are the same people.

275 00:42:23.600 00:42:25.170 Uttam Kumaran: Yeah, so I think, you know…

276 00:42:25.170 00:42:40.240 Clarence Stone: conventional consulting, like Big Fuller, are best living in that short-term world just because of the style of organizations that they run, right? No floating cash balances, you know, quick turnaround

277 00:42:40.240 00:42:53.769 Clarence Stone: job, still working on the billable hour. Like, they are really geared towards being able to respond immediately with, you know, a narrow-use tool, because you get your immediate, you know.

278 00:42:54.360 00:43:04.789 Clarence Stone: you know, cash benefit from it, and it is very rare that they end up, you know, rationalizing a larger bill that works for everybody. So, let them stay in that world.

279 00:43:04.790 00:43:21.679 Clarence Stone: Right? I mean, we can sit from the seats and start seeing what the patterns are, and eventually start building a larger solution that’ll apply to more situations, but, you know, we’re gonna get a lot of, you know, intelligence just by watching them do all of those reps.

280 00:43:23.770 00:43:24.939 Uttam Kumaran: Yeah, I agree.

281 00:43:26.460 00:43:28.179 Clarence Stone: So…

282 00:43:29.760 00:43:35.050 Clarence Stone: What’s a good next step, I guess? Like, so I’m gonna put some more stuff together. I.

283 00:43:35.050 00:43:39.169 Uttam Kumaran: Well, dude, I’ll send you… well, I’ll send you this transcript, you can use it to brainstorm.

284 00:43:39.510 00:43:44.669 Uttam Kumaran: And yeah, I mean, I think, I think…

285 00:43:45.190 00:43:54.550 Uttam Kumaran: I think that overall, it comes to, like, there will have to be some innovation in the offering. There also has to be… I mean, that is actually a lot easier, because

286 00:43:54.900 00:43:58.940 Uttam Kumaran: you can sell anything. Delivering on it is the next challenge.

287 00:43:59.340 00:44:13.939 Uttam Kumaran: And… but he’s gonna… he’s gonna run into people, problems on both sides. Talent for people that can actually do this is really sparse. Yeah. But he probably has some part of that internally, and they probably just need to get upskilled.

288 00:44:14.180 00:44:25.290 Uttam Kumaran: You know, so there’s one side there. Second is the org changes are gonna be very difficult. Like, moving to some model that is, like.

289 00:44:25.580 00:44:32.129 Uttam Kumaran: Someone develops something for a client, then it gets pushed up to a platform team, like…

290 00:44:32.890 00:44:38.449 Uttam Kumaran: It’s just… you… it just takes a level of, like, engineering rigor, and…

291 00:44:38.560 00:44:50.350 Uttam Kumaran: in, like… I don’t know, it’s tough, like, it took something like Palantir, it took these guys a long time to be able to ever even, like, ship some of these products, but now their whole businesses are these products.

292 00:44:50.480 00:44:52.669 Uttam Kumaran: But, like, I think if he’s… if he’s…

293 00:44:52.970 00:45:15.769 Clarence Stone: outlining PJ’s method of selling. I used to make fun of him, and I used to say, like, he sells snicker bars, because he would do, like, 30,000, like, reporting dashboard engagements, and the way he did it was, like, if there was a pattern to, you know, develop a report for a certain type of data set, you know, he would

294 00:45:15.770 00:45:21.759 Clarence Stone: Shell out that loss on the first one, but really sell everything at a really, really cheap rate.

295 00:45:22.010 00:45:38.170 Clarence Stone: So, his claim to fame as a partner really was, kind of taking this approach where, you know, you test the market, it works, okay, bring it, you know, into the center and scale it so that anyone in the reporting team can create that Power BI dashboard.

296 00:45:38.170 00:45:38.960 Uttam Kumaran: Yes.

297 00:45:39.680 00:45:44.109 Uttam Kumaran: I think this is gonna resonate with him, because all we’re saying… Because, dude, because also, like, the…

298 00:45:44.110 00:45:45.380 Clarence Stone: It doesn’t change, right?

299 00:45:45.380 00:45:59.499 Uttam Kumaran: Yeah, the model doesn’t change. Even for me, the reason why we invest in playbooks is because I want someone whose marginal rate is cheaper to execute the same quality of work. You know, and so I have my senior people build playbooks.

300 00:45:59.580 00:46:08.529 Uttam Kumaran: Because then I can get someone that’s half the cost who can execute that 80% of the way there, and net-net, like, we end up executing it for cheaper, but…

301 00:46:08.670 00:46:09.690 Uttam Kumaran: Yep.

302 00:46:09.980 00:46:27.810 Uttam Kumaran: I just think you’re up against two things right now. One, still, even the smartest people are not doing a lot of AI stuff. So, like… but, again, he can… if he has budget and cache, then you can go hire people and… and do that. The business model innovation is hard, like…

303 00:46:28.000 00:46:28.910 Uttam Kumaran: Yeah.

304 00:46:28.910 00:46:43.530 Clarence Stone: So, I can tell you right off the bat, he’s not even gonna find that talent in Canada, and he’s already accepted it. But there’s probably, like, 2 or 3 people on his team that can be upskilled to do this. Okay, yeah. Would be totally open to, like.

305 00:46:43.530 00:46:44.360 Uttam Kumaran: the time.

306 00:46:44.360 00:46:53.240 Clarence Stone: to us, or just, like, if your team wants to take this engagement, like, teaching them how to get to the fundamentals, because I fully guarantee you, they are going to call you.

307 00:46:53.870 00:46:54.280 Uttam Kumaran: Yeah.

308 00:46:54.280 00:47:04.530 Clarence Stone: Like, this is not, you know, a… an all-be-all fully built product. So, like, on that flip side, he’s like, instead of getting that expert or just, like, shelling out for it.

309 00:47:04.530 00:47:10.910 Uttam Kumaran: No, dude, I would… I don’t mind, and training and doing this is gonna be a big part of our business, and for me.

310 00:47:11.020 00:47:23.569 Uttam Kumaran: developing a training product works in two ways, because the better we train external people, I then will turn that around, and I’ll be able to recruit cheaper and upskill people myself, so… Yeah. And there’s also now, like, I’m now…

311 00:47:23.570 00:47:27.629 Clarence Stone: Colin in it, too. He probably has, like, some really great ideas on training.

312 00:47:27.630 00:47:30.729 Uttam Kumaran: Yeah, and I’ve also now, like,

313 00:47:31.370 00:47:43.929 Uttam Kumaran: I’ve sort of, like, tapped a couple people at Fallen, for example, has this, like, ChatGPT level 1, level 2, level 3, so there’s, like, kind of tool upskilling. There’s probably some amount of just, like, how do you start to think from

314 00:47:44.330 00:47:51.769 Uttam Kumaran: prompt, to, like, workflow, to agent, to tools, to MCP.

315 00:47:51.790 00:48:05.670 Uttam Kumaran: Right? Like, second is, like, how do I just train everybody to either… to just become certified to build shit on NAN, or whatever the equivalent is, right? Like, so getting non-technical people to start to build automation themselves.

316 00:48:05.740 00:48:11.889 Uttam Kumaran: This is, like, kind of where our company is gonna go to, where now everybody in our company is pretty trained on

317 00:48:12.170 00:48:13.819 Uttam Kumaran: ChatGPT basics.

318 00:48:14.070 00:48:19.240 Uttam Kumaran: But, like, to give you an example, like, we’re, we’re talking to HelloFresh.

319 00:48:19.370 00:48:24.709 Uttam Kumaran: And they’re interested in bringing us in to help them build out, like, their N8N platform.

320 00:48:25.010 00:48:29.019 Uttam Kumaran: So that anyone in the company can build on it within that company.

321 00:48:29.320 00:48:42.430 Uttam Kumaran: Right? And so their thinking is that, like, we don’t need you to come help us build, like, one thing, maybe, but they’re like, we want you to come in and help us build, like, the fact that anybody in the company can tap into N8N and build stuff on top of it.

322 00:48:42.590 00:48:49.869 Clarence Stone: Yeah, and I guarantee you, every time we’ve done this at EY, they always call back, because they go, hey, this is as far as we’ve gone.

323 00:48:49.870 00:48:52.770 Uttam Kumaran: Oh, yeah, yeah, no, the pitch, no, ultimately…

324 00:48:53.410 00:49:04.999 Uttam Kumaran: Generally, the teachers are… we’re teaching because we’re the best, and there will be things that are so advanced that will have to happen outside of there, and… and yeah, and that’s… that’s sort of our… our hope as well.

325 00:49:05.930 00:49:10.629 Uttam Kumaran: But, like, because there’s such a knowledge gap and a training gap, I don’t know, it’s… it’s a…

326 00:49:11.030 00:49:18.449 Uttam Kumaran: it’s a cool time, and I want to do more engagements that are training and upskilling. It’s also very satisfying to, like, help people do that. Kind of works, that’s really cool.

327 00:49:18.450 00:49:23.010 Clarence Stone: Yeah, yeah, and personally, I think it validates your methodology, too.

328 00:49:23.010 00:49:24.120 Uttam Kumaran: Yes, yes.

329 00:49:24.120 00:49:34.219 Clarence Stone: It would be a lot more powerful, you know, by the time you want to go for an acquisition to say, hey, we didn’t just do this, we got other organizations to do it.

330 00:49:34.290 00:49:41.049 Uttam Kumaran: No, and dude, it’s like, if I can bring in any random full-stack backend, front-end person, and then turn them into an AI engineer, like, that’s what I…

331 00:49:41.530 00:49:49.780 Uttam Kumaran: Right? Like, that’s… whatever methodology we’re gonna use to hire, because I can’t find people either. So I’m having the same problem. We’ve…

332 00:49:49.850 00:50:02.100 Uttam Kumaran: I’ve gotten good people through ways that, like, nobody should go through to find people, you know? And so it’s not… what I realized is you’re… you have… we have… you’re gonna have to upskill and train. Like, I don’t think people are gonna be…

333 00:50:02.420 00:50:10.600 Uttam Kumaran: There’s not people in the market that are, like, really, really great at AI, and if they are, they’re not advertised that way, I suppose. You know, so…

334 00:50:10.600 00:50:27.919 Clarence Stone: Yeah. So here’s the thing, I’m pretty sure, and I already know that PJ started the process to do the engagement just for building that first demo, so, I mean, I would love it if you wanted to join me on that, like, I can, you know, loop you in as I hear more.

335 00:50:27.920 00:50:31.120 Uttam Kumaran: Sure. And then on that second part, like.

336 00:50:31.120 00:50:46.600 Clarence Stone: we can bring that topic up, in conversation, because I think it’s, like, a natural progression to that, you know, conversation. Because, like, right now, MPJ has no AI services that he’s providing on his own, and at the very least, if.

337 00:50:46.600 00:50:47.640 Uttam Kumaran: That’s crazy.

338 00:50:47.640 00:50:50.779 Clarence Stone: Yeah, you know, train these guys to, you know.

339 00:50:50.930 00:50:59.459 Clarence Stone: teach others how to use Copilot, or, you know, build an agent, right, or self-service themselves, it’d be just a huge value driver for him.

340 00:51:00.690 00:51:12.630 Uttam Kumaran: Let’s do it, and you tell me, like, if it’s mostly strategy-level stuff, like, I can do it. If we want to build out these demos and synthetic data, then I’ll just bring an engineer from my side, and yeah, let’s crush through it.

341 00:51:12.630 00:51:25.220 Clarence Stone: Yeah, so that’s what I’m gonna do. I’m gonna continue to probe PJ on, you know, a better KPI that we can, you know, design around, and see if we can, get a, you know.

342 00:51:25.220 00:51:35.060 Clarence Stone: a current state, future state assessment going so that, you know, there’s at least going to be some milestones and definitive savings that you can highlight from creating the AI process.

343 00:51:36.220 00:51:44.529 Uttam Kumaran: Yeah, I also just think a big part of this also is just, like, we’re… we’re in the market looking at all the tools that support this, so… of course, I’m not, like.

344 00:51:44.760 00:51:58.370 Uttam Kumaran: I’m not big on, like, build everything, like, if we can bring in different vendors for different parts of the pie, that’s something that he would go have to do a roadshow on, and certainly, I think by the time he makes it through all the active products, like.

345 00:51:58.410 00:52:06.290 Uttam Kumaran: you’ll be an old man, you know? So, I already know… we kind of already know a lot that’s what’s working, and then for the folks that is working, we have relationships.

346 00:52:06.390 00:52:16.289 Uttam Kumaran: So, like, I don’t know, I feel like just in just being in the market and buying these tools, we’ve learned a lot. So on the procurement side, like, I think we’ve helped a lot of clients, because

347 00:52:16.440 00:52:23.469 Uttam Kumaran: I don’t know how you would find some of these products if you just weren’t, like, binging Twitter. You know, so for them, it’s like.

348 00:52:23.800 00:52:31.799 Uttam Kumaran: what’s the opportunity cost of that? Like, they may just go to… be like, oh, I’m just gonna buy whatever Amazon has, and I’m like, dude, you should go to something that’s, like.

349 00:52:31.980 00:52:37.499 Uttam Kumaran: That is… but some of these companies are, like, Series B, Series C. You have never heard about them, you know? So…

350 00:52:40.230 00:52:41.510 Uttam Kumaran: Okay, perfect.

351 00:52:42.060 00:52:44.500 Clarence Stone: Oh yeah, I think that sounds like a plan, right?

352 00:52:44.700 00:52:45.850 Clarence Stone: And…

353 00:52:46.000 00:52:53.839 Clarence Stone: Yeah, I’ll let you know as soon as I know more, but I think this is helpful because, you know, at least we can start thinking on, you know.

354 00:52:54.100 00:52:58.930 Clarence Stone: What the next larger, you know, engagement opportunity is here.

355 00:52:58.930 00:52:59.640 Uttam Kumaran: Yeah.

356 00:53:00.560 00:53:03.649 Uttam Kumaran: So then what I’m gonna do is, like, I’ll… I’ll get you this,

357 00:53:04.210 00:53:13.910 Uttam Kumaran: as soon as this recording processes, I’ll get you this transcript, and then you have those… the doc that I sent you, and then, yeah, just tell me… just text me if you want to brainstorm further, or, like, what you need from me.

358 00:53:14.170 00:53:16.679 Clarence Stone: Cool. Sounds good, man. Thanks for the help.

359 00:53:16.800 00:53:19.300 Uttam Kumaran: Of course, dude, good luck, yeah, make it happen.

360 00:53:19.550 00:53:26.019 Clarence Stone: Absolutely, it… yeah, well… This is definitely,

361 00:53:26.110 00:53:45.939 Clarence Stone: gonna be the start to something interesting, because, PJ has always also asked about, what Neil has been doing. Oh, nice. If you take any following steps with Neil, I think it would actually solidify, you know, additional involvement as well. Because…

362 00:53:46.060 00:53:49.739 Clarence Stone: Yeah, I mean, these partners are all friends, right?

363 00:53:49.740 00:53:50.720 Uttam Kumaran: Yes, yes, yeah.

364 00:53:50.720 00:53:58.850 Clarence Stone: We show up credentialed that way, and it’s pretty much a lock, we just have to make sure that we’re positioning ourselves correctly and delivering.

365 00:53:59.560 00:54:00.760 Uttam Kumaran: Right. Perfect.

366 00:54:02.800 00:54:03.610 Clarence Stone: Cool!

367 00:54:05.750 00:54:11.839 Uttam Kumaran: Okay, alright. Yeah, I was, like, up so late, doing more work as soon as I got back home, so I gotta finish this.

368 00:54:11.840 00:54:13.930 Clarence Stone: I was up before yesterday, it was…

369 00:54:13.930 00:54:23.080 Uttam Kumaran: Dude, we should have tried. I was up till 2, working on random shit, on my, like, second coffee, but I need to, like, grind through some, like, really boring shit right now.

370 00:54:23.080 00:54:26.620 Clarence Stone: Oh, guess what? USAA called me this morning.

371 00:54:27.050 00:54:29.689 Uttam Kumaran: Oh, yeah, so what? Oh, you, so what, they came back to you?

372 00:54:29.690 00:54:30.650 Clarence Stone: Yeah.

373 00:54:30.650 00:54:31.779 Uttam Kumaran: Oh my god.

374 00:54:32.280 00:54:33.270 Uttam Kumaran: And…

375 00:54:33.530 00:54:35.140 Clarence Stone: I already told him now.

376 00:54:35.140 00:54:40.329 Uttam Kumaran: Okay, good. Oh, I mean, like, I’m happy for you, but yeah. Fuck yeah.

377 00:54:40.700 00:54:54.939 Clarence Stone: It’s funny how, like, you instantly were like, obviously no, and then, like, I talked to some people at, EY and Riley, who owns a chocolate company, and everyone’s like, no, don’t do it.

378 00:54:54.940 00:54:57.729 Uttam Kumaran: Yeah, but also, dude, you’re not… you can’t be a second pick, bro.

379 00:54:58.040 00:54:58.730 Clarence Stone: Yeah.

380 00:54:58.730 00:55:00.560 Uttam Kumaran: It’s a shame on them.

381 00:55:01.950 00:55:02.970 Uttam Kumaran: Same on them.

382 00:55:04.570 00:55:07.359 Clarence Stone: Yeah, so, it’s been a day.

383 00:55:09.620 00:55:12.310 Clarence Stone: And then this whole Charlie Kirk thing.

384 00:55:12.310 00:55:13.150 Uttam Kumaran: I know.

385 00:55:14.310 00:55:20.680 Uttam Kumaran: Yeah, that’s what I’m gonna… I wanna… I’m just going back on Twitter and seeing, like.

386 00:55:21.540 00:55:23.130 Uttam Kumaran: What is even going on?

387 00:55:25.660 00:55:27.599 Clarence Stone: Bird, I’ll keep you up to date.

388 00:55:27.600 00:55:28.780 Uttam Kumaran: Okay, okay, perfect.

389 00:55:29.600 00:55:30.960 Clarence Stone: Alright, adios, man.

390 00:55:30.960 00:55:32.229 Uttam Kumaran: Hi, dude. Bye.

391 00:55:32.230 00:55:33.180 Clarence Stone: Yep, bye.