Meeting Title: Uttam <> Yuriy Date: 2026-03-10 Meeting participants: Alina Forin, Clarence Stone, Uttam Kumaran


WEBVTT

1 00:00:56.090 00:00:57.530 Clarence Stone: Yo, yo!

2 00:00:57.630 00:00:59.889 Clarence Stone: Tom said he’s running a few minutes behind.

3 00:01:01.040 00:01:02.970 Alina Forin: Okay, hold on, I gotta do something anyway.

4 00:01:03.190 00:01:03.920 Clarence Stone: Cool.

5 00:01:29.340 00:01:32.960 Alina Forin: I realize I’m using, Alina’s laptop, so it says Alina.

6 00:01:34.390 00:01:35.900 Alina Forin: To be confused with me.

7 00:01:36.250 00:01:39.269 Alina Forin: I’m… I may have to…

8 00:01:40.190 00:01:43.360 Alina Forin: Cut this, like, 5 minutes short-ish.

9 00:01:44.220 00:01:47.399 Alina Forin: It’s like, my babysitter’s coming, we gotta do some trade-offs.

10 00:01:47.620 00:01:49.529 Clarence Stone: No worries, you’re good.

11 00:01:49.530 00:01:51.779 Alina Forin: doing an intra, I’m hoping, you know…

12 00:01:51.780 00:01:54.429 Clarence Stone: Yeah, it’s just a hangout intro, just like the last one.

13 00:01:54.580 00:01:55.220 Alina Forin: Yeah.

14 00:01:58.020 00:02:01.360 Clarence Stone: Your boy’s got more for you if you don’t like any of these offers.

15 00:02:02.630 00:02:04.410 Alina Forin: I think you’re gonna have to…

16 00:02:04.880 00:02:10.250 Alina Forin: I feel like, I’m the new pretty gal being walked around the ball.

17 00:02:10.710 00:02:11.490 Clarence Stone: Yeah.

18 00:02:12.890 00:02:14.890 Clarence Stone: Dude, when I showed up, like…

19 00:02:15.850 00:02:21.539 Clarence Stone: People are like, what kind of monster are you? What is this?

20 00:02:21.540 00:02:26.739 Alina Forin: Yeah, I mean, you are, you are a beast in your own merit, rightfully so.

21 00:02:28.050 00:02:30.369 Alina Forin: You are, a nutty man.

22 00:02:30.620 00:02:35.990 Clarence Stone: Oh, so, I was gonna say, I bet you $5 that MJ’s just gonna…

23 00:02:36.140 00:02:38.490 Clarence Stone: Try to calm you down and give you a raise.

24 00:02:38.900 00:02:40.839 Clarence Stone: It might be a nice, handsome race.

25 00:02:41.190 00:02:46.219 Alina Forin: I mean, my… my request is still the request. Like, they’re… they’re talking about, like.

26 00:02:46.690 00:02:50.519 Alina Forin: Oh, the plan. I’m like, honestly, I don’t care for your plan.

27 00:02:51.390 00:02:53.540 Alina Forin: You missed plan, in my eyes.

28 00:02:53.730 00:02:55.590 Alina Forin: Like, this is a…

29 00:02:56.460 00:03:03.309 Alina Forin: Let’s… let… this is a make… let’s make it right. Like, you calling me and buttering me up and telling me, like, just…

30 00:03:03.720 00:03:10.239 Alina Forin: Words? Like, I’m done with words. If you want me to be serious about staying, show me, comp.

31 00:03:10.840 00:03:17.180 Alina Forin: It’s like, I know what comp is for an MD, meet me in the middle. Like, you can’t pay me full MD? Fine.

32 00:03:17.930 00:03:19.149 Alina Forin: You could do it next year?

33 00:03:19.150 00:03:25.239 Clarence Stone: Dude, like, nothing really changes, because you have the responsibility without the authority.

34 00:03:25.450 00:03:29.039 Clarence Stone: Right? Like, you’re still an SM, and they’re gonna expect more.

35 00:03:30.250 00:03:36.199 Alina Forin: Oh, yeah, I mean, the… nothing changes between an SM to an MD. Absolutely nothing, other than comp.

36 00:03:37.620 00:03:40.519 Clarence Stone: Well, you get to throw your weight around a little bit.

37 00:03:40.800 00:03:45.289 Alina Forin: No one respects an MD, come on, let’s be serious. I never did as a staff.

38 00:03:46.140 00:03:48.660 Alina Forin: I never really did as a manager.

39 00:03:49.810 00:03:55.629 Alina Forin: It’s like… I’m not in it for the respect.

40 00:03:56.350 00:03:59.729 Alina Forin: This is more so…

41 00:04:01.050 00:04:18.380 Alina Forin: like, I feel like this whole experience just kind of showed me, like, you give the man what the man deserves. And, like, once I make it, I’m just gonna coast. I’m never gonna really ever again go consistently above and beyond, because

42 00:04:19.110 00:04:19.899 Alina Forin: For what?

43 00:04:20.180 00:04:25.039 Alina Forin: You’re gonna give me an… instead of a 5%, you’re giving me an 8?

44 00:04:26.270 00:04:27.220 Alina Forin: And, like, you’re telling…

45 00:04:27.220 00:04:30.950 Clarence Stone: I got 12 my first year at EY, I thought that was gonna be regular.

46 00:04:31.210 00:04:33.449 Alina Forin: Yeah, no, there was, like, a few years where, like.

47 00:04:33.450 00:04:36.000 Clarence Stone: Disney really pumping out, like, 10?

48 00:04:36.000 00:04:44.840 Alina Forin: 12s, and I remember getting those two, and then it slowly went back to, like, the fives and eights, and whatever the case it was, for the average.

49 00:04:45.230 00:04:45.940 Clarence Stone: Yeah.

50 00:04:46.190 00:04:52.409 Alina Forin: And like, honestly, like, if… we can kind of go on more of these side quests, and if this…

51 00:04:52.680 00:04:55.040 Alina Forin: I feel like the more sidequests you start.

52 00:04:55.460 00:04:58.110 Alina Forin: There’s a higher success of some panning out.

53 00:04:58.790 00:05:02.480 Clarence Stone: I mean, dude, I’m the king of side fries.

54 00:05:02.480 00:05:08.789 Alina Forin: I… I think I am now of, appreciation for said sidequest.

55 00:05:10.730 00:05:14.550 Clarence Stone: I mean, do you catch how much he said he’s made so far?

56 00:05:15.030 00:05:16.110 Clarence Stone: Six figures.

57 00:05:17.020 00:05:20.400 Alina Forin: No, I caught that. Like, my ADD mind caught that.

58 00:05:21.670 00:05:32.010 Clarence Stone: crazy, isn’t it? Like, you just find the efficiency so that, like, the kid could sell, like, make it a fixed system so that even, you know, we could do it, like…

59 00:05:32.010 00:05:35.830 Alina Forin: I mean, honestly, this, to me, this is a… I’m… I’m a sponge.

60 00:05:36.480 00:05:42.620 Alina Forin: Like, pay me in your knowledge, and then what if I just do the same thing? I’m in a complete different time zone.

61 00:05:42.620 00:05:42.940 Clarence Stone: Yeah.

62 00:05:42.940 00:05:43.820 Alina Forin: state.

63 00:05:44.390 00:05:52.929 Alina Forin: I pinged him after, I was like, make him sell East Coast for you, man. He’s like, Yuri’s great in front of a customer, like, don’t…

64 00:05:52.930 00:05:54.200 Clarence Stone: That’s the best part!

65 00:05:55.140 00:05:56.570 Clarence Stone: Like, you’re listening.

66 00:05:57.100 00:06:06.150 Alina Forin: It’s… I feel like that one was, like, pretty good with, I see the upside, long term.

67 00:06:06.720 00:06:08.679 Clarence Stone: There’s no, there’s no short term.

68 00:06:09.740 00:06:26.910 Clarence Stone: No, I… there’s generally not in most of these deals, right? Like, even here, like, you’re gonna learn a shit ton about AI. Like, we are light speeds ahead of, like, the rest of the industry, but, like, we need to find the scale, and…

69 00:06:26.970 00:06:36.409 Clarence Stone: We need to start automating, dude, like, we’re doing hiring all the time, like, we can’t keep pace with hiring to take more jobs right now.

70 00:06:37.240 00:06:38.720 Clarence Stone: It’s crazy.

71 00:06:40.130 00:06:41.179 Alina Forin: I mean…

72 00:06:41.640 00:06:52.759 Alina Forin: Listen, before, we never had this conversation, because it was a… A black and white, but now we’re gray, and I’m like, listen, I’ll do side quests with you, man, let’s fucking roll.

73 00:06:53.080 00:07:01.199 Alina Forin: I don’t think we ever actually did a project. All we did was talk about projects the entire time, and then never got to do nothing.

74 00:07:01.470 00:07:13.119 Clarence Stone: I love BY because it’s like that. Like, I think I’m on the seventh, conversation with Nate and, Chris Gilbert on whether or not they’re gonna start a SaaS for K1 Extract.

75 00:07:14.300 00:07:15.980 Alina Forin: Oh, Rob’s stuff?

76 00:07:15.980 00:07:20.929 Clarence Stone: Yeah, and dude, that’s why being a contractor is the best. I literally just hang out.

77 00:07:20.930 00:07:22.489 Alina Forin: I’m… Time to cut.

78 00:07:22.730 00:07:26.030 Alina Forin: After when I’m done eating, can I be on my iPad?

79 00:07:26.280 00:07:33.510 Alina Forin: at 7 o’clock, when me and Mama have an appointment, you can be on your iPad, yes. But you have to go eat, be a good boy.

80 00:07:35.050 00:07:36.579 Alina Forin: Do you want to say hello to Clarence?

81 00:07:36.580 00:07:37.450 Clarence Stone: Bye!

82 00:07:37.450 00:07:38.610 Alina Forin: Can’t really see it here.

83 00:07:38.780 00:07:39.919 Alina Forin: That’s my buddy.

84 00:07:40.620 00:07:41.459 Clarence Stone: What’s up?

85 00:07:41.960 00:07:44.570 Alina Forin: He’s my panda bear. This is my Rem.

86 00:07:46.120 00:07:48.310 Alina Forin: Alright, alright, I gotta go talk to adults.

87 00:07:48.310 00:07:50.589 Clarence Stone: You can get your iPad soon. I’m good.

88 00:07:50.590 00:07:55.499 Alina Forin: Because I can’t use my computer for this, because it’s secret, it’s perf- secret.

89 00:07:55.660 00:07:57.219 Clarence Stone: We’re doing tickets.

90 00:07:57.550 00:08:02.520 Alina Forin: No, Remy, that’s not a thing, it’s just a screen. Remi, go inside and go eat.

91 00:08:02.870 00:08:04.819 Alina Forin: Alright, get out of here, man. Close the door, though.

92 00:08:06.470 00:08:10.580 Alina Forin: We talked about that. What do you… Whatever.

93 00:08:13.850 00:08:18.320 Alina Forin: Where’s your… where’s your buddy? We’re gonna be… very short.

94 00:08:18.810 00:08:19.270 Clarence Stone: Yeah.

95 00:08:19.270 00:08:20.110 Alina Forin: short.

96 00:08:20.380 00:08:33.299 Clarence Stone: I’m texting him right now, like, ship blew up today because the other founder’s out, so I had to chip in today, too, so… Like, this is what I mean, like, we’ve got so much business, it’s crazy. It’s boomtown.

97 00:08:33.309 00:08:36.739 Alina Forin: Yeah, I mean, we could… we could do this tomorrow, I mean, there’s not… I’m not…

98 00:08:36.740 00:08:44.220 Clarence Stone: I’m trying to shake EY right now. You heard me when I said, like, dude, I don’t think I can help you, like… it’s bad.

99 00:08:45.310 00:08:50.369 Alina Forin: It is pretty… it’s… I mean, it’s like that all the time, there’s so much bureaucracy. What’s up, dude?

100 00:08:51.470 00:08:59.139 Alina Forin: mac and cheese, because it’s Mac and Cheese Mondays, so can we have it today? Sure, go ahead, have mac and cheese. Close the door.

101 00:08:59.140 00:09:01.370 Clarence Stone: Yeah, we’re having mac and cheese!

102 00:09:01.370 00:09:09.050 Alina Forin: I made them house leadies yesterday, and they’re, like, complaining about that. My house letties, or my chicken cutlets, my house special, they’re amazing.

103 00:09:09.050 00:09:09.589 Clarence Stone: Oh…

104 00:09:10.280 00:09:13.569 Alina Forin: And I call them houseletties, because, it was, like.

105 00:09:13.570 00:09:34.240 Alina Forin: on New Year’s, we were hosting, and I had this idea, I’m like, you know what is the worst thing? Is, like, when you’re at a New Year’s, and you come, and you eat a little bit, and then, like, you gotta wait a bunch of hours for, like, this sit-down meal, and at that point, you don’t really want to have a meal anymore. Wouldn’t it be cool if you could just, like, snack on something? Short chips, fine, but, like, chips only takes you so far.

106 00:09:34.240 00:09:37.279 Alina Forin: Wouldn’t you want a snack on a… on a house letti?

107 00:09:38.110 00:09:43.610 Alina Forin: And it was, like, a smashing hit, like, literally. Everybody’s like, I’m stealing this concept.

108 00:09:44.250 00:09:45.550 Clarence Stone: For, like…

109 00:09:45.550 00:09:48.029 Alina Forin: Like, a horde divorce of just chicken cullets.

110 00:09:51.300 00:09:52.839 Clarence Stone: Daki, that sounds good.

111 00:09:52.940 00:09:55.219 Clarence Stone: I’m texting him, let me see where he’s at.

112 00:09:55.220 00:09:58.559 Alina Forin: Yeah. Yeah, let’s, let’s re… like, reschedule this, because, like.

113 00:09:58.560 00:10:02.660 Clarence Stone: We have a customer being a total bitch-ass right now, I think that’s what he’s dealing with.

114 00:10:02.660 00:10:10.069 Alina Forin: And, you know, listen, you do… I’m fine. Listen, you know my schedule is fairly open for fucking any… anything at this point. Like, I will…

115 00:10:10.410 00:10:17.489 Alina Forin: I have nothing… the only conversation I probably will not be able to move is when MJ comes to gas me up about some shit.

116 00:10:19.290 00:10:25.770 Clarence Stone: How are you gonna handle it? What if he’s like… Well, double your salary.

117 00:10:27.430 00:10:31.700 Alina Forin: They’re not gonna double my salary, they’re barely gonna give me the 17% I’m looking for.

118 00:10:32.630 00:10:36.059 Clarence Stone: Well, you’re definitely not getting 17% if they won’t double.

119 00:10:36.240 00:10:40.329 Alina Forin: It’s like… I was like, meet me in the middle, it’s gonna take me 17% to get.

120 00:10:40.330 00:10:42.879 Clarence Stone: Why were you not at the senior manager meeting?

121 00:10:43.820 00:10:44.640 Alina Forin: Which one?

122 00:10:45.130 00:10:46.240 Clarence Stone: in LA.

123 00:10:49.210 00:10:51.430 Alina Forin: What senior manager meeting in LA?

124 00:10:52.550 00:10:57.150 Clarence Stone: They’re talking about the future, like, op model of Triple.

125 00:10:57.150 00:11:02.160 Alina Forin: Oh, that’s probably the… because they’re splitting, basically, VCL from consulting.

126 00:11:02.820 00:11:09.030 Alina Forin: And Immaculate is probably heavily involved, so she’s probably chose LA for the meeting.

127 00:11:09.810 00:11:12.989 Alina Forin: And that’s the… I’m not in that side at all.

128 00:11:13.780 00:11:16.190 Clarence Stone: Well, MJ is supposed to be there.

129 00:11:17.890 00:11:26.159 Alina Forin: Wait, are we… we’re talking about, like, THE Mike Johnson, not MJ, not the fucking… the retard one, right?

130 00:11:27.920 00:11:32.249 Clarence Stone: Just so, just so I’m clear. Okay, so Retard One lives there.

131 00:11:32.680 00:11:39.260 Clarence Stone: Sure, I thought he was in Boston, but it doesn’t even matter. He moved, he moved. Charlotte MJ is supposed to be there.

132 00:11:40.380 00:11:54.849 Alina Forin: Well, I think that is all because they’re trying to credentialize the shit that Joskin’s fucking cooking up, or whatever, or they’re cooking up. Like, every single partner has said it without being descriptive, like, there’s a big change coming, and everyone’s like, oh, there’s gonna be a split, we don’t…

133 00:11:54.850 00:12:02.999 Alina Forin: blah blah blah. So, like, I know it’s, like, the consulting business is basically separating from the VCL, because they’re tired of fighting with each other.

134 00:12:03.500 00:12:07.599 Alina Forin: And Joe Skin has absolutely no control over anybody.

135 00:12:08.110 00:12:08.960 Alina Forin: So…

136 00:12:08.960 00:12:15.410 Clarence Stone: What’s up with that? Like, everyone keeps telling me that he’s not, like, in operational control of anything.

137 00:12:15.410 00:12:23.949 Alina Forin: Nothing. Nothing. He’s, I think the number one thing is no one actually believed he deserved it.

138 00:12:24.120 00:12:28.950 Alina Forin: like, I think if they gave it to Weinberg, which I think they should have.

139 00:12:29.680 00:12:33.540 Alina Forin: We would have actually had some meaningful…

140 00:12:33.990 00:12:39.760 Alina Forin: like, I think it would have been much better, because Weinberg… is…

141 00:12:40.300 00:12:43.959 Alina Forin: As goofy as he is on, like, the surface.

142 00:12:44.300 00:12:46.730 Alina Forin: He knows how to run a business, he’s the op man.

143 00:12:47.300 00:12:56.259 Alina Forin: You just gave the keys to a Ferrari for someone who’s driven a go-kart on the side.

144 00:12:56.980 00:12:57.360 Clarence Stone: Yeah.

145 00:12:57.360 00:13:05.530 Alina Forin: And, like, doesn’t know what it takes to, like, operate a Ferrari, or operate a Ferrari at speed, because the problem is, like, the speed of things moving.

146 00:13:05.690 00:13:10.400 Alina Forin: he was making some decisions that, like, people are like, what the fuck are you doing? And then at some.

147 00:13:10.400 00:13:16.940 Clarence Stone: So they just put Weinberg in the corner? Like, what the fuck happened with that? Like, I show up, and he’s not even in Triple T anymore.

148 00:13:17.090 00:13:19.610 Alina Forin: Well, he’s technically still there, sort of.

149 00:13:19.610 00:13:20.340 Clarence Stone: blocks, but…

150 00:13:20.340 00:13:24.849 Alina Forin: But, like, well, basically, like, him and I were pretty close when all this was going down.

151 00:13:25.100 00:13:31.769 Alina Forin: And… he was just, like… I was helping him, like, do some sales stuff, basically, like, kick the tires of things.

152 00:13:31.900 00:13:40.290 Alina Forin: And he goes, honestly, what the fuck are we doing here? What am I kicking the tires for, like, 50K? Like, I gotta meet revenue, fuck this. So…

153 00:13:42.900 00:13:48.379 Uttam Kumaran: Hey guys, so sorry. I just… I just, like, had meetings back up into meetings, and…

154 00:13:49.340 00:13:51.740 Uttam Kumaran: This is the meeting I was looking forward to, though.

155 00:13:52.740 00:13:57.990 Alina Forin: I know, this is the phone, but I do have a little bad news, I do gotta cut it, like, I have to…

156 00:13:57.990 00:13:58.669 Uttam Kumaran: Sorry, I know it.

157 00:13:58.670 00:13:59.160 Alina Forin: hard stop.

158 00:13:59.160 00:13:59.989 Uttam Kumaran: Larry’s just told me.

159 00:13:59.990 00:14:00.740 Alina Forin: love.

160 00:14:01.310 00:14:01.690 Uttam Kumaran: Okay.

161 00:14:01.690 00:14:05.790 Alina Forin: But it… I think it’s okay. I mean, your screen’s blurry, I don’t know if you got, like, a hater.

162 00:14:05.790 00:14:07.400 Clarence Stone: Is that my friend?

163 00:14:07.400 00:14:09.969 Uttam Kumaran: That’s so weird. No, hold on, hold on.

164 00:14:09.970 00:14:10.399 Alina Forin: Oh, there.

165 00:14:10.400 00:14:11.370 Uttam Kumaran: my back? Okay.

166 00:14:11.370 00:14:14.620 Alina Forin: Sorry, I’m on a MacBook Air, I’m getting a new laptop soon.

167 00:14:14.720 00:14:15.710 Uttam Kumaran: It’s coming.

168 00:14:18.290 00:14:20.220 Uttam Kumaran: How’s everything? It’s great to, it’s great to…

169 00:14:20.220 00:14:22.490 Alina Forin: For the first, like, a year.

170 00:14:22.490 00:14:25.120 Uttam Kumaran: Oh, great. It’s great. I feel like it’s caught up.

171 00:14:25.120 00:14:25.830 Alina Forin: nice laptop.

172 00:14:29.100 00:14:39.190 Alina Forin: But yeah, no, I think, Clarence speaks highly of you, and, you know, you guys got a good thing going, and I think what kind of prompted this was…

173 00:14:39.190 00:14:42.780 Uttam Kumaran: I had, like, a realization that I really don’t give a shit about EY anymore, and I was like…

174 00:14:42.780 00:14:48.859 Alina Forin: And in Clarence’s mind was like, yo, are you… are you talking side quests? I’m like, I think I’m ready for side quests.

175 00:14:50.160 00:14:50.560 Alina Forin: Damn.

176 00:14:50.560 00:14:58.279 Uttam Kumaran: Clarence knows I also don’t… I also don’t care… I also don’t care much about UI, I play for another team, so yeah, we’re all on the same page.

177 00:14:58.820 00:15:00.990 Alina Forin: Alright, wait, are you in another Big Four?

178 00:15:01.380 00:15:09.630 Uttam Kumaran: No, no, no, I’m on a big 4,000. I run Brainforge, so we’re a data and AI consultant.

179 00:15:09.630 00:15:11.679 Alina Forin: Oh, yeah, yeah, you’re in a complete.

180 00:15:11.680 00:15:12.859 Uttam Kumaran: I’m out of the game.

181 00:15:13.000 00:15:15.240 Uttam Kumaran: I’m out of the game. Yeah, yeah.

182 00:15:15.730 00:15:20.489 Uttam Kumaran: I’m in the G League, but you know, we’re trying to punch up, so, yeah.

183 00:15:20.490 00:15:35.040 Alina Forin: Well, look, I think this conversation is, how do we plunge up? And, you know, I’ll just cut to the chase, because we’ve only got, like, you know, X amount of time. We could always have pleasantries another time. So I think where… you know, I don’t know what Clarence told you about me, but, like, my whole thing, I’m process.

184 00:15:35.350 00:15:59.750 Alina Forin: that’s my bread and butter, it’s what I do, it’s what I sell. You know, how I met Clarence was a fortune accident one day, when we were actually just kind of shooting the shit. I was like, you know, we actually never worked on a single project. We had so many conversations about plans for a project, which is, like, one of the biggest challenges of doing anything here on UI side, is, like, you have these ideas, you get the momentum, you get into a meeting, and there’s, like, someone who has no idea what the fuck’s going on.

185 00:15:59.750 00:16:04.119 Alina Forin: John’s like, oh, what this, oh, what this, what this, and the momentum stops, and you’re done.

186 00:16:04.210 00:16:12.460 Alina Forin: So, you know, I don’t know if you wanna… I guess, do you… what do you know about me, or anything at all? And I could give you the elevator space, Pete.

187 00:16:12.460 00:16:30.369 Uttam Kumaran: Yeah, no, I mean, dude, that’s… I feel like I… I… we could skip a lot of that, too. I feel like Clarence gotcha’s for you, like, I don’t really… I’m also not a big credential person in general, like, I… so I’m more about what we… what we can do today and what we can do tomorrow. So, for me, like, one thing I told Clarence about, is, like.

188 00:16:30.760 00:16:48.119 Uttam Kumaran: Right now, so I run a small data and AI consultancy. We’re about 25 people. Started the business, like, two and a half years ago. My background is in data engineering and computer engineering, but we work with a bunch of clients, sort of, like, anywhere from, like, 20, 30 million in revenue to, like, several hundred.

189 00:16:48.260 00:16:55.820 Uttam Kumaran: And we’re growing really quickly. However, I’ve built a business really using AI from, like, basically day one.

190 00:16:55.950 00:17:15.620 Uttam Kumaran: And right now, we’re automating it kind of from the inside out, very, very programmatically, and in a process-driven way, where we are doing internal time studies, we are mapping out tasks that people are doing, and we are building skills, automations, workflows, SOPs, in order to do that. So something that probably rhymes with a lot of what you’ve done, except

191 00:17:15.619 00:17:24.160 Uttam Kumaran: now it doesn’t just live as a document, like, something ends up using it, right? Which is the toughest part, I think, is policy adherence and actually doing that, so…

192 00:17:24.740 00:17:41.130 Uttam Kumaran: I think what’s interesting is that, one, Clarence and I met, and he saw that I was actively doing this with basically no budget, no… it was a completely bootstrap business, and there’s nobody from, like, any big, recognizable brand, nobody in my business. Someone interned at EY,

193 00:17:41.460 00:17:46.390 Uttam Kumaran: Maybe. And… but nobody’s coming from a consulting background. I have no background in consulting.

194 00:17:46.480 00:18:01.210 Uttam Kumaran: I just like to do great data work and AI work for clients. So we did a lot of AI stuff for our business, and then now we’re actually able… we found that the AI work that we did for ourselves was about a year ahead of, like, what people were selling.

195 00:18:01.250 00:18:08.949 Uttam Kumaran: In the industry. Which means, for the last 2 years, nobody really, like, cared about what I was talking about, because it didn’t make any sense. Until, like, 8 months.

196 00:18:09.280 00:18:18.669 Uttam Kumaran: 6 to 8 months, something… and we listed, like, sort of, like, what are maybe the catalysts, whether it’s Super Bowl or something. The noise is very, very high.

197 00:18:18.720 00:18:25.800 Alina Forin: The problem is, is still, like, I think some people have missed the boat on understanding what the LLM is. I don’t know if they’ll ever get there.

198 00:18:25.810 00:18:45.149 Uttam Kumaran: But it’s now so loud for them that they, like, have to buy something, and so we want to be in the market for them to buy something. However, in us, you have people that actually can do this thing, and in fact, we need people that are very process-oriented to build some of these systems and guide us to building more of them. And so internally, we are basically

199 00:18:45.150 00:18:56.310 Uttam Kumaran: We are, internally are automating our own business, and it’s working. We’re doing really well on the margin side. We have people that have no experience doing some of the work that they’re doing at, like, a really high skill level, all the sort of talking points.

200 00:18:56.310 00:19:10.110 Uttam Kumaran: done it with zero budget, which means, like, if I even have, like, a fifth penny, I know how to use it. And, like, I think what Clarence is trying to work with me on is how do we get this beyond what I’m capable of with my current business, which is just selling to, like.

201 00:19:10.110 00:19:25.050 Uttam Kumaran: you know, we’re reaching towards, like, billion-dollar companies. It still may take a few years, versus, like, okay, how do we enter into Fortune 1000 faster? And so, for me, I will always tell clients, I think we were gonna get there. On a timeline that matters, I’m not sure.

202 00:19:25.230 00:19:36.960 Uttam Kumaran: And so when we started talking, we’ve been friends now, I think, for a year or so. We started talking the last 2-3 months, it’s like, okay, how can we do some work? Like, Clarence just made some money, is now also looking for side quests.

203 00:19:37.140 00:19:55.559 Uttam Kumaran: And then started seeing our business from the inside, and it was like, holy shit, dude, like, why aren’t you selling higher? I’m like, dude, I’m nobody. I’m like Joe Schmo. I built this whole thing, which is crazy, but then you’re talking to me, like, why are you selling to enterprise? I’m like, dude, what the fuck are you talking about? So that’s sort of, like, where we’re, like.

204 00:19:56.120 00:19:57.659 Uttam Kumaran: Where we’re at.

205 00:19:58.340 00:20:14.360 Clarence Stone: And from our standpoint, Utom, like, the kind of shit that they pitch at us, the kind of stuff that we’re even able to sell at EY, is, like, 2 years behind what we’re doing. That’s why, like, there’s this misalignment, and we need this expertise to figure it out.

206 00:20:14.390 00:20:14.870 Alina Forin: Right.

207 00:20:15.100 00:20:22.199 Uttam Kumaran: Yeah. Which is so crazy to me, which is where I’m like, dude, I don’t even get… I don’t even get it, but, like, fair, okay? I believe you.

208 00:20:22.200 00:20:38.549 Alina Forin: I think, like, the… where, like, Clarence and I started to, like, reconnect again after, like, he ditched me when I came to, ACL, I still remember that, like, I got a client that I was pitching for 2 years, of, like, hey, you need AI, you need that, and they’re a…

209 00:20:38.670 00:20:46.729 Alina Forin: one of the sovereign wealth funds on UAE, and they had a directive from the top down, like, you have to use AI. So to them, as, like.

210 00:20:47.140 00:21:02.129 Alina Forin: business people, they’re like, I need some widgets, and I need to put this widget so I can check a box and be done. And that was my whole thing with them, because, like, they’re, like, one of my, like… I… that was a client that started with Xero, and, you know, we’re doing, like, you know, 8 mil of, like, consulting work. Or, like, tax work, too.

211 00:21:02.280 00:21:15.989 Alina Forin: And I was like, Clarence, I’m like, how do I even, like, approach this? Like, I sold them this dream of, like, oh, they’re gonna want an assessment, they gave us 50K, like, here, like, kind of, like, show us what you got. And I’m like, what the fuck am I doing? And then Ham and I were just kind of brainstorming a bit, I’m like.

212 00:21:15.990 00:21:25.550 Alina Forin: This is basically a tax functional assessment, which I’ve done in the past, many, many times, and just with a very specific lens of AI. And once I broke it down, I’m like, oh shit, this is gonna be a cakewalk.

213 00:21:25.590 00:21:39.150 Alina Forin: So, you know, right now, I’m gonna be a little bit careful about, like, what projects I pick. Sure, sure, sure. I want to do the ones that are gonna have an AI impact, so I’m like, to me, this is, like, I’m just gathering data, data, data.

214 00:21:39.150 00:21:39.780 Uttam Kumaran: Yeah, yeah, yeah.

215 00:21:39.780 00:21:52.149 Alina Forin: Let me see all the different opportunities, how messed up are your systems, which most of them are extremely messed up, because they build it for finance, but then they build it for, like, what they need for, like, one slither, and then every

216 00:21:52.470 00:21:58.850 Alina Forin: This is an afterthought, and then they spent billions later getting it to where it should have been from the beginning.

217 00:21:58.870 00:22:17.460 Alina Forin: So it’s like, the short-sightedness is prevalent through the industry, like, widespread. So now you’re asking people who are short-sighted, don’t have the architecture, to now slap up AI, and they’re like, give me AI, sure, just, like, let me license it so I can check a box. Like, that’s pretty much where I think the industry is at a whole.

218 00:22:17.830 00:22:27.619 Uttam Kumaran: That’s where… so that’s where I was selling to a year ago, which is like, okay, we bought Glean, or like, we turned on ChatGPT for everyone. If you guys were two years ago, then yeah, that’s about right.

219 00:22:27.620 00:22:33.309 Alina Forin: Yeah, you’re basically… we’re just catching up, and, like, things in my head are starting to click. Yes.

220 00:22:33.310 00:22:40.349 Uttam Kumaran: What you’re probably seeing is the pace, but now you’re also seeing, if you’re talking to Clarence or seeing what’s happening.

221 00:22:40.350 00:22:47.800 Alina Forin: it’s gonna catch up to exactly where you think it’s gonna go. Yeah, he showed me, like, that agent that, like, basically took your notes and organized them, like.

222 00:22:48.070 00:22:52.890 Alina Forin: we have a multi-billion dollar PMO function, Eddie Wise.

223 00:22:52.890 00:23:10.689 Uttam Kumaran: Yes, I know. I decided not to do a PMO at my company because of what we saw in the last 6 months. Like, I’ve basically automated most of traditional product… I was a product manager for a while, and we’ve, like, replaced a lot of it. We had a plan to roll out a PMO and do things the normal way.

224 00:23:10.770 00:23:22.979 Uttam Kumaran: And then I… I was always hesitant, I don’t really like project management much, and then I was like, we’re close, we’re close, and then one day we nailed it, we did most of it through AI, and then I’m like, cool, we have a line of sight to the rest, and we’re, like.

225 00:23:23.300 00:23:29.249 Uttam Kumaran: now people can just do their work and update a skill that updates all the tickets, produces a Gantt chart.

226 00:23:29.250 00:23:42.769 Alina Forin: This is, like, it got me thinking, and I was like, I actually went internally, I’m like, I have this potential idea, will anyone let me do it? And they’re like, absolutely not. I was like, cool, we’re done.

227 00:23:44.300 00:23:52.649 Uttam Kumaran: Yeah, so I think, I think that’s sort of, like, where we’re at, but you tell me, like, what is your motivation, what’s next for you, and, like, what, like, yeah.

228 00:23:52.650 00:24:08.170 Alina Forin: So I guess where… where this… how we got here today was, like, you know, I’m willing to go on SideQuest, and I was like, you know, I… I want to learn AI, and I want to feel… I want to, like, actually put this… basically, it came from, like, I would want to put on my resume is, like, I know what I’m doing.

229 00:24:08.310 00:24:25.520 Alina Forin: And then Clarence is like, well, why don’t you do more than that? Why don’t you actually, like, actually do stuff, go into SideQuest, you know, you learn from us, which is, like, to me, at one point, I was like, I’m not even, you know, to me, I’m like, I wanna do an investment. Like, you’re gonna invest in me, I’m gonna invest in you, like…

230 00:24:25.770 00:24:40.390 Alina Forin: you’re gonna learn a ton from me, I’m gonna learn a ton from you, like, to me, that’s… we’re kind of square. Where that goes after that, like, we’ll see. I just need to kind of just frame out, like, you know, like, what are… what are your real business needs? Like, where do you need someone like me? And…

231 00:24:40.390 00:24:47.919 Alina Forin: let’s just jump in and do shit, and if it’s working, it works. We continue. If it doesn’t, we walk away, and we shake hands, and I’ll see you when I see you. You know, that’s how I look at it. Okay.

232 00:24:47.920 00:25:02.199 Uttam Kumaran: I mean, I texted Clarence, yeah, I mean, look, the fun thing about doing this is my business started by asking all my friends for favors, and now we’re a little bit more formal, but, like, I still do things like that. Like, we have a lot of people, I’m like, yo, just come see what you can help us with.

233 00:25:02.430 00:25:10.769 Uttam Kumaran: And then, like, let’s just see if you can help them with something, and then I’ll find a path forward. And so, what you have in our business, you have a very fun playing ground of a

234 00:25:11.080 00:25:25.989 Uttam Kumaran: pretty traditional IT consultancy that is using AI extremely… like, it’s trying to use AI in a very methodical way across every single role in the business. You have a group of people who are very open to it, because I haven’t let… I don’t let

235 00:25:26.050 00:25:34.479 Uttam Kumaran: a lot of the people at EY they’re used to working with into my company, they just don’t make it in the front door. And you have a senior leadership

236 00:25:34.480 00:25:48.090 Uttam Kumaran: Who’s, like, very, very pushing. So you have, like, a perfect storm, except no budget. Which we’re changing. But, like, otherwise, it is a perfect ground to try and learn, because failure is just doing things a normal way, actually, for me.

237 00:25:48.440 00:25:50.710 Alina Forin: Yeah, and I think that’s where I’m like.

238 00:25:50.880 00:25:57.469 Alina Forin: in the beginning, let’s just see how this works. Like, I’m gonna learn from you, whatever I take here, I’ll be able to sell it better at my spot.

239 00:25:57.470 00:25:57.920 Uttam Kumaran: S.

240 00:25:57.920 00:25:59.790 Alina Forin: So, like, to me, it’s like…

241 00:25:59.930 00:26:02.619 Alina Forin: Our capital is knowledge right now.

242 00:26:03.050 00:26:03.460 Uttam Kumaran: Yeah, yeah, yeah.

243 00:26:03.460 00:26:05.050 Alina Forin: Let’s do a knowledge share.

244 00:26:05.050 00:26:14.189 Uttam Kumaran: Of course, you know, our goal is gonna be, like, we want to work with you on something. So, like, I’m gonna say that out loud, is, like, I want you to come in and see how fast we’re doing this, and be like.

245 00:26:14.230 00:26:15.790 Alina Forin: Oh, damn, like…

246 00:26:15.950 00:26:19.360 Uttam Kumaran: They’re gonna go make a bunch of money doing this, we should go make some money doing this.

247 00:26:19.360 00:26:20.740 Alina Forin: All on board.

248 00:26:21.180 00:26:23.999 Alina Forin: up and on, and, like, maybe this, like, you know.

249 00:26:24.720 00:26:33.689 Alina Forin: We gotta go. Alright, I do gotta run. Okay, okay, let’s keep texting, like, players, maybe throw us in a group or something. Yeah, throw us in a group chat, we’ll get some time, and then we’ll dig a little deeper and figure it out.

250 00:26:34.010 00:26:34.660 Uttam Kumaran: Perfect, perfect.

251 00:26:35.480 00:26:36.770 Clarence Stone: Everyone stick around? Okay.

252 00:26:36.770 00:26:38.520 Uttam Kumaran: Yeah, I’ll say right. Yeah, I’ll say right.

253 00:26:38.520 00:26:39.090 Clarence Stone: Cool.

254 00:26:39.090 00:26:39.780 Uttam Kumaran: Thank you.

255 00:26:40.210 00:26:42.710 Clarence Stone: Yo, I had a great chat with Amber. She’s like…

256 00:26:42.710 00:26:47.169 Uttam Kumaran: She just met, she just met with me. Yo, she’s very smart, but is also, like.

257 00:26:47.500 00:26:53.410 Uttam Kumaran: sometimes nervous, until… I mean, but she goes on rabble. She literally just messaged me about intent engineering.

258 00:26:54.350 00:27:03.029 Uttam Kumaran: A bunch of stuff. She said, I chatted with Clarence, I’m very excited, I would love to be part of this next stage. I created skills, and I’m talking to Bea tomorrow about skills.

259 00:27:03.680 00:27:06.489 Uttam Kumaran: Clarence said starting with feedback is skills, blah blah blah.

260 00:27:06.730 00:27:09.400 Uttam Kumaran: I think maybe my thought is, like.

261 00:27:09.540 00:27:12.690 Uttam Kumaran: I didn’t jump into that call thinking we would have a new role for them.

262 00:27:12.870 00:27:16.010 Uttam Kumaran: I think it makes sense to, but I want to be careful in that

263 00:27:16.370 00:27:24.520 Uttam Kumaran: And she was… she was… she’s… she’s aware, because… so I worked with Amber for a long time. She was actually… we were originally going to make her our Chief of staff.

264 00:27:25.010 00:27:34.940 Uttam Kumaran: At the companies. And she was going that direction, and then she actually, like, I don’t know, we just found that, like, we needed her to keep doing work, and then she ended up in strategy.

265 00:27:35.240 00:27:38.130 Uttam Kumaran: But she does have this, like, higher level,

266 00:27:38.690 00:27:40.849 Uttam Kumaran: She has an ability to reflect.

267 00:27:41.200 00:27:48.249 Uttam Kumaran: And get unemotional about things, that really helps with this type of work. And to think about things very, very objectively.

268 00:27:48.340 00:28:03.730 Uttam Kumaran: Versus, like, meaning, like, you could be like, this thing sucks, and she’s like, okay, like, meaning she’s very good with things like that, which I feel like has helped for this type of work. I wonder, A, I think Mustafa’s gonna say the same thing. Like, I just didn’t want to offer anything, because I don’t know what this is.

269 00:28:03.730 00:28:06.939 Clarence Stone: I didn’t offer anything. I said, listen, like, I don’t know what this is called.

270 00:28:06.940 00:28:07.819 Uttam Kumaran: No, I said the same thing.

271 00:28:07.820 00:28:18.599 Clarence Stone: Or, like, the Wax On Wax Office, but, like, we need to figure it out, right? Like, like, we are just on the frontier, like, look at this video, there’s, like, 5,000 views.

272 00:28:20.110 00:28:21.100 Uttam Kumaran: Yes, yes.

273 00:28:21.100 00:28:31.210 Clarence Stone: Like, this is… this is new, new, new, new, new, right? Like, so, when she asked me, she’s like, what’s the role, what do I do? I’m like, like, you tell me.

274 00:28:31.630 00:28:44.269 Clarence Stone: Right? Let’s figure that out. But, like, when you, you know, put it down to priorities, the same priorities that we aligned on earlier, like, last week, where I was like, hey, one, let’s crowd around, making sure that the EP replacement is good.

275 00:28:44.330 00:28:52.910 Clarence Stone: Right? Which is, like, review those outputs, continue to refine and improve those skills with me. And in doing that, you’re gonna learn how skills are made.

276 00:28:53.590 00:28:54.530 Clarence Stone: Right?

277 00:28:54.530 00:28:55.670 Uttam Kumaran: Yeah, I agree.

278 00:28:55.890 00:29:06.689 Clarence Stone: And then, like, from there, you have, like, so many paths. Like, you could be, like, I want to build on the platform, and make these automations more real, because I have better requirements.

279 00:29:06.690 00:29:17.279 Clarence Stone: Or, I want to make an agent management layer, right? And manage these agents and workflows and check on the quality, right? Like, but there’s so many different paths from there.

280 00:29:18.190 00:29:21.089 Clarence Stone: Or you could just say, I love this concept, and go sell it.

281 00:29:21.280 00:29:22.050 Clarence Stone: Whatever you want.

282 00:29:22.050 00:29:22.590 Uttam Kumaran: Yeah.

283 00:29:23.780 00:29:25.330 Clarence Stone: Right? So, yeah.

284 00:29:25.880 00:29:26.600 Clarence Stone: I hope she’s.

285 00:29:26.600 00:29:27.080 Uttam Kumaran: and willingly.

286 00:29:27.080 00:29:29.270 Clarence Stone: impression that this is some sort of promotion or anything.

287 00:29:29.270 00:29:37.489 Uttam Kumaran: No, no, no, I think what we’re gonna find, though, is that this is the coolest team on the company, but, like, not everybody…

288 00:29:37.890 00:29:39.710 Uttam Kumaran: can do this. That’s what I think…

289 00:29:40.080 00:29:47.660 Uttam Kumaran: that’s probably what I want to get to… I wanna… and I don’t… you may already know this, and you may just be like, it does… like, I think…

290 00:29:47.810 00:29:54.949 Uttam Kumaran: I don’t know. For me, I’m like, I think in asking… in coming to, like, the CSO meeting and asking everybody to do it, I don’t think it’s gonna work. I don’t think they’re gonna do it.

291 00:29:55.180 00:30:04.700 Uttam Kumaran: And I don’t… it’s like, I would rather take the energy we’re gonna do in trying to convince them and just do it for them, because they’re gonna use it. And so, that was sort of my, probably.

292 00:30:05.120 00:30:09.509 Uttam Kumaran: that’s my feedback for you, and you can tell me I’m wrong, and we’ll see either way, but, like.

293 00:30:09.510 00:30:15.320 Clarence Stone: Fully agree. I don’t think the CSOs are going to percolate. I think the SLs might. They might understand that.

294 00:30:15.320 00:30:21.310 Uttam Kumaran: But it’s like, who cares? As long as we get B and a couple of these guys to build it, and then we’re like, dude, it’s so obvious, use this thing. Start cheating.

295 00:30:21.310 00:30:23.120 Clarence Stone: I think that’s what, like, I…

296 00:30:23.120 00:30:33.939 Uttam Kumaran: they don’t… I ultimately come back to, like, nobody needs to know how it works. Nobody needs to know how a skill works or how it’s created. Just use them. Don’t worry too much about it.

297 00:30:34.090 00:30:37.659 Uttam Kumaran: I think you’re… you are very, like, progressive in, like.

298 00:30:39.020 00:30:55.190 Uttam Kumaran: And again, maybe it’s… maybe it’s… maybe it’s because, like, I don’t know, I think you’re… you’re very positive in the fact that, like, this is gonna power you. I don’t think people want to fish. Like, they want to sit down and get the fish. Baked… they want to get the baked fish handed to them. They’re not trying to fish, they’re not trying to know how the rod works.

299 00:30:55.190 00:31:02.039 Clarence Stone: Look at the people I’m around. Yuri calls me when he’s free, and he goes, I want to fish. Teach me how. Let me.

300 00:31:02.040 00:31:03.990 Uttam Kumaran: No, no, oh my gosh, I know, but like…

301 00:31:03.990 00:31:07.640 Clarence Stone: I’m surrounded by. So, like, my perception is slightly different.

302 00:31:07.640 00:31:17.560 Uttam Kumaran: I’m not… I’m not… I’m not in any way… I’m not in any way… no, I agree, but I’m not in any way commenting on the fact that, like, in order to survive longer term, you have to fish. I agree with you, but…

303 00:31:18.600 00:31:23.110 Uttam Kumaran: Like, I was like, okay, I think Amber is realizing that.

304 00:31:23.220 00:31:25.599 Uttam Kumaran: Some people will realize that, or they won’t.

305 00:31:26.130 00:31:31.249 Uttam Kumaran: And if they do, then we have a place that they can start building for themselves. If they don’t.

306 00:31:31.690 00:31:37.749 Uttam Kumaran: then they don’t, and they still got a place for as long as we have a place, which I think is still gonna be for a while, but, like.

307 00:31:37.750 00:31:38.250 Clarence Stone: Long time.

308 00:31:38.250 00:31:41.780 Uttam Kumaran: Again, it’s like, you could be on one side or the other. Amber woke up and was like, holy shit.

309 00:31:42.370 00:31:43.160 Uttam Kumaran: Fide.

310 00:31:43.700 00:31:48.939 Uttam Kumaran: you’re gonna automate this. And I said, yeah, but she’s like, what’s gonna happen? I’m like, dude, it’s not… that’s not like… don’t think about it, like.

311 00:31:49.120 00:31:58.889 Uttam Kumaran: Don’t think about it like that. I was like, we need people to help do a lot of that, and there’s… also, we’re in client service. I talked to someone today, I said, the meeting is the… what matters.

312 00:31:59.300 00:32:07.189 Uttam Kumaran: Like, and unfortunately, we can’t automate that, so more meetings with clients, where you come to the meeting, you deliver shit, and you keep doing that.

313 00:32:07.620 00:32:09.639 Uttam Kumaran: We’re gonna be golden, you know?

314 00:32:09.640 00:32:13.080 Clarence Stone: Yeah. Versus more time in Jira, more time in IDE.

315 00:32:13.080 00:32:14.439 Uttam Kumaran: Things like that, you know?

316 00:32:14.920 00:32:25.170 Clarence Stone: Agreed. Yeah, I mean, Amber poked and prodded at me for a while to, like, tell the full thing. And when I finally gave her that video, she’s like, oh.

317 00:32:25.910 00:32:34.589 Clarence Stone: I see. Because, like, I kept telling her, like, what you do is actually going to be more valuable in the third turning, you just have no idea.

318 00:32:35.370 00:32:56.800 Clarence Stone: Like, we need people who understand, process, and construct companies, and create context layers, and understand, like, workflows and things like that. It has nothing to do with technology. Like, you probably should never even bother touching cursor. Like, what comes next is that human connection and understanding how, like, we’re gonna be able to plug a platform.

319 00:32:56.800 00:32:58.420 Clarence Stone: As a, you know, service.

320 00:32:58.420 00:33:02.990 Clarence Stone: And then do that 20% last mile, that makes it contextual.

321 00:33:03.230 00:33:03.910 Uttam Kumaran: Yeah.

322 00:33:03.930 00:33:06.870 Clarence Stone: Right, like, I was like, honestly, like.

323 00:33:07.220 00:33:23.620 Clarence Stone: you should probably know, like, how it’s made, but, like, it’s not your expertise. Like, it doesn’t matter. Like, that’s what I told Yuri today, too. I was like, I think you’re missing something, bro. Like, you want to learn about AI, sure, but you only need to know about it so as far as it touches process.

324 00:33:24.230 00:33:30.629 Clarence Stone: And then how we, like, implement it, and how it changes the process? Cool. You don’t have to code, dude.

325 00:33:31.060 00:33:32.179 Clarence Stone: Yeah. Gone.

326 00:33:33.060 00:33:39.570 Uttam Kumaran: Well, I think for people like… for people like me, I, I can’t, like, live a life without knowing how the thing works.

327 00:33:39.570 00:33:41.199 Clarence Stone: Yeah, me too.

328 00:33:41.200 00:33:42.070 Uttam Kumaran: So…

329 00:33:42.280 00:33:52.160 Uttam Kumaran: I fundamentally just don’t get it, but for what I do know, I don’t understand it, but I do know that these people exist. And for them, yeah, it’s perfect. Just fucking use the thing.

330 00:33:52.320 00:33:59.390 Uttam Kumaran: Like, don’t worry too much about it. If you want, we can go deeper, but frankly, like, just do your job faster.

331 00:33:59.390 00:34:14.420 Clarence Stone: designer in me, Utam, goes all the time, then, like, what do we build in terms of experiences so that they will fucking use it? And how do we measure it, and how do we make sure that, like, it’s working, and the trains are on time, and the output.

332 00:34:14.420 00:34:18.720 Uttam Kumaran: But you don’t think the output is just copy whatever the biggest guys are doing?

333 00:34:19.210 00:34:21.190 Uttam Kumaran: Because they probably figured it out, okay?

334 00:34:21.190 00:34:33.950 Clarence Stone: So, I don’t think I texted you this, but I think the Frontier Labs are, like, really far off the gold these days. Like, as of probably a week ago, I’m seeing patterns that are complete pattern disrupts from the behavior.

335 00:34:34.070 00:34:42.970 Clarence Stone: OpenAI bought another open source company. OpenAI then launched a vibe-coded app that was clearly vibe-coded.

336 00:34:43.130 00:34:58.049 Clarence Stone: Claude is now going $200K for baseline context limit for Opus, and doing medium thinking. In SWEBench, that’s less than KimmyK2.5. So people are now coding with less than KimmyK 2.5, and they don’t even know it.

337 00:34:58.590 00:34:59.580 Uttam Kumaran: Mmm…

338 00:34:59.580 00:35:02.440 Clarence Stone: What the fuck are they doing? $25 code reviews?

339 00:35:02.650 00:35:08.259 Clarence Stone: You sell the problem, you sell the solution, like, it’s so obvious to the, like, the average person.

340 00:35:08.410 00:35:10.259 Clarence Stone: And I, I think, like.

341 00:35:11.000 00:35:24.160 Clarence Stone: they don’t know what to do next. They’re either out of money, or they’re in disarray because organizationally they have too many directions, right? Like, I don’t know what the problem is, but this pattern disrupt is weird to me, bro.

342 00:35:24.160 00:35:29.300 Uttam Kumaran: Well, I think they’re just… well, I mean, if I was to give you another perspective, I think they are…

343 00:35:29.820 00:35:33.969 Uttam Kumaran: Purely trying to get… do what we’re doing, is get into the enterprise.

344 00:35:34.690 00:35:36.639 Uttam Kumaran: I don’t think anything else matters.

345 00:35:37.250 00:35:37.870 Clarence Stone: Well.

346 00:35:37.870 00:35:42.630 Uttam Kumaran: So they’re just gonna keep releasing a suite of products, and then they hired a bunch of FDEs.

347 00:35:42.770 00:35:47.179 Uttam Kumaran: To be like, we have finance shit, oh, we have legal shit, like, you should just use our platform.

348 00:35:47.630 00:35:52.250 Uttam Kumaran: And it doesn’t matter if it’s half-baked to get the MSA. You saw, even for the government.

349 00:35:52.670 00:36:01.180 Uttam Kumaran: Right? That Emil thing showed me how even he was like, yo, I don’t want to fire you guys. It’s fucking hard to get, like.

350 00:36:01.380 00:36:03.070 Uttam Kumaran: Just make this work.

351 00:36:03.400 00:36:11.850 Uttam Kumaran: And I was like, oh, fuck, like, just getting the MSA sign and becoming the… like, exclusive?

352 00:36:11.850 00:36:12.770 Clarence Stone: Yeah.

353 00:36:13.380 00:36:20.439 Uttam Kumaran: I was like, oh, damn. Like, this is just business inertia keeping this thing in, so I think they’re just gunning for…

354 00:36:20.820 00:36:26.739 Uttam Kumaran: as many of those, and then I… I think they’re… I also don’t know, because we’ve never seen

355 00:36:26.970 00:36:30.160 Uttam Kumaran: Companies grow in valuation this big.

356 00:36:30.770 00:36:39.439 Uttam Kumaran: And so I don’t know, like, what… even if there is a strategy, if those are all, like, if they just have teams of people that are, like, you’re a one-man product team.

357 00:36:39.660 00:36:43.719 Uttam Kumaran: And, like, you’ve… we’re just gonna ship 10 products, kill 9 of them.

358 00:36:43.970 00:36:48.309 Uttam Kumaran: I don’t know, like, there’s some, like, laws of physics that have changed.

359 00:36:48.700 00:36:52.670 Uttam Kumaran: in product development that I don’t know, like, how that impacts strategy right now.

360 00:36:52.840 00:36:59.229 Clarence Stone: Yeah, and I don’t think they understand the structuring, and we’re more nimble to be able to figure that out, but I.

361 00:36:59.230 00:37:05.319 Uttam Kumaran: No, in our… in our… I think we’re able to go really deep in one area and just cut it out, versus, like, their…

362 00:37:05.810 00:37:07.969 Uttam Kumaran: Yeah, they’re playing this broad…

363 00:37:07.970 00:37:11.519 Clarence Stone: But if you call what Adam has said, like, the enterprises don’t trust them.

364 00:37:11.940 00:37:14.499 Clarence Stone: Like, they can try all day to try to sell them.

365 00:37:14.500 00:37:25.189 Uttam Kumaran: No, and that’s also where, like, again, I don’t know… yeah, I think you should… if you’re… I saw people tweet about, like, yeah, there’s some customers that are… that are gonna drop Anthropic just because of the risk of, like.

366 00:37:25.350 00:37:28.170 Clarence Stone: Yeah. The government labeling them a supply chain risk.

367 00:37:28.800 00:37:30.730 Uttam Kumaran: That’s why they sued them immediately.

368 00:37:31.680 00:37:33.860 Uttam Kumaran: But again, this is where I’m almost like…

369 00:37:34.360 00:37:46.060 Uttam Kumaran: you enterprise guys, just use whatever story fucking works today, I don’t care. I don’t give a fuck about any of these stories. Just use whatever is on their mind, and you know what’s on their mind, and…

370 00:37:46.400 00:37:47.160 Uttam Kumaran: Bangin’ out.

371 00:37:47.160 00:37:51.820 Clarence Stone: By the way, Yuri’s a sales monster. He’s the only one that had a bigger book than me.

372 00:37:52.290 00:37:55.830 Uttam Kumaran: Mmm, okay. Where’s he from? His accent was interesting.

373 00:37:55.830 00:37:57.300 Clarence Stone: Jersey.

374 00:37:57.960 00:37:58.800 Uttam Kumaran: Okay, okay.

375 00:37:58.800 00:38:03.739 Clarence Stone: That’s why I was like, oh, Robert’s over there, like, when he comes back from Panama, you should go see him in New York.

376 00:38:04.250 00:38:06.799 Uttam Kumaran: Yeah, I had to go to New York,

377 00:38:07.390 00:38:09.160 Uttam Kumaran: So we should go for something.

378 00:38:09.160 00:38:09.880 Clarence Stone: Yeah.

379 00:38:10.380 00:38:16.969 Clarence Stone: Yeah, say hi to him. He’s… he’s on the Jersey side, but he’ll… he’ll sweep over.

380 00:38:17.770 00:38:21.920 Clarence Stone: What other thing was I gonna show you? Oh, I did a thing.

381 00:38:22.400 00:38:24.890 Clarence Stone: Let me just give you the…

382 00:38:24.890 00:38:26.200 Uttam Kumaran: Give me some design stuff.

383 00:38:27.330 00:38:28.370 Clarence Stone: Yo.

384 00:38:28.680 00:38:33.840 Clarence Stone: you’re… I… I just… So, background?

385 00:38:33.980 00:38:41.340 Clarence Stone: I got obsessed, I hit the cursor limit, then I started coding in Windsurf last night.

386 00:38:41.670 00:38:46.079 Clarence Stone: And let me just show you what I built. This is an Astro Design Library.

387 00:38:49.810 00:38:50.780 Clarence Stone: Right.

388 00:38:51.090 00:38:53.120 Clarence Stone: Here’s all your standard components.

389 00:38:54.290 00:38:55.689 Uttam Kumaran: Oh, nice!

390 00:38:55.690 00:39:03.150 Clarence Stone: Right, and by the way, like, the way I design things, everything’s reactive, and you’ll see there’s subtle touches and animations and sweeps. Like.

391 00:39:03.460 00:39:04.580 Clarence Stone: See that? Yeah.

392 00:39:04.850 00:39:05.850 Clarence Stone: Like…

393 00:39:05.850 00:39:06.690 Uttam Kumaran: Bye.

394 00:39:07.140 00:39:21.009 Clarence Stone: And then, like, accents, tones, gradients, and all that shit, right? So, this is normal. Then we did custom charts, because, like, this is absolutely necessary. Oh, it’s not loading. Let me… let me go to local… localhost. Yeah.

395 00:39:22.480 00:39:27.009 Clarence Stone: And… Animated bars with glows.

396 00:39:27.950 00:39:33.220 Clarence Stone: Right? That’s normal shit. We’ve got nodes, all that stuff, mermaid diagrams, flowcharts.

397 00:39:33.920 00:39:35.460 Clarence Stone: All of this is code.

398 00:39:36.240 00:39:37.090 Uttam Kumaran: Wow.

399 00:39:38.430 00:39:39.160 Clarence Stone: Slide deck?

400 00:39:39.160 00:39:40.089 Uttam Kumaran: Oh, that’s scam.

401 00:39:40.090 00:39:42.319 Clarence Stone: I got you. Agents.

402 00:39:42.430 00:39:45.569 Clarence Stone: Here’s your agent instruction. Download your slide scale.

403 00:39:49.180 00:39:49.710 Clarence Stone: And this.

404 00:39:49.710 00:39:51.150 Uttam Kumaran: This is the interesting.

405 00:39:54.710 00:39:56.179 Clarence Stone: All that, right?

406 00:39:56.400 00:40:00.119 Clarence Stone: Here’s what’s even sicker. I made the entire social stack.

407 00:40:02.010 00:40:03.559 Clarence Stone: These are real ads.

408 00:40:09.420 00:40:10.390 Uttam Kumaran: Wow.

409 00:40:16.030 00:40:16.570 Clarence Stone: Right.

410 00:40:16.570 00:40:18.920 Uttam Kumaran: Well, dude, you know how I’m gonna go even further.

411 00:40:19.050 00:40:22.210 Uttam Kumaran: I have some Google Drives filled with

412 00:40:22.430 00:40:27.069 Uttam Kumaran: all of the best McKinsey Deloitte decks, like, ever made.

413 00:40:27.070 00:40:28.820 Clarence Stone: Somewhere on my laptop.

414 00:40:30.060 00:40:33.580 Uttam Kumaran: I, you know, one day I got very obsessed with, like,

415 00:40:33.940 00:40:38.360 Uttam Kumaran: I’ve done, like, deep dives with all these companies, but I went and found, like, on Reddit.

416 00:40:38.550 00:40:42.179 Uttam Kumaran: And, like, some sites where people had accumulated all the gaps.

417 00:40:42.890 00:40:49.340 Uttam Kumaran: And, I haven’t tried full of them. Decks and one-pagers and assets that are, like, actually decent.

418 00:40:49.980 00:41:04.570 Clarence Stone: So, there’s MD files for implementing the design system, creating slides, component implementations, these are all skills that were generated. Responsive and mobile web design. So this, like, you just have to download this, add it to your system, and start coding.

419 00:41:05.400 00:41:08.150 Clarence Stone: And it’ll just inherit an entire design system.

420 00:41:08.820 00:41:12.469 Clarence Stone: Slide creation, Integration to AI workflows.

421 00:41:15.200 00:41:17.440 Clarence Stone: Tenets of our design, why we do it.

422 00:41:19.690 00:41:23.360 Clarence Stone: Right? This is all baked into the MD file, I just surfaced it.

423 00:41:24.860 00:41:27.380 Clarence Stone: So, I want to make this for Brainforge.

424 00:41:28.220 00:41:30.389 Clarence Stone: All we have to do is a color swap.

425 00:41:30.710 00:41:35.549 Uttam Kumaran: I mean, dude, tell me what you need. We have all this, there’s a Figma file with our entire design library.

426 00:41:35.550 00:41:42.039 Clarence Stone: Yep. I built the full design kit, you saw that, and then I turned it.

427 00:41:42.040 00:41:42.820 Uttam Kumaran: Yeah.

428 00:41:43.440 00:41:44.290 Clarence Stone: And look, look at.

429 00:41:44.290 00:41:45.180 Uttam Kumaran: Wow.

430 00:41:45.180 00:41:48.160 Clarence Stone: Do you see this Gaussian blur effect up top?

431 00:41:48.420 00:41:49.050 Uttam Kumaran: Yes, yes.

432 00:41:49.050 00:41:50.419 Clarence Stone: Yeah, liquid glass.

433 00:41:50.960 00:41:51.750 Uttam Kumaran: Yes.

434 00:41:54.510 00:41:59.119 Clarence Stone: Yeah, so this is what you get from a Jedi Master front-end developer. That’s… that’s…

435 00:41:59.460 00:42:04.729 Clarence Stone: This is my only wheelhouse. This is my only house, but I should make one for you guys.

436 00:42:05.900 00:42:15.650 Uttam Kumaran: Well, yeah, I mean, I’m kind of like… well, I think Sam and Mustapos have to decide on, like, the desktop, but I basically want to have AI do a complete reskin of our existing platform.

437 00:42:18.130 00:42:25.169 Uttam Kumaran: And, like, I would like to just try every major design AI… I’d like to just learn how to design an AI in the next week.

438 00:42:25.490 00:42:27.300 Uttam Kumaran: Whatever’s, like, cutting edge.

439 00:42:28.000 00:42:29.830 Uttam Kumaran: So I just checked the box there, I’m like.

440 00:42:31.420 00:42:35.659 Uttam Kumaran: And, yes, I get handed off, but I want to see, like, should I try a paper?

441 00:42:35.890 00:42:37.470 Uttam Kumaran: Should I try Figma?

442 00:42:37.830 00:42:41.180 Uttam Kumaran: I want to take a look at different things that it’s building.

443 00:42:41.180 00:42:45.740 Clarence Stone: So I skipped all that, I created the fundamental design principles on paper.

444 00:42:45.880 00:42:55.780 Clarence Stone: So that was the 12, like, PDFs that you saw. It was just the look and feel, the essence of it, right? And then I took Tailwind, and I said, make Tailwind this.

445 00:42:57.500 00:43:08.480 Clarence Stone: Right, and it just started… it takes a lot, because AI sucks that front end, unfortunately. But, like, a couple rounds, and it gave me this.

446 00:43:08.670 00:43:16.610 Clarence Stone: And by the way, the next thing I’m gonna do with these socials is I want somebody to be able to prompt it with the right language, and just generate it.

447 00:43:17.520 00:43:18.219 Clarence Stone: So, let’s say…

448 00:43:18.220 00:43:18.830 Uttam Kumaran: Yeah.

449 00:43:18.830 00:43:27.649 Clarence Stone: this, you can say, give me a format one by one, with a dot grid and edge fade, with, like, this grid pattern.

450 00:43:27.790 00:43:31.399 Clarence Stone: And these variables, and then this is what the text needs to be.

451 00:43:32.260 00:43:34.020 Clarence Stone: And then, boom, you just made it.

452 00:43:36.370 00:43:37.090 Uttam Kumaran: Damn.

453 00:43:38.480 00:43:44.280 Clarence Stone: Right, same thing here, like, this is why there’s so many examples, like, you just swap the image, swap the text, done.

454 00:43:47.470 00:43:48.170 Uttam Kumaran: Wow.

455 00:43:48.930 00:43:49.790 Clarence Stone: Right?

456 00:43:54.900 00:44:01.719 Clarence Stone: Yeah, so that’s… that’s what I used a shit ton of credits for, but I think it’s saying.

457 00:44:01.720 00:44:03.180 Uttam Kumaran: Oh, that’s sick.

458 00:44:03.180 00:44:07.110 Clarence Stone: now I know, like, the process, so it’ll be quicker.

459 00:44:07.580 00:44:08.800 Clarence Stone: For Ring Forge.

460 00:44:09.310 00:44:14.050 Uttam Kumaran: I mean, dude, we… I haven’t even finished out the HTML slides then, so even just being able to, like.

461 00:44:14.550 00:44:16.870 Uttam Kumaran: Do that for people would be helpful.

462 00:44:17.180 00:44:22.770 Clarence Stone: Yeah, so I’m gonna start with slides. I think, if you can give me your slide repo of McKinsey examples, that’d be sick.

463 00:44:23.250 00:44:23.570 Uttam Kumaran: Okay.

464 00:44:23.570 00:44:28.830 Clarence Stone: I’ll… I’ll make it smarter. Dude, it’s still so bad at layout sometimes, so I have to, like.

465 00:44:28.830 00:44:29.570 Uttam Kumaran: Yeah.

466 00:44:29.570 00:44:35.099 Clarence Stone: 12 standardized templates, and then you have to say, use template 1, Here’s the content.

467 00:44:35.800 00:44:36.610 Uttam Kumaran: Yeah.

468 00:44:38.430 00:44:39.180 Uttam Kumaran: Okay.

469 00:44:41.170 00:44:42.010 Clarence Stone: Yeah.

470 00:44:44.050 00:44:46.070 Uttam Kumaran: Yeah, we’re gonna extend,

471 00:44:47.420 00:44:53.350 Uttam Kumaran: this guy interviewed really well, like, doesn’t want any money at all, like, like, 80K.

472 00:44:53.350 00:44:54.280 Clarence Stone: What?

473 00:44:54.690 00:44:55.790 Uttam Kumaran: Data analyst.

474 00:44:56.030 00:45:01.030 Uttam Kumaran: Data analyst, no, no, no. And then… Jasmine…

475 00:45:01.190 00:45:04.419 Uttam Kumaran: We’re gonna extend an offer, too, but she wants to start later.

476 00:45:04.550 00:45:09.340 Uttam Kumaran: next… next April, or this April, at the end of the month, and then…

477 00:45:10.280 00:45:13.840 Uttam Kumaran: We have a few people on for AI that I wonder, almost, if you want to talk to.

478 00:45:14.220 00:45:14.570 Clarence Stone: Okay.

479 00:45:14.570 00:45:16.540 Uttam Kumaran: Or you want to join their final panel?

480 00:45:16.800 00:45:24.130 Clarence Stone: Yeah. If you can make it late, like, past 5, I can definitely make it for sure. If not, like, I’ll coordinate with Kayla.

481 00:45:26.120 00:45:28.520 Uttam Kumaran: Okay, yeah, I think AST… I think it’s…

482 00:45:31.450 00:45:32.750 Clarence Stone: I’ll reach out to her, don’t worry.

483 00:45:36.030 00:45:46.650 Uttam Kumaran: swapping AirPods and, like, restarting it. Yeah, I’ll tell Kayla, we have, I think, 3 people that are gonna end up in final rounds for AI.

484 00:45:47.050 00:45:48.500 Uttam Kumaran: Some of them seem good.

485 00:45:48.720 00:45:51.920 Uttam Kumaran: Have no clue what the price point is they’re thinking about, but…

486 00:45:52.160 00:45:52.930 Clarence Stone: Yeah.

487 00:45:52.930 00:45:54.799 Uttam Kumaran: Yeah, dude, Kayla’s really good, bro.

488 00:45:55.270 00:45:56.419 Uttam Kumaran: Yeah, let’s sad.

489 00:45:57.180 00:45:59.650 Clarence Stone: Yeah, she really got thrown into it, too.

490 00:45:59.780 00:46:01.049 Clarence Stone: It was great.

491 00:46:01.050 00:46:01.960 Uttam Kumaran: Kayla?

492 00:46:02.660 00:46:06.669 Uttam Kumaran: like… Out of everyone we hired in the last 6 months.

493 00:46:10.660 00:46:15.479 Uttam Kumaran: Probably the most equipped for the job that we hired her for, and absolutely dominated.

494 00:46:15.850 00:46:24.329 Uttam Kumaran: Like… We’ve hired some people who I threw in, and they swam. I don’t want to count that. Like, we hired her for, like, a hit job on this one thing, and she absolutely is dominating.

495 00:46:25.010 00:46:27.859 Uttam Kumaran: And there’s no AI. No AI yet.

496 00:46:28.130 00:46:28.580 Clarence Stone: Yeah.

497 00:46:28.580 00:46:29.280 Uttam Kumaran: Stay.

498 00:46:30.840 00:46:35.229 Clarence Stone: That’s great. It’s good that we’re plugging those gaps, because I.

499 00:46:35.230 00:46:35.750 Uttam Kumaran: Yeah.

500 00:46:35.750 00:46:37.050 Clarence Stone: For Greg today.

501 00:46:37.730 00:46:38.410 Uttam Kumaran: Yeah.

502 00:46:40.600 00:46:49.160 Uttam Kumaran: And then I wanna see, dude, I wanna see how we get, I’m gonna have some of the team… I think I’m gonna have Kayla, Luke come to Austin next month, because I’m gonna… I’m gonna present.

503 00:46:49.670 00:46:56.309 Uttam Kumaran: So I’m gonna present about, like… and that’s why I think, like, dude, if we can… if we start to get the story together, I could present

504 00:46:56.460 00:46:59.669 Uttam Kumaran: Like, a formal announcement of something at this presentation.

505 00:47:00.550 00:47:05.510 Uttam Kumaran: just because I want to… I’m going to present on sort of what running an AI-native services firm is like.

506 00:47:05.860 00:47:11.679 Uttam Kumaran: I want to use that slide deck and that talk track to go present a bunch of places.

507 00:47:11.680 00:47:13.010 Clarence Stone: Yeah. And basically.

508 00:47:13.080 00:47:16.739 Uttam Kumaran: Through our story, sell what we’re gonna sell, which is a perfect story.

509 00:47:16.810 00:47:18.170 Clarence Stone: It’s a perfect story.

510 00:47:18.530 00:47:19.400 Uttam Kumaran: You know?

511 00:47:20.150 00:47:23.989 Clarence Stone: Yeah, I think the top line is, like.

512 00:47:24.280 00:47:32.680 Clarence Stone: AI dances so fast, but implementation takes time, and intention, and planning, and change management, right?

513 00:47:32.840 00:47:37.050 Clarence Stone: And, like, you’re either sacrificing one or the other in most companies.

514 00:47:37.680 00:47:38.240 Uttam Kumaran: Yeah.

515 00:47:38.240 00:47:48.179 Clarence Stone: you don’t do that here. Like, I’m just gonna keep pushing you, like, design systems and random apps and stuff, and there’s gonna be more things on the shelf to sell as things progress.

516 00:47:48.760 00:47:49.320 Uttam Kumaran: Yeah.

517 00:47:50.390 00:47:51.260 Uttam Kumaran: I agree.

518 00:47:51.610 00:47:52.160 Clarence Stone: Yeah.

519 00:47:52.840 00:47:57.360 Clarence Stone: Alright, I don’t want to keep you, it’s your coffee shop time, so…

520 00:47:57.360 00:48:00.030 Uttam Kumaran: Dude, I need to… I’ve been here since 10am, I need to leave.

521 00:48:00.030 00:48:02.169 Clarence Stone: Oh, shit, really? You were working from there?

522 00:48:02.680 00:48:04.010 Clarence Stone: How do you do it, man?

523 00:48:04.340 00:48:07.069 Uttam Kumaran: Working from here? Well, my Wi-Fi went out earlier.

524 00:48:07.250 00:48:09.080 Uttam Kumaran: And…

525 00:48:09.820 00:48:14.899 Uttam Kumaran: I don’t know, I was in Cursor for, like, 2 hours. Dude, wait, let me show you something. Hold on.

526 00:48:15.470 00:48:21.280 Uttam Kumaran: I just keep swapping AirPods. They’re close, I’m on this fucking stupid MacBook, but .

527 00:48:22.230 00:48:24.160 Clarence Stone: Dude, I just ordered one through John.

528 00:48:25.440 00:48:34.040 Uttam Kumaran: I know, we just said we’re about to order ours through John, so… I basically…

529 00:48:34.200 00:48:35.519 Uttam Kumaran: Here it is.

530 00:48:37.630 00:48:50.349 Uttam Kumaran: This problem with me, dude, is I… every time I try to do one thing, I’m like, we should make this super reusable every other… if I fucking waste my fucking life not doing the main thing, but whatever.

531 00:48:50.510 00:48:55.099 Uttam Kumaran: I basically worked on the entire reusable playbook plan.

532 00:48:55.800 00:48:56.415 Clarence Stone: Oh.

533 00:48:57.210 00:49:03.899 Uttam Kumaran: Because, like, a few weeks ago, I had worked on the initial plan, which was just, like, we want to work on playbooks.

534 00:49:04.310 00:49:21.470 Uttam Kumaran: I finally had a use case for it, because we’re doing an end-to-end omni, like, BI tool deployment, and I’m like, this is a perfect use case for not only templated SOWs, templated project plans, templated linear milestones and projects, and templated tickets. And then I was working with it to basically say, like.

535 00:49:21.720 00:49:28.220 Uttam Kumaran: let’s go ahead and rip this for all of our services. And I basically wanted to create a read-only linear team.

536 00:49:28.580 00:49:33.260 Uttam Kumaran: That is, like, a sample team, where you can go pick a project, clone it to your clients.

537 00:49:33.400 00:49:35.470 Uttam Kumaran: And it comes with all the tickets.

538 00:49:36.090 00:49:40.870 Uttam Kumaran: And then also, you can also come in here and be like, I want to do this service, and it’ll go pick up

539 00:49:41.160 00:49:43.790 Uttam Kumaran: the SOW and the implementation plan.

540 00:49:43.980 00:49:46.290 Uttam Kumaran: And so I kind of built that out, so…

541 00:49:46.470 00:49:58.030 Uttam Kumaran: CSO can now pull a single folder from Vault, GTM, services, line, subservice, offering, get all four composable artifacts, the offer, the SOP, the implementation plan, the linear template.

542 00:49:59.090 00:50:02.770 Uttam Kumaran: there’s a service hierarchy now, which is… I basically said, like.

543 00:50:02.950 00:50:12.860 Uttam Kumaran: Think about a hierarchy, but, like, this may change. So we have… these are high-level services. The subservice, the offering, the phase, the deliverable, the ticket, so all the way down to the atomic unit, which is the ticket.

544 00:50:13.840 00:50:18.260 Uttam Kumaran: And then I also gave it some, like, helpful, like, literature on, like.

545 00:50:19.150 00:50:23.340 Uttam Kumaran: Like, what a really amazing runbook could look like from, like, a company that’s mapped out.

546 00:50:23.340 00:50:24.020 Clarence Stone: cabin.

547 00:50:24.910 00:50:28.310 Uttam Kumaran: Flex7 is, the guys that started VIXL.

548 00:50:28.680 00:50:29.710 Uttam Kumaran: So, Vixel…

549 00:50:29.710 00:50:33.939 Clarence Stone: I know that was the native MCP that was in OpenWork?

550 00:50:34.940 00:50:37.260 Uttam Kumaran: Oh, really? I don’t know, maybe there’s just some…

551 00:50:37.510 00:50:56.040 Uttam Kumaran: Yeah, so NTT Data bought them, and they are advisors of ours, and they have a small piece of Brainforge. Gotcha. And they basically… I give them a lot of credit for, like, the reason we became, like, really, really formal, versus just, like, a mix of, like, freelancers.

552 00:50:56.170 00:51:07.250 Uttam Kumaran: Oh, they redid their website. It’s mostly AI, I can tell. And yeah, they’re… so they ran Flux7, and then they started, basically, YC for IT service companies.

553 00:51:07.250 00:51:08.410 Clarence Stone: Gotcha.

554 00:51:08.600 00:51:12.719 Uttam Kumaran: And we are one of the standout People in their businesses now.

555 00:51:12.930 00:51:14.200 Uttam Kumaran: And there we are.

556 00:51:15.100 00:51:16.890 Uttam Kumaran: And we’re on the front page.

557 00:51:17.110 00:51:21.959 Uttam Kumaran: They just had, like, two… they started this a few years ago, and they just had two out… two outcomes.

558 00:51:22.400 00:51:27.799 Uttam Kumaran: From portfolio companies, but, like, I loved them because all they cared about was IT service businesses.

559 00:51:28.090 00:51:39.310 Uttam Kumaran: And they gave us all their resources on, like, how they thought about everything, and they scaled to, like, pretty big. And so they gave me all their runbooks and things like that, and now’s the perfect time where I was like, here’s a company who literally

560 00:51:39.510 00:51:42.689 Uttam Kumaran: Dude, they basically have a runbook, and you must have seen this…

561 00:51:42.830 00:51:51.420 Uttam Kumaran: at UI, so maybe nothing new. But bro, like, the most detailed shit of all time, like, I would… I was like, oh my god, we’re never gonna get here.

562 00:51:51.630 00:51:55.169 Uttam Kumaran: You know, like, here’s a good example. Let me share my old screen.

563 00:51:56.290 00:52:00.739 Uttam Kumaran: like… This is the Flux7 runbook.

564 00:52:01.410 00:52:03.450 Uttam Kumaran: Light.

565 00:52:03.640 00:52:11.610 Uttam Kumaran: how do you do capacity, like, every single thing. And it’s pages and pages of, like, the most minute fucking thing, like.

566 00:52:11.810 00:52:18.780 Uttam Kumaran: You know, user stories, how should they be written? How do we… how do we do… Yeah, this, like.

567 00:52:18.890 00:52:24.650 Uttam Kumaran: such good canonical information on burn… like, thinking about burndowns, like, really, really…

568 00:52:24.950 00:52:30.859 Uttam Kumaran: Like, how do you run a great retro, event scheduling, planning, blah blah blah blah blah, like…

569 00:52:30.980 00:52:37.590 Uttam Kumaran: So I was like, yo, put… I want to put this in our library, or somewhere, and then I also said, think about what you can learn from these.

570 00:52:37.850 00:52:39.769 Uttam Kumaran: I think what you can learn from this.

571 00:52:40.150 00:52:40.940 Uttam Kumaran: Right?

572 00:52:41.340 00:52:52.359 Uttam Kumaran: So, it did a good job at doing that, and yeah, I have to merge this PR, but basically, we now have a process where you can say to the CSO, I want to run this service.

573 00:52:52.520 00:52:58.710 Uttam Kumaran: There’s a bunch of other bullshit in here. You can now say, I want to run this service, and it helps you

574 00:52:58.970 00:53:06.720 Uttam Kumaran: basically pull out all the core plans that you need, and then eventually you start with the SOW, then you work on the… then you work on the…

575 00:53:06.830 00:53:26.219 Uttam Kumaran: the offers, the offer, the SOWs, the implementation plans, and those cascade into the tickets, and then eventually you should be able to be like, I want to do this for this client, this for this client. We must have talked about this, but, like, I want to walk in, client X wants this, this, and this, pick it off the shelf, linear tickets, AI agents, human in the review steps.

576 00:53:26.820 00:53:31.229 Uttam Kumaran: And then it’s up to you to press the brakes as hard as you want.

577 00:53:31.370 00:53:33.009 Uttam Kumaran: And makes up…

578 00:53:33.010 00:53:34.220 Clarence Stone: I have a… I have a.

579 00:53:34.220 00:53:34.620 Uttam Kumaran: Yeah.

580 00:53:34.620 00:53:39.099 Clarence Stone: for, design thinking and UX design and product.

581 00:53:40.100 00:53:43.359 Uttam Kumaran: Oh, great, yeah, I mean, I would love that, because this is all going to be data…

582 00:53:44.130 00:53:52.340 Uttam Kumaran: this is gonna be data and AI related, but, like, again, it has guidelines on how to name epics, how to name milestones, because we already thought about a lot of that.

583 00:53:52.530 00:53:59.370 Uttam Kumaran: So then it just… it just builds the higher level abstractions, because, dude, we’re close to people not being in linear either, bro.

584 00:53:59.830 00:54:00.430 Clarence Stone: Yeah.

585 00:54:00.430 00:54:03.450 Uttam Kumaran: I think we’re, like, a year away, but we’re close.

586 00:54:03.630 00:54:04.700 Uttam Kumaran: We’re close.

587 00:54:04.920 00:54:14.170 Uttam Kumaran: So then I’m like, all that matters is that the next AI that takes this thing has enough context, so they have to be verbose, they have to be clear, they have to have clear breakdown of tasks.

588 00:54:14.170 00:54:22.309 Clarence Stone: Do you think we should just improve the current platform and just update it with new skill capacities while we figure out what the harness looks like?

589 00:54:24.160 00:54:26.320 Uttam Kumaran: I mean, dude, I prefer that…

590 00:54:26.940 00:54:33.730 Uttam Kumaran: like, I mean, this is where, like, I’m still stuck in, like, your broke-for mentality, bro, because I just want people to use this thing.

591 00:54:33.840 00:54:39.079 Uttam Kumaran: And I want it to keep bolstering the skills, and I want more people to start building skills. After that, it runs. It’s self-driving.

592 00:54:39.390 00:54:40.340 Uttam Kumaran: Right? Yeah.

593 00:54:40.730 00:54:47.570 Uttam Kumaran: Because I ultimately don’t care about me or you doing anything except selling and doing the frontiers. Right now, I’m building the, like.

594 00:54:47.880 00:54:58.190 Uttam Kumaran: I’m like, we’re the first people to, like, light fire in this company. Then I’m like, guys, I show you how to do fire, due to fire, I’m not gonna do fire ever again. I wrote down how to do it, you can do it.

595 00:54:58.390 00:55:06.990 Uttam Kumaran: Right? So I’m just doing the first… I’m trying not to get enamored with the fact that I can do this, and instead do what you told me to do, which is, like.

596 00:55:07.520 00:55:14.920 Uttam Kumaran: think about if you have all these pieces, like, I’m… that’s why I want to do the design piece, because I haven’t done AI in design in, like, a year.

597 00:55:15.090 00:55:18.540 Uttam Kumaran: I just need to know what’s possible for me to then drag it out.

598 00:55:18.860 00:55:20.569 Clarence Stone: And so I’m going one by…

599 00:55:20.850 00:55:23.159 Uttam Kumaran: I’m going one by one by one.

600 00:55:23.530 00:55:32.109 Uttam Kumaran: So basically, like, project management, great, here’s what you can do in sales, here’s what you can do this, here’s what skills are. I’m doing all the layers, so then I can spend my time being, like.

601 00:55:32.430 00:55:45.819 Uttam Kumaran: okay, how close are we to self-driving company, and, like, how far are we, and, like, what’s the limiting factor? And then we knew we need… we still, like… we’re the… people are the limiting factor. We have to go higher, we have to go sell, me and you have to sit on meetings, so, like.

602 00:55:46.450 00:55:49.460 Uttam Kumaran: I need to get to that ASAP, you know?

603 00:55:50.450 00:55:56.430 Uttam Kumaran: We’re still not there yet. Like, we don’t have enough cash in the bank.

604 00:55:56.580 00:55:59.549 Uttam Kumaran: We don’t have enough people to do what we’re tasked to do today.

605 00:56:00.000 00:56:04.360 Uttam Kumaran: like, we’re not, like, safe yet, so I have to keep working, but, like…

606 00:56:04.360 00:56:06.819 Clarence Stone: They keep giving you free labor from your.

607 00:56:06.820 00:56:07.280 Uttam Kumaran: Yeah.

608 00:56:07.280 00:56:09.190 Clarence Stone: Friends, you’ll get there.

609 00:56:09.500 00:56:13.479 Uttam Kumaran: No, dude, we should think about the new structure, dude. We should drive towards a decision there.

610 00:56:13.810 00:56:19.409 Uttam Kumaran: And then I think you should go… I think we should raise, or something. I’m getting more convinced the more I sleep on it.

611 00:56:25.430 00:56:29.020 Uttam Kumaran: But, like, maybe we should get enough just so, like, we can…

612 00:56:30.470 00:56:33.940 Uttam Kumaran: Like, Brainforce can just turn on, and worked.

613 00:56:35.000 00:56:41.319 Uttam Kumaran: Robert runs that, and then we just, like, think about this. I mean, I still think my criteria stand, but, like.

614 00:56:41.590 00:56:46.419 Uttam Kumaran: I’m not nervous that it works for us to go get those clients.

615 00:56:46.810 00:56:50.980 Uttam Kumaran: I’m just like, let’s get everything in place to drive towards that future, because I do think

616 00:56:52.330 00:56:57.699 Uttam Kumaran: I’m thinking maybe we… we do that, because I see how, even though we’re moving fast at Brainforge.

617 00:56:58.600 00:57:02.029 Uttam Kumaran: It’s not fast enough for us to capture the opportunity that’s, like, right now.

618 00:57:02.340 00:57:10.359 Clarence Stone: Yeah, I… so, my observation right now in this moment is that there’s awareness, but there’s still this, like.

619 00:57:10.920 00:57:15.089 Clarence Stone: Executive mentality at the top end, where they think they can take shortcuts.

620 00:57:16.130 00:57:19.719 Uttam Kumaran: Yes, for sure, but dude, think about it, it was a year ago, it was worse than this, bro.

621 00:57:20.030 00:57:29.010 Clarence Stone: It was way worse than this, so… We do massive marketing pushes, we make ourselves known, when they try it and they fail, they will call.

622 00:57:31.480 00:57:41.349 Clarence Stone: That’s… that’s usually how the cycle works, like, there’s a lack of awareness, and then they realize, hey, you were right, tell me more. Oh, that looks simple, I’ll do it. Or…

623 00:57:41.350 00:57:43.960 Uttam Kumaran: Look, can I tell you today, dude, one of our vendors…

624 00:57:44.280 00:57:52.800 Uttam Kumaran: called me, and I was asking him about some roadmap I needed, and I was telling him about our stuff. He was like, can you come present to my company about

625 00:57:52.910 00:57:58.170 Uttam Kumaran: He was like, what do you mean? And I explained it to him, and he was like… and they’re a data vendor.

626 00:57:58.700 00:58:07.310 Uttam Kumaran: And he’s like, can you come present at our brown bag? We have external speakers come present? Like, sure. And then I was like, holy fuck, like…

627 00:58:07.830 00:58:14.199 Uttam Kumaran: But they’re… they’re, like, a group of all engineers, super senior, they’re using Codex, cursor, every… cloud code for everything.

628 00:58:15.070 00:58:20.180 Uttam Kumaran: But I was like, oh damn, even these guys on the engineering side don’t get, like.

629 00:58:20.280 00:58:23.350 Uttam Kumaran: This, like, context engineering piece.

630 00:58:26.020 00:58:33.939 Uttam Kumaran: But yeah, I mean, dude, I think we should just, like, try to sign something and go, and I also think that, like, you should take advantage of the fact that some people get it, some people don’t.

631 00:58:34.160 00:58:39.310 Uttam Kumaran: And maybe the people that get it, get it as much as we do and want to put money somewhere, because they can’t affect it

632 00:58:39.420 00:58:40.820 Uttam Kumaran: Any other way?

633 00:58:43.680 00:58:50.359 Uttam Kumaran: And that’s maybe where, like, you should just take some of these conversations, continue to say no to people strategically to build that up.

634 00:58:50.820 00:58:51.480 Clarence Stone: Yeah.

635 00:58:51.700 00:58:52.680 Clarence Stone: Yeah.

636 00:58:53.610 00:58:54.030 Uttam Kumaran: Yeah.

637 00:58:54.030 00:58:58.069 Clarence Stone: Tuesday, man, kind of flew by. It was just, like, catching up on work.

638 00:58:58.610 00:58:59.120 Clarence Stone: I eat.

639 00:58:59.120 00:59:03.639 Uttam Kumaran: Oh, I know, dude, what a… what a… such a poison, like, my actual job is.

640 00:59:04.300 00:59:11.150 Uttam Kumaran: But… So be it. Like, so be it.

641 00:59:11.770 00:59:16.899 Uttam Kumaran: It’s causing me to automate more and more of it actively, so it’s… at least it’s a good thing for that.

642 00:59:17.160 00:59:26.529 Clarence Stone: By the way, like, Yuri also came from AIG Insurance, and I introduced him to Ian earlier today. Nice.

643 00:59:26.920 00:59:28.190 Clarence Stone: Ike…

644 00:59:28.680 00:59:39.220 Clarence Stone: Alternate strat could be that, like, instead of doing 5 hours of ourselves each, Yuri just tells us what needs to be built, and we just ship it for Ian.

645 00:59:42.870 00:59:43.920 Uttam Kumaran: Yeah, but I mean, I…

646 00:59:43.920 00:59:49.279 Clarence Stone: mapping, go do your analysis, I don’t care, you’re just telling me, like, what needs to be built. Give me the product requirements.

647 00:59:50.310 00:59:51.539 Uttam Kumaran: But again, I’m like…

648 00:59:52.230 01:00:07.189 Uttam Kumaran: If we’re building enterprise, dude, we’ve got to make some money around here. I’m down for that, I don’t care. I think I’m gonna ask Yuri to do the same thing. I think he’s gonna have more fun time at my company, because we already have most of it ready for him to go. He’s gonna use Kirscher and be able to do all his deliverables.

649 01:00:07.520 01:00:12.319 Uttam Kumaran: In an afternoon, because… I just don’t know the jargon of process mapping.

650 01:00:12.570 01:00:13.480 Clarence Stone: Yep.

651 01:00:13.840 01:00:20.310 Uttam Kumaran: I’m missing that, but… He’s probably 4 prompts away from doing his whole fucking job in Kirscher today.

652 01:00:20.550 01:00:22.410 Clarence Stone: I taught him how to use cursor today.

653 01:00:22.860 01:00:38.109 Uttam Kumaran: Yeah, so he’s gonna… I think, yes, on the insurance side, the problem with that is there’s just not enough muscle at Ian’s company to execute the things, right? So the challenge is, like, you truly need AI harnesses to work without fail.

654 01:00:39.290 01:00:43.049 Clarence Stone: Yeah, and that, like, that’s where I’m stuck at.

655 01:00:44.750 01:00:52.599 Clarence Stone: we need to find a really easy way to spin up harnesses. And that’s… on the technical level, that’s what I’ve been thinking on.

656 01:00:56.070 01:00:56.720 Clarence Stone: Because…

657 01:00:56.720 01:00:57.960 Uttam Kumaran: Well, yeah.

658 01:00:57.960 01:00:59.199 Clarence Stone: Go for it. Go for it.

659 01:01:00.200 01:01:05.859 Uttam Kumaran: Well, what we… what… that’s what Sam’s working on right now, is I basically… if you look in the platform chat, I told him.

660 01:01:06.220 01:01:14.140 Uttam Kumaran: I need any agentic interface on our platform to work with linear, to work with,

661 01:01:14.620 01:01:19.190 Uttam Kumaran: work with Linear, work with Supabase, work with,

662 01:01:19.280 01:01:34.729 Uttam Kumaran: fuck, I’m, like, I’m so braindid. Like, GitHub, there’s, like, 5 or 6 core softwares. I was like, no matter what, whether you’re using Cursor, or Codex or whatever, I said… he was like, yeah, I know we’re using Cursor, but, like, I said, dude, next 6 months, something else will come. It doesn’t matter what it is.

663 01:01:34.730 01:01:43.589 Uttam Kumaran: it needs to be able to… we need to slap it on this thing, it needs to work. So the end keys need to all be in a good place, where it’s, like, composable. So we did that yesterday. We’re using 1Passwords.

664 01:01:43.650 01:01:58.150 Uttam Kumaran: One pastor actually came out with a great product for once, which is, like, you can basically load MPs, and you can keep it up to date across multiple developers, so we’re trying to use that. And then I basically said… I said, we need to completely get out of MCPs.

665 01:01:58.350 01:02:02.240 Uttam Kumaran: you need to figure out everything through a CLI,

666 01:02:03.120 01:02:08.220 Uttam Kumaran: And then I said, like, I want this to be able to be used regardless of what the agentic framework is.

667 01:02:08.450 01:02:09.429 Uttam Kumaran: You know?

668 01:02:09.430 01:02:11.230 Clarence Stone: Because, like, if I just had…

669 01:02:11.230 01:02:12.809 Uttam Kumaran: And testing, and then testing.

670 01:02:13.260 01:02:18.130 Clarence Stone: like, cloud API endpoints, I would be really dangerous on any front end.

671 01:02:19.190 01:02:19.820 Clarence Stone: Are you.

672 01:02:19.820 01:02:20.610 Uttam Kumaran: Yeah, yeah.

673 01:02:20.610 01:02:22.279 Clarence Stone: I will do the rest.

674 01:02:23.730 01:02:24.540 Uttam Kumaran: Hmm.

675 01:02:24.540 01:02:25.250 Clarence Stone: like…

676 01:02:25.370 01:02:39.419 Clarence Stone: that’s usually my contract with my backend developers. It’s like, dude, I don’t care how you give me this API. Like, don’t… don’t even try to describe it to me, I don’t know anything about backend. Like, you give me an API, you will see a sick front end.

677 01:02:41.630 01:02:45.679 Uttam Kumaran: But that’s where the problem is, some of our use cases need data work.

678 01:02:45.850 01:02:48.979 Uttam Kumaran: Like, we’re deciding on using Turbo Puffer, I told you about.

679 01:02:48.980 01:02:49.380 Clarence Stone: Yeah.

680 01:02:49.380 01:02:51.159 Uttam Kumaran: For some quick embeddings.

681 01:02:51.740 01:03:04.119 Uttam Kumaran: But that’s the advantage, is, like, it’s use, it’s context. I mean, what Adam said is actually the thing that’s out of… everything is sticking in my mind, is the… what do you say, right context, right accurate context at the right time, or whatever.

682 01:03:04.540 01:03:06.899 Uttam Kumaran: I said that a bunch of that 10 times today.

683 01:03:07.480 01:03:14.189 Uttam Kumaran: So I’m like, that, I think, was helpful, because that implied that, like, you need certain access methods for certain contacts.

684 01:03:14.190 01:03:14.770 Clarence Stone: Yeah.

685 01:03:14.770 01:03:18.640 Uttam Kumaran: database, CLI, you may have it in your file system.

686 01:03:18.900 01:03:20.359 Uttam Kumaran: Like, I don’t know.

687 01:03:20.550 01:03:28.849 Clarence Stone: So, I’ll propose something to think about. I think Obsidian is a great way to do it. You can use a, like, GitHub extension.

688 01:03:29.110 01:03:32.159 Clarence Stone: and automatically have it syncing to the GitHub.

689 01:03:32.370 01:03:39.410 Clarence Stone: Everybody’s individual, obsidian should trigger a branch creation.

690 01:03:39.710 01:03:40.730 Uttam Kumaran: Mmm…

691 01:03:40.730 01:03:42.439 Clarence Stone: They will pull from that branch.

692 01:03:42.940 01:03:45.359 Clarence Stone: Right? They can push that branch.

693 01:03:45.790 01:03:50.129 Clarence Stone: you can decide if you want a PR into the main repo from that branch.

694 01:03:50.870 01:03:57.550 Clarence Stone: Yeah. Right, so you’d have a knowledge manager, and I’ll be like, hey, I wrote a thought piece. Please, like, submit this.

695 01:03:58.090 01:03:58.610 Uttam Kumaran: Yeah.

696 01:03:58.610 01:04:01.530 Clarence Stone: Right? And then he can pull that into the main repo.

697 01:04:02.850 01:04:06.850 Uttam Kumaran: Yeah. Well, dude, one thing that I was talking to Sam about today.

698 01:04:07.310 01:04:10.580 Uttam Kumaran: Well, I was talking to two things, he said, dude, the merge conflicts are getting worse.

699 01:04:10.860 01:04:14.249 Clarence Stone: Do you need to think about something for that.

700 01:04:14.250 01:04:17.120 Uttam Kumaran: I said, you need to think about something, because I don’t know.

701 01:04:17.570 01:04:21.239 Uttam Kumaran: I also basically told him that,

702 01:04:21.410 01:04:23.880 Uttam Kumaran: I was slapping directly with him.

703 01:04:24.290 01:04:31.650 Uttam Kumaran: I told him that, I’ve fucked at all times, and…

704 01:04:41.850 01:04:43.180 Uttam Kumaran: And…

705 01:04:48.970 01:04:50.119 Uttam Kumaran: That’s gonna happen.

706 01:04:50.420 01:04:51.569 Uttam Kumaran: Oh, here we go.

707 01:04:54.630 01:04:56.140 Uttam Kumaran: Yeah, so…

708 01:05:00.590 01:05:06.860 Uttam Kumaran: I basically said, oh yeah, he was… I shipped a PR about global search.

709 01:05:07.450 01:05:13.229 Uttam Kumaran: I just vibe-coded something, I was like, I want to search over everything that AI has access to. I want a simple.

710 01:05:13.510 01:05:18.359 Uttam Kumaran: Like, spotlight-style search bar in the platform, where you can search through every fucking thing.

711 01:05:19.070 01:05:21.209 Uttam Kumaran: when I shipped, it was half-baked, and I was like.

712 01:05:21.400 01:05:32.459 Uttam Kumaran: I just want to put this idea somewhere. At some point we’ll come back to it, and we did, because he was like, what’s the status? I want to test turbo buffer, and I’m like, yeah, please go for it. I was like, I basically want to index the repo on more things.

713 01:05:32.680 01:05:40.809 Uttam Kumaran: And he was like, I said, I want Brain Forge assistant to be able to answer questions about it, and he’s like… he sent me this thing, which is, like, just Dash from Vercel.

714 01:05:41.400 01:05:44.170 Uttam Kumaran: Which I don’t… I’m like, maybe, what the fuck is this?

715 01:05:44.620 01:05:47.210 Clarence Stone: Did you go through a chain of grat?

716 01:05:47.400 01:05:50.069 Uttam Kumaran: It’s Bash for Agents, yeah, so, like.

717 01:05:50.630 01:05:54.330 Uttam Kumaran: Simulated bash with an in-memory virtual file system.

718 01:05:54.950 01:05:59.019 Uttam Kumaran: Designed for AI agents that need a secure sandbox bash environment.

719 01:05:59.190 01:06:02.319 Uttam Kumaran: So basically, I think you can pull down all the files.

720 01:06:03.470 01:06:07.370 Uttam Kumaran: I don’t know. I haven’t really had a sec to think about what… how this.

721 01:06:07.370 01:06:08.050 Clarence Stone: Yeah.

722 01:06:08.050 01:06:10.719 Uttam Kumaran: what this is, but, like, file system related…

723 01:06:10.720 01:06:15.269 Clarence Stone: Here is that it works really well, but, like, it takes a while, because it’s grappling a lot.

724 01:06:15.270 01:06:18.729 Uttam Kumaran: Yeah, so that’s the other thing I told him, is like, why don’t you just index

725 01:06:18.970 01:06:27.039 Uttam Kumaran: I said you can also just try indexing the core files via TurboPuffer. I mean, just run it side by side and see what you think.

726 01:06:28.530 01:06:32.239 Uttam Kumaran: Because Turbo Puffer, it’s a, it’s a, it’s like a vector DB.

727 01:06:33.320 01:06:39.020 Uttam Kumaran: So you would just query it, and it would basically give you the right… but it’s just very, very fast. It’s very lightweight.

728 01:06:39.190 01:06:41.940 Clarence Stone: agent search on the vector DB.

729 01:06:42.740 01:06:49.409 Uttam Kumaran: Because it’s… It’s, you have to load the whole fucking repo into your memory.

730 01:06:51.150 01:06:52.840 Uttam Kumaran: Like, that’s the problem.

731 01:06:52.840 01:06:55.180 Clarence Stone: Is this not… Cloud.

732 01:06:56.700 01:07:07.789 Uttam Kumaran: Brainforge Assistant is running on a service that doesn’t necessarily need to have the entire repo alongside of it. It just has the code related to run the Brainforge Assistant-related shit.

733 01:07:07.960 01:07:15.710 Uttam Kumaran: So I’m like… I’m like, yes, you could either run an external search service, or you could have the files right next to the backend, where it is.

734 01:07:16.330 01:07:20.389 Uttam Kumaran: But, ultimately, multiple different services want to access this data.

735 01:07:20.590 01:07:22.389 Uttam Kumaran: So put it in one place, like…

736 01:07:22.740 01:07:28.080 Uttam Kumaran: if you want, like, either… like, I want Global Search to have it, I want a Slack Assistant to have it, I want the EMT,

737 01:07:28.850 01:07:35.910 Uttam Kumaran: thing for us that’s needs a query to have access to this data, not everything is going to have a file system access.

738 01:07:40.200 01:07:48.760 Uttam Kumaran: not everything is gonna be an… not everything has an agent behind it that will have access to the file system, so it may need to hit an API to search.

739 01:07:49.060 01:07:49.810 Uttam Kumaran: Right?

740 01:07:50.540 01:07:51.320 Clarence Stone: Yeah.

741 01:07:52.130 01:08:06.389 Uttam Kumaran: So you can’t, like, you have to hit a file system to then run search, like, where does that file system live? It should just probably live in a database with all the… and it’s not like, I don’t need to search through all the code in the repo, mainly want to just have all our Markdown files.

742 01:08:06.520 01:08:12.259 Uttam Kumaran: To start with, And then ultimately, like, maybe we can move that out of the platform anyways, like…

743 01:08:12.390 01:08:15.239 Uttam Kumaran: Maybe the platform is more like, these things are here.

744 01:08:15.490 01:08:19.710 Uttam Kumaran: And then the con… that’s what… right now, it’s a security nightmare in this platform.

745 01:08:19.859 01:08:22.060 Uttam Kumaran: So eventually, like, we’ll move this…

746 01:08:23.120 01:08:26.940 Uttam Kumaran: We’ll move this stuff somewhere else, where it can all be, like, through our back.

747 01:08:28.830 01:08:36.010 Clarence Stone: I… maybe I should just throw super memory in the repo, because it uses a recursive model. It doesn’t hold everything in the model.

748 01:08:36.319 01:08:39.349 Uttam Kumaran: Yeah, I was thinking about doing super… I was looking at it yesterday.

749 01:08:40.140 01:08:43.100 Clarence Stone: And my problem was that, like, I just didn’t.

750 01:08:43.100 01:08:48.409 Uttam Kumaran: Why don’t you just send it in the platform to Sam, because I think he’s gonna look through this tomorrow.

751 01:08:48.850 01:08:49.830 Clarence Stone: I’ll send it.

752 01:08:50.899 01:08:56.739 Uttam Kumaran: Yeah, I also thought, I was… I was, oh, I think I might have ran something with it, actually. Let’s see…

753 01:08:58.979 01:09:00.729 Uttam Kumaran: I think I had cursor…

754 01:09:00.839 01:09:04.229 Uttam Kumaran: I think I sent it to remember, I said, tell me what we can do with this.

755 01:09:06.550 01:09:12.960 Clarence Stone: Oh, so I didn’t know that Super Memory was a real company, I was just calling the thing I made Super Memory. So it’s like.

756 01:09:12.960 01:09:15.730 Uttam Kumaran: Oh, no, Supermember is a company. Oh, okay, okay, okay.

757 01:09:15.850 01:09:17.420 Uttam Kumaran: Okay, okay, okay, never mind.

758 01:09:17.420 01:09:21.739 Clarence Stone: So mine uses a recursive model on top of Apple’s Clara.

759 01:09:22.520 01:09:26.029 Clarence Stone: So it does a reasoning, and then it searches in loops.

760 01:09:27.100 01:09:28.370 Uttam Kumaran: Okay, but it’s local.

761 01:09:28.550 01:09:32.100 Clarence Stone: Yeah, and it’s local, but it was too slow for my computer.

762 01:09:32.109 01:09:32.649 Uttam Kumaran: Oh, okay.

763 01:09:32.649 01:09:41.239 Clarence Stone: For a 7 billion model, it… like, if it takes a while to do a search, that’s, like, it… it kind of, like.

764 01:09:41.389 01:09:47.359 Clarence Stone: Crumbles, but apples are Creating a rag for it, and it did improve the results.

765 01:09:47.939 01:09:50.179 Clarence Stone: I just don’t like managing rags.

766 01:09:50.999 01:09:59.149 Clarence Stone: Like, I’m… I’m, like, one thing that you’re gonna realize for me is that I’m gonna fight against not having to use a rag solution. Every time.

767 01:09:59.999 01:10:00.699 Clarence Stone: I absolutely.

768 01:10:00.700 01:10:01.420 Uttam Kumaran: Hmm.

769 01:10:01.420 01:10:02.070 Clarence Stone: Eat it.

770 01:10:02.410 01:10:06.599 Clarence Stone: But… but now that you’re putting things in a graph…

771 01:10:07.350 01:10:12.159 Clarence Stone: DB, though, I think the most reasonable thing is to try Cortex.

772 01:10:12.800 01:10:19.380 Clarence Stone: Like, by the way, like, I switched EY to Cortex for a bunch of clients. Their research, the search function’s so good.

773 01:10:21.530 01:10:24.320 Uttam Kumaran: No, what is… what is Cortex Search? What is that?

774 01:10:24.550 01:10:30.969 Clarence Stone: So, it’s… it’s… Cortex is… Snowflake’s own agent… Oh, from Snowflake. Yeah.

775 01:10:30.970 01:10:41.490 Uttam Kumaran: No, no, see, but now you’re in my world, dude, there’s only some amount of data that can be in relational DB. Like, I’m not gonna stuff everything in Snowflake. Let’s say I want to search every Slack message I’ve ever sent.

776 01:10:42.990 01:10:52.849 Uttam Kumaran: like, yes, you could do it in Snowflake, but it’s adding an unnecessary vendor in that step. Like, I can have a graph… I can have a graph to be right next

777 01:10:53.070 01:10:55.209 Uttam Kumaran: to the backend for Slack.

778 01:10:56.260 01:10:57.539 Uttam Kumaran: So, like, you don’t…

779 01:10:57.700 01:11:06.179 Uttam Kumaran: Yeah. Ultimately, yeah, Snowflake is gonna be, like, do everything, but, like, there’s no need in my situation to, like, vendor lock.

780 01:11:06.180 01:11:08.400 Clarence Stone: Obsidian Search so good?

781 01:11:10.720 01:11:12.460 Uttam Kumaran: Oh, because it’s running it locally.

782 01:11:13.180 01:11:14.400 Clarence Stone: And it’s graph.

783 01:11:15.590 01:11:22.990 Uttam Kumaran: Yeah, so they… so, graph search is highly performant. It’s much more performant… I should send you a book on,

784 01:11:23.530 01:11:25.860 Uttam Kumaran: Database performance constraints, but…

785 01:11:26.000 01:11:38.979 Uttam Kumaran: Graph is way more performant, you just… it’s just… databases were built for use cases. Like, just like you shouldn’t use relational database for everything, you shouldn’t use Graph for everything. Graph is all about relationships, not about

786 01:11:38.980 01:11:50.249 Uttam Kumaran: like, large, like, sums of analytics. So… so that is… it’s… that is mainly about searching. So, one, they’re gonna… they’re gonna layer on both graph and vector indexing.

787 01:11:50.270 01:11:56.430 Uttam Kumaran: And then while you’re doing this, it’s gonna just guess and show you the most relative… there’s a bunch of algorithms to do, like, search

788 01:11:56.770 01:11:57.730 Uttam Kumaran: basically, like.

789 01:11:59.310 01:12:17.649 Uttam Kumaran: how far are you on in terms of search? So local search is generally solved, but the problem is that they have to re-index it very often. And, like, when you’re in Obsidian, you’re only dealing with a limited amount of files. Like, this is where, like, I want to search for, like, a Slack message. I want to… I want a Slack message I sent 3 years ago to pop up.

790 01:12:17.790 01:12:19.019 Uttam Kumaran: What if it’s relevant.

791 01:12:19.590 01:12:20.140 Clarence Stone: Okay.

792 01:12:20.140 01:12:25.229 Uttam Kumaran: It’s like, it’s a little bit… like, I want a Slack message, I want a HubSpot thing, I want an email, like…

793 01:12:26.040 01:12:28.850 Uttam Kumaran: And I want to build it. Like, I don’t want to use a glean.

794 01:12:29.440 01:12:30.050 Uttam Kumaran: You know?

795 01:12:30.050 01:12:36.480 Clarence Stone: Check this out, then. You might find this article interesting. I’m just gonna put it in our… I’ll put it in this chat, I guess.

796 01:12:37.440 01:12:39.719 Clarence Stone: I texted you by mistake. Here we go.

797 01:12:39.890 01:12:44.829 Clarence Stone: he figured out a way to structure this.

798 01:12:47.810 01:13:00.560 Clarence Stone: And this is beyond my database understanding, so, like, maybe you will understand it better. But this guy has, like, no followers on X, practically for, like, the kind of content that he is creating.

799 01:13:00.560 01:13:03.000 Uttam Kumaran: Oh, you sent me this, yeah, I have to go read all this guy’s shit.

800 01:13:03.000 01:13:05.449 Clarence Stone: This guy is next fucking level.

801 01:13:05.560 01:13:07.339 Clarence Stone: On, like, context engineering.

802 01:13:14.220 01:13:21.199 Clarence Stone: Like, he’s creating systems where his agents are note-taking for themselves, and, like, going through loops.

803 01:13:21.200 01:13:26.959 Uttam Kumaran: Yeah, yeah, so our thing is, like, so there’s a… there’s a lot of blog posts that came out about continuous learning.

804 01:13:27.690 01:13:30.640 Uttam Kumaran: Did you happen to read those? It’s basically the same thing, where, like.

805 01:13:30.970 01:13:42.690 Uttam Kumaran: you want the agents, so kind of something I did today is, I told you I did this, this process of, like, mapping out our services and runbooks. I then had it… I had it run for this one service we’re developing.

806 01:13:43.090 01:13:51.689 Uttam Kumaran: I had another agent use it to create the runbook, the SOW, all the pieces for a new service. I then said, what was difficult about that?

807 01:13:52.470 01:14:11.599 Uttam Kumaran: And then it said, yeah, this thing was out of place, this thing was out of place. I said, cool, propose all those fucking changes to make it. And the document council didn’t catch it. But once it ran through… once it… another agent followed the steps, it then found, like, yeah, this was ambiguous, so I had to, like, make a decision here, we should probably time that up. It basically gave feedback.

808 01:14:11.600 01:14:25.550 Uttam Kumaran: the only piece I’m missing is there’s not continuously an agent that is running through all our SOWs and trying things, measuring it against, like, a eval, and then being like, okay, we’re this far off. I read a tweet today kind of talking about that, but…

809 01:14:25.940 01:14:32.309 Uttam Kumaran: That’s what we’re not doing, like, we’re not writing… there’s not enough people writing back to the system as there are people using the system, which happens.

810 01:14:32.420 01:14:35.410 Uttam Kumaran: It happens in a, in a, in a network like this.

811 01:14:35.680 01:14:46.229 Uttam Kumaran: But ideally, the way to bootstrap it is to have… the more… so I think one way of putting this is the more things we can completely hand off to AI,

812 01:14:46.870 01:14:51.810 Uttam Kumaran: and be comfortable with it, the better the system can get, because that AI can immediately write the feedback.

813 01:14:52.010 01:14:52.650 Clarence Stone: Yep.

814 01:14:52.830 01:14:59.549 Clarence Stone: And when you have those feedback loose, did you see Carpathy sing? I actually tried it. I have a really great tax mini model now.

815 01:15:00.060 01:15:01.150 Uttam Kumaran: Oh, really?

816 01:15:01.310 01:15:03.290 Clarence Stone: Yeah, the auto research is really good.

817 01:15:03.630 01:15:04.950 Uttam Kumaran: I haven’t tried it.

818 01:15:04.950 01:15:24.310 Clarence Stone: there is an argument to be made that you can create a bunch of small agents with singular functions, use a series of tool calls as assistant agents for, like, larger, broader functions. Like, a 500 mil, like, tax policy, you know, focused agent knew everything about Partnership K1s.

819 01:15:26.160 01:15:27.500 Uttam Kumaran: Hmm…

820 01:15:29.480 01:15:36.399 Clarence Stone: Right, so, like, I mean, a great starting place would be, now that you’ve refined the playbook, it found the gaps.

821 01:15:36.820 01:15:38.610 Uttam Kumaran: Throw it into a loop.

822 01:15:38.720 01:15:49.450 Clarence Stone: with auto research. And now you have an agent that functions as the conversational tool piece for people to say, like, how does this process work? What happens next? Like.

823 01:15:49.600 01:15:51.400 Clarence Stone: That’s the only thing it’s trained on.

824 01:15:54.530 01:15:55.090 Uttam Kumaran: interesting.

825 01:15:55.090 01:15:58.119 Clarence Stone: The 500 mil model, that’s all it knows.

826 01:15:59.260 01:16:00.040 Uttam Kumaran: Wow.

827 01:16:02.800 01:16:07.950 Clarence Stone: So then, like, it becomes your check value, right? Because it knows the playbook

828 01:16:08.200 01:16:10.630 Clarence Stone: to the T. And that’s all it knows.

829 01:16:12.990 01:16:13.610 Uttam Kumaran: Hmm…

830 01:16:18.940 01:16:21.130 Uttam Kumaran: Yeah, I mean, that’s why I think if you have, like, a very…

831 01:16:21.130 01:16:24.869 Clarence Stone: itself, like, I thought it crashed, but I woke up the next morning, it worked.

832 01:16:25.580 01:16:30.019 Uttam Kumaran: Yeah, my MacBook Air is, like, never works now if I have cursor up. I don’t know.

833 01:16:30.340 01:16:38.619 Clarence Stone: Oh, and the other thing you should consider as another mechanism is, and I don’t know how to scale this, this is where I’m stuck, have you seen Hermes Agent?

834 01:16:39.550 01:16:40.190 Uttam Kumaran: No.

835 01:16:40.430 01:16:47.090 Clarence Stone: Dude, Hermes Agent is, like, another harness that somebody made that goes through its own RL loops.

836 01:16:50.130 01:16:51.080 Uttam Kumaran: Okay.

837 01:16:51.460 01:17:03.619 Clarence Stone: And, I mean, it’s in the CLI, right? Like, and it operates, you know, just like conversations on, you know, your daily work and topics, and it’s going through loops at the end based on your feedback and proving that model.

838 01:17:04.730 01:17:05.369 Clarence Stone: People are…

839 01:17:05.370 01:17:06.040 Uttam Kumaran: He’s like…

840 01:17:06.040 01:17:10.150 Clarence Stone: Coin 27B, because it’s, like, 12 gigs in Q4,

841 01:17:10.340 01:17:13.370 Clarence Stone: And it’s just enhancing their plan all the time.

842 01:17:14.180 01:17:14.970 Uttam Kumaran: Hmm.

843 01:17:14.970 01:17:19.849 Clarence Stone: I’m on a memory constraint. I need to get, like, something with more RAM to run this.

844 01:17:23.270 01:17:25.689 Uttam Kumaran: Damn. No, send it to me, I don’t even know what that is.

845 01:17:25.690 01:17:35.270 Clarence Stone: Yeah, let me send you Hermes Agent. It’s sick, but, like, I don’t know how to scale it, right? Like, one person uses it, it trains that model, it’s really good, but it’s good for that one person.

846 01:17:37.570 01:17:39.190 Uttam Kumaran: Oh, okay, okay.

847 01:17:39.330 01:17:50.830 Clarence Stone: Right? Because it becomes so personalized. And, and, and, you know, weird, random late-night thought is, like, I, like, we keep talking about context in layers. What if it’s agents in layers?

848 01:17:51.010 01:17:53.930 Clarence Stone: Like, I have a small model that always learns from what I do.

849 01:17:54.170 01:17:57.419 Clarence Stone: Right? There’s a project runbook playbook agent.

850 01:17:58.420 01:18:00.270 Clarence Stone: And then there’s a company agent.

851 01:18:01.710 01:18:03.880 Clarence Stone: And they just all have conversation chains.

852 01:18:04.240 01:18:09.810 Clarence Stone: and check values with each other. And I was like, that’s just another layer of, like, advanced thought that I…

853 01:18:09.810 01:18:14.709 Uttam Kumaran: But again, I think the complication here is, like, It’s easy when it’s, like.

854 01:18:15.860 01:18:30.500 Uttam Kumaran: document work, but it… the complexity happens when the… it’s… you’re right in that it’s mostly harnessing, but it’s actually just, like, data engineering is not automated enough to do this type of shit. Like, the tools itself you use don’t all have CLIs, they need certain checks.

855 01:18:30.610 01:18:33.570 Uttam Kumaran: like… That’s why I think research…

856 01:18:33.680 01:18:36.959 Uttam Kumaran: And anything that’s heavy document processing is gonna go first.

857 01:18:39.710 01:18:42.349 Clarence Stone: Yeah, and I think that’s the main value prop.

858 01:18:42.350 01:18:45.870 Uttam Kumaran: And yeah, that’s, like, mostly… that’s, like, most jobs, I get it, but, like.

859 01:18:46.410 01:18:52.340 Uttam Kumaran: ultimately, I think… I think we’re past that, or like, for us, that’s an adoption problem right now.

860 01:18:52.340 01:19:02.920 Clarence Stone: So, so here’s an idea. Hermes Agent could technically run as an EP, because he’s already built in with the cron harness, he can do parallel workstreams.

861 01:19:03.060 01:19:07.659 Clarence Stone: he’s plugged into Slack, It has all the things you need.

862 01:19:10.160 01:19:13.850 Uttam Kumaran: I mean, like, how hard would it be to, like, what do you need to run this?

863 01:19:14.970 01:19:15.450 Clarence Stone: You need.

864 01:19:15.450 01:19:18.709 Uttam Kumaran: I mean, I can… I can give it a free… I have a… I have a… I can give it a…

865 01:19:19.100 01:19:21.930 Uttam Kumaran: an Azure AI. I just can’t have it do,

866 01:19:22.280 01:19:25.810 Uttam Kumaran: I just have to have it be read-only, I’m not gonna let it… I can’t just… can’t let it do anything.

867 01:19:25.810 01:19:28.200 Clarence Stone: Yeah, yeah, it’s not gonna write to anything,

868 01:19:28.410 01:19:33.609 Clarence Stone: It probably just needs 32GB of RAM for a decent model, like Quin27B.

869 01:19:38.770 01:19:42.869 Clarence Stone: But it’s… or you can just run it through a $5 VPS.

870 01:19:43.740 01:19:46.269 Uttam Kumaran: Well, I wonder if it’s better for me to build…

871 01:19:46.600 01:19:52.300 Uttam Kumaran: a GitHub action that looks through all of the cursor logs for the day, And then…

872 01:19:52.850 01:19:54.649 Uttam Kumaran: Makes updates once a day.

873 01:19:56.390 01:19:58.050 Clarence Stone: We could probably do that.

874 01:19:58.900 01:20:00.000 Uttam Kumaran: Because what else?

875 01:20:01.170 01:20:04.120 Uttam Kumaran: Can I… as cursor… can I even get the cursor logs?

876 01:20:04.730 01:20:07.520 Clarence Stone: No, you can’t. That’s the fucking thing.

877 01:20:07.520 01:20:08.200 Uttam Kumaran: It’s Enterprise?

878 01:20:08.200 01:20:13.010 Clarence Stone: That, like, there’s parts of someone else’s harness that are always fucking blocked off.

879 01:20:15.930 01:20:17.809 Uttam Kumaran: Oh, but I can’t get it, or what?

880 01:20:20.600 01:20:21.150 Uttam Kumaran: Really?

881 01:20:21.150 01:20:27.200 Clarence Stone: I don’t… I don’t think logs you can get. You can get… Like, the stream?

882 01:20:28.250 01:20:29.389 Clarence Stone: Which is good enough.

883 01:20:34.790 01:20:36.150 Uttam Kumaran: Interesting.

884 01:20:39.680 01:20:45.900 Clarence Stone: And it doesn’t have, like, the loud heartbeat properties that make OpenClaw really, like, security issue.

885 01:20:46.110 01:20:52.140 Clarence Stone: So people are, like, kind of saying this might be the great replacement for You know, just…

886 01:20:52.690 01:20:57.380 Clarence Stone: An agent wrapper that’s always just doing some sort of cron job and improving itself.

887 01:20:58.730 01:21:02.109 Uttam Kumaran: We do not log agent responses or generated code content.

888 01:21:03.760 01:21:05.860 Uttam Kumaran: So instead, we recommend using hooks.

889 01:21:07.830 01:21:11.530 Clarence Stone: Then you have to build so many hooks, and then it kills your fucking cursor.

890 01:21:19.730 01:21:22.390 Uttam Kumaran: I wonder, so is it on everybody’s machine?

891 01:21:22.590 01:21:24.860 Uttam Kumaran: Like, are you saying all the logs are on my machine?

892 01:21:24.990 01:21:26.139 Clarence Stone: Yeah, it is.

893 01:21:26.540 01:21:32.190 Uttam Kumaran: So why don’t I just build a script that I just go one… I just ask everybody to run it and give me the whatever the fuck the zip

894 01:21:32.650 01:21:34.030 Uttam Kumaran: Drops for them.

895 01:21:36.110 01:21:39.639 Clarence Stone: You can, and then just grab it and do an RL loop, or have.

896 01:21:39.640 01:21:44.660 Uttam Kumaran: Dude, there’s only a couple of us that are using this. I’ll ask me… I’ll ask me, D…

897 01:21:45.370 01:21:49.260 Uttam Kumaran: All the couple people that are using the system, to just drop…

898 01:21:50.130 01:22:01.429 Uttam Kumaran: Just give me it. Well, dude, I wanna move… that’s why I wanna… we just have to move to open worker and open code. I told that to… I told that to Sam. I’m gonna… I told him eventually we’ll move to open code, just, like, make the harness work.

899 01:22:01.820 01:22:06.759 Uttam Kumaran: with BugBot, it’s gonna be the most forgiving, or with, Cursor Cloud, or with…

900 01:22:06.870 01:22:08.839 Uttam Kumaran: Codex, it’s gonna be the most forgiving.

901 01:22:08.980 01:22:11.749 Uttam Kumaran: And then we will move to the next challenge. We’ll move up.

902 01:22:11.910 01:22:12.310 Clarence Stone: Yeah.

903 01:22:12.310 01:22:12.990 Uttam Kumaran: You know?

904 01:22:13.230 01:22:16.930 Uttam Kumaran: Then we’ll go to open code, but until we get this right, I don’t want to fucking…

905 01:22:17.210 01:22:19.609 Uttam Kumaran: deal with VMs and shit like that, you know?

906 01:22:20.240 01:22:23.290 Clarence Stone: I, I think there’s a better way, though, like, and…

907 01:22:24.660 01:22:28.570 Clarence Stone: If, if, like, let’s say the organizational…

908 01:22:29.220 01:22:36.529 Clarence Stone: harness lived in the cloud, and we all had a Hermes or something like that. Like, there has to be another layout.

909 01:22:36.940 01:22:39.399 Clarence Stone: And I… I’ll think through it.

910 01:22:45.380 01:22:49.480 Clarence Stone: But yeah, it uses its own RL loop, so this is post-training.

911 01:22:51.860 01:22:52.600 Uttam Kumaran: Wow.

912 01:22:58.620 01:23:01.779 Clarence Stone: Wait, it says it uses cloud models? How the fuck does that work?

913 01:24:36.620 01:24:38.330 Clarence Stone: Oh…

914 01:24:43.230 01:24:45.759 Clarence Stone: So, Hermes works on 5 bottles.

915 01:24:46.840 01:24:47.610 Uttam Kumaran: Oh, okay.

916 01:24:49.240 01:24:52.360 Clarence Stone: It’s not as elegant, but I’ll take it.

917 01:24:53.240 01:24:55.170 Clarence Stone: It’s fine-tuning the reward.

918 01:24:57.700 01:24:58.540 Uttam Kumaran: Oh, okay.

919 01:24:59.020 01:25:05.550 Clarence Stone: So you’re injecting the reward path in between. I… That’s not really training.

920 01:25:08.410 01:25:12.589 Clarence Stone: But this, this might work as a replacement for EP.

921 01:25:14.380 01:25:17.449 Clarence Stone: Right, because then you can do a monitoring dashboard on top of it.

922 01:25:17.960 01:25:20.489 Clarence Stone: Take a look at the cron drops that it’s been assigned.

923 01:25:20.860 01:25:22.540 Clarence Stone: It’s self-reflective.

924 01:25:24.100 01:25:25.880 Clarence Stone: Has its own closed learning loop.

925 01:25:35.120 01:25:41.929 Clarence Stone: Oh, shit, dude, I’m so sorry, I got yoga. The robots is… I’ll be back later, though.

926 01:25:41.930 01:25:42.439 Uttam Kumaran: Okay, okay.

927 01:25:42.440 01:25:44.210 Clarence Stone: Let me try this tonight. It’s been a.

928 01:25:44.210 01:25:44.810 Uttam Kumaran: Okay, okay.

929 01:25:44.810 01:25:45.769 Clarence Stone: to do, anyway.

930 01:25:46.770 01:25:48.740 Uttam Kumaran: I’ll put me, you, and Yuri in a group chat.

931 01:25:48.740 01:25:50.859 Clarence Stone: Oh, yeah, yeah, I’ll do that right now. Alright, dude.

932 01:25:50.860 01:25:51.999 Uttam Kumaran: Okay, alright, thank you.

933 01:25:52.000 01:25:52.720 Clarence Stone: Yes.