Meeting Title: Brainforge x CTA AI Strategy Sync Date: 2026-05-04 Meeting participants: Uttam Kumaran, Amber Lin


WEBVTT

1 00:00:41.660 00:00:42.870 Uttam Kumaran: Hello.

2 00:00:43.040 00:00:44.030 Amber Lin: Hi there!

3 00:00:44.740 00:00:45.409 Amber Lin: How was your day?

4 00:00:45.410 00:00:46.560 Uttam Kumaran: Around everything.

5 00:00:47.260 00:00:53.190 Amber Lin: Pretty good. I’m a little… I think I got a bit of a cold from Nico, but, like…

6 00:00:53.360 00:00:56.780 Amber Lin: Today’s been a… it’s not too hard, so…

7 00:00:57.170 00:00:58.530 Uttam Kumaran: Okay, okay.

8 00:00:58.530 00:00:59.140 Amber Lin: Yeah.

9 00:00:59.960 00:01:09.640 Uttam Kumaran: Yeah, I wanted to write some stuff down. Maybe we can write some stuff down? Yeah. Sorry I didn’t get a chance to talk, but I have some fun updates for you, so…

10 00:01:09.640 00:01:10.460 Amber Lin: Okay, okay.

11 00:01:10.460 00:01:18.769 Uttam Kumaran: I hope the delay in not talking to you on Friday, you’ll be like, okay, that’s fine.

12 00:01:19.030 00:01:22.370 Amber Lin: Well, I haven’t seen this template in a while.

13 00:01:22.670 00:01:26.760 Uttam Kumaran: I’m not doing that many one-on-ones, I promise you.

14 00:01:28.400 00:01:29.460 Amber Lin: Okay.

15 00:01:29.460 00:01:32.640 Uttam Kumaran: You’re the only remaining recurring one-on-one slot.

16 00:01:32.760 00:01:33.610 Amber Lin: Michael.

17 00:01:33.610 00:01:34.420 Uttam Kumaran: Calendar.

18 00:01:34.680 00:01:38.400 Amber Lin: Well, it lasted a full year, so…

19 00:01:38.400 00:01:46.600 Uttam Kumaran: I know I’ve been, like, skipping half of them, too, but it’s… that’s on me, but I do have some fun updates for you, so…

20 00:01:46.600 00:01:47.570 Amber Lin: Awesome.

21 00:03:03.110 00:03:07.100 Uttam Kumaran: And the what didn’t go well, you put feeling good about CTA work.

22 00:03:07.100 00:03:11.560 Amber Lin: Oh, I said… oh, I read what went well.

23 00:03:12.320 00:03:17.490 Amber Lin: So funny. So I… okay, let me… let me move that.

24 00:03:18.090 00:03:21.309 Amber Lin: What went well?

25 00:03:21.450 00:03:22.710 Amber Lin: Didn’t go well.

26 00:05:53.740 00:05:54.480 Uttam Kumaran: Okay.

27 00:05:55.300 00:05:56.119 Uttam Kumaran: I feel good.

28 00:06:12.270 00:06:13.000 Amber Lin: Cool, okay.

29 00:06:14.740 00:06:19.599 Amber Lin: I feel good about this, too. I think this has been a… this has been a fun month for me.

30 00:06:20.020 00:06:25.139 Uttam Kumaran: Okay, cool. Yeah. Let’s start from the top, so…

31 00:06:25.260 00:06:37.820 Uttam Kumaran: Certs, Omni, dbt, AI-related work, great. So I think, yeah, like, when did we talk? Maybe it’s been 2 months since you called me and said I’m gonna automate myself, right? How long ago was that?

32 00:06:38.220 00:06:45.270 Amber Lin: That was before the L&D were so probably, like, early March?

33 00:06:45.380 00:06:47.980 Amber Lin: I would say, you’re right, like, 2 months ago.

34 00:06:47.980 00:06:49.470 Uttam Kumaran: Okay, nice.

35 00:06:49.670 00:06:52.550 Uttam Kumaran: Well, yeah, I, I think,

36 00:06:52.930 00:07:10.000 Uttam Kumaran: I think, roughly, like, what do you think in terms of what I mentioned then? Like, do you feel like I… do you think, like, that what I told you would happen was kind of on board? You kind of… I’m trying to recall even, like… but what I think, basically, I said is, like, go further on, like, learning the AI systems.

37 00:07:10.530 00:07:12.750 Uttam Kumaran: You’ll see that there’s plenty to do.

38 00:07:13.080 00:07:22.490 Amber Lin: Yeah, I think since then, I don’t know when, like, the harness stuff and whatever came about, but I feel like in the past.

39 00:07:22.660 00:07:26.820 Amber Lin: few months, it’s became a lot more structured in

40 00:07:27.790 00:07:43.420 Amber Lin: like, or at least my understanding of what AI engineering means. Like, to, I’ve been doing a little bit of discovery on my own, and also the work I’m doing right now is also more AI-related, so I’m getting to see, like, oh, okay, so…

41 00:07:43.540 00:07:57.340 Amber Lin: all this stuff needs support to automate. Like, that in itself is the job. Instead of doing directly the coding work.

42 00:07:57.650 00:08:10.880 Amber Lin: it’s giving it the context, or as we call it, the harness, so that it can do the work. Yes. So, I guess my pivot is, okay, so…

43 00:08:11.380 00:08:18.669 Amber Lin: I’m gonna do that instead. Of course, I need to learn what the work is, but the end goal is…

44 00:08:19.440 00:08:22.980 Amber Lin: Well, the end goal is the same, but the method is different.

45 00:08:22.980 00:08:37.980 Uttam Kumaran: Yeah, you actually are, like, are… you’re… you should realize that it’s actually a lot less about writing than write a code. It’s actually just having a crisp understanding of the outcome, and, like, testable, like, what needs to be true about the outcome, and then letting the AI sort of, like.

46 00:08:38.289 00:08:41.570 Uttam Kumaran: Get there, and then you look at it, you look at that plan, and you’re like.

47 00:08:41.820 00:08:44.809 Uttam Kumaran: That’s not… don’t worry about that, don’t worry about that, don’t worry about that.

48 00:08:44.950 00:08:52.360 Uttam Kumaran: drive, you know? Like, most of my, now, time with AI is actually, like, I spend about an hour, like, planning.

49 00:08:52.550 00:08:55.629 Uttam Kumaran: like, a huge product, and then I’m like.

50 00:08:55.970 00:09:07.670 Uttam Kumaran: I have AI sort of review that, and I review it, and then I say, cool, take phase one, take phase two, take phase three, you know? And I… I don’t write any code. I haven’t written code for, like, 4 months.

51 00:09:07.670 00:09:16.260 Amber Lin: Yeah, I think this is a timely call, because this weekend I was reading more on, AI harness and what it means, and I think my understanding.

52 00:09:16.260 00:09:16.810 Uttam Kumaran: Yes.

53 00:09:16.810 00:09:28.160 Amber Lin: AI engineering has changed a bit, because before, like, when I hear AI engineering, it’s still engineering, like, you’re still doing the work, but now I think you’re creating

54 00:09:28.230 00:09:37.309 Amber Lin: the system to… so that it can do the work. So, my understanding has changed a little bit, so it makes me more interested in

55 00:09:38.030 00:09:44.160 Amber Lin: learning AI engineering? I feel like the term is… It’s very pleasant.

56 00:09:44.160 00:09:50.309 Uttam Kumaran: That’s all made up, it’s all kind of made-up terms, but kind of, like, this kind of goes into… I mean, I think this… I think…

57 00:09:50.530 00:09:53.389 Uttam Kumaran: to not belabor this point, it’s a grape.

58 00:09:53.550 00:10:03.170 Uttam Kumaran: I think you’re actually gonna find that my kind of proposal to you is gonna be actually taking advantage of this feeling where you’re like.

59 00:10:03.340 00:10:06.579 Uttam Kumaran: I now am, like, convinced of the benefits.

60 00:10:06.950 00:10:18.170 Uttam Kumaran: I understand that there are these concepts of, like, context, harnesses, MCPs, like, I understand the, like, plan and then work. You understand how to do it?

61 00:10:18.210 00:10:30.330 Uttam Kumaran: And then, secondarily, you actually demonstrated that you built something that you could have probably done manually, which is the core tech semantic views, but AI actually showed you how you could build this feedback loop.

62 00:10:30.550 00:10:41.379 Uttam Kumaran: Right? From, like, the questions, the evals, so then, like, come back, and you showed this, what’s called, like, basically this self-learning loop, right? To, like, build. So, I think you just did that on this client, basically.

63 00:10:41.380 00:10:53.339 Amber Lin: Yeah, it was very helpful to understand what evals and why we need observability, because then that’s the only way to get it… to loop on itself.

64 00:10:53.340 00:10:54.110 Uttam Kumaran: Yes.

65 00:10:54.360 00:11:11.180 Uttam Kumaran: And so, that’s great. Like, I think you’ve done it once. I, yeah, kind of… that’s also kind of what I said. I said, and then also, I think you did a great job, like, getting certs, pushing that. I think, like, B, like, that’s, like, B’s number one goal is, like, get that going. And, like, I’m happy to see that this team is, like.

66 00:11:11.230 00:11:22.569 Uttam Kumaran: doing well. I think Mustafa kind of struggled a bit to, like, help, and so we’re kind of moving him off of L&D stuff, but I think you guys are a good team. I think both of you guys are, like, pretty direct communicators, and

67 00:11:22.700 00:11:27.480 Uttam Kumaran: that’s directly hitting our revenue, so that’s really great. And then, yeah, I think, like.

68 00:11:27.630 00:11:34.369 Uttam Kumaran: yeah, what did go well makes sense, as usual, around here, is, like, projects starting, ending. I think, like.

69 00:11:34.750 00:11:51.549 Uttam Kumaran: I put you on CTA, hopefully, to give you, like, a little bit of an isolated place where you can run. Like, daily, you can kind of just run, and they really value it, and it’s, like, not, like, the hardest, but it is, like, kind of new technology, and in that sense, it is kind of the hardest.

70 00:11:51.550 00:11:59.340 Amber Lin: But that’s what… I think that it’s a good positioning, and I think you and Robert kind of know what I… what I fit into better.

71 00:11:59.340 00:11:59.840 Uttam Kumaran: For sure.

72 00:11:59.840 00:12:03.019 Amber Lin: I would do better on this client than on Elovid.

73 00:12:03.020 00:12:03.849 Uttam Kumaran: Yes, yes, yes.

74 00:12:03.850 00:12:05.040 Amber Lin: a good choice.

75 00:12:05.040 00:12:10.089 Uttam Kumaran: Yeah, and so… this is why I think you’re gonna see that, like, when we go to CTA, like.

76 00:12:10.220 00:12:24.130 Uttam Kumaran: I think it’s gonna be amazing. They’re also a great client. I also think, like, without you on this, I don’t think we would have made this much progress. I don’t think I could have gone and started thinking about the broader contract with Catherine and things like that, so that was a good partnership.

77 00:12:24.270 00:12:35.190 Uttam Kumaran: I think, yeah, for me, I just want to see how, like, Amber does more for the business. Like, how can other people see what you’re doing and kind of, like, follow your path?

78 00:12:35.320 00:12:42.149 Uttam Kumaran: I think you’re still doing a lot, but you’re affect… right now, you’re affecting your client, and you’re affecting you. I’m like…

79 00:12:42.290 00:12:46.970 Uttam Kumaran: But nice search is one way, but I kind of… a lot of my theme today is gonna be, like.

80 00:12:47.260 00:12:51.309 Uttam Kumaran: How can you break off, like, 10 or 20% of your time to be, like.

81 00:12:51.570 00:12:57.110 Uttam Kumaran: who’s… who the hell here does not know skills? And everybody here needs to know skills, or like.

82 00:12:57.360 00:13:12.379 Uttam Kumaran: shipping something, like, back to the platform. Like, for example, like, Awash literally just built, like, a client portal, like, last week. And that was great, because he… he, like, went through the journey of, like, trying to build something on our platform, and I thought that was really fun.

83 00:13:12.670 00:13:30.950 Uttam Kumaran: And so I kind of want you to go even deeper on, like, not only building skills for data work, but, like, how can you build your own, like, for example, how can you… how can you build the, like, total rewards calculator that, like, you would want, right? Or how can you build, like, something for client management that, like, you think everybody could need?

84 00:13:31.090 00:13:46.279 Uttam Kumaran: Ultimately, what you’re gonna find, though, is, like, once you build it, you’re gonna learn how it takes to build a full-stack app. And also, we’re gonna go to a bunch of clients where… and this is part of a lot… I’m gonna continue to, like, tease this bigger pitch, is, like, prototyping is gonna matter as part of this, like, new thing.

85 00:13:46.380 00:13:53.959 Uttam Kumaran: And so, like, that’s kind of, like, what I wanted to… that’s probably my feedback, you know? It’s like, we’re a really good guinea pig, because, like.

86 00:13:53.960 00:13:54.480 Amber Lin: Huh.

87 00:13:54.480 00:13:57.599 Uttam Kumaran: I’m telling you, you have total freedom, we have all the data.

88 00:13:57.700 00:14:04.990 Uttam Kumaran: we have, you know, when we have a lot of problems, and so I kind of am interested in the people that are most AI-forward to be like.

89 00:14:05.280 00:14:10.260 Uttam Kumaran: build some stuff, like, see what it’s like to build some stuff and try to get adoption in our own community, you know?

90 00:14:10.820 00:14:26.060 Amber Lin: It’s funny that you say that was the… because as I was working on my own AI learning this weekend, and I think my first project that I had was inspired by the open code link that you guys sent me.

91 00:14:26.060 00:14:26.540 Uttam Kumaran: Nice.

92 00:14:26.540 00:14:45.929 Amber Lin: And my goal was to build a course for myself to learn the front end, back-end, and then how to apply AI engineering throughout, so that… I only had a plan, but as you were talking, I think this is something that we can have a course for the company as well.

93 00:14:46.350 00:14:55.809 Amber Lin: Right now, we focus more on using the tools, and I think in terms of building, of course, you can just work with AI, but you’ve got…

94 00:14:55.810 00:14:56.230 Uttam Kumaran: Totally right.

95 00:14:56.230 00:15:02.250 Amber Lin: I’ve built so much stuff that there’s reusable things, and there’s things that people don’t have.

96 00:15:02.250 00:15:19.270 Uttam Kumaran: But it should be as easy as, like, I want to make an app that does this. Okay, let me walk you through how to get the… do the design, do the back end, do the front end, now build it, now here’s how you get it reviewed. That’s the next unlock, and yeah, like, I want people to see that they can build their own software. Like…

97 00:15:19.270 00:15:19.830 Amber Lin: Yeah.

98 00:15:19.830 00:15:25.319 Uttam Kumaran: we just, yeah, you could totally do that. You’re probably, like, a week behind on doing that, yeah.

99 00:15:25.440 00:15:29.239 Uttam Kumaran: So, like, that’s probably my feedback. And yeah, I think, like.

100 00:15:29.500 00:15:44.910 Uttam Kumaran: you getting certified is great. You’re a great… I mean, it’s clearly a great… you’re a great student, and you just… you just crushed through that. I also think, like, the CTA feedback has been really great, and I will say, like, what you’re learning about AI is… is, like, really the thing to internalize, is, like.

101 00:15:45.470 00:15:53.359 Uttam Kumaran: It doesn’t really matter if you know the subject matter, it actually matters that you know how to use AI to learn whatever it is you don’t know.

102 00:15:53.470 00:16:00.970 Uttam Kumaran: See what I mean? Like, I know nobody knows Snowflake AI. Nobody knows how to build those agents, by the way. Not even the Snowflake people.

103 00:16:01.980 00:16:15.719 Uttam Kumaran: Like, so… and, like, what you saw is you built all these new semantic views, all these things are new concepts, right? But what you know is, like, you just know how to build anything, or you know how to learn anything, and that’s actually the skill set to learn, right?

104 00:16:16.100 00:16:35.019 Uttam Kumaran: Like, the thing for me would be, like, oh, you don’t store for, like, AI Cortex, we’re just gonna have you do Cortex work for now. No, no, no. Now I’m like, you need to lean into the fact that you just came into something that’s, like, brand new, and that you were able to learn it, and so now, you should see… should we feel free that, like, you’re probably two weeks from shipping software.

105 00:16:35.410 00:16:40.379 Uttam Kumaran: even though you may not know front-end, back-end, don’t worry about that. Like, worry more about, like.

106 00:16:40.580 00:16:42.319 Uttam Kumaran: How fast can you learn anything?

107 00:16:42.570 00:16:46.250 Uttam Kumaran: Right? Or how can you effectively plan and research

108 00:16:46.390 00:16:49.860 Uttam Kumaran: and sort of, like, AI into the world.

109 00:16:50.050 00:16:51.209 Uttam Kumaran: what you want.

110 00:16:51.890 00:16:53.420 Uttam Kumaran: That’s… that’s the bigger skill set.

111 00:16:54.550 00:16:55.230 Amber Lin: I see.

112 00:16:55.230 00:16:58.039 Uttam Kumaran: So, like, Cortex is just, like, one thing.

113 00:16:58.340 00:16:59.190 Uttam Kumaran: Right.

114 00:16:59.190 00:16:59.950 Amber Lin: Yeah.

115 00:17:01.100 00:17:09.080 Amber Lin: I agree. They did have pretty good documentation, and I also had stuff that you already built.

116 00:17:09.329 00:17:15.599 Uttam Kumaran: But see, even I didn’t know until, like, I probably worked, like, 2 days on everything there, and, like, I didn’t know much.

117 00:17:15.839 00:17:17.279 Uttam Kumaran: But again, it’s like…

118 00:17:17.429 00:17:20.929 Uttam Kumaran: Some people are gonna take 6 to 8 months to learn to do what we just did.

119 00:17:21.289 00:17:26.349 Uttam Kumaran: In, like, 3-4 weeks. So, like, it’s, like, that dramatic. I’m telling you, it’s, like, that dramatic.

120 00:17:27.150 00:17:32.640 Uttam Kumaran: And you know, if you… now if you do it a second time, how long would it have taken you? Like, a couple days to get everything through? Like…

121 00:17:33.440 00:17:37.619 Uttam Kumaran: So… But the takeaway here is, like.

122 00:17:38.660 00:17:44.299 Uttam Kumaran: Like, knowing how to do something is not… is not really, an issue anymore.

123 00:17:45.080 00:17:46.120 Uttam Kumaran: See what I mean?

124 00:17:46.860 00:17:48.470 Uttam Kumaran: Yeah. Which is kind of crazy.

125 00:17:49.790 00:17:54.900 Amber Lin: It is, I think this is the new age of consulting firms, because

126 00:17:55.020 00:18:04.400 Amber Lin: like, when I first started in consulting, like, people also don’t know anything when they go into that. Yeah. And so it’s… I think it’s that, but amplified, so…

127 00:18:04.400 00:18:04.920 Uttam Kumaran: Yeah.

128 00:18:04.920 00:18:06.520 Amber Lin: But we’re able to do it.

129 00:18:07.100 00:18:07.430 Uttam Kumaran: Cool.

130 00:18:07.430 00:18:12.919 Amber Lin: I’m looking at the notes you have here, and key takeaways and action. Do you want to go into that a bit more?

131 00:18:13.160 00:18:26.190 Uttam Kumaran: Yeah, I just, like, I mean, other than your other stuff, yeah, I think Jasmine is great. I’m glad that she’s, like, spending time with everybody. I also think that, like, yeah, we’re, like, a much bigger company. I also think that…

132 00:18:26.330 00:18:31.080 Uttam Kumaran: We are… this is probably feedback from Kayla, is like.

133 00:18:31.180 00:18:49.550 Uttam Kumaran: for Kayla, it’s like, she’s gonna work on some more cultural things to get people to know each other. I’m really happy to see everybody in LA getting together so often, like, that’s amazing. I also think, yes, on the JD side, like, that’s kind of, like, what we’ll talk about today, is, like, I’m trying to think about how to place someone like you

134 00:18:49.720 00:18:51.750 Uttam Kumaran: I play someone like a Greg.

135 00:18:52.050 00:18:54.930 Uttam Kumaran: And like, yeah, and I think I have something.

136 00:18:55.340 00:18:57.970 Uttam Kumaran: And like, I think it’ll be better than…

137 00:18:59.030 00:19:01.630 Uttam Kumaran: analyst, like, I don’t think that’s fair.

138 00:19:01.760 00:19:07.550 Uttam Kumaran: But I also don’t think, like, I’ve… give me some credit, there’s not, like, a… there’s not, like, a North Star for me to, like…

139 00:19:08.580 00:19:09.940 Uttam Kumaran: I feel like a JD for some of the stuff I’.

140 00:19:09.940 00:19:22.720 Amber Lin: Yeah, the stuff that we’re doing is very new. Like, of course, there’s a lot of people that’s trying to do the same thing, but it’s new, and JDs are the older market, so I…

141 00:19:22.720 00:19:23.050 Uttam Kumaran: Yes.

142 00:19:23.390 00:19:27.110 Amber Lin: I couldn’t find anything, and I was looking at the older stuff, like, this is…

143 00:19:27.220 00:19:30.370 Amber Lin: a part of what I do, but it’s not all that I want to do.

144 00:19:30.370 00:19:30.720 Uttam Kumaran: Yeah.

145 00:19:30.720 00:19:37.110 Amber Lin: I told Jazz, I was like, hey, I don’t… I’m not that interested in just one of these…

146 00:19:37.110 00:19:37.590 Uttam Kumaran: Yes.

147 00:19:37.590 00:19:44.340 Amber Lin: these, so… But then I feel, oh, I’m not… I haven’t dug deep enough into either.

148 00:19:44.340 00:19:44.670 Uttam Kumaran: Yeah.

149 00:19:44.670 00:19:49.660 Amber Lin: of these JDs, it’s just kind of merged together nicely, but then it’s a new thing.

150 00:19:51.280 00:19:54.019 Uttam Kumaran: Yeah, then my… probably my feedback is, like.

151 00:19:55.050 00:19:57.779 Uttam Kumaran: I wanna… even if we don’t talk, I still…

152 00:19:58.000 00:20:05.220 Uttam Kumaran: love hearing your reflections. And I actually was, you know, as you can notice, this isn’t the weekly ret- this isn’t your weekly retro, DB.

153 00:20:05.220 00:20:05.940 Amber Lin: Mmm.

154 00:20:06.090 00:20:09.559 Uttam Kumaran: So I was clicking around, because I… because, you know, like, a year ago.

155 00:20:09.640 00:20:10.150 Amber Lin: Whoa.

156 00:20:10.150 00:20:11.970 Uttam Kumaran: I didn’t look at all of them.

157 00:20:11.970 00:20:23.359 Amber Lin: I know, I haven’t been writing much, is why I’m laughing. I was like… I was much more focused on writing things when it’s new, and then I.

158 00:20:23.360 00:20:37.149 Uttam Kumaran: No, but I think also it’s clearly, like, that’s how you think, and I also don’t want just the fact that we don’t need to not have, like, reflections, because I will read everything you send me in Slack. I had just been having a hard time getting time these days.

159 00:20:37.480 00:20:54.950 Uttam Kumaran: So, especially, like, Friday, like, end of the day. So, part of it is, like, I still want people that have a high bar for reflection and introspection to share that. Even it can just be a DM to me about, like, I’m feeling a certain way. And then I also want you, and these are… this is where, like, I’ll tell you, like.

160 00:20:54.950 00:21:08.940 Uttam Kumaran: it’s worth having, like, a strong opinion, but just, like, loosely held. Meaning, like, hey, I think I like this, here’s why, like, let’s talk about it. So I still want to… I don’t want you to be, like, a… I think for you and your career, don’t be, like, a leaf in the wind.

161 00:21:09.210 00:21:24.979 Uttam Kumaran: like, I think when you called me a few months ago, that was actually a great moment to be like, I feel like I’m gonna leave for the wind, I think the wind’s dying, like, what do I do? But, like, I… I… you could even… you could do that once a week, it doesn’t matter, but I want you to feel like you have some control.

162 00:21:25.510 00:21:28.399 Uttam Kumaran: And then my job is to, like.

163 00:21:28.950 00:21:47.720 Uttam Kumaran: my job is not to say, okay, great, you have, like, a really interesting set of skills, but, like, I’m just gonna bottle you into, like, data analyst. Like, that’s, like, not at all. So my challenge is, like, okay, what is this, like, job? And I think that’s what today, like, I kind of wanted to talk a little bit about, like.

164 00:21:47.850 00:21:54.440 Uttam Kumaran: our kind of positioning. Like, I don’t know if you had a chance to look at the thing that,

165 00:21:54.660 00:22:00.560 Uttam Kumaran: Robert Scent at all, or, like, on a kind of new company positioning, but I can pull it up.

166 00:22:00.560 00:22:03.830 Amber Lin: I kind of read it in the Brainforge team chat.

167 00:22:03.830 00:22:14.590 Uttam Kumaran: Okay, I’ll… I’ll pull it up, and then maybe we can just, like… gloss over it. Let’s see…

168 00:22:26.620 00:22:29.600 Amber Lin: I can send the link, I think it’s this one.

169 00:22:30.970 00:22:32.550 Uttam Kumaran: Yeah, I have it here.

170 00:22:37.900 00:22:39.210 Uttam Kumaran: Okay.

171 00:22:40.100 00:22:49.540 Uttam Kumaran: So yeah, I mean, they’re basically moving to, like, a company that is all about, like, context engineering.

172 00:22:51.390 00:22:57.239 Uttam Kumaran: And… Probably, like, part of the reason for doing this is, like.

173 00:22:57.530 00:23:04.470 Uttam Kumaran: as you know, everything that works in AI is really gonna be dependent on, like, great context.

174 00:23:05.120 00:23:14.580 Uttam Kumaran: So… Like, part of the thing we want to work on is, like.

175 00:23:14.730 00:23:24.820 Uttam Kumaran: for companies, when they hire us, we want to be positioned as, like, the people that get your data estate into the right position in order to enable great AI.

176 00:23:25.010 00:23:33.430 Uttam Kumaran: And not the other way around, meaning, like, we’re not gonna start… we’re not gonna just do straight data work anymore without an AI focus.

177 00:23:34.150 00:23:35.360 Uttam Kumaran: Whether that’s, like, enabling…

178 00:23:35.360 00:23:39.519 Amber Lin: We finally decided to, like, or found a way to…

179 00:23:39.520 00:23:40.770 Uttam Kumaran: No way.

180 00:23:41.170 00:23:48.770 Uttam Kumaran: I wanted to do it, but we, like… yeah, you know, we’ve, like, tried to figure it out. But also, like, this is something that the market didn’t know this until…

181 00:23:49.350 00:23:58.270 Uttam Kumaran: Because I’ve been pitching this for 2 years, you know, context engineering, and nobody got it until, like, probably, like, 4 months ago, 3 months, and now they’re getting it.

182 00:23:58.380 00:24:02.740 Uttam Kumaran: And then guess what? We’re now, like, the only people that have ever done stuff like this.

183 00:24:03.140 00:24:15.650 Uttam Kumaran: not only internally, but actually have some client wins where we’re doing this. And so, like, I don’t have to pitch you on, like, why you need great contacts for AI, but, like, I don’t know, this is, like, a great…

184 00:24:15.770 00:24:18.349 Uttam Kumaran: This is kind of a good way to think about it, you know?

185 00:24:18.350 00:24:21.209 Amber Lin: Let me try and find that, I want to read that.

186 00:24:23.070 00:24:23.990 Amber Lin: Cool.

187 00:24:24.680 00:24:34.140 Amber Lin: And I guess my question, and you probably have the… where we’re gonna discuss this, is where do we plan for me to fit in?

188 00:24:36.230 00:24:43.739 Uttam Kumaran: Kind of, yeah, basic… well, it’s… I’m kind of, like, pitching this as, like, this is where our company is headed, and…

189 00:24:44.170 00:24:49.539 Uttam Kumaran: We kind of need to… We kind of need to just, like.

190 00:24:49.830 00:24:54.089 Uttam Kumaran: I mean, figure out both how, like, our existing team fits into this to support this.

191 00:24:54.370 00:24:56.980 Uttam Kumaran: And… kind of how…

192 00:24:57.850 00:25:05.789 Uttam Kumaran: this changes the type of work that we’re doing for people. But ultimately, I think this is really in line with, like, what we’re doing for, like, a CTA.

193 00:25:06.000 00:25:09.900 Uttam Kumaran: And so, ultimately, I think…

194 00:25:11.420 00:25:16.220 Uttam Kumaran: we are headed this direction more formally. You know?

195 00:25:16.380 00:25:17.410 Uttam Kumaran: But…

196 00:25:17.660 00:25:22.830 Uttam Kumaran: I think this is a good deck to kind of go through. It’s really long, and, like, we’re working on it, but.

197 00:25:22.830 00:25:24.709 Amber Lin: You guys are so much prettier.

198 00:25:25.150 00:25:28.119 Amber Lin: It has been pretty, but this is even better.

199 00:25:28.120 00:25:29.470 Uttam Kumaran: This is getting better.

200 00:25:29.820 00:25:41.020 Uttam Kumaran: So, like, either way, we’re gonna codify this, but to largely tell you, like, we’re starting to shift this direction more sincerely, and I think more of, like.

201 00:25:41.710 00:25:55.739 Uttam Kumaran: more of, like, what I… what I kind of want to get across today is I think you’re… the way you fit in is really well aligned with this future of, like, what we’re trying to sell, but ultimately, I think we have, like, two…

202 00:25:56.110 00:25:59.140 Uttam Kumaran: We’re gonna have, like, kind of two roles, probably, in the company.

203 00:26:00.570 00:26:03.899 Uttam Kumaran: We’re probably gonna have people that sit as, like, AI strategists.

204 00:26:04.220 00:26:06.729 Uttam Kumaran: People that are able, that do understand

205 00:26:07.690 00:26:20.180 Uttam Kumaran: how this all works. They understand what context engineering is, they understand skills and skill building, and they’re able to prototype. But most of all, they’re able to, like, sit and teach people.

206 00:26:20.430 00:26:25.740 Uttam Kumaran: And, like, actually be like, cool, company wants to, like, get on this path. How do we do that, right? So…

207 00:26:25.980 00:26:27.810 Uttam Kumaran: It’s, like, a mix of, sort of, like.

208 00:26:28.030 00:26:40.750 Uttam Kumaran: serious, like, MBA kind of, like, consulting, plus, like, some prototyping and some, like, deeper technical understanding, but it doesn’t mean, like, okay, you need to go build every single thing, or they’re, like.

209 00:26:40.810 00:26:56.149 Uttam Kumaran: for example, if I was to be like, cool, build a great, like, role-based access control system, you’d be like, dude, like, I could do that, but I’m sure there’d be something that’s, like, messed up. More of, like, what that is gonna go to is, like, what we’re gonna call, like, basically, like, AI engineering.

210 00:26:56.640 00:27:02.840 Uttam Kumaran: which is actually just gonna be, like, these, like, 100X, like, engineers. It’s kind of like me shipping, like.

211 00:27:02.970 00:27:05.470 Uttam Kumaran: Everything on the platform, like, alone.

212 00:27:05.580 00:27:07.310 Uttam Kumaran: Like, kind of on the side.

213 00:27:07.430 00:27:13.000 Uttam Kumaran: That level of, like, execution power, which is, like, we’re just gonna…

214 00:27:13.920 00:27:16.949 Uttam Kumaran: We’re gonna bias towards having engineers who

215 00:27:17.150 00:27:30.060 Uttam Kumaran: are able to really be, like, 100x, like, what a classic engineer can be in terms of pace, in terms of, like, the sophistication of the solution, and we’re gonna have, like, these two kind of rules, I’m thinking.

216 00:27:31.460 00:27:46.229 Uttam Kumaran: And so, part of this is, like, there’s also another way to think about it, is, like, there’s sort of two axes. There’s people that come to the table with, like, a deep subject matter expertise, like, for example, like, like a WASH, deep data expertise.

217 00:27:46.370 00:27:50.000 Uttam Kumaran: And then they, like, learned the AI piece, and now they’re like.

218 00:27:50.160 00:27:57.720 Uttam Kumaran: blasting really, really hard, right? But their main job is still, like, a lot on the data side. You also have people that are…

219 00:27:57.960 00:28:04.210 Uttam Kumaran: have, like, no background, no, like, subject matter expertise, but they’re accelerating really hard on AI.

220 00:28:04.590 00:28:13.300 Uttam Kumaran: And, like, that’s another background that we’re finding. You’re kind of in the middle, but frankly, actually, what… where Brainforce people like you and, like, Greg.

221 00:28:13.410 00:28:16.819 Uttam Kumaran: Are unique, in that you guys just learned it.

222 00:28:17.130 00:28:20.560 Uttam Kumaran: Which… and you guys are also very, like, patient and can teach.

223 00:28:20.780 00:28:27.079 Uttam Kumaran: And so that is actually really, really fit for, like, kind of like this, like, AI strategist role that we’re planning.

224 00:28:27.790 00:28:32.809 Uttam Kumaran: It’s kind of like, for example, if I was to get a call from X company, and they were like.

225 00:28:33.130 00:28:38.010 Uttam Kumaran: We’re, like, 4 years behind you guys, we just need, like, some help to start thinking about, like.

226 00:28:38.550 00:28:40.249 Uttam Kumaran: How we could even do this?

227 00:28:41.370 00:28:43.320 Uttam Kumaran: You, I call you, and I call Greg.

228 00:28:45.280 00:28:50.489 Uttam Kumaran: And you’re gonna find that the deals that we’re getting are kinda like that. Like, they’re very… they’re like…

229 00:28:51.140 00:28:56.760 Uttam Kumaran: Like, we’ve had a platform for, like, 2 years. There… people are, like, not even on ChatGBT who are calling us.

230 00:28:57.670 00:28:59.800 Uttam Kumaran: So, they’re years behind.

231 00:29:00.480 00:29:06.989 Uttam Kumaran: And so you gotta think about what do they want? They don’t want, like, a chat app. They want, like, a 2- or 3-year roadmap.

232 00:29:07.600 00:29:11.620 Uttam Kumaran: Of, like, how do we even do this? Like, what is it we’re doing?

233 00:29:11.760 00:29:18.750 Uttam Kumaran: Right? Like, they’ve never heard of a skill before. And the benefit is, like, you actually don’t need to have, like, deep…

234 00:29:19.700 00:29:24.950 Uttam Kumaran: technical depth, What, you actually need, recent depth.

235 00:29:25.120 00:29:28.170 Uttam Kumaran: Like, you need to have gone on this journey yourself.

236 00:29:28.410 00:29:28.910 Amber Lin: Right?

237 00:29:29.010 00:29:39.079 Uttam Kumaran: To be able to talk about skills, talk about using a cursor, talk about this flywheel, this feedback loop, talk about evals, right? Nobody has been doing that for, like, that long.

238 00:29:39.280 00:29:43.989 Uttam Kumaran: And so, part of what we’re thinking about is forming a new team

239 00:29:44.100 00:29:46.570 Uttam Kumaran: kind of like a new AI team.

240 00:29:46.860 00:29:49.730 Uttam Kumaran: That is sort of top-heavy with strategists.

241 00:29:50.370 00:29:53.809 Uttam Kumaran: And then has a couple of, like, really, really hardcore engineers.

242 00:29:54.160 00:29:57.899 Uttam Kumaran: That are, like, 1 to 5, meaning, like.

243 00:29:58.560 00:30:02.370 Uttam Kumaran: they’re kind of like how I’ve been able to ship all this stuff for us, like.

244 00:30:02.700 00:30:06.319 Uttam Kumaran: They’re really pushing the limits of, like, what can be built.

245 00:30:06.690 00:30:10.430 Uttam Kumaran: And that’s, like, sort of this, like, new team that I’m… that I’m sort of planning.

246 00:30:10.670 00:30:12.449 Amber Lin: Like, yeah, consulting.

247 00:30:12.570 00:30:13.340 Amber Lin: Team.

248 00:30:14.330 00:30:15.590 Uttam Kumaran: Yeah, I honestly.

249 00:30:15.590 00:30:16.200 Amber Lin: I just think it’s.

250 00:30:16.200 00:30:18.110 Uttam Kumaran: it’s like a new AI team, like…

251 00:30:18.110 00:30:18.920 Amber Lin: Yeah.

252 00:30:18.920 00:30:21.919 Uttam Kumaran: I think the previous AI team was people that

253 00:30:22.570 00:30:28.690 Uttam Kumaran: know how to use AI to build stuff, I don’t think that’s that… was enough.

254 00:30:30.190 00:30:31.679 Uttam Kumaran: You need to be able to teach.

255 00:30:32.630 00:30:47.940 Uttam Kumaran: and you need to be able to teach all the primitives, like, there’s still things that I feel like we haven’t learned here, because we haven’t done at our company, like long-running agents, like memory, like, stuff like OpenClaw, like these, like, larger personal assistant style frameworks.

256 00:30:47.940 00:30:56.869 Uttam Kumaran: But, like, I’m comfortable that if you guys spent a week and messed around with it, you’d be able to learn it, and you’d be able to teach it. And that’s the strategy kind of group.

257 00:30:57.200 00:31:04.320 Uttam Kumaran: And then there’s a deeper piece of people that are, like, all they’re doing is, like, once the strategy group sells the implementation.

258 00:31:04.680 00:31:07.769 Uttam Kumaran: Those guys are building whatever it is that we need.

259 00:31:08.140 00:31:14.290 Uttam Kumaran: And the strategy group… the strategy… strategists are more like, Product managers, style.

260 00:31:14.470 00:31:20.170 Uttam Kumaran: Where you’re, like, able to get the requirements, Maybe build a little prototype.

261 00:31:20.280 00:31:25.389 Uttam Kumaran: Get the bigger one built by the team, and then kind of come back and do the demo and loop.

262 00:31:25.510 00:31:28.389 Uttam Kumaran: And, like, that’s kind of this, like, new team that we’re…

263 00:31:28.610 00:31:30.679 Uttam Kumaran: That I’m sort of thinking about right now.

264 00:31:30.950 00:31:31.610 Amber Lin: Okay.

265 00:31:32.610 00:31:45.660 Amber Lin: Well, that sounds very exciting to me, because it sounds like we will… that’s… the strategy team is learn new stuff, do new stuff, and then document learnings, and I think that’s what I like to do.

266 00:31:46.640 00:31:49.670 Uttam Kumaran: Yeah, so I… Exactly.

267 00:31:49.820 00:32:01.489 Uttam Kumaran: So that’s sort of, like, what I’m thinking. I’m not sure what the timeline is, but I also think it lines up well with your work for CTA. I think you’re using AI to do most of it.

268 00:32:01.630 00:32:03.279 Uttam Kumaran: I also think…

269 00:32:03.540 00:32:11.869 Uttam Kumaran: it is… there’s an aspect of learning, but it’s also this, like, strategy aspect. I actually think some of our engagements are going to be even, like, less hands-on keyboard.

270 00:32:12.040 00:32:16.039 Uttam Kumaran: It’s gonna be just more of, like, What is a skill?

271 00:32:16.300 00:32:18.039 Uttam Kumaran: Or, like, what is context?

272 00:32:18.140 00:32:20.670 Uttam Kumaran: Like, it’s sort of, like, even further back.

273 00:32:21.110 00:32:23.650 Uttam Kumaran: And it’s gonna involve, like.

274 00:32:24.100 00:32:30.529 Uttam Kumaran: going to see people, or going to see clients, and, like, basically being, like, their first AI team.

275 00:32:31.390 00:32:38.939 Uttam Kumaran: And I’m kind of thinking of forming it around, like, you, around Greg, around Pranav.

276 00:32:39.090 00:32:40.330 Uttam Kumaran: And then…

277 00:32:40.440 00:32:50.980 Uttam Kumaran: I’m… we’re bringing on Davis on some of the engineering stuff. So, but really, my focus in the beginning is actually just this, like, strategy group, of people that are, like.

278 00:32:51.260 00:32:54.410 Uttam Kumaran: on-the-fly thinkers, That are, like.

279 00:32:54.970 00:32:59.319 Uttam Kumaran: whatever you just said, give me, like, 20 minutes, I’ll get back to you, and I’ll kind of, like, have figured it out.

280 00:32:59.940 00:33:02.950 Uttam Kumaran: Are really forward with, like, understanding how to teach.

281 00:33:03.080 00:33:11.900 Uttam Kumaran: and also recently went through the journey themselves. Meaning, I can’t put someone in this who themselves aren’t hammering AI, like, every day.

282 00:33:12.100 00:33:21.969 Uttam Kumaran: Right? I don’t think it makes sense. I don’t think you can explain it. And I know you’re doing a lot, I know Greg’s doing it a lot. And so, yeah, kind of my questions are, like.

283 00:33:22.740 00:33:24.310 Uttam Kumaran: What do you think about that?

284 00:33:25.040 00:33:29.980 Uttam Kumaran: I also think about, like, How do you take some of your time

285 00:33:30.090 00:33:48.669 Uttam Kumaran: to actually dedicate to, like, running a little bit of the strategy loop within Brainforge, so you can start to get better. Like, how are you going and being like, let’s say I was a strategist for myself, well, what could I prototype and then hand to the platform team to maybe, like, productionalize, you know? And…

286 00:33:49.200 00:33:53.360 Uttam Kumaran: Yeah, and so those are kind of, like, the first two Bullets.

287 00:33:53.960 00:34:11.919 Amber Lin: Hmm, I see. Well, I’m very interested in the role. I don’t exactly know what it’ll look like, but based on what we said, very interesting. It’s pretty much exactly I’m trying to do myself, so I’m glad I can do it in the company. In terms of influence.

288 00:34:11.929 00:34:17.599 Amber Lin: I… the moment you said it, I was like, hmm, I… I don’t exactly know the state of our…

289 00:34:17.699 00:34:27.689 Amber Lin: company, and I think that’s… if I were to treat Brainforge as a client, I think all… always the first step would be intake, and to know what

290 00:34:27.810 00:34:36.560 Amber Lin: the current state is, so if… if I were to have that, then I can start prototyping things.

291 00:34:37.860 00:34:43.099 Amber Lin: I think I have my… I have time to do that.

292 00:34:43.100 00:34:52.850 Uttam Kumaran: But it’s… but it’s not, like, I guess what I would say, it’s not gonna be, like, 50%, but it’s like, let’s say you had, like, 2 or 3 hours a week, or a few hours a week to just, like, build something for the delivery team.

293 00:34:53.870 00:34:56.370 Uttam Kumaran: But think of it like we were the client.

294 00:34:56.580 00:35:00.929 Uttam Kumaran: I think it’s gonna help you just, like, learn in an environment that’s really safe.

295 00:35:01.740 00:35:04.730 Uttam Kumaran: I’m like, okay, what is it like if I, like, looked at my day-to-day and I’m like.

296 00:35:05.720 00:35:10.060 Uttam Kumaran: I want to build something that helps you, like, track hours better, or I want to build something that, like.

297 00:35:10.270 00:35:19.870 Uttam Kumaran: is able to deliver me a daily update every day of, like, what’s in my schedule? That’s something that I think everybody could use. How about I go ahead and build it, I get some buy-in.

298 00:35:20.250 00:35:21.480 Uttam Kumaran: And then I’ll ship it.

299 00:35:21.650 00:35:31.650 Uttam Kumaran: and then platform team can take it, automate it, put it wherever, but then you’re, like, the product owner, you know? That’s a great example. I think you could knock that out if you spent, like, 2-3 hours, you know?

300 00:35:31.870 00:35:32.500 Amber Lin: Yeah.

301 00:35:33.140 00:35:36.880 Amber Lin: Well, like, putting Clockify in the platform.

302 00:35:37.180 00:35:37.660 Uttam Kumaran: Yeah.

303 00:35:37.660 00:35:39.739 Amber Lin: We don’t have to deal with Clockify.

304 00:35:39.740 00:35:40.640 Uttam Kumaran: Exactly.

305 00:35:42.130 00:35:48.479 Uttam Kumaran: So, like, that’s sort of where I want, like… but this is sort of, like, what I’m thinking. I also think, like…

306 00:35:50.880 00:35:56.689 Uttam Kumaran: I don’t know on what timeline, but I wrote, like, sort of this messy middle, is there’s gonna be a lot of people in the middle.

307 00:35:57.240 00:36:04.289 Uttam Kumaran: But I don’t know… like, what they’re gonna do longer term. Because if you have deep subject matter expertise.

308 00:36:05.070 00:36:08.729 Uttam Kumaran: the only expectation now is that you use AI to do that faster.

309 00:36:08.900 00:36:11.110 Amber Lin: Yeah, I think Zora’s a good example.

310 00:36:11.110 00:36:14.620 Uttam Kumaran: That’s a good example. But, you know, that took a lot of pushing.

311 00:36:14.620 00:36:16.220 Amber Lin: Yeah, I remember.

312 00:36:16.220 00:36:20.590 Uttam Kumaran: And so, I know, but Awash is a great example. Awash is already a monster.

313 00:36:20.590 00:36:21.380 Amber Lin: Yeah.

314 00:36:21.380 00:36:24.169 Uttam Kumaran: And so it only becomes bigger, right?

315 00:36:24.840 00:36:30.250 Uttam Kumaran: But then this is where I told… I mentioned to Jasmine feedback. I said, dude, you’re not, like, you’re not doing enough AI.

316 00:36:30.700 00:36:34.119 Uttam Kumaran: And I don’t know, until you do it, how are you gonna believe it? But, like…

317 00:36:34.870 00:36:37.519 Uttam Kumaran: Doing things the analog way is not an option.

318 00:36:38.380 00:36:40.679 Uttam Kumaran: Like, it’s not impre- that’s not impressive to me.

319 00:36:41.360 00:36:42.320 Uttam Kumaran: At all.

320 00:36:42.560 00:36:52.310 Uttam Kumaran: It’s like, why not we hire people? It’s, like, not our business model, right? And so, ultimately, that’s a us… if you’re not using it, then we’re bad at… we’re bad at, training people.

321 00:36:52.810 00:36:58.140 Uttam Kumaran: Like, right? But ultimately, like, how can we have a company, and we’re pitching AI transformation.

322 00:36:58.400 00:37:03.420 Uttam Kumaran: And people are using AI in the company. There’s, like, not a… there shouldn’t be a single person.

323 00:37:04.290 00:37:08.010 Uttam Kumaran: And so that’s also what I think about, is, like, if you’re in the middle, right, if you, like.

324 00:37:08.300 00:37:12.130 Uttam Kumaran: If you’re, like, a ju… if you’re coming with, like, no experience in anything and don’t know AI,

325 00:37:12.310 00:37:17.199 Uttam Kumaran: you’re kind of in the middle. If you come in with deep expertise somewhere, but then, like, you’ve never…

326 00:37:17.420 00:37:18.820 Uttam Kumaran: Created a skill.

327 00:37:19.510 00:37:27.099 Uttam Kumaran: you’re kind of in the middle, too. I’m like… so both of those characters, I don’t really know how they fit into a longer-term brain force strategy.

328 00:37:29.390 00:37:37.500 Uttam Kumaran: But it’s okay, like, I think we have some time to sort of nudge people in the right direction, and as you know, it’s sort of, like, undeniable.

329 00:37:37.680 00:37:41.730 Uttam Kumaran: Once you start using it. And the faster everybody in this company

330 00:37:41.980 00:37:44.490 Uttam Kumaran: But Kayla’s shipping her first skills today.

331 00:37:46.410 00:37:51.859 Uttam Kumaran: You know, it’s like, once people start, you see it, you can’t unsee it, and so, ultimately.

332 00:37:52.090 00:37:57.790 Uttam Kumaran: we’re not only doing that for ourselves, we are selling that transformation. You see what I mean?

333 00:37:58.260 00:37:58.880 Amber Lin: Yeah.

334 00:37:59.010 00:38:07.850 Uttam Kumaran: I want to get every one of your team members to the point where they can’t see that, and they’re just on this flywheel of optimizing their workflows and their efficiency, you know?

335 00:38:07.850 00:38:14.279 Amber Lin: So are we… It sounds like we’re giving the course to them as well.

336 00:38:14.430 00:38:16.620 Amber Lin: It’s much easier for them.

337 00:38:16.620 00:38:22.589 Uttam Kumaran: So the… so yes, another version is, like, we will most likely monetize L&D

338 00:38:22.990 00:38:30.549 Uttam Kumaran: Externally, through this way, because there isn’t a… there isn’t, like, a… Industry standard certification process.

339 00:38:31.480 00:38:37.659 Uttam Kumaran: And it’s not like we’re gonna start a Coursera, but when we come to the table for a client, we’re gonna be like, we have strategists.

340 00:38:37.840 00:38:41.919 Uttam Kumaran: we have our learning and development team, we also have our engineers, right? So, like.

341 00:38:42.300 00:38:46.059 Uttam Kumaran: We have strategists that come sit with your C-suite and, like, think through this.

342 00:38:46.480 00:38:52.090 Uttam Kumaran: We also have, like, engineers who can go build whatever, and then we have your learning and development team to train your staff to, like.

343 00:38:53.220 00:39:04.959 Uttam Kumaran: open their first IDE, like, ship their first skill, like, things like that. And so, ultimately, B, when he formed the L&D team, told me that his goal was, like, how do we commercialize this?

344 00:39:05.120 00:39:07.930 Uttam Kumaran: By the… by the start of next quarter.

345 00:39:08.140 00:39:12.340 Uttam Kumaran: I called him last week, and I said, I think I’ll have something for you this month.

346 00:39:12.830 00:39:26.070 Uttam Kumaran: So, hurry up, like, keep… like, make it… polish it up, get everybody going, because, we… we probably have… the next, like, probably 10, 20 deals are all AI for us.

347 00:39:26.270 00:39:28.310 Uttam Kumaran: Completely in this direction.

348 00:39:28.690 00:39:33.510 Uttam Kumaran: Like, we’re not doing any pure play data anymore. We’re just pitching it, pitching it at least.

349 00:39:34.070 00:39:44.719 Amber Lin: Do we have enough, like, at least a shared understanding among… amongst strategists, as you… as you call them, about, like.

350 00:39:45.580 00:39:48.739 Uttam Kumaran: No, I’ll… I only talked to you, I talked to Greg earlier, and I talked to you.

351 00:39:48.740 00:39:49.150 Amber Lin: Okay.

352 00:39:49.150 00:39:52.630 Uttam Kumaran: I mean, I mean, I, like, this is where, like, some people, like.

353 00:39:52.840 00:39:55.139 Uttam Kumaran: But everybody has pieces of this, like…

354 00:39:55.710 00:40:00.739 Uttam Kumaran: Robert, from the sales side, is like, dude, how are you gonna deliver on all this AI work I’m about to sell?

355 00:40:01.040 00:40:03.269 Uttam Kumaran: Right? I’m also like.

356 00:40:03.430 00:40:09.650 Uttam Kumaran: well, the stuff we’re doing for ABC and Lilo is not really our final form for AI work.

357 00:40:09.760 00:40:13.950 Uttam Kumaran: That’s almost like engineering, and there’s a little piece of AI in it.

358 00:40:13.950 00:40:14.520 Amber Lin: Yeah.

359 00:40:14.990 00:40:16.280 Uttam Kumaran: I don’t like that either.

360 00:40:16.410 00:40:18.490 Uttam Kumaran: And so, ultimately, I’m like.

361 00:40:18.740 00:40:35.930 Uttam Kumaran: we… what are we even selling? We’re selling this, like, context engineering, but all this starts with, like, broader strategy, and so the data people fit under there, because there’s a lot of data work to do this. There’s also a lot of other work. There’s, like, security, hard… the hard… basically the… it’s, like, harness engineering, right?

362 00:40:36.250 00:40:42.889 Uttam Kumaran: The problem with… the problem is nobody knows har… in 6 months, or 12 months, they’ll know the word harness, and then we’ll say we harness engineer.

363 00:40:43.010 00:40:48.390 Uttam Kumaran: Right now, somehow people know context engineering. They didn’t know when I said it two years ago, or even a year ago.

364 00:40:48.390 00:40:48.980 Amber Lin: Matt?

365 00:40:48.980 00:40:57.679 Uttam Kumaran: everybody knows it, and so the broader thing, and I’d also, like, I’m happy to send you our presentation from

366 00:40:57.840 00:41:03.979 Uttam Kumaran: Vixel. Vixel, but the reception was really, really positive.

367 00:41:05.590 00:41:11.999 Uttam Kumaran: And I think everybody gets it, and they see us, and how deep we’ve gone on data, and they’re like.

368 00:41:12.490 00:41:18.780 Uttam Kumaran: that matters to solve an AI feature, not really, like, just do normal dashboards, you know?

369 00:41:21.460 00:41:28.909 Amber Lin: So, let’s see… I mean, I think this deck has a lot of, high-level

370 00:41:29.040 00:41:38.379 Amber Lin: architecture of, like, how… what the components are, but I think if we have robust enough

371 00:41:38.700 00:41:50.970 Amber Lin: documentation, or at least a knowledge base, we should be able to generate some basic strategies if we have the client information. Yeah. Because the structure is the same.

372 00:41:51.100 00:41:58.640 Amber Lin: Yeah. So, I think maybe that’s how you can deliver on all of the work that’s coming in.

373 00:41:58.820 00:42:09.709 Amber Lin: To have… because designing it well, it just means adapting this good structure to their needs, and then, like, mostly that’s the heavy lifting of

374 00:42:10.100 00:42:16.439 Amber Lin: Talking to them, getting context, and then just customizing this template.

375 00:42:16.930 00:42:17.550 Uttam Kumaran: Yeah.

376 00:42:17.550 00:42:19.130 Amber Lin: And educating them, so…

377 00:42:19.130 00:42:19.820 Uttam Kumaran: Yes.

378 00:42:22.310 00:42:25.740 Amber Lin: Not too much, like, technical…

379 00:42:26.010 00:42:28.909 Uttam Kumaran: No, but, like, I think it’s, it’s like…

380 00:42:29.170 00:42:31.210 Uttam Kumaran: what does the word technical mean? It’s just.

381 00:42:31.210 00:42:41.190 Amber Lin: I know, I just… I just asked Davis. He asked me today, what do you define your technical experience? I was like, I don’t know what you mean.

382 00:42:42.050 00:42:47.199 Uttam Kumaran: You know, and so… I think my… yeah, the Slack message you sent me…

383 00:42:47.990 00:42:54.820 Uttam Kumaran: sort of, like, hits… it hits more about, like, how we’re hiring and things like that, but…

384 00:42:55.540 00:42:57.219 Uttam Kumaran: I don’t know, I think you’ve…

385 00:42:57.570 00:43:01.199 Uttam Kumaran: Because you drove… you dove into, like, the deep end.

386 00:43:01.480 00:43:03.719 Uttam Kumaran: Like, more of my point today is, like.

387 00:43:03.970 00:43:05.610 Uttam Kumaran: I think I have a path.

388 00:43:06.350 00:43:16.760 Uttam Kumaran: for you, that is actually defining this, like, undefined set of, like, you’re a generalist, but you learn fast, but then you also learn so much about AI that, like.

389 00:43:16.950 00:43:18.760 Uttam Kumaran: you should actually teach AI.

390 00:43:18.880 00:43:23.230 Uttam Kumaran: And that’s where you, Greg, I want to position some people to just…

391 00:43:23.490 00:43:29.699 Uttam Kumaran: basic… because otherwise, I’m the only one who can take a call and explain what a skill is today.

392 00:43:29.830 00:43:35.229 Amber Lin: I like that. I would really love to, like, standardize how we talk about.

393 00:43:35.230 00:43:35.580 Uttam Kumaran: Yeah.

394 00:43:36.650 00:43:40.889 Uttam Kumaran: And then it allows you to go deeper. I was… my feedback was go deeper, right?

395 00:43:40.890 00:43:41.600 Amber Lin: Yeah.

396 00:43:41.600 00:43:46.470 Uttam Kumaran: in the thing… the number one thing worth going deeper on, on planet Earth right now.

397 00:43:46.610 00:43:48.890 Uttam Kumaran: Right?

398 00:43:48.890 00:43:49.740 Amber Lin: life.

399 00:43:50.100 00:43:53.399 Uttam Kumaran: Yes, you could go deeper on dashboarding and blah blah blah, but, like.

400 00:43:53.910 00:43:58.849 Uttam Kumaran: What an opportunity to go deep on the thing that nobody has any depth on.

401 00:43:59.160 00:44:01.489 Uttam Kumaran: This is the new technology.

402 00:44:01.900 00:44:02.750 Uttam Kumaran: like…

403 00:44:03.010 00:44:11.219 Uttam Kumaran: And just by, like, even in your short depth of, like, a year of using all this stuff, you’re already deeper than most of the market.

404 00:44:12.600 00:44:22.209 Uttam Kumaran: You know, and so what I want to try to do is convert your time so that you’re spending most of your time on teaching that, learning the latest loop.

405 00:44:22.350 00:44:23.270 Uttam Kumaran: Right? And Luke.

406 00:44:23.270 00:44:31.029 Amber Lin: Yeah, I’m more than happy to do that if, like, my allocation allows, which sounds like the case.

407 00:44:31.030 00:44:38.220 Uttam Kumaran: No, and I want… and I want more… I want more money coming to that group, because we’re gonna bill higher for that. It’s gonna be, like…

408 00:44:38.810 00:44:44.150 Uttam Kumaran: Sort of undefined strategy work, and it’s gonna… this is kind of gonna be the premier team.

409 00:44:44.400 00:44:49.930 Uttam Kumaran: Because ultimately, like, I don’t know, I don’t… I don’t think depth in a certain subject area

410 00:44:50.260 00:44:58.769 Uttam Kumaran: I think it’s becoming cheaper. So, like, yes, if you have, like, 10 years of experience, okay, I can probably see a couple things that the AI can’t maybe, like, stitch

411 00:44:59.420 00:45:00.760 Uttam Kumaran: Together, the thing.

412 00:45:01.320 00:45:08.410 Uttam Kumaran: And, like, that’s how I feel, like, in data or in, like, product building. I’ve just done it for so long. I think I can see certain patterns.

413 00:45:08.990 00:45:16.440 Uttam Kumaran: But, like, less than that, or, like, if you never… even if you worked for a long time, you never went deep on something, sort of, like…

414 00:45:17.740 00:45:21.659 Uttam Kumaran: I don’t know. I… I don’t think there’s a… there’s a path, so…

415 00:45:21.660 00:45:22.160 Amber Lin: Yeah.

416 00:45:22.160 00:45:29.400 Uttam Kumaran: part of it is, like, you have to know some areas so deeply so that you can automate it, you know? Or build skills that could help people do it.

417 00:45:29.700 00:45:34.759 Uttam Kumaran: But frankly, I think everybody’s gonna become an engineer, everybody’s gonna become a salesperson, every… like…

418 00:45:34.910 00:45:45.750 Uttam Kumaran: you’re gonna become this sort of, like, weapon of a person that’s, like, kind of like who me and Robert are, right? We’re kind of, like, have learned how to do all these things, but at the same amount of time that I’ve always had.

419 00:45:45.960 00:45:55.160 Uttam Kumaran: But our propensity to learn has gone faster, we’ve used all the skills, and now, ultimately, we’re monetizing the fact that our company was one of the first to, like.

420 00:45:55.540 00:45:57.109 Uttam Kumaran: transform itself.

421 00:45:57.410 00:46:01.639 Uttam Kumaran: using AI, and we’re basically selling that story, you know?

422 00:46:03.780 00:46:13.179 Amber Lin: Yeah, I’m… I’m excited. I think what I would like to do next, because I’m… I’m very sold on Wuji sets now.

423 00:46:13.180 00:46:13.900 Uttam Kumaran: Okay.

424 00:46:15.150 00:46:30.099 Amber Lin: And so I’m just diving into the how. I think I’ll try to structure some, or at least list some concrete stuff I want to do. I’m gonna go ahead and experiment with the…

425 00:46:30.900 00:46:43.169 Amber Lin: the engineering and making… making changes to the platform, so I’ll try to do one of those projects. And then I… I have some thoughts, but I’ll read the deck closely, and…

426 00:46:43.770 00:46:57.339 Amber Lin: probably list out or try to rec- write a doc for my own understanding of how do these things fit together, how would I explain things to certain people,

427 00:46:57.580 00:47:08.309 Amber Lin: And then if we were to form a team, I would like to talk internally about, hey, these are some things I don’t understand, these are some things you don’t understand, and just make sure we have a…

428 00:47:08.690 00:47:10.380 Amber Lin: combine understanding.

429 00:47:10.660 00:47:13.740 Uttam Kumaran: Yeah, and I think broadly, I also want you to think about, like.

430 00:47:14.330 00:47:18.109 Uttam Kumaran: where you want to go, like, I think part of my question is, like.

431 00:47:18.860 00:47:25.150 Uttam Kumaran: Do you want to, like… I mean, the third option is, like, I think you should…

432 00:47:25.270 00:47:27.620 Uttam Kumaran: Take advantage of our team to start

433 00:47:28.090 00:47:35.630 Uttam Kumaran: building, like, your brand, and start to, like, talk, share some of this stuff more openly. But also, it’s gonna depend on, like.

434 00:47:36.030 00:47:42.430 Uttam Kumaran: Do you want to lead client engagements? Do you want to, like, get on the sales side? Like, I think, think about…

435 00:47:42.570 00:47:48.700 Uttam Kumaran: you’ve unlocked, like, you’re technical, you learned AI, you can PM,

436 00:47:48.880 00:48:00.879 Uttam Kumaran: think about, like, where you want to go, like, for example, Greg really clearly want, like, loves the sort of sales side of it, the, like, figuring out the structure and, like, getting that over the line.

437 00:48:01.110 00:48:08.189 Uttam Kumaran: And so you think about it, it’s like… but some of the… but again, if the engineering piece, I’m telling you, if that’s the focus.

438 00:48:08.330 00:48:17.299 Uttam Kumaran: the flip side of that is, like, I’m expecting those engineers to be, like, 100x efficient, like, running, like, 10 projects at a time. It’s kind of insane.

439 00:48:17.880 00:48:21.639 Uttam Kumaran: So the other thing is to think about, like, okay, do I want to go deeper on, like.

440 00:48:22.390 00:48:27.050 Uttam Kumaran: one part of, like, the AI transformation. I don’t know, I just think thinking about, like.

441 00:48:27.880 00:48:39.550 Uttam Kumaran: how you feel like your role could grow, or, like, what leadership could look like. They could also be like, look, we have no… we don’t have any decks or materials on this, like, we have… we need a lot, we need to write on that.

442 00:48:39.680 00:48:45.030 Uttam Kumaran: so I think just thinking about what your SKU is within this, like, team.

443 00:48:45.440 00:48:47.909 Uttam Kumaran: Because I think Greg’s skew is gonna be, like.

444 00:48:48.310 00:48:56.729 Uttam Kumaran: coming with me and schmoozing, and he’s great at that. I don’t think he’s a professor, so he’s great at, like, the teaching people who have no clue, like, nothing.

445 00:48:56.910 00:49:00.740 Uttam Kumaran: I would say, I don’t know if you have the… that level of patience I don’t even possess.

446 00:49:00.740 00:49:02.210 Amber Lin: I don’t have that level of patience.

447 00:49:02.210 00:49:05.720 Uttam Kumaran: So, this is where I’m like, okay, but then maybe Amber comes in.

448 00:49:05.770 00:49:23.689 Uttam Kumaran: at, like, a certain skill set, or maybe you’re just… everything you do is about prototyping. Like, maybe me and Greg kind of get, like, certain… something they want to build, and we’re like, cool, we can prototype something quick. So I just think… think about, like, spend some time thinking about this AI strategist, like, role, and, like.

449 00:49:23.900 00:49:29.699 Uttam Kumaran: where you want to lean. All of us are going to talk to clients, think about broader planning.

450 00:49:29.860 00:49:34.759 Uttam Kumaran: But… and, like, there’s gonna be this L&D aspect, but just think about where you want to fit, you know?

451 00:49:35.990 00:49:53.489 Amber Lin: Yeah, I’ve… I don’t know how I would perform on the sales side, because I’ve not done it before. I would probably, at one point, I’ll experiment to see how I do. But overall, like, right now, I… this is not a final answer. I feel like I…

452 00:49:53.670 00:50:03.129 Amber Lin: do… do well in writing, like, docs or writing. I enjoy the aspects of, oh, hey, here’s a syllabus, here are things that.

453 00:50:03.610 00:50:14.770 Amber Lin: I’ve learned, I’m explaining to you in my words, but my teaching is less of one-on-one, I’m gonna walk you through, like, I feel like I’m too young to have that patience right now.

454 00:50:14.800 00:50:29.489 Amber Lin: But, like, overall, I like… if I learn something, I feel like it’s a waste if I haven’t wrote it down or sent it out to somewhere, so I enjoy doing that aspect.

455 00:50:29.870 00:50:42.740 Amber Lin: And I think, same thing, when you mentioned building personal brands and, like, LinkedIn and stuff, I wanted to do that, but I just never figured out, like, what am I really doing? Because I did PM, and I did data, and all of a sudden, I’m now doing a different thing.

456 00:50:42.740 00:50:43.290 Uttam Kumaran: Yeah.

457 00:50:43.290 00:50:45.510 Amber Lin: We figured out the narrative that I have.

458 00:50:45.510 00:50:46.719 Uttam Kumaran: Yeah, and that’s why…

459 00:50:46.720 00:50:47.430 Amber Lin: We have.

460 00:50:47.660 00:50:49.450 Uttam Kumaran: But yeah, but that’s why I’m gonna try to…

461 00:50:49.560 00:50:54.520 Uttam Kumaran: Give you guys also resources to start basically building your own, like.

462 00:50:54.720 00:51:06.059 Uttam Kumaran: you… all of us on this team should feel like we’re building, like, a business. Like, I’m gonna post… we should just post about what we’re learning, or see deals will just start coming our way, and that is ultimately gonna be, like.

463 00:51:06.210 00:51:22.500 Uttam Kumaran: that’s, like, a huge path to, like, quite a bit more money, and so what I want to show with Greg and to you is, like, start doing that, because if someone literally hits you up and is like, I would love to work with you guys, what you just posted about is great, you’ll literally get money for bringing that deal in.

464 00:51:22.850 00:51:29.929 Uttam Kumaran: And I want… I want Brainforce to support you guys in doing that. I was thinking, you’re early in your career, like, you’re gonna look back.

465 00:51:30.140 00:51:30.870 Uttam Kumaran: Edit.

466 00:51:31.020 00:51:32.030 Uttam Kumaran: you know.

467 00:51:32.250 00:51:40.990 Uttam Kumaran: every job from here on out, you’re gonna see the stuff you did publicly as being really, really helpful. You know, nothing you post is ever gonna, like.

468 00:51:41.280 00:51:56.080 Uttam Kumaran: it’s just gonna be super helpful to have more stuff out in the wild, and so this is something that I don’t have… like, I want to figure out a way for the marketing team to, like, also take advantage of y’all that you guys are doing this, not just me or Robert, right? Like, a lot of us are doing interesting things.

469 00:51:56.080 00:52:08.199 Uttam Kumaran: And how can they help you guys get your voice out there? And so that’s also, like, a focus. But again, I’m not… I can’t do… I don’t have time or the bandwidth to do that with any group outside of this, like, small group of people.

470 00:52:09.870 00:52:15.950 Uttam Kumaran: But that’s, like, my, kind of, my commitment is, like, I’m not… I’m just, like, kind of not spending time with anybody that isn’t gonna be, like.

471 00:52:16.100 00:52:21.449 Uttam Kumaran: this, like, on the forefront of, like, what we’re trying to do, who we’re trying to become, because everything else is…

472 00:52:21.590 00:52:24.259 Uttam Kumaran: We’re generally figuring it out, you know?

473 00:52:25.450 00:52:26.050 Amber Lin: Yeah.

474 00:52:26.300 00:52:28.090 Amber Lin: Very cool.

475 00:52:28.850 00:52:45.789 Amber Lin: I like all the topics we’ve talked about so far, I have some things to think about, I have something I want to try. I’ll try to send you a reflection next week, but like, right now, it’s mostly just a CTA thing, and I’ll start thinking about

476 00:52:45.980 00:52:47.390 Amber Lin: How to…

477 00:52:48.150 00:52:54.239 Amber Lin: how to lean more into the strategy side, but I think first I’ll… I’ll do a project to…

478 00:52:54.370 00:53:00.969 Amber Lin: like, give more… better understanding of where we’re at right now, and just… Yeah.

479 00:53:02.170 00:53:04.700 Uttam Kumaran: Yeah, you’re gonna say it’s gonna be, like, a mix of, like.

480 00:53:04.940 00:53:18.860 Uttam Kumaran: project management, mix of product management, mix of engineering, mix of teaching, like… but that’s who all of us are. That’s why, like, one JD doesn’t really capture that. Until now, I have to write the JD that captures that, right? But, like, it didn’t make sense to me because

481 00:53:19.420 00:53:29.670 Uttam Kumaran: I would just say, like, you’re, like, a crush, you’re just doing your one job faster. Now that we have an opportunity to sell that, it makes a lot of sense. So, yeah, kind of let me know what you think.

482 00:53:30.100 00:53:35.850 Amber Lin: Yeah. Well, thank you for having this call. I do want to do what you said, so let me know what the title.

483 00:53:35.850 00:53:36.540 Uttam Kumaran: Okay.

484 00:53:36.820 00:53:38.720 Amber Lin: I’ll work on it on my side as well.

485 00:53:38.720 00:53:40.279 Uttam Kumaran: Okay, okay, perfect.

486 00:53:40.280 00:53:41.170 Amber Lin: I appreciate that.

487 00:53:41.170 00:53:43.170 Uttam Kumaran: Okay, thank you. Bye.

488 00:53:43.330 00:53:43.990 Amber Lin: Go on.