Meeting Title: Amber Q2 KPI-bonus structure discussion Date: 2026-03-27 Meeting participants: Amber Lin, Brylle Girang, Uttam Kumaran


WEBVTT

1 00:03:04.640 00:03:06.050 Brylle Girang: Hello!

2 00:03:07.790 00:03:08.820 Amber Lin: Hi there!

3 00:03:09.150 00:03:10.309 Amber Lin: Just waiting for…

4 00:03:10.310 00:03:10.680 Brylle Girang: doing.

5 00:03:10.680 00:03:12.910 Amber Lin: He was coming. I’m doing good!

6 00:03:14.430 00:03:20.219 Brylle Girang: Yeah, he is still in the meeting, so we were in the weekly retros, and he’s still speaking with…

7 00:03:20.640 00:03:23.059 Brylle Girang: with Drawbert, so… might take a while.

8 00:03:23.060 00:03:31.990 Amber Lin: Okay. Yeah, I have some tasks on hand, so I’ll just do it, until he comes on, or do you want to get started?

9 00:03:32.600 00:03:38.930 Brylle Girang: Yeah, I think I can give you, like, a brief introduction, and then I’ll let Utam expound on it later. Would that work?

10 00:03:39.150 00:03:40.240 Amber Lin: Yeah, that works.

11 00:03:40.540 00:03:41.230 Brylle Girang: Okay.

12 00:03:42.180 00:03:44.109 Brylle Girang: Let me just share this one.

13 00:03:58.260 00:04:00.039 Brylle Girang: Okay,

14 00:04:00.210 00:04:06.400 Brylle Girang: Can you check this Notion page out? And I don’t know if you have seen it already, but…

15 00:04:06.570 00:04:18.320 Amber Lin: No, I haven’t. I… I actually heard Jarrell and Kayla talking about it yesterday, so I know what… I think I know what you’re trying to do, but never seen this before. If you want to.

16 00:04:18.320 00:04:18.760 Brylle Girang: Yeah. Seriously.

17 00:04:18.769 00:04:23.689 Amber Lin: And while you’re talking, or something, so I can follow along, that would be great.

18 00:04:24.500 00:04:29.789 Brylle Girang: Yeah, so we are building a learning and development team in Brainforge.

19 00:04:30.340 00:04:46.550 Brylle Girang: And this plan lays down everything that we want to do in Quarter 2 and in the succeeding quarters. So, I just added here, like, the reasoning why learning and development would be a good department for us to actually invest in.

20 00:04:46.710 00:04:51.799 Brylle Girang: And the main problems here are the things that you can see here.

21 00:04:51.910 00:05:02.010 Brylle Girang: So, we have leads that we can close, we have partners that we can chase, because we don’t have enough certifications for our people. I know that we have

22 00:05:02.220 00:05:07.540 Brylle Girang: started, like, the Omni certifications, and then the DPT certifications, but

23 00:05:07.820 00:05:13.929 Brylle Girang: no one is, like, primarily responsible or accountable for that. Like, no one’s pushing that.

24 00:05:13.930 00:05:17.169 Amber Lin: I also haven’t finished this, so I know what you’re talking about.

25 00:05:17.170 00:05:35.850 Brylle Girang: Yeah, yeah, but, you know, everyone’s busy, there’s just not one team actually pushing that forward. Second problem is that, you know, we have been building things, the platform team have been shipping things, and then there’s no proper way for us to actually

26 00:05:35.850 00:05:38.110 Brylle Girang: Convert those into digestible.

27 00:05:38.280 00:05:49.690 Brylle Girang: bite-sized pieces for the teams, for the people, and we’re building things, and then no one’s using them. We don’t understand why, and that’s… that’s the gap that we want to fill, right?

28 00:05:49.810 00:05:54.430 Brylle Girang: And lastly, one thing that we’re actually thinking is that

29 00:05:54.910 00:06:02.869 Brylle Girang: this can be a revenue-generating objective, because clients have been asking us to, like, train teams on AI,

30 00:06:03.140 00:06:17.179 Brylle Girang: train them on how to use, maybe, Blobby, how to use other AI tools, how can they implement ChatGPT in their workspace, right? And we have been doing it for free, like, a courtesy advisory side.

31 00:06:17.430 00:06:25.829 Brylle Girang: And then we don’t… we don’t earn money from it. So I… I do believe, we do believe that that’s going to be, like, a really good opportunity for us.

32 00:06:25.830 00:06:39.020 Amber Lin: Yeah, I see. Very exciting. I think I’ve done both, been on both sides, because I’ve been teaching Eden clients how to use the AI stuff. What do you think will be the portion of it, like.

33 00:06:39.180 00:06:44.519 Amber Lin: For this role, this program that will be… what’s the split between internal and clients?

34 00:06:45.320 00:07:04.490 Brylle Girang: The first run, the first phase will be focused on internal, and the main reason for that is because we want to build a really solid case study of how learning and development, AI learning and development, can be a wonderful thing for work, for a business, right?

35 00:07:04.490 00:07:17.350 Brylle Girang: So the first phase will be us trying this out for ourselves first, and then once Brainforge is a totally AI-native company, then we can start selling it to clients.

36 00:07:17.350 00:07:28.000 Brylle Girang: Similarly to how we’re doing other stuff, right? We’re building stuff internally, and then once we see that it’s effective, once we see that there’s a commercial.

37 00:07:28.010 00:07:33.170 Brylle Girang: capability of it, then we sell it to clients. So that’s also how we’re approaching, like, learning.

38 00:07:33.170 00:07:33.700 Amber Lin: That’s funny.

39 00:07:33.700 00:07:34.340 Brylle Girang: spent here.

40 00:07:34.340 00:07:48.310 Amber Lin: Cool, very cool. From hearing this, I kind of hear two work streams, though. There’s, like, certificates of regular L&D, and then there’s AI. Is there… is that two work streams split up?

41 00:07:48.450 00:07:55.569 Amber Lin: Like, are… mingle them… mingling them into one? How do we want to approach that?

42 00:07:55.750 00:08:01.709 Brylle Girang: Yeah, good question. So, I’m not looking at it on a workstream’s perspective yet.

43 00:08:02.920 00:08:13.319 Brylle Girang: So that’s more of the, like, the commercial side, right? My initial thought is that learning and development will focus more into, like, the actual implementations

44 00:08:13.320 00:08:24.179 Brylle Girang: of AI tools or AI services into a workplace. When it comes to certifications, I think it will be more of a partnership between us

45 00:08:24.240 00:08:28.229 Brylle Girang: and our partners, for example, SnowflakeDBT, and I don’t…

46 00:08:28.330 00:08:33.059 Brylle Girang: Think that that’s going to be a good commercial push for us.

47 00:08:33.409 00:08:33.919 Brylle Girang: I see.

48 00:08:33.929 00:08:37.129 Amber Lin: So that’s just part of our, like, internal L&D.

49 00:08:37.130 00:08:39.020 Brylle Girang: Internal, yeah, exactly.

50 00:08:39.020 00:08:39.559 Amber Lin: Cool.

51 00:08:40.299 00:08:53.959 Brylle Girang: Okay, and then these are, like, the top initiatives and objectives for learning and development. So the first objective, and why we think that this is going to be amazing internally and externally, is because, one.

52 00:08:54.119 00:09:00.129 Brylle Girang: If we get people certified, clients will trust us more, we get more partners.

53 00:09:00.249 00:09:05.169 Brylle Girang: And guess what? That translates to more revenue, more money for us, right?

54 00:09:05.299 00:09:18.439 Brylle Girang: And then the second piece is that if L&D is going to be a successful project internally, then we can ship it as a service for our clients. Like, hey, this worked for us, we were able to, like.

55 00:09:19.319 00:09:24.829 Brylle Girang: get everyone AI native in 3 months, we can do that for you, too.

56 00:09:24.969 00:09:27.199 Brylle Girang: So it’s that sort of a framing.

57 00:09:27.200 00:09:38.830 Amber Lin: Yeah, I see your vision, I’ve kind of read through the projects and initiatives. Where do I fit in, or say, why are we talking about this right now?

58 00:09:39.180 00:09:43.469 Brylle Girang: Yeah, exactly. So, this is where I think you will…

59 00:09:44.170 00:09:51.809 Brylle Girang: will be really helpful for me. I mentioned about, you know, Us getting people certified.

60 00:09:52.080 00:10:09.950 Brylle Girang: And I want to be able to, like, lead the drive on some tools that I don’t have direct experience on. For example, if I’m going to tell people, hey, get certified at Omni, get certified at DBT, and I’m not dbt certified, I’m not Omni certified.

61 00:10:10.270 00:10:15.279 Brylle Girang: That’s not going to be a good push, right? And I’m not even using the tools myself.

62 00:10:15.400 00:10:25.100 Brylle Girang: So, what we have discussed here is that for us to be able to push people, we need someone to push them who’s also accustomed to what they’re actually doing.

63 00:10:25.220 00:10:32.030 Brylle Girang: So you are accustomed with, let’s say, Omni, you’re accustomed with SQL, you’re accustomed with

64 00:10:33.230 00:10:38.340 Brylle Girang: with BigQuery, so if you get… if you want people to be certified for those tools.

65 00:10:38.580 00:10:53.639 Brylle Girang: it would be best if you were… if you’re the one leading that charge. Mind you best if you’re the one who’s going to be certified first, and then that’s when we can ask them, hey, this has been really effective for me, this certification helped me get certified too.

66 00:10:54.030 00:11:00.350 Brylle Girang: etc. So that’s for IC, and that’s where I need the most help with. You can see here that

67 00:11:00.580 00:11:06.900 Brylle Girang: The… each service will have their, like, own certification routes?

68 00:11:07.230 00:11:25.370 Brylle Girang: And that really depends on the tools that they’re using. For data service, it might revolve around, you know, Snowflake, the data management side of things. For AI, that would involve, you know, the AI tools, Anaten, MashSha. For strategy, that might involve… this is not final yet, so this is just based on my hunches.

69 00:11:25.440 00:11:38.120 Brylle Girang: But it might involve, you know, Mixpanel, SQL, BigQuery, Omni certifications, etc. And I need leaders who can help me, like, push the certifications to the teams, to their own service lines.

70 00:11:38.280 00:11:47.909 Brylle Girang: So it makes, you know, it makes… it makes sense that people who are using the tools, people who are within the service lines, are also the ones pushing.

71 00:11:48.280 00:11:51.080 Brylle Girang: The certifications upwards.

72 00:11:51.250 00:11:54.040 Amber Lin: Yeah, that makes sense to me.

73 00:11:54.490 00:12:05.339 Amber Lin: like, we can dive into specifics. If I were to lead that work stream, what would it look like? What about the AI workflow? Is it mostly you, or do you need help there?

74 00:12:05.850 00:12:15.930 Brylle Girang: Yeah, exactly. So that is included in Initiative 4, Service Line Enablement. You can see here Project 4.1 and Project

75 00:12:17.000 00:12:20.339 Brylle Girang: Actually, the whole initiative here. So…

76 00:12:20.440 00:12:30.760 Brylle Girang: when we develop AI workflows, we also need someone who knows whatever the hell the service line is actually doing, right? And I think you are a great example for that.

77 00:12:30.760 00:12:40.929 Brylle Girang: The cursor skills that you want to build, the workflows within Cursor that you want to build, those are specific to your service line, and I won’t be able to build those myself.

78 00:12:40.930 00:12:46.650 Brylle Girang: Because I don’t know what… whatever the hell you’re doing. So I need people who can help me.

79 00:12:46.990 00:12:57.060 Brylle Girang: push those workflows to their own service line teams, and I need people who will be able to train the people within that specific service line for those workflows.

80 00:12:57.300 00:12:58.789 Brylle Girang: Does that make sense?

81 00:12:58.790 00:12:59.660 Amber Lin: Yeah.

82 00:13:01.270 00:13:12.970 Brylle Girang: Yeah, and one exciting stuff that we’re also planning are build-athons, where we dedicate one month, or one quarter, we figure out one problem, a big, big problem.

83 00:13:13.010 00:13:22.520 Brylle Girang: And we tell the service lines, hey, if you solve this using an AI workflow, using a tool that you can build using AI,

84 00:13:22.580 00:13:32.899 Brylle Girang: You get a price, you get to own that tool, and within 30 days, we try to check if that tool will stay with the company for forever.

85 00:13:33.080 00:13:38.109 Brylle Girang: And we need specific people who can lead specific service lines for that.

86 00:13:42.940 00:13:44.310 Amber Lin: Cool. Okay.

87 00:13:46.760 00:14:03.079 Amber Lin: like, this… this all makes sense to me. I’m… like, I’m more curious right now of how things would… how things are gonna land. Is this something that we’ve already started, or something that we plan to start in the near term?

88 00:14:03.770 00:14:05.769 Brylle Girang: The project plan will be…

89 00:14:05.880 00:14:18.200 Brylle Girang: will be set in stone by Monday, and I am expecting that I will start on the curriculum by Monday, too. So, I would say that the first official start of the whole project would be on April 6th.

90 00:14:18.550 00:14:22.939 Brylle Girang: if… If the timeline… if we’re… if we’re following the timelines.

91 00:14:22.940 00:14:23.690 Amber Lin: Hmm.

92 00:14:25.300 00:14:26.669 Amber Lin: I see.

93 00:14:29.860 00:14:36.299 Amber Lin: Cool. If, say, if I were to start helping on… on this…

94 00:14:36.550 00:14:43.219 Amber Lin: like, this main project, this workflow, how many hours do you estimate it would take for me per week?

95 00:14:44.470 00:14:51.580 Brylle Girang: Currently, I’m estimating that it would take 10 hours per week, per person, that’s going to help me out, so that’s 2 hours per day.

96 00:14:51.870 00:14:52.550 Amber Lin: Okay.

97 00:14:52.850 00:14:53.730 Amber Lin: Okay.

98 00:14:54.350 00:15:00.120 Amber Lin: valid. I’m very open to that, it’s very exciting. It’s kind of what I’ve been…

99 00:15:00.450 00:15:17.090 Amber Lin: doing outside of working hours right now, of trying to explore the different skills and stuff you showed me before. So, no questions unasked, happy to get started, happy to sync with you to, like, take… understand what the…

100 00:15:17.160 00:15:22.019 Amber Lin: what my ownerships are and what I need to hit. So…

101 00:15:22.130 00:15:30.579 Amber Lin: just… I guess we’ll talk next Monday, but mostly, I guess, each time I have questions of what my other assignments are going to be.

102 00:15:30.870 00:15:31.670 Uttam Kumaran: Yeah, yeah.

103 00:15:31.670 00:15:32.740 Amber Lin: It’s less of a.

104 00:15:32.740 00:15:37.010 Uttam Kumaran: Well, so let me, yeah, let me share, like, so one is…

105 00:15:37.670 00:15:51.980 Uttam Kumaran: this is the team, like, it’s a… this is a brand new team, and what I didn’t want to do was create another layer of hierarchy that rolls into me without, like, super, super clear direction, especially if it’s going to be a part of your time. And so…

106 00:15:52.120 00:16:03.820 Uttam Kumaran: I wanted, like, in hearing sort of the direction from B on, like, the L&D team, I feel like there’s clarity on, like, what the mission is, there’s clarity on, like, what the objectives are, and then he also has clarity on why

107 00:16:03.940 00:16:14.929 Uttam Kumaran: like, he needs additional hours beyond himself, right? And so that was the first thing I wanted to challenge, is like, what is the goal of the team? Like, how… what are the objectives, and how does,

108 00:16:15.080 00:16:15.770 Uttam Kumaran: like…

109 00:16:16.460 00:16:24.400 Uttam Kumaran: why… like, why would you need other folks, right? And so I don’t want to… I didn’t want to fit you guys to a plan. I wanted, like, make the plan stand for itself.

110 00:16:24.550 00:16:25.879 Uttam Kumaran: But I feel like…

111 00:16:25.930 00:16:37.369 Uttam Kumaran: this is gonna be one of the teams that operates, like, really closely to the platform team, and getting some of that work out, which I know is something that you’ve been interested in. It also, I think, satisfies

112 00:16:37.400 00:16:47.580 Uttam Kumaran: like, the way Bea framed it, which is, like, you guys become the subject matter experts in the service line, is also really helpful. I don’t see the service leaders having time

113 00:16:47.630 00:16:49.150 Uttam Kumaran: To, like, do this.

114 00:16:49.220 00:16:59.929 Uttam Kumaran: this quarter, but it is extremely important that also B is not going to be able to translate some of the technical, like, the super data or AI-specific

115 00:16:59.940 00:17:10.250 Uttam Kumaran: things that need to get built to those teams without… because he doesn’t have the credibility, you know? But this is where, like, what he’s trying to build is a system to develop

116 00:17:10.400 00:17:24.150 Uttam Kumaran: the learning and development across the company, which will be in areas that are beyond the stuff he’s familiar with. So, I think the KPI’s there. I think one thing I told him, and this is where, like, I want to give… I want to make it more on B, is that

117 00:17:24.190 00:17:36.810 Uttam Kumaran: he has OKRs to hit, and so I want… the reason why I wanted him here is I want him to kind of basically propose what OKRs are going to be owned by you and Mustafa, because ultimately.

118 00:17:36.960 00:17:52.739 Uttam Kumaran: I’m gonna ask him about the objectives that he listed there. And so, in that sense, like, ultimately, he’s the OKR… he’s the OKR owner that you’ll ladder into. Yeah. And so, I want him to sort of set where he wants you to fit in, and what… how he wants to…

119 00:17:53.060 00:18:02.209 Uttam Kumaran: you know, put in… put KPIs for you guys to own to hit bonuses. This is a leadership role at the company, like, the way we’re architecting is we’re gonna have

120 00:18:02.460 00:18:11.890 Uttam Kumaran: CSOs, we’re gonna have service leads, we’re also gonna have this, like, people on the learning development team. I don’t know at this point… at this point, we don’t have the budget to make

121 00:18:12.750 00:18:23.880 Uttam Kumaran: like, more than one person full-time on this team, let alone, I think, the bandwidth to even, like, think about that. But… did… did you explain, sort of, like, commercially, like, how this is…

122 00:18:24.090 00:18:29.329 Uttam Kumaran: Like, okay, so, as you can see, there’s a huge component to, like, I think we can go sell this.

123 00:18:29.460 00:18:34.200 Uttam Kumaran: Right? I think if we get good at teaching, if we teach ourselves successfully across these…

124 00:18:34.630 00:18:39.089 Uttam Kumaran: All the time, we get people that ask us to help them teach, you know, to do stuff, and so…

125 00:18:39.090 00:18:48.479 Amber Lin: Because I’ve been teaching the Eden folks that I’ve been working with how to use Blobby, use AI, so, like, I know what it’s like and what the demand is like.

126 00:18:48.480 00:18:50.840 Uttam Kumaran: Yeah, so there’s a clear path towards…

127 00:18:50.870 00:18:56.620 Uttam Kumaran: this team actually driving commercial revenue, and therefore, like, commanding more budget. I think, like.

128 00:18:56.630 00:19:11.179 Uttam Kumaran: without talking too soon, I think there is a path towards there being people, like, full-time on this team. But in the short term, I hope it’s… I think it’ll satisfy your curiosity, what we talked about a few weeks ago, which is, like, I want to stay ahead of the curve on AI,

129 00:19:11.460 00:19:19.560 Uttam Kumaran: that’s something that I don’t have the bandwidth to… to come share, although I have all the tools in the right place for it to happen.

130 00:19:19.730 00:19:27.460 Uttam Kumaran: And so I think this is the mechanism for that. And then I think on the what the OKRs are for you, I want that to get specified

131 00:19:27.600 00:19:31.270 Uttam Kumaran: by B, and I’m gonna be making this exact same pitch.

132 00:19:31.510 00:19:34.480 Uttam Kumaran: To Mustafa. Yeah. You know…

133 00:19:34.480 00:19:39.960 Amber Lin: Yeah, cool. I mean, I’ve been… I’ve been waiting for this, I know we’ve been developing this, and I think that

134 00:19:40.180 00:19:46.199 Amber Lin: like, I know B is making his OKR, so I… I think I would like to talk to him next Monday?

135 00:19:46.200 00:19:46.810 Uttam Kumaran: Okay.

136 00:19:47.760 00:19:49.640 Uttam Kumaran: I don’t want to drag this on, so it is.

137 00:19:49.640 00:19:58.350 Amber Lin: No, I know, I think what I wanted in the contract, as we talked about in the first time, was not necessarily, like, this specific, okay, I would.

138 00:19:58.350 00:19:58.790 Uttam Kumaran: Yeah.

139 00:19:58.790 00:20:16.820 Amber Lin: number, which is, hey, we have an OKR on this area, you have this range of, like, bonuses you can hit or may hit. So, like, so B, thank you for explaining that, and I… I look forward to talking to you Monday once your, like, plans and OKRs are…

140 00:20:17.190 00:20:18.110 Amber Lin: Solid.

141 00:20:18.110 00:20:21.529 Uttam Kumaran: Yeah, and I also want you to think about, like, you know, I think…

142 00:20:21.790 00:20:30.330 Uttam Kumaran: you know, one thing that you mentioned in our last call, and I talked to Robert about, is, like, you’ve hopped around a lot of pieces of parts of the business, but I also think that’s, like, what you’re really good at.

143 00:20:30.470 00:20:34.710 Uttam Kumaran: Like, you are a really amazing, like, 0 to 1 on, like, a new…

144 00:20:35.030 00:20:50.779 Uttam Kumaran: place, like, part of the company, and I think you’ve always done that, and I think part of, like, how I would… I would love to see our company grow is that internal people… this is how, like, real consultancies work. You start up a team, and then you’re like, cool, like.

145 00:20:50.890 00:21:04.359 Uttam Kumaran: I want to get this person to come do parts of it, I want to come this person. That’s how it works. We are, internally, we have a group of, like, mercenaries. I have a job to get done. Hey, I got approval for this job. Here are our goals.

146 00:21:04.360 00:21:16.529 Uttam Kumaran: like, I need these people. First, Utom, I need these people to achieve this. I see the ROI. Cool. You can go get another additional 20, 30 hours. Okay, but then I want… I actually want it to come from Amber.

147 00:21:16.600 00:21:20.239 Uttam Kumaran: Right? Or I want it to come from this person. And that’s how it, like, I think that…

148 00:21:20.350 00:21:23.239 Uttam Kumaran: Company’s gonna look longer term is, like.

149 00:21:23.390 00:21:25.139 Uttam Kumaran: We have a group of people that have a…

150 00:21:25.420 00:21:27.680 Uttam Kumaran: That have a bunch of mixed skill sets.

151 00:21:28.120 00:21:33.800 Uttam Kumaran: the better the pitch is for the project, the more people you can kind of pitch and be like, I want that person to help.

152 00:21:34.120 00:21:45.179 Uttam Kumaran: And, like, I think that’s sort of, like, how I want to see this project run. It’s the first time that I think someone, a leader on the team, has come with a plan like this.

153 00:21:45.260 00:21:57.560 Uttam Kumaran: where there’s an ability for them to also say, I need additional resources, and I go to them, I say, okay, there are two people who clearly have leadership sensibilities that I think you should tap into in order to achieve your plan.

154 00:21:58.430 00:22:00.710 Uttam Kumaran: How much time do you need, and what’s…

155 00:22:00.860 00:22:05.899 Uttam Kumaran: how are you gonna make sure that they’re able to achieve what they want to achieve as well? So…

156 00:22:06.160 00:22:09.750 Uttam Kumaran: that’s sort of, like, how I’m thinking about it, you know? So, in terms of, like, contract.

157 00:22:10.150 00:22:16.600 Uttam Kumaran: I… I… I… what I can’t… I can’t sign something that’s, like… I can sign something that’s, like, there are OKRs.

158 00:22:16.830 00:22:18.869 Uttam Kumaran: There will be bonus ties opening.

159 00:22:18.870 00:22:34.500 Amber Lin: Actually, that’s essentially what I want. I think today, I wanted to talk to you about the other proposals I have, and why not, or why yes, or why in… when in the future. So, like, on the bonus for…

160 00:22:34.500 00:22:41.229 Amber Lin: next year, too. I’m pretty clear on, like, we have agreement on what that is, but.

161 00:22:41.230 00:22:41.590 Uttam Kumaran: Yes.

162 00:22:41.590 00:22:44.640 Amber Lin: want to talk about the other stuff, so, like, they’re…

163 00:22:45.090 00:22:48.700 Amber Lin: If… if we put them aside, there’s a reason we put them aside.

164 00:22:48.700 00:22:49.250 Uttam Kumaran: Okay, okay.

165 00:22:49.250 00:22:50.440 Amber Lin: Yeah, I think.

166 00:22:50.440 00:22:54.560 Uttam Kumaran: Maybe, maybe B, I feel like Amber and I can stay on and discuss that. I think…

167 00:22:54.560 00:22:55.080 Brylle Girang: Yeah.

168 00:22:55.080 00:22:58.019 Uttam Kumaran: If you can plan on please trying to close…

169 00:22:58.320 00:23:03.460 Uttam Kumaran: this whole thing out, like, Monday. I will try, like, or I don’t, like…

170 00:23:03.700 00:23:07.099 Uttam Kumaran: George, you wanna jump off this and go call Mustafa, too?

171 00:23:07.280 00:23:10.939 Uttam Kumaran: Like, ultimately, I think he’s the only person with… who…

172 00:23:11.240 00:23:19.980 Uttam Kumaran: I think would be the person into the AI world that could do this. It’s up to you to kind of convince him and make sure he wants to do it. I don’t see him saying no, but, like.

173 00:23:20.130 00:23:21.860 Uttam Kumaran: That’s the other person, so…

174 00:23:22.000 00:23:28.100 Uttam Kumaran: If you want to call him, and then, yeah, I would love if you want to grab time with Amber on Monday to sort of close this out, and then Amber, I can stay on.

175 00:23:28.880 00:23:30.679 Brylle Girang: capture. Talk to you on Monday, Amber.

176 00:23:30.680 00:23:31.330 Uttam Kumaran: Okay.

177 00:23:31.330 00:23:32.489 Amber Lin: Yeah, talk to you soon.

178 00:23:32.490 00:23:33.290 Uttam Kumaran: Thank you, guys.

179 00:23:33.290 00:23:33.970 Amber Lin: Bye.

180 00:23:34.510 00:23:52.330 Amber Lin: Yeah, I think first thing is, what is my assignments gonna look like, even in the next week? Because I know I’m ramping down on Element, especially if we’re ramping Jasmine and Avid up. It doesn’t justify having more hours there.

181 00:23:52.350 00:23:58.820 Amber Lin: I’m doing this AMBLE discovery, which, but it’s helpful for Robert’s…

182 00:23:59.030 00:24:11.469 Amber Lin: renewal discussion, but it’s not gonna last forever. Probably he’s gonna talk early next week, and it’s gonna end. Eden is mostly Eden Oz for the next 2 weeks, and then I’ll come back on, but…

183 00:24:11.640 00:24:14.930 Amber Lin: Other than that, the next two weeks, I’m…

184 00:24:15.290 00:24:20.350 Amber Lin: pretty open in time, and it’s our new project, so I’m at the point.

185 00:24:20.370 00:24:22.569 Uttam Kumaran: So let me show you something.

186 00:24:47.190 00:24:48.009 Uttam Kumaran: Hold on.

187 00:25:02.100 00:25:04.660 Uttam Kumaran: Wait, I literally just set this up.

188 00:25:28.070 00:25:30.630 Uttam Kumaran: Okay, great. Here.

189 00:25:33.930 00:25:40.910 Uttam Kumaran: Cool, so… To kind of give you a sense of, like, where…

190 00:25:41.930 00:25:46.089 Uttam Kumaran: we are at. So these are all, like, clients that,

191 00:25:48.440 00:25:52.419 Uttam Kumaran: These are all… let me just share this… these are all…

192 00:25:52.550 00:25:57.280 Uttam Kumaran: So these are all clients who had… who have contracts that started in…

193 00:25:57.490 00:26:08.130 Uttam Kumaran: JAN and are actually, like, extending beyond. In addition to these, we have, Element that’s signing a renewal, CTA that’s signing a renewal.

194 00:26:08.370 00:26:17.010 Uttam Kumaran: We have Global Vet Link that we’re pitching a renewal to. We have this company called Sunstone that’s coming up.

195 00:26:17.190 00:26:21.259 Uttam Kumaran: That we’re about to sign. We have…

196 00:26:21.390 00:26:30.520 Uttam Kumaran: Amble, depending on how that goes. We also have your work on this, On the L&D team.

197 00:26:30.630 00:26:35.050 Uttam Kumaran: So, immediately, like, where I see… I need…

198 00:26:35.290 00:26:41.929 Uttam Kumaran: I’m making the pitch to Jasmine and Robert to get your help with me on CTA.

199 00:26:42.090 00:26:47.939 Uttam Kumaran: I think the reason it’s an interesting project, it’s actually, like.

200 00:26:48.340 00:26:58.899 Uttam Kumaran: the project, one of the big parts of the next two months is actually completely focused on getting their Snowflake AI agent, like, in a great place. They actually want to replace Power BI with just the

201 00:26:59.340 00:27:06.699 Uttam Kumaran: an AI agent. Like, they don’t even want to… they want to build a couple of, like, dashboards and streamlet, but,

202 00:27:06.920 00:27:15.950 Uttam Kumaran: a lot of it is around building extremely good semantic context for an AI agent in Snowflake. I think this… the client is,

203 00:27:16.080 00:27:19.659 Uttam Kumaran: Amazing, like, there’s… it’s like… They’re so, so chill.

204 00:27:20.010 00:27:27.860 Uttam Kumaran: They already really like us. Me, Awash, and Ashwini are working on that, and I need someone to sort of take the reins on

205 00:27:28.330 00:27:31.610 Uttam Kumaran: owning the… Success of, like.

206 00:27:31.770 00:27:38.780 Uttam Kumaran: the development of the AI agent within Snowflake. It is really kind of probably similar to, like, the blobby work you’re doing.

207 00:27:38.940 00:27:43.780 Uttam Kumaran: And it’s basically everything around, like, measuring feedback.

208 00:27:44.540 00:27:51.139 Uttam Kumaran: Implementing better semantic context, steering that, and then there’s gonna be some portion of probably, like.

209 00:27:51.300 00:27:55.659 Uttam Kumaran: helping them train people internally. I can send you the whole project plan.

210 00:27:56.550 00:28:04.840 Uttam Kumaran: for, Q2 for there, but I certainly need assistance there. I’m gonna start to… Run out of time.

211 00:28:04.980 00:28:09.250 Uttam Kumaran: So that’s… so that’s one. Eden is also picking back up.

212 00:28:09.900 00:28:20.099 Uttam Kumaran: So Robert should kind of probably tell you sometime soon, like, what the plan is there. But I think there’s gonna be some time needed there as well.

213 00:28:20.100 00:28:20.760 Amber Lin: Okay.

214 00:28:20.760 00:28:24.329 Uttam Kumaran: So, CTA is kind of, like, immediate. I think, like, I…

215 00:28:24.520 00:28:28.099 Uttam Kumaran: I just sort of finished the project plan today,

216 00:28:28.570 00:28:38.410 Uttam Kumaran: And then on Element, it’s, again, the way… the way this is gonna work longer term is, like, Jasmine is most likely gonna determine all the allocations on strategy, so this would hopefully be the…

217 00:28:38.420 00:28:52.649 Uttam Kumaran: the last time where it’s, like, sort of down to the wire. But CTA is probably the most immediate thing, and then this L&D team. And on the L&D stuff, there’s work starting Monday, so I think B is, like, trying to… B is trying to hedge, being like.

218 00:28:52.790 00:28:58.430 Uttam Kumaran: let’s start on the 6th, I’m like, Why? Like, just start my The internal team, like, just…

219 00:28:58.640 00:29:04.100 Uttam Kumaran: just start doing stuff. So, he’s just trying to nail it, and he… and it’s… it’s really important, so…

220 00:29:04.550 00:29:13.320 Uttam Kumaran: I feel like with those two, with Jasmine starting, like, this is also where there’s a lot of, like, standardization-type stuff still to do.

221 00:29:13.800 00:29:28.620 Uttam Kumaran: One thing we could do is, like, see how next week goes. There’s still work to do on… like, I’m looping in Advait onto default as well, and there’s… there’s still a couple of other clients that we’re about to sign that will need strategy work.

222 00:29:29.160 00:29:34.320 Uttam Kumaran: But I think, like, this L&D team is, like, a really unique opportunity.

223 00:29:34.550 00:29:38.760 Uttam Kumaran: So, I don’t want… I want you to also, like, make sure to nail that, and that…

224 00:29:38.900 00:29:42.130 Uttam Kumaran: Like, you have… you have, like, time to do that, you know?

225 00:29:42.130 00:29:50.499 Amber Lin: Yeah, I want to make sure… I think I’ll have time to do that, but I also want to make sure I have, like, a revenue-contributing work.

226 00:29:50.500 00:29:51.170 Uttam Kumaran: Yeah, yeah.

227 00:29:51.170 00:29:51.570 Amber Lin: when…

228 00:29:51.570 00:29:57.020 Uttam Kumaran: But that team… yeah. But this is where I’ll challenge you in that, like, That…

229 00:29:57.280 00:30:02.569 Uttam Kumaran: I would… I would change for, like, revenue contributing to, like, you’re delivering on immediate revenue, but…

230 00:30:02.740 00:30:07.640 Uttam Kumaran: The reason why I approved that team to form is because there’s a clear revenue directive.

231 00:30:07.980 00:30:18.299 Uttam Kumaran: Like, one, there’s a path towards us shipping L&D as a service. Second, a lot of the stuff is around helping people use AI more, which allows us to

232 00:30:18.410 00:30:22.489 Uttam Kumaran: get more done. And then also, it’s all around certification, so…

233 00:30:22.920 00:30:27.389 Uttam Kumaran: there is a big revenue component. I hear you, though, it’s not, like, immediately working on clients.

234 00:30:27.390 00:30:27.820 Amber Lin: Yeah.

235 00:30:27.820 00:30:33.780 Uttam Kumaran: in that sense, like, we are, like, I’m the client for that team. And so, I think it… that’s why I was like, that’s…

236 00:30:33.910 00:30:38.360 Uttam Kumaran: if… in terms of other leadership places, like, that is a spot where I need help.

237 00:30:38.940 00:30:48.369 Amber Lin: Cool. Yeah, I hear that. I’m happy to start on the, like, the CTA work, that sounds really interesting. Okay. That’s kind of similar to

238 00:30:48.920 00:30:50.850 Amber Lin: probably worse. It’s probably, like.

239 00:30:50.850 00:30:54.280 Uttam Kumaran: Like, yeah, it’s that, except, like, probably minus all of the, like.

240 00:30:55.000 00:31:03.079 Uttam Kumaran: drama. Minus, like, all the, like, negativity and, like, weirdness. So it’s, it’s, like, really cool client.

241 00:31:03.280 00:31:06.439 Uttam Kumaran: So, CTA, do you know, like, the CES conference?

242 00:31:06.940 00:31:08.849 Amber Lin: Yeah, yeah.

243 00:31:08.850 00:31:12.109 Uttam Kumaran: Yeah, that’s… that’s them. They’re, like, it’s, like, the biggest tech conference in the world.

244 00:31:12.110 00:31:13.009 Amber Lin: That’s so cool!

245 00:31:13.010 00:31:15.650 Uttam Kumaran: Yeah, so they’re an incredibly nice team.

246 00:31:16.850 00:31:28.169 Uttam Kumaran: really cool work. It’s all Snowflake AI work. So, I think it’s, like, a huge step in the right direction for, like, kind of, like, what you want to do, and I’ve been…

247 00:31:28.260 00:31:38.480 Uttam Kumaran: It’s a really new technology, so this quarter I’ve been learning a lot about it, and I think it’s in a… the first version of it is in a good place, but it’s gonna require someone to, like, deeply think about

248 00:31:38.970 00:31:47.599 Uttam Kumaran: like, it’s almost like… it’s sort of similar to the Andy project, but it’s actually more in the data world. Like, we’re building, like, a data agent.

249 00:31:47.790 00:31:50.469 Uttam Kumaran: Right? So you’ll have to think really…

250 00:31:51.130 00:31:52.060 Amber Lin: Yeah, yeah.

251 00:31:52.060 00:31:53.560 Uttam Kumaran: You have to think really deeply about it.

252 00:31:53.560 00:32:07.839 Amber Lin: was helpful, because… because today, I was like, what level do we have to model to for an AI agent to work? And for sell-through, we can’t just give it daily sell-through, we have to give it, like, receipts.

253 00:32:07.850 00:32:16.789 Amber Lin: Yeah. And then they can use that field to answer questions, because some things don’t roll up, and an AI agent wouldn’t know what to do. So that was… that was really cool.

254 00:32:16.790 00:32:21.489 Uttam Kumaran: Yeah, and I’ll send you this, I… this isn’t… I’ll send you a couple things on this subject, but…

255 00:32:21.490 00:32:21.890 Amber Lin: Good.

256 00:32:21.890 00:32:25.549 Uttam Kumaran: this, this, like, I was reading this today, and

257 00:32:26.310 00:32:33.399 Uttam Kumaran: this is where I think, like, right now on the data team, I don’t know, if anyone is really thinking super hard about

258 00:32:33.510 00:32:36.430 Uttam Kumaran: Data agents, and…

259 00:32:37.250 00:32:42.680 Uttam Kumaran: Just on the strategy team in particular, thinking about how we… how we allow people to…

260 00:32:42.720 00:32:57.630 Uttam Kumaran: do basic querying, basic analysis through AI, and I really need someone, ideally you, to have, like, a deep understanding, because I don’t think Robert is gonna get there, he’s just short on time. I think Jasmine is gonna be just busy, like, making sure the ship runs.

261 00:32:57.700 00:33:06.159 Uttam Kumaran: And then Greg is sort of just on CSO, so in every team, there has to be someone who’s, like, a subject matter expert on, like, how AI impacts

262 00:33:06.330 00:33:20.860 Uttam Kumaran: that group, and I think this goes to, like, what your… what your assumption was, which I challenged, which is, like, I’m gonna be out of a job in, like, a few months. Like, I think you’re gonna see that this problem is much more complicated,

263 00:33:21.060 00:33:27.330 Uttam Kumaran: then… and I think this is where I want you to have a look at… at this, and one of the things in this

264 00:33:27.540 00:33:29.759 Uttam Kumaran: Really, it’s just this piece, which is, like.

265 00:33:30.220 00:33:39.710 Uttam Kumaran: why… why do A… why does AI struggle with… with data analysis? And it sort of talks about, like, how does the agent know how revenue or quarters are defined?

266 00:33:39.890 00:33:45.359 Uttam Kumaran: You know? Like, where is the right data sources? Which is the source of truth?

267 00:33:45.460 00:33:51.129 Uttam Kumaran: And then it sort of talks about building this context layer. Once you understand this piece.

268 00:33:51.330 00:33:58.399 Uttam Kumaran: it’s gonna really click for you why our platform and why Cursor is designed in the way it is.

269 00:33:58.530 00:34:06.590 Uttam Kumaran: Because you’re gonna realize that cursor is, like, is just, is just a layer on top of the data.

270 00:34:06.770 00:34:09.179 Uttam Kumaran: Like, whether it’s Cursor or ChatGPT,

271 00:34:09.219 00:34:28.730 Uttam Kumaran: what is the art in what we’ve done at Brainforge is we’ve set up the repo in a way where you could do so much regardless whether it’s cloud code, cursor, and that’s, I think, the conclusion I want you to see, but not for just any business work, in particular for the type of work that we do on the strategy analytics team.

272 00:34:28.820 00:34:44.680 Uttam Kumaran: how can AI… how do we… how do we set AI in the way that powers that? Another good question that CTA asked me is they said, how do we get the AI to not respond to certain questions? Because some people were like, I don’t want people using this for, like, gossip. I don’t want people, like, asking for, like.

273 00:34:45.110 00:34:49.379 Uttam Kumaran: what did… what did so-and-so make this… this year? Yeah, so that’s the…

274 00:34:49.389 00:34:50.359 Amber Lin: and stuff.

275 00:34:50.360 00:34:57.920 Uttam Kumaran: Yeah, it’s all data governance, it’s called steering, harnessing, like, it’s, it’s like… and you’ll see that

276 00:34:58.080 00:35:03.949 Uttam Kumaran: quickly, you’ll be like, wow, okay, Blobby’s actually not, like, that complicated. Like, I think it’s impressive.

277 00:35:04.060 00:35:04.820 Uttam Kumaran: But…

278 00:35:05.520 00:35:13.880 Uttam Kumaran: I think you’ll start to be like, oh, that’s, like, so 2025, like, those types of agents. Because where I want you to push it, and one thing that I’ll show you…

279 00:35:14.020 00:35:24.570 Uttam Kumaran: In… and I, and I kind of showed this to, to Demi, is, yeah, this…

280 00:35:26.600 00:35:31.830 Uttam Kumaran: So, I started doing some research earlier this quarter on this concept of, like,

281 00:35:32.540 00:35:38.089 Uttam Kumaran: basically data agents, and sort of what I would call, like, an agent-powered data environment.

282 00:35:38.330 00:35:40.090 Uttam Kumaran: I think this’ll be good.

283 00:35:40.480 00:35:43.990 Uttam Kumaran: Sunday reading for you is, like, what’s in this sort of folder.

284 00:35:44.070 00:35:46.809 Amber Lin: Which is just… Did you send, did you send the link to me?

285 00:35:46.810 00:35:49.190 Uttam Kumaran: Yes, I’ll send this to you.

286 00:35:49.510 00:35:51.379 Amber Lin: And the article, if you can.

287 00:35:51.380 00:35:53.249 Uttam Kumaran: And the article, I’ll send it to you.

288 00:35:58.290 00:35:59.619 Uttam Kumaran: And then this…

289 00:36:04.020 00:36:07.740 Uttam Kumaran: I think if you read this, you’ll be like, Holy shit.

290 00:36:09.000 00:36:13.390 Uttam Kumaran: But, basically, where was I?

291 00:36:14.410 00:36:15.190 Uttam Kumaran: Yeah.

292 00:36:15.520 00:36:16.510 Uttam Kumaran: So…

293 00:36:17.130 00:36:22.140 Uttam Kumaran: I spent some time, and this was just, like, one research session that I did, where I was just, like.

294 00:36:22.930 00:36:30.739 Uttam Kumaran: I want to think about how AI can get leveraged through the entire supply chain of our typical data,

295 00:36:31.220 00:36:40.549 Uttam Kumaran: a typical data project of ours, from engineering to modeling, to BI, to product analytics to strategy and analysis.

296 00:36:40.710 00:36:42.630 Uttam Kumaran: And…

297 00:36:42.930 00:36:58.370 Uttam Kumaran: one thing that I was just, like, trying to think about is, like, okay, what are, like, the different, like, areas of the project? And so, this, this, this, this folder is really good to sort of give you a lay of the land of, like, what I was thinking about. But really, I think if I go to,

298 00:36:59.570 00:37:04.770 Uttam Kumaran: This doc has a lot about, like, what I’m calling, like.

299 00:37:05.350 00:37:11.330 Uttam Kumaran: self-learning data agents. And basically a lot of this is like, okay, we have, like.

300 00:37:11.520 00:37:14.289 Uttam Kumaran: We show that cursor can ask SQL questions, right?

301 00:37:14.510 00:37:19.599 Uttam Kumaran: So, the next jump is, like, okay, can cursor take a task from linear that says.

302 00:37:20.290 00:37:30.490 Uttam Kumaran: use data to solve this problem? Can it break that down into SQL questions, run those SQL, take the… take a bunch of the answer, and then, like, actually give you the answer?

303 00:37:30.750 00:37:38.790 Uttam Kumaran: Okay, and then at some point, like, while you’re… while you’re doing that, right, then it… then it learns, like, oh, actually, we always want to use revenue instead of sales.

304 00:37:38.980 00:37:45.380 Uttam Kumaran: we always are gonna use, the daily average versus the 4-week average, right? So these are, like, learnings.

305 00:37:45.550 00:37:49.590 Uttam Kumaran: And so part of this is a little bit of a document on, like, how do…

306 00:37:49.870 00:37:53.530 Uttam Kumaran: How do agents, like, learn? And how do we think about

307 00:37:53.730 00:37:57.440 Uttam Kumaran: memory for, like, AI agents, so it’s…

308 00:37:57.780 00:37:58.230 Amber Lin: Yeah.

309 00:37:58.230 00:38:00.440 Uttam Kumaran: concept around memory. Yeah, go ahead.

310 00:38:01.050 00:38:07.719 Amber Lin: Cool, I was just… I think in my own exploration, I was just getting into this, of…

311 00:38:07.850 00:38:21.890 Amber Lin: like, iterations of skills and rules, and also where do they store, that… that context, which is the memory that you’re talking about. So I kind of know where you’re… where you’re getting to, and that’s very exciting.

312 00:38:21.890 00:38:25.250 Uttam Kumaran: Cool, and, like, one… maybe one other area that I’ll,

313 00:38:25.810 00:38:34.930 Uttam Kumaran: I’ll share with you is… maybe even going one step further, this is all part of, sort of, like, a broader platform plan, which is, like.

314 00:38:35.240 00:38:47.119 Uttam Kumaran: what is the plan for the platform team this quarter? And so the way we’re thinking about it is there are some primitives when it comes to AI. So now that you’ve got it, if you get context and you kind of get memory, then this should be, like, the step above, which is, like.

315 00:38:47.420 00:38:52.849 Uttam Kumaran: what we’re just talking about today is just for the strategy and AI team, right? But if you broaden it, it’s like.

316 00:38:53.480 00:39:00.589 Uttam Kumaran: any work that our company does, right, has… what are the… what are the corollaries across them, right? So there’s some context.

317 00:39:00.760 00:39:04.630 Uttam Kumaran: Right? So AI has access to relevant history, company knowledge.

318 00:39:04.810 00:39:11.009 Uttam Kumaran: There’s a spec, which is a definition of what it’s done. It’s like the acceptance criteria, it’s the linear ticket, right?

319 00:39:11.240 00:39:17.469 Uttam Kumaran: There’s a verification, right? This is, like, the PR review. This is someone being, like, cool, this is good to go.

320 00:39:17.950 00:39:19.629 Uttam Kumaran: There’s the execution.

321 00:39:20.080 00:39:27.659 Uttam Kumaran: Okay, this is the fact that I can say codex, go run this, cursor, go run this, and it runs. The where is it running? Is it the cloud? Is it your machine?

322 00:39:27.840 00:39:30.289 Uttam Kumaran: There’s observation, which is, like.

323 00:39:30.630 00:39:37.739 Uttam Kumaran: How do you know it’s running? How do you know what it ran? How do you trace back to find out issues? And then there’s everything around safety.

324 00:39:38.080 00:39:43.669 Uttam Kumaran: it can’t, like, it can’t affect certain systems. It shouldn’t be able to, like, email the client, right? That’s, like, a guardrail.

325 00:39:44.000 00:39:53.280 Uttam Kumaran: So, one of the things Clarence and I talk a lot about is, like, I think these are, like, the sort of six primitives of, like, what we’re gonna call, like, the AI harness at Brainforge.

326 00:39:53.540 00:39:55.899 Uttam Kumaran: And so, like, think about the harness as, like.

327 00:39:56.870 00:40:09.690 Uttam Kumaran: if you think about, like, AI as, like, a thing that… a set of tools or ways of working, these are all the ways that we have to shape it for it to do, actually, productive work at Brainforge. And so, if I was to go back to, like,

328 00:40:10.240 00:40:16.619 Uttam Kumaran: I wrote this doc about, sort of, like, what is the, like, sort of learnings from this report about

329 00:40:17.360 00:40:25.050 Uttam Kumaran: cloud… about, like, data agents, and really, it’s like, cool, who’s the goal? Map common asks by role to a repeatable worker, SOP, or skill.

330 00:40:25.320 00:40:30.800 Uttam Kumaran: Give reviewers automatic signals for data PRs, test out whether changes are working.

331 00:40:31.110 00:40:40.619 Uttam Kumaran: you have, like, sort of guardrails in Snowflake, and then you allow, like, something… people to be able to, like, run things in the cloud, not necessarily on your… on your machine. And so…

332 00:40:40.830 00:40:48.160 Uttam Kumaran: this will be actually helpful, because you’ll actually… you’re starting to learn a lot about dbt, actually, so if I was to, like… but if I was to focus on the analysts, right?

333 00:40:48.350 00:41:00.409 Uttam Kumaran: What are the common asks? Investigate a question. Validate or reject a hypothesis. Produce insights. Build draft decks and narrative artifacts. Probably missing some here, but you can think about, then, okay.

334 00:41:00.610 00:41:02.610 Uttam Kumaran: Can we actually determine

335 00:41:02.970 00:41:07.999 Uttam Kumaran: just like I’m asking Jasmine, the reason I asked Jasmine for what makes a great deck.

336 00:41:08.150 00:41:12.979 Uttam Kumaran: is because I need her to give… give us this, like, the… the spec.

337 00:41:13.190 00:41:17.439 Uttam Kumaran: Right? She needs to specify what are the components, or what is the…

338 00:41:17.560 00:41:20.290 Uttam Kumaran: What is the, like, exit criteria for a great deck?

339 00:41:20.460 00:41:34.030 Uttam Kumaran: That way, when we build decks, we judge it against exit criteria. That way, when an agent builds a deck, it judges it against an exit criteria, right? And so you could think about… and these are all just, like, I was just riffing here, but you could think about…

340 00:41:34.140 00:41:47.679 Uttam Kumaran: different workers, or agents, or skills that contribute to something that does analysis execution, insight synthesis. If you’ve been playing around with skills, you know, over the past few weeks, you know that, like.

341 00:41:47.960 00:41:50.689 Uttam Kumaran: if you… you can give AI skills to run.

342 00:41:51.210 00:41:57.300 Uttam Kumaran: And ultimately, if flipped another way, AI has access to skills to run. Just like it has access to MCP.

343 00:41:57.640 00:42:01.389 Uttam Kumaran: Just like it has access to contacts, has access to web, and so…

344 00:42:02.070 00:42:04.320 Uttam Kumaran: this is sort of how I’m thinking about

345 00:42:04.790 00:42:11.149 Uttam Kumaran: mapping the tasks of our data team to, like, agentic workflows.

346 00:42:11.320 00:42:16.969 Uttam Kumaran: And then there’s some stuff here about, like, how are we gonna do this? Like, for example, the analyst harness.

347 00:42:17.120 00:42:23.809 Uttam Kumaran: Okay, you need to get a question, success, you need to make the hypothesis, run the query, build the insight.

348 00:42:23.930 00:42:26.440 Uttam Kumaran: Right? So, for example, you could think about, like.

349 00:42:26.580 00:42:29.509 Uttam Kumaran: Amber, actually, you’re in the… you’re… you have to do this.

350 00:42:30.180 00:42:32.649 Uttam Kumaran: Then you have to approve every one of these steps.

351 00:42:32.900 00:42:40.779 Uttam Kumaran: So you turn from doing these things, and then being like, AI, tell me if I’m right, to AI doing these things, and AI’s asking you, am I right?

352 00:42:42.110 00:42:43.680 Uttam Kumaran: See what I mean?

353 00:42:43.970 00:42:44.990 Amber Lin: Very cool.

354 00:42:45.460 00:42:49.049 Uttam Kumaran: I’m just, just like… blabbing on and on, but, like.

355 00:42:49.050 00:42:50.030 Amber Lin: There’s

356 00:42:51.070 00:43:06.070 Amber Lin: Something for the analysis, so I… I had a skill for… I think I’m starting to build up to, like, an agent. I have skills for connecting to DBs, running, like, revenue queries, and then, like, taking those

357 00:43:06.090 00:43:16.990 Amber Lin: graphs or images and putting them in a slide deck. Great. So that’s kind of what I’m starting, but I think the piece I’m missing is the, like, what you made Jasmine do, like, requiring.

358 00:43:16.990 00:43:17.430 Uttam Kumaran: Yes.

359 00:43:17.430 00:43:18.410 Amber Lin: for a dad.

360 00:43:18.410 00:43:20.880 Uttam Kumaran: It’s only about the… you’ll actually find…

361 00:43:21.050 00:43:29.560 Uttam Kumaran: That skill development is really easy, but getting the notion of what is right is hard.

362 00:43:29.720 00:43:30.430 Amber Lin: Like…

363 00:43:30.430 00:43:40.890 Uttam Kumaran: And that’s why I’m pushing on these… this is why, broadly, on the delivery team, I’m starting to push these standards, is ultimately we need standards, and we need SOPs, and, like.

364 00:43:41.080 00:43:47.600 Uttam Kumaran: requirements, and then ultimately, if there’s not a requirement, the AI needs to say, let me always look back at the standards.

365 00:43:47.880 00:43:55.200 Uttam Kumaran: Right? So I can… I can infer what the best way of doing it. Otherwise, it’s gonna, like, look through its corpus and think about some random…

366 00:43:55.430 00:43:58.459 Uttam Kumaran: memory it has, and then just decide, right? Instead, it’s like.

367 00:43:58.630 00:44:01.419 Uttam Kumaran: We have to almost try to develop its intuition.

368 00:44:01.970 00:44:07.809 Uttam Kumaran: like, in the event it has all these pieces of information, it needs to make a judgment call, right? And…

369 00:44:07.850 00:44:25.359 Uttam Kumaran: you may think about a human’s judgment, you may say that’s undefinable. Well, similar to an AI, you can’t really, like… you have to steer it, and we have to treat it like us, where it’s like, okay, you have, like, long-term memory, short-term memory, you have things in front of you, you have access to tools, you have thresholds of, like, confidence.

370 00:44:25.480 00:44:38.520 Uttam Kumaran: And yes, what you’ll find… and so my job on the platform team is to enable you to make these skills faster, but what you’re gonna find is the skill creation is not the limitation, it’s invoking it.

371 00:44:39.000 00:44:41.700 Uttam Kumaran: you’re gonna find what I found out, is that

372 00:44:42.190 00:44:57.149 Uttam Kumaran: there’s… there’s not an… there’s not enough bandwidth to start working on all these things. Like, you’re gonna have 4 or 5 chat windows up, you’re gonna be working on a bunch of things, and you’re gonna find that you’re the limitation. The next step is for you to assign work out to the AI.

373 00:44:57.340 00:44:57.809 Amber Lin: Right?

374 00:44:57.850 00:45:09.430 Uttam Kumaran: But again, what happens is if, just like if you have an intern, you don’t give it a clear, and you’re like, run, it’s just gonna Roomba itself into the wall, right? It’s, like, not gonna, like, really understand what’s going on. So…

375 00:45:09.930 00:45:14.889 Uttam Kumaran: as you can see, the reason why I’ve been working on a lot of these plans is the plan

376 00:45:15.010 00:45:18.249 Uttam Kumaran: And the spec is actually the limiting factor.

377 00:45:18.530 00:45:22.909 Uttam Kumaran: on the platform team, I’m gonna solve, like, you can go… you can go into…

378 00:45:23.190 00:45:26.070 Uttam Kumaran: Like, you can go into,

379 00:45:26.900 00:45:40.779 Uttam Kumaran: linear today and start assigning things to AI. You should try it. You could go do that today. You can assign things to cursor or codecs. Like, a good example, if I was to show you, is in the Andy channel today, there was this,

380 00:45:41.650 00:45:44.549 Uttam Kumaran: There was this thing about, hey, is this right?

381 00:45:44.880 00:45:51.770 Uttam Kumaran: Right? I don’t think this data is right. Is there any way… I think we’re… we have to filter out Janice. Yeah, I’ll do that manually.

382 00:45:52.490 00:45:59.169 Uttam Kumaran: You can literally go assign this to AI, and I can show you, and we can see, I don’t know if it’s… I don’t know if the repo is hooked up, but let’s…

383 00:45:59.310 00:46:00.830 Uttam Kumaran: Let’s see if this works.

384 00:46:01.020 00:46:05.840 Uttam Kumaran: So you can go in here, and…

385 00:46:06.680 00:46:11.099 Uttam Kumaran: you can literally go, cool, I want to click… I wanna literally assign it to cursor.

386 00:46:12.190 00:46:14.859 Uttam Kumaran: And… It’s gonna start working on it.

387 00:46:15.410 00:46:16.860 Amber Lin: Oh, that’s very cool.

388 00:46:16.860 00:46:23.909 Uttam Kumaran: and you’ll move it into a progress, and then I’ll show you. So, you could actually click on this, and I want to actually go to the cloud run.

389 00:46:24.010 00:46:27.989 Uttam Kumaran: So if I go… so it’ll also start to comment on what it’s doing.

390 00:46:28.120 00:46:32.500 Uttam Kumaran: Right now, I can see it’s in the… it should… let’s see, it’ll… it should… it may work.

391 00:46:34.320 00:46:42.350 Uttam Kumaran: So right now, it’s starting to basically run. So if you look, this is the prompt. You’ve been assigned to resolve a linear issue, carefully analyze. Below, you find the issue.

392 00:46:42.780 00:46:45.360 Uttam Kumaran: And so, guess what my next point is?

393 00:46:45.550 00:46:49.060 Uttam Kumaran: What is… what is the limiting factor in this prompt, if I had to ask you?

394 00:46:50.540 00:46:57.949 Amber Lin: Well, it doesn’t have context to how we set up the data.

395 00:46:57.950 00:46:58.380 Uttam Kumaran: Yes.

396 00:46:58.700 00:47:00.090 Uttam Kumaran: Shitty context.

397 00:47:00.290 00:47:08.090 Uttam Kumaran: shitty acceptance criteria, right? So, the limitation now is the fact that this is kind of sucky.

398 00:47:08.440 00:47:23.420 Uttam Kumaran: But what you’re gonna find is, yes, like, Opus and these things, they’re really smart. They will take, like, every single thing and, like, infer, but ultimately, we could use GPT, like, 5.1 if we just had a much richer context.

399 00:47:24.080 00:47:35.119 Uttam Kumaran: So, another thing I’m gonna build is, like, there should be an AI that looks at this ticket and says, this cannot move into to-do until the context is filled out, right?

400 00:47:35.220 00:47:35.900 Amber Lin: Hmm.

401 00:47:35.900 00:47:41.999 Uttam Kumaran: And it should flag this SL, say, this is shitty context, you need to go fill it out with better context. Because ultimately.

402 00:47:42.190 00:47:49.080 Uttam Kumaran: I’m gonna assign this to cursor, but if cursor fails, it’s because of the context. It’s not because of cursor.

403 00:47:49.290 00:47:50.100 Uttam Kumaran: Right.

404 00:47:50.780 00:47:53.459 Uttam Kumaran: And so that’s what I want you to think a little bit about, like.

405 00:47:53.890 00:47:57.380 Uttam Kumaran: This whole world of, like, context engineering, harnessing.

406 00:47:57.570 00:48:10.160 Uttam Kumaran: what we’re ultimately trying to solve is that we’re… typically in knowledge work, we’re really bad at writing things down and giving clear expectations, and using AI should just expose that to you, that that is the limiting factor.

407 00:48:10.330 00:48:16.880 Uttam Kumaran: And what we’re gonna find is, like, one of the… one of the goals for the platform team

408 00:48:17.390 00:48:24.110 Uttam Kumaran: One of the goals for the platform team, I’ll have you read, is… this.

409 00:48:29.720 00:48:30.280 Amber Lin: Hmm.

410 00:48:31.260 00:48:44.409 Uttam Kumaran: So, for all the work that the platform team does, which is, like, making sure cursor is set up, I actually want to force my team that 50% of our tickets only… can only be done by AI. No human intervention.

411 00:48:45.180 00:48:50.149 Amber Lin: I mean, right now, most of what the team does is instruct AI to code, and then…

412 00:48:50.150 00:48:53.060 Uttam Kumaran: Yes, so… but we need to make the leap, right?

413 00:48:53.060 00:48:54.030 Amber Lin: Okay.

414 00:48:54.030 00:49:01.770 Uttam Kumaran: But it’s… it’s tough, you have to have your whole system set up, the right keys, right? All the little paper-cut things that you’re dealing with, which is, like.

415 00:49:01.950 00:49:19.449 Uttam Kumaran: my env key isn’t there, or, like, the thing isn’t installed. That’s what we’re figuring out, like, for the broader organization, so that any piece of work you do assign, you can have trust that it will execute. It may be slow, it may ask you follow-up questions, but, like, it can get there.

416 00:49:19.630 00:49:26.279 Uttam Kumaran: So that’s where, like, because ultimately, right now, my limitation is I cannot work on all the things that I have.

417 00:49:27.050 00:49:31.699 Uttam Kumaran: And I’m consistently generating more projects to work on.

418 00:49:32.170 00:49:40.240 Uttam Kumaran: But I’m trying… where I’m trying to push to is not only can I assign a piece of work out, can I just assign one AI that governs the entire project?

419 00:49:40.530 00:49:47.959 Uttam Kumaran: and then assigns out to other AI, right? And manages it. And so you can see we’re building these, like, layers of abstraction.

420 00:49:48.740 00:49:51.039 Uttam Kumaran: But the unit of work is the ticket.

421 00:49:53.560 00:50:04.130 Amber Lin: Very cool. Wow, this is… this has really gone a far way. Well, I’m excited to read this. I don’t have enough context of what… I know what you’re talking about, but it will be helpful…

422 00:50:04.130 00:50:08.949 Uttam Kumaran: You’ll see it, you’ll see it. I just wanted to tell you these, because, like, in, like, a week or two, it’ll come back and click.

423 00:50:09.280 00:50:09.710 Amber Lin: Okay.

424 00:50:09.710 00:50:12.089 Uttam Kumaran: You know, the clock will come back, and you’ll be like.

425 00:50:12.480 00:50:14.470 Amber Lin: Okay.

426 00:50:14.470 00:50:17.839 Uttam Kumaran: This is where I can’t… because I can’t go deep on strategy.

427 00:50:18.430 00:50:26.199 Uttam Kumaran: Like, I can’t… I can’t spend a cycles being like, how do I… Just enable the strategy team.

428 00:50:26.310 00:50:28.409 Uttam Kumaran: what I’m gonna do is make it so, like.

429 00:50:28.660 00:50:38.120 Uttam Kumaran: anyone who wants to build AI for their team, you have the… you have the things. Like, we chose linear, because I know you can assign AI out. You have a cursor.

430 00:50:38.260 00:50:42.399 Uttam Kumaran: You have… everybody is gonna learn how to do skills, so you’re gonna have, like.

431 00:50:42.750 00:50:45.900 Uttam Kumaran: I’m gonna give you, like, Rock. Earth.

432 00:50:46.040 00:50:52.530 Uttam Kumaran: fire, stone, right? I’m gonna give you, like, the core primitives, And, like.

433 00:50:52.540 00:50:53.180 Amber Lin: Edge.

434 00:50:53.180 00:50:56.189 Uttam Kumaran: And then everybody can start running for their own teams, right?

435 00:50:56.560 00:50:57.280 Amber Lin: Cool.

436 00:50:57.860 00:51:02.600 Amber Lin: Okay, exciting! Well, even more excited to talk to me, next.

437 00:51:02.600 00:51:08.279 Uttam Kumaran: So yeah, I think… I think if you… for me, I think if you can work with him, if you can say, if… what is…

438 00:51:08.420 00:51:11.689 Uttam Kumaran: couple ways that I would… I would ask him is, like, what is the…

439 00:51:11.800 00:51:14.739 Uttam Kumaran: What is the key result that he’s most nervous about hitting?

440 00:51:14.850 00:51:16.780 Uttam Kumaran: Or what is the key result that if you hit.

441 00:51:17.170 00:51:23.849 Uttam Kumaran: it is the biggest impact of the business, and I would try to throw your hat in the ring for that. There’s a lot of stuff he wants to do.

442 00:51:25.280 00:51:30.810 Uttam Kumaran: But… and I… whatever you guys agree on, I’ll… I will work with you and tie the… tie the bonus to that.

443 00:51:31.270 00:51:33.720 Amber Lin: Cool. Yeah.

444 00:51:33.910 00:51:41.060 Amber Lin: I’m excited about the year 2 stuff. I do want to come back, talk a little bit about the contracts that we just…

445 00:51:41.060 00:51:41.590 Uttam Kumaran: Yes.

446 00:51:41.590 00:51:45.660 Amber Lin: we can finalize it today, and then we’ll get… I’ll work with Rico to get it.

447 00:51:45.660 00:51:46.190 Uttam Kumaran: Cool.

448 00:51:46.440 00:51:49.740 Amber Lin: written… written out and signed.

449 00:51:50.220 00:51:52.839 Amber Lin: Let’s see, let me pull up my dog.

450 00:51:55.440 00:51:56.270 Amber Lin: Yep.

451 00:52:00.150 00:52:07.510 Amber Lin: I think I can take what we talked about today, and I’ll update the stock to give to Rico, so he can update the…

452 00:52:07.710 00:52:18.910 Amber Lin: updated contract or whatever. So I think year two… let’s see… I think year two bonus structure, we’ve talked about that.

453 00:52:19.440 00:52:19.760 Uttam Kumaran: Yes.

454 00:52:19.760 00:52:24.770 Amber Lin: So… We talked about that. I… so I have…

455 00:52:25.550 00:52:29.920 Amber Lin: I think two things to ask. So this, I think we talked about.

456 00:52:30.250 00:52:48.769 Amber Lin: a 10-year base raise at, I think we did 7%, so I wanted to ask you, is there a possibility for a year one bonus, and is there, range for, like, year two base adjustment beyond a 10-year base raise? Because I know I’m doing very different work than

457 00:52:48.770 00:52:50.639 Amber Lin: The PM work. So.

458 00:52:50.640 00:52:51.150 Uttam Kumaran: Yeah.

459 00:52:51.150 00:52:52.629 Amber Lin: What are your thoughts, Daryl?

460 00:52:52.840 00:52:58.769 Uttam Kumaran: Yeah, so… You may like one answer, you may not like one answer, so…

461 00:52:58.770 00:52:59.230 Amber Lin: Huh.

462 00:52:59.230 00:53:00.410 Uttam Kumaran: on one hand.

463 00:53:01.270 00:53:12.570 Uttam Kumaran: what I can control is that I’m gonna put forth this OKR in front of you for the next… there’s gonna be OKRs for the next three quarters that if you hit, there’s gonna be a bonus. I feel like you’re going to…

464 00:53:12.770 00:53:15.950 Uttam Kumaran: You’re gonna be able to hit the 100K.

465 00:53:16.480 00:53:21.529 Uttam Kumaran: through achieving those. I can see that you’re, like, Yes, but I want, like.

466 00:53:21.830 00:53:38.279 Uttam Kumaran: the floor to rise as well. For that, I have to wait until Jasmine gets here, because the service leads are going to own the leveling within their org. We don’t, like, I do think that you’re operating well beyond, like.

467 00:53:38.600 00:53:47.820 Uttam Kumaran: what you would classify as, like, a junior. But I… but also, like, I’m not… it’s… it’s not gonna be my organization. So, what I want her to do is set the levels.

468 00:53:48.110 00:54:07.639 Uttam Kumaran: And then she’s gonna own with Kayla, like, what are the… what are the basic opportunities? To give you the broader perspective is, like, what I don’t want to do here is be, like, you’re not… there’s no bonus, there’s no additional, and then also there’s no bonus. Like, I think you’re gonna hit the 100K if you… if you rip those quarterly goals.

469 00:54:09.130 00:54:12.300 Uttam Kumaran: like, if you hit the quarterly OKRs, I think you will get there.

470 00:54:12.880 00:54:13.320 Amber Lin: Huh.

471 00:54:13.320 00:54:19.959 Uttam Kumaran: right now, for everyone that hits one year, we’re offering 7%. To give you a frame of reference, like.

472 00:54:21.250 00:54:25.019 Uttam Kumaran: The standard industry right now in consulting is doing, like.

473 00:54:25.460 00:54:27.839 Uttam Kumaran: Anywhere from, like, 1% to 4%.

474 00:54:28.190 00:54:35.729 Uttam Kumaran: So I feel like 7% is healthy. I know it’s not like… I don’t want to be like, everybody else is doing this, so we’re… that’s not how I think about it.

475 00:54:35.900 00:54:37.360 Uttam Kumaran: But I also…

476 00:54:38.020 00:54:48.519 Uttam Kumaran: for this to start to stabilize, and for you to have a clear sense of, like, year one, year two, I have to delegate to the service leads to start to basically create their own org structure.

477 00:54:48.670 00:54:57.230 Uttam Kumaran: And give them the opportunity. What I can control right now is that there’s this bonus in place for hitting these OKRs on this new team.

478 00:54:58.580 00:55:08.870 Uttam Kumaran: And then I think it’s helpful for Jasmine to come in and consider re-leveling. And that’s the conversation that I would like you and her to have, or you, me, and her to have, or me and her have, is like.

479 00:55:09.070 00:55:12.889 Uttam Kumaran: Does there need to be a re-leveling on the strategy team, right?

480 00:55:12.890 00:55:27.149 Amber Lin: Okay, I hear you on… I hear you on this point. My question here, then, is what is the estimated range of bonuses tied to OKRs? Is it more in the 1% to 5% range, or is it.

481 00:55:27.150 00:55:30.409 Uttam Kumaran: No, it’s, like, in the sort of 10-20% range.

482 00:55:30.410 00:55:31.100 Amber Lin: Cool.

483 00:55:31.100 00:55:34.759 Uttam Kumaran: So, to give you another frame of reference there,

484 00:55:35.340 00:55:43.599 Uttam Kumaran: most people are having bonuses within 5-10%. We’re trying to do bonuses that are closer to the 10-20%.

485 00:55:44.210 00:55:47.229 Uttam Kumaran: And… and the reason is, one, I feel like…

486 00:55:47.630 00:55:54.190 Uttam Kumaran: We are much more organized than most companies and can tie team performance and individual performance to

487 00:55:54.410 00:55:59.540 Uttam Kumaran: company revenue. That’s not something that most companies do. Most companies will say.

488 00:55:59.860 00:56:09.509 Uttam Kumaran: we did okay, here’s… everybody gets 4%, and, like, you may have crushed it, Sally in the other cubicle may have, like, never came to work, you both get 4%. Like.

489 00:56:09.660 00:56:15.689 Uttam Kumaran: I don’t like that, I don’t… I never like that. So what I want to do, though, is give you… give everybody a higher…

490 00:56:15.980 00:56:19.189 Uttam Kumaran: A higher, like, amount to achieve.

491 00:56:19.300 00:56:20.560 Uttam Kumaran: But on net.

492 00:56:20.710 00:56:28.239 Uttam Kumaran: we also pay really well. And so, one thing that we are thinking about longer term is, like, I don’t want to actually

493 00:56:28.440 00:56:35.530 Uttam Kumaran: like, change base? Like, not in terms of your base, but in terms of, like, our leveling, like, in terms of our ranges.

494 00:56:35.800 00:56:54.430 Uttam Kumaran: I think we’re in a healthy place. In fact, like, I think we… we really, like, we pay a lot. Instead, what I want to do is ratchet up the bonus. And I want it to be really more meritocratic, where people have a shot at getting 10-20% bonus. That is not something that, like, is typically offered in the industry.

495 00:56:54.650 00:57:00.279 Uttam Kumaran: And that’s something that, like, it’s a bet I’m taking for… if I can set the company objectives right and set the incentives right, then…

496 00:57:00.760 00:57:18.280 Uttam Kumaran: I’m totally fine. If you… if you guys rip L&D and it becomes a service, and we make money, why wouldn’t I give them money? Like, it’s so clear to me, the reason why it’s oftentimes not, there’s two reasons why people don’t often do it. One is that operationally, it’s… it’s tough. Like, it’s way easier to be like.

497 00:57:18.660 00:57:24.520 Uttam Kumaran: At the end of the year, we’re just gonna count the piggy bank and then give a piece. Second is greed. Most people…

498 00:57:24.770 00:57:27.759 Uttam Kumaran: don’t want to offer any of that. They were like, no, we’re not gonna…

499 00:57:27.880 00:57:38.470 Uttam Kumaran: They’re like, we’re not gonna pay you at all, and there’s no bonus. That’s, like, most engineering companies. So, I… I don’t… I’m like… I don’t like both of them. I particularly don’t like the second one.

500 00:57:38.600 00:57:44.940 Uttam Kumaran: But the first one, I’m taking on the operational burden of trying to allow people to go for

501 00:57:45.070 00:57:46.870 Uttam Kumaran: 10-20% bonus.

502 00:57:47.030 00:57:48.530 Uttam Kumaran: And that should put you…

503 00:57:48.990 00:57:54.030 Uttam Kumaran: I think that, and, like, potentially re-leveling should get you into that range, for sure.

504 00:57:54.030 00:58:13.450 Amber Lin: I hear you on that. I think in my… in my experience with past, like, consulting companies where my friend Susan Ivy, it’s… some of it is project-based. If this project goes well, then you get a bonus, but I think we’re operating more on the OKR level, which I’m also happy with. I guess my question is then, so, so far.

505 00:58:13.450 00:58:22.220 Amber Lin: how many bonuses has been paid out… paid out? So I just want to understand, is this something that’s already happening?

506 00:58:22.250 00:58:27.069 Amber Lin: Do we actually pay out the bonuses we promised? I want to understand that part.

507 00:58:27.070 00:58:35.990 Uttam Kumaran: Yeah, so this is the first quarter that we’ve done bonuses. We’ve done spot bonuses, so, like, if someone comes in and, like, something crazy’s happened and they have to crush it, like…

508 00:58:36.260 00:58:52.480 Uttam Kumaran: like, Zoran came in on a project recently that, like, they needed some, like, rush work, and he crushed it, and so we gave a spot bonus. So, you’ll see kind of two things happen. One is, like, service leads are gonna be more in charge. Service leads and CSOs are going to have the ability to sort of, like.

509 00:58:52.780 00:58:58.970 Uttam Kumaran: be like, hey, this person really crushed it on a project, I wanna, like, allocate some amount of money for a spot bonus.

510 00:58:59.130 00:59:01.810 Uttam Kumaran: And then this is the first quarter we’ve even done bonuses, so…

511 00:59:02.350 00:59:05.660 Uttam Kumaran: For the folks that are in the leadership positions,

512 00:59:05.760 00:59:21.760 Uttam Kumaran: will be reflecting sort of what we put forth, in the beginning of the year, which is sort of that, like, I think we basically said 5%. And then each of those folks are gonna get… get… we’re going through for each CSO and SL and doing updated OKRs. So this is the first quarter we actually did

513 00:59:22.390 00:59:26.959 Uttam Kumaran: We didn’t even have bonuses, really, before, like, this quarter, you know?

514 00:59:27.070 00:59:29.869 Uttam Kumaran: And then what we’re gonna do is, like, bonuses are not…

515 00:59:30.680 00:59:40.590 Uttam Kumaran: as you also probably know, bonuses will be played at the end of the year. But we’re gonna try to… we’re gonna try to aim for how… for… this is a… for this quarter, I’m gonna do this quarter.

516 00:59:40.700 00:59:44.899 Uttam Kumaran: But we’re probably gonna move to, like, half-year cycle, because it’s just, like.

517 00:59:44.900 00:59:53.210 Amber Lin: That makes sense to me, I think that’s what most people do. So, it sounds like to me, like, nobody got a year one bonus, so I’m happy to scratch that if that’s the level.

518 00:59:53.210 00:59:53.830 Uttam Kumaran: That’s correct.

519 00:59:53.830 00:59:55.940 Amber Lin: Everybody, so, like…

520 00:59:55.940 01:00:02.810 Uttam Kumaran: Yeah, to give you a sense, nobody else did. I don’t like being like, no one else did, so you don’t, like… but nobody else did.

521 01:00:02.810 01:00:09.289 Amber Lin: Like, that’s… if that’s what everybody… that’s what the company is at, like, that’s… that can be my expectation.

522 01:00:09.290 01:00:16.630 Uttam Kumaran: Yeah, and we’re only… we’ve only given… yeah, we’ve only given tenure-based raises. 7% is the top level of the range.

523 01:00:16.910 01:00:23.320 Uttam Kumaran: And I’m still, like… what I didn’t want to do is move you off of EP and then not give you something else, so…

524 01:00:23.680 01:00:36.219 Uttam Kumaran: I… I’m… I was like, B, I think this is a good… I think this is a good spot. I believe in his project, and I think that you can play a big component of it, and then I want to assign a target to that. So, I… I… I hear that.

525 01:00:36.220 01:00:40.569 Amber Lin: And I believe in his ability to execute his plans, so…

526 01:00:40.790 01:00:45.549 Uttam Kumaran: And you have someone who’s in, like, leadership vouching for this project.

527 01:00:45.770 01:00:53.530 Uttam Kumaran: And, like, again, his job is to help you achieve that. My job is to help both of you guys achieve this, so…

528 01:00:53.680 01:01:00.660 Uttam Kumaran: this is where it’s like, I’m like, I want to set the bonus that we’re all happy if we hit it, like, it’s, you know, so… I think this quarter.

529 01:01:00.830 01:01:13.100 Uttam Kumaran: we, like, are just figuring out the CSO and the whole delivery structure. I think this should be really clear, and you have someone that’s actually managing and measuring the KPI weekly, so you’ll have clarity on, like, if you’re on track to hit it.

530 01:01:14.090 01:01:16.040 Amber Lin: Cool, I see.

531 01:01:16.400 01:01:27.510 Amber Lin: For the Q1 bonus revaluation, I know we paused or removed the EP rule, but will that still be in the… will that still be considered for.

532 01:01:27.510 01:01:33.480 Uttam Kumaran: Yeah, I think… I think I’m gonna basically… for… for everyone in that… I mean, probably the…

533 01:01:34.060 01:01:41.690 Uttam Kumaran: Some of the challenges I’m having is that not everyone did, like… whether we removed it off their plate, or they didn’t just…

534 01:01:42.070 01:01:44.769 Uttam Kumaran: People just ended up, like, not doing the stuff I asked.

535 01:01:45.750 01:01:48.759 Uttam Kumaran: So what’s fair? What do you think is fair?

536 01:01:48.760 01:01:58.359 Amber Lin: Well, if I seek… to be very honest, I feel like only me and Mustafa did the EP work. I agree. And showed up to the meetings.

537 01:01:58.360 01:01:58.960 Uttam Kumaran: I agree.

538 01:01:58.960 01:02:04.159 Amber Lin: That’s the only people who were there, if you ask me and whoever else was at that meeting.

539 01:02:04.880 01:02:06.130 Uttam Kumaran: Yeah, so…

540 01:02:06.250 01:02:13.969 Uttam Kumaran: what I’m thinking about is, like, maybe just tracking up to the point where, you know, we kind of, like, moved you guys off, and then just…

541 01:02:14.080 01:02:23.090 Uttam Kumaran: adhering to that bonus for that time period, basically doing, like, 5% of whatever was earned up till that point. Okay.

542 01:02:23.590 01:02:24.420 Amber Lin: I think that’s fair.

543 01:02:24.420 01:02:26.969 Uttam Kumaran: Yeah, I think that’s fair, too. I,

544 01:02:27.380 01:02:35.779 Uttam Kumaran: It’s tough, because on one hand, I’m like, maybe I just didn’t… wasn’t clear enough. On some hand, some people just, yeah, didn’t do shit, so I was like, fuck, like, and… Yeah. And I was like, okay.

545 01:02:35.780 01:02:40.830 Amber Lin: It turns out we can remove it, so the less… the less stuff, the better, so…

546 01:02:40.830 01:02:42.230 Uttam Kumaran: Yeah, so.

547 01:02:42.230 01:02:43.859 Amber Lin: Ended up in the good thing.

548 01:02:43.860 01:02:51.699 Uttam Kumaran: Yeah, so that’ll get… that’ll all, like, we’re… we’re doing this, like, quarterly and month-end close right now, and so that’ll… this’ll all get, like.

549 01:02:51.800 01:02:55.059 Uttam Kumaran: Ironed out over the next two weeks, like, what bonuses end up being, so…

550 01:02:55.370 01:03:01.200 Amber Lin: Cool. Awesome. So, is the number we’re looking at for the base this one?

551 01:03:02.160 01:03:05.709 Uttam Kumaran: I think that’s a good question for…

552 01:03:06.040 01:03:13.059 Uttam Kumaran: Kayla, because, oh, I guess, like, yeah, it’s whatever, you’re at… you’re at 88, 88, so whatever that is, plus, yeah.

553 01:03:13.280 01:03:14.620 Uttam Kumaran: 1.07.

554 01:03:15.130 01:03:16.839 Uttam Kumaran: Cool. If that’s this, then yeah.

555 01:03:17.340 01:03:18.000 Amber Lin: Okay.

556 01:03:18.570 01:03:27.089 Amber Lin: Sounds good. I’ll work with Kayla and Rico to get the contract ironed out. I’ll sign, and you can review.

557 01:03:27.090 01:03:39.629 Uttam Kumaran: Okay. And then also it’s worth, it’s worth talking… I mean, I know you’re already talking to Kayla, you may have heard about, sort of, what her OKRs are, but probably the… on the people side is where you’ll see a lot of her things directly are gonna affect

558 01:03:40.040 01:03:43.830 Uttam Kumaran: Your peace of mind and well-being and longevity of the company, so it’s worth.

559 01:03:43.830 01:03:45.099 Amber Lin: I just talked to her yesterday.

560 01:03:45.100 01:03:48.020 Uttam Kumaran: Okay, so you probably know, like, some of the stuff she’s planning, yeah.

561 01:03:48.370 01:03:48.940 Amber Lin: Yeah.

562 01:03:49.330 01:03:49.960 Uttam Kumaran: I’m also, like.

563 01:03:49.960 01:03:50.880 Amber Lin: We’re the offsite.

564 01:03:51.170 01:03:54.750 Uttam Kumaran: Yeah, I told her too. I think we’ll try to do it this year.

565 01:03:54.890 01:03:56.059 Uttam Kumaran: I basically said.

566 01:03:56.190 01:04:02.040 Uttam Kumaran: if she can tell me how much it’s gonna cost, then I can determine. I mean, I’ll come to LA, I’m gonna come to LA sometime next…

567 01:04:02.040 01:04:03.050 Amber Lin: Yeah.

568 01:04:03.050 01:04:03.790 Uttam Kumaran: or something?

569 01:04:03.790 01:04:06.550 Amber Lin: Book something far further out, maybe, like, in.

570 01:04:06.550 01:04:09.389 Uttam Kumaran: I would like to do it in Austin, everybody’s, like, in the middle, and, like.

571 01:04:09.390 01:04:12.699 Amber Lin: I think we should do in Austin. It’s cheaper to do it in Austin.

572 01:04:12.700 01:04:15.120 Uttam Kumaran: No, it is cheaper to do in Austin, it’s like…

573 01:04:15.330 01:04:19.320 Uttam Kumaran: There’s just a lot of… there’s… there’s a lot of people in this… there’s, like, a lot of people now.

574 01:04:19.500 01:04:25.900 Uttam Kumaran: So it’s, like, flights, hotels, food… I mean, it’s not.

575 01:04:25.900 01:04:26.590 Amber Lin: That’s like…

576 01:04:26.590 01:04:28.820 Uttam Kumaran: It’s… I don’t think it’s, like… I don’t think it’s like that.

577 01:04:28.820 01:04:31.559 Amber Lin: He was his 30K, I don’t think it’s gonna be 30K.

578 01:04:32.350 01:04:36.280 Uttam Kumaran: I… yeah, I wonder… so how many people do we have in the US right now?

579 01:04:36.730 01:04:45.740 Amber Lin: We have you, Robert, me, Hannah, Advit, Jarrell.

580 01:04:45.740 01:04:46.430 Uttam Kumaran: Greg.

581 01:04:46.860 01:04:47.750 Amber Lin: Greg?

582 01:04:47.980 01:04:49.140 Amber Lin: Jasmine?

583 01:04:49.380 01:04:53.009 Amber Lin: Oh, Sam, oh, I guess the… we have 10 people-ish?

584 01:04:53.010 01:04:55.040 Uttam Kumaran: We may have another person next week.

585 01:04:56.690 01:04:58.149 Uttam Kumaran: Does that say 11 people?

586 01:04:58.150 01:04:59.100 Amber Lin: Okay.

587 01:05:00.160 01:05:00.840 Amber Lin: Cool.

588 01:05:00.840 01:05:08.360 Uttam Kumaran: Let’s say… okay, let’s say… let’s say flight is 500, because let’s just, like, let’s just be liberal with it. $500.

589 01:05:08.360 01:05:11.489 Amber Lin: Temporary people, 500.

590 01:05:11.490 01:05:12.970 Uttam Kumaran: It’s like, are we gonna do…

591 01:05:12.970 01:05:13.619 Amber Lin: Oh, let’s do it.

592 01:05:13.880 01:05:16.170 Amber Lin: Let’s do bi-person, and then…

593 01:05:16.170 01:05:18.330 Uttam Kumaran: Off-site is, like, during the week, right? So…

594 01:05:18.490 01:05:19.750 Amber Lin: Yeah, so, like.

595 01:05:19.750 01:05:21.569 Uttam Kumaran: Is it 2 days, or how long is it?

596 01:05:21.710 01:05:25.159 Amber Lin: I don’t know, like, 2? It’s like 3 days? Three days seems like…

597 01:05:25.160 01:05:25.899 Uttam Kumaran: go live.

598 01:05:26.690 01:05:27.080 Amber Lin: Yeah, maybe.

599 01:05:27.080 01:05:29.850 Uttam Kumaran: Maybe it’s come in, and then it’s… it’s…

600 01:05:30.740 01:05:35.129 Amber Lin: The last two days of the week, and whatever, if people want to spend the weekend, they can.

601 01:05:35.810 01:05:36.320 Amber Lin: Like.

602 01:05:36.320 01:05:38.229 Uttam Kumaran: Oh, no, that’s a great idea, that’s a great idea.

603 01:05:38.230 01:05:40.259 Amber Lin: Because people like to travel, like, visit…

604 01:05:40.260 01:05:45.539 Uttam Kumaran: So it’s like… Come in Wednesday night, Thursday, Friday.

605 01:05:47.940 01:05:50.109 Amber Lin: And then they can book their own flights out.

606 01:05:50.810 01:05:52.280 Amber Lin: On… on the company.

607 01:05:52.280 01:05:52.730 Uttam Kumaran: Really?

608 01:05:53.060 01:05:54.320 Amber Lin: No, no, no!

609 01:05:54.320 01:05:54.789 Uttam Kumaran: You can book it.

610 01:05:54.790 01:05:55.580 Amber Lin: But pick the.

611 01:05:55.580 01:05:56.290 Uttam Kumaran: It’s funny.

612 01:05:56.290 01:05:56.820 Amber Lin: If they want.

613 01:05:56.820 01:05:57.539 Uttam Kumaran: Yeah, yeah, okay.

614 01:05:57.540 01:05:57.920 Amber Lin: Totally.

615 01:05:57.920 01:05:58.670 Uttam Kumaran: Okay, sure.

616 01:05:58.670 01:06:01.050 Amber Lin: every week, that’s all… that’s their free time.

617 01:06:01.050 01:06:05.820 Uttam Kumaran: 500 for flights, Two days… In a hotel.

618 01:06:06.290 01:06:07.110 Uttam Kumaran: It’s probably, like.

619 01:06:07.110 01:06:11.910 Amber Lin: Or, like, a big, big Airbnb. It might save more if it’s just a big Airbnb.

620 01:06:11.910 01:06:16.230 Uttam Kumaran: But what would you think that is? Is that, like, that’s, like, still probably, like, 200 bucks a person?

621 01:06:16.230 01:06:18.660 Amber Lin: I… yeah… probably.

622 01:06:18.660 01:06:24.040 Uttam Kumaran: Hotel is probably 200 bucks a person, so… And then food per person?

623 01:06:24.040 01:06:24.780 Amber Lin: day.

624 01:06:25.730 01:06:34.190 Amber Lin: Yeah, pause… like… Two… how many meals is that?

625 01:06:35.610 01:06:38.690 Uttam Kumaran: 1, 2, 3, 4, 5, 6, 7.

626 01:06:39.810 01:06:42.010 Amber Lin: 7 times 20 worth reading.

627 01:06:42.980 01:06:45.840 Uttam Kumaran: 7… 7 times, let’s say it’s,

628 01:06:46.730 01:06:50.319 Uttam Kumaran: Let’s say it’s 30 bucks a meal. Wow. You know?

629 01:06:50.320 01:06:53.680 Amber Lin: Okay, so that’s, like, 200… 200ish.

630 01:06:54.330 01:06:55.010 Amber Lin: Plus…

631 01:06:55.010 01:06:58.009 Uttam Kumaran: No, it’s 200-ish times 11.

632 01:06:59.160 01:07:09.399 Amber Lin: Yeah, I’m doing per person, so plus, like, 400 for lodging. So, for flights, food, and lodging, we’re about a thousand…

633 01:07:09.400 01:07:09.820 Uttam Kumaran: Each.

634 01:07:09.820 01:07:14.189 Amber Lin: person. Each. And then… Okay. So, like, 10…

635 01:07:14.190 01:07:14.770 Uttam Kumaran: 10K.

636 01:07:14.770 01:07:21.930 Amber Lin: 10 to 15K. Yeah. So, like… Up to you guys.

637 01:07:21.930 01:07:25.340 Uttam Kumaran: No, I’m… I wanna do it, like, I’m the one My idea!

638 01:07:25.680 01:07:27.520 Amber Lin: And you don’t have to fly, so you can’t say.

639 01:07:27.520 01:07:36.389 Uttam Kumaran: No, no, no, no, no. I would fly, I would fly. I just… we could do it in LA, it’s just like, I’d rather do it here, but I feel like this is also a good middle ground.

640 01:07:36.510 01:07:43.469 Uttam Kumaran: And yes, it is cheaper. I think it’s all… yeah, 10 to 15 grand, I think we could do. It’s also not every… some of it’s, like, credit card…

641 01:07:43.590 01:07:45.889 Uttam Kumaran: Like, I just think we… I just wanna, like.

642 01:07:46.880 01:07:51.779 Uttam Kumaran: it’s also a lot of… lot of logistics, so we can’t, like, do it two weeks before. I have to do it, like.

643 01:07:53.880 01:07:56.550 Uttam Kumaran: We have to do a lease, like, 3 months before, right?

644 01:07:57.770 01:07:59.509 Amber Lin: Why is… why’s that?

645 01:07:59.510 01:08:06.920 Uttam Kumaran: Because you gotta think about, some people, like, are, like, really grown with families, like, they can’t just, like, fly randomly, like…

646 01:08:06.920 01:08:08.360 Amber Lin: I’m like.

647 01:08:08.360 01:08:16.130 Uttam Kumaran: I feel like I’m… I’m getting closer to there. I’m still like, yo, let’s go to LA for a weekend, but yes, we have to… we have to give people, like.

648 01:08:17.189 01:08:20.399 Uttam Kumaran: Probably at least, like, 90 days, heads up.

649 01:08:20.560 01:08:21.710 Amber Lin: Yeah.

650 01:08:21.710 01:08:26.669 Uttam Kumaran: Because some people have to book babysitters, or kids, or, like, move around, travel.

651 01:08:26.850 01:08:27.929 Amber Lin: That is true.

652 01:08:27.930 01:08:30.740 Uttam Kumaran: So, either way, I mean… Look.

653 01:08:31.350 01:08:42.349 Uttam Kumaran: we have some monster stuff coming down the pipeline. We could be… basically be like, yo, let’s do off-site 30 days, whoever can make it, make it, but I would like it to be, like, something where we come in, we’re not, like, everything’s sort of, like.

654 01:08:42.800 01:08:47.090 Uttam Kumaran: Figured out, you know, like, we come in, everybody meets, we, like.

655 01:08:47.390 01:08:50.869 Uttam Kumaran: Do some good work stuff, but then it’s mostly hanging out.

656 01:08:51.000 01:08:53.649 Uttam Kumaran: And we don’t, like, worry about, like.

657 01:08:54.729 01:09:02.629 Uttam Kumaran: we don’t worry about, like, yeah, just, like, the money, or… I don’t want to do, like, a half-assed, cheap-ass thing, you know, because I’ll just come see you guys for, like.

658 01:09:03.200 01:09:04.549 Uttam Kumaran: Couple hundred bucks.

659 01:09:05.340 01:09:09.460 Uttam Kumaran: do dinner. Like, we could do it off-site, with LA offsite, like, tomorrow, but…

660 01:09:09.560 01:09:12.489 Uttam Kumaran: If we’re gonna do a big US thing, I’d rather it be, like.

661 01:09:12.970 01:09:13.700 Amber Lin: Yeah.

662 01:09:13.700 01:09:15.919 Uttam Kumaran: You know, something… something worthwhile. Okay.

663 01:09:16.080 01:09:21.560 Amber Lin: Yeah, I mean, if Kayla knows it will take, like, it’s mainly logistics, then…

664 01:09:22.290 01:09:25.610 Amber Lin: Baby, she, she says she want a next quarter.

665 01:09:25.880 01:09:26.370 Amber Lin: So…

666 01:09:26.370 01:09:28.659 Uttam Kumaran: I also think she’s probably right, but…

667 01:09:29.210 01:09:35.930 Uttam Kumaran: I’m telling you, like, we’re cooking on some stuff. I feel like if the… like, we’re… if some deals, some of these big, big-ass deals close, like.

668 01:09:35.939 01:09:37.919 Amber Lin: They are big deals.

669 01:09:37.920 01:09:40.300 Uttam Kumaran: Yeah, they could go. Like, it could really change around.

670 01:09:40.300 01:09:44.630 Amber Lin: It’s hard to imagine that we went from 5K to this.

671 01:09:44.630 01:09:52.270 Uttam Kumaran: I know. Well, I was telling people, you know, I don’t think I even made… in my first year and a half, I don’t think I made…

672 01:09:52.920 01:09:54.650 Uttam Kumaran: $90,000.

673 01:09:54.830 01:10:03.219 Uttam Kumaran: The business. The business, the whole business, actually, like, the whole business itself. That’s crazy. Not me, I didn’t… I haven’t made anything in, like, years. The whole business itself.

674 01:10:03.640 01:10:04.529 Amber Lin: So, to have…

675 01:10:04.530 01:10:07.369 Uttam Kumaran: One element deal that is $90,000 a month.

676 01:10:07.520 01:10:10.050 Uttam Kumaran: Is, like, out of this world.

677 01:10:10.050 01:10:11.010 Amber Lin: Huh.

678 01:10:11.010 01:10:14.290 Uttam Kumaran: I think we deserve it.

679 01:10:14.290 01:10:16.220 Amber Lin: Like, I’m excited for us!

680 01:10:16.220 01:10:17.309 Uttam Kumaran: You deserve it, you know?

681 01:10:17.900 01:10:18.420 Amber Lin: Huh.

682 01:10:18.420 01:10:25.559 Uttam Kumaran: And that’s why I also, like, I want the people that have been here to start to, like, go up. Like, I don’t want you guys to, like… there’s some people who just want to be ICs.

683 01:10:25.780 01:10:32.100 Uttam Kumaran: Fine. But, like, for the folks like you, and I just want to put people under you, I want to, like, put responsibility

684 01:10:32.220 01:10:34.000 Uttam Kumaran: On your plate, and so, like.

685 01:10:34.370 01:10:38.870 Uttam Kumaran: I don’t know, I feel like that’s what… that’s the right thing to do, and I think, like, that’s…

686 01:10:38.970 01:10:43.940 Uttam Kumaran: You give everyone here more leverage as we grow, and more opportunity to try new things.

687 01:10:45.050 01:10:45.720 Uttam Kumaran: So…

688 01:10:45.720 01:10:58.369 Amber Lin: Exciting, yeah, I’m excited about the new stuff. It aligns with what I want to learn more and what I’ve already been doing, so… and I liked walking Avid through, like, Cursor and stuff.

689 01:10:58.370 01:10:58.920 Uttam Kumaran: Yeah.

690 01:10:58.920 01:11:02.299 Amber Lin: he’s used that before, so that was fun for me.

691 01:11:03.770 01:11:04.390 Uttam Kumaran: Great.

692 01:11:04.530 01:11:05.250 Uttam Kumaran: Hell yeah.

693 01:11:05.500 01:11:09.049 Amber Lin: Yeah, thank you so much for your time. I know we took a bit longer than we planned.

694 01:11:09.050 01:11:10.060 Uttam Kumaran: Very good, I’m…

695 01:11:10.060 01:11:10.890 Amber Lin: Yeah.

696 01:11:10.890 01:11:12.470 Uttam Kumaran: I’m done for the day, basically.

697 01:11:12.730 01:11:19.450 Amber Lin: Me too. I’m just cleaning up this element spreadsheet, because we just updated a field, and I don’t freak out.

698 01:11:21.010 01:11:24.090 Amber Lin: Yeah, okay, thanks. I’ll talk to you next week.

699 01:11:24.090 01:11:24.920 Uttam Kumaran: Thank you, bye.

700 01:11:24.920 01:11:25.660 Amber Lin: I…