Meeting Title: Andi Roadmapping Date: 2026-03-17 Meeting participants: Pranav Narahari, Samuel Roberts, Brylle Girang


WEBVTT

1 00:01:06.480 00:01:07.470 Samuel Roberts: Hey.

2 00:01:08.400 00:01:09.140 Pranav Narahari: Hey.

3 00:01:10.920 00:01:14.580 Samuel Roberts: Thanks for creating the new meeting, I was just figuring out what was going on there.

4 00:01:14.750 00:01:18.539 Pranav Narahari: Yeah, I think I… as I messaged that, I saw you, you were typing, probably found.

5 00:01:18.540 00:01:20.510 Samuel Roberts: Yeah. Yeah. Cool, cool.

6 00:01:20.990 00:01:26.510 Pranav Narahari: So I think, Utam, like, he said in, like, the delivery leads chat that he had a 9.

7 00:01:26.770 00:01:30.240 Pranav Narahari: AM, which is 10 a.m. our time, so…

8 00:01:30.280 00:01:31.300 Samuel Roberts: Oh.

9 00:01:31.300 00:01:43.040 Pranav Narahari: He’s probably gonna be a little bit late, so I’m thinking… and I’ve already kind of talked to Bea a lot about this, but I haven’t really talked to you, so we can probably just, like, kind of get started, and then when they join, we can just… they’ll have context already.

10 00:01:43.220 00:01:45.900 Pranav Narahari: Let me…

11 00:01:50.720 00:01:55.329 Pranav Narahari: Yeah, so I’ll send you these docs, too, and then right now, I’ll just kind of…

12 00:01:55.500 00:01:58.100 Pranav Narahari: Tell you, like, high level about them, and then you can, like…

13 00:01:58.460 00:02:04.710 Pranav Narahari: you know, maybe even in this call, we’ll go into, like, the… the details. Let me just get these real quick…

14 00:02:21.060 00:02:24.050 Pranav Narahari: Oh, I, I think I sent them…

15 00:02:24.540 00:02:31.100 Pranav Narahari: earlier than… yeah, yesterday morning, it was, like, the zip file of product definition. It’s in the… let me just…

16 00:02:32.120 00:02:33.279 Pranav Narahari: I’ll reply in chat.

17 00:02:33.280 00:02:34.130 Samuel Roberts: In the channel?

18 00:02:34.320 00:02:35.820 Pranav Narahari: Yeah, yeah, okay.

19 00:02:35.820 00:02:37.710 Samuel Roberts: Yeah. It’ll pop up for you. Oh, cool.

20 00:02:41.400 00:02:43.320 Samuel Roberts: Oh, there it is, yeah, okay, let me pull that down.

21 00:02:43.320 00:02:44.160 Pranav Narahari: Okay.

22 00:02:44.160 00:02:47.499 Samuel Roberts: Yep. Oh, I did download this ad, but I think I didn’t get a chance to go through that.

23 00:02:47.890 00:02:48.970 Pranav Narahari: Yeah, all good.

24 00:02:48.970 00:02:49.810 Samuel Roberts: Okay, cool.

25 00:02:50.270 00:02:56.719 Pranav Narahari: There’s, like, 3, diff… there’s 4 files in there. One of them I probably didn’t need to include. But,

26 00:02:57.070 00:03:03.980 Pranav Narahari: The three things that I think we’ll… we’ll be working on starting in April are…

27 00:03:04.480 00:03:19.980 Pranav Narahari: we talked a little bit about the triage process. I want to… and you kind of saw that whole thread with me, you, and Utam about that. We kind of were going back and forth, just, like, discussing what it should look like. It’s basically going to be exactly what we kind of mapped out in that thread.

28 00:03:20.970 00:03:26.560 Pranav Narahari: What it’s gonna look like is, once it gets to the point of, becoming a triage ticket.

29 00:03:26.840 00:03:33.449 Pranav Narahari: what we’ll first do is we’re going to… okay, we’re gonna assign it to, most likely Janice.

30 00:03:33.850 00:03:38.850 Pranav Narahari: Because she’s the one that’s just, like, picking up the unassigned tickets anyways. I just want…

31 00:03:39.030 00:03:45.540 Pranav Narahari: them to always be assigned to somebody, so we’re just gonna assign them to Janiece. She will then assess whether this is a…

32 00:03:46.020 00:03:54.990 Pranav Narahari: something that isn’t included within the central doc, or it’s not… or it is included in the central doc, and the issue had something else to do with Andy.

33 00:03:56.330 00:04:10.359 Pranav Narahari: Now, what will change in the process, like, the desired outcome, is going to be that Janiece will never have to then assign it to another trainer internally to actually make the updates into the central doc themselves.

34 00:04:10.360 00:04:22.620 Pranav Narahari: They’ll probably… they’ll probably assign it to a trainer to just, answer the question, because she’ll know who’s probably best to answer a certain, like, question, or add content into the central doc, I should say.

35 00:04:22.730 00:04:27.789 Pranav Narahari: But what I’ve mapped out in that, PRD document.

36 00:04:28.210 00:04:36.550 Pranav Narahari: is how they should be giving us that information about what should be added into the central doc. So, I think it’s more or less like…

37 00:04:38.160 00:04:40.399 Samuel Roberts: Which, which PRD are we looking at specifically?

38 00:04:40.400 00:04:47.719 Pranav Narahari: It should be, like… I forget the name. It’s not the transcripts one, and it’s not the one.

39 00:04:48.560 00:04:50.210 Samuel Roberts: Document update? Is that…

40 00:04:50.700 00:04:51.499 Pranav Narahari: Yeah, probably that one.

41 00:04:51.710 00:04:57.939 Samuel Roberts: Okay, cool. I was just looking for triage, and I didn’t see it, so I was, like, wasn’t sure which one. Okay, that makes sense, now that we’re getting there. Okay.

42 00:04:58.160 00:04:59.800 Pranav Narahari: Yeah, and so…

43 00:05:00.730 00:05:05.259 Pranav Narahari: Yeah, I map out, like, the specific details in there. Let me actually just pull that up so I can…

44 00:05:05.460 00:05:06.890 Pranav Narahari: Just talk more.

45 00:05:07.330 00:05:07.910 Samuel Roberts: Sure.

46 00:05:08.140 00:05:13.099 Pranav Narahari: like, exactly… I think I wrote it out pretty well in there.

47 00:05:20.590 00:05:23.030 Samuel Roberts: So you said, after…

48 00:05:23.690 00:05:28.049 Samuel Roberts: ticket gets created. Is that… are we changing the process that a ticket gets created at?

49 00:05:29.440 00:05:33.990 Pranav Narahari: No, no, that part is still gonna be all the same, so it’s, like, based on thumbs up, thumbs down.

50 00:05:33.990 00:05:37.039 Samuel Roberts: Okay, okay. Just wasn’t sure if that needed to… got it.

51 00:05:41.290 00:05:48.790 Pranav Narahari: Yeah, so if you look at the target workflow, with co-pilot section, that’s where I kind of map out the new process.

52 00:05:48.800 00:05:52.839 Samuel Roberts: Same trigger. Oh, there it is, yep, same trigger, cool, triggers, classified as triggers.

53 00:05:56.710 00:05:57.540 Samuel Roberts: Okay.

54 00:05:59.720 00:06:00.760 Brylle Girang: Hello, guys.

55 00:06:01.080 00:06:01.980 Samuel Roberts: Hey!

56 00:06:01.980 00:06:02.750 Pranav Narahari: Hey, babe.

57 00:06:05.590 00:06:10.149 Samuel Roberts: Okay, so for this triage classifies issue as, is that a,

58 00:06:11.010 00:06:12.840 Samuel Roberts: That’s what Janice would be doing?

59 00:06:13.960 00:06:15.350 Samuel Roberts: Yes. Number 2? Okay.

60 00:06:15.350 00:06:15.910 Pranav Narahari: Yeah.

61 00:06:16.320 00:06:21.410 Samuel Roberts: So that’s a human step, and then if there’s… they can submit a proposed content through the triage intake, okay.

62 00:06:25.500 00:06:26.870 Samuel Roberts: Yep, okay.

63 00:06:28.220 00:06:28.990 Brylle Girang: Do you miss a video.

64 00:06:31.690 00:06:32.320 Samuel Roberts: Cool.

65 00:06:34.070 00:06:34.640 Brylle Girang: Oh, hmm.

66 00:06:35.500 00:06:43.000 Pranav Narahari: Yeah, and so y’all are… I’m gonna make some updates to this doc as we speak, so yeah, I’ll specify that Janiece should do step two.

67 00:06:43.000 00:06:48.590 Samuel Roberts: Oh, it doesn’t even have to be that, it just has to be that… I just want to make sure I understood that that was a human step and not a, like, auto step.

68 00:06:48.830 00:06:53.930 Pranav Narahari: Yeah, no, I think it was kind of ambiguous, so I’m glad that we’re adding that context.

69 00:06:54.180 00:06:54.860 Samuel Roberts: Okay.

70 00:06:55.520 00:07:01.130 Samuel Roberts: And then… content gap… Good, good.

71 00:07:01.130 00:07:06.630 Brylle Girang: Are we talking about the… The four goals.

72 00:07:07.070 00:07:11.699 Brylle Girang: the product… product… what do you call that again? Product definition markdowns?

73 00:07:11.820 00:07:13.539 Samuel Roberts: Yeah, the… yes.

74 00:07:13.650 00:07:18.310 Samuel Roberts: I’ve been looking at the ABC2041 document update copilot PRD.

75 00:07:19.990 00:07:20.840 Brylle Girang: Gotcha.

76 00:07:25.290 00:07:25.830 Pranav Narahari: Yeah.

77 00:07:25.830 00:07:33.559 Brylle Girang: And, Pranav, is this supposed to be… is this supposed to be the document that we’re also going to share with them directly? Or is this just supposed to be internal?

78 00:07:34.040 00:07:35.129 Pranav Narahari: This is just internal.

79 00:07:36.090 00:07:37.510 Brylle Girang: Okay, gotcha.

80 00:07:38.000 00:07:57.240 Pranav Narahari: I think what they would need is just, a slide, probably just, like, describing what we’re gonna do, and then, they’ll have a bunch of questions on Thursday when I present it to them, and just for me to give them, like, clear definition of, like, hey, this is what our target… I think they’re mostly going to care about, which they should care about, is just a target workflow.

81 00:07:57.300 00:07:58.539 Pranav Narahari: I just want to.

82 00:07:58.540 00:07:58.940 Brylle Girang: Yeah.

83 00:07:58.940 00:08:03.230 Pranav Narahari: We’re fully aligned with that. How we build things, I don’t really…

84 00:08:03.760 00:08:06.220 Pranav Narahari: You know, it’s not really important to…

85 00:08:06.390 00:08:10.699 Pranav Narahari: To them, of course, like, they don’t care how we build it.

86 00:08:11.110 00:08:20.080 Pranav Narahari: they just want it probably to be the fastest, most reliable, and so it’s kind of on us to just, like… it doesn’t really make sense, I think, to share this PRD. They would just probably

87 00:08:20.190 00:08:22.039 Pranav Narahari: Not be able to read most of it.

88 00:08:24.920 00:08:35.910 Brylle Girang: Yeah, exactly, exactly, I agree. Maybe this slide could just, you know, include the why, why we’re building this, and then the… how it will affect their workflows, and they should be happy with that.

89 00:08:36.470 00:08:37.090 Pranav Narahari: Yeah.

90 00:08:38.150 00:08:38.990 Brylle Girang: Okay.

91 00:08:39.250 00:08:49.240 Brylle Girang: Yeah, I went through this, that was my only feedback, like, if this is going to be external, the why should be first, but since this is internal, this is…

92 00:08:49.750 00:08:54.199 Brylle Girang: fairly good, and I like the idea, especially the…

93 00:08:55.640 00:09:03.350 Brylle Girang: the streamlined workflow when it comes to updating the central docs, that’s going to be the most exciting for them, for sure.

94 00:09:03.490 00:09:04.050 Samuel Roberts: Yeah.

95 00:09:04.050 00:09:13.050 Pranav Narahari: Yeah, they’ve already mentioned how that was a little bit tedious. Yeah, Sam, we can probably, like, talk about… I mean, I don’t know. B, did you have a certain, like…

96 00:09:13.400 00:09:18.630 Pranav Narahari: I know you kind of gave a little bit of an agenda in the meeting notes, but I think we’re kind of already…

97 00:09:18.920 00:09:23.119 Pranav Narahari: passed a lot of that, stuff, so I’m wondering…

98 00:09:23.370 00:09:29.959 Pranav Narahari: Did you kind of want us to go through the specifics, like, of, like, the linear tickets and stuff, or,

99 00:09:30.970 00:09:45.269 Pranav Narahari: what I was thinking what we could do is just, like, Sam and I could discuss a little bit and, like, get your input on, like, how are we going to actually execute on this? Like, so with all the linear tickets, I’ve set up, an additional field about, like, how…

100 00:09:46.180 00:09:57.830 Pranav Narahari: how much we can… I feel like we can deploy confidently to, like, a cloud agent to complete, to even further, like, speed up our process to, like, shipping this. So…

101 00:09:57.920 00:10:16.429 Pranav Narahari: We could talk about that. Also, what I was hoping to talk about was just, like, okay, we have these 3 plans, how are we then going to, how should we, like, prioritize them, and how fast should we go? I talked a little bit to Utam yesterday, and he was just like, dude, like, based on how this

102 00:10:16.500 00:10:18.340 Pranav Narahari: Client is…

103 00:10:18.600 00:10:24.109 Pranav Narahari: how our deal with this client is structured is we should try to get to that highest tier, like, ASAP.

104 00:10:24.230 00:10:25.919 Pranav Narahari: So, if you feel like you could.

105 00:10:25.920 00:10:26.460 Brylle Girang: Yeah.

106 00:10:26.770 00:10:28.150 Pranav Narahari: I don’t want you to, like

107 00:10:28.590 00:10:39.550 Pranav Narahari: slow down on these three things, just because, you know, we want to minimize, like, time spent on ABC, so they can… so Mustafa and Casey can have time working on platform.

108 00:10:39.550 00:10:40.129 Samuel Roberts: Yeah, no.

109 00:10:40.130 00:10:42.170 Pranav Narahari: I think… I think the idea here is, like.

110 00:10:42.440 00:10:53.929 Pranav Narahari: let’s… if we have clear vision on, like, okay, these things are gonna result in higher usage, then we should just try to bang out all three of these things, like, ASAP, and as soon as possible.

111 00:10:57.260 00:11:15.399 Brylle Girang: Yeah, I think I’m good with that. I would prefer if we, like, dedicate this month or April, if we dedicate all our resources just for one month, and then get the higher tier, and then we can dedicate the later months for platform and the other stuff that OTAM wants to prioritize.

112 00:11:15.510 00:11:28.340 Brylle Girang: But, yeah, I think our hardest priority for now is to make sure that we increase usage, increase adoption, increase our tier, so that we can have earlier impact when it comes to the tiers.

113 00:11:28.820 00:11:33.850 Brylle Girang: Instead of, like, spreading our efforts into 3 months, just to…

114 00:11:33.850 00:11:34.400 Samuel Roberts: No.

115 00:11:34.990 00:11:35.700 Brylle Girang: Just a minute.

116 00:11:35.910 00:11:38.180 Brylle Girang: The other priorities, right?

117 00:11:38.790 00:11:41.730 Pranav Narahari: Yeah, so… Yep.

118 00:11:42.580 00:11:57.229 Brylle Girang: Yeah, I think… I don’t need to be really involved in how you do things. What I really want to see is the timeline, like, the allocations, how much do we need, what do we need to adjust.

119 00:11:57.310 00:12:04.210 Brylle Girang: Do we have enough hours to finish this study within a month, etc. Those are the things that I really want to see.

120 00:12:05.310 00:12:10.199 Pranav Narahari: Okay, cool. So yeah, maybe Sam and I could just, like, work on that, Hal.

121 00:12:10.390 00:12:24.100 Pranav Narahari: To just… because, yeah, I put in the total development effort for each one of these things, Sam, so we can go over, like, do these seem like accurate estimates? And then after that, we can…

122 00:12:24.260 00:12:25.919 Pranav Narahari: Talk about just, like.

123 00:12:26.380 00:12:42.889 Pranav Narahari: Yeah, I think system design is one thing I want to talk about, just, like, let’s try to really, like… we don’t want this to be, like, breaking, and we’re constantly, like, creating patches for it, so, like, let’s really try to just, like, stress test this system design, like, let’s think through, like, what edge cases we might be missing.

124 00:12:43.600 00:12:44.800 Pranav Narahari: So…

125 00:12:45.330 00:12:59.309 Pranav Narahari: Yeah, let’s do that, and then what we can do is, like, okay, see how that affects estimates, and then, based on that, we can just start roadmapping. Okay, now we have, like, the full estimates, we know how much each section for each project is gonna take.

126 00:12:59.360 00:13:05.749 Pranav Narahari: then we can talk about, okay, how do we deploy this to, like, me, you, Mustafa, Casey, and then specifically, like.

127 00:13:05.750 00:13:21.609 Pranav Narahari: Also, I’m really interested in, like, how we can segment certain things like this into just cloud agents. Like, it does it fully, or it’s just, like, we create the linear tickets, and as soon as we create the linear tickets, it just creates, like, PRs per ticket, and then it does as much as it can

128 00:13:21.650 00:13:28.190 Pranav Narahari: And then so Mustafa, Casey, you and I are always picking up tickets that have already been worked on by the cloud agent.

129 00:13:28.730 00:13:31.590 Samuel Roberts: Yeah, ideally, that would be, you know, as many of those as possible.

130 00:13:31.900 00:13:44.789 Pranav Narahari: Yeah, and I think this would be, like, a cool, just, like, pilot for that type of process. So then, like, for all future projects that we work on, like, we’ll learn a lot from this, but then we’ll have, for the next project, like, we’ll be even more efficient.

131 00:13:44.960 00:13:56.480 Pranav Narahari: And so we can even talk a little bit about that, like, okay, how do we envision adding cloud agents to, like, help us ship this… ship these, ship these tickets?

132 00:13:58.080 00:13:59.759 Samuel Roberts: Cool. Okay, yeah, let’s dig in.

133 00:14:00.050 00:14:01.020 Samuel Roberts: Cool. Yeah.

134 00:14:01.020 00:14:04.929 Brylle Girang: Cool. I’ll let you two handle that, and then just let me know if you need any help.

135 00:14:05.180 00:14:19.300 Brylle Girang: When it comes to, like, the product management, Pranav, I think our current options are really amazing. Just let me know if you need help when it comes to ideating what the customer service team needs, okay?

136 00:14:19.780 00:14:20.770 Pranav Narahari: Yeah, totally.

137 00:14:21.940 00:14:22.909 Brylle Girang: Thank you both, bye-bye.

138 00:14:27.030 00:14:27.919 Pranav Narahari: Cool, cool.

139 00:14:27.920 00:14:28.520 Samuel Roberts: Truth.

140 00:14:30.810 00:14:39.679 Pranav Narahari: let’s maybe… yeah, step back a little bit, just so I can… I think you understand this, like, tran… the document update one, right?

141 00:14:40.270 00:14:43.420 Pranav Narahari: Let’s go over maybe just,

142 00:14:44.400 00:14:52.439 Pranav Narahari: the… the real dashboards. Real dashboards are pretty simple, like, I just kind of have, like, a bunch of different dashboard ideas.

143 00:14:52.880 00:14:53.490 Samuel Roberts: Okay.

144 00:14:53.640 00:15:06.400 Pranav Narahari: And so we can just talk about, like, how I feel like the implementation for each of these dashboard ideas are gonna differ. I think they’re all gonna be, like, pretty lightweight, but it’s, like, 7 different dashboards that I feel like we can do pretty quickly.

145 00:15:07.320 00:15:16.520 Pranav Narahari: And then… but the more interesting one is the transcript. So, it’s a transcript-driven adoption analysis. Okay.

146 00:15:17.050 00:15:31.239 Pranav Narahari: So how this one is gonna work is that right now, Yvette has told me that all the CSR’s conversations are being… the transcripts are being tracked and, like, saved. So, given that, what we can do is…

147 00:15:32.070 00:15:37.189 Pranav Narahari: let’s… let’s honestly… let’s just go through what I have down here.

148 00:15:37.540 00:15:55.999 Pranav Narahari: Yeah. So, yeah, current workflow, transcript usage signals exist in separate places. Teams can adopt… can see adoption patterns at a high level, but missed opportunities are not consistently quantified. Department leaders do not receive a weekly prioritized list of where Andy should have been used. There’s no automated

149 00:15:56.230 00:15:59.850 Pranav Narahari: There’s no standardized outcome bucket to separate

150 00:16:00.050 00:16:14.049 Pranav Narahari: adoption issues from product readiness gaps. Yeah. So, basically, like, we have these transcripts, there’s not much information that they’re taking from them, which… and I feel like they’re kind of a goldmine to, like, understand why

151 00:16:14.830 00:16:18.279 Pranav Narahari: Like, why the usage hasn’t been super high.

152 00:16:18.380 00:16:22.649 Pranav Narahari: Also, what I want to add to this, and it’s not really…

153 00:16:23.140 00:16:31.599 Pranav Narahari: Something for them, but it’s gonna give us, like, a… a ceiling for right now of, like, where can we see usage actually getting to?

154 00:16:31.770 00:16:32.640 Pranav Narahari: Right?

155 00:16:32.640 00:16:33.770 Samuel Roberts: Mmm, yeah.

156 00:16:34.060 00:16:37.839 Pranav Narahari: Because we’ll get all the transcripts from every single CSR, we’ll understand, like, okay.

157 00:16:37.950 00:16:43.250 Pranav Narahari: the maximum amount of usage that we can get is X amount. I feel like right now we don’t have that metric.

158 00:16:45.320 00:16:51.090 Samuel Roberts: Like, how many… calls to Andy we could expect in a day, or a week, or whatever.

159 00:16:51.090 00:16:52.180 Pranav Narahari: Yeah, exactly.

160 00:16:52.180 00:16:56.579 Samuel Roberts: on the current calls, if they were to be used maximally. Yeah, okay, I think that makes sense.

161 00:16:56.770 00:17:05.500 Pranav Narahari: I mean, right now, we for sure know that there’s room for improvement. Right, right. But at some point, we don’t wanna… we need to know, like, what is the max, like…

162 00:17:05.670 00:17:13.239 Pranav Narahari: if everyone is using Andy for everything, we can assume that they’re gonna use it more, unless they hire more people, or… that’s a totally different problem.

163 00:17:14.160 00:17:22.490 Samuel Roberts: Yeah, I think the insight there could be the types of calls, and, like, maybe figuring out where Andy is lacking right now.

164 00:17:22.970 00:17:29.660 Samuel Roberts: Like, if there’s things beyond central docks and zips that we’re not really covering, Yep.

165 00:17:29.660 00:17:30.839 Pranav Narahari: But that’s kind of… a little bit.

166 00:17:30.840 00:17:36.340 Samuel Roberts: There’s a a ton of exchanges. When I was pulling transcripts, it was… it was a lot, so…

167 00:17:36.340 00:17:36.790 Pranav Narahari: was sick.

168 00:17:36.790 00:17:37.480 Samuel Roberts: money there.

169 00:17:37.480 00:17:42.029 Pranav Narahari: Yeah, so what you can probably help me with here is just, like, I don’t have context for, like.

170 00:17:42.360 00:17:47.659 Pranav Narahari: where we’re even at with transcripts. I didn’t even know that you’ve had the ability to look at transcripts yet, so…

171 00:17:47.660 00:17:56.560 Samuel Roberts: Yeah, so I… as part of, like, the… I think the reason it was on hold is because there was going to be a separate SOW for this kind of work, and we were just kind of preliminarily, like.

172 00:17:57.050 00:18:03.520 Samuel Roberts: Doing a quick bit of analysis on it, but we were trying to get, another Scope, basically.

173 00:18:03.860 00:18:06.959 Pranav Narahari: Yo, Sam, give me just one second, I’ll be right back.

174 00:18:07.290 00:18:08.529 Samuel Roberts: Sure, yeah, no problem.

175 00:18:54.450 00:18:55.720 Pranav Narahari: Hey, I’m back.

176 00:18:55.990 00:18:56.520 Samuel Roberts: Okay.

177 00:18:56.880 00:18:58.800 Pranav Narahari: Yeah, you were saying about, like, transcripts?

178 00:18:59.130 00:19:05.700 Samuel Roberts: Yeah, so the current state is basically… I… have a…

179 00:19:06.960 00:19:15.389 Samuel Roberts: script that, it’s a little weird, the way… so, this is 8x8. I don’t know how much context you have into, like, what the tools they’re using are yet, but…

180 00:19:15.540 00:19:18.410 Samuel Roberts: 8x8 is, like, their…

181 00:19:19.150 00:19:30.239 Samuel Roberts: I don’t know exactly what it is, the call system or whatever, but that’s where the transcripts go. There’s some stuff there beyond that, there’s some metadata, but basically I have to do this, like, batch download kind of thing.

182 00:19:30.390 00:19:31.250 Samuel Roberts: Yup.

183 00:19:31.510 00:19:32.400 Samuel Roberts: Which…

184 00:19:32.650 00:19:42.709 Samuel Roberts: is not crazy slow, but there are a ton of interactions. So, that’s why, like, I pulled, I think, like, a day, and it was, like, several thousand.

185 00:19:44.420 00:19:54.390 Samuel Roberts: they need to be filtered a little bit, because not all of those are necessarily, like, the types of CSR calls that we’re looking for. Some are just, like, reception, some are, like, transfer, you know, like, it’s everything.

186 00:19:54.550 00:20:03.630 Samuel Roberts: But I was able to… initially, I wasn’t sure if I could get all that information out of there. I was able to figure out where the metadata lived, find that metadata, source the right ones, download them.

187 00:20:03.950 00:20:09.639 Samuel Roberts: load them, they come in some weird forms, where it’s all just, like, JSON, and it’s, like.

188 00:20:09.840 00:20:12.650 Samuel Roberts: Almost words by word,

189 00:20:14.090 00:20:25.080 Samuel Roberts: and I have, like, a little script that reflows it into, like, a transcript you can read, but I really… but everything that’s in BigQuery right now is just dumped JSON from the,

190 00:20:25.750 00:20:27.720 Samuel Roberts: the output from 8x8.

191 00:20:27.980 00:20:29.960 Samuel Roberts: So… .

192 00:20:30.250 00:20:39.689 Pranav Narahari: So, right now, BigQuery, it’s not filtered based on, like, relevant conversations yet. Some of them are still, like, HR conversations.

193 00:20:39.690 00:20:41.449 Samuel Roberts: Some of them are, yeah, whatever…

194 00:20:42.030 00:20:54.450 Samuel Roberts: And it kind of shows when there’s in… I might have filtered by inbound versus outbound, I don’t remember at this point now. I can double check that, but… yeah, and it’s also… it’s, like, historical stuff right now. I don’t know how quick things end up there, I can…

195 00:20:54.820 00:21:04.079 Samuel Roberts: probably take a look at that, because I was just doing, like, I was gonna try to do, like, January, because this is, like, mid-February, I was working on this, so…

196 00:21:04.480 00:21:07.270 Samuel Roberts: was like, let me pull all of January, and that was gonna be…

197 00:21:08.360 00:21:11.900 Samuel Roberts: A lot. And so I was like, let me pull, like, the last 3 days.

198 00:21:12.030 00:21:31.810 Samuel Roberts: days, or two days a day. I hadn’t actually looked at what the most, like, up-to-date stuff in the system is, so that might be something we need to find out, just to make sure, like, how quickly these things come in. Are we able to do… I’m sure, like, the past week is probably there, I just don’t know how, like, real-time it is, so I don’t think it’s… I don’t think it’ll be a problem, but I would love to confirm that.

199 00:21:31.940 00:21:39.180 Samuel Roberts: And then, yeah, like, I think we can probably do some things… Like, the transcripts will have… the…

200 00:21:39.390 00:21:43.319 Samuel Roberts: ID of the agent, or, like, a way to identify the agent.

201 00:21:43.550 00:21:47.559 Samuel Roberts: It might even have some of the info of the caller, I don’t know.

202 00:21:47.710 00:21:53.590 Samuel Roberts: But the timing of it is where I was kind of thinking we could correlate to Andy usage.

203 00:21:53.930 00:22:01.509 Samuel Roberts: Because we know who’s using Andy, we know when they’re using Andy, and we know what time they’re on certain calls. So I’m hoping that that means we can correlate some of that stuff.

204 00:22:03.710 00:22:13.220 Samuel Roberts: But for what you were saying about just, like, understanding where Andy could be used, we could probably do that even without correlating, but that would definitely help as well, so…

205 00:22:14.600 00:22:21.339 Pranav Narahari: Yeah, and I think with transcripts, like, it really opens a floodgate for how much optimization we could do.

206 00:22:21.340 00:22:21.830 Samuel Roberts: Yeah.

207 00:22:21.830 00:22:24.919 Pranav Narahari: So, what I did here was just, like, really narrowed the scope.

208 00:22:25.100 00:22:34.709 Pranav Narahari: However, I think I am not giving enough credit to, like, how complex this process is, so we’ll… I think this is one thing that we’ll have to expand, it’s…

209 00:22:34.710 00:22:35.230 Samuel Roberts: Definitely.

210 00:22:35.230 00:22:43.069 Pranav Narahari: A little bit, because, like, once we have the… the relevant data,

211 00:22:43.390 00:22:51.730 Pranav Narahari: that will be a lot easier going forward, but, like, I think pulling in that data is gonna be a little bit of, like, a… it sounds like a little bit of a…

212 00:22:52.740 00:23:00.980 Pranav Narahari: data ingestion issue a little bit, just because, like, things are happening, like, super… super quickly, and…

213 00:23:00.980 00:23:08.610 Samuel Roberts: No, and I think there are definitely ways to automate that a bit more. I was doing things very manual on my computer just to pull transcripts, but…

214 00:23:08.930 00:23:09.510 Pranav Narahari: Fuck.

215 00:23:10.170 00:23:13.070 Samuel Roberts: Automating that in a way that will be,

216 00:23:14.040 00:23:20.530 Samuel Roberts: You know, run every night, pull in yesterday’s transcripts, potentially, like… Put in BigQuery…

217 00:23:20.530 00:23:21.160 Pranav Narahari: Yeah.

218 00:23:21.270 00:23:28.560 Pranav Narahari: process. So how I feel about this is that… yeah, so I can kind of talk through a little bit of, like, the target workflow. Sure.

219 00:23:28.920 00:23:32.770 Pranav Narahari: basically, I want this, like, on a weekly basis, like, so weekly, transcription back.

220 00:23:32.770 00:23:33.430 Samuel Roberts: Okay.

221 00:23:33.540 00:23:34.860 Pranav Narahari: Grouped by department.

222 00:23:35.120 00:23:43.629 Pranav Narahari: So we need to figure out how we can group by department. That might be easy if we know, like, who is being called, right? Because we have all the CSRs, like…

223 00:23:43.910 00:23:55.029 Pranav Narahari: grouped by department at this point. And if we don’t, like, well, we can get that pretty easily. And then what we do with each transcript is there’s gonna be in a…

224 00:23:55.430 00:23:56.800 Pranav Narahari: a prior step.

225 00:23:57.400 00:24:10.700 Pranav Narahari: So it’s… I said for each transcript, CSR questions are extracted and normalized into, canonical intents. Each extracted question is passed to Andy for answer evaluation. So I would say CSR questions, or…

226 00:24:10.820 00:24:24.480 Pranav Narahari: For each transcript, first, we assess… if this… is a relevant… question for… Andy?

227 00:24:24.680 00:24:26.280 Pranav Narahari: And by relevant…

228 00:24:26.280 00:24:26.880 Samuel Roberts: Oh.

229 00:24:27.030 00:24:30.320 Pranav Narahari: I’m gonna be like, is it not HR-related?

230 00:24:31.000 00:24:39.610 Pranav Narahari: Not HR-related… Or… Some other type of phone call.

231 00:24:45.240 00:24:48.740 Pranav Narahari: Yeah, let me maybe just share my screen as I make these changes, too.

232 00:24:48.740 00:24:51.780 Samuel Roberts: Okay, yeah, yeah, I have it up, but that would be ideal.

233 00:24:51.780 00:24:52.560 Pranav Narahari: Yeah.

234 00:25:02.240 00:25:06.120 Samuel Roberts: And then, when you say CSR questions are extracted,

235 00:25:06.790 00:25:12.260 Samuel Roberts: Those are, like, what could have been potential questions to ask Andy, like, what they had to look up.

236 00:25:12.760 00:25:13.450 Pranav Narahari: Yeah.

237 00:25:14.890 00:25:16.129 Pranav Narahari: And I’ll put the…

238 00:25:27.190 00:25:34.319 Pranav Narahari: Yeah, because based on, like, the conversation, we’ll be able to assess, like, okay, what did the CSR actually have to look up?

239 00:25:36.260 00:25:44.370 Pranav Narahari: Yeah, so it’s kind of like what is the… The customer’s question, but… I think this is…

240 00:25:44.370 00:25:48.849 Samuel Roberts: Yeah, I think that, yeah, translating that into, like, what could Andy have answered here for them?

241 00:25:50.510 00:25:59.099 Pranav Narahari: Yeah, so part of it’s, like, translating the customer’s question into, like, what they’re actually asking, and, like, how it’s saved within their internal files.

242 00:26:00.800 00:26:01.460 Pranav Narahari: Okay.

243 00:26:01.750 00:26:05.679 Pranav Narahari: So, each extracted question is passed to Andy for answer evaluation.

244 00:26:06.590 00:26:13.750 Pranav Narahari: Andy, answer, is cost checked against historically answered similar CSR questions?

245 00:26:14.010 00:26:17.060 Pranav Narahari: Yeah, so… now this is where we kind of…

246 00:26:17.200 00:26:22.760 Pranav Narahari: do the assessment of, like, okay, each question that a CSR is asking, let’s pass it through Andy to see

247 00:26:23.110 00:26:30.469 Pranav Narahari: Basically, yeah, we see… Is this a question that we’re currently handling?

248 00:26:31.890 00:26:38.669 Pranav Narahari: And now we assess, like, the answer of it. So we put it through Indy, it’s gonna give us some response, right?

249 00:26:38.840 00:26:43.330 Pranav Narahari: Sometimes it’s gonna… it’s gonna tell us very quickly, like, hey, like.

250 00:26:43.690 00:26:46.989 Pranav Narahari: this… we don’t have a good answer for you? Okay, so that means…

251 00:26:47.540 00:26:55.880 Pranav Narahari: That’s where we kind of get into these, evalu… like, these outcome buckets. So…

252 00:26:56.000 00:27:06.620 Pranav Narahari: basically, if a question is asked, and we’re able to confirm based on, like, historically asked questions, like, let’s say a different CSR asked the same question to Andy, and they said it was a

253 00:27:06.820 00:27:15.299 Pranav Narahari: it was a correct response, then we can confidently say that that’s a case where this CSR should always ask

254 00:27:15.490 00:27:18.879 Pranav Narahari: Andy, for this type of question, now.

255 00:27:19.220 00:27:24.940 Pranav Narahari: And then, there’s gonna be certain questions, probably, like.

256 00:27:27.170 00:27:29.900 Pranav Narahari: Where we’re just, like, within, like, certain…

257 00:27:30.990 00:27:37.100 Pranav Narahari: Maybe there’s certain things that they need to look up, That we’re like… These are so case-by-case.

258 00:27:37.550 00:27:42.229 Pranav Narahari: And it’s not something that we feel ready that Andy should support right now.

259 00:27:42.230 00:27:47.980 Samuel Roberts: I see what you’re saying? Yeah, yeah. So, like, something… there are things that Andy’s good for now, things that Andy’s not…

260 00:27:48.410 00:27:51.830 Samuel Roberts: good for, but things that Andy could be good for.

261 00:27:51.830 00:27:55.849 Pranav Narahari: Exactly. So, Andy, not Lady? Yeah. Couldn’t be good for him.

262 00:27:56.540 00:27:57.750 Samuel Roberts: Didn’t make sense. Okay.

263 00:27:58.210 00:28:06.959 Samuel Roberts: Yeah, so for… for the, historically answered, is that something we want to, like, create a…

264 00:28:07.510 00:28:09.230 Samuel Roberts: Like, a dataset of?

265 00:28:09.700 00:28:10.750 Samuel Roberts: We’re here…

266 00:28:11.240 00:28:16.609 Pranav Narahari: I think we already have that. We have a list of all of the different questions Andy’s been asked.

267 00:28:16.760 00:28:19.140 Pranav Narahari: And then also… for…

268 00:28:19.140 00:28:26.600 Samuel Roberts: Right, but do we want to, like, pre-categorize them and everything? Is that what I’m… or… just, like, do a lookup of them based on the current question?

269 00:28:26.840 00:28:30.329 Pranav Narahari: Yeah, I think we just do a lookup, a little lookup, basically.

270 00:28:30.330 00:28:30.970 Samuel Roberts: Okay.

271 00:28:30.970 00:28:31.999 Pranav Narahari: Just like a…

272 00:28:33.260 00:28:39.619 Pranav Narahari: Yeah, just a lookup. It would probably… we had to create, like, some just internal rag that is super simple.

273 00:28:39.620 00:28:42.620 Samuel Roberts: Yeah, that’s what I’m thinking, maybe, so we embed the actual questions as well.

274 00:28:42.620 00:28:43.160 Pranav Narahari: Exactly.

275 00:28:43.160 00:28:47.439 Samuel Roberts: For do a vector lookup of similar questions, that could make sense. Okay, cool.

276 00:28:47.440 00:28:47.980 Pranav Narahari: Yep.

277 00:28:49.690 00:28:55.810 Pranav Narahari: And then what do I actually want to deliver to the client?

278 00:28:57.600 00:28:59.200 Pranav Narahari: Top 10 missed intents.

279 00:28:59.560 00:29:00.570 Pranav Narahari: per department.

280 00:29:01.710 00:29:05.080 Pranav Narahari: So… What are the top 10 things?

281 00:29:05.910 00:29:08.950 Pranav Narahari: Maybe top 10 categories per department.

282 00:29:09.290 00:29:14.789 Pranav Narahari: that Janice can then go to the CSRs and be like, hey, we’ve looked…

283 00:29:15.040 00:29:22.849 Pranav Narahari: We’ve pulled all of y’all’s transcripts, and there are these top 10 categories, and, you know, 10 is, like, just a random number.

284 00:29:23.040 00:29:27.199 Pranav Narahari: But these are the categories that we feel that,

285 00:29:27.810 00:29:32.489 Pranav Narahari: if y’all were using Andy, you would have gotten this… the right answer, and…

286 00:29:33.180 00:29:35.470 Pranav Narahari: Like, right now, in its current state.

287 00:29:35.660 00:29:36.280 Pranav Narahari: Right.

288 00:29:36.280 00:29:37.030 Samuel Roberts: Right.

289 00:29:37.310 00:29:42.849 Pranav Narahari: And then there’s much more analysis that we get here, too. It’s like, how do we improve Andy?

290 00:29:43.660 00:29:45.120 Pranav Narahari: And so…

291 00:29:45.460 00:29:52.940 Pranav Narahari: from the Andy Not Ready section, like, we’ll basically have a bunch of information there to be like, okay, this is how we can improve the central dock.

292 00:29:54.850 00:29:56.280 Samuel Roberts: Yeah, I like that.

293 00:29:56.630 00:30:00.199 Pranav Narahari: Cool. And then, yeah, on top of this will be a real dashboard.

294 00:30:00.320 00:30:11.459 Pranav Narahari: showing weekly, weekly insights. So, first, what I want to do is a historical analysis, so we can probably do for, like, the last few months.

295 00:30:12.520 00:30:15.989 Pranav Narahari: And then… A lot of the scripts…

296 00:30:16.190 00:30:24.020 Pranav Narahari: That we build there are going to be… could be… re… like…

297 00:30:24.160 00:30:38.219 Pranav Narahari: revamped, just so, like, we can build, like, an automated process for weekly insights. I really, like, what I want to do here is, like, I don’t want to focus on, like, full end-to-end automation, just off the bat.

298 00:30:38.620 00:30:44.119 Pranav Narahari: I think… The best way to, like, run these type of projects is to have, like.

299 00:30:44.390 00:30:58.779 Pranav Narahari: basically tools that you can just, like, click buttons to, like, run, like, huge amounts of just, steps, but they’re kind of just, like, function as, like, internal agents for us. Kind of like in Cursor, like, you know, we have these skills to, like, do a bunch of shit.

300 00:30:58.780 00:30:59.250 Samuel Roberts: Right, right.

301 00:30:59.250 00:31:04.340 Pranav Narahari: You can have the same thing with, like, these scripts. And then, once we feel like we’ve really, like.

302 00:31:05.290 00:31:17.570 Pranav Narahari: like, defined what this process looks like. Then we can talk about wiring up all these things, so it’s literally just, like, an automated schedule for these weekly insights to arrive in the dashboard.

303 00:31:19.780 00:31:22.170 Samuel Roberts: Yeah, yeah, I think that makes sense. I mean, even…

304 00:31:23.040 00:31:28.000 Samuel Roberts: Even those chunks are probably gonna have to be built in, like, modular pieces anyway, so I think that makes sense.

305 00:31:29.190 00:31:39.819 Samuel Roberts: like, you know, you hit a button, several things kick off in an order, paralleled and serialized, and like, who knows? Yeah, okay. But yeah, like, not tying everything together is probably smart.

306 00:31:40.540 00:31:41.170 Pranav Narahari: Yeah.

307 00:31:41.530 00:31:48.120 Pranav Narahari: And yeah, I just want to maybe call out this part, too, like, real-time transcript injection, not important.

308 00:31:49.130 00:31:53.629 Pranav Narahari: So we don’t really even need to worry about, like, Wiring up, like.

309 00:31:54.110 00:32:01.170 Pranav Narahari: Webhooks, or whatever, to, like, get them into, like, BQ,

310 00:32:01.380 00:32:19.189 Pranav Narahari: even if it’s, like, you know, we run a script in the beginning just to, like, hey, all of these, 8x8 files pull into BQ, like, that’s totally fine for me, for, like, honestly, most of this project. Even we can… even when we deliver it finally, like, it can still be that process, and then…

311 00:32:19.190 00:32:19.810 Samuel Roberts: Yeah.

312 00:32:20.680 00:32:24.859 Pranav Narahari: What we’ll do after we finish this project is, like.

313 00:32:25.040 00:32:43.880 Pranav Narahari: if it just becomes annoying and tedious for us to do, like, every Friday, you know, running these scripts, then we’ll just be like, hey, Janice, Yvette, like, this is kind of taking up a lot of our time, like, we can’t work on, like, future projects, so we’re just gonna spend a few more weeks just, like, fully automating this process for y’all. So it’s just completely out of our hands.

314 00:32:45.800 00:32:47.349 Samuel Roberts: Yeah. I think that’s fine.

315 00:32:47.750 00:32:48.510 Pranav Narahari: Cool.

316 00:32:48.740 00:32:59.239 Pranav Narahari: So… Yeah, this ingestion maybe is gonna be a little bit more diff… and I think question extraction…

317 00:32:59.370 00:33:05.700 Pranav Narahari: And, maybe question normalization might be difficult.

318 00:33:06.270 00:33:08.299 Pranav Narahari: There’ll be a lot of, like,

319 00:33:09.380 00:33:12.709 Pranav Narahari: First of all, there’s a lot of data, and so…

320 00:33:14.050 00:33:16.939 Pranav Narahari: That’s kind of in the ingestion…

321 00:33:17.290 00:33:25.659 Pranav Narahari: But even on top of ingestion, it’s like, okay, how do we… you know, if we get thousands of calls and transcripts for a thousand, like, we need to build a system that can…

322 00:33:26.960 00:33:32.159 Pranav Narahari: That can actually, that’s not gonna break under that much load, so…

323 00:33:32.160 00:33:32.740 Samuel Roberts: Yeah.

324 00:33:32.740 00:33:44.590 Pranav Narahari: That’s where, you know, I’ll let you, like, fully read this, you know? I’ll be able to digest it better if, like, you kind of can just, like, read it on your own. And then probably just, like.

325 00:33:44.590 00:33:49.310 Samuel Roberts: Yeah, I think you’re right, though. Ingestion… is probably a…

326 00:33:49.790 00:34:05.309 Samuel Roberts: like, it’s not… I have the pieces together in terms of, like, exactly what you’re saying, scripts I can run. I wouldn’t call them necessarily ready to, like, even trigger on a weekly basis yet, but that… that might not be crazy. The… the amount of time that takes is…

327 00:34:06.120 00:34:07.580 Samuel Roberts: not trivial.

328 00:34:09.639 00:34:15.280 Samuel Roberts: So we want to think about that as, like, a when-we-run-it kind of thing, probably over the weekend, maybe.

329 00:34:16.800 00:34:19.490 Samuel Roberts: You know, pull all the last business week kind of thing.

330 00:34:20.409 00:34:20.960 Pranav Narahari: Yeah.

331 00:34:20.969 00:34:26.849 Samuel Roberts: extraction and the normalization, I think the other question, in addition to just, like, volume of…

332 00:34:28.329 00:34:39.119 Samuel Roberts: of data is also, like, how good are these transcripts? In terms of, like, transcribed accuracy?

333 00:34:39.519 00:34:45.329 Samuel Roberts: I don’t know, really, and so I might be, like, over or underestimating how good they are.

334 00:34:45.589 00:34:50.169 Samuel Roberts: But, you know, I mean… it’s still…

335 00:34:50.309 00:34:56.009 Samuel Roberts: potential for things to get you wrong, you know, like, just… I don’t know, I mean, I’m sure we’ll have to dig into that a little bit more.

336 00:34:56.129 00:34:56.929 Samuel Roberts: Nope.

337 00:34:57.920 00:34:59.870 Pranav Narahari: You know what? I think that’s a good point.

338 00:35:00.230 00:35:01.080 Samuel Roberts: Yeah.

339 00:35:01.080 00:35:06.100 Pranav Narahari: We should, be able to assess The accuracy of these transcripts?

340 00:35:06.700 00:35:09.550 Pranav Narahari: And so that should be a little bit of a discovery for this project.

341 00:35:09.900 00:35:25.400 Samuel Roberts: Yeah, yeah, just because I saw… I mean, I don’t remember anything offhand, I was just trying to get the data into BigQuery, but I do remember seeing some things where I’m like, that doesn’t make sense. Like, I know they’re saying ABC Home and Commercial, and it’s getting transcribed wrong, and so I just don’t know what else could be like that, you know?

342 00:35:25.600 00:35:30.780 Samuel Roberts: I don’t know what they’re using for transcription, I don’t know. You know what I mean? I just don’t know where that falls, so I don’t…

343 00:35:31.180 00:35:34.509 Samuel Roberts: Might not be a problem, but definitely part of the discovery process, yeah.

344 00:35:36.760 00:35:37.870 Pranav Narahari: Yeah, this is… this feels like a…

345 00:35:37.870 00:35:43.780 Samuel Roberts: If we can pull out, like, Oh, this user…

346 00:35:44.000 00:35:50.130 Samuel Roberts: or this customer had this kind of question, it might not have to be as granular as I’m even picturing.

347 00:35:50.650 00:35:56.450 Samuel Roberts: It might just be like, oh, this call was addressing Rescheduling a whatever appointment.

348 00:35:56.700 00:36:06.879 Samuel Roberts: fine. That, you know, the nitty-gritty accuracy might not be relevant there, but if it’s more like, oh, at this point in the call, they should have used this Andy flow, or this Andy question.

349 00:36:08.240 00:36:11.990 Samuel Roberts: that might be a little… could be a little harder. So I’m just thinking… I just want to keep us…

350 00:36:12.430 00:36:17.740 Samuel Roberts: Cognizant of the… Different ways we might have to go about it, depending on that, you know?

351 00:36:23.130 00:36:25.220 Samuel Roberts: I’m just discovering how I can… yeah.

352 00:36:26.290 00:36:27.050 Samuel Roberts: Perfect.

353 00:36:28.170 00:36:33.460 Samuel Roberts: But yeah, even if there’s errors like that, you know, we can probably make certain assumptions that, like.

354 00:36:33.710 00:36:37.549 Samuel Roberts: you know, they’re talking about ABC Home Commercial instead of whatever random thing it said.

355 00:36:37.970 00:36:38.320 Pranav Narahari: Yeah.

356 00:36:38.320 00:36:41.789 Samuel Roberts: ways we can… we can tweak that, but yeah, even this, like, the…

357 00:36:42.200 00:36:51.050 Samuel Roberts: Pass the full transcript in, and have it say, like, this is what the customer wanted, this is how the agent did, might be enough, instead of getting real, like.

358 00:36:51.790 00:36:55.289 Samuel Roberts: real granular with it, but we’ll have to find that out. I think that’s definitely…

359 00:36:56.020 00:36:57.630 Samuel Roberts: Part of the discovery process.

360 00:36:59.310 00:36:59.890 Samuel Roberts: Cool.

361 00:37:02.190 00:37:06.310 Samuel Roberts: And then intent normalization, that’s kind of, yeah, that’s… that’s… I think…

362 00:37:06.460 00:37:13.810 Samuel Roberts: Assuming that question extraction goes fine with this sort of stuff, that’s not… Crazy…

363 00:37:19.320 00:37:21.350 Samuel Roberts: And the evaluation…

364 00:37:22.890 00:37:28.650 Samuel Roberts: Yeah, we might need to think… and this is where I was just saying about the, like, how granular we want to go.

365 00:37:29.650 00:37:36.170 Samuel Roberts: What types of… questions get evaluated by Andy in this flow?

366 00:37:37.520 00:37:40.630 Samuel Roberts: You know, is it…

367 00:37:41.850 00:37:48.780 Samuel Roberts: are we able to get so granular that we know exactly what the CSR would have been able to type into ANDI, or is it going to be a little higher level, like…

368 00:37:50.730 00:37:54.450 Samuel Roberts: the customer needed this, couldn’t India have provided that?

369 00:37:54.680 00:37:55.380 Samuel Roberts: you know.

370 00:37:57.160 00:38:02.960 Samuel Roberts: like, what’s generating the actual questions that Andy gets fed in this step, is sort of my…

371 00:38:03.100 00:38:04.730 Samuel Roberts: Unknown at this point, too.

372 00:38:06.050 00:38:07.359 Samuel Roberts: Does that make sense?

373 00:38:08.280 00:38:15.140 Pranav Narahari: Yeah, like, so right now, how… how Andy is being used.

374 00:38:16.360 00:38:20.080 Pranav Narahari: It’s that they’re asking a direct question, not just asking about a category.

375 00:38:20.970 00:38:24.360 Samuel Roberts: No, I understand, but I’m saying, like, I don’t know what data we’ll be able to extract.

376 00:38:24.980 00:38:28.469 Pranav Narahari: Oh, yeah, no, I feel like I get what you’re saying, like, I think…

377 00:38:28.610 00:38:39.169 Pranav Narahari: We have to make sure that we’re able to extract the exact question, and for some things, like, we’ll probably have to put, like, a confidence score,

378 00:38:39.170 00:38:39.720 Samuel Roberts: Yeah.

379 00:38:40.310 00:38:44.780 Pranav Narahari: And… the good thing is, though, that we have so much data that…

380 00:38:45.050 00:38:52.110 Pranav Narahari: My… my feeling is that we’ll have at least some things that are gonna meet the bar for, like, the confidence score that could have been…

381 00:38:52.110 00:38:56.619 Samuel Roberts: True, that’s true, you’re right, you’re right. I’m thinking about every individual one, but it doesn’t need to be, it can be…

382 00:38:56.830 00:38:57.710 Samuel Roberts: You know.

383 00:38:58.030 00:38:58.720 Pranav Narahari: Yeah.

384 00:38:59.250 00:39:01.029 Samuel Roberts: A certain subsection of the ones that are.

385 00:39:01.030 00:39:02.440 Pranav Narahari: Think of the goal as being, like.

386 00:39:02.440 00:39:03.300 Samuel Roberts: Yeah, that’s true.

387 00:39:03.610 00:39:06.900 Pranav Narahari: We just need to give some insights every single week that make sense.

388 00:39:08.310 00:39:09.160 Pranav Narahari: We don’t need to get…

389 00:39:09.160 00:39:09.720 Samuel Roberts: Yeah.

390 00:39:11.280 00:39:14.170 Pranav Narahari: an insight to every single CSR’s conversation.

391 00:39:14.860 00:39:17.779 Samuel Roberts: Exactly, exactly. That’s a good way to think about it.

392 00:39:18.870 00:39:22.759 Samuel Roberts: Okay, yeah, that works then. And then historical similarity cross-search…

393 00:39:36.950 00:39:41.640 Pranav Narahari: Yeah, so… maybe, like, for the last, like, just 15 minutes-ish, like, I can kind of…

394 00:39:41.910 00:39:46.250 Pranav Narahari: just talk about, the real PRD as well?

395 00:39:48.120 00:39:54.610 Pranav Narahari: And then… I’m sure you probably just need, like, a little bit more time to just, like, look at these, like, product definitions,

396 00:39:55.130 00:39:57.260 Pranav Narahari: And then, yeah, let’s just, you know…

397 00:39:57.660 00:40:03.920 Pranav Narahari: chat on Slack and, like, let me know, like, what you think needs to be refined further. I’m really interested in, like, what you… like…

398 00:40:05.090 00:40:08.150 Pranav Narahari: How you think the system design would work, and, like, if you think…

399 00:40:08.510 00:40:15.849 Pranav Narahari: with, the estimates I made, like, is it doable? I think it is, to be honest, like, a lot of this stuff, like.

400 00:40:16.020 00:40:21.700 Pranav Narahari: I feel like with cloud agents, like, we could… we could build pretty quick. Okay.

401 00:40:22.190 00:40:26.100 Pranav Narahari: Yeah, but… You know, I’ll get your take on that as well.

402 00:40:26.100 00:40:26.880 Samuel Roberts: Yeah.

403 00:40:28.090 00:40:28.950 Samuel Roberts: Yeah, I’ll take it.

404 00:40:28.950 00:40:34.030 Pranav Narahari: And what I think would be also… Cool for…

405 00:40:34.400 00:40:39.429 Pranav Narahari: Like, you to help me, like, figure out, or if you kind of want to take point on this, is like…

406 00:40:39.690 00:40:40.990 Pranav Narahari: How…

407 00:40:41.380 00:40:50.439 Pranav Narahari: because how I see things, and I was kind of talking to Utam yesterday, is, like, I kind of think more about product, and then you kind of think more about, like, engineering management.

408 00:40:50.980 00:40:52.360 Pranav Narahari: And so…

409 00:40:52.720 00:41:01.970 Pranav Narahari: if you kind of think of this as, like, for ABC, like, how are you thinking about, like, okay, where do you need to, like, step in for certain things?

410 00:41:02.460 00:41:05.180 Pranav Narahari: And where can you, like, like…

411 00:41:05.350 00:41:16.519 Pranav Narahari: hopefully, more often than not, like, not need to step in, and just, like, have cloud agents do things. Or then, also, for a level above that, how do Mustafa and Casey do things?

412 00:41:16.520 00:41:17.589 Samuel Roberts: Of course.

413 00:41:17.880 00:41:24.270 Pranav Narahari: And so… this is kind of, like, the process that I think is, like, if we can…

414 00:41:24.580 00:41:43.630 Pranav Narahari: exemplify this process, like, super well. Like, that would be super dope for Brainforge, and then, like, for future clients, it’s just gonna be, like… I feel like this is kind of our team, you know, me, you, Mustafa, Casey, whoever joins, it’s like, this is gonna be the team that just kind of, like, gets deployed to different, clients.

415 00:41:43.630 00:41:44.530 Samuel Roberts: Right.

416 00:41:44.670 00:41:57.879 Pranav Narahari: So, it’s like, for us, it’s like, we can… we really have the benefit of, like, okay, like, this is a team, like, let’s try to build something that works for all of us, you know? What do we all, like, want to do?

417 00:41:58.290 00:42:08.729 Pranav Narahari: And we don’t need to, like, think about, like, other people, like, plugging… like, we don’t need to build a process for other people in the company. It’s really just, like, kind of, like, the four of us with cloud agents.

418 00:42:08.730 00:42:09.270 Samuel Roberts: Sure.

419 00:42:09.490 00:42:11.829 Pranav Narahari: Yeah, so I think that’s kind of exciting to me.

420 00:42:11.830 00:42:13.119 Samuel Roberts: No, it is, it is.

421 00:42:13.320 00:42:17.340 Samuel Roberts: I kind of like the way this, yeah, that maps out pretty well.

422 00:42:17.570 00:42:18.660 Samuel Roberts: Yeah.

423 00:42:19.910 00:42:21.870 Pranav Narahari: So the cool thing about why,

424 00:42:24.850 00:42:28.910 Pranav Narahari: what Tom was saying was, it’s like, yeah, like.

425 00:42:29.400 00:42:32.719 Pranav Narahari: as me and you, like, CSO and SO, like…

426 00:42:33.400 00:42:37.090 Pranav Narahari: This is kind of, like, our client. They kind of don’t want to be, like.

427 00:42:37.680 00:42:39.270 Pranav Narahari: Tapping in all the time?

428 00:42:39.450 00:42:40.800 Samuel Roberts: No, of course not.

429 00:42:40.800 00:42:45.169 Pranav Narahari: And so I think, like, with ABC right now, like, it’s just kind of like, I’m onboarding as CSO, and, like.

430 00:42:45.310 00:42:53.280 Pranav Narahari: Right now, with, like, migration and stuff, like, they’re just, like, kind of, like, unsure about, like, what’s gonna happen next, which is why they’re being a little bit more high-touch with us.

431 00:42:53.300 00:42:54.230 Samuel Roberts: But…

432 00:42:54.640 00:42:56.459 Pranav Narahari: I think once we kind of…

433 00:42:56.580 00:43:05.150 Pranav Narahari: put this in the roadmap. I think right now, BRD seems pretty confident with it. I talked to Utam, he seems like he likes the ideas, too.

434 00:43:05.530 00:43:13.870 Pranav Narahari: at a certain point, I feel like it’s really just gonna be the four of us, and then it’s just like, every once in a while, we’ll talk to, like, Utam about, like, just checking in on certain things, so…

435 00:43:14.630 00:43:17.999 Pranav Narahari: Yeah, we really have a lot of freedom here to, like, run things how you want to run it.

436 00:43:20.250 00:43:21.369 Samuel Roberts: Good, I like that.

437 00:43:21.480 00:43:23.100 Samuel Roberts: Yeah. I think, yeah, once…

438 00:43:24.060 00:43:28.159 Samuel Roberts: Once we kind of get this plan in motion, I think it will be able to execute pretty well on it.

439 00:43:28.630 00:43:30.330 Pranav Narahari: Yep, I agree.

440 00:43:33.070 00:43:43.800 Pranav Narahari: So yeah, for these dashboards, I think you’re gonna have a little bit more context than me, and then maybe we loop in Casey and Mustafa, too, to, like, see the current status on some of this stuff, because…

441 00:43:44.440 00:43:48.749 Pranav Narahari: Thumbs up and thumbs down trends dashboard, I think that might be something we already have.

442 00:43:49.430 00:43:50.730 Samuel Roberts: I think so.

443 00:43:50.780 00:43:55.549 Pranav Narahari: And then also, weekly usage adoption, we already have that.

444 00:43:56.380 00:44:02.450 Pranav Narahari: Maybe we can add some additional, like, Detailed to that, but… I just kind of…

445 00:44:02.640 00:44:05.459 Pranav Narahari: Wrote up some stuff here that might be…

446 00:44:06.100 00:44:06.670 Samuel Roberts: Okay.

447 00:44:06.670 00:44:10.329 Pranav Narahari: Useful, and, like, maybe different views, like a user leaderboard, and…

448 00:44:11.670 00:44:15.440 Pranav Narahari: Yeah, we already have the weekly exchange trends,

449 00:44:15.780 00:44:23.640 Pranav Narahari: And then low usage area, summary table, like, yeah. It’s something, like, small add-ons that I feel like one shot with cursor can probably do.

450 00:44:24.840 00:44:28.369 Samuel Roberts: Yeah, I mean, if the data’s there, the data’s there, you know, I’m not too worried about that.

451 00:44:28.920 00:44:34.499 Pranav Narahari: Yeah, this will be useful, the speed and latency optimization dashboard, just like…

452 00:44:34.760 00:44:47.229 Pranav Narahari: somewhere we can always… and I think there is some view, and I think it just needs to be, like, fixed, because I think Utam was, like, showing it to me, and he’s like, yeah, this looks kind of broken, but, like, the PADP90,

453 00:44:47.510 00:44:52.330 Pranav Narahari: those, like… Those graphs just look a little bit off.

454 00:44:52.640 00:45:01.560 Pranav Narahari: So, it should be pretty quick. There’s probably just, like, some wiring issue with, like, the real dashboard code, so… should be pretty quick to fix.

455 00:45:05.640 00:45:06.340 Samuel Roberts: tab.

456 00:45:07.230 00:45:16.559 Pranav Narahari: Yeah, I’m not super familiar with how, like, we’re building these dashboards. I did, like, some cursor stuff to, like, kind of assess the complexity of building one, and it said, like, pretty low.

457 00:45:16.560 00:45:17.590 Samuel Roberts: Okay.

458 00:45:18.210 00:45:23.440 Pranav Narahari: It said low for these type of dashboards where we’re not pulling in new data.

459 00:45:23.440 00:45:27.170 Samuel Roberts: Right, yeah, I was gonna say, if the data’s already in Snowflake, it’s…

460 00:45:27.720 00:45:31.620 Samuel Roberts: I actually haven’t done a ton of the dashboarding myself, so I don’t know the, like…

461 00:45:33.010 00:45:38.660 Samuel Roberts: the details of the code, but I don’t think they’re crazy once the… yeah, it’s just basically, like.

462 00:45:39.200 00:45:41.890 Samuel Roberts: fetching the data and displaying it. I don’t think it’s…

463 00:45:42.120 00:45:46.560 Samuel Roberts: If we need new data, then it becomes a… Different task.

464 00:45:47.120 00:45:48.340 Pranav Narahari: Yeah, totally.

465 00:45:50.680 00:45:55.229 Pranav Narahari: Yeah, this one’s much more straightforward, we probably don’t even need to talk about it. You can probably just, like, read it, let me know.

466 00:45:55.230 00:45:55.790 Samuel Roberts: Okay.

467 00:45:58.560 00:46:06.300 Pranav Narahari: But… yeah, so… we can maybe with, like, what do we have left, like, 10 more minutes? We can talk a little bit about…

468 00:46:06.850 00:46:15.410 Pranav Narahari: kind of the system, you know, how we want to execute on these, since we really want to, take advantage of, like, cloud agents.

469 00:46:15.550 00:46:21.780 Pranav Narahari: I’m thinking, like… What we do is…

470 00:46:23.600 00:46:32.569 Pranav Narahari: I kind of, like, set out, like, some timelines on this stuff, will align just to make sure there’s… there’s no huge gaps there, but…

471 00:46:33.700 00:46:39.320 Pranav Narahari: What we can talk about is, like, hey… and honestly, like, we’ll probably have a lot of overlap on just, like.

472 00:46:39.680 00:46:45.939 Pranav Narahari: CSO, SL, like, because we’re gonna just want to support each other, make sure, like, there’s no gaps.

473 00:46:46.530 00:46:48.210 Pranav Narahari: But, I think…

474 00:46:48.450 00:46:52.569 Pranav Narahari: What sounds good to me is, like, if we can just build, like, write out all these tickets.

475 00:46:52.740 00:47:00.400 Pranav Narahari: you can figure out the orchestration of them. Like, where does it go to cloud agents? When does it go to Mustafa Casey?

476 00:47:00.520 00:47:02.879 Pranav Narahari: What makes the most sense to me is that…

477 00:47:03.350 00:47:06.010 Pranav Narahari: To just, like, really minimize on the…

478 00:47:06.220 00:47:12.049 Pranav Narahari: amount of, like, upfront work that we need to do is, like, yeah, I can just assign them to you, you can then…

479 00:47:13.330 00:47:18.680 Pranav Narahari: Like, just to deploy them all to, like, cloud agents, and then…

480 00:47:19.360 00:47:28.919 Pranav Narahari: to make it even more, like, deterministic, the outcome, like, you can even give it a few prompts as additional context than what the ticket already provides.

481 00:47:29.750 00:47:37.769 Pranav Narahari: But yeah, probably refining the tickets is probably another thing that you’ll probably help with, with, like, from a… I’ll come from more product and…

482 00:47:37.880 00:47:43.220 Pranav Narahari: obviously more, like, I’m very technical as well, but then you can fill in the gaps where you feel like.

483 00:47:43.400 00:47:52.450 Pranav Narahari: okay, we need to take this into consideration, too, to give additional context to the cloud agent. And then that’s also useful for Mustafa and Casey to get direction on.

484 00:47:53.450 00:48:04.959 Pranav Narahari: So, another important thing to me, too, with, like, all of these future projects is that we have tickets that can be closed in a day, not, like, that are going over, like, multiple days.

485 00:48:04.960 00:48:06.530 Samuel Roberts: It’d be small tickets, totally, totally.

486 00:48:06.530 00:48:22.620 Pranav Narahari: they should all be small tickets, so, like, it would be really weird if, like, one ticket that was opened one day didn’t get closed that day, too. Like, there had to be some… something that went wrong, basically. Because, like, all the tickets that I’ve mapped out should be, like, two to four hours.

487 00:48:22.620 00:48:23.800 Samuel Roberts: I’m seeing that here.

488 00:48:23.800 00:48:25.140 Pranav Narahari: Yeah, so…

489 00:48:25.460 00:48:33.029 Pranav Narahari: really, like Mustafa and Casey should be able to do multiple per day, especially since, like, that’s 2-4 hours before cloud agents are able to hack at it.

490 00:48:33.370 00:48:45.009 Pranav Narahari: So… and I feel like the cloud agents, too, like, the percentage that I said, like, the cloud agent could do are just kind of more or less arbitrary, like, I think that’s probably… a lot of these can be one-shotted, but…

491 00:48:45.340 00:48:48.969 Pranav Narahari: Cursor’s just never gonna say, like, it can do it 100%.

492 00:48:49.760 00:48:50.360 Pranav Narahari: Yeah.

493 00:48:51.030 00:48:53.979 Samuel Roberts: Yeah. Okay, yeah, the other thing, I think…

494 00:48:56.380 00:49:01.999 Samuel Roberts: As long as the cloud agents are all set up right on the repo and everything, too, because I know we have everything set up for, like, our…

495 00:49:03.140 00:49:05.259 Samuel Roberts: repo. I don’t know if things are set up

496 00:49:05.530 00:49:08.559 Samuel Roberts: Cursor-wise on ABC, so we just want to make sure we have that ready to go.

497 00:49:09.030 00:49:09.450 Pranav Narahari: Yeah.

498 00:49:09.450 00:49:13.300 Samuel Roberts: Okay, I’m just thinking about what could be blockers here. So that… Yes.

499 00:49:13.300 00:49:19.260 Pranav Narahari: Do you want to think a little bit about that? Like, maybe do some demos on, like, your end to just build out some tickets?

500 00:49:19.630 00:49:25.230 Pranav Narahari: create some PRs on ABC, like, just to, like, some random branch, like, it’s not gonna break anything, of course.

501 00:49:27.410 00:49:36.479 Pranav Narahari: And then, what’s going to be issued… like, something we need to think about, too, is, like, okay, then a lot… if a lot of our work is, like, being done by the cloud agents.

502 00:49:37.350 00:49:45.759 Pranav Narahari: We’re gonna move fast, but we’re gonna have a backlog of PRs. So then what does that review process look like? Do we then have another cloud agent that, like.

503 00:49:46.430 00:49:49.670 Pranav Narahari: pulls in a bunch of these PRs together,

504 00:49:49.980 00:49:53.930 Pranav Narahari: And then just, like, has one PR that we can… like…

505 00:49:54.480 00:49:57.230 Pranav Narahari: That we can just, like, review instead of going into, like.

506 00:49:57.380 00:50:01.939 Pranav Narahari: Because we’re probably gonna close out, like, 20 to 30 tickets per week with this process, at least.

507 00:50:02.220 00:50:07.770 Pranav Narahari: So we don’t want to have to go through 30 PRs, right? Like, probably we can, like…

508 00:50:08.270 00:50:16.380 Pranav Narahari: Have another cursor, like, cloud agent, just like, okay, combine certain of these, like, like…

509 00:50:16.830 00:50:20.029 Pranav Narahari: Branches, and so then we’re just, like, looking at…

510 00:50:21.070 00:50:25.539 Pranav Narahari: like, larger, but fewer PRs.

511 00:50:25.540 00:50:33.059 Samuel Roberts: Mmm, I see what you’re saying. Okay, yeah, so then we probably want to make sure we’re… and I think you already kind of have them grouped by…

512 00:50:33.810 00:50:36.049 Samuel Roberts: Like, what they affect.

513 00:50:37.450 00:50:41.880 Samuel Roberts: Oh, maybe not. Hold on, that’s what I’m seeing here. Oh, this is the cloud agent. Okay, it’s organized by…

514 00:50:42.250 00:50:45.310 Samuel Roberts: how well the cloud agent can do it, not necessarily, like.

515 00:50:46.940 00:50:50.309 Samuel Roberts: which ones we would want to see together in a PR, potentially, too.

516 00:50:50.630 00:50:59.390 Pranav Narahari: Yeah, so if you want to maybe just, like, take this, and then just, like, use Cursor to figure out, like, okay, this is kind of the system that we’re trying to build,

517 00:51:00.580 00:51:04.529 Pranav Narahari: how should we best group these tickets so then we can, like, have fewer PRs?

518 00:51:04.910 00:51:09.989 Pranav Narahari: And then… Yeah, like, I think that would be…

519 00:51:10.120 00:51:12.770 Pranav Narahari: That’s kind of, like, one of the last steps of the process, and…

520 00:51:13.780 00:51:15.829 Pranav Narahari: Then you and… you and me, like…

521 00:51:16.030 00:51:19.150 Pranav Narahari: I would prefer if we just have to go through, like…

522 00:51:19.500 00:51:26.269 Pranav Narahari: maybe, like, 4 PRs a week, you know, like… but I’ll let you set that number, whatever you think makes the most sense, because we also don’t want to.

523 00:51:26.270 00:51:26.900 Samuel Roberts: Yeah.

524 00:51:26.900 00:51:28.409 Pranav Narahari: Oversimplify things, right?

525 00:51:28.410 00:51:36.010 Samuel Roberts: Yeah, no, I think it’ll probably just be determined by, like, if these are all affecting the same thing, they can be kind of grouped together, tested together, kind of thing.

526 00:51:36.310 00:51:40.350 Samuel Roberts: I’m assuming things work and aren’t breaking, and have to determine, like, what…

527 00:51:40.500 00:51:44.610 Samuel Roberts: You know, so there’s something to be said for keeping them separate as well, because… Yeah.

528 00:51:44.610 00:51:46.000 Pranav Narahari: So, I’ll let you set that up.

529 00:51:46.000 00:51:47.640 Samuel Roberts: Okay. Yeah, definitely.

530 00:51:47.860 00:51:51.290 Pranav Narahari: You can set that tolerance for, like, what you think is similar enough.

531 00:51:53.560 00:52:02.179 Pranav Narahari: But, yeah, I mean, this would be, like, a really cool process, like, if we’re able to, like, pull it off, because I feel like we’d be moving so fast.

532 00:52:05.020 00:52:07.209 Samuel Roberts: Yeah, I think so. I mean, yeah.

533 00:52:07.860 00:52:14.359 Pranav Narahari: Because I think right now, like, how things are working is, like, people are maybe doing this, like, Utam is doing this, but, like, kind of individually. Right.

534 00:52:14.360 00:52:14.980 Samuel Roberts: Right, right.

535 00:52:14.980 00:52:17.009 Pranav Narahari: What we’re trying to build here is, like.

536 00:52:17.180 00:52:22.370 Pranav Narahari: kind of building, like, a process where, like, there’s four people best utilizing cloud agents. Yeah.

537 00:52:22.370 00:52:23.130 Samuel Roberts: I agree.

538 00:52:23.450 00:52:24.870 Pranav Narahari: Yeah, so…

539 00:52:25.460 00:52:31.490 Samuel Roberts: Okay. Yeah, I’ll go through these. So I… there were 4 of them, but I don’t need to look at one of them, you were saying?

540 00:52:31.730 00:52:35.809 Pranav Narahari: Yeah, just the ones that… with PRD. The other one is, like.

541 00:52:35.920 00:52:45.109 Pranav Narahari: I scoped out, like, 4 other projects, too, but I don’t think that they’re as important as… like, I think these are the top 3 that we should move forward with.

542 00:52:45.970 00:52:48.320 Samuel Roberts: Oh, I see, okay, yeah, I’m seeing it now. Yeah.

543 00:52:48.320 00:52:51.549 Pranav Narahari: Like, that one just, like, summarizes a bunch of other ones. Okay.

544 00:52:51.550 00:52:52.060 Samuel Roberts: Alright, I won’t…

545 00:52:52.180 00:52:54.040 Pranav Narahari: Find those either. So, yeah, no need to look at.

546 00:52:54.040 00:52:54.600 Samuel Roberts: Okay.

547 00:52:54.910 00:52:59.969 Samuel Roberts: Alright, cool, yeah, I’ll go through the other ones. So, are these… where are you keeping these right now?

548 00:53:00.590 00:53:06.049 Pranav Narahari: arm… I actually just have it on my local, so let me push these to a branch.

549 00:53:06.490 00:53:09.540 Samuel Roberts: Okay, yeah, either… I mean, I don’t know if you want them in…

550 00:53:09.850 00:53:13.790 Samuel Roberts: We probably just want them in the, like, the…

551 00:53:14.010 00:53:17.580 Samuel Roberts: monorepo, I guess, but we also have an ABC repo, so I wasn’t sure.

552 00:53:18.800 00:53:19.650 Pranav Narahari: Mmm.

553 00:53:20.120 00:53:22.630 Pranav Narahari: So should I put this in a brand platform, you think?

554 00:53:24.360 00:53:30.889 Samuel Roberts: Probably… let me… I mean, I think at some point we’re gonna need them over there, because we’re gonna wanna start making tickets and stuff, and being able to point…

555 00:53:31.710 00:53:35.189 Samuel Roberts: Linear to certain things. Carser to certain things.

556 00:53:38.180 00:53:38.820 Pranav Narahari: Yeah, but…

557 00:53:38.820 00:53:40.640 Samuel Roberts: I mean… Oh, so…

558 00:53:40.640 00:53:45.830 Pranav Narahari: Sorry. Like, skip the cloud agents, you’re saying, like… Yeah.

559 00:53:45.830 00:53:49.059 Samuel Roberts: I’ll have context of the ABC repo, probably, more than anything.

560 00:53:51.920 00:53:54.159 Pranav Narahari: Oh, so it should be in the ABC repo, you’re saying?

561 00:53:54.370 00:54:02.180 Samuel Roberts: Well, I think for now it’s fine, and maybe once we have them… Like, finalized, I guess?

562 00:54:02.610 00:54:03.360 Pranav Narahari: Yeah.

563 00:54:03.580 00:54:08.270 Samuel Roberts: So yeah, I mean, I was a Vault, ABC Home Commercial, There’s…

564 00:54:08.490 00:54:11.179 Samuel Roberts: 3 or 4 folders in there…

565 00:54:12.250 00:54:16.529 Samuel Roberts: I’m just wondering, resources… trying to think where the best place to put these might be.

566 00:54:19.430 00:54:25.319 Samuel Roberts: Because, to be honest, I don’t use the vault as much as some other people, so I’m, like, I’m just not sure how people are organizing the clients here.

567 00:54:26.720 00:54:28.220 Pranav Narahari: Yeah, and also…

568 00:54:28.910 00:54:34.419 Pranav Narahari: I’ve never pushed to the vault. I’m using it every single day, but, like, that was one thing I need to bring up to B, I’m like.

569 00:54:35.010 00:54:40.179 Pranav Narahari: how should I be thinking about pushing to the vault? Like, does it not go through, like, a PR process, or I feel like that’s.

570 00:54:40.180 00:54:47.190 Samuel Roberts: No, there’s… no, you’d make a PR, basically, and just submit it, and I think, basically, the… anything that is not in the…

571 00:54:47.690 00:54:49.210 Samuel Roberts: app folder?

572 00:54:49.590 00:54:50.860 Samuel Roberts: Effectively?

573 00:54:51.740 00:54:57.779 Samuel Roberts: which is, like, code. Anything else, I think it’s… the code owner is Utam, and he just approves the PR.

574 00:55:01.260 00:55:06.449 Samuel Roberts: But, I mean, if you’re on a branch working on it, I wouldn’t stress it. We could just work on, like, a separate branch for now.

575 00:55:08.160 00:55:08.810 Pranav Narahari: Okay.

576 00:55:08.810 00:55:16.590 Samuel Roberts: analyze PR it in, or do a draft PR so that we can chat about it on there, you know what I mean? Like, making use of GitHub’s probably not a bad idea, too.

577 00:55:18.410 00:55:20.569 Pranav Narahari: I’ll just add this to the ABC one for now.

578 00:55:20.570 00:55:27.300 Samuel Roberts: Okay, yeah, cool. Yeah, do that, and then I… yeah, throw it up on a branch, maybe, and we can just work on there for now. I’ll make whatever…

579 00:55:27.480 00:55:30.020 Samuel Roberts: Bits, or changes, or ask questions, or whatever.

580 00:55:30.910 00:55:31.880 Pranav Narahari: Sounds good.

581 00:55:31.880 00:55:32.820 Samuel Roberts: Cool, okay.

582 00:55:33.720 00:55:34.640 Samuel Roberts: Sounds good.

583 00:55:34.900 00:55:36.289 Pranav Narahari: Cool. Thanks, man. Talk soon.

584 00:55:36.290 00:55:38.040 Samuel Roberts: Yeah, you too. Bye.