Meeting Title: ABC Grooming Date: 2025-05-29 Meeting participants: Uttam Kumaran, Amber Lin, Casie Aviles, Annie Yu


WEBVTT

1 00:03:28.150 00:03:29.290 Uttam Kumaran: Hey, Casey?

2 00:03:30.960 00:03:31.630 Casie Aviles: Hey? You don’t.

3 00:03:33.300 00:03:35.900 Uttam Kumaran: Okay, let’s just get started.

4 00:03:37.120 00:03:45.240 Uttam Kumaran: yeah, I just wanted to sort of go through board and like, make just clean a couple of things up here, and just make sure we’re set for

5 00:03:46.290 00:03:49.110 Uttam Kumaran: like next friends.

6 00:03:49.340 00:03:52.330 Uttam Kumaran: And then we can just like, confirm a couple of things.

7 00:03:54.860 00:03:58.040 Uttam Kumaran: But yeah, maybe I’ll just like Jump right in if you’re okay with that.

8 00:03:58.890 00:03:59.710 Casie Aviles: Yeah, sure.

9 00:04:00.540 00:04:09.830 Uttam Kumaran: Cool. Okay, so anything here that we can move

10 00:04:10.810 00:04:15.320 Uttam Kumaran: to done. I feel like all this is Bill blocked.

11 00:04:16.490 00:04:20.230 Casie Aviles: Yeah, I think the the one with the Api we haven’t had.

12 00:04:21.140 00:04:27.939 Casie Aviles: I think, Tim, if I remember correctly, he moved that to to 8, 8 by 8 he escalated

13 00:04:28.330 00:04:29.499 Casie Aviles: to do support.

14 00:04:29.720 00:04:31.669 Uttam Kumaran: Okay, so that’s still blocked.

15 00:04:36.106 00:04:44.410 Uttam Kumaran: This one, yeah. Still, I mean this one we’re doing every week or so. Right? Annie.

16 00:04:46.580 00:04:53.559 Annie Yu: Yeah, yeah, we are asking data each week and then load it manually.

17 00:04:54.680 00:04:56.389 Uttam Kumaran: Okay, so this is blocked.

18 00:04:56.940 00:05:01.049 Uttam Kumaran: This is blocked by the 8. By.

19 00:05:05.310 00:05:07.690 Uttam Kumaran: I don’t know how. What is the other ticket name?

20 00:05:12.320 00:05:14.370 Uttam Kumaran: Oh, there’s not a ticket. Okay? So

21 00:05:27.700 00:05:31.934 Uttam Kumaran: okay, yeah, this is blocked by Tim. Okay, I’ll follow up with that.

22 00:05:39.590 00:05:41.400 Uttam Kumaran: okay, this one is still blocked.

23 00:05:49.280 00:05:52.319 Uttam Kumaran: Okay, I will follow up with Tim on these, because I don’t

24 00:05:52.510 00:05:56.880 Uttam Kumaran: let’s go on. Let’s talk about okay. Oh, this is good.

25 00:06:03.160 00:06:10.150 Uttam Kumaran: Okay, maybe. Let’s talk about this ticket

26 00:06:10.600 00:06:15.549 Uttam Kumaran: which is integrating the thumbs down feedback into the quality score.

27 00:06:16.650 00:06:22.389 Uttam Kumaran: Is this something that we can do in the data like we don’t need? We can do this like in real.

28 00:06:27.810 00:06:33.000 Uttam Kumaran: For context, this is like we in real, we have the quality score. We want that to integrate

29 00:06:33.330 00:06:37.270 Uttam Kumaran: the feedback as well from the thumbs down.

30 00:06:44.450 00:06:46.740 Uttam Kumaran: Annie, I think this question for you.

31 00:06:48.927 00:06:55.202 Annie Yu: Isn’t quality isn’t quality score something computed by the

32 00:06:57.760 00:06:58.300 Casie Aviles: Yeah.

33 00:06:58.300 00:06:58.830 Annie Yu: Chat.

34 00:06:58.830 00:06:59.600 Casie Aviles: It is.

35 00:07:01.220 00:07:02.190 Uttam Kumaran: Yeah, I guess what? My.

36 00:07:02.190 00:07:03.070 Casie Aviles: The back end.

37 00:07:03.070 00:07:09.099 Uttam Kumaran: Yeah, my question here is like, we, we want one score which includes the human feedback and the

38 00:07:09.250 00:07:10.980 Uttam Kumaran: evaluation feedback.

39 00:07:14.300 00:07:20.409 Uttam Kumaran: So my question is like, can we? Can we integrate the thumbs down into the quality score.

40 00:07:23.930 00:07:27.109 Annie Yu: But I I don’t think I have the right answer to that.

41 00:07:30.480 00:07:33.180 Casie Aviles: I think I could do this on the back end instead. Then.

42 00:07:33.180 00:07:36.810 Uttam Kumaran: I don’t. But I guess my question is like we’re getting the thumbs up data right.

43 00:07:38.170 00:07:39.170 Annie Yu: We are.

44 00:07:39.680 00:07:42.809 Uttam Kumaran: And then we have. We’re getting the evaluation score data.

45 00:07:48.220 00:07:51.790 Uttam Kumaran: So why can’t we create another metric? That just is both of those.

46 00:07:57.320 00:08:05.210 Annie Yu: Okay. So we will want to like, define another metric and ignore the, I guess, based on the

47 00:08:05.580 00:08:09.759 Annie Yu: quality score and thumbs up score.

48 00:08:09.760 00:08:10.470 Uttam Kumaran: Correct.

49 00:08:11.750 00:08:15.850 Annie Yu: And is that something we can discuss? On the definition.

50 00:08:16.710 00:08:19.209 Uttam Kumaran: Yeah, so I would. We can discuss right now.

51 00:08:22.880 00:08:27.540 Uttam Kumaran: I guess I would ask you like, how much of a factor do you want the thumbs up to be

52 00:08:28.340 00:08:32.100 Uttam Kumaran: or like, how would you deter? How would how would we like define that metric.

53 00:08:38.280 00:08:44.770 Annie Yu: I don’t have enough understanding of the evaluation quality score.

54 00:08:45.380 00:08:54.429 Annie Yu: So, Casey, if you can help me understand what that is and how it! How that is! Populated!

55 00:08:56.180 00:09:02.170 Casie Aviles: Yeah, I think what we wanted to do here was that basically for each

56 00:09:02.830 00:09:09.040 Casie Aviles: a thumbs down, we just want it to be marked as a lower score, or like a an error.

57 00:09:10.050 00:09:15.550 Casie Aviles: Because, yeah, that’s the the idea behind it. So

58 00:09:15.860 00:09:20.870 Casie Aviles: if there’s like, if they send a thumbs down, it should automatically be a

59 00:09:21.949 00:09:27.399 Casie Aviles: lower score, or like an error. That’s the idea. Behind this ticket. I think.

60 00:09:27.810 00:09:33.260 Uttam Kumaran: Yeah. Like, for example, if like, let’s take an example today.

61 00:09:36.600 00:09:40.350 Uttam Kumaran: So if I go to ABC logs

62 00:09:41.270 00:09:46.299 Uttam Kumaran: today, I saw this does a term of mesh system warranty. I’m sorry, but I don’t have access to information

63 00:09:47.250 00:09:55.269 Uttam Kumaran: thumbs down so it should be 50 50, or the or this should be maybe higher.

64 00:09:57.180 00:09:58.650 Uttam Kumaran: I’m okay with that.

65 00:10:00.180 00:10:07.490 Uttam Kumaran: like, if this is thumbs down and their valuation score is high. That still is not good enough, you know. So it should be automatically

66 00:10:07.920 00:10:09.080 Uttam Kumaran: discounted.

67 00:10:16.240 00:10:16.905 Annie Yu: And

68 00:10:17.800 00:10:26.890 Annie Yu: one follow up. Question is, what’s what’s the difference between error, error, score and quality score, because there’s a an error

69 00:10:27.310 00:10:28.320 Annie Yu: rate, right?

70 00:10:28.320 00:10:36.369 Casie Aviles: Yes, yeah. How we define the error is basically when the quality score is lower than 5.

71 00:10:38.130 00:10:38.960 Annie Yu: Okay.

72 00:10:43.420 00:10:48.230 Uttam Kumaran: Okay, so there’s so yeah, so I guess that makes sense.

73 00:10:54.460 00:11:01.280 Casie Aviles: And yet the problem is that when the thumbs down are provided, it’s not mark flagged as an error, and it gets sick.

74 00:11:01.470 00:11:04.879 Casie Aviles: A quality score that is higher than 5, or maybe 5.

75 00:11:05.390 00:11:08.920 Casie Aviles: So I guess we want that to be a factor to the.

76 00:11:09.810 00:11:10.780 Uttam Kumaran: Yes, yeah.

77 00:11:17.150 00:11:25.859 Uttam Kumaran: So basically, it’s like the quality score is composed of both, if but like.

78 00:11:26.250 00:11:34.690 Uttam Kumaran: But feedback may, feedback may not exist always, if it does, it should

79 00:11:35.700 00:11:40.420 Uttam Kumaran: factor into X percent of the final score.

80 00:11:45.930 00:11:46.700 Annie Yu: Yeah.

81 00:11:48.940 00:11:50.980 Uttam Kumaran: Like, do you think 50% is fine?

82 00:12:01.100 00:12:04.572 Annie Yu: So if so, what’s the

83 00:12:06.680 00:12:10.630 Annie Yu: What’s the baseline of a quality score like a typical? If

84 00:12:10.940 00:12:18.740 Annie Yu: if we say like 50%, does that mean that score slash into half.

85 00:12:19.200 00:12:25.990 Uttam Kumaran: Well, it’s it’s meaning that like 50% of the quality score like weight should come from.

86 00:12:26.200 00:12:30.360 Uttam Kumaran: it’s basically a weighted average, right?

87 00:12:30.910 00:12:38.859 Uttam Kumaran: So you can say, like 50% of the weight should come from the quality score. 50% of the weight should come from the feedback.

88 00:12:40.890 00:12:44.210 Uttam Kumaran: and you can assume no feedback is positive feedback.

89 00:12:47.140 00:12:56.409 Annie Yu: Yeah. So in that case, because right now, but keep me honest. Casey. But to my knowledge, right now, we just

90 00:12:56.860 00:13:02.229 Annie Yu: get a quality score from the back end, and everything was calculated there.

91 00:13:03.640 00:13:09.719 Casie Aviles: Yes, but it’s not factoring in the thumbs up or thumbs down at the moment.

92 00:13:14.520 00:13:22.959 Annie Yu: Yeah. So if we want to bring in the thumbs up and down, shouldn’t that also be calculated into the process, instead of

93 00:13:24.240 00:13:29.010 Annie Yu: doing it from the dashboard side.

94 00:13:31.940 00:13:35.710 Casie Aviles: Hmm, yeah, that’s something. That’s what I was thinking earlier. That.

95 00:13:36.240 00:13:40.750 Uttam Kumaran: This is just a math. This is just a math problem, though, like you can do this in the dashboard.

96 00:13:41.330 00:13:48.139 Uttam Kumaran: I would rather not like cause. We already have this data. If if we have to change it any, then it’s gonna take a long time.

97 00:13:49.020 00:13:52.219 Uttam Kumaran: I would rather it’s just like, compute this like in the sequel.

98 00:14:03.870 00:14:06.080 Uttam Kumaran: like, you can just do a weighted average

99 00:14:13.930 00:14:18.390 Uttam Kumaran: like exactly like this sort of factor.

100 00:14:29.590 00:14:32.120 Uttam Kumaran: Do you want to give this a shot, or like.

101 00:14:32.120 00:14:36.189 Annie Yu: I think my question is that quality score was

102 00:14:37.330 00:14:43.610 Annie Yu: already calculated on some level with already like weighted average.

103 00:14:44.300 00:14:48.239 Annie Yu: using some other dimensions. Right? So

104 00:14:48.470 00:14:58.069 Annie Yu: this calling score we get from SQL is something that’s already calculated halfway. So how do I know?

105 00:14:58.300 00:15:03.120 Annie Yu: Like, I guess then I will definitely need help to figure out

106 00:15:03.460 00:15:06.200 Annie Yu: how to get that quality score.

107 00:15:07.530 00:15:10.990 Uttam Kumaran: But the quality scores already computed for you. So I don’t think you need to

108 00:15:12.500 00:15:17.940 Uttam Kumaran: like you don’t need to under. There’s no need to go beneath the hood there, because

109 00:15:18.180 00:15:20.169 Uttam Kumaran: this is given to you by the system.

110 00:15:21.830 00:15:27.700 Uttam Kumaran: So you’re just doing a weighted average of the quality score and the feedback score, the manual feedback score.

111 00:15:31.310 00:15:38.999 Annie Yu: Okay, yeah, I’ll I’ll look into that.

112 00:15:39.000 00:15:49.379 Uttam Kumaran: I mean, if it’s if it’s all making sense, we can work on it together. I’m happy to. I I think this is. I think you’re overthinking this quite a bit. I think it’s just a pretty basic weighted average

113 00:15:49.530 00:15:55.250 Uttam Kumaran: using the quality score metric. And then the feedback score. If it’s thumbs up, you can do it a 1,

114 00:15:55.450 00:15:59.870 Uttam Kumaran: and if it sums down you can do it as 0, and then you calculate it.

115 00:15:59.980 00:16:02.810 Uttam Kumaran: A weighted average of both those scores.

116 00:16:04.070 00:16:09.040 Uttam Kumaran: And then you can assign some sort of factor like 50% comes from

117 00:16:09.330 00:16:12.110 Uttam Kumaran: the feedback score. 50% comes from the quality score.

118 00:16:14.003 00:16:14.559 Annie Yu: okay.

119 00:16:15.310 00:16:16.640 Uttam Kumaran: We can do this one together.

120 00:16:18.440 00:16:20.710 Annie Yu: Yeah, yeah, and yeah.

121 00:16:21.930 00:16:27.550 Uttam Kumaran: I just don’t overthink it. It’s not that, not it’s not that, not that, not as complicated as you think.

122 00:16:28.530 00:16:29.340 Annie Yu: Okay.

123 00:16:33.255 00:16:33.850 Uttam Kumaran: Okay?

124 00:16:35.600 00:16:39.879 Uttam Kumaran: Or I can take it. I can take a crack at it and send it to you, or you tell me.

125 00:16:49.520 00:16:54.090 Annie Yu: okay, yeah, I I need to wrap my head around this.

126 00:16:55.120 00:16:57.964 Annie Yu: I I think I do know what you’re saying.

127 00:16:59.250 00:17:06.650 Uttam Kumaran: Yeah, it’s it’s it’s literally like, so you have. Let’s say, we have like question, what is

128 00:17:07.240 00:17:10.569 Uttam Kumaran: what is XX is y?

129 00:17:11.180 00:17:21.180 Uttam Kumaran: The quality score is 7, and the feedback score is a 0 right.

130 00:17:21.180 00:17:22.709 Annie Yu: 0 means thumbs down.

131 00:17:22.710 00:17:25.020 Uttam Kumaran: Thumbs down. Yes, thumbs down.

132 00:17:25.560 00:17:28.819 Uttam Kumaran: So then you just have to do a way that you just have a wait column.

133 00:17:29.750 00:17:32.450 Uttam Kumaran: This is 50%. This is 50%.

134 00:17:32.760 00:17:37.820 Uttam Kumaran: And you’re gonna end up what are like 3 and a half final score.

135 00:17:40.270 00:17:46.550 Uttam Kumaran: Right previously, we would have said it with 7. Now, because the feedback came in as a 0, it’d be 3 and a half.

136 00:17:49.060 00:17:54.530 Annie Yu: And we I guess we will have to give like a threshold. What’s good? What’s bad.

137 00:17:56.680 00:17:58.540 Uttam Kumaran: No, you just have to report on the score.

138 00:17:59.590 00:18:03.169 Annie Yu: But how do people like interpret that score.

139 00:18:04.300 00:18:10.390 Uttam Kumaran: If it’s a low score, it’s probably bad. If it’s a if it’s a 10, it’s gonna be good, it’s on. It’s 1 to 10.

140 00:18:11.080 00:18:11.460 Casie Aviles: Yeah.

141 00:18:11.460 00:18:15.470 Annie Yu: The quality score also ranges from one to 10. Is that correct?

142 00:18:16.540 00:18:22.190 Uttam Kumaran: I mean you’re wait. I’m not the one looking at the dashboard every day. I feel like you guys should know this.

143 00:18:22.190 00:18:24.079 Annie Yu: No, I’m asking Casey.

144 00:18:24.080 00:18:27.349 Uttam Kumaran: Oh, okay, I mean, can pull up the dashboard like it’s.

145 00:18:27.900 00:18:29.010 Casie Aviles: Yeah, it’s 1 to 10.

146 00:18:29.010 00:18:30.669 Uttam Kumaran: Yeah, it’s always been one to 10.

147 00:18:34.270 00:18:36.339 Uttam Kumaran: Yeah, it’s always been one to 10.

148 00:18:36.670 00:18:37.200 Annie Yu: Okay.

149 00:18:40.030 00:18:45.769 Uttam Kumaran: So this is just a pretty simple, weighted average of just these 2, and then you can set the weights at 50, or you can

150 00:18:46.330 00:18:49.569 Uttam Kumaran: try to see like what’s logical.

151 00:18:50.060 00:18:54.419 Annie Yu: Okay, okay? I think. Then in that case, I think 50 makes sense.

152 00:18:56.290 00:19:00.670 Uttam Kumaran: Yeah, cause. Otherwise, it’s like, if we have something that’s thumbs down, and then we rate it a 10.

153 00:19:01.550 00:19:06.290 Uttam Kumaran: That doesn’t make any sense. Yeah. So that’s basically what we’re trying to compete against.

154 00:19:06.630 00:19:07.690 Annie Yu: Yeah, yeah.

155 00:19:07.690 00:19:14.360 Uttam Kumaran: That way. Also, like we can re probably reduce some of these like we can just have like one score for people to look at. We can have the breakdowns, but.

156 00:19:15.860 00:19:18.320 Annie Yu: So maybe not the thumbs up.

157 00:19:18.914 00:19:21.520 Annie Yu: So I. So in that case.

158 00:19:22.350 00:19:26.730 Annie Yu: can we also hide quality score? Or that’s something we still want to show.

159 00:19:30.750 00:19:35.980 Uttam Kumaran: I would say, this different ticket. It’s like what we’re going to show versus, not. I kind of have to have to see the data.

160 00:19:36.100 00:19:36.690 Annie Yu: Yeah.

161 00:19:39.050 00:19:39.990 Uttam Kumaran: Okay, cool.

162 00:19:40.840 00:19:43.109 Uttam Kumaran: Alright. So give this one a go.

163 00:19:45.320 00:19:47.240 Uttam Kumaran: Alright, let’s keep going.

164 00:19:53.300 00:20:05.977 Amber Lin: Hi, I’m currently in the project. Training assistant bot. So we have a few tickets there. That’s more. Half of them is engineering based half of them. I’m working on with the clients directly.

165 00:20:06.430 00:20:15.480 Amber Lin: and if we want to go there, that could be a good starting point, or we could look at more related to Andy. So there’s 2 sides.

166 00:20:16.265 00:20:20.179 Uttam Kumaran: Yeah, I guess I just wanna see like any of these.

167 00:20:20.480 00:20:25.779 Uttam Kumaran: maybe let’s just start with like any of are these all still in to do.

168 00:20:31.140 00:20:32.680 Annie Yu: Username.

169 00:20:33.590 00:20:35.220 Amber Lin: Should be done in. This is done.

170 00:20:36.850 00:20:37.190 Uttam Kumaran: Cool!

171 00:20:38.610 00:20:41.030 Uttam Kumaran: This is done, and then it’s.

172 00:20:41.030 00:20:41.590 Amber Lin: Test user.

173 00:20:41.590 00:20:42.550 Uttam Kumaran: Filter.

174 00:20:46.190 00:20:49.050 Annie Yu: Yeah, this is also the next.

175 00:21:09.930 00:21:14.489 Uttam Kumaran: This one is this, like, we need to do this.

176 00:21:17.250 00:21:19.949 Amber Lin: No no.

177 00:21:22.080 00:21:24.311 Uttam Kumaran: Like, is this done, or is this

178 00:21:24.630 00:21:29.299 Amber Lin: This is a nice to have. I’m gonna put it in ready for development.

179 00:21:29.650 00:21:34.210 Amber Lin: This is helping them. See? Okay, we have all these documents ready.

180 00:21:34.330 00:21:37.379 Amber Lin: I think there’s something else, you should add.

181 00:21:38.550 00:21:40.630 Uttam Kumaran: I see, and then for this one.

182 00:21:45.550 00:21:47.800 Uttam Kumaran: this is just like literally the

183 00:21:53.350 00:21:55.980 Uttam Kumaran: whatever the central dock thing is right.

184 00:21:58.000 00:21:59.069 Amber Lin: Let me check.

185 00:22:08.470 00:22:10.120 Uttam Kumaran: It’s like cleaning this up right.

186 00:22:11.600 00:22:18.159 Amber Lin: Dependent on a client. Because we don’t have the authority to edit their documents without.

187 00:22:18.160 00:22:19.740 Uttam Kumaran: Is this the document right now?

188 00:22:19.970 00:22:21.150 Amber Lin: This is a document right now.

189 00:22:21.150 00:22:22.329 Uttam Kumaran: This is horrible!

190 00:22:22.540 00:22:23.590 Amber Lin: That’s what I say.

191 00:22:23.940 00:22:27.680 Uttam Kumaran: But why aren’t we fit? Why aren’t we? Why don’t we? Why aren’t we cleaning this up.

192 00:22:28.713 00:22:46.140 Amber Lin: We’re a bit stuck on the clients. We can’t really do any edits without changing the contents. There’s danger of changing the contents, because we are probably gonna do AI edits and then passing it through. So we need the con client to work with us to

193 00:22:46.250 00:22:55.059 Amber Lin: make sure these are correctly formatted. And right now Denise and Shannon are spending most of their time going through the Csr test drive to make sure.

194 00:22:55.550 00:23:00.660 Amber Lin: They say they’re making sure that the documents are. Everything is in here.

195 00:23:00.660 00:23:07.250 Uttam Kumaran: We can have AI. We can go section by section and clean this up and not lose any anything.

196 00:23:07.720 00:23:08.390 Amber Lin: Okay.

197 00:23:08.890 00:23:12.560 Uttam Kumaran: Right like.

198 00:23:15.300 00:23:16.509 Amber Lin: I agree. I mean, it’s a good point.

199 00:23:17.390 00:23:22.680 Uttam Kumaran: Yeah, we don’t. I mean that we don’t have to change any content. It’s just this is like.

200 00:23:23.050 00:23:26.219 Uttam Kumaran: for example, if I copied and paste this in the chat, Gbt said.

201 00:23:27.000 00:23:30.169 Uttam Kumaran: here’s a format. Put the format. We’re gonna be good, right? So.

202 00:23:30.910 00:23:34.970 Uttam Kumaran: Okay. So I think this is still a priority.

203 00:23:35.250 00:23:38.289 Uttam Kumaran: I think for this, though, we need to think about like.

204 00:23:38.590 00:23:41.460 Uttam Kumaran: we need a structure for each section.

205 00:23:43.100 00:23:52.130 Uttam Kumaran: Right like, did, have we ever like thrown that like we just, we basically need to know, like

206 00:23:52.250 00:23:57.880 Uttam Kumaran: for a given section, what are the like, what’s the format? Right? So that’s 1 thing.

207 00:23:58.340 00:23:58.900 Amber Lin: Hmm.

208 00:24:17.170 00:24:27.090 Uttam Kumaran: So for one of these, we’re gonna need like, basically section format guidelines.

209 00:24:28.090 00:24:31.390 Uttam Kumaran: What is in each section?

210 00:24:33.370 00:24:39.130 Uttam Kumaran: At what point should a new section be created.

211 00:24:39.480 00:24:44.719 Uttam Kumaran: You can just throw all this into Gemini or something, and you can ask it these questions. But basically.

212 00:24:45.080 00:24:49.620 Uttam Kumaran: it looks like we have some table of contents billing call list.

213 00:24:50.430 00:24:54.970 Uttam Kumaran: But like, what are these? Are these business divisions? Are these

214 00:24:57.070 00:25:03.590 Uttam Kumaran: right? So we need some way to organize this. They’re not going to give this to us. There’s no way they think about things this way. So.

215 00:25:04.940 00:25:05.490 Uttam Kumaran: We should.

216 00:25:05.828 00:25:10.229 Amber Lin: Okay, that’s really good cause. Think I was waiting. I was constantly.

217 00:25:10.230 00:25:13.870 Uttam Kumaran: They’re not gonna be able to do any of this. Yeah, they’re just gonna keep pasting stuff in here.

218 00:25:14.060 00:25:15.690 Amber Lin: Okay, okay, that’s good.

219 00:25:15.690 00:25:22.770 Amber Lin: We should rewrite it. I mean, the only thing we have to really make sure is like, if you’re using AI to rewrite it. There can’t be a loss of.

220 00:25:23.420 00:25:24.450 Uttam Kumaran: Context.

221 00:25:24.450 00:25:26.070 Amber Lin: Yes, yes.

222 00:25:27.300 00:25:27.730 Uttam Kumaran: But.

223 00:25:27.730 00:25:28.230 Amber Lin: Oh!

224 00:25:28.230 00:25:29.918 Uttam Kumaran: A lot of this is like

225 00:25:30.800 00:25:34.099 Uttam Kumaran: a lot of this is probably duplicated in many places, but.

226 00:25:34.100 00:25:40.910 Amber Lin: Yes, it is we discovered that when we’re looking over the past drives, and

227 00:25:41.690 00:25:53.260 Amber Lin: I don’t exactly know where it is in the Central Doc that things are duplicated, but their documents, when they handed it to us, was duplicated, and therefore I assume, when we put that in, some of this is duplicated.

228 00:25:53.430 00:25:58.450 Uttam Kumaran: Yeah. So what we should do for this is, create a copy.

229 00:26:00.830 00:26:12.430 Uttam Kumaran: Use AI to clean up, use another AI to verify that the copy isn’t missing any info.

230 00:26:12.960 00:26:13.500 Amber Lin: Hmm.

231 00:26:13.500 00:26:22.849 Uttam Kumaran: From the Og. One, and then the next thing is like, copy the copy over to the Oc.

232 00:26:23.250 00:26:26.160 Amber Lin: No, I think we should also run it by them to confirm.

233 00:26:26.900 00:26:27.570 Uttam Kumaran: Yeah.

234 00:26:31.750 00:26:35.260 Uttam Kumaran: yeah, this is like gonna this, yeah, we need to. We should do this

235 00:26:35.570 00:26:46.679 Uttam Kumaran: doing like one, you should use AI to basically figure out like, what’s in this, Doc? What? What are the categories of knowledge that are in this document? And have? AI give you what the format is that we should do.

236 00:26:47.990 00:26:53.240 Uttam Kumaran: Okay. Great, if perfect.

237 00:26:57.350 00:26:59.439 Uttam Kumaran: This is the same one as that one right?

238 00:27:00.096 00:27:00.749 Amber Lin: Which one.

239 00:27:01.290 00:27:08.260 Amber Lin: This is based on what we identify the structure. Maybe they’re missing a guideline on this specific.

240 00:27:09.160 00:27:13.689 Amber Lin: Yeah, okay, this is not as urgent. This is something where we can do in the future.

241 00:27:14.650 00:27:15.535 Uttam Kumaran: Okay,

242 00:27:18.480 00:27:21.670 Amber Lin: I’m just putting in a backlog like it doesn’t have to be.

243 00:27:22.660 00:27:30.049 Uttam Kumaran: Okay, that’s fine. Great. So, okay, so this is this one.

244 00:27:32.240 00:27:35.230 Uttam Kumaran: well, okay, this is actually gonna go. Okay. So

245 00:27:41.100 00:27:42.480 Uttam Kumaran: this is done right?

246 00:27:45.600 00:28:02.389 Amber Lin: This is specifically for thumbs up. I don’t. This is not a high priority, because we want to address thumbs down feedback, though we have noticed that sometimes, when they wanted to put a positive feedback on what went well, they didn’t have a place to put that.

247 00:28:04.440 00:28:07.290 Uttam Kumaran: Yeah, Casey, like, how long was this? Would this take you to do.

248 00:28:09.760 00:28:15.610 Casie Aviles: Think maybe within a day, just to add another feedback. Sorry feedback box.

249 00:28:16.280 00:28:17.380 Casie Aviles: So yeah.

250 00:28:17.800 00:28:18.520 Uttam Kumaran: Okay.

251 00:28:25.030 00:28:29.008 Uttam Kumaran: okay, yeah, this is low. Prior. So we’ll leave that for now.

252 00:28:35.010 00:28:37.219 Uttam Kumaran: how’s this? What about this one? Casey?

253 00:28:40.530 00:28:44.969 Casie Aviles: This time don’t know how to do yet. No, this hasn’t been done.

254 00:28:45.720 00:28:51.260 Uttam Kumaran: Oh, we can’t, we can’t do. Oh, okay, I see what you mean.

255 00:28:53.620 00:29:02.740 Uttam Kumaran: Alright. I will go. I can try to figure this out, oh, find.

256 00:29:13.040 00:29:15.420 Uttam Kumaran: Okay. So this one is in.

257 00:29:16.410 00:29:18.289 Uttam Kumaran: They’ll need requirements.

258 00:29:29.530 00:29:31.680 Casie Aviles: This is just a formatting thing.

259 00:29:33.860 00:29:35.110 Uttam Kumaran: This should be chill, right?

260 00:29:35.900 00:29:36.530 Casie Aviles: Yeah.

261 00:29:45.340 00:29:49.120 Uttam Kumaran: Like, are we? Gonna put it? Just we’re just gonna put it on a new line.

262 00:29:50.610 00:29:54.219 Casie Aviles: Yeah, just make sure that it’s a new guy. It’s visible.

263 00:29:54.990 00:29:55.940 Uttam Kumaran: Okay, cool.

264 00:30:00.760 00:30:05.919 Uttam Kumaran: great. Okay. So improvements on abbreviations and acronyms.

265 00:30:12.190 00:30:14.549 Uttam Kumaran: Do we get this from them yet or no?

266 00:30:16.720 00:30:20.280 Amber Lin: No, we don’t have a list from them yet, and it’s blocked.

267 00:30:25.430 00:30:27.166 Uttam Kumaran: Is there any way?

268 00:30:28.900 00:30:29.990 Uttam Kumaran: We can.

269 00:30:30.660 00:30:41.549 Amber Lin: There is a way of. We look through all the Central Docs and get confusing abbreviations, and let them write that out in a spreadsheet.

270 00:30:41.910 00:30:45.099 Uttam Kumaran: Yeah, so like, if I take this whole thing right now.

271 00:30:45.690 00:30:47.640 Uttam Kumaran: and I toss it into here

272 00:30:50.710 00:30:55.399 Uttam Kumaran: and just toss it to AI and say, Oh, I get this paste size is too long.

273 00:30:55.590 00:31:00.250 Amber Lin: But basically. Yeah, if you just toss some section of this.

274 00:31:14.960 00:31:16.850 Uttam Kumaran: Yep, perfect.

275 00:31:21.070 00:31:23.150 Uttam Kumaran: great! We should just do this.

276 00:31:25.520 00:31:28.010 Amber Lin: Fantastic, so.

277 00:31:28.010 00:31:28.909 Uttam Kumaran: Can you do this.

278 00:31:30.200 00:31:30.860 Casie Aviles: Okay.

279 00:31:31.090 00:31:33.310 Uttam Kumaran: Okay. I’m just gonna I’ll copy this.

280 00:31:33.660 00:31:35.370 Amber Lin: Yeah. What? Ticket?

281 00:31:35.800 00:31:37.190 Amber Lin: 9 to stop?

282 00:31:37.580 00:31:39.050 Uttam Kumaran: You have an example.

283 00:31:43.330 00:31:50.510 Uttam Kumaran: So we’ll do use AI to over

284 00:31:51.060 00:32:02.080 Uttam Kumaran: the entire C. Doc. To pull all acronyms but acronyms into a

285 00:32:05.230 00:32:07.580 Uttam Kumaran: We’re not bringing any spreadsheets in, are we?

286 00:32:07.780 00:32:11.710 Uttam Kumaran: Oh, we are right. So I guess we can put acronyms into

287 00:32:11.900 00:32:21.330 Uttam Kumaran: at table at the table appendix of Central Doc.

288 00:32:22.660 00:32:29.489 Uttam Kumaran: I guess, Casey, I’ll I’ll I’ll ask you, if like, how we should sort of manage this

289 00:32:31.920 00:32:34.500 Uttam Kumaran: like, are you gonna put this into the system, prompt.

290 00:32:36.920 00:32:41.330 Casie Aviles: Yeah, we could pull it into the system from. So the bot is aware.

291 00:32:42.920 00:32:45.729 Casie Aviles: because that’s 1 of the errors that they

292 00:32:46.426 00:32:51.889 Casie Aviles: reported on. So we fix some of that. But I guess they have more.

293 00:32:53.250 00:32:56.660 Casie Aviles: It. Yeah, acronyms and deviations.

294 00:32:59.090 00:33:01.149 Uttam Kumaran: Do you think this is like 2 points.

295 00:33:02.480 00:33:04.589 Casie Aviles: Yeah, okay, that’s fair.

296 00:33:05.210 00:33:10.870 Uttam Kumaran: Okay, I are you gonna just throw this into Gemini? Are you gonna loop through this.

297 00:33:13.530 00:33:16.960 Casie Aviles: Yeah, I’m just gonna I’m just thinking of getting everything.

298 00:33:18.100 00:33:21.050 Casie Aviles: But they thank you, Jeremy, I didn’t throw the largest context.

299 00:33:21.340 00:33:26.269 Uttam Kumaran: Yeah, try the whole thing, and then maybe see if you loop through it, if there’s any difference. But okay.

300 00:33:26.650 00:33:29.510 Uttam Kumaran: perfect, that’s great.

301 00:33:30.230 00:33:37.150 Uttam Kumaran: How strict should Andy be on the input? Yeah, so have they given any. So I think we should have something like this.

302 00:33:37.500 00:33:40.270 Uttam Kumaran: Casey, like, there is some restriction on the input.

303 00:33:41.490 00:33:44.699 Uttam Kumaran: do you do? Are you? Are we doing that for any agents right now?

304 00:33:46.238 00:33:48.210 Casie Aviles: No, we don’t restrict the input.

305 00:33:49.310 00:33:53.209 Uttam Kumaran: If you were to give a suggestion like, how would? What would? How would you think about it like.

306 00:33:56.350 00:33:59.630 Casie Aviles: I think when they give some something like

307 00:34:00.929 00:34:08.659 Casie Aviles: incomplete message. Then I guess I we’re gonna have to prompt the bot to ask for clarification questions.

308 00:34:09.159 00:34:09.829 Uttam Kumaran: Okay.

309 00:34:12.459 00:34:13.579 Uttam Kumaran: So

310 00:34:26.789 00:34:29.059 Uttam Kumaran: so this is gonna be an improvement to the system. Prompt.

311 00:34:31.010 00:34:31.430 Casie Aviles: Yeah.

312 00:34:34.010 00:34:38.030 Uttam Kumaran: Is that in are we pulling that from? The Github prompts at all?

313 00:34:38.550 00:34:39.420 Uttam Kumaran: Not yet right.

314 00:34:39.812 00:34:45.139 Casie Aviles: Not yet, because the concerns might add to the time if we.

315 00:34:45.290 00:34:46.440 Uttam Kumaran: Hmm.

316 00:34:47.750 00:34:49.290 Casie Aviles: Yeah, to the response, time.

317 00:34:49.290 00:34:51.620 Uttam Kumaran: Maybe I should ask Mustafa what he thinks about that.

318 00:34:53.610 00:34:54.800 Uttam Kumaran: See what he says.

319 00:34:55.810 00:34:58.219 Uttam Kumaran: You’re right in that. We don’t want to pull it every time. But

320 00:35:09.720 00:35:13.569 Uttam Kumaran: okay, alright, this is fine.

321 00:35:16.630 00:35:17.360 Uttam Kumaran: Okay.

322 00:35:18.990 00:35:21.140 Uttam Kumaran: This one, I think, is like a probably 1 point.

323 00:35:25.230 00:35:26.830 Uttam Kumaran: Nice. Okay.

324 00:35:30.742 00:35:32.729 Uttam Kumaran: okay, let’s go.

325 00:35:42.950 00:35:44.979 Uttam Kumaran: Let’s just see if there’s anything.

326 00:35:46.270 00:35:48.240 Uttam Kumaran: Bot execution time.

327 00:35:57.320 00:35:59.310 Uttam Kumaran: Okay, I like this one.

328 00:36:00.120 00:36:03.869 Uttam Kumaran: This one is like, we need to do something where we’re taking the.

329 00:36:04.690 00:36:13.850 Uttam Kumaran: we’re taking the responses that have a low score, and the Llm. Is automatically suggesting what to improve in the central dock.

330 00:36:15.080 00:36:22.006 Uttam Kumaran: Ideally, I would prefer this to happen in slack like in our

331 00:36:30.500 00:36:31.200 Casie Aviles: Okay.

332 00:36:33.190 00:36:34.639 Uttam Kumaran: Right or something.

333 00:36:35.440 00:36:38.969 Uttam Kumaran: We’re basically like, I, I think I think what should happen is,

334 00:36:48.450 00:36:50.050 Uttam Kumaran: yeah, this.

335 00:36:51.560 00:36:55.679 Uttam Kumaran: But I don’t think we can. Oh, so.

336 00:36:55.680 00:37:05.050 Amber Lin: I think we already have this with the train I bought to be very honest, because right now we give them the update based on

337 00:37:06.581 00:37:16.209 Amber Lin: a prompt that we we give the trainer bot, and this essentially is automating that of when we receive feedback. Then we suggested it.

338 00:37:23.210 00:37:26.629 Uttam Kumaran: So what? What is? What is the trainer bot like? Go to a spreadsheet right now.

339 00:37:26.800 00:37:27.460 Amber Lin: No.

340 00:37:29.150 00:37:32.430 Uttam Kumaran: So what happens with the wait? So.

341 00:37:32.720 00:37:35.439 Amber Lin: So what the train of art right now is.

342 00:37:35.550 00:37:43.570 Amber Lin: it stops at providing the updates that need to be made. It will interview the interview the people to get the

343 00:37:43.720 00:37:50.800 Amber Lin: correct information. But right now we don’t have the capability to update this directly into the spreadsheet or into the Central Doc.

344 00:37:51.090 00:37:56.939 Amber Lin: We tried embeddings, I remember before. I think we tried to put it in

345 00:37:57.649 00:38:04.860 Amber Lin: a database of the different documents, but then the updates, it was very prone to error. So we paused.

346 00:38:06.540 00:38:13.309 Amber Lin: The ticket of having it directly update into the central doc. So right now the updates are

347 00:38:13.620 00:38:15.989 Amber Lin: copy and paste manually.

348 00:38:15.990 00:38:19.670 Uttam Kumaran: And then is a training bot is a trainer bot in the Google Workspace.

349 00:38:20.300 00:38:29.590 Amber Lin: No, we had issues of integrating 2 chat bots in in Google.

350 00:38:29.880 00:38:32.150 Amber Lin: Casey would know a bit more about that.

351 00:38:32.150 00:38:35.370 Uttam Kumaran: Yeah, is there? There’s no but there’s no open ticket for that one.

352 00:38:36.180 00:38:38.770 Amber Lin: We had a we had.

353 00:38:38.980 00:38:42.410 Amber Lin: We tried to investigate quite a few times. It did not work out.

354 00:38:44.860 00:38:49.270 Uttam Kumaran: Okay, but we already have one bot there. So what’s the problem with getting another one?

355 00:38:50.420 00:38:51.160 Amber Lin: You see.

356 00:38:51.870 00:39:02.200 Casie Aviles: Yeah, the problems with the set up with Google. But maybe I’ll have to get back.

357 00:39:02.200 00:39:03.049 Uttam Kumaran: You call Tim.

358 00:39:05.352 00:39:08.619 Casie Aviles: No, not I. Didn’t talk with them about that.

359 00:39:08.620 00:39:15.760 Uttam Kumaran: Okay. So this, we this like has to happen. So let’s just say, Trainer bought and Google Workspace.

360 00:39:16.640 00:39:21.300 Uttam Kumaran: Okay? So then we’ll figure this out with Tim fine.

361 00:39:21.300 00:39:23.790 Casie Aviles: The the bot is in. I think, in a

362 00:39:24.610 00:39:28.419 Casie Aviles: that we deployed through the custom ui that we had before.

363 00:39:28.530 00:39:32.560 Casie Aviles: Yeah, we we’ll move that to Google. Then.

364 00:39:34.190 00:39:38.170 Uttam Kumaran: And then, so I I do think that at least having us.

365 00:39:44.310 00:39:47.480 Uttam Kumaran: I think having a spreadsheet is fine,

366 00:39:53.066 00:39:57.219 Uttam Kumaran: which. Well, I don’t know what. What was I looking at before?

367 00:40:05.370 00:40:08.099 Uttam Kumaran: okay, this is enable direct updates.

368 00:40:16.420 00:40:20.459 Uttam Kumaran: So which was the one I was just looking at, which was like finding the

369 00:40:22.200 00:40:27.329 Uttam Kumaran: oh, feedback to suggestions. Okay, yeah. So this is feedback to suggestion.

370 00:40:28.220 00:40:31.599 Uttam Kumaran: Yeah, so it’s feedback to suggestion workflow.

371 00:40:34.830 00:40:39.650 Uttam Kumaran: So for each new row of for each

372 00:40:41.751 00:40:50.079 Uttam Kumaran: response. There, that is a low score. So this is basically like Lm, as a judge.

373 00:40:52.380 00:41:05.250 Uttam Kumaran: the prompt is con. So for each response. Pair response pair is sent to Llm with Cdock. Context

374 00:41:08.680 00:41:15.219 Uttam Kumaran: training, Doc, or what info is missing.

375 00:41:18.540 00:41:29.629 Amber Lin: When we say low score here, we’ll we’re talking about the score that’s now that will be integrated with the feedback right? Because right now, our score is not an accurate measure of.

376 00:41:29.800 00:41:38.110 Amber Lin: If the response warrants an update, we’re relying on the Csr’s feedback right now to identify what’s working and what’s not.

377 00:41:38.110 00:41:43.010 Uttam Kumaran: But I guess that’s also another question I have is like, why our scores are not accurate.

378 00:41:44.520 00:41:52.839 Uttam Kumaran: Right? So that’s also something like we’re giving high scores for things that are that should be high scores. That means our evaluation data set is not right.

379 00:41:56.495 00:41:59.419 Uttam Kumaran: So that’s something else we can look at. So.

380 00:42:05.980 00:42:11.309 Amber Lin: That would be the integrate thumbs down feedback into quality score.

381 00:42:12.180 00:42:17.050 Uttam Kumaran: Well, this would be another one, which is like we have a bunch of false scores. Then we have scores.

382 00:42:17.050 00:42:17.440 Amber Lin: And.

383 00:42:17.440 00:42:19.429 Uttam Kumaran: We have scores that are rated high.

384 00:42:20.020 00:42:23.709 Uttam Kumaran: Shouldn’t be. Which means our our eval data set is wrong.

385 00:42:26.990 00:42:32.780 Uttam Kumaran: like ideally about. We should not be getting any thumbs down. We should already know that it’s gonna.

386 00:42:32.780 00:42:33.180 Amber Lin: That’d be a thought.

387 00:42:33.180 00:42:35.640 Uttam Kumaran: Comes down right like, that’s the goal.

388 00:42:41.880 00:42:42.560 Uttam Kumaran: Okay?

389 00:42:44.100 00:42:47.840 Uttam Kumaran: So this one we can just do in a new spreadsheet.

390 00:42:55.490 00:42:58.929 Uttam Kumaran: okay, I’m just gonna say, create a new spreadsheet. So is there a?

391 00:43:01.780 00:43:06.629 Uttam Kumaran: Oh, great. Okay, yeah. So this is exactly the one. So we’re gonna have another

392 00:43:07.060 00:43:10.009 Uttam Kumaran: Ll, basically, an Llm is gonna suggest what to do here.

393 00:43:10.430 00:43:11.460 Amber Lin: Yes.

394 00:43:17.880 00:43:20.730 Uttam Kumaran: Casey, this is probably a big one, so I’ll leave as 5.

395 00:43:23.185 00:43:23.490 Casie Aviles: Bye.

396 00:43:25.000 00:43:29.150 Uttam Kumaran: And then I’m gonna create one. So this is feedback. So this is ready.

397 00:43:32.430 00:43:34.240 Uttam Kumaran: Oh, yeah, what is it? So?

398 00:43:34.510 00:43:38.139 Uttam Kumaran: Where? Where? Where are these. Oh, oh, by the way, is coming from.

399 00:43:41.160 00:43:41.820 Amber Lin: Hmm.

400 00:43:42.330 00:43:44.170 Uttam Kumaran: Where is the oh, by the ways.

401 00:43:44.980 00:43:46.280 Amber Lin: Right now.

402 00:43:46.520 00:43:57.049 Amber Lin: The oh, they expect us to generate. Oh, by the way, it’s based on their entire offerings. So they’re depending on us to do the logical suggestions.

403 00:43:58.380 00:44:00.390 Uttam Kumaran: Are offers right like these are.

404 00:44:00.803 00:44:24.390 Amber Lin: There’s 2 things. So there’s offers and other services. Other service just means. You have this. Therefore you might want to consider something else that we offer. The second part is power offers this month. Right? This is maybe a holiday offer, maybe a discount promotion they’re promoting. We have that we we don’t have a

405 00:44:24.910 00:44:29.690 Amber Lin: like we for the other services. We have a list from their website.

406 00:44:30.520 00:44:31.530 Uttam Kumaran: Hmm, okay.

407 00:44:32.170 00:44:32.740 Amber Lin: Yeah.

408 00:44:34.850 00:44:38.869 Uttam Kumaran: So for people offers. Where are do we have a list of those.

409 00:44:39.180 00:44:55.929 Amber Lin: They sent a document. I I included it. Oh, gosh! It was in one of those tickets. I’ll go find it. It should be also in the document somewhere. I think the problem is not that we don’t have a list of all these services. I believe Miguel integrated them a while back

410 00:44:56.531 00:45:01.880 Amber Lin: or integrated the offers at least. But right now it only suggested mosquito.

411 00:45:04.280 00:45:10.679 Amber Lin: which has been fine because that has boosted their mosquito upsells. But I want us to address this.

412 00:45:11.200 00:45:16.430 Uttam Kumaran: But can they just put like a sheet here with the offer? And then what the discount is, and description.

413 00:45:16.430 00:45:17.440 Amber Lin: Yeah. Totally.

414 00:45:20.530 00:45:21.290 Uttam Kumaran: All right.

415 00:45:21.480 00:45:27.959 Uttam Kumaran: This other services thing is fine. The offers thing we just need to pull from somewhere like that’s not on us at all.

416 00:45:34.030 00:45:34.680 Amber Lin: Okay, so

417 00:45:34.680 00:45:39.990 Amber Lin: can you make the ticket for improving the quality score? I’m just writing one. Don’t wanna do.

418 00:45:39.990 00:45:41.909 Uttam Kumaran: No, it’s already done. It’s already there. Yup!

419 00:45:41.910 00:45:42.819 Amber Lin: Oh, okay.

420 00:45:44.030 00:45:44.580 Uttam Kumaran: Yeah.

421 00:45:45.190 00:45:47.809 Uttam Kumaran: Integrate thumbs down feedback into quality score.

422 00:45:51.430 00:46:02.789 Amber Lin: Oh, I mean the other parts of I don’t think. Do you think just doing thumbs down? Would improve the quality score? Or is there something just wrong or not complete enough with our say evals, or how we.

423 00:46:02.790 00:46:05.620 Uttam Kumaran: Oh, yeah, I mean, I I

424 00:46:06.430 00:46:09.300 Uttam Kumaran: yeah, I have to create something here. That’s just like.

425 00:46:09.545 00:46:11.510 Amber Lin: I’ll create one. I I was just working.

426 00:46:11.510 00:46:13.880 Uttam Kumaran: But I don’t know if we’re gonna be able to work on it. I don’t have.

427 00:46:13.880 00:46:22.470 Amber Lin: Yeah, I’ll put it in backlog. I think thumbs down. Integration will help change a lot of things. And then this is something for the future.

428 00:46:29.880 00:46:31.580 Uttam Kumaran: Okay, this one.

429 00:46:38.710 00:46:44.210 Uttam Kumaran: I’m gonna mark this as canceled, that’s basically related to yours.

430 00:46:53.980 00:46:55.269 Uttam Kumaran: Is this done.

431 00:47:04.390 00:47:04.770 Amber Lin: Doubt.

432 00:47:04.770 00:47:05.650 Casie Aviles: So we did not.

433 00:47:11.120 00:47:17.459 Uttam Kumaran: So I just want to know. I guess 1st my 1st question here would be, Can we identify all the hallucination pairs.

434 00:47:32.260 00:47:36.880 Uttam Kumaran: Okay. Alright. I know we’re at 1 59. So we’ll probably have to continue this.

435 00:47:41.350 00:47:48.009 Uttam Kumaran: but I feel better. We didn’t get to updates on the dashboards. And I think, kind of 2 things. One, I think.

436 00:47:48.830 00:47:54.330 Uttam Kumaran: Annie, I’m gonna add some tickets related to like insights

437 00:47:54.460 00:47:58.610 Uttam Kumaran: like I want. We want to start to help show them like what

438 00:47:59.440 00:48:04.579 Uttam Kumaran: like, basically start to give them the insights I got for, especially for their executive team

439 00:48:04.710 00:48:11.129 Uttam Kumaran: on like, how many? Oh, by the ways they’re getting, what are the types of questions that are being asked?

440 00:48:11.280 00:48:16.210 Uttam Kumaran: Like, what are the challenges we’re facing? So I’ll create a ticket on that

441 00:48:18.130 00:48:24.990 Uttam Kumaran: And then the quality score, I think, is the other biggest one. And then I’m gonna start. I’m gonna push Tim on the 8 by 8 stuff.

442 00:48:27.490 00:48:28.940 Annie Yu: Okay. Sounds good.

443 00:48:31.750 00:48:33.470 Uttam Kumaran: Okay, perfect.

444 00:48:33.670 00:48:37.650 Amber Lin: Thank you guys, yeah, thank you. I’ll see you in the other meeting.

445 00:48:37.650 00:48:38.160 Casie Aviles: Thank you.

446 00:48:38.160 00:48:38.870 Uttam Kumaran: Bye.