Meeting Title: Brainforge x ABC Home and Commercial: Weekly Project Check Date: 2026-02-05 Meeting participants: read.ai meeting notes, Matt’s Notetaker (Otter.ai), JanieceGarcia, Samuel Roberts, Uttam Kumaran, Steven, Amber Lin


WEBVTT

1 00:01:30.040 00:01:31.320 Uttam Kumaran: Hello!

2 00:01:31.930 00:01:33.800 Uttam Kumaran: Hi! Hi!

3 00:01:34.750 00:01:36.000 Uttam Kumaran: How is everything?

4 00:01:36.250 00:01:38.659 JanieceGarcia: Good! How are you?

5 00:01:38.660 00:01:39.440 Uttam Kumaran: Good!

6 00:01:39.610 00:01:40.610 JanieceGarcia: Yay!

7 00:01:42.050 00:01:45.090 JanieceGarcia: Steven, haven’t seen you on one of these in a while!

8 00:01:45.960 00:01:48.599 Steven: Yeah, last week. No, the week before.

9 00:01:49.560 00:01:51.209 JanieceGarcia: Yeah, it was like, not last week.

10 00:01:51.580 00:01:52.130 Steven: No.

11 00:01:52.130 00:01:54.490 JanieceGarcia: You were here in Austin last week, sir.

12 00:01:59.540 00:02:02.040 JanieceGarcia: I think it’s just gonna be me and you today, Steven.

13 00:02:04.410 00:02:04.970 Steven: Cool.

14 00:02:05.260 00:02:06.840 Uttam Kumaran: We have some fun things to share.

15 00:02:07.560 00:02:08.479 JanieceGarcia: Yay!

16 00:02:09.639 00:02:13.469 JanieceGarcia: I’m so excited. If this whole Andy thing…

17 00:02:13.950 00:02:16.490 JanieceGarcia: The automation for the central dock.

18 00:02:16.490 00:02:18.720 Uttam Kumaran: Yes, that’s what we’re sharing.

19 00:02:18.720 00:02:24.330 JanieceGarcia: Guys, if that’s really what I’m thinking it is, I’m like… Thank you!

20 00:02:25.000 00:02:27.620 Uttam Kumaran: We have to build… we have to do it first, but yes.

21 00:02:27.620 00:02:28.400 Samuel Roberts: Exactly.

22 00:02:28.400 00:02:33.659 Uttam Kumaran: Yeah, we met this week, and we’re continuing to sort of build out, like.

23 00:02:34.270 00:02:38.970 Uttam Kumaran: Now, as we go imprint this on different teams, like, how do we maintain the central dock, and…

24 00:02:39.430 00:02:41.480 Uttam Kumaran: Yeah, I feel like we’re…

25 00:02:41.930 00:02:44.870 Uttam Kumaran: We’re… we’re in a good spot, we have a plan for this month, so…

26 00:02:45.100 00:02:54.710 JanieceGarcia: Okay, well, I will say that the database, the zip code deal, I think it’s working well. I think the updates are happening, I can see the changes.

27 00:02:54.710 00:03:06.339 JanieceGarcia: Cool. I’m actually… I have one that I have to do… well, two of them that I have to do today, because, somebody retired, and then a tech actually got let go, so I need to make those updates.

28 00:03:06.630 00:03:09.919 JanieceGarcia: So I’ll do that, if not this afternoon.

29 00:03:10.150 00:03:10.930 JanieceGarcia: Tomorrow morning.

30 00:03:10.930 00:03:25.519 Uttam Kumaran: Is there anywhere, Janiece, while you’re doing those, you could just, like, record your screen? Yeah. And send us, like, a little recording? Yeah. You don’t even need to, like, talk through it, although you can basically be like, I’m gonna do this, so just so I can see how you’re, like, navigating.

31 00:03:26.640 00:03:28.730 JanieceGarcia: I’ll do it. Then I’m gonna do it tomorrow morning.

32 00:03:29.480 00:03:33.520 Uttam Kumaran: Okay. Well, hopefully it doesn’t take longer.

33 00:03:33.520 00:03:37.299 JanieceGarcia: No, because I leave… I try to leave Austin around 3, 3.30.

34 00:03:37.300 00:03:38.610 Uttam Kumaran: Okay, okay, alright.

35 00:03:39.050 00:03:42.799 Uttam Kumaran: Yeah, that way, I just want to see, like, how you’re navigating and see if we can make it any easier.

36 00:03:42.800 00:03:44.429 Samuel Roberts: Definitely. Yeah, absolutely.

37 00:03:44.440 00:03:45.170 JanieceGarcia: Yeah.

38 00:03:45.580 00:03:50.140 Uttam Kumaran: If you want, I can send you… I can send an email to just, like, so… as a reminder, up to you.

39 00:03:50.140 00:03:52.319 JanieceGarcia: I’ll send a reminder for myself. Okay. You’re good.

40 00:03:52.320 00:03:52.990 Uttam Kumaran: Okay.

41 00:03:56.010 00:03:58.040 Uttam Kumaran: Cool. Amber, it’s all yours.

42 00:03:58.040 00:04:04.839 Amber Lin: Yeah, let’s get started. So, let me share screen… Let’s go right here…

43 00:04:08.590 00:04:16.490 Amber Lin: So, I know Yuva’s not joining today, so, we can… I’ll send her the updates when we send the email tomorrow.

44 00:04:16.640 00:04:32.200 Amber Lin: I know Pest and Mechanical is still up there. I think the usage is a little lower than what we had last week, so maybe that’s something that we will… we can check in with the trainers on. I think the last

45 00:04:32.300 00:04:37.840 Amber Lin: three, four weeks or so, we were doing really great, so I want to see if we can keep that up.

46 00:04:40.210 00:04:58.679 Amber Lin: Cool. I want to talk first about a brief update on the transcripts. I know this was something we were all very excited about, and something that Yvette have been wanting to do for a long time. So, Sam has some updates there, and then when we have more stuff, we’ll let Yvette know.

47 00:05:00.460 00:05:02.610 Uttam Kumaran: Yeah, Sam, do you want… yeah, go ahead, go, sorry.

48 00:05:02.610 00:05:04.400 Samuel Roberts: Yeah, so I was able to,

49 00:05:04.610 00:05:06.689 Samuel Roberts: mostly pick up where I left off.

50 00:05:07.630 00:05:11.610 Samuel Roberts: the API, the keys, everything was good. I think the…

51 00:05:12.160 00:05:18.780 Samuel Roberts: There were two concerns previously. One was, like, credit card data that might possibly be in there, and so, I…

52 00:05:19.010 00:05:26.520 Samuel Roberts: double-checked that with Tim, and he had… he’s gonna get back to me on a couple things there. And then the rate limits, because apparently it’s just one…

53 00:05:26.690 00:05:43.490 Samuel Roberts: general rate limit. It’s not like we have a separate way to access it, and so I don’t want to try to do this big backbill of data and have it mess something up for someone else. So he’s gonna get back to me on that. So, it’s getting good to a point when we can probably start ingesting some of the data, but I’m not gonna run a big

54 00:05:43.740 00:05:47.240 Samuel Roberts: You know, pull and be able to do, like, kind of a first pass at it.

55 00:05:47.550 00:05:51.640 Samuel Roberts: Until we find that out. But I am able to access transcripts, process them, get

56 00:05:52.690 00:05:57.229 Samuel Roberts: There’s some things in there, like sentiment, and confidence and stuff.

57 00:05:58.170 00:06:07.760 Samuel Roberts: there’s a little bit of a hiccup I just encounter… literally just encountered in the past, like, hour, where I’m not sure if the user IDs that are in the transcripts match

58 00:06:08.320 00:06:11.419 Samuel Roberts: Anything to know, like, who’s on the phone?

59 00:06:11.750 00:06:14.999 Samuel Roberts: So I’m trying to work through that a little bit. There’s… the API is…

60 00:06:15.820 00:06:22.489 Samuel Roberts: Interesting. This is something we encountered before, and we had to get on the phone with the 8x8 dev and everything, and he was…

61 00:06:22.990 00:06:29.190 Samuel Roberts: I don’t want to say apologetic, but he understood that there are issues with the API, and so it’s… I’m trying to work through some of that now. I might have to loop them in at some point.

62 00:06:29.190 00:06:30.590 Amber Lin: Gotcha. Okay.

63 00:06:30.590 00:06:37.570 Samuel Roberts: Besides that, yeah, like, the transcripts are coming out, so… and we’ll get the IDs, the question is just correlating those with who’s who.

64 00:06:37.990 00:06:40.950 Uttam Kumaran: Do you have, like, an example transcript, Sam?

65 00:06:40.960 00:06:43.650 Samuel Roberts: Yeah, let me… let me move some things around here.

66 00:06:43.650 00:06:48.330 Uttam Kumaran: Show the team… Like, what we actually getting, and then…

67 00:06:48.720 00:06:52.500 Uttam Kumaran: I may even ask if we can just, like, put one of those, or, like.

68 00:06:52.950 00:06:55.629 Uttam Kumaran: 5 of them into the database, so.

69 00:06:55.630 00:06:58.619 Samuel Roberts: That’s what I was thinking when you get to. Yeah, I think we can probably start…

70 00:06:58.860 00:07:04.899 Samuel Roberts: doing that soon. Let me… Get the files here, sorry.

71 00:07:09.460 00:07:10.930 Samuel Roberts: Okay, so…

72 00:07:11.130 00:07:17.240 Uttam Kumaran: Yeah, maybe while you’re doing that, I mean, it came up even in our meeting yesterday, is that we, one, we want to…

73 00:07:17.600 00:07:27.220 Uttam Kumaran: one, understand, like, how the CSRs are reinforcing some of the policies, but in addition, I think even for the trainers, it’s their ability to say, like.

74 00:07:27.340 00:07:31.100 Uttam Kumaran: are the CSRs adhering to, like, the things that we’re training?

75 00:07:31.270 00:07:32.770 Uttam Kumaran: them to do.

76 00:07:32.870 00:07:41.459 Uttam Kumaran: In addition, like, we were talking yesterday about attribution for clients, like, where are the customers coming from? And that’s something that we also…

77 00:07:41.590 00:07:56.010 Uttam Kumaran: right now, it’s a man… I know it’s manually getting inputted into Evolve, but maybe there’s a way for us to reinforce that, or somehow automatically grab that so that data is accurate. But I’m sort of predicting that there’s going to be a lot of ways that we can

78 00:07:56.100 00:08:04.739 Uttam Kumaran: make this transcript database open to you, Janice, and Yvette, for just kind of asking questions over all of them, like.

79 00:08:05.080 00:08:22.669 Uttam Kumaran: I feel like I need to brainstorm a bit more on, like, what you could do with that, but I… I have a feeling that there’s a lot of opportunity. In our company, we do this, actually, internally, where we have all of our transcripts, and so we use it to ask questions about commonalities between different types of clients.

80 00:08:22.780 00:08:28.080 Uttam Kumaran: getting feedback on a meeting, and that was, I think, one of the initial things is, like, trying to do the feedback

81 00:08:28.260 00:08:31.229 Uttam Kumaran: Scores and things like that, so… yeah.

82 00:08:33.470 00:08:38.959 Samuel Roberts: Yeah, sorry, I just realized the output is just one big line of text, so I just want to make it a… there we go.

83 00:08:39.880 00:08:48.620 Samuel Roberts: Let me share… Here… So, this is basically… How a transcript comes in.

84 00:08:48.830 00:08:59.080 Samuel Roberts: I don’t actually know when this one’s from, because I’m just kind of asking the API for random ones, but we can do filtering by date and stuff, but you can see, like, each word is a separate, kind of.

85 00:08:59.730 00:09:00.560 Uttam Kumaran: Oh, okay.

86 00:09:00.670 00:09:03.880 Samuel Roberts: confidence level, so what I have to do is process all that.

87 00:09:04.080 00:09:07.279 Samuel Roberts: And give us this transcript text here.

88 00:09:07.480 00:09:12.549 Samuel Roberts: Where it combines, because if you actually look, it does each word, and then each of these…

89 00:09:12.850 00:09:15.410 Samuel Roberts: Events is a different…

90 00:09:15.960 00:09:22.190 Samuel Roberts: person talking. Okay. And so, what I’ve done is a little function that can… I’ll put it into this form.

91 00:09:22.640 00:09:29.349 Samuel Roberts: Nice. So, it’s actually back and forth, at least. It’s a little weird with timing, because of the way people can talk over each other and things, but…

92 00:09:29.350 00:09:36.759 Uttam Kumaran: Let’s look at just this. Can you, can you wrap this whole one? Just because I think line 9 is cut, but… okay, so this just looks like…

93 00:09:38.280 00:09:42.249 Samuel Roberts: Yeah, I don’t know what specifically was getting pulled, I wasn’t focusing too much on the content here, but…

94 00:09:42.250 00:09:44.759 Uttam Kumaran: Yeah, I just want to read through and just… so…

95 00:09:45.590 00:09:46.559 JanieceGarcia: This is…

96 00:09:48.290 00:09:48.940 Uttam Kumaran: What?

97 00:09:49.200 00:09:51.800 JanieceGarcia: It’s a mechanical one, it’s definitely not a CSR.

98 00:09:52.210 00:09:52.880 Uttam Kumaran: Okay.

99 00:09:53.050 00:09:54.669 JanieceGarcia: That’s a mechanical projects.

100 00:09:55.510 00:09:56.390 Uttam Kumaran: Okay.

101 00:09:56.700 00:10:03.300 Samuel Roberts: Yeah, that’s the other… I just have access to, like, the transcript, like, I can filter by date and stuff, I’m not sure what’s getting…

102 00:10:04.090 00:10:06.520 Samuel Roberts: What’s… what’s… how it’s organized in 8x8 is.

103 00:10:06.520 00:10:13.570 Uttam Kumaran: On the batch, or on the API call, are you able to see, like, the person it’s associated with, or, like, do you have any IDs?

104 00:10:13.570 00:10:20.640 Samuel Roberts: that’s where I’m trying to sort out a little bit. There is a… an ID in here… .

105 00:10:21.310 00:10:29.589 JanieceGarcia: I wonder, though, Sam, because you said that you’re trying to figure out, like, on how it’s sorted, how 8x8 is sorted, so we are in queues.

106 00:10:29.730 00:10:43.659 JanieceGarcia: So, I wonder if we could give you specific queues that the agents are in. So, like, our lawn queue, our HVAC queues, and not our outside queues that do more outbound stuff, because that would be exactly what.

107 00:10:43.660 00:10:51.579 Samuel Roberts: That might be good, yeah, because there’s… so what I found, and this was something that was a hiccup earlier, is that there’s, like, two different ways to access this data.

108 00:10:51.580 00:10:52.520 JanieceGarcia: I think…

109 00:10:52.520 00:11:00.880 Samuel Roberts: the first way I was accessing it sometimes has the transcripts, but I think that’s where the IDs might be where I can then properly fetch

110 00:11:01.410 00:11:05.360 Samuel Roberts: the right calls? Because right now, I’m just hitting an endpoint that is…

111 00:11:05.460 00:11:10.660 Samuel Roberts: kind of a bulk download endpoint, and so it’s… it’s basically, like, give me…

112 00:11:11.000 00:11:16.210 Samuel Roberts: 10 that fit this filter kind of thing. So if I can dial that in a little bit, that will give me

113 00:11:17.070 00:11:23.260 Samuel Roberts: better transcripts, probably. So yes, that would be helpful, some of that information, and then I can get in there and figure out

114 00:11:23.460 00:11:30.940 Samuel Roberts: you know, which IDs to actually search for. There’s a couple other things here that I think I can filter by besides date.

115 00:11:32.910 00:11:34.840 JanieceGarcia: I’m gonna take a note. Sorry.

116 00:11:34.840 00:11:36.110 Samuel Roberts: If it’s… yeah, no problem.

117 00:11:36.330 00:11:44.630 Samuel Roberts: Yeah, the issue I was saying before is that this user ID, I have not been able to figure out what exactly it’s tied to, and that might be because it’s in the other part of,

118 00:11:46.290 00:11:51.520 Samuel Roberts: of the database, but even there, there’s an agent ID, which is not the same, and I’m trying to…

119 00:11:51.960 00:11:58.769 Samuel Roberts: Map those together to see if there’s something in the middle, but… I think this is also…

120 00:11:58.960 00:12:03.290 Samuel Roberts: Yeah, this is also 2020, so I don’t even know. You know, I’m just hitting the endpoints right now, so…

121 00:12:06.340 00:12:19.239 Samuel Roberts: Mostly to prove that it was getting data, because the mechanics of… you have to, like, reach out to it, ask it for the data, and then it does a processing job, so you have to keep checking it, it’s not just a kind of immediate stream.

122 00:12:20.020 00:12:23.690 Samuel Roberts: That structure is in place now, and we can dial in the filters, so…

123 00:12:23.690 00:12:24.230 Uttam Kumaran: Okay.

124 00:12:24.710 00:12:26.359 Samuel Roberts: the queues would be helpful, I think, for that.

125 00:12:27.110 00:12:27.860 Uttam Kumaran: Okay.

126 00:12:29.910 00:12:32.969 Uttam Kumaran: Okay, cool, we can go back to slides, yeah, that’s great.

127 00:12:40.000 00:12:44.860 Amber Lin: Awesome, so let’s look at… The next one, so…

128 00:12:44.860 00:13:02.450 Amber Lin: This is the main update that we have today. One of the main updates is the zip code. We were able to add the tech levels, and I wanted to show you what we’re now able to ask about, because that gives us a lot more dimensions, a lot more flexibility.

129 00:13:02.450 00:13:13.620 Amber Lin: So, for mechanical, we got a sheet that was about the tech levels that had additional details on what levels they are, what license they have, what calls they can take.

130 00:13:14.040 00:13:17.560 Amber Lin: As well as, if a certain tech

131 00:13:18.170 00:13:24.309 Amber Lin: can take apprentices, etc. So, now we are able to…

132 00:13:24.600 00:13:30.910 Amber Lin: For example, asked about, what are all the technicians at this, at level B?

133 00:13:31.200 00:13:39.359 Amber Lin: Or we could ask about, who can take apprentices. I assume CSRs would…

134 00:13:39.800 00:13:53.409 Amber Lin: they won’t ask this type of question that much, but it could be helpful, maybe, for Janice, if… or someone who wants to assign or see if there’s someone who can help their apprentice.

135 00:13:53.410 00:14:01.350 Amber Lin: And then we can ask by license type, because I know, depending on their license, they’re able to cover different things.

136 00:14:01.460 00:14:07.899 Amber Lin: And then, lastly, we can also ask about call type. I think this is the most common…

137 00:14:08.100 00:14:23.039 Amber Lin: specification that we get. We get, who can handle service calls, who can handle… give me a tech that can handle maintenance calls. So I think this would be very, very helpful for, based on the triage questions we’ve been getting. And lastly.

138 00:14:23.380 00:14:36.149 Amber Lin: We’ve only yet added this for mechanical and dispatch, and we will be adding this column for the other assignments as well. But currently, we only have it for mechanical.

139 00:14:41.620 00:14:46.470 Amber Lin: Cool. Janice, who do you think would be the best

140 00:14:46.610 00:14:51.970 Amber Lin: person to test these things? Should we let Mechanical know and start testing and see?

141 00:14:52.380 00:14:53.510 Amber Lin: Their feedback is.

142 00:14:54.230 00:14:57.620 JanieceGarcia: I was just fixing to let Tara know.

143 00:14:57.620 00:14:58.090 Amber Lin: Perfect.

144 00:14:58.090 00:15:02.720 JanieceGarcia: not open, I would definitely let the entire team have at it for testing.

145 00:15:03.040 00:15:15.499 Amber Lin: Okay, I do think this is open for everyone. I was able to test it earlier as well, so this should be on the live version already, because we did some internal testing. So I’ll let them know and see what they say.

146 00:15:15.920 00:15:21.919 JanieceGarcia: Just send it in the mechanical chat with them, and let them know that they’ve been updating some awesome.

147 00:15:21.920 00:15:22.610 Amber Lin: Okay.

148 00:15:22.800 00:15:23.360 JanieceGarcia: Cool.

149 00:15:24.600 00:15:25.220 Amber Lin: Quote.

150 00:15:25.690 00:15:34.960 Amber Lin: So, now I want us to… want us to talk about the central dog overhaul, of different steps we want to take.

151 00:15:35.170 00:15:39.709 Amber Lin: Your time, do you want to take over here and talk about what we will plan to do?

152 00:15:39.710 00:15:50.339 Uttam Kumaran: Yeah, so we met internally, last week, or maybe earlier this week, to talk a little bit about, like, ways to improve

153 00:15:50.470 00:16:00.770 Uttam Kumaran: sort of the central dock updating process, and it was, like, it was a really, really good discussion. I think a couple things that we identified, is there’s…

154 00:16:00.940 00:16:01.870 Uttam Kumaran: there’s…

155 00:16:01.960 00:16:17.760 Uttam Kumaran: things around the creation of the Central Doc, and the updates of the central docs. So on the document creation side, right, so when Amber is going out into, you know, new teams, she’s gathering all their sources in Google Drive, listing all of them, putting them all into a Google Doc.

156 00:16:17.810 00:16:23.670 Uttam Kumaran: Then, you have to kind of create these sections, reorder headering, doing the headers, and then finally.

157 00:16:23.740 00:16:39.310 Uttam Kumaran: after, you know, updating, we then kind of get that into AI, and then over time, we have to continue to consolidate to find things that are written two places, maybe they’re two things about the same thing written differently, right? And improving the wording, cutting out fluff.

158 00:16:39.350 00:16:54.700 Uttam Kumaran: And so right now, we’re using AI for doing some of the reformatting. We’re using AI to sort of update some of the words. We’re using AI to help us kind of think through the structure, but not really beyond that. And so what is still manual?

159 00:16:54.700 00:17:07.210 Uttam Kumaran: really manual is the gathering of all those documents, copying and pasting it in, but actually more than that, it’s actually finding these duplicates and errors, and proposing, like, what the changes should be. So…

160 00:17:07.359 00:17:10.990 Uttam Kumaran: A couple of ideas that, you know, we…

161 00:17:11.220 00:17:14.030 Uttam Kumaran: we sort of broke down, and I’ll actually share…

162 00:17:14.579 00:17:31.939 Uttam Kumaran: like, kind of this… this list, and… because I’ll… it’ll be helpful to kind of see it this way. We actually went through and are trying to identify, like, different ways in which we can, affect, like, various pieces of this sort of central dock issue. So…

163 00:17:32.260 00:17:46.249 Uttam Kumaran: we broke it down into sections, so we have, like, document analysis and quality, we have feedback on, like, queries that are coming in, stuff around how we’re getting information from the document, as well as, like, content, maintenance, and so…

164 00:17:46.250 00:17:53.680 Uttam Kumaran: A couple things that we really pulled out, is one, we need to have some type of

165 00:17:55.780 00:18:10.199 Uttam Kumaran: we need to somehow… some type of, like, timeline on which we update and check the AI… the central doc. And so, I think every day is a bit much, but every two weeks or every month, we’re able to, like, look at any new information added, and then sort of, like.

166 00:18:10.340 00:18:28.160 Uttam Kumaran: then be like, okay, here’s the point at which we make changes. So, this is probably just, like, a meeting, but a couple things that we want to look for. We want to look for duplicate content, we want to look for contradictory information, and we want to look for things that are ambiguous, right? So, language where it’s like, maybe do this, or sometimes do this, like.

167 00:18:28.160 00:18:40.999 Uttam Kumaran: the AI is just gonna say that, and it’s gonna be as ambiguous as what’s in there. So, these are 3 things that I think are very… gonna be very easy for us to build some AI prompts around identifying. And so, during that…

168 00:18:41.030 00:18:51.210 Uttam Kumaran: you know, AI doc audit, we can run these, and then it’ll flag to us, hey, there’s duplicate information here, let’s just consolidate, right? And so,

169 00:18:51.570 00:19:08.130 Uttam Kumaran: that’s, like, sort of what… what this… this quality piece is. Another thing that we wanted to look at, on the content side was, auto-formatting. And so, what we’re gonna… what we know in the document is there’s just, like, a variety of different types of data, so…

170 00:19:08.130 00:19:08.760 Uttam Kumaran: You have…

171 00:19:08.760 00:19:31.289 Uttam Kumaran: typical SOPs, you have, like, literal quotes of, like, line by line what should be said, right? And so, we just want to make sure that they’re always formatted in the right way. So that’s one piece. Second is, like, there’s some probably, like, word optimization. So, we… we… there’s probably both fluff and just, like, using the same grade

172 00:19:31.290 00:19:35.290 Uttam Kumaran: level of wording across the whole board that we’re gonna…

173 00:19:35.290 00:19:38.670 Uttam Kumaran: you know, try to… try to push forward.

174 00:19:38.780 00:19:48.900 Uttam Kumaran: And so some of these we’re gonna be able to run manually, but then over time, we’re going to want this to be able to just kind of run a little bit automatically. And really, it’s this policy adherence

175 00:19:49.220 00:20:02.080 Uttam Kumaran: thing that I… what I want to talk about is, like, our team needs to think about what are the guidelines for new information and existing information, so that when you run AI over the dock, it can find things that are out of policy.

176 00:20:02.150 00:20:17.409 Uttam Kumaran: And so that’s how we’re gonna actually maintain that not only does this dock get lower in length, but that we’re… we are limiting the amount of errors that get added so that they’re not immediately surfacing to the CSRs.

177 00:20:17.410 00:20:28.109 Uttam Kumaran: We have a couple more things about, you know, these are a little bit specific about how we’re doing sectioning and things like that, but that’s a lot around the content itself.

178 00:20:28.110 00:20:40.969 Uttam Kumaran: And then, really, on the… there’s kind of two other pieces I want to talk about. So, one is, like, understanding the queries that are coming in, right? So, one is, like, we need to create a report around unanswered questions.

179 00:20:40.970 00:20:50.310 Uttam Kumaran: So what are questions that are being… that CSRs are asking, and the AI is not able to answer, right? And so just seeing that report, and continuously during

180 00:20:50.700 00:20:55.409 Uttam Kumaran: Sort of our bi-weekly audit, looking at that report and making sure that information gets in.

181 00:20:55.520 00:21:10.679 Uttam Kumaran: Other thing is looking at the top, the longest queries, so the things that really take the longest, and identifying, like, is that something that we can solve through, you know, central doc changes? And then the last two pieces are looking at thumbs down, and

182 00:21:11.090 00:21:21.529 Uttam Kumaran: categorizing the queries, right? So being able to look at what is actually getting the highest number of questions, what types of content, you know?

183 00:21:21.530 00:21:34.600 Uttam Kumaran: And that goes to my next piece, is like, based on how people are accessing, we can actually optimize the way in which Andy answers. So, for example, right now, we’re treating every part of that doc equally.

184 00:21:34.640 00:21:35.340 Uttam Kumaran: But…

185 00:21:35.410 00:21:53.869 Uttam Kumaran: I’m guessing, and what we’re probably right on, is most of the questions are, like, a certain… accessing just a certain piece, right? And so, we want to surface that faster, in a couple ways. One, maybe, again, there’s ways in the database to make this just, like, access, first, so it’s, like, some sort of ranking.

186 00:21:53.960 00:22:03.939 Uttam Kumaran: And there’s also caching. So, like, if people are always asking for the same thing, Andy can actually just cache that response and give it back in really, really fast, versus…

187 00:22:04.010 00:22:14.259 Uttam Kumaran: Having to… for the same question, having to go all the way back. So Google does a very similar thing, right? So if you Google something, and you just Google it again, it’s not… it doesn’t take the same amount of time.

188 00:22:14.430 00:22:17.960 Uttam Kumaran: And then we have a couple of other things on right, like…

189 00:22:18.360 00:22:23.030 Uttam Kumaran: You know, if you tend to ask one question, The next question you ask.

190 00:22:23.040 00:22:31.789 Uttam Kumaran: Can we, like, predict that a little bit? So there’s some more advanced things that we’re thinking about, but these are sort of the list of… of…

191 00:22:31.800 00:22:44.049 Uttam Kumaran: things that we’re gonna start working on. We have some movement on this today, on building out some of this reporting, but really, I think we’re gonna do a first pass of, like, these.

192 00:22:44.090 00:22:46.270 Uttam Kumaran: Manually, and then…

193 00:22:46.610 00:22:52.360 Uttam Kumaran: See, like, okay, how much of that worked, and then start to say, okay, let’s see if we can run this on a…

194 00:22:52.560 00:23:05.270 Uttam Kumaran: bi-weekly basis. My goal is really, one, to see the length of the doc go down, to see the answers, of course, like, be more standardized, like, and predictable, and then for us to

195 00:23:05.300 00:23:18.800 Uttam Kumaran: for the next department for the speed at which we’re creating the central dock to go up, right? So the impetus from this was Amber was like, hey, we have to create this, but it’s just so much work, like, how are we going to, like, have the capacity to do so?

196 00:23:18.900 00:23:28.429 Uttam Kumaran: And I’m like, okay, well, we need to pause and actually see, now that we’ve done it a few times, let’s look to see how we can make it better. But now that we have to maintain this dock.

197 00:23:28.530 00:23:39.680 Uttam Kumaran: Right? And so we… we can’t… if we… if we spend all the time maintaining the dock, then there’s no time to net new development. And so, these are the things that we’re gonna, you know, be working on this month to…

198 00:23:40.130 00:23:41.810 Uttam Kumaran: To make those changes, so…

199 00:23:44.880 00:23:52.140 JanieceGarcia: I’m excited about it. I mean, my… my biggest thing is, and I know, I see Amber’s do… I mean, we spent hours and hours.

200 00:23:52.140 00:23:53.050 Uttam Kumaran: Yes.

201 00:23:53.050 00:23:58.369 JanieceGarcia: Creating the dock, and then even going back in and trying to maintain it, it’s…

202 00:23:59.360 00:24:04.860 JanieceGarcia: if I go and I input something, I… okay, am I putting it in the right spot? Am I… Yes.

203 00:24:05.110 00:24:11.290 JanieceGarcia: formatting it correctly, because it’s not coming up in handy, you know, and it’s like, how…

204 00:24:11.480 00:24:13.689 Uttam Kumaran: So, first thing to nail is that, like.

205 00:24:14.260 00:24:19.819 Uttam Kumaran: We want you to get to put it in there, and then we’ll make sure that the system puts it in the right place.

206 00:24:19.950 00:24:25.130 Uttam Kumaran: Then, eventually, I think we’ll think about, okay, is there a way for you to actually input it.

207 00:24:25.290 00:24:34.089 Uttam Kumaran: And we catch it on the input side, like, maybe you go through a chatbot to add things, and you’re actually not writing directly in a doc. So those are, like, some of the ideas.

208 00:24:34.210 00:24:40.770 Uttam Kumaran: But if you… even if you just get it in there, and then on some basis we clean it all up, put it in the right place.

209 00:24:41.110 00:24:45.459 Uttam Kumaran: And you can trust that that’s happening, it’s a lot better than it is now.

210 00:24:45.800 00:24:50.359 JanieceGarcia: Right, right, yeah. Because it is, I mean, and I… and I see it, and I know…

211 00:24:50.700 00:24:55.650 JanieceGarcia: what Amber’s talking about, and I know her and I probably feel the same.

212 00:24:55.650 00:24:56.400 Uttam Kumaran: Yes.

213 00:24:56.400 00:25:02.149 JanieceGarcia: Struggles, but it is very time-consuming right now with how we’re maintaining it.

214 00:25:02.310 00:25:03.800 Uttam Kumaran: Okay, okay, cool.

215 00:25:05.580 00:25:10.390 JanieceGarcia: So I’m excited. Yes, bring it on. Let me know what you need from me.

216 00:25:10.390 00:25:10.730 Uttam Kumaran: Definitely.

217 00:25:12.460 00:25:13.830 JanieceGarcia: I will, I will.

218 00:25:17.290 00:25:36.470 Amber Lin: Well, that’s all the updates I have. And then on the discovery side, I know you guys are still discussing there. Any follow-ups, any action items we need from, the ABC team, or is it all the action items on our side?

219 00:25:36.780 00:25:52.470 Uttam Kumaran: I think it’s all on our side, I think, yeah, Janiece, we can just get a recording of you going through the admin portal, and then we’re working on some of these additional reports, so I think when we meet with the event next, we can share… I know, Amber, we’re talking about some of those in Slack today.

220 00:25:52.790 00:25:56.890 Uttam Kumaran: And then hopefully by next week, we’ll have, some more

221 00:25:57.260 00:25:59.460 Uttam Kumaran: You know, info on the transcripts.

222 00:25:59.650 00:26:03.079 Uttam Kumaran: You know, we’re actively talking to Tim about resolving some of those.

223 00:26:03.210 00:26:12.609 Uttam Kumaran: And then we will have done probably one or two passes of cleaning up the existing central dock with some of these ideas that we have. So we’ll show some of the results of that, yeah.

224 00:26:12.610 00:26:24.220 JanieceGarcia: Okay, I’ll go ahead… I actually will do… I have it already set up. I’ll do, the recording now for those two that I’m going to remove from the system. And then I did want to ask…

225 00:26:24.570 00:26:35.809 JanieceGarcia: on the central dock, because I’ve told Amber, I’ve waited to put this information in there. It’s not crazy, oh my gosh, policies, you know, that are pressing, it’s just…

226 00:26:36.160 00:26:41.369 JanieceGarcia: one-off questions that come around, but do you want me to add those to the central doc right now with y’all?

227 00:26:41.370 00:26:47.519 Uttam Kumaran: Yeah, yeah, yeah, yeah, don’t worry about our… yeah, don’t worry about our process. We’re gonna make a copy and, like, be testing a bunch of stuff, so…

228 00:26:47.520 00:26:53.709 JanieceGarcia: Okay. You consider it, like, business as usual for now. Yeah, we’ll work around anything, so…

229 00:26:53.710 00:27:02.250 Uttam Kumaran: We’re just gonna be testing what it’s like to make these edits and how that affects things, and then we’ll… we’ll inform anyone before we’re changing anything in there, officially.

230 00:27:02.250 00:27:06.309 JanieceGarcia: Okay, perfect. Alright, well, yeah, let me know what I can do to help.

231 00:27:06.620 00:27:07.380 Uttam Kumaran: Okay.

232 00:27:07.380 00:27:10.800 JanieceGarcia: Please. Okay. Cool beans. Well, that’s all I had, Steven.

233 00:27:10.800 00:27:14.129 Samuel Roberts: Any information about, like, the queues, potentially?

234 00:27:14.130 00:27:22.559 JanieceGarcia: I’m gonna ask Yvette, so I already have my email set up, so her reply will come back to you guys, because I have y’all copied on the email, so…

235 00:27:23.000 00:27:23.720 Samuel Roberts: Ming.

236 00:27:29.400 00:27:29.780 Steven: Cool.

237 00:27:29.780 00:27:30.380 Amber Lin: Hmm.

238 00:27:30.380 00:27:33.210 Steven: Yep, whatever y’all do, just make Janiece happy, and we’re all happy.

239 00:27:35.420 00:27:38.819 Uttam Kumaran: Janice is usually happy, I feel like, so it’s hard to understand.

240 00:27:39.150 00:27:43.009 Uttam Kumaran: If we’re just… if we’re actually doing a great job, or if you’re just a nice person, so…

241 00:27:43.310 00:27:47.640 JanieceGarcia: I am always happy. We have to be happy. If we’re not, then…

242 00:27:47.640 00:27:52.580 Uttam Kumaran: I’m the same way. I just, like, smile through it, figure it out, get to the next thing.

243 00:27:52.580 00:27:53.539 JanieceGarcia: Figure it out.

244 00:27:53.540 00:27:54.320 Uttam Kumaran: Yeah.

245 00:27:55.470 00:27:57.680 Uttam Kumaran: Okay, amazing. Thank you, everybody.

246 00:27:58.090 00:27:58.800 Uttam Kumaran: Thank you.

247 00:27:58.800 00:28:00.930 Steven: Appreciate it. Thank you all. Bye.