Meeting Title: Brainforge x ABC Home and Commercial: Weekly Project Check Date: 2025-05-08 Meeting participants: Uttam Kumaran, Amber Lin, Scott Harmon, Janiecegarcia, Mattburns


WEBVTT

1 00:00:12.800 00:00:13.680 JanieceGarcia: Hey! Amber.

2 00:00:13.680 00:00:14.610 Amber Lin: Hi.

3 00:00:14.610 00:00:15.860 JanieceGarcia: Long time, no see.

4 00:00:15.860 00:00:18.730 Amber Lin: I know I missed you. It’s so.

5 00:00:18.730 00:00:25.880 JanieceGarcia: Real. I know I’m like I haven’t talked to you all week. I’ve been in basic week with new hire, with the new hire. So.

6 00:00:25.880 00:00:29.750 Amber Lin: Oh, wow! Wait! Did we get even more new hires?

7 00:00:29.750 00:00:33.860 JanieceGarcia: Just one, just one who I’ve been working with her this week.

8 00:00:34.950 00:00:35.740 Amber Lin: Wow!

9 00:00:35.740 00:00:42.640 JanieceGarcia: So literally from like 7 to 9. I’m like trying to cram as much stuff as I can, because I’m with her from 9 to 4.

10 00:00:43.470 00:00:44.310 Amber Lin: Wow!

11 00:00:44.650 00:00:47.409 JanieceGarcia: And then I have my regular meetings so

12 00:00:48.440 00:00:51.850 JanieceGarcia: like I haven’t even been able to be with her the entire time.

13 00:00:53.170 00:00:57.800 Amber Lin: And I was off yesterday. So that’s good.

14 00:00:59.840 00:01:00.860 Amber Lin: Hi Bhutan!

15 00:01:01.170 00:01:02.290 Uttam Kumaran: Hello!

16 00:01:02.830 00:01:03.860 Uttam Kumaran: I’m back!

17 00:01:04.019 00:01:11.019 Amber Lin: We’re waiting for. I think Steven won’t be able to make it today. I think we’re waiting for Yvette and Pat.

18 00:01:11.530 00:01:12.140 Uttam Kumaran: Okay.

19 00:01:12.430 00:01:13.170 Amber Lin: Yeah.

20 00:01:13.320 00:01:16.139 JanieceGarcia: We’ll see what Yvette says. Yvette’s chatting me right now.

21 00:01:16.450 00:01:17.030 Amber Lin: Hmm.

22 00:01:40.422 00:01:42.540 JanieceGarcia: She hasn’t said anything yet. But

23 00:01:47.700 00:01:52.770 JanieceGarcia: she’s not able to make the meeting at the same time.

24 00:01:53.150 00:01:59.910 JanieceGarcia: she was, she said that Holt called a meeting and schedule at the same time, right before you rescheduled this one.

25 00:02:01.400 00:02:03.350 JanieceGarcia: So, but myself and Matt are in here.

26 00:02:04.400 00:02:04.870 MattBurns: There!

27 00:02:06.400 00:02:08.530 Amber Lin: That’s all good. I’ll.

28 00:02:08.539 00:02:09.199 JanieceGarcia: I’m not.

29 00:02:09.740 00:02:10.500 MattBurns: Bye, guys.

30 00:02:10.910 00:02:11.700 Uttam Kumaran: Hello!

31 00:02:14.810 00:02:17.560 MattBurns: Think Yvette’s gonna join us, is she not? Or Janice, or.

32 00:02:17.560 00:02:22.370 JanieceGarcia: She, she said. Holt called a meeting Matt for the routing stuff.

33 00:02:22.370 00:02:22.690 MattBurns: Okay.

34 00:02:24.000 00:02:27.970 MattBurns: Okay. Well, we’ll proceed. Are you, Scott?

35 00:02:28.460 00:02:29.999 Scott Harmon: Good! How are you doing.

36 00:02:32.330 00:02:33.140 JanieceGarcia: Good.

37 00:02:33.380 00:02:35.088 JanieceGarcia: You’re kind of quiet, though.

38 00:02:36.470 00:02:37.709 Scott Harmon: Can you hear me? Okay.

39 00:02:38.400 00:02:39.290 MattBurns: Little bit.

40 00:02:39.290 00:02:40.610 Amber Lin: Hello!

41 00:02:41.050 00:02:44.660 Scott Harmon: Sorry I’ll just. I’ll just kind of listen in and not be until the method of time.

42 00:02:47.110 00:02:50.419 Amber Lin: Scott, I think in your airpods I cannot hear you.

43 00:02:56.110 00:02:58.330 Uttam Kumaran: I think I caught it, he said. He’s just gonna listen in.

44 00:02:58.890 00:02:59.450 Amber Lin: Alrighty.

45 00:02:59.640 00:03:00.370 Amber Lin: Yeah.

46 00:03:00.370 00:03:07.810 Amber Lin: let me get started exciting stuff today. We’ll keep it brief, but want to share this with everybody.

47 00:03:07.970 00:03:12.600 Amber Lin: Yeah. So let me start this presentation.

48 00:03:13.800 00:03:18.140 JanieceGarcia: I’m taking notes for Yvette. So if I’m not looking at you guys, that’s why.

49 00:03:18.475 00:03:28.540 Amber Lin: Okay, and I’ll I can update her as well. So everybody has access to this presentation. And hopefully, I’ll get to talk to her on Monday, as well.

50 00:03:28.990 00:03:33.259 JanieceGarcia: She actually just said, Please let Amber know she’s gonna reach out to you tomorrow.

51 00:03:34.390 00:03:39.719 Amber Lin: Awesome, very excited. I haven’t also haven’t talked to her in a bit. Miss her as well.

52 00:03:39.860 00:03:43.640 Amber Lin: Oh, wrong place, slideshow.

53 00:03:44.770 00:03:47.110 Amber Lin: Okay. So

54 00:03:47.300 00:03:55.409 Amber Lin: unusual meeting time on Thursday, hopefully moving forward. We’ll probably keep this meeting in time. Because Friday does get very, very busy for everybody.

55 00:03:55.770 00:04:04.909 Amber Lin: And so today I kind of wanted to focus on the trainer bot. So the trainer Bot relates to how we can structure

56 00:04:05.120 00:04:14.410 Amber Lin: the foundation of the knowledge system because ultimately all of the problems for us kind of for the Csrs as well. Kind of came from that

57 00:04:14.570 00:04:37.410 Amber Lin: accessing knowledge was hard. And then for the trainers updating knowledge was hard, and then that affects the business performance. So we spent a lot of time this week, with the team and with Scott and with went through a lot of research as well to see. Okay, what is the

58 00:04:37.490 00:04:46.589 Amber Lin: fundamental solution? And how can we sort of enable that as we’re creating the trainer bot. Because this, I think, is a great opportunity

59 00:04:47.090 00:04:49.379 Amber Lin: to solve some of the problems.

60 00:04:49.910 00:04:52.619 Amber Lin: And so the goal that

61 00:04:52.830 00:05:05.190 Amber Lin: the ultimate goal that we keep in mind is okay. We want to set up a system that will make support, training and documentation reliable and scalable.

62 00:05:06.040 00:05:11.060 Amber Lin: And so, taking a look on the right. Here is

63 00:05:11.170 00:05:15.140 Amber Lin: the current central Doc table of contents.

64 00:05:15.500 00:05:18.249 Amber Lin: So just by looking at this.

65 00:05:18.530 00:05:23.589 Amber Lin: I don’t think we can say, Okay, what are we missing? So

66 00:05:23.760 00:05:27.589 Amber Lin: we can’t really using this, we can’t really predict

67 00:05:27.940 00:05:46.859 Amber Lin: what areas of knowledge that we’re missing for the Csr, so we don’t know when the Csr will say, Hey, I need this. So we’re very reactive right now. And that means there’s a lot of firefighting, and we all know that that doesn’t feel very good.

68 00:05:47.820 00:05:59.160 Amber Lin: And so that’s the 1st problem. And the second problem that we’ve identified is that looking at single singular guides right? Every single document

69 00:05:59.830 00:06:14.750 Amber Lin: just by looking at it. We wouldn’t know if we’ve covered everything. So each time we go in and update, we’ll think, okay, we’re good. We’re good for at least like 2, a few months, until something new comes up. But then

70 00:06:15.100 00:06:21.370 Amber Lin: maybe the next day, the Csr Csrs will say, Hey, I actually needed something else.

71 00:06:21.920 00:06:22.900 Amber Lin: So

72 00:06:23.270 00:06:34.030 Amber Lin: we don’t want to keep doing updates that need patching in the future, because that will take up so much of our trainers that very, very valuable time.

73 00:06:34.450 00:06:38.969 Amber Lin: So these are the 2 problems that we’ve identified.

74 00:06:41.200 00:06:45.870 Amber Lin: And a reason why we want to do this is

75 00:06:46.860 00:06:54.869 Amber Lin: the quality of the knowledge that we put in is going to be the quality of the bot for the Csrs.

76 00:06:55.100 00:07:05.880 Amber Lin: And as we’ve been hearing all the past few weeks, why is the usage going up. Why aren’t they using it more? We, we know there’s benefits to it is because

77 00:07:06.090 00:07:08.419 Amber Lin: it’s not. There’s errors

78 00:07:08.600 00:07:26.139 Amber Lin: is that they’re not giving them the quality response that they need. And therefore why would I use a poor quality? Something that doesn’t solve my issue in the moment? Right? So we got to make sure that we have good quality knowledge.

79 00:07:26.420 00:07:32.889 Amber Lin: And that’s 1 thing about how we fill out each guide. And then on the bigger level of

80 00:07:33.080 00:07:53.100 Amber Lin: how do we make sure that things are complete? How do we save our trainers? Time is, we need structure to the guides and also to the knowledge base. So we not only make sure each document is complete. We also make sure that we cover everything in our knowledge bases, so this will save us a lot of time in the future.

81 00:07:56.090 00:08:01.590 Amber Lin: And so a solution that we’ve been talking about, and sort of what even what we

82 00:08:01.770 00:08:15.879 Amber Lin: intended and start. And this is just to consolidate it and to incorporate it into. As we develop a trainer wise, we want to have clear guidelines on how to structure the knowledge base so that we can one.

83 00:08:16.050 00:08:18.690 Amber Lin: We can rework our central dock

84 00:08:19.436 00:08:29.440 Amber Lin: and to to set guardrails for the future. For when we add knowledge in the future we want it to be complete, every reliable.

85 00:08:29.780 00:08:48.999 Amber Lin: So this will help our trainer bot to come up with better answers. It’ll come. It will help the Daily training help with auditing knowledge, onboarding and ensuring that quality, and most importantly, save us from firefighting, because that is a huge component and a huge stress factor.

86 00:08:51.560 00:09:02.650 Amber Lin: And so we looked specifically at our current system. And we looked at, okay, what are, what are the problems we have and what are the solutions and how, where we are at in solving that problem.

87 00:09:02.830 00:09:06.340 Amber Lin: So the ones I highlighted in green right here is

88 00:09:06.650 00:09:18.829 Amber Lin: we’ve already created a solution in the past few weeks. Right? We want to have bot friendly formatting checked. We want to have make sure that we have complete

89 00:09:19.060 00:09:32.410 Amber Lin: knowledge. Each time we do updates, we want it to be complete. I’ll show you guys this in a second. This is very exciting for me. And then a few more things that we want to mention to bring up in this meeting is that

90 00:09:32.930 00:09:47.440 Amber Lin: the the central dock right now? It’s not that great. It’s it’s working, would love to have more structural con structure in that. So it’s usable. And it’s so, it’s even more usable and faster to navigate.

91 00:09:47.870 00:10:08.429 Amber Lin: And ultimately, as a long term solution we want to say, hey, how do we structure the knowledge base? So how do we make sure that we cover everything that that our goal in the start is completion, so that we aim high and we aim bright

92 00:10:08.860 00:10:16.310 Amber Lin: so quickly. This is something that I want to share this because they’re very, very exciting for us.

93 00:10:16.440 00:10:21.850 Amber Lin: So we’ve been talking the past few weeks about okay, we want this trainer bot

94 00:10:22.340 00:10:26.570 Amber Lin: to ensure high quality input data.

95 00:10:26.710 00:10:34.570 Amber Lin: So we want to make sure that no missing pieces is left when we do update a document.

96 00:10:35.070 00:10:41.779 Amber Lin: And so here’s example of, say, I looked at one of the errors and we say, Okay.

97 00:10:41.920 00:10:48.770 Amber Lin: we say we don’t treat for sales. I don’t think that’s in the Central Doc yet. So if I type that in to the trainer bot.

98 00:10:49.400 00:10:51.210 Amber Lin: And they said.

99 00:10:51.740 00:11:02.439 Amber Lin: Okay, I got this. But I need more of this because you’re probably gonna come back to me about this in the future. So

100 00:11:02.760 00:11:13.589 Amber Lin: let’s make sure. Okay, what are the specific treatment characteristics, or what is the context? So is there any exceptions? Is there a. Why behind what we’re doing?

101 00:11:13.660 00:11:36.089 Amber Lin: How should the Csrs even talk about it like? Is there a tone of voice we want them to adopt? And then, lastly, of internal procedures. How is that going to get done? So this is a preliminary framework, for now I want to confirm this with Janice and Yvette in the future to improve on that. But I really do think this is going to help reduce the numbers of times that

102 00:11:36.170 00:11:37.779 Amber Lin: our trainers have to go back.

103 00:11:39.280 00:11:40.580 JanieceGarcia: I’m excited for this.

104 00:11:41.420 00:11:42.710 Amber Lin: I’ve yeah. The.

105 00:11:44.090 00:11:51.380 Scott Harmon: Yeah, just to just to emphasize what amber saying and the importance of this in a knowledge management.

106 00:11:51.510 00:11:56.400 Scott Harmon: you know, ecosystem, which is what we’re trying to build here the

107 00:11:57.480 00:12:06.860 Scott Harmon: the problem, the problem is that the the experts, the subject matter experts like you, Janice, or the other people who’ve been around the company for a long time.

108 00:12:07.810 00:12:13.699 Scott Harmon: You know that new knowledge needs to be added, and you you do your best to type it in in a comprehensive way.

109 00:12:13.880 00:12:15.989 Scott Harmon: But at the end of the day you’re not

110 00:12:16.300 00:12:20.489 Scott Harmon: a documentation expert, you know. You’re a pest services expert.

111 00:12:20.630 00:12:29.030 Scott Harmon: And so you know you you may forget when you’re creating the knowledge, some additional information that

112 00:12:29.839 00:12:38.029 Scott Harmon: called contacts like, when would this apply with, you know, just the additional data that makes that information useful. So

113 00:12:38.480 00:12:43.570 Scott Harmon: this approach means that the trainer bot ends up interviewing you. Right? So

114 00:12:43.910 00:12:48.120 Scott Harmon: what amber showing you is, you may say, Hey, I need to create some new knowledge. It’s about this service

115 00:12:48.440 00:12:50.440 Scott Harmon: change, or whatever the topic is

116 00:12:50.550 00:12:56.530 Scott Harmon: and the experience you’ll have. And we need to test this with you is, it’ll say, great! That’s terrific.

117 00:12:57.090 00:13:01.649 Scott Harmon: Let’s get started. Okay, it it’ll start drafting it again. Now

118 00:13:01.770 00:13:10.880 Scott Harmon: tell me when this applies, it’ll ask you a series of questions to help make that knowledge document complete

119 00:13:11.130 00:13:15.029 Scott Harmon: and at the limit you might find a little irritating

120 00:13:15.240 00:13:22.439 Scott Harmon: like gosh! Why is this thing asking me so many questions? But the reason it’s doing it is to have a complete structured

121 00:13:22.930 00:13:29.839 Scott Harmon: knowledge. So that when A. Csr asks a question, it can really be expert at knowing

122 00:13:30.460 00:13:36.199 Scott Harmon: which bits of knowledge apply, and then how to serve those back to a Csr. So it’s kind of the missing link

123 00:13:37.100 00:13:38.460 Scott Harmon: in the system.

124 00:13:38.570 00:13:42.180 Scott Harmon: And you know we’re all excited about as Amber mentioned.

125 00:13:42.440 00:13:46.869 Scott Harmon: We really want to get feedback on this kind of more interview, based approach

126 00:13:47.130 00:13:51.060 Scott Harmon: to make sure when it publishes knowledge, is publishing a thorough.

127 00:13:51.360 00:13:54.350 Scott Harmon: you know every word, every phrase.

128 00:13:54.520 00:13:56.820 Scott Harmon: It’s using words consistently.

129 00:13:57.260 00:14:02.140 Scott Harmon: you know. Oftentimes you’ll use one word in one document and a different word. And those things all kind of

130 00:14:02.850 00:14:05.299 Scott Harmon: tend to erode the knowledge quality.

131 00:14:05.690 00:14:11.409 Scott Harmon: And so the reason is, you know, you have kind of one set of terminology and one set of criteria

132 00:14:11.620 00:14:14.409 Scott Harmon: and one set of service definitions. And there aren’t.

133 00:14:14.650 00:14:18.670 Scott Harmon: There aren’t just these different views of reality. So

134 00:14:18.920 00:14:22.650 Scott Harmon: anyway, I’m super excited. We’ve got here. To me. This is kind of the heart of the

135 00:14:23.010 00:14:25.549 Scott Harmon: building, a real self learning system.

136 00:14:25.850 00:14:31.090 Scott Harmon: and I feel like when we get this bit in place. We’re going to be really thrilled with what we see. So.

137 00:14:31.520 00:14:59.749 JanieceGarcia: And I think it, you know, going off of that, too, Scott. It’ll it’s definitely going to help. I mean, I have so much information already. I feel like just that we’ve gotten responses and answers on this week, because we spoke to Manual on Monday and getting some of this information. I’m already trying to put it into the Central Doc. But I’m like, is it going in there correctly? Is it? Gonna come out on the on, Andy, you know, so I’m I’m very, very excited for this.

138 00:15:00.290 00:15:07.760 Scott Harmon: Right. The way I hope this works is that they’ll still will be a central document. You still can read that document.

139 00:15:07.960 00:15:13.620 Scott Harmon: but your experience won’t be. Oh, I’ve updated a central document. Your experience will be

140 00:15:14.080 00:15:17.290 Scott Harmon: the trainer, Doc. The trainer. Bot, has interviewed me.

141 00:15:17.580 00:15:24.280 Scott Harmon: It really is now written a very nice piece of a summary of what I want in there, and it wrote to the dot like

142 00:15:24.440 00:15:27.490 Scott Harmon: it writes to the document, and

143 00:15:27.620 00:15:32.159 Scott Harmon: you don’t have to worry so much about like I I hope you’ll I?

144 00:15:32.550 00:15:37.600 Scott Harmon: If my gut’s right, you won’t, you’ll end up not looking at the document at all anymore. After 6 months.

145 00:15:38.571 00:15:41.659 Scott Harmon: You know, you’ll just you’ll just use the

146 00:15:41.760 00:15:45.210 Scott Harmon: the chat interfaces. So that’s what we’re trying to steer for.

147 00:15:45.490 00:15:46.669 JanieceGarcia: Awesome. Okay.

148 00:15:52.140 00:15:57.119 Uttam Kumaran: So amber. I think you’ll talk about when they’re gonna have access to this later. Right?

149 00:15:57.540 00:16:23.359 Amber Lin: Yeah, totally. So this one? We. We have it in a demo spaces. We want to get it out to you and iterate as soon as possible. So, Janice, actually, you can already use this. I’ll send you the link later. But before we roll out to more trainers I definitely want to have a more detailed testing with you, to make sure. Hey, before more people get access to this?

150 00:16:23.370 00:16:34.819 Amber Lin: Is this really what people want? Are there anything else that’s missing? So I’ll meet with you, and then we’ll go through this. We’ll take it to take a few turns, and we’ll see what we need to improve on.

151 00:16:35.020 00:16:41.329 Scott Harmon: So amber. Do you plan to have? I’m sorry I’m sorry, do you? So? Janice will obviously be kind of our

152 00:16:41.680 00:16:48.740 Scott Harmon: our key early tester. Do you want anybody else included in this phase, one? Or is Janice gonna cover be our first.st

153 00:16:49.340 00:16:53.020 JanieceGarcia: I would think Yvette too, for sure.

154 00:16:53.020 00:16:54.340 Amber Lin: Oh, definitely.

155 00:16:54.580 00:16:55.810 JanieceGarcia: Okay.

156 00:16:55.810 00:17:07.419 JanieceGarcia: And truthfully right now, until we get the trainers all lined out because they’re going through career progression within the next couple of months, and we’ll find out for sure who will be the trainers. So

157 00:17:07.619 00:17:15.699 JanieceGarcia: I think myself and Yvette will probably answer on this as well. But it would be myself and Yvette, probably for the first, st while.

158 00:17:16.920 00:17:17.480 Scott Harmon: Great.

159 00:17:17.839 00:17:18.769 Amber Lin: Sounds good.

160 00:17:22.279 00:17:34.389 Amber Lin: and this is something that’s also included in the trainer bot as we improved on it this week, so the trainer bot will also know what’s best

161 00:17:34.549 00:17:45.569 Amber Lin: for A AI to digest. So we’ve incorporated the best ways to format a guideline, and automatically it will just format it in the way that Andy will understand.

162 00:17:45.669 00:17:48.675 Amber Lin: So this saves you the process of looking, hey?

163 00:17:49.459 00:18:00.419 Amber Lin: does this really work does not really work? So in the in the process of creation. It would have made sure that this is A AI appropriate document.

164 00:18:02.789 00:18:07.569 Amber Lin: and I wanted to also touch on

165 00:18:08.011 00:18:26.799 Amber Lin: how we will approach all of those floating documents we have right now. So right now, we have the central doc. But I know that still there is a Google drive that has all these different documents floating around. And our goal for this is that we want people to have one source of truth.

166 00:18:26.889 00:18:52.169 Amber Lin: because if they reference different systems. And maybe if they’re not matched, or if you update one but not the other, then people are gonna have confused. So that be confused. And that’s gonna create even more problems. So Janice, I definitely want to work with you to identify one all of the floating documents and to make very clear path of how and when we’re gonna archive, them

167 00:18:52.169 00:19:01.709 Amber Lin: make people go to the central Doc instead. So we’ll talk we’ll go over this in a separate meeting, but just want to put it there.

168 00:19:06.459 00:19:16.689 Amber Lin: Back to the back, to the central doc, right? We talked about a few problems that we identified in the Central Doc that we can keep improving. So right now.

169 00:19:16.929 00:19:25.239 Amber Lin: there’s not really organization in the Central Doc. It’s kind of like a running train of thought and

170 00:19:26.389 00:19:34.459 Amber Lin: not so one that we talked about overall organization and 2, we talked about specific for each

171 00:19:34.589 00:19:55.119 Amber Lin: document. There’s also not really a structure to it. It kind of just says whatever it wants, because that’s the way we’ve been thinking, we think, oh, we need this. Oh, got to add that. And so it kind of just trails on, whether in the specific document or in the overall document. So that creates a problem sometimes that

172 00:19:55.909 00:20:22.209 Amber Lin: Andy doesn’t always give the right content because it bought things very structurally versus our human brain kind of just sometimes just goes on. So a way that we want to improve the problem of Andy. Not really surfacing information, even if it’s in the central Doc, is what we’re trying to do here. We want to create structure into each document and into the whole

173 00:20:22.709 00:20:43.019 Amber Lin: whole system. So we want to have clean hierarchies. Want to have headings, want to have specific categories. I want to fill all the different information, as we said in a train about. We want to have all the categories, and that’s really going to be helpful for Andy to understand, because it speaks the same bot language if I put it that way.

174 00:20:45.776 00:20:56.053 Amber Lin: This is kind of we audited this current central Doc, and this is the kind of the knowledge structure that emerged. So there’s

175 00:20:57.439 00:21:03.449 Amber Lin: mostly management parts. There’s parts about billing. There’s

176 00:21:03.559 00:21:07.189 Amber Lin: sections about the different programs. And there is like

177 00:21:07.279 00:21:27.769 Amber Lin: customer part. So it’s kind of like all over the place. There’s no overarching structure, but we identify that there’s kind of 2 implicit knowledge bases that’s present. And this is also a very common organization for customer service knowledge bases. So there’s usually a

178 00:21:27.839 00:21:37.959 Amber Lin: customer serving a customer facing section, and there’s an internal facing section, and I will go over them in a bit more detail over here. But overall

179 00:21:38.874 00:21:49.059 Amber Lin: for internal parts we have like operational procedures we have call scripts. We have billing workflows, technician guides, and audit policies.

180 00:21:49.189 00:22:01.719 Amber Lin: and for the customer facing side is where we found that sometimes it’s a little bit fragmented and all over the place. I also took the time to go through the ABC. Website specifically with the mindset. The customer of

181 00:22:01.899 00:22:25.029 Amber Lin: How am I going to get answers? And I found that you have to go through a lot of navigation to come to a question, and you can’t really search it up. So you end up that you have to call a Csr even with some of the most basic questions. But that’s for another time. So for the customer serving knowledges. We have Faqs. We have like certain explanations and processes.

182 00:22:25.189 00:22:30.259 Amber Lin: But since we didn’t start with a baseline structure.

183 00:22:30.519 00:22:37.019 Amber Lin: there’s a lot of gaps here and there, and honestly, right now we don’t even know where the gap is.

184 00:22:37.689 00:22:48.279 Amber Lin: So pausing there, that was a lot of words that spit out of my mouth. So we want to identify the gaps and fill those gaps. It’s ultimately what we’re trying to do.

185 00:22:48.280 00:22:56.709 Uttam Kumaran: Yeah, when I look at this, like, I basically see like the company structure. But it’s basically the operating system for the department, right? As each of these different pieces.

186 00:22:58.050 00:23:01.839 Uttam Kumaran: So I love like seeing it broken down this way.

187 00:23:03.350 00:23:05.880 Uttam Kumaran: And then we have the underlying information.

188 00:23:07.990 00:23:28.119 Amber Lin: Yeah. And I know this is somewhat of a distant goal and a fresh goal for for us to complete right now. But as I’m very, very invested because it works so long with you guys, I’m very invested in the well-being of our trainers and our Csrs, and ultimately

189 00:23:28.700 00:23:41.000 Amber Lin: to make even the bot retrieval process easier. So this is going to benefit the humans and the bots. Because if we organize the knowledge, then you spend less time

190 00:23:41.260 00:23:52.810 Amber Lin: finding the information, which is why we even started this project in the 1st place. So even if this is a scratch stretch, I want to put it out there. So there’s customer facing parts right? There’s

191 00:23:53.583 00:24:06.579 Amber Lin: knowledge base for that. And sometimes it, it goes on the website as well. Sometimes. Csrs use that to answer answer calls from the customer. And then there’s the internal facing part

192 00:24:06.680 00:24:28.937 Amber Lin: of a lot of issues that you want might want to troubleshoot. You might want to teach your internal employees how to do things. So there’s a separate systems. Separate 2 knowledge bases that kind of overlap sometimes. And ultimately, what we want to do is each of them might have different tags where it might be about a certain program, or about a

193 00:24:29.850 00:24:55.590 Amber Lin: certain technician or some sort of content. It’s like a hashtag right? And sometimes they overlap, as you see in this kind of knowledge graph layer. So under the fundamental part under all those articles is, how they? How do they overlap with each other like? How do they relate to each other? And having that structure will help us.

194 00:24:56.254 00:25:03.050 Amber Lin: Help the bot suggest. Follow up questions, because now it will understand. Oh, you said this.

195 00:25:03.240 00:25:05.279 Amber Lin: there’s 3 things related to that.

196 00:25:05.640 00:25:06.560 Amber Lin: And yeah.

197 00:25:06.560 00:25:07.969 Scott Harmon: So I just if

198 00:25:08.250 00:25:14.410 Scott Harmon: if if I could just add in a real quick comment, and, Matt, this is for you. We talked about this a long time ago, and we got started.

199 00:25:14.980 00:25:17.259 Scott Harmon: What Amber’s showing in the upper right

200 00:25:17.450 00:25:20.040 Scott Harmon: is is a thing called a knowledge graph.

201 00:25:20.430 00:25:26.359 Scott Harmon: And it’s it’s without without boring you to death. It’s kind of the state of the art about

202 00:25:26.580 00:25:32.340 Scott Harmon: how Llms understand the world you’re talking about to them.

203 00:25:32.810 00:25:34.740 Scott Harmon: And so

204 00:25:35.260 00:25:42.969 Scott Harmon: if you can build and it. It is kind of what it looks like. It’s a visual representation of how concepts and terms are linked together.

205 00:25:43.855 00:25:46.290 Scott Harmon: In a in a conceptual way.

206 00:25:47.160 00:25:51.830 Scott Harmon: Then, if you can, if you have that for the Llm. The Llm. Can now

207 00:25:52.380 00:25:58.849 Scott Harmon: speak highly intelligently about a field. So, for example, if you’re going to build a medical diagnostic.

208 00:25:59.160 00:26:03.090 Scott Harmon: Llm, it has to build a knowledge graph of all kinds of different

209 00:26:03.680 00:26:12.020 Scott Harmon: medical procedures and terminology. So this is a this piece. This knowledge graph is sort of the key

210 00:26:12.250 00:26:18.030 Scott Harmon: that allows the Llm. To become very intelligent and intuitive

211 00:26:18.340 00:26:21.610 Scott Harmon: in in how it’s able to answer questions.

212 00:26:21.810 00:26:26.410 Scott Harmon: So moving from just that unstructured dock to.

213 00:26:27.000 00:26:33.090 Scott Harmon: And I think, if I’m not mistaken, is this a knowledge graph of that you? We’ve actually

214 00:26:33.700 00:26:35.710 Scott Harmon: built from their dock amber.

215 00:26:36.181 00:26:42.778 Amber Lin: No, this is this is a simulated generated image. It is not accurate.

216 00:26:43.250 00:26:46.809 Scott Harmon: But we’ll build one for your stuff, and then the Llm. Will

217 00:26:47.000 00:26:54.620 Scott Harmon: truly be smart and sort of understanding how things work, and how all the terminology landscape works at at ABC. So

218 00:26:54.740 00:26:55.260 Scott Harmon: anyway.

219 00:26:55.260 00:26:56.390 MattBurns: And I imagine that.

220 00:26:56.390 00:26:57.199 Scott Harmon: Piece of the puzzle.

221 00:26:57.380 00:27:01.389 MattBurns: I would imagine that would certainly speed up response time. Obviously.

222 00:27:02.500 00:27:03.579 Scott Harmon: So I think so. I think that’s.

223 00:27:03.870 00:27:05.060 MattBurns: From the bot. Yeah.

224 00:27:05.300 00:27:11.920 Uttam Kumaran: Yeah, it just basically helps as a guide, like, where exactly to go. Yeah.

225 00:27:12.900 00:27:19.700 MattBurns: Well, and Janice, I I again going back to Amber’s original

226 00:27:20.570 00:27:24.209 MattBurns: 1st page there, I don’t know how much of that was really vetted

227 00:27:24.760 00:27:29.639 MattBurns: fully, or if it was just well, here’s a document. Here’s a document. Here’s another one. Here’s another one.

228 00:27:29.850 00:27:36.310 MattBurns: and I think maybe just some continual internal vetting may may help with that, because again.

229 00:27:37.290 00:27:41.920 MattBurns: that’s just a lot of stuff. I think everybody said, well, here’s all our stuff. But.

230 00:27:42.470 00:27:45.740 JanieceGarcia: No, you’re absolutely right. Yes, it was so.

231 00:27:47.690 00:27:50.259 MattBurns: Yeah, I think we could probably help by.

232 00:27:50.260 00:27:52.350 JanieceGarcia: Shorten shortening and.

233 00:27:52.350 00:27:58.660 MattBurns: And vetting and kind of going through and saying, Well, we we could certainly improve on some of these. So yeah.

234 00:27:58.660 00:28:00.320 JanieceGarcia: Absolutely. Yes, sir.

235 00:28:03.340 00:28:04.110 Amber Lin: Awesome.

236 00:28:04.270 00:28:21.430 Amber Lin: very exciting. And all of this will sort of consolidate to a user facing interface. Right? How do we search? How do we query like that will just be a search plus the trainer bot or Andy that we’re creating. So that will be the most outer facing there.

237 00:28:23.680 00:28:24.360 Amber Lin: Okay.

238 00:28:24.590 00:28:54.410 Amber Lin: so ultimately, I just want to wrap up. We’re 1 min away. So ultimately, how this helps is reduces time and the amount of escalations and firefighting help. It helps the AI scale, because now it has a better understanding of knowledge and a better foundation if we can. It gives the Csrs one single source of truth that’s very organized and helps

239 00:28:54.630 00:28:57.619 Amber Lin: make this knowledge base future proof.

240 00:28:57.770 00:29:07.029 Amber Lin: because we know where everything is and we know what specifically needs to change. If we ever needed to change something. So this is sort of a

241 00:29:08.540 00:29:11.959 Amber Lin: long term strategy. If I want to put it that way.

242 00:29:14.340 00:29:21.230 Amber Lin: So next steps I will follow up with event engineers on this. I’ll just name it here.

243 00:29:22.980 00:29:27.180 Amber Lin: Okay, that’s all. Perfectly 30 min. Any questions.

244 00:29:29.410 00:29:31.570 MattBurns: No, I’m good.

245 00:29:34.090 00:29:59.310 JanieceGarcia: I’m good. I know this week, because of me being with Ashley, our new agent. That will be using Andy here in the next several weeks, I mean, she just started. So she does not have access to Andy just yet. But with that, and then with really being able to set that time. Get back to setting that time aside. Then it’s it’s really gonna be good.

246 00:29:59.450 00:30:04.850 JanieceGarcia: because I know the vetting and adding that new information is, it’s gotta happen. So yeah.

247 00:30:04.850 00:30:19.260 Uttam Kumaran: I’m glad. I think you know. Just the last point is, I think, a lot of what I gathered from being on site was, hey, it’s it is hard to add things. And when we add things, maybe it’s not full and so really enabling

248 00:30:19.420 00:30:25.640 Uttam Kumaran: now that we’ve basically enabled getting the data out. We want to make sure that what’s in there is accurate. And so I think.

249 00:30:25.640 00:30:25.970 JanieceGarcia: Yes.

250 00:30:25.970 00:30:29.800 Uttam Kumaran: Hopefully, you guys see that like, okay, we’re takes took some action on some of those things.

251 00:30:29.800 00:30:30.220 MattBurns: For sure.

252 00:30:30.220 00:30:41.220 Uttam Kumaran: So I think you’ll have that. And then, as we basically hopefully move this to more divisions, they have both of those available. Given all the pain we like sort of crawl through to make it happen. So

253 00:30:41.460 00:30:47.669 Uttam Kumaran: we’ll start with those teams like off the bat. Having this, you know, and then getting access the same way, so.

254 00:30:48.970 00:30:49.990 MattBurns: Good stuff.

255 00:30:50.480 00:30:51.820 JanieceGarcia: Very good. Yes.

256 00:30:52.850 00:30:55.390 JanieceGarcia: Well, thank you. Guys appreciate it. Thank you.

257 00:30:55.390 00:30:56.560 Amber Lin: So much.

258 00:30:56.910 00:30:57.320 JanieceGarcia: Bye.

259 00:30:57.750 00:30:58.300 Amber Lin: Bye.