Meeting Title: Brainforge x ABC Home and Commercial_ Weekly Project Check Date: 2025-03-21 Meeting participants: Uttam Kumaran, Amber Lin, Steven, Janiecegarcia, Yvetteruiz, Mattburns, Scott_Harmon
WEBVTT
1 00:00:19.770 ⇒ 00:00:20.880 Uttam Kumaran: Hey! Scott!
2 00:00:21.450 ⇒ 00:00:22.340 Scott_Harmon: Here we come!
3 00:00:23.860 ⇒ 00:00:24.810 JanieceGarcia: Good morning!
4 00:00:24.810 ⇒ 00:00:25.539 Scott_Harmon: How are you?
5 00:00:26.080 ⇒ 00:00:28.190 JanieceGarcia: I’m good. How are y’all happy? Friday?
6 00:00:28.190 ⇒ 00:00:29.220 Scott_Harmon: Happy Friday! You, too.
7 00:00:29.220 ⇒ 00:00:29.870 Uttam Kumaran: Friday.
8 00:00:30.130 ⇒ 00:00:31.660 Scott_Harmon: You have good weekend plans teed up
9 00:00:32.439 ⇒ 00:00:35.599 JanieceGarcia: I do not. My middle son
10 00:00:36.499 ⇒ 00:00:42.289 JanieceGarcia: had his stuff done yesterday, so we are going to be relaxing and resting this again, for sure
11 00:00:42.590 ⇒ 00:00:43.539 Scott_Harmon: Oh, good!
12 00:00:44.190 ⇒ 00:00:45.200 Scott_Harmon: Hi, Matt!
13 00:00:47.720 ⇒ 00:00:48.895 JanieceGarcia: How are you guys
14 00:00:49.190 ⇒ 00:00:49.660 Uttam Kumaran: Good.
15 00:00:50.520 ⇒ 00:00:51.210 YvetteRuiz: Good.
16 00:00:54.490 ⇒ 00:00:56.059 Scott_Harmon: Hey, Amber, are you on? So you’re
17 00:00:56.060 ⇒ 00:01:01.329 Amber Lin: Hello! I am on. I am very hesitant to show my show myself
18 00:01:01.330 ⇒ 00:01:02.519 Scott_Harmon: No problem, no problem.
19 00:01:02.520 ⇒ 00:01:10.069 Amber Lin: Because I here I got my wisdom tooth, 4 of them extracted on Wednesday, and I look like a chipmunk
20 00:01:10.330 ⇒ 00:01:12.360 JanieceGarcia: Oh no!
21 00:01:13.085 ⇒ 00:01:22.149 Amber Lin: Okay, here my left side is significantly bulged. And this is like, Okay, but I am so swollen right now, I was like, do I?
22 00:01:24.265 ⇒ 00:01:28.954 Amber Lin: Here, this is. This is me right now I am. I look so funny
23 00:01:31.070 ⇒ 00:01:32.850 YvetteRuiz: Oh!
24 00:01:32.850 ⇒ 00:01:41.279 Amber Lin: I’m mad. This is a this is my your 1st impression of me, and I look like a chipmunk because I got my got my tooth extracted.
25 00:01:41.280 ⇒ 00:01:42.115 Amber Lin: Aw.
26 00:01:44.530 ⇒ 00:01:46.940 Scott_Harmon: Well, hey, we got Mr. Burns on
27 00:01:46.940 ⇒ 00:01:47.829 MattBurns: Good morning!
28 00:01:48.030 ⇒ 00:01:49.219 Uttam Kumaran: Hey! Good morning!
29 00:01:49.220 ⇒ 00:01:50.360 JanieceGarcia: Morning.
30 00:01:50.530 ⇒ 00:01:54.889 Scott_Harmon: People never use the Mr. Burns from the Simpsons on you, do they?
31 00:01:54.890 ⇒ 00:01:56.720 MattBurns: No, actually, not so, I guess
32 00:01:57.070 ⇒ 00:02:01.300 YvetteRuiz: Good thing. How are you guys today?
33 00:02:01.300 ⇒ 00:02:02.350 Scott_Harmon: Pretty good, pretty good.
34 00:02:03.160 ⇒ 00:02:03.890 MattBurns: Good.
35 00:02:05.710 ⇒ 00:02:11.989 Uttam Kumaran: Cool amber. Maybe I’ll let you kick it off, and then I know we have a couple of topics to get to, so it should be a good hour.
36 00:02:12.150 ⇒ 00:02:13.080 Steven: Yeah, real quick.
37 00:02:13.080 ⇒ 00:02:21.910 Steven: I don’t know. I know, Matt. I don’t know if you want. I know I don’t know. If you had another meeting to jump on to or not. I know one reason, Matt, I want to jump on was to talk some of the pricing I know
38 00:02:21.910 ⇒ 00:02:22.370 Uttam Kumaran: Yes.
39 00:02:22.370 ⇒ 00:02:27.269 Steven: I talked a little yesterday on that. So, Matt, I don’t know where you’re going to stay on the whole thing. Or do you want to start off with that to
40 00:02:27.519 ⇒ 00:02:37.229 MattBurns: It’s up to you guys, I’m I’m comfortable. I don’t have anything pressing so I can stay on the whole thing. So whether we wanna handle that at the end. Or now it’s up to you guys, I’m
41 00:02:37.230 ⇒ 00:02:38.569 Uttam Kumaran: Yeah, I let I let Amber.
42 00:02:38.570 ⇒ 00:02:42.009 Uttam Kumaran: I just didn’t know if Matt needed to jump off or not cool.
43 00:02:42.010 ⇒ 00:02:50.609 Uttam Kumaran: I let Amber know that we probably wanna just carve out a bulk of the meeting to talk about that. So we’ll just run through, probably for the next 10 min or so updates for the week, and then we can jump right to that
44 00:02:50.610 ⇒ 00:02:51.150 Steven: Perfect.
45 00:02:51.150 ⇒ 00:02:52.319 Amber Lin: Cool sounds, good
46 00:02:52.320 ⇒ 00:02:52.850 MattBurns: Okay.
47 00:02:53.020 ⇒ 00:02:57.310 Amber Lin: So I’m gonna share my screen and let me present first.st
48 00:03:01.980 ⇒ 00:03:05.169 Amber Lin: Alright, can everyone see my screen?
49 00:03:06.270 ⇒ 00:03:12.119 Amber Lin: Great? So we’re currently at Friday, March 21, st almost at the end of March.
50 00:03:12.350 ⇒ 00:03:17.219 Amber Lin: And today we’re just going to quickly go over the progress. And then we’re going to talk about the phase 2.
51 00:03:17.650 ⇒ 00:03:27.230 Amber Lin: So this week we are integrating the air trainer assistant into into Google Chat. And we estimate it’s gonna take around
52 00:03:27.380 ⇒ 00:03:44.909 Amber Lin: one or 2 more weeks to get all the features set up, and to make sure that it’s consistent with all the context and for the Cs. Csra assistant. These are the few things that we got progress on quite a few, so I’ll go through them one by one.
53 00:03:45.210 ⇒ 00:03:46.970 Amber Lin: So 1st of all
54 00:03:47.380 ⇒ 00:03:59.279 Amber Lin: is the dashboard. So this I sent you guys here in the email. But I just wanted to go over them in the meeting as well, just to go over each metric.
55 00:03:59.490 ⇒ 00:04:02.630 Amber Lin: So let’s look at this 1 1.st
56 00:04:05.740 ⇒ 00:04:10.310 Amber Lin: So this is the dashboard for the conversation logs.
57 00:04:10.450 ⇒ 00:04:15.360 Amber Lin: So we have here the total conversations
58 00:04:15.540 ⇒ 00:04:22.690 Amber Lin: and total records. We remove the total sessions in a new update. And so this essentially tells you
59 00:04:22.950 ⇒ 00:04:24.560 Amber Lin: how many
60 00:04:25.040 ⇒ 00:04:44.759 Amber Lin: conversation. So about a same topic, they asked a question, and you have back and forth. Right. So this is how many conversations and the records, just how many back and forth you can have. And this will have more. Once we have the data from the various testings. And the lastly, we here is the average execution time.
61 00:04:45.260 ⇒ 00:05:01.259 Amber Lin: So what’s really nice about this dashboard is that it’s really fast to navigate. So right here on the right, you can click on whatever you’re interested in. So let’s say, I want to look at the total average execution time.
62 00:05:01.570 ⇒ 00:05:06.120 Amber Lin: And here we can go sort by
63 00:05:06.660 ⇒ 00:05:17.270 Amber Lin: descending or ascending, and we can expand a table. And then we’ll see. Okay, here’s more of the data over here, let’s go back
64 00:05:17.480 ⇒ 00:05:29.160 Amber Lin: to the dimensions, and let’s see total conversations where?
65 00:05:30.980 ⇒ 00:05:41.680 Amber Lin: Yeah, I wanted to see when the bot did not, was not able to give a right answer.
66 00:05:44.990 ⇒ 00:05:57.400 Amber Lin: I had that I had that in the video that I recorded for you guys. And here’s also a view that you guys can go in, and we can make it more granular, based on what you like.
67 00:05:58.640 ⇒ 00:06:01.629 Amber Lin: So that’s for that dashboard. And there’s also
68 00:06:01.880 ⇒ 00:06:11.900 Amber Lin: more views to show, depending on what you like, and the other dashboard is it is this one.
69 00:06:15.460 ⇒ 00:06:20.340 Amber Lin: and this is our evaluation of the qualities. So
70 00:06:20.910 ⇒ 00:06:37.210 Amber Lin: here are the total number of valuations, the similarities. I know these these names are a little bit confusing. So the embedding similarity it means, it means that if are you talking about
71 00:06:37.470 ⇒ 00:06:56.299 Amber Lin: the right thing. So I think an example of what I gave is that the cat is sleeping versus the feline is sleeping, so are they talking about the right thing, even though they’re using different wordings. And the second part of and this one of similarities, the higher the score the better.
72 00:06:56.530 ⇒ 00:07:01.819 Amber Lin: and for the second one the Levenstein score measures. How?
73 00:07:02.454 ⇒ 00:07:12.010 Amber Lin: And we have similar the strings are. So let me just pull up a explanation.
74 00:07:16.690 ⇒ 00:07:17.630 Amber Lin: So
75 00:07:17.730 ⇒ 00:07:28.549 Amber Lin: that one essentially is how different. The wording is. So maybe there’s 1 letter that’s missing, but it’s still pretty similar. So this one
76 00:07:28.550 ⇒ 00:07:29.020 Scott_Harmon: Number
77 00:07:29.020 ⇒ 00:07:29.890 Amber Lin: Get better
78 00:07:29.890 ⇒ 00:07:34.400 Scott_Harmon: Amber when you say how similar. I guess I’m a little lost, SIM
79 00:07:34.940 ⇒ 00:07:36.879 Scott_Harmon: Talking about, similar, between what?
80 00:07:36.880 ⇒ 00:07:48.840 Amber Lin: So maybe it says, Hello, world! But then they missed an O, but you can still kind of tell. It’s saying, Hello, world like, does that make sense? That’s for the Levenstein scores
81 00:07:48.840 ⇒ 00:07:53.330 Scott_Harmon: So the Levenstein store is saying whether, yeah, I’m being thick headed.
82 00:07:53.700 ⇒ 00:07:57.279 Scott_Harmon: It was grammatically gave a grammatically correct sentence
83 00:07:58.985 ⇒ 00:07:59.960 Amber Lin: Here
84 00:07:59.960 ⇒ 00:08:09.389 Uttam Kumaran: It’s all so it’s a way to think about it is. It’s almost like how far away you are from the right thing. So if you were to say Hello, world! And you, miss, and you add an extra L,
85 00:08:10.130 ⇒ 00:08:13.119 Uttam Kumaran: you’re actually not that far from Hello world.
86 00:08:13.840 ⇒ 00:08:41.770 Uttam Kumaran: And so that’s just one way of us understanding whether the answer we gave back is close enough to the answer that’s expected. There is also another score here which is like, is it the? Is it the right answer overall? So we’re trying to evaluate on a couple of different metrics. And yeah, I think that what amber posted in the chat. Here is the leverage. Score is more about the wording. So if you were to flip letters, how far are you from the right expected answer
87 00:08:42.230 ⇒ 00:08:50.039 Steven: That’s the question is, who are you talking about flipping letters like? If a Csr typed wrong, and then the AI assistant types it right, or who? What are you?
88 00:08:50.040 ⇒ 00:09:01.430 Uttam Kumaran: So we are. So we are talk so right. What you’re looking at now is how we’re evaluating. Ultimately, we want to answer. We want to answer the question, is our bot accurate? And on average, how accurate is it?
89 00:09:01.733 ⇒ 00:09:30.539 Uttam Kumaran: And so we have several different scores. We do here, and these are all based on sort of benchmarks from from the industry, on how to measure the accuracy of these AI agents. So this is purely on the AI agent response side. And the way we actually do this, is we we created, we created again with Janice and Yvette and the team, an evaluation data set of when a question is answered, what is actually the expected? When a question is asked, what is the expected answer? And so amber? If you could just show
90 00:09:30.540 ⇒ 00:09:42.059 Uttam Kumaran: that what we have. And if you could just zoom in just a little bit, and maybe I’ll just I’ll just highlight one piece here. So what you’ll see here is, you’ll see. This is the question that’s answered.
91 00:09:42.170 ⇒ 00:09:50.000 Uttam Kumaran: and this and amber. If you could just go to 1 1 specific one where we can see the pairs. The AI answer, and
92 00:09:50.000 ⇒ 00:09:50.849 Amber Lin: Let me.
93 00:09:51.350 ⇒ 00:09:52.900 Amber Lin: Yeah, let me.
94 00:09:53.780 ⇒ 00:09:59.110 Uttam Kumaran: I think you can just expand this column, and we could just take a look at one of them where there is an accurate pair
95 00:10:00.040 ⇒ 00:10:08.620 MattBurns: And they’re doing this verbally. You, Thomas, not just typing. It’s the verbal response from the Csr compared with what the verbal response would be from the AI bot
96 00:10:08.960 ⇒ 00:10:29.949 Uttam Kumaran: Yeah. So let me give you. Let me just take one step back and just explain how we’re doing what we’re calling evaluation right again to answer the question for y’all, which is, not only can we play with it and know, hey, it’s it’s roughly working on every incremental change. We want a data set that we can run the agent through to make sure that it is accurate across many fronts.
97 00:10:29.970 ⇒ 00:10:45.069 Uttam Kumaran: different types of questions. Amber, if you just want to pause here, we just take a look at one question, different types of questions. Different difficulty of questions. And so one thing that you’re seeing here is? We asked the question, can you give the maintenance text in this zip code?
98 00:10:45.426 ⇒ 00:11:04.709 Uttam Kumaran: Right now? Our answer is, I don’t know. And so what we’re what we’ve done is we’ve worked with Janice and Yvette to get what the accurate answer is, what that provides us is one an understanding of okay, what should we expect? And for our engineering team they need to make sure that the answer the agent the AI agent, gives
99 00:11:04.750 ⇒ 00:11:21.179 Uttam Kumaran: is is accurate relative to this answer. And so there’s another question here. Can you give me the information on the pest bundling service? This is where we do have you know it. It seems like we are basically matched towards what Yvette and Janice would would expect
100 00:11:21.180 ⇒ 00:11:45.400 Uttam Kumaran: that this answer, this answer should be. So every time we’re actually running these evaluations on the agent every day and collecting data on how accurate we are. This is what we call like a golden data set, which is basically from your team? What are the actual answers to these questions? And so when an agent asks when it. When a Csr asks the bot, we can evaluate whether we’re we’re right or not.
101 00:11:46.070 ⇒ 00:11:47.390 Uttam Kumaran: Our our answer was right or not.
102 00:11:47.390 ⇒ 00:11:50.650 Scott_Harmon: So that’s super helpful for me. So
103 00:11:51.740 ⇒ 00:12:02.319 Scott_Harmon: I know you’ve broken it on the dashboard into a bunch of different scores, but it seems like to me, and maybe I’m oversimplifying it. It would be great to see a number between
104 00:12:03.200 ⇒ 00:12:03.710 Uttam Kumaran: Yes.
105 00:12:03.710 ⇒ 00:12:07.770 Scott_Harmon: Of accuracy that that just summarize what you just said like
106 00:12:07.770 ⇒ 00:12:08.540 Uttam Kumaran: Correct.
107 00:12:08.540 ⇒ 00:12:10.840 Scott_Harmon: Like the 1st one would be really bad.
108 00:12:11.020 ⇒ 00:12:13.389 Scott_Harmon: in other words, saying, I don’t know is.
109 00:12:13.560 ⇒ 00:12:15.339 Scott_Harmon: I would call that a 0,
110 00:12:15.440 ⇒ 00:12:26.700 Scott_Harmon: and the second one is probably really high. Just the 2 examples you gave the 2 rows you picked. Is there a number on the dashboard? That just is that kind of simple, boil it all down to
111 00:12:27.430 ⇒ 00:12:28.960 Scott_Harmon: how accurate it is.
112 00:12:29.140 ⇒ 00:12:33.559 Uttam Kumaran: Amber. Can you share what our plan is to create like a basic like a summary score
113 00:12:33.560 ⇒ 00:12:38.340 Scott_Harmon: There’s the 11 shine score, and I get that. That’s AI industry stuff. But honestly.
114 00:12:40.160 ⇒ 00:12:40.590 Uttam Kumaran: Yeah.
115 00:12:40.590 ⇒ 00:12:45.069 Scott_Harmon: Probably more interesting to us internally than it is to anybody at ABC. Would be my take
116 00:12:45.070 ⇒ 00:12:48.880 Uttam Kumaran: I hear you. So we do have a path towards that amber. If you want to just talk through that
117 00:12:49.170 ⇒ 00:13:14.129 Amber Lin: Totally. So. What we’re working on right now is we’re working to combine these 2 sorry, these 2 dashboards, and ultimately this will be the summary score of what you need to look at. So this is to break down for us as engineers to know. Okay, what exactly is going wrong. And what do we need to tweak to make the overall score better? But for people who’s really busy, and you’re at
118 00:13:14.130 ⇒ 00:13:26.609 Amber Lin: the exact level you just need to look at this, to know. Okay, I know things are improving. So we understand what your need is. And this is what we arrived at. So once we combine these 2 dashboards, we’ll have a quality score
119 00:13:26.610 ⇒ 00:13:32.039 Scott_Harmon: Super helpful. I would to the extent. It’s not a big deal. I’d move that to the top and highlight it again. I
120 00:13:32.260 ⇒ 00:13:34.780 Scott_Harmon: I don’t want to speak for Janice and Yvette, but I
121 00:13:35.430 ⇒ 00:13:43.400 Scott_Harmon: or Steven, but I would think that would be the number I’d look at if I were them. Some of these other things like level 11 shine score. I think we’re probably just gonna
122 00:13:44.010 ⇒ 00:13:49.779 YvetteRuiz: Yeah, no, you’re exactly right, Scott and I had some conversation, and so did me and Amber regarding that. So I was.
123 00:13:49.930 ⇒ 00:13:53.374 YvetteRuiz: That was one of the key metrics that we were gonna be looking for. But yeah.
124 00:13:55.080 ⇒ 00:13:59.210 Uttam Kumaran: So so one of our. So one of our goals with this is every meeting we have.
125 00:13:59.230 ⇒ 00:14:10.680 Uttam Kumaran: I’ve instructed the team to lead with this. So we want to show that the bot is getting used, that the responses are accurate and that they’re coming within a timely manner.
126 00:14:10.690 ⇒ 00:14:32.489 Uttam Kumaran: Right? So when I wake up, and I think about this project. Those are the things that I look at as the most important. We will have several ways of calculating that, of course. But to Scott’s point, we’re gonna create just one area where you can see how, how, how much is this getting used? Who’s using it right? And so we we now have the list of the past. Csrs, so we’ll be able to see who is using it more
127 00:14:32.570 ⇒ 00:14:39.809 Uttam Kumaran: I want to look at? Are the answers accurate like? Are they the right answer? And are they coming in a timely manner?
128 00:14:39.810 ⇒ 00:14:40.200 Uttam Kumaran: The 5.
129 00:14:40.200 ⇒ 00:14:44.929 Uttam Kumaran: The final thing? Of course we want to look at is whether the calls are getting resolved.
130 00:14:46.050 ⇒ 00:14:48.190 Uttam Kumaran: You know. Is there a 1st call? Resolution
131 00:14:48.580 ⇒ 00:14:55.750 Scott_Harmon: So. I know. I guess it sounds like this average quality score isn’t done yet, because it shows 0
132 00:14:56.300 ⇒ 00:15:02.560 Scott_Harmon: utam or amber. Do you have a sense of where we’re at right now, with the golden data set like, how how good are we now?
133 00:15:02.990 ⇒ 00:15:25.080 Amber Lin: For the golden data set. Actually, Denise did a fantastic job. We have mostly, every for all the questions on the dashboard. We almost have an ideal answer. So right now, it’s just for our next phase. We do want to test this with maybe more answers, or test this with the Csrs, and see how things go. So
134 00:15:25.080 ⇒ 00:15:28.850 Scott_Harmon: How is our? How is our bot doing against those answers is what I’m asking
135 00:15:30.080 ⇒ 00:15:30.610 Amber Lin: Let me!
136 00:15:30.610 ⇒ 00:15:33.460 Scott_Harmon: I don’t know. I can’t tell from this dashboard
137 00:15:33.630 ⇒ 00:16:00.289 Uttam Kumaran: Yeah. So I think I think, Amber, we should just take this back. Because, currently, as I I noticed as well, some of these are are not up to date like we? We should. We are matching on most of these questions right now. I think you should go back to the team and ask them to just update this answer, this quote column E, and basically, we should again, if we, if we know what the accurate answer is, I want to give an update to this team on how far away from Matching, on everything, at least in the Golden data set
138 00:16:00.760 ⇒ 00:16:20.000 JanieceGarcia: And can I ask, too, like going back to what you guys were talking about? And the similarities? And the answer, what the bot is giving so one of the questions that I had asked when I was playing with it, and I was trying to break the bot like you guys said it was about the mosquito suppression, but it put in as mosquito supervision.
139 00:16:22.000 ⇒ 00:16:25.399 JanieceGarcia: Did you guys see that answer that response?
140 00:16:25.900 ⇒ 00:16:28.530 JanieceGarcia: I don’t think it was one of the ones I emailed. Y’all. But
141 00:16:29.890 ⇒ 00:16:34.090 JanieceGarcia: is that kind of what you guys are talking about when y’all are talking about the Levitstein
142 00:16:35.090 ⇒ 00:16:46.979 Uttam Kumaran: Yeah, I think we’ll probably have to just take it back and look at that specific question. But any sort of feedback, yeah, any sort of feedback like that would be great amber. We don’t have to go through it on on this. We’ll just take it back.
143 00:16:47.670 ⇒ 00:16:49.180 Uttam Kumaran: And then we can basically come.
144 00:16:49.620 ⇒ 00:16:50.180 Uttam Kumaran: Okay.
145 00:16:50.340 ⇒ 00:16:56.419 YvetteRuiz: Yeah. It’s also like the one we talked about yesterday. The oh, by the ways, how it kind of generated. I don’t know
146 00:16:56.800 ⇒ 00:16:59.608 YvetteRuiz: where I got the information
147 00:17:00.170 ⇒ 00:17:13.480 Uttam Kumaran: So we have a few. We have a few things that I think, as we get the feedback loop going between the usage, one of the pieces that we’re gonna talk about today is how we’re actually going to flag to this crew, including Denise.
148 00:17:13.740 ⇒ 00:17:31.945 Uttam Kumaran: What questions are not able to be answered. And identifying is that a problem with the bot? Is that a problem with the knowledge? And what we found for a lot of things is that it’s just not in. It’s just not in there anywhere, and so where the team is actually working with us to go back and make sure it’s updated.
149 00:17:32.230 ⇒ 00:17:32.830 JanieceGarcia: Okay.
150 00:17:32.830 ⇒ 00:17:33.350 Uttam Kumaran: Yeah.
151 00:17:34.030 ⇒ 00:17:37.640 Uttam Kumaran: So if you could just flag any of those, you can send it just directly to me, and amber
152 00:17:38.160 ⇒ 00:17:46.900 JanieceGarcia: I’ll do that. I actually have a sheet that I started with ever I met with amber, and so I shared that with her. But I’ll share it with you as well. So that way you you have it, and you can see
153 00:17:46.900 ⇒ 00:17:47.469 Scott_Harmon: Is there?
154 00:17:47.930 ⇒ 00:17:56.480 Scott_Harmon: Is there a is there a feature utam on the back end somewhere where the bot, if it gives
155 00:17:56.810 ⇒ 00:18:01.800 Scott_Harmon: an answer? That’s not right or doesn’t match very well with the Golden data set.
156 00:18:02.650 ⇒ 00:18:07.319 Scott_Harmon: does it flag that immediately so that we can fix it
157 00:18:07.320 ⇒ 00:18:36.569 Amber Lin: Here. Actually, let me talk about that here. So we’ve been working on the feedback loop, and last time we showed that you can give a thumbs up and thumbs down. But we we didn’t stop there. We said, Okay, we need people to give specific feedback. So we know how to improve it. So when you see an answer that’s not correct, you give a feedback, and you can type in specifically, what do you think this is wrong about, and then it’s logged in our database
158 00:18:36.630 ⇒ 00:18:38.430 Amber Lin: so we can either
159 00:18:38.780 ⇒ 00:18:53.750 Amber Lin: flag it, send to send it to someone or pull this up at the weekly reviews and say, Okay, these are the things that we need to work on. So we already have a feedback loop in place, which I think will be really helpful in identifying what needs to be improved
160 00:18:54.430 ⇒ 00:18:57.789 Scott_Harmon: So you’re just gonna have to agree how to operationalize that. I guess
161 00:18:57.970 ⇒ 00:19:15.580 Uttam Kumaran: Yeah. And so, Scott, to your point. One of the things I talked to Yvette about. And and I think this is again something we’ll we’ll cover in just a sec is what is the feedback loop between identifying that? An answer was wrong and get in basically triaging it and understanding whose problem it is. One piece is, of course.
162 00:19:15.660 ⇒ 00:19:30.539 Uttam Kumaran: isolating what is wrong, and so part of that is getting it. From this we will also start to flag through our evaluation process that’s running on every single question. Which ones we’re not responding well to, or which ones we just don’t have any response at all.
163 00:19:31.010 ⇒ 00:19:37.850 Uttam Kumaran: And so you’re totally right. We’re gonna begin to actually alert those in our own slack channels to begin to triage those
164 00:19:39.170 ⇒ 00:19:41.739 Uttam Kumaran: of course, for the Csr. On the call.
165 00:19:41.930 ⇒ 00:19:57.469 Uttam Kumaran: We want to be accurate on everything. There may be things where to Yvet’s point. If we don’t want the bot to just make up an answer right? So, in short term, I expect us to have still some things where it says, I’m not sure
166 00:19:57.864 ⇒ 00:20:03.950 Uttam Kumaran: but we will quickly identify those, and ideally on like a weekly cadence, start to scrub those as we as we find them.
167 00:20:05.860 ⇒ 00:20:06.550 Uttam Kumaran: Okay, thank you.
168 00:20:06.550 ⇒ 00:20:15.340 Uttam Kumaran: Some of this will be growing pains as we scale it, because we’ve only known between. You know, Yvette, Janice Shannon and Grace.
169 00:20:15.520 ⇒ 00:20:20.219 Uttam Kumaran: I I totally expect there to be other questions that that aren’t covered
170 00:20:20.220 ⇒ 00:20:25.409 Scott_Harmon: Safe to state, to state the obvious, or maybe just repeat what you just said.
171 00:20:25.690 ⇒ 00:20:29.369 Scott_Harmon: you know, to make sure I understand it. Every week.
172 00:20:29.850 ⇒ 00:20:33.980 Scott_Harmon: Janice Yvette, or someone from your team will look at a list
173 00:20:34.150 ⇒ 00:20:39.260 Scott_Harmon: of the cases where the bot couldn’t answer the questions.
174 00:20:39.470 ⇒ 00:20:46.190 Scott_Harmon: And someone, either. There was a problem with the bot, or there was a gap in the knowledge base.
175 00:20:46.710 ⇒ 00:20:52.199 Scott_Harmon: Janice or somebody will fix that so that the following week the bottle gets smarter every week.
176 00:20:53.700 ⇒ 00:20:54.310 Uttam Kumaran: Correct.
177 00:20:54.310 ⇒ 00:20:58.939 Scott_Harmon: Because I think I think it’s fine again. I can’t speak for you guys, but
178 00:20:59.560 ⇒ 00:21:04.330 Scott_Harmon: you know, if it’s not a hundred percent, if you can stump it on some questions. Okay, fine
179 00:21:04.870 ⇒ 00:21:10.989 Scott_Harmon: people can always just use the old process and ask the, you know. Ask the Guru.
180 00:21:11.120 ⇒ 00:21:17.339 Scott_Harmon: but it’s got to get if it gets incrementally better every week. I think we’ll all be happy
181 00:21:17.340 ⇒ 00:21:22.309 Uttam Kumaran: 100%. And we actually want to. We actually want to report on that in the dashboard
182 00:21:22.690 ⇒ 00:21:29.379 Uttam Kumaran: which we want to report on the questions that it’s not getting right, and we want to see that number going down over time
183 00:21:29.380 ⇒ 00:21:32.389 Scott_Harmon: Right? Because obviously, what you want is a self learning
184 00:21:32.680 ⇒ 00:21:36.619 Scott_Harmon: a system. That sort of just by definition makes itself smarter
185 00:21:36.620 ⇒ 00:21:37.040 Uttam Kumaran: Yes.
186 00:21:37.040 ⇒ 00:21:45.869 Scott_Harmon: Every week. Right? So the execs can go. Oh, my gosh! Like so much smarter than it was 3 weeks ago. Because there’s these, you know 15 questions that
187 00:21:46.300 ⇒ 00:21:49.080 Scott_Harmon: stumped it 3 weeks ago, and now it can answer
188 00:21:50.490 ⇒ 00:21:50.970 Uttam Kumaran: Correct.
189 00:21:50.970 ⇒ 00:21:51.850 YvetteRuiz: Got you
190 00:21:54.820 ⇒ 00:22:13.029 Uttam Kumaran: So I think we have some takeaways on that. We definitely have a several updates, the dashboard that we need to make. I also do amber. Wanna have a metric there that shows very clearly what things we’re getting wrong and why? And then on this call weekly, we should see that number going down
191 00:22:13.610 ⇒ 00:22:16.440 Uttam Kumaran: as a percentage of total messages
192 00:22:16.440 ⇒ 00:22:17.360 Scott_Harmon: Right? That’s good.
193 00:22:17.360 ⇒ 00:22:18.010 Uttam Kumaran: To be that
194 00:22:18.690 ⇒ 00:22:21.849 Scott_Harmon: Right took the word. And that kind of goes back to that metric.
195 00:22:22.460 ⇒ 00:22:25.749 Scott_Harmon: The 1st conversation, which is okay. Let’s say we got.
196 00:22:26.630 ⇒ 00:22:29.839 Scott_Harmon: you know 85%. We answered right?
197 00:22:29.960 ⇒ 00:22:35.520 Scott_Harmon: And 15%, we we didn’t have an answer to you want to see that that number approach.
198 00:22:36.310 ⇒ 00:22:37.500 Scott_Harmon: you know, get better
199 00:22:38.390 ⇒ 00:22:39.010 Uttam Kumaran: Correct.
200 00:22:39.280 ⇒ 00:22:39.860 Scott_Harmon: Okay.
201 00:22:41.860 ⇒ 00:23:02.169 Amber Lin: Great and just a few quick updates on the other end, we’ve incorporated what we scrape from the website. So now we can answer, okay, what is the service? Because most of the times, it’s about the availability. So right now, we can also answer, Okay, what are the procedures? What, exactly, is the service. So, as you can see here the example.
202 00:23:02.550 ⇒ 00:23:14.700 Amber Lin: And next, Shannon and Grace did a great job we have. I think we’re pretty much done with updating the Google Google docs. Is that is that the case? Yvette and Janice
203 00:23:15.730 ⇒ 00:23:33.349 YvetteRuiz: So I, yeah, I went in and I updated everything that. Grace and Shannon gave feedback. I checked with them. They were scrubbing a little bit more. So I will be checking it today. Because Shannon just came back from vacation. So I’ll finish off. But we’re I’ve already updated everything that they’ve had
204 00:23:33.350 ⇒ 00:23:39.699 Amber Lin: Great that will that will make our responses a lot more up to date. And that’s I really really appreciate that
205 00:23:39.860 ⇒ 00:23:40.450 YvetteRuiz: Yep.
206 00:23:40.640 ⇒ 00:24:06.449 Amber Lin: So the last few updates is that our engineers have worked on improving the speed. And right now we improved it by to have a safe estimate. We’ve improved it by 1.5 seconds, but, as you can see here, it might be more than that. But I just wanted to be safe and not give a overestimate. So we’re working on improving that. And if it’s
207 00:24:06.500 ⇒ 00:24:11.450 Amber Lin: if it’s a shorter time, it means when we expand, it will be a lot easier
208 00:24:11.450 ⇒ 00:24:27.359 Uttam Kumaran: Yeah, to just pause. Here are, you know, when we started the project, our average was around like 20 to 30 seconds. I think our goal when we talked about 2 weeks ago, was about 10 seconds. And for me, what I told the team is, I said, 3 seconds. So
209 00:24:27.440 ⇒ 00:24:49.740 Uttam Kumaran: that’s where we want to go is we want the responses for as many of questions as possible to be as near real time. There will be questions that maybe take a lot of recall. And that will require quite a bit more time. But we know that Csrs are on the phone multitasking to interact with the bot. And so our job is to have
210 00:24:49.870 ⇒ 00:25:03.160 Uttam Kumaran: their ability to ask a question, then ask it to follow question and sort of get to the answer be really fast. They may not have, you know. 1530, sec. 30, 45, you know, minute to do this. And again, all of this is sort of
211 00:25:03.310 ⇒ 00:25:07.372 Uttam Kumaran: trying to attack the 1st call resolution problem.
212 00:25:08.090 ⇒ 00:25:29.709 Uttam Kumaran: right? And so one of the things that we will report on in this meeting, too, is is how we’re doing around average execution time for the questions we do have that are outliers right? For the questions that are taking 1015, 20 seconds, how can we begin to start to optimize those? But I’m I’m kind of proud of how far we’ve come. We’ve come here, just, you know, in the short time
213 00:25:29.830 ⇒ 00:25:43.890 YvetteRuiz: Yeah, Janice, do you, Janice? Actually, she jumped on phones this week because we got hit with the phone call volume. So it gave her a perfect opportunity to do some live. Do you want to share some of that feedback on the response time
214 00:25:43.890 ⇒ 00:25:55.439 JanieceGarcia: It seemed like at 1st it was taking a little bit longer, but of course that’s just that was just me. But I do think the response time after playing with it and after being on the phones, it is
215 00:25:55.800 ⇒ 00:25:56.840 JanieceGarcia: really quick.
216 00:25:57.130 ⇒ 00:26:05.400 JanieceGarcia: So it’s nice. You’re not literally waiting there. It’s probably just about even with our evolve system as well. So
217 00:26:06.050 ⇒ 00:26:15.509 Uttam Kumaran: Do you have? Do you have a sense? I guess. 2 follow questions. Do you have a sense? Where was there any questions that you asked or question types that took a bit longer where they were all pretty, even
218 00:26:16.294 ⇒ 00:26:28.115 JanieceGarcia: The termite one a mosquito, because that’s we’re in those seasons, and termites really starting to come up and and kick in. So with those those response times were,
219 00:26:28.510 ⇒ 00:26:28.865 Uttam Kumaran: Okay.
220 00:26:29.220 ⇒ 00:26:31.789 JanieceGarcia: But other than that? No.
221 00:26:31.890 ⇒ 00:26:47.119 JanieceGarcia: And when it came to like asking, Okay, who does this service? To be able to send out for a job, completion or reservice on the customers. The customer service that they already have those. It was really fast. It was more as of
222 00:26:47.730 ⇒ 00:26:53.940 JanieceGarcia: what type of service. What is being done in these services? Not not necessarily. Who do we send out
223 00:26:55.360 ⇒ 00:27:02.999 Uttam Kumaran: And then 1 1 more question there. Did you find like the questions? The responses were pretty readable like while you were live on the call
224 00:27:03.330 ⇒ 00:27:07.631 JanieceGarcia: They are. Yes, they are. It’s nice to see that they’ve shrunk
225 00:27:07.990 ⇒ 00:27:08.550 Uttam Kumaran: Yeah, and it
226 00:27:08.550 ⇒ 00:27:25.149 JanieceGarcia: Not these long paragraphs anymore? So yes, for sure. It did just the mosquito stuff. And like I said, I’ll send that to you, Tom, but the mosquito stuff did kind of scare me because it was spitting out more than what I’m asking, and then
227 00:27:25.570 ⇒ 00:27:32.120 JanieceGarcia: was kind of off. But I think with that, you know, that’s something that we can work on together for sure.
228 00:27:32.120 ⇒ 00:27:47.700 Uttam Kumaran: Okay, great. I think I think there’s probably one more layer on the presentation, the responses which we can use better formatting or emojis, or better spacing. I think there’s still one more push I want to do there and then, certainly again, I want to start to be able to measure.
229 00:27:47.900 ⇒ 00:28:04.239 Uttam Kumaran: What responses are we sending that are longer than X amount of characters? And why? Right? And and again, now that we’ve got kind of gone through the the bulk of sort of shaping all the responses, our jobs to sort of do the 80 20 and sort of find
230 00:28:04.240 ⇒ 00:28:04.600 JanieceGarcia: Right.
231 00:28:04.600 ⇒ 00:28:17.979 Uttam Kumaran: The real sticklers and sort of go after them. So that’ll be what I’m pushing the team on to both measure. And then every week sort of show that we’ve we’ve isolated some of those, and we have a path towards mitigating that
232 00:28:18.620 ⇒ 00:28:34.670 JanieceGarcia: Because it seems like, and I know Amber and I have talked about, you know, trying to offer the oh by the ways, or offer give a little bit more information. But there was just some stuff that I’m like. Wait a minute. No like. It’s almost too much information when we ask just a specific question.
233 00:28:35.100 ⇒ 00:28:39.843 JanieceGarcia: So, but I was really playing with it and digging in even after the phone call
234 00:28:41.090 ⇒ 00:28:50.549 MattBurns: So, Janice, on this recent example where you were on the phone, I mean, you were typing that in and getting a a written response back from the bot. Essentially
235 00:28:50.870 ⇒ 00:28:52.559 JanieceGarcia: Yes, sir, definitely.
236 00:28:52.560 ⇒ 00:28:54.380 MattBurns: Okay. Cool. Okay.
237 00:28:54.530 ⇒ 00:29:06.669 YvetteRuiz: We we just to add to me, we talked to Grace and Shannon. So, Matt, we have a couple of new hires. So we’re gonna test that with them. This coming week. I think that’s gonna be some good, you know.
238 00:29:07.460 ⇒ 00:29:09.860 YvetteRuiz: Feedback from them, for sure. Because what are they asking
239 00:29:10.500 ⇒ 00:29:12.666 MattBurns: Yup, no, no doubt that’s
240 00:29:13.350 ⇒ 00:29:19.440 MattBurns: That’s definitely gonna you would think. Certainly help the learning curve on a new person, for sure. Yeah.
241 00:29:20.500 ⇒ 00:29:44.989 JanieceGarcia: Absolutely. And that’s what I’m thinking when it comes to the responses. Because if I’m asking, okay, what is mosquito suppression? And it’s responding back, saying, Oh, mosquito supervision! And then it kind of lists all of our different mosquitoes, mosquito system. The missing for the suppression. I mean. It. It gives you everything. And I want it to just focus on what we’re asking. And
242 00:29:44.990 ⇒ 00:29:45.916 YvetteRuiz: One user,
243 00:29:46.640 ⇒ 00:29:47.200 MattBurns: Yeah.
244 00:29:50.380 ⇒ 00:29:51.340 Amber Lin: Sounds good
245 00:29:51.918 ⇒ 00:30:17.789 Amber Lin: let me quickly run through these, so we can have some time to talk about phase 2. So this is a nice little thing that we have. This is not related to, not in the Google Chat. This is sort of a separate number that people can call and schedule. Right now we’re working to incorporate it into the scheduling system, but when I called it it was pretty cool. It sounded like a real person. It answered, I’m asking for my address.
246 00:30:18.120 ⇒ 00:30:47.030 Amber Lin: I asked it. What if I’m not home? It answered. Just like what we have in the data sheet? I asked, oh, what if the bugs come back? They’re like? Oh, don’t worry. We have warranty to cover this. And then it told me, oh, by the way, do you want this other service? So that was really good. When I asked, how much does it cost? It did hallucinate because we don’t have data on that yet, but the bot sounded pretty confident it was nice to work with, and if you guys have time, you could just try it out and just call that number
247 00:30:48.430 ⇒ 00:30:49.840 Steven: Very cool. Yeah.
248 00:30:49.840 ⇒ 00:30:50.380 Amber Lin: This is
249 00:30:50.380 ⇒ 00:30:54.630 YvetteRuiz: I do want to test that. I know I was talking to you know about that yesterday, so I’m excited
250 00:30:54.630 ⇒ 00:30:56.345 Amber Lin: Yeah, check that out.
251 00:30:56.950 ⇒ 00:30:58.860 Amber Lin: Note that number down. Okay.
252 00:30:58.860 ⇒ 00:30:59.250 YvetteRuiz: Honestly.
253 00:30:59.250 ⇒ 00:31:08.839 Amber Lin: See? This is pretty important, and we are trying. We’re wanting to deploy it to all Csrs by April 11, th right? And so right now.
254 00:31:08.840 ⇒ 00:31:11.340 Uttam Kumaran: Yeah, maybe I’ll just pause. I’ll just pause there. So
255 00:31:11.340 ⇒ 00:31:11.750 Amber Lin: Okay.
256 00:31:11.750 ⇒ 00:31:34.490 Uttam Kumaran: One of the things that me and Yvet spent time yesterday talking about, and one of the things I said is, I think we’re we can continue to make optimizations here. We wanted to set a pretty hard deadline and a goal of how we can roll this out to the 25 past. Csrs. I said. April 11th is pretty fair. Of course
257 00:31:34.630 ⇒ 00:31:36.969 Uttam Kumaran: we’re going to look to do that
258 00:31:37.200 ⇒ 00:31:46.410 Uttam Kumaran: before then. We’re working on a rollout plan right now that is sort of a phased approach where similar to Shannon and Grace, we sort of bring on
259 00:31:46.833 ⇒ 00:32:14.150 Uttam Kumaran: Csrs to use the system. We get feedback from them. We go through these iterations of fixing things, and ideally, our goal is to roll roll this out, to have make available. I think part of what I I don’t know if amber, if we may have skipped this is is Janice being able to join us in our daily stand ups and our our sort of sprint work directly with our engineering team. We
260 00:32:14.320 ⇒ 00:32:38.870 Uttam Kumaran: we meet as a team every day about this project, including, and on Mondays and Fridays we sort of do what’s called planning, and and like a retrospective on how the week went. One of the things I? I asked you that is, it would be really really helpful. Sort of close the loop and bring Janice directly into that meeting to talk directly with our engineers and help amber plan. What is the priority there? Especially as we look to involve
261 00:32:39.020 ⇒ 00:32:52.640 Uttam Kumaran: 5, 10, and up to 25 folks here? We definitely need some feedback on how to best on board them. How to, you know, create some internal champions on the team of of the tool. But
262 00:32:52.700 ⇒ 00:33:09.889 Uttam Kumaran: I’m still very committed to trying to getting this out. Not in a few months, in like a few weeks. And so we want to sort of balance both. Yvette mentioned that. I think that’s a that’s a good to go on that. But maybe, Janice, if you just have any thoughts or ideas there
263 00:33:10.300 ⇒ 00:33:15.289 JanieceGarcia: No, Yvette did talk to me about it this morning. I think that’s a great idea. I mean, especially I’ve
264 00:33:15.710 ⇒ 00:33:32.909 JanieceGarcia: I’m carving out specific time every single day. And I think doing that, it’s gonna keep me especially like with my question that I had today, I will know, you know, instead of emailing, you guys, we can talk about that because I know y’all have your meetings. Amber had told me about that as well. So I think that’s yes.
265 00:33:33.090 ⇒ 00:33:33.530 JanieceGarcia: please.
266 00:33:33.970 ⇒ 00:33:34.610 Uttam Kumaran: Perfect.
267 00:33:34.840 ⇒ 00:33:36.220 Amber Lin: Great to hear.
268 00:33:36.800 ⇒ 00:33:55.810 Amber Lin: So we want to for next week. We want to test it with around 5 Csrs. I know you guys have some new hires. So, having testing with them, will allow us to see, okay, what are we not doing as well? What do we need to ramp up so that we can roll it out to all of the Csrs.
269 00:33:55.810 ⇒ 00:34:16.329 Amber Lin: And we have a more detailed rollout plan. But this is sort of what we are aiming for over here. So right now, we’re at the end of the 3rd week of March, and we want to have group testing and adjustments. And by the end of the 1st week of April we start to. We have a fully in deployed
270 00:34:16.330 ⇒ 00:34:21.779 Scott_Harmon: Can I? Can I ask a question about that second line? Item, the document line item
271 00:34:21.780 ⇒ 00:34:22.500 Amber Lin: Hmm.
272 00:34:22.719 ⇒ 00:34:23.589 Scott_Harmon: And
273 00:34:23.959 ⇒ 00:34:37.979 Scott_Harmon: so part of the project, you know, for phase one was to take these documents that were sort of put together in various formats and just created. I use the term ad hoc and then to
274 00:34:38.079 ⇒ 00:34:42.269 Scott_Harmon: to clean them up a bit and make them more readable.
275 00:34:42.529 ⇒ 00:34:48.599 Scott_Harmon: and also make it so that the training bot can actually update them going forward. So we have sort of a cleaner.
276 00:34:48.749 ⇒ 00:34:54.989 Scott_Harmon: a cleaner set of docks along the way we’ve learned a lot. We’ve encountered an entire, you know.
277 00:34:55.159 ⇒ 00:34:57.179 Scott_Harmon: We’ve gone through the spreadsheets.
278 00:34:57.509 ⇒ 00:35:06.819 Scott_Harmon: you know, and better understood them. We’ve we’ve gone through the oh, by the way documents which I I didn’t know about, and that’s that’s been exciting. I heard
279 00:35:07.079 ⇒ 00:35:09.589 Scott_Harmon: this week we learned about another one
280 00:35:09.999 ⇒ 00:35:14.979 Scott_Harmon: which were the churn. I forgot what they were called. Do, Tom. They’re they’re kind
281 00:35:14.980 ⇒ 00:35:17.930 Uttam Kumaran: Save save tactics.
282 00:35:18.250 ⇒ 00:35:21.592 YvetteRuiz: Save tactics. So that’s another kind of document.
283 00:35:22.290 ⇒ 00:35:26.450 Scott_Harmon: We scraped. We decided we needed to scrape the website because
284 00:35:26.730 ⇒ 00:35:33.769 Scott_Harmon: there’s some descriptions of what’s included in a service that doesn’t exist in a document. It’s on the website. So we scrape that in. So
285 00:35:34.310 ⇒ 00:35:41.259 Scott_Harmon: so this says updating documents, I I guess I’m asking for a status of, Do we have a clean doc set, or
286 00:35:41.890 ⇒ 00:35:45.980 Scott_Harmon: you know that I can now go, or Matt or anybody could go now, read
287 00:35:46.220 ⇒ 00:35:57.619 Scott_Harmon: and and see all this stuff kind of in one place? Or is that just an always ongoing effort? Or can you tell me a little bit more about the stat of that second bar getting those documents kind of
288 00:35:58.530 ⇒ 00:35:59.999 Scott_Harmon: ready for AI
289 00:36:00.210 ⇒ 00:36:16.941 Amber Lin: Totally they’re pretty much ready for AI. What we’re looking. What we’re working on right now is just to update individual parts and add in all other information that wasn’t provided earlier. So we’re incorporating, say, pricing data with the
290 00:36:17.470 ⇒ 00:36:19.358 Amber Lin: the service port that
291 00:36:19.980 ⇒ 00:36:31.190 Amber Lin: the part that mentioned yesterday, when he was talking with Yvette. But right now we have a central document of all the information in one place.
292 00:36:31.510 ⇒ 00:36:35.009 Amber Lin: This is the document. It has all the information
293 00:36:35.180 ⇒ 00:36:43.299 Amber Lin: in one place that makes it really easy for the AI Bot to access. And we also have these spreadsheets
294 00:36:43.410 ⇒ 00:36:44.430 Amber Lin: of
295 00:36:44.580 ⇒ 00:36:47.100 Scott_Harmon: The availabilities and whom
296 00:36:47.100 ⇒ 00:36:50.800 Scott_Harmon: we were. We were gonna beautify these spreadsheets. I think we had it to do to do that
297 00:36:50.800 ⇒ 00:36:57.270 Uttam Kumaran: Yes, it’s still on the to do list and then maybe just one, maybe Amber, just to go back to the Central Doc. And I know
298 00:36:57.270 ⇒ 00:36:57.605 Amber Lin: Because
299 00:36:57.940 ⇒ 00:37:08.596 Uttam Kumaran: Because Matt is just sort of looking at this for the 1st time. One of the things that we did is we basically consolidated all text based information to one document.
300 00:37:09.310 ⇒ 00:37:16.870 Uttam Kumaran: There’s a couple of benefits for this one. There’s a hundred documents. So having one place is is speaks for itself.
301 00:37:16.870 ⇒ 00:37:43.349 Uttam Kumaran: The second piece is our ability to update this and maintain this with new information is really, really crucial. So one of the things that we’re working on right now is for the ability to for Csrs to come in here and propose changes if they notice that there is a new policy. We even found that Shannon and Grace were both. Hey, we need to update these things. Let’s just go into this, Doc, and make the Update. So there’s 1 single place.
302 00:37:43.631 ⇒ 00:38:03.608 Uttam Kumaran: To go make the updates. Second, we’re actually building another chat bot, that will allow you will help you actually go make those Updates. This is a pretty big document. It has a lot of information. And so we’re building a chat bot, that you can say, I want to go update this specific part of the document, it will guide you and actually go. Make the change for you.
303 00:38:03.890 ⇒ 00:38:04.879 Scott_Harmon: I just wanna
304 00:38:05.040 ⇒ 00:38:15.690 Scott_Harmon: really just emphasize what you said, Matt. If you go all the way back when we started this, I think, started out as a knowledge management kind of discussion, and then came up in another conversation. And I,
305 00:38:16.000 ⇒ 00:38:25.249 Scott_Harmon: I said, Look, if we could get this, all have AI clean up the knowledge and manage it and keep it current, and that is by itself a huge win. So
306 00:38:25.760 ⇒ 00:38:33.249 Scott_Harmon: I think we’re very far down the road to doing that where, instead of having knowledge all over the place
307 00:38:33.430 ⇒ 00:38:41.359 Scott_Harmon: and confusion about who owns it, and whether it’s current. And so we’re coming up with a pretty close to a centralized source of truth here.
308 00:38:41.720 ⇒ 00:38:44.060 Scott_Harmon: with good controls on
309 00:38:44.580 ⇒ 00:38:50.139 Scott_Harmon: keeping, you know, and being able to update it, finding out where there are gaps. And then continually.
310 00:38:50.440 ⇒ 00:38:53.480 Scott_Harmon: you know, adding to this corpus of knowledge, which was.
311 00:38:54.150 ⇒ 00:38:57.900 Scott_Harmon: and I’m glad we did it with AI and not a knowledge based tool, because I think it’s
312 00:38:58.620 ⇒ 00:39:02.209 Scott_Harmon: be cheaper and faster. So I just really wanted to.
313 00:39:02.400 ⇒ 00:39:04.639 Scott_Harmon: You know, that was a huge goal coming in
314 00:39:04.640 ⇒ 00:39:09.100 MattBurns: No agree, and that that’s great. We just like said event would just have rules in places
315 00:39:09.100 ⇒ 00:39:09.780 YvetteRuiz: Yes.
316 00:39:09.780 ⇒ 00:39:16.430 MattBurns: Alter the document. And you don’t want create kind of a Wikipedia situation where you can just go in and change things. So. But
317 00:39:17.320 ⇒ 00:39:18.729 MattBurns: yeah, that’s why
318 00:39:18.730 ⇒ 00:39:20.759 Scott_Harmon: That would be great. Yeah, absolutely.
319 00:39:20.760 ⇒ 00:39:21.569 YvetteRuiz: Well, then.
320 00:39:21.570 ⇒ 00:39:24.350 Scott_Harmon: You have in the trainer. Bot what we call the trainer. Bot mat
321 00:39:24.350 ⇒ 00:39:24.750 YvetteRuiz: That’s
322 00:39:24.750 ⇒ 00:39:28.080 Scott_Harmon: Spot that can go. Add to this as we get into it.
323 00:39:28.580 ⇒ 00:39:33.070 Scott_Harmon: We’re going to want to have more rules about. Who can tell the trainer bot to add what
324 00:39:34.100 ⇒ 00:39:38.520 Scott_Harmon: So I just want to flag for you, and we could. There is are also still some.
325 00:39:39.130 ⇒ 00:40:02.999 Scott_Harmon: I don’t call them knowledge gaps, but just places where ABC. Is still a little conflicted on where the math, the total source of truth, is a lot of visit around service offerings and prices and stuff like that. That is still a little hazy that I think we can help, you know, work together to tighten down. But I think we now have a good solid, single source of truth here
326 00:40:03.120 ⇒ 00:40:04.460 Scott_Harmon: to start building on
327 00:40:05.430 ⇒ 00:40:22.370 Amber Lin: Yeah. And Max, to just show you what the training bot actually does. So here’s an example of the update feature that we show last week. So we asked it, okay, I need to update thermosol docs. It pulls, it pulls the pulls the docs up. And then we can look
328 00:40:22.680 ⇒ 00:40:27.349 Amber Lin: and say, okay, we want to update a add a new service manager
329 00:40:27.630 ⇒ 00:40:41.959 Amber Lin: and the bot updates it over here. It checks with you. Can you confirm that this is good with the update. And we say, Okay, let’s confirm. And so then it writes it into that central document that you just saw.
330 00:40:42.430 ⇒ 00:40:50.620 Amber Lin: And now we were able to have that here. So that’s the process that we’ll use to update
331 00:40:51.020 ⇒ 00:40:59.810 Amber Lin: this document. So it will be a lot easier that you don’t need to scroll through. The 91 pages to find where to be
332 00:41:00.660 ⇒ 00:41:02.100 MattBurns: Excellent, good.
333 00:41:04.590 ⇒ 00:41:04.875 Amber Lin: Okay.
334 00:41:05.630 ⇒ 00:41:06.839 Scott_Harmon: Great job, team.
335 00:41:08.590 ⇒ 00:41:23.159 Amber Lin: Our engineers are fantastic, and we have so much help from Yvette, Janice, Shannon, and Grace it was, and Scott was a fantastic contribution for all the different ideas and keeping us on track. So we really have a great team here.
336 00:41:23.700 ⇒ 00:41:28.319 Amber Lin: Now I’ll just leave it to Utam to
337 00:41:28.530 ⇒ 00:41:32.219 Amber Lin: go over, discuss the phase. 2 documents.
338 00:41:32.630 ⇒ 00:41:34.446 Uttam Kumaran: Yeah. So maybe I’ll just
339 00:41:35.020 ⇒ 00:41:37.650 Uttam Kumaran: you know, I’ll just sort of set the
340 00:41:37.810 ⇒ 00:41:59.050 Uttam Kumaran: the lay of the land. I think we’re all we. I sent the email. I think we talked to you about yesterday. Basically you know, we we wanted to. Commonly when you buy one of just for my, one of the ways you buy software by these solutions is either on a license basis or on like a per user basis for us. I really wanted to take the risk and take a bet that
341 00:41:59.090 ⇒ 00:42:19.259 Uttam Kumaran: we are going to drive, you know, benefits to your income statement. And for us, I want we decided that out of all the metrics you shared with us, being able to allow your Csrs to resolve calls on the 1st call is probably the most specific thing we can affect.
342 00:42:19.652 ⇒ 00:42:39.860 Uttam Kumaran: This will most likely this will reduce, you know, people not getting a second call potentially churning from those and in the short term, we find that our benefit is probably most attributable here. You know, there are a couple of metrics around what is our 1st call resolution. Now, how many of our calls are problems.
343 00:42:39.960 ⇒ 00:42:52.730 Uttam Kumaran: how much of an impact is not getting a second call on that customer churning? How much are those customers worth. So I I sort of teed up some of those questions. But that is sort of our goal with this like, I don’t think this is
344 00:42:52.950 ⇒ 00:43:03.607 Uttam Kumaran: this probably easier for us to just throw a price on this? But I definitely wanted to give this a shot and see whether we can line up on something that actually,
345 00:43:04.070 ⇒ 00:43:09.329 Uttam Kumaran: really, it’s a win win on. If we’re able to actually demonstrate that we’re we’re changing these metrics for y’all
346 00:43:11.660 ⇒ 00:43:16.305 MattBurns: Yeah, Steven and I chatted a little bit yesterday. Just it’s
347 00:43:17.650 ⇒ 00:43:21.819 MattBurns: It’s a great goal. And I think it makes a lot of sense, because
348 00:43:24.720 ⇒ 00:43:31.220 MattBurns: you know, at the end of the day, if you if you help the bottom line, everybody wins, and then and we can share that.
349 00:43:31.510 ⇒ 00:43:34.939 MattBurns: it’s trying to
350 00:43:36.050 ⇒ 00:43:43.539 MattBurns: really put a dollar value on. It is is difficult, certainly on the front end. It’s difficult.
351 00:43:43.680 ⇒ 00:43:49.210 MattBurns: I mean, in one respect, savings will occur when
352 00:43:49.650 ⇒ 00:43:55.450 MattBurns: we’re more efficient and we can do X amount of business with truthfully
353 00:43:55.660 ⇒ 00:44:01.590 MattBurns: fewer Csrs, because they’re just more efficient. You know, they’re handling calls quicker.
354 00:44:01.820 ⇒ 00:44:06.160 MattBurns: They’re getting better resolutions. They’re keeping the customers happier and
355 00:44:06.360 ⇒ 00:44:11.516 MattBurns: and reducing the the churn of of customers. So
356 00:44:13.320 ⇒ 00:44:18.000 MattBurns: It’s a little hard utam to to say.
357 00:44:18.300 ⇒ 00:44:26.150 MattBurns: How does that relate to pricing for you guys? I know, Yvette, when we
358 00:44:26.330 ⇒ 00:44:28.749 MattBurns: talked a little bit, I think on
359 00:44:28.900 ⇒ 00:44:32.180 MattBurns: the same day group, you know, that does our after hours.
360 00:44:32.530 ⇒ 00:44:36.249 MattBurns: Essentially, they, their pricing is based on
361 00:44:36.630 ⇒ 00:44:40.230 MattBurns: just I guess. Really, you’ve got the the minutes that
362 00:44:40.230 ⇒ 00:44:41.080 YvetteRuiz: Minutes.
363 00:44:41.080 ⇒ 00:44:48.759 MattBurns: It’s used. So that’s pretty tangible. We can. We can certainly say, Okay, well, if we use that
364 00:44:49.080 ⇒ 00:44:56.329 MattBurns: after hours, bot to handle, you know, calls we would otherwise pay people just to be sitting there
365 00:44:56.783 ⇒ 00:45:10.120 MattBurns: cause we have to pay them by the hour, and they’re whether they’re taking a call or not, we’re paying them with with this same day. It’s pretty tangible. Okay, I used X amount of minutes from the same day folks. Therefore I can
366 00:45:10.520 ⇒ 00:45:12.459 MattBurns: relate that to a price.
367 00:45:14.950 ⇒ 00:45:22.319 MattBurns: The other thought I had potentially utam is to say, at least initially.
368 00:45:22.550 ⇒ 00:45:27.129 MattBurns: maybe we do kind of a base plus type thing where we go. Okay? Cause
369 00:45:27.370 ⇒ 00:45:32.259 MattBurns: what I’m what I’m a little worried about is all of a sudden we we create something.
370 00:45:32.710 ⇒ 00:45:41.819 MattBurns: And these improvements are taking place, but it’s difficult
371 00:45:43.270 ⇒ 00:45:43.930 Uttam Kumaran: Totally.
372 00:45:43.930 ⇒ 00:45:51.069 MattBurns: What the dollar value will be. So at the end of the month, all of a sudden, we get a bill for $16,000. We’re going. Well, wait a minute.
373 00:45:51.300 ⇒ 00:45:51.779 Uttam Kumaran: I hear you
374 00:45:52.250 ⇒ 00:46:00.300 MattBurns: So that’s hard. But but maybe something initially, with a with a cost plus, we could say, Here’s a here’s a base fee we pay you.
375 00:46:00.890 ⇒ 00:46:03.660 MattBurns: Then maybe we can work on these things
376 00:46:04.510 ⇒ 00:46:09.006 MattBurns: so that ultimately we can make it more quote unquote
377 00:46:09.950 ⇒ 00:46:14.797 MattBurns: based on exactly these factors. But
378 00:46:15.940 ⇒ 00:46:20.990 MattBurns: I don’t know this, so it’s it’s hard to place a dollar value on it for for me right now.
379 00:46:20.990 ⇒ 00:46:25.279 Scott_Harmon: Well, let let me just throw my yeah, my sense in here. So
380 00:46:25.910 ⇒ 00:46:35.270 Scott_Harmon: you know again, I got way way. Too much experience haven’t done this over this exact thing, you know, negotiating call center technology pricing for
381 00:46:35.270 ⇒ 00:46:35.870 MattBurns: Oh!
382 00:46:36.200 ⇒ 00:46:43.580 Scott_Harmon: For companies like Dell and at and T and British telecom stuff. So so I guess
383 00:46:43.750 ⇒ 00:46:51.128 Scott_Harmon: what I think might be the quickest way to get to something that works for ABC and and for Brainforge is
384 00:46:52.390 ⇒ 00:47:02.609 Scott_Harmon: 45 min to an hour pricing workshop on a whiteboard, you know, in your office, Matt, if we could. You know my my experience is you’ve got to sit down with the business owner
385 00:47:03.280 ⇒ 00:47:06.680 Scott_Harmon: and just throw 3 or 4 concepts up on the board.
386 00:47:06.840 ⇒ 00:47:13.249 Scott_Harmon: and until you get one that aligns with both, I think the idea you several ideas are floating around here that are all good.
387 00:47:15.220 ⇒ 00:47:20.560 Scott_Harmon: I’ll toss out another one just to think about which is, which is
388 00:47:21.160 ⇒ 00:47:27.740 Scott_Harmon: one way to think about. Andy is as an expert that you might hire in a company.
389 00:47:27.870 ⇒ 00:47:45.259 Scott_Harmon: And right now that expertise is provided fractionally by Janice and Yvette and a couple of other people. But let’s say, Matt, you go, boy, I’m gonna I’m gonna hire typically in Call Center world. These would be called level 2, or, you know, level 3 specialists. We’re gonna hire one of them. What would they cost me a year.
390 00:47:45.610 ⇒ 00:47:49.750 Scott_Harmon: you know, burden. Maybe they cost me a hundred grand, great. Okay?
391 00:47:50.170 ⇒ 00:47:54.429 Scott_Harmon: How busy would they be? Right? So you could. You could kind of base it off of
392 00:47:55.080 ⇒ 00:48:00.370 Scott_Harmon: gosh! I don’t have to hire a specialist now for pest right, and I’m getting all these so.
393 00:48:00.780 ⇒ 00:48:07.019 Scott_Harmon: But my experience. Is it better to go a whiteboard and just kind of throw throw numbers down right? The
394 00:48:07.620 ⇒ 00:48:09.759 Scott_Harmon: the things they have to meet
395 00:48:09.910 ⇒ 00:48:15.040 Scott_Harmon: for ABC. They have to be, you know. No surprises.
396 00:48:15.040 ⇒ 00:48:15.570 MattBurns: Yeah.
397 00:48:15.570 ⇒ 00:48:19.979 Scott_Harmon: And things that I can really measure and believe is a business value.
398 00:48:20.540 ⇒ 00:48:24.700 Scott_Harmon: And for Brainforge, what they need to be is
399 00:48:25.670 ⇒ 00:48:30.500 Scott_Harmon: we’ve got some amount of upside based on how much we can impact the business
400 00:48:30.500 ⇒ 00:48:30.970 MattBurns: Yup!
401 00:48:30.970 ⇒ 00:48:33.130 Scott_Harmon: And and I just want to
402 00:48:33.520 ⇒ 00:48:41.180 Scott_Harmon: underline what I think is cool about Utam’s approach, which is, we’re not just coming in here with a 4 cents a question kind of a
403 00:48:41.780 ⇒ 00:48:42.150 MattBurns: Yeah.
404 00:48:42.150 ⇒ 00:48:48.050 Scott_Harmon: Oh, Sas like model, because we don’t like those. I don’t think you like them, Matt. So
405 00:48:48.470 ⇒ 00:48:50.870 MattBurns: No, I I appreciate that. And again, it’s
406 00:48:51.388 ⇒ 00:48:54.690 MattBurns: like, I said. If we can ultimately
407 00:48:54.940 ⇒ 00:49:00.640 MattBurns: quantify a lot of this again, it it’s a win-win, and we say that a lot where
408 00:49:00.640 ⇒ 00:49:01.210 Scott_Harmon: Yeah.
409 00:49:01.210 ⇒ 00:49:03.049 MattBurns: If if we know that.
410 00:49:03.260 ⇒ 00:49:10.089 MattBurns: And again, we’re we’re we’re we’re gonna get some of this accomplished. There’s a there’s certainly a savings to the bottom line. So
411 00:49:10.270 ⇒ 00:49:22.990 Scott_Harmon: Yeah. So my my recommendation tactically is just if we could carve off. I just have had tremendous success just getting in a room and going to a whiteboard, and the the answer tends to emerge real quickly.
412 00:49:23.472 ⇒ 00:49:29.610 Scott_Harmon: I sure I don’t know what everyone’s timing’s like, but I know you know
413 00:49:29.850 ⇒ 00:49:36.559 Scott_Harmon: in the past I’ve I’ve made an enormous progress. So that’s what I would recommend in the next if we could find a way to squeeze in a
414 00:49:37.640 ⇒ 00:49:40.899 Scott_Harmon: 45 min or an hour work session. I bet we’d solve it.
415 00:49:42.700 ⇒ 00:49:46.900 MattBurns: Yeah, I’m up for that, Steven. Any thoughts from from you on it
416 00:49:47.230 ⇒ 00:49:50.839 Steven: Yeah, no, I think I’ll cover. I mean what we don’t want. Because, yeah, we
417 00:49:51.740 ⇒ 00:50:11.220 Steven: obviously, it’s still new to us. We don’t a result. Y’all could say. You know we trust you. All y’all are working like I said. I like that. We’re both aligned, and that if y’all provide better results. You make more money, and if y’all provide better results, we save more money. So that’s the line we want to get to. But it’s hard to see that right now, because we don’t know what we’re gonna grade. And so yeah, if we get there.
418 00:50:11.220 ⇒ 00:50:25.399 Steven: it’s almost like Matt said. We start out with some kind of cost plus and tinker with it and play with it until we kind of figure out something that is tangible us to say, yeah, you know. Oh, by the way, I think another one that we’ve talked about. But 1st call resolution. How do you put a price on that?
419 00:50:25.604 ⇒ 00:50:31.730 Steven: We know it’s worth something, is it? Is it as valuable now as it will be in the future? We don’t know. But yeah, I think just sitting down and
420 00:50:32.180 ⇒ 00:50:35.459 Steven: say we trust you all. We love that y’all are working with us and not just
421 00:50:35.700 ⇒ 00:50:41.030 Steven: throwing money other. But yeah, we have to be able to save money to justify it. Y’all have to be able to make money to justify it. So
422 00:50:41.313 ⇒ 00:50:46.139 Scott_Harmon: If you guys want to shoot us a couple of times that we could squeeze something in
423 00:50:46.140 ⇒ 00:50:46.560 Uttam Kumaran: Perfect.
424 00:50:46.560 ⇒ 00:50:47.860 Scott_Harmon: I bet
425 00:50:48.586 ⇒ 00:51:01.400 Scott_Harmon: I bet. And we could. We could probably do an initial deal again. Maybe we just I think we’re gonna limit phase phase 2 to pass. So so I’m sure we could come up with. Kind of hey? Let’s use this for the next few months. See how it works. And then.
426 00:51:01.870 ⇒ 00:51:10.750 Scott_Harmon: you know, evolve it, and if you want to expand it to another service line like lawn, and you know we can. We can get a little fancier, but I think we want to walk before we run
427 00:51:10.750 ⇒ 00:51:11.450 Uttam Kumaran: Yes.
428 00:51:11.640 ⇒ 00:51:12.159 Steven: Makes sense. Yeah.
429 00:51:12.160 ⇒ 00:51:20.420 Steven: And I like the result space, because my other fear with this, if your price per question all of a sudden, people think this is a cool tool. So they’re using for everything, whether they really need it or not.
430 00:51:21.500 ⇒ 00:51:22.569 Scott_Harmon: That’s it!
431 00:51:22.570 ⇒ 00:51:24.310 Scott_Harmon: One I hate, I hate.
432 00:51:24.310 ⇒ 00:51:32.579 Uttam Kumaran: And a lot of companies are doing that. I mean again, this is what I talked to Scott about is like, I have no interest in pricing that way. A lot of companies in this space are doing that.
433 00:51:32.996 ⇒ 00:51:59.950 Uttam Kumaran: I don’t know. Not interested. So for me, I think it’s perfect, like if we can come to an agreement on what that looks like. This was our stab. But of course, even talking to Yvette yesterday, there’s gonna be a lot of assumptions made. And so for us, it’s gonna be balancing that. But also again, if, for example, we’re saving the lifetime of the customer. But like you don’t see that for 5 years. Then that’s definitely not aligned right? And so we wanna come, yeah, go ahead, Scott.
434 00:51:59.950 ⇒ 00:52:14.099 Scott_Harmon: There are 2 call types that are. I’ll just again, we can do this next week on a whiteboard. But there are 2 call types and document types that are directly tied to revenue that I’ve seen so far. You just mentioned one Steven, which is, oh, by the way.
435 00:52:14.310 ⇒ 00:52:25.630 Scott_Harmon: those are upsells if they work. You have a very specific new order for a new service that you didn’t have before that call, so I think it’s pretty easy for us to claim. Hey? We helped
436 00:52:26.250 ⇒ 00:52:32.280 Scott_Harmon: get some percentage of whatever that. Oh, by the way, upsell was the other one I just learned about were these
437 00:52:32.530 ⇒ 00:52:47.680 Scott_Harmon: these. Oh, shit I call! They’re not oh, shits, but they’re about to churn, and I got to learn a little more. But you know the value of a churn right? And if we’re able to jump in and you’re about to lose a customer, you know what the annualized contract value of that customer is.
438 00:52:48.180 ⇒ 00:52:52.070 Scott_Harmon: I would think, at some level if we’re able to help avert.
439 00:52:52.820 ⇒ 00:53:02.200 Scott_Harmon: You know, a customer that was going to churn, we’d be comfortable going. Yeah, you know, that’s real value. You you know that customers work, you know, 800 bucks a year, whatever it is.
440 00:53:02.490 ⇒ 00:53:12.549 Scott_Harmon: So you know, we could throw that in the mix, but but I think the principles were aligned on. We want to stay as closely tied to real measurable business value as we can.
441 00:53:12.810 ⇒ 00:53:17.320 Scott_Harmon: and then we want to walk before we run, so I like your idea some kind of a base plus
442 00:53:17.600 ⇒ 00:53:18.160 MattBurns: Yeah.
443 00:53:18.410 ⇒ 00:53:19.229 Scott_Harmon: Oh, man!
444 00:53:19.680 ⇒ 00:53:26.966 MattBurns: Well, good! Well, we’ll do that. I’ll work with Steven and Yvette on some. Sometimes we could could get up here and do that
445 00:53:27.970 ⇒ 00:53:28.550 Scott_Harmon: Good.
446 00:53:28.550 ⇒ 00:53:32.410 MattBurns: Yup, go that route perfect? Well, yeah, again, this is.
447 00:53:32.590 ⇒ 00:53:42.829 MattBurns: I’ve been keeping up peripherally with what Yvette and Steven tell me. But you know, to see what you guys are doing in a short period of time. It’s it’s very impressive. So kudos to you guys.
448 00:53:43.270 ⇒ 00:53:51.849 Scott_Harmon: And I’ll do the same back. I just think the whole team, but especially Janice and Yvette, have just been amazing in.
449 00:53:52.040 ⇒ 00:53:53.889 Scott_Harmon: and if if you don’t have it.
450 00:53:54.000 ⇒ 00:54:00.659 Scott_Harmon: A team of experts like that that are deeply engaged. You can’t really make this AI thing work.
451 00:54:01.060 ⇒ 00:54:11.810 Scott_Harmon: and they’re clearly invested, you know in in making it work, I think, because it’ll make their life better and easier. And so so it’s it’s really great alignment
452 00:54:12.010 ⇒ 00:54:20.569 Scott_Harmon: between between the 2 teams, and you know, with that, without that I don’t think as smart as Utam’s team is. I don’t think he’d be able to
453 00:54:21.000 ⇒ 00:54:24.119 Scott_Harmon: to be making the gains that he is
454 00:54:24.900 ⇒ 00:54:27.180 MattBurns: No understood, that makes sense right.
455 00:54:27.402 ⇒ 00:54:28.290 Uttam Kumaran: Totally! And you know
456 00:54:28.290 ⇒ 00:54:29.302 YvetteRuiz: Guys. Thank you.
457 00:54:29.640 ⇒ 00:54:47.160 Uttam Kumaran: Yeah, I took a sec. I met with you about yesterday. I took a sec to go see the bullpen with everybody, and and say, Hi! And that was really impactful, you know. I think it was great to see how excited Grace and and Shannon were, and to see that they were using it, and they were like, Well, can I ask it? This
458 00:54:47.430 ⇒ 00:55:11.369 Uttam Kumaran: all those sorts of things? And so it was really really nice to see. I mean. I of course you know for me, I I always push our team to move faster, and so I want to find ways for us to to deliver more cheaper, faster. But I’m also really really proud. We we took some new technologies, really have have made it made it work in in our existing system so happy
459 00:55:12.940 ⇒ 00:55:25.882 YvetteRuiz: Thank you guys, I really appreciate y’all support. You guys have taken the time to help break down things to better understand. So I’ve learned a lot in the mix of it, too, so I’m excited, and thank you for y’all y’all support and help
460 00:55:27.270 ⇒ 00:55:28.070 MattBurns: Good deal.
461 00:55:29.650 ⇒ 00:55:47.739 Uttam Kumaran: Perfect, so maybe we’ll I’ll we’ll send a note after this, just to confirm maybe times for for next week. If we can grab, you know, an hour or so. I think we have enough. I have enough stuff to come with, you know. Pre prepared to that meeting and and kick start. So, Scott, I think between us we’ll we’ll grab some time to do that
462 00:55:47.740 ⇒ 00:55:51.909 Scott_Harmon: Yeah, I think a whiteboard will nail it pretty quick. Yeah.
463 00:55:52.170 ⇒ 00:55:57.239 Uttam Kumaran: And then I think, Amber, if you want to go to the next slide. So just a couple of things coming up. So
464 00:55:57.552 ⇒ 00:56:27.239 Uttam Kumaran: we’re working with Tim on how we can speed up our ability to deploy the bots right now. We’re we’re sort of doing this on a maybe once or twice a week. We probably will end up having some updates almost every day or every 2 days. So we’re working on him on how to improve the deploying process. Janice will be joining us, which is amazing. I told the team yesterday as soon as I got back home. So that’ll really close the loop on a lot of things. And again, we’re working on, you know, 4 or 5 things I want us to focus on just the critical pads.
465 00:56:27.591 ⇒ 00:56:41.899 Uttam Kumaran: To getting this rolled out. And then yes, I’m talking with Brian a little bit about getting some more data around our the customers that are canceling, and probably hopefully give us some some ammo for for the meeting that we’ll have next week.
466 00:56:44.480 ⇒ 00:56:45.220 MattBurns: Very good.
467 00:56:46.910 ⇒ 00:56:47.650 YvetteRuiz: Yes.
468 00:56:49.890 ⇒ 00:56:52.256 YvetteRuiz: Well, thank you, Amber. I hope you feel better
469 00:56:53.700 ⇒ 00:56:54.660 Scott_Harmon: Well done! Amber.
470 00:56:54.660 ⇒ 00:56:55.320 Amber Lin: That
471 00:56:56.140 ⇒ 00:56:57.257 Uttam Kumaran: Yeah. Thanks. Amber.
472 00:56:57.630 ⇒ 00:56:59.310 MattBurns: Thank you. Guys. Bye-bye.
473 00:56:59.310 ⇒ 00:57:00.090 Uttam Kumaran: So much appreciate it
474 00:57:00.090 ⇒ 00:57:01.720 YvetteRuiz: Yeah, bye, bye.