Meeting Title: Brainforge x ABC Home and Commercial Date: 2025-02-21 Meeting participants: Uttam Kumaran, Steven, Janiecegarcia, Miguel De Veyra, Yvetteruiz, Scott_Harmon
WEBVTT
1 00:00:16.460 ⇒ 00:00:17.830 Uttam Kumaran: Hey! Good morning!
2 00:00:18.120 ⇒ 00:00:19.279 Scott_Harmon: Hey? How you doing.
3 00:00:19.930 ⇒ 00:00:21.050 JanieceGarcia: Good? How are y’all.
4 00:00:21.810 ⇒ 00:00:23.180 Scott_Harmon: Good to see ya.
5 00:00:23.750 ⇒ 00:00:24.970 JanieceGarcia: You as well.
6 00:00:26.380 ⇒ 00:00:27.810 Scott_Harmon: Morning, pat.
7 00:00:28.730 ⇒ 00:00:29.790 Miguel de Veyra: Hey? Everyone! Good morning!
8 00:00:29.790 ⇒ 00:00:30.350 YvetteRuiz: Guys.
9 00:00:30.840 ⇒ 00:00:40.279 Uttam Kumaran: Hey? Good morning, cool, I think what we can probably just move right along. So I’ll just pull up
10 00:00:40.760 ⇒ 00:00:46.349 Uttam Kumaran: the sort of agenda on our side. And yeah, we have a bunch of stuff to sort of
11 00:00:47.160 ⇒ 00:01:01.540 Uttam Kumaran: demo today a few things in particular that I want to get through is one just talking through where we are on like overall timeline and then we want to demo a few things around Google chat.
12 00:01:01.630 ⇒ 00:01:17.589 Uttam Kumaran: We wanna demo some of the questions that are now being answered by the agent. We do have like a dashboard set up now to show like the data analytics around what we’re collecting and then we can kind of talk about how we wanna
13 00:01:17.600 ⇒ 00:01:39.949 Uttam Kumaran: over the next 2 weeks. I think the main phase will be, how do we actually get this into your environment and get some people testing? So I have a few notes there. So overall on timeline, we? We’re basically on track in terms of getting all of the agent data and being able to answer all the questions right now. The next phase for the second agent is really working on updating the docs.
14 00:01:40.249 ⇒ 00:01:58.200 Uttam Kumaran: At the moment, we’re sort of going through the 1st Manual review, which is just making sure everything. The docs is fine now, and then that way. We know that the initial agent can answer all the questions the second piece is going through and allowing for changes. And so that’ll be the second agent we worked on.
15 00:01:58.512 ⇒ 00:02:15.559 Uttam Kumaran: The next thing I want to, demo, and I don’t know, Miguel, if you want to. Demo the Google Chat agent. This was a big piece that was sort of a little bit of unknown in the in the beginning of the project. But it looks like we’re we’re gonna have success there in terms of having this available as a Google chat
16 00:02:15.700 ⇒ 00:02:20.510 Uttam Kumaran: available to the ABC. Team. So, Miguel, if you’d like to.
17 00:02:20.510 ⇒ 00:02:21.230 Miguel de Veyra: Okay.
18 00:02:21.500 ⇒ 00:02:22.240 Uttam Kumaran: Share that.
19 00:02:25.160 ⇒ 00:02:26.160 Miguel de Veyra: Okay, sure.
20 00:02:30.168 ⇒ 00:02:31.860 Miguel de Veyra: Can everyone see my screen?
21 00:02:31.860 ⇒ 00:02:32.650 Uttam Kumaran: Yes.
22 00:02:33.220 ⇒ 00:02:46.119 Miguel de Veyra: So there’s right now, at least, at least, for now there’s 2 ways we can communicate with the agent. One is by messaging the agent directly. So we’re gonna add this into your workspace, something similar to this. And then
23 00:02:46.240 ⇒ 00:02:50.739 Miguel de Veyra: it’s basically the same as we tested before last
24 00:02:50.850 ⇒ 00:02:58.680 Miguel de Veyra: by this Tuesday where you can just ask questions, and then, you know, it will reply, and then the other way is.
25 00:02:59.090 ⇒ 00:03:01.590 Miguel de Veyra: I think you have to go to test.
26 00:03:03.900 ⇒ 00:03:08.359 Miguel de Veyra: I forgot where the test is. Oh, here and then the other thing is, you have to tag
27 00:03:08.700 ⇒ 00:03:10.850 Miguel de Veyra: ABC bought. And then basically.
28 00:03:11.610 ⇒ 00:03:12.930 Uttam Kumaran: So this is again a group.
29 00:03:13.180 ⇒ 00:03:18.440 Miguel de Veyra: Yeah, like, like in the group where everyone can see it. What is, let’s say.
30 00:03:20.033 ⇒ 00:03:22.099 Miguel de Veyra: chemistry, or something like that.
31 00:03:34.130 ⇒ 00:03:50.749 Uttam Kumaran: So another thing that we’ll sort of talk about today as well is expectations for timing. And what sort of data is most relevant? 2 other things we’re working on for the bot is also being able to share where it’s sourcing its data in the document.
32 00:03:51.386 ⇒ 00:03:58.290 Uttam Kumaran: So like you may, people may be able to say, Go, read more here, if you’d like to go. And it could take a document.
33 00:03:58.929 ⇒ 00:04:09.459 Uttam Kumaran: And then also, if there’s an issue how to escalate that basically. And so it gets flagged to Janice or or others on this, this document is out of date.
34 00:04:09.460 ⇒ 00:04:15.979 Scott_Harmon: Can I double click on that last sentence you had? Who, Tom? So what I heard you say was that
35 00:04:16.529 ⇒ 00:04:26.170 Scott_Harmon: in that when when you start chatting with the bot, it’ll include maybe a link into the document to read more like here. Here’s where I read that or or whatnot.
36 00:04:26.340 ⇒ 00:04:28.810 Scott_Harmon: The second thing, I think you you said
37 00:04:29.600 ⇒ 00:04:31.989 Scott_Harmon: we’ll maybe repeat the second thing you said.
38 00:04:32.788 ⇒ 00:04:38.209 Uttam Kumaran: The second thing I said is, if it’s incorrect, being able to like escalate that
39 00:04:38.943 ⇒ 00:04:44.730 Uttam Kumaran: or in situations where there’s no answer. That’s also an escalation of like a.
40 00:04:45.350 ⇒ 00:04:48.090 Scott_Harmon: So how would that? So you mean, like, the user
41 00:04:48.400 ⇒ 00:04:51.160 Scott_Harmon: would basically say, this doesn’t work or.
42 00:04:51.630 ⇒ 00:04:57.440 Uttam Kumaran: Yeah, so ideally. But yeah, basically, we collect some feedback as in this is incorrect. Or
43 00:04:57.940 ⇒ 00:05:09.549 Uttam Kumaran: if we know that we we don’t have that information that also gets collected. And then we basically have an escalation process for how either that’s like a active escalation, or that goes basically to a
44 00:05:09.880 ⇒ 00:05:16.650 Uttam Kumaran: a field to to Janice or Yvette on like these? Are the questions active questions that aren’t answered.
45 00:05:16.650 ⇒ 00:05:17.540 YvetteRuiz: Terrified.
46 00:05:17.920 ⇒ 00:05:19.589 Scott_Harmon: Right? So they’re in that case.
47 00:05:19.590 ⇒ 00:05:30.489 YvetteRuiz: They come up. Oh, I’m so sorry the ones that we were searching Miguel right. They came up. I’m sorry, but I don’t have access to that information. In those instances those will be flagged up to to us.
48 00:05:30.880 ⇒ 00:05:32.850 Miguel de Veyra: Yep, yeah. This one.
49 00:05:33.130 ⇒ 00:05:37.289 Scott_Harmon: Or or if if the information just isn’t in the document.
50 00:05:37.310 ⇒ 00:05:37.940 YvetteRuiz: Yes.
51 00:05:37.940 ⇒ 00:05:44.529 Scott_Harmon: And the so the bot should be able to say, I’m sorry, you know I the document, or whatever my knowledge base doesn’t
52 00:05:44.710 ⇒ 00:05:48.449 Scott_Harmon: have that, and then you would be flagged, you or Janice, or whomever.
53 00:05:48.969 ⇒ 00:05:52.479 YvetteRuiz: And be able to use the agent to put that knowledge in there.
54 00:05:52.820 ⇒ 00:05:54.630 Scott_Harmon: Okay, got it? Got it? Okay.
55 00:05:54.630 ⇒ 00:06:01.960 YvetteRuiz: Yep, yep, okay. So the red ones that we tagged as that. And then maybe some of the yellow ones that weren’t clear or something.
56 00:06:03.090 ⇒ 00:06:06.350 Miguel de Veyra: Yep, the discounts. We don’t have access to that. Still.
57 00:06:06.350 ⇒ 00:06:06.890 YvetteRuiz: Right.
58 00:06:07.280 ⇒ 00:06:07.850 Miguel de Veyra: Yeah.
59 00:06:08.691 ⇒ 00:06:16.259 YvetteRuiz: Quick question. I’m so sorry the Google Chat. You guys said that it was going to be a group with everyone. Is that what I’m understanding.
60 00:06:16.660 ⇒ 00:06:17.070 Uttam Kumaran: No.
61 00:06:17.070 ⇒ 00:06:18.130 Miguel de Veyra: There’s 2. Okay.
62 00:06:18.130 ⇒ 00:06:22.190 Uttam Kumaran: Yeah, so you’ll be able to. The Csrs will be able to message it directly.
63 00:06:22.190 ⇒ 00:06:23.390 YvetteRuiz: Okay. Okay.
64 00:06:24.068 ⇒ 00:06:28.470 Uttam Kumaran: You can. Also, we can enable this ability to
65 00:06:28.600 ⇒ 00:06:31.140 Uttam Kumaran: have it in a chat with multiple people.
66 00:06:31.540 ⇒ 00:06:46.110 Uttam Kumaran: For example, if you’re in a group. And you want to reference it. You can also do that. But again, we were just showing the functionality. We can. We can turn that off basically but people can go directly into their workspace and send a message.
67 00:06:46.310 ⇒ 00:06:47.320 YvetteRuiz: Awesome. Okay.
68 00:06:47.320 ⇒ 00:06:53.469 Uttam Kumaran: This isn’t live currently in your environment. I will have to be working with Tim to to get this all enabled. But
69 00:06:53.840 ⇒ 00:07:02.809 Uttam Kumaran: it’s it’s like working and hooked up our main concern. There was less about getting a live in your environment, more about making sure that the Google chat can actually work.
70 00:07:03.670 ⇒ 00:07:04.200 Uttam Kumaran: Yes.
71 00:07:08.290 ⇒ 00:07:14.480 Uttam Kumaran: cool. And then, Miguel, do we want to talk about the new Central Doc? You know. And and.
72 00:07:14.730 ⇒ 00:07:14.980 Miguel de Veyra: Yeah.
73 00:07:14.980 ⇒ 00:07:15.600 Uttam Kumaran: On! There!
74 00:07:15.600 ⇒ 00:07:38.050 Miguel de Veyra: Yeah, so basically, Yvette, I shared you this drive, where basically, it contains everything, at least in these folders. It contains every document that we have. And then what I did was basically put, you know, residential sops all into this one, Doc, just to organize this all ppts. I converted them into text format some.
75 00:07:38.560 ⇒ 00:07:57.109 Miguel de Veyra: So everything’s here. And then what I did eventually was add them into an entire, into a Central Doc or the Bible over here. So everything now is, here’s the central. But of course we didn’t really put the updated stuff, I just, you know, fixed it so now, as you can see. I probably have to adjust some of these still.
76 00:07:57.310 ⇒ 00:08:01.430 Miguel de Veyra: but you know it’s most of it are, I would say.
77 00:08:01.730 ⇒ 00:08:05.539 Miguel de Veyra: structured in a way that it’s easy to understand for the bot
78 00:08:05.660 ⇒ 00:08:07.669 Miguel de Veyra: and for other people. Of course.
79 00:08:07.860 ⇒ 00:08:16.449 Uttam Kumaran: So our our 1st goal, of course, was just to get everything in one place. What we can do now is, there’s likely redundant information in several
80 00:08:17.540 ⇒ 00:08:24.619 Uttam Kumaran: that we will use AI to basically start to even consolidate this further. Again, we want this.
81 00:08:24.720 ⇒ 00:08:32.549 Uttam Kumaran: We want 2 things, one. This, of of course, like this has all the information, but most of the people shouldn’t be accessing this doc where you’re like
82 00:08:32.780 ⇒ 00:09:00.020 Uttam Kumaran: just coming in it like fresh face. Ideally, most of the access to the stock is through the bot where you’re answering. You’re asking a question and you need some follow up information. Maybe you can go to the section, or if you’re making an update, and you don’t want to use the bot for an update, or you want to go, make a few lines. You can go directly in here and do that. So those residual documents that are in the other folders we will most likely either leave or sort of keep archived, and then this will sort of become the central source of truth.
83 00:09:00.220 ⇒ 00:09:02.379 Uttam Kumaran: For for knowledge there.
84 00:09:03.570 ⇒ 00:09:10.969 YvetteRuiz: Okay. So when I’m sorry to make sure. So if we wanted to make any adjustments, we could come here already, Utam, and just go ahead and update.
85 00:09:10.970 ⇒ 00:09:12.940 Miguel de Veyra: Correct. Yes, that’s right.
86 00:09:13.490 ⇒ 00:09:32.510 Uttam Kumaran: So one thing that I will be talking to, I think, with Janice about on this topic is, what is the ideal structure for this document right now, it’s a lot of. So we want to work through. What are the key sections, and how we can start to consolidate this without losing fidelity of like the information.
87 00:09:32.929 ⇒ 00:10:01.369 Uttam Kumaran: Right. The nice thing is, everything is in here, and so we can work from this to sort of consolidate it further. The second piece we’ll be working on is sort of the feedback loop that I talked about, which is when there is an escalation. For example. There is a question that is unknown, or there’s an answer that’s incorrect. That someone gives feedback on. How do we actually go through and make that update. That is the second agent that we’ll be working on, which is
88 00:10:01.550 ⇒ 00:10:04.110 Uttam Kumaran: we want to go make this change, basically.
89 00:10:05.370 ⇒ 00:10:11.249 Uttam Kumaran: And so that’s something that we’ll we will. We will be talking about, Miguel. Can I share screen.
90 00:10:11.410 ⇒ 00:10:12.310 Miguel de Veyra: Oh, yeah, of course.
91 00:10:12.310 ⇒ 00:10:15.040 Uttam Kumaran: Yeah, so one thing that
92 00:10:15.280 ⇒ 00:10:21.579 Uttam Kumaran: you know, we started working on, which is this this concept of, you know, this like golden data set.
93 00:10:21.998 ⇒ 00:10:50.940 Uttam Kumaran: We made some good progress here. And I think we’re gonna spend another hour early next week working on this, which is basically starting to build out our core. You know, basically our our evaluation data set for the agent. So the information you’re seeing here and just make this a little bit cleaner to see. What you’re seeing here is you’re seeing a question. You’re seeing a kind of a question type which this is like our almost like our technical question type.
94 00:10:51.430 ⇒ 00:10:55.630 Uttam Kumaran: I may also put more of like a descriptive question type. Here
95 00:10:55.920 ⇒ 00:10:59.259 Uttam Kumaran: we will be working through what the expected answer should be.
96 00:10:59.520 ⇒ 00:11:04.410 Uttam Kumaran: This is actually what the what the the current AI agent gives.
97 00:11:04.780 ⇒ 00:11:07.560 Uttam Kumaran: which is helpful to see, but
98 00:11:07.670 ⇒ 00:11:25.150 Uttam Kumaran: is sort of not good to be like. This is the North Star, right? So we want to create for each of these questions, what is the North Star? And how should it be communicated right now, like my, my takeaway is like, these are pretty. These are probably good structure, but it’s probably a lot of information. Maybe there’s ways for us to.
99 00:11:25.170 ⇒ 00:11:53.480 Uttam Kumaran: because again, they’re on the phone, they probably want to just see just the answer. So those are the things that we will be working on early next week is what is the actual, what is actual feel of the answers? Because currently we’re sort of saying, Give everything and we haven’t given any guidance on the structure of the responses. We’ve made. We mainly just confirm that we can get all the info and like, basically, for the most part, has everything it needs to answer the question. So that’s 1 thing that
100 00:11:53.500 ⇒ 00:12:19.500 Uttam Kumaran: we want to look at. The way we will actually do that is not is in one sense, you know, we’ll go through a bunch of these and sort of like type out the answers. But what I will glean from that is sort of the tone like how robust it needs to be. The the language level like and sort of things like that which we will provide to the agent on like how to actually structure the response. That way for future questions. We don’t need to sort of like.
101 00:12:20.000 ⇒ 00:12:25.081 Uttam Kumaran: basically, we’re we’re trying to give it a sense of like how to actually write the responses?
102 00:12:25.650 ⇒ 00:12:28.023 Uttam Kumaran: and so we will be going through this
103 00:12:28.690 ⇒ 00:12:55.659 Uttam Kumaran: next week. And then in our data, we will actually start to categorize our questions by these types. So you’ll get a sense of what are the questions by type. You’ll also get a sense of the quality score, meaning for questions by that are spreadsheet related retrievals. And again, these are just sort of our decisions for types. Now, these may not stick like what is, what are? What’s the score of the questions that we’re getting in terms of our ability to answer those
104 00:12:56.940 ⇒ 00:13:12.369 Uttam Kumaran: and sort of one thing that we’ve been working on there is is a is a dashboard to actually visualize a lot of that. So this is just we we’ve only had, like, you know, roughly, like 30 conversations. Now that we’ve been logging but basically, now we’re able to see all the conversations.
105 00:13:12.470 ⇒ 00:13:17.438 Uttam Kumaran: conversation, and session and record. There’s like sort of different hierarchy.
106 00:13:18.020 ⇒ 00:13:32.670 Uttam Kumaran: But basically, I wanted to see granularity of every single back and forth associated with a conversation, and then we will start to layer on what’s called like dimensions. Which is who is asking the question, what is the question? Type?
107 00:13:33.187 ⇒ 00:13:41.870 Uttam Kumaran: What is the score of the response the agent gave and then also the time it took to actually return that answer.
108 00:13:42.293 ⇒ 00:14:04.159 Uttam Kumaran: So we’ll start to have all of those and what we will most likely see on the timing piece is the harder questions take longer to answer right? And but that gives us team that gives our team, a little bit of a north star on like, okay, these are the ones we should target. One question I do have for the team on that is, and it was in the agenda which is like, what is the expectations for
109 00:14:04.350 ⇒ 00:14:05.390 Uttam Kumaran: priming?
110 00:14:06.740 ⇒ 00:14:23.440 Uttam Kumaran: you know. I I know. Of course it should instant, is probably the right answer here, but I’m trying to get a sense of what we can set as a group as a goal for execution. Times in terms of returning the right information to the agent when a question is is answered.
111 00:14:24.089 ⇒ 00:14:28.109 Uttam Kumaran: do you guys have a sense of like a upper level for that.
112 00:14:32.310 ⇒ 00:14:34.000 JanieceGarcia: Thinking, go ahead, Yvette.
113 00:14:35.420 ⇒ 00:14:45.300 YvetteRuiz: So make to make sure that I understand. So when we ask the bot question, you’re wanting to kind of what’s what’s the response? Time that we’re looking at. Yeah.
114 00:14:45.300 ⇒ 00:14:48.500 Uttam Kumaran: What would be like the the highest
115 00:14:49.440 ⇒ 00:14:55.069 Uttam Kumaran: that could be cause. We will start to measure our ability to make that happen. Basically.
116 00:14:56.390 ⇒ 00:15:05.579 YvetteRuiz: I’m thinking about. I mean, I don’t want like 30 like 30 seconds that, you know, and then at the highest, maybe the 45 to 45.
117 00:15:05.580 ⇒ 00:15:07.489 YvetteRuiz: Really, that seems very high.
118 00:15:07.800 ⇒ 00:15:11.400 Steven: Yeah, I was, gonna say, like, I was gonna say, like, 10 seconds, 5 to 10.
119 00:15:11.400 ⇒ 00:15:13.460 Uttam Kumaran: I was, gonna say, 3 seconds.
120 00:15:13.460 ⇒ 00:15:14.930 YvetteRuiz: Yeah, okay.
121 00:15:15.670 ⇒ 00:15:43.349 JanieceGarcia: I mean, I’m thinking of, like, if we’re on the phone with the customer, too, and we’re having to look for something right now. Our whole time would be, you know, at the highest would be the 2 min. So I was going to 30 seconds to give that time for the bot to actually look at, and then also be able. Okay, if they’re having to look into the document even further. That gives them a couple more seconds after that. So I was. That’s where my thought process was too.
122 00:15:43.590 ⇒ 00:15:48.480 Steven: Shorter than 30 10 at the most, I mean, because they basically.
123 00:15:48.480 ⇒ 00:15:52.440 Uttam Kumaran: Whatever. Whatever you guys say, I I say, half of that. So.
124 00:15:52.440 ⇒ 00:15:53.580 YvetteRuiz: Well, okay, so.
125 00:15:53.580 ⇒ 00:15:54.290 Uttam Kumaran: That’s what I thought.
126 00:15:54.574 ⇒ 00:15:56.280 YvetteRuiz: I’m gonna go and make 15.
127 00:15:56.280 ⇒ 00:16:12.719 Uttam Kumaran: That’s that actually helps to understand that, like the average whole time is around like 30 seconds to 2 min. That gives us like an upper ceiling of 30 seconds, one which it helps us flag. If anything’s above 30 seconds, we need to sort of isolate that
128 00:16:12.720 ⇒ 00:16:29.740 Uttam Kumaran: second, yeah, we we will be for the easy questions we’re seeing faster execution times. And we’ll start to target like 15 seconds or less, basically and sort of set that as a milestone. And we’ll begin to see across different categories what those sort of
129 00:16:29.870 ⇒ 00:16:31.519 Uttam Kumaran: execution times are.
130 00:16:32.680 ⇒ 00:16:36.680 Scott_Harmon: So in normal, every call center is different. But
131 00:16:37.330 ⇒ 00:16:41.450 Scott_Harmon: I’m kind of towards much closer towards Steven’s view, like, okay.
132 00:16:41.650 ⇒ 00:16:58.450 Scott_Harmon: things that take longer than 5 or 5 or 10 seconds like we have to keep the larger goal in mind here that our goal is to solve calls on problems on the 1st call and not require a call back. So every additional second that this thing takes to respond.
133 00:16:58.660 ⇒ 00:17:05.750 Scott_Harmon: that number is gonna go down in the aggregate. Right. So so for this thing to be really successful.
134 00:17:06.060 ⇒ 00:17:15.319 Scott_Harmon: a a vet Janice need to be able to see you know, a reduction in the requirement for a second call like we solved it on the 1st call. So
135 00:17:15.589 ⇒ 00:17:16.530 Scott_Harmon: you know.
136 00:17:17.210 ⇒ 00:17:24.379 Scott_Harmon: basic human factors in call centers, things that take longer than 5 or 10 seconds. Csrs just won’t do, you know, they’ll just say, Yeah, you know.
137 00:17:24.829 ⇒ 00:17:27.799 Scott_Harmon: you know, they’re just they’re they’re.
138 00:17:27.960 ⇒ 00:17:30.969 Uttam Kumaran: They’ll click off. Click off to the next thing. Yeah.
139 00:17:30.970 ⇒ 00:17:40.430 Scott_Harmon: Yeah, they’ll go to another window, or they’ll pop their head up. So yeah, let’s just let’s just learn from the world here and not overthink this, you know. 5 seconds or less would seem to be
140 00:17:41.270 ⇒ 00:17:42.690 Scott_Harmon: what you know, and.
141 00:17:42.690 ⇒ 00:17:52.380 Steven: And it also, because obviously, it seems like the AI had. That’s 1 advantage of the AI. It has the capability of doing that. If you said, Yeah, there’s no way we get there, obviously. But it should have the capability to find.
142 00:17:52.380 ⇒ 00:17:57.790 Uttam Kumaran: Yeah, for for me, it’s more of an exercise of like prioritization. Like we have.
143 00:17:58.040 ⇒ 00:18:07.429 Uttam Kumaran: we have another like 1015 core things to push forward. I’m sort of like, I want to set a goal and make sure we hit that so that we can continue working on other stuff.
144 00:18:08.380 ⇒ 00:18:14.120 Uttam Kumaran: We can optimize sort of forever there. So I wanna sort of aim for something.
145 00:18:16.870 ⇒ 00:18:18.709 YvetteRuiz: Hey? You, Tom? Real quick.
146 00:18:19.420 ⇒ 00:18:20.840 Steven: How how does it?
147 00:18:21.370 ⇒ 00:18:33.619 Steven: How does it do with? You know, I’m looking at the input there on the top, right with misspellings, you know, like a bundle supposed to be bundle. Would it know that we meant bundle by bundle like, is that? Why.
148 00:18:33.620 ⇒ 00:18:53.400 Uttam Kumaran: It’s it. So yeah, it’s so. It it’ll be totally fine with most of those, unless the misspelling, of course, like, you know, if you’re using Chat Gpt. The tendency for. And again, I I do. This, too, is to just expect it to know exactly what you’re talking about. So you just type, shorter and shorter stuff. And then.
149 00:18:53.530 ⇒ 00:18:59.459 Uttam Kumaran: of course, the quality of the response is not good, the way we will sort of solve, for that is
150 00:18:59.836 ⇒ 00:19:04.369 Uttam Kumaran: and Miguel, I think this is something for our input. Validation is if we don’t have enough
151 00:19:04.830 ⇒ 00:19:07.330 Uttam Kumaran: information, we will ask again.
152 00:19:07.460 ⇒ 00:19:15.390 Uttam Kumaran: right? Like, if it’s just like, what is a service? Then we we can’t. We can respond and take 20 seconds to respond to that. Or we can
153 00:19:15.500 ⇒ 00:19:29.549 Uttam Kumaran: basically immediately identify that. That question is, gonna return a like a dot answer. We should have asked the follow up. So we will solve that a little bit. However, for things like this, it’s it will totally understand the context.
154 00:19:29.550 ⇒ 00:19:36.039 Steven: Because, like, yeah, they wanted, probably, wouldn’t that a Feral ant? It’s actually a Pharaoh ant. My guess is, it wouldn’t figure out.
155 00:19:36.750 ⇒ 00:19:38.769 Miguel de Veyra: You know that was me.
156 00:19:38.890 ⇒ 00:19:39.530 JanieceGarcia: It actually.
157 00:19:39.530 ⇒ 00:19:39.980 Steven: Oh, okay.
158 00:19:39.980 ⇒ 00:19:41.739 JanieceGarcia: Figure. It actually did figure it out.
159 00:19:41.740 ⇒ 00:19:42.620 Steven: Oh, yeah. Okay.
160 00:19:42.880 ⇒ 00:19:44.540 JanieceGarcia: Yeah, yeah. It’s a good.
161 00:19:45.220 ⇒ 00:19:48.650 Uttam Kumaran: I think it’ll be very nice once. This is. Once
162 00:19:49.610 ⇒ 00:19:55.950 Uttam Kumaran: we sort of have more data coming in, you’ll be able to poke at every message response, pair
163 00:19:56.403 ⇒ 00:20:24.056 Uttam Kumaran: and ideally, you can poke at that once we have a lot ideally, we will be looking at. We will basically have another language model that scores this that looks at the message and the response and gives a score. And so we will start to tackle the scores that are the worst, basically understanding whether that’s our. That’s the agent problem, whether we didn’t have the knowledge. You know, or something else. Basically.
164 00:20:30.270 ⇒ 00:20:32.820 Uttam Kumaran: we’ll pause there any questions.
165 00:20:34.596 ⇒ 00:20:35.110 Scott_Harmon: I don’t.
166 00:20:35.110 ⇒ 00:20:37.990 Scott_Harmon: I don’t know if you’ve got any further in the agenda utam I
167 00:20:38.120 ⇒ 00:20:42.440 Scott_Harmon: where are we at on the spreadsheet? Database? Lookup stuff.
168 00:20:42.920 ⇒ 00:20:52.999 Uttam Kumaran: Yeah. So we actually just found, you know, at the moment that the performance was was pretty good with just having the Csvs available to the agent.
169 00:20:53.438 ⇒ 00:21:10.699 Uttam Kumaran: Just writing context. So we haven’t actually had the need to based on the questions that we got from Yvette and Janice. We haven’t had the need to engineer that just yet. And we sort of move forward with just getting the Google Chat working. We will start to be working on some tougher
170 00:21:11.180 ⇒ 00:21:13.840 Uttam Kumaran: questions, and we’ll sort of see whether this
171 00:21:14.310 ⇒ 00:21:19.879 Uttam Kumaran: is breaking down. But at the moment it’s been able to make the associations. Yeah.
172 00:21:19.880 ⇒ 00:21:21.280 Scott_Harmon: Kind of surprised by that.
173 00:21:21.900 ⇒ 00:21:23.079 Uttam Kumaran: I am, too.
174 00:21:23.487 ⇒ 00:21:29.810 Uttam Kumaran: And I don’t know, Miguel, if you have an example of. When we did a look up for
175 00:21:30.020 ⇒ 00:21:32.079 Uttam Kumaran: a service and a Zip code.
176 00:21:32.945 ⇒ 00:21:33.460 Miguel de Veyra: Yes!
177 00:21:33.460 ⇒ 00:21:33.970 Uttam Kumaran: You have one hand.
178 00:21:33.970 ⇒ 00:21:34.460 Miguel de Veyra: Oh!
179 00:21:34.460 ⇒ 00:21:35.330 Uttam Kumaran: Share.
180 00:21:36.341 ⇒ 00:21:40.110 Miguel de Veyra: I do. You have access to the chat? I think we can try it now.
181 00:21:41.870 ⇒ 00:21:43.250 Miguel de Veyra: I’ll send you something.
182 00:21:43.250 ⇒ 00:21:43.980 Uttam Kumaran: Yeah.
183 00:21:47.580 ⇒ 00:21:48.319 Uttam Kumaran: grab it.
184 00:21:49.250 ⇒ 00:21:54.039 Miguel de Veyra: Or maybe Steven, do you wanna try asking it like maybe a Zip code or anything else?
185 00:21:54.040 ⇒ 00:21:57.169 Scott_Harmon: Yeah, I think it’s important. And maybe, do we have Janice
186 00:21:57.340 ⇒ 00:21:59.219 Scott_Harmon: and Yvette? Do we have a?
187 00:21:59.880 ⇒ 00:22:06.579 Scott_Harmon: We need a good list of questions to push this right? So you, you know we’re talking about here right like these are the questions that
188 00:22:06.930 ⇒ 00:22:19.120 Scott_Harmon: you know last week that you broke down to these 2 steps? Do we offer a service? Step one, and then step 2. You know I’m scheduling it after after the clients that they want it.
189 00:22:19.430 ⇒ 00:22:25.299 Scott_Harmon: We need a a good set of questions that push that that are very realistic.
190 00:22:27.310 ⇒ 00:22:40.649 Uttam Kumaran: So yeah, that’s what we’re planning on working early next week. On getting basically that entire Eval data set filled out. I can show, for example. One of them that
191 00:22:42.110 ⇒ 00:22:49.680 Uttam Kumaran: that worked. So we asked, can you give the maintenance tax in the Zip code? And at the moment
192 00:22:50.206 ⇒ 00:22:55.509 Uttam Kumaran: we? We didn’t have right, Miguel. We didn’t have that those sheets, but then.
193 00:22:55.510 ⇒ 00:23:00.929 Miguel de Veyra: Yeah, I I didn’t add the college station, but then we added it, and it was able to locate it.
194 00:23:00.930 ⇒ 00:23:02.869 Uttam Kumaran: It was able to then answer,
195 00:23:05.040 ⇒ 00:23:05.430 Scott_Harmon: So.
196 00:23:05.840 ⇒ 00:23:06.360 Uttam Kumaran: Yeah.
197 00:23:06.360 ⇒ 00:23:12.290 Scott_Harmon: So I I mean, I don’t know how many questions we can get in there in the next week, but I’d like I’d like to see
198 00:23:12.850 ⇒ 00:23:14.449 Scott_Harmon: Yvette and Janice
199 00:23:14.790 ⇒ 00:23:20.359 Scott_Harmon: a bunch of kind of the harder you know the the harder questions that Tech would ask again, so that look up like
200 00:23:21.150 ⇒ 00:23:24.160 Scott_Harmon: that to to see if this thing works, because
201 00:23:26.850 ⇒ 00:23:35.412 JanieceGarcia: And starting with, like, I know, we were looking at text. And when Yvette and I were going through and asking it with Miguel and Shannon.
202 00:23:36.670 ⇒ 00:23:44.459 JanieceGarcia: But what about like the inspector sheets? Or if we actually do service in those areas? Have? Are those questions that you’re looking for.
203 00:23:44.460 ⇒ 00:23:46.810 Uttam Kumaran: Yes. All of the questions. Yeah.
204 00:23:47.330 ⇒ 00:23:51.429 Scott_Harmon: If you were, gonna pick the hardest questions to answer from the sheets
205 00:23:52.269 ⇒ 00:23:57.210 Scott_Harmon: the ones that are trickiest, or whatever you know most common.
206 00:23:57.670 ⇒ 00:24:00.110 Scott_Harmon: you know, I’d like to see 10 or 20 of those.
207 00:24:00.110 ⇒ 00:24:06.300 YvetteRuiz: We just couldn’t ask that time, Miguel, correct me if I’m wrong, because we didn’t have all that loaded just yet. So.
208 00:24:06.300 ⇒ 00:24:06.750 Miguel de Veyra: Yes, yes.
209 00:24:06.750 ⇒ 00:24:18.729 Scott_Harmon: He, but he’s got it loaded now. So I mean the reason. The reason I’m obsessing is number one. You said it was very important number 2. We’re. It’s a question as to whether or not we’re gonna need to.
210 00:24:19.060 ⇒ 00:24:23.980 Scott_Harmon: We can keep the current kind of spreadsheet like structure, or we have to do a little more work
211 00:24:24.130 ⇒ 00:24:27.420 Scott_Harmon: to create a database. That’s a little bit more.
212 00:24:29.540 ⇒ 00:24:32.099 Scott_Harmon: What’s the right word? I’m searching for Tom.
213 00:24:32.850 ⇒ 00:24:33.500 Uttam Kumaran: Yeah.
214 00:24:33.500 ⇒ 00:24:42.119 Uttam Kumaran: whether we need more relationships at the moment it’s figuring it out. But I’m I’m with Scott, and that I’m not like a hundred percent convinced that this will handle the hardest problems.
215 00:24:42.637 ⇒ 00:24:54.419 Uttam Kumaran: So. But also, it’s I actually don’t care much about whether we have the information. At the moment I’m more interested on the questions, cause that’s those are things that those are things that I don’t.
216 00:24:54.540 ⇒ 00:25:01.650 Uttam Kumaran: I have. I can’t. I don’t have. And even if we tried to make those up we would get nowhere. So those are the yeah.
217 00:25:01.650 ⇒ 00:25:07.829 Scott_Harmon: Highlighted these as some of the most important things we could do. And so I just really want to keep a particular
218 00:25:08.290 ⇒ 00:25:11.890 Scott_Harmon: focus on those kinds of questions and answers.
219 00:25:12.440 ⇒ 00:25:17.109 Scott_Harmon: And you know whether we have the right data model, you know, underneath, Tom.
220 00:25:17.320 ⇒ 00:25:17.920 Uttam Kumaran: Yeah.
221 00:25:18.500 ⇒ 00:25:19.090 Scott_Harmon: Okay.
222 00:25:19.820 ⇒ 00:25:27.814 Uttam Kumaran: So we will also. Yeah, I think we’re gonna continue to iterate on this. I I think this is gonna be the highest priority thing for Monday.
223 00:25:28.320 ⇒ 00:25:50.410 Uttam Kumaran: and then, ideally, I think, Janice, even if you have free time, because I we’re we’re sort of working on all the AI agents. Even if you have any free time to go through this and add questions yourself. That’s that would be super super helpful. Basically going through and just literally writing down the question and then writing down what you would expect the answer to be.
224 00:25:50.610 ⇒ 00:25:56.193 Uttam Kumaran: okay, that’s it. Like, that’s the whole exercise. Everything else you can leave.
225 00:25:56.780 ⇒ 00:25:58.869 Uttam Kumaran: every other field you can leave to us. Yeah.
226 00:26:00.280 ⇒ 00:26:00.790 YvetteRuiz: Yeah.
227 00:26:00.790 ⇒ 00:26:24.313 YvetteRuiz: And then I also just like I was talking to yesterday. I are. I am working with our pest leadership team. Because Shannon stepped in with Miguel when I was meeting with Miguel, and I had her ask a couple of questions, but now that she also understands the concept. She’s like, okay, is it? Let’s start working on this. So we’ll be able to provide you more information.
228 00:26:24.640 ⇒ 00:26:53.353 Uttam Kumaran: Yeah. So I I think I’ll do. We’ll do, maybe like an hour session, where, if a couple of us are in here asking questions. I think this will get more robust, and then we will start to continue to evolve like nice thing is this will get more robust as people use the bot. But for this initial set before we can say, Okay, it’s working. We really need this to validate against. So I’ll kind of keep. I’ll just keep poking next week until we can get a get a good amount here.
229 00:26:54.220 ⇒ 00:26:57.570 Uttam Kumaran: and so yeah, this is this one, I would say, is the
230 00:26:57.690 ⇒ 00:27:16.360 Uttam Kumaran: is the most crucial for us to be able to like, prove, show our work like basically show that it’s it’s effective. And additionally, when, as we, as the energy team is making changes, we want to measure that it’s the scores are going up and not meaning our answers to the questions are more in line with what’s expected.
231 00:27:16.490 ⇒ 00:27:18.540 Uttam Kumaran: and then that’s going up over time.
232 00:27:22.610 ⇒ 00:27:23.490 YvetteRuiz: Understood.
233 00:27:24.040 ⇒ 00:27:28.600 Uttam Kumaran: So I’ll make. I’ll just clean the spreadsheet up and make it super super clear. But that would be amazing.
234 00:27:29.053 ⇒ 00:27:38.719 Uttam Kumaran: I think the other items we have to sort of decide is one I’m gonna be working with Tim to get this into your environment. I’ll probably just
235 00:27:39.295 ⇒ 00:27:44.920 Uttam Kumaran: sort of give access to just this group. We do need a name for this.
236 00:27:45.803 ⇒ 00:27:47.290 Uttam Kumaran: Right now.
237 00:27:47.600 ⇒ 00:27:49.429 Uttam Kumaran: It’s just called ABC. Bot.
238 00:27:50.170 ⇒ 00:28:01.250 Uttam Kumaran: I’ll give you a couple of ideas for what people do right now in AI one, a lot of people give it a name like an actual name, because it makes it seem more human.
239 00:28:01.957 ⇒ 00:28:05.570 Uttam Kumaran: Sometimes people give it something. AI,
240 00:28:06.500 ⇒ 00:28:14.979 Uttam Kumaran: I don’t know. I interact a lot with these sorts of bots. I feel like when they’re more human named. You tend to interact it more like a human. If it’s a bot.
241 00:28:15.270 ⇒ 00:28:17.579 Uttam Kumaran: These sort of like are short with stuff.
242 00:28:17.810 ⇒ 00:28:22.669 Scott_Harmon: Okay, if you don’t, guys don’t put. And either in this deal I’m gonna I’m done with you.
243 00:28:24.155 ⇒ 00:28:29.539 YvetteRuiz: Yeah, I was gonna go because we have our our after our bot. We named it Andy.
244 00:28:29.540 ⇒ 00:28:30.140 JanieceGarcia: Sandy.
245 00:28:30.140 ⇒ 00:28:30.580 Uttam Kumaran: Okay.
246 00:28:30.580 ⇒ 00:28:34.049 YvetteRuiz: Andy, cause. That’s Andy, the anteater. Scott. That’s the name.
247 00:28:36.010 ⇒ 00:28:39.829 Scott_Harmon: Okay, I don’t know. I okay, I mean, it’s it’s yours.
248 00:28:40.260 ⇒ 00:28:41.650 Scott_Harmon: I just feel like the art bar.
249 00:28:41.650 ⇒ 00:28:42.070 YvetteRuiz: Because.
250 00:28:42.070 ⇒ 00:28:43.370 Scott_Harmon: Pretty damn cool.
251 00:28:43.370 ⇒ 00:28:44.319 YvetteRuiz: Very seldom.
252 00:28:44.320 ⇒ 00:28:48.070 Scott_Harmon: You have a cool thing to work with. So let’s do this.
253 00:28:48.070 ⇒ 00:28:48.855 Uttam Kumaran: Oh!
254 00:28:50.575 ⇒ 00:28:51.510 YvetteRuiz: That’s.
255 00:28:51.510 ⇒ 00:28:53.629 JanieceGarcia: Anteater that actually talks back.
256 00:28:54.300 ⇒ 00:28:55.014 Uttam Kumaran: Yeah.
257 00:28:55.730 ⇒ 00:28:59.449 Scott_Harmon: Come on, just think of the Logos in the I’m gonna start using the
258 00:28:59.720 ⇒ 00:29:04.440 Scott_Harmon: like stable diffusion and make Logos of a top 5 min.
259 00:29:04.910 ⇒ 00:29:07.690 Uttam Kumaran: He had a marketing for the bot. Yeah. So.
260 00:29:07.690 ⇒ 00:29:09.589 Scott_Harmon: The next hour. Yeah, yeah.
261 00:29:10.770 ⇒ 00:29:19.283 YvetteRuiz: I just envision our coloring book or anteater the front page of our our coloring book, which is Andy, the anteater.
262 00:29:19.670 ⇒ 00:29:23.442 YvetteRuiz: Okay, Andy, be okay, close up. It would be Andy.
263 00:29:25.370 ⇒ 00:29:26.179 Uttam Kumaran: Okay, so we’ll
264 00:29:26.550 ⇒ 00:29:32.450 Uttam Kumaran: we’ll just move forward with one of those. We can keep changing it. But I do agree that naming it a human name.
265 00:29:32.690 ⇒ 00:29:39.269 Uttam Kumaran: There’s some psychology around it. I don’t have back science, but it. This is a human.
266 00:29:39.660 ⇒ 00:29:57.670 Uttam Kumaran: you know, in school I took this class called human computer interaction. This is a big thing about the way humans interact with technology. That I do think that the more you can humanize this, the more people will rely on it like we rely on other people. And that’s, of course, what we’re going for. So.
267 00:29:58.080 ⇒ 00:30:06.029 YvetteRuiz: I think I’ll put a fun spin to it, and I’ll put like a a contest for my team and say, Hey, whoever comes up with the best name.
268 00:30:07.260 ⇒ 00:30:08.610 YvetteRuiz: AI, assistant.
269 00:30:09.580 ⇒ 00:30:10.020 Uttam Kumaran: Easily.
270 00:30:11.370 ⇒ 00:30:15.860 Scott_Harmon: And O over time, I know you guys know this. But we
271 00:30:16.120 ⇒ 00:30:21.570 Scott_Harmon: and this, this is a really is a good fun cultural topic. But you can.
272 00:30:21.810 ⇒ 00:30:29.109 Scott_Harmon: Who? Who? Tom’s team can train this to answer in kind of an ABC. Cultural voice.
273 00:30:29.110 ⇒ 00:30:29.550 Uttam Kumaran: Sure.
274 00:30:29.550 ⇒ 00:30:32.580 Scott_Harmon: So different companies have different values.
275 00:30:33.020 ⇒ 00:30:36.240 Scott_Harmon: They value directness, they value conversation, you know they value.
276 00:30:36.440 ⇒ 00:30:40.090 Scott_Harmon: You know. There’s all kinds of ways, you know. Just think of the people, you know.
277 00:30:40.200 ⇒ 00:30:47.370 Scott_Harmon: and if there’s cultural values that ABC has, which it certainly does, you can tell the bot to answer in a voice
278 00:30:47.510 ⇒ 00:30:50.979 Scott_Harmon: and and give it a personality like it could be.
279 00:30:51.130 ⇒ 00:30:59.320 Scott_Harmon: you know, very abrupt, or to the point, or you know, very fact, based or very friendly or very helpful like you could give it different
280 00:30:59.710 ⇒ 00:31:13.649 Scott_Harmon: tones of voice. We’ll do that later. But that kind of stuff really does matter in terms of whether people just like it. And at the end of the day you want the Csrs to like this thing personally like that’s really. I really like talking to it, and
281 00:31:13.990 ⇒ 00:31:19.700 Scott_Harmon: so we will. There’s a lot of ability to inform its it’s voice.
282 00:31:19.700 ⇒ 00:31:20.390 Uttam Kumaran: Yes.
283 00:31:20.390 ⇒ 00:31:28.919 Scott_Harmon: You know, using ABC’s you know, just cultural values. And that’s actually pretty easy. You could just feed if you’ve got cultural values, docs or
284 00:31:29.450 ⇒ 00:31:32.410 Scott_Harmon: a bunch of conversations that represent.
285 00:31:33.100 ⇒ 00:31:37.800 Scott_Harmon: You know how ABC kind of behaves. You could just feed it to it, and it’ll start talking
286 00:31:38.320 ⇒ 00:31:39.570 Scott_Harmon: talking like that. So.
287 00:31:39.570 ⇒ 00:31:50.790 Steven: I think we’ve we’ve talked about this before, but we’ll be able to get data from as well like we’ll be able to say, hey? What’s the most frequently asked questions? What are the most frequently asked? Like, what topics? It’s already answering the most, and that kind of stuff.
288 00:31:50.790 ⇒ 00:31:54.359 Uttam Kumaran: Yeah, do you mean for the Csrs or for like this crew?
289 00:31:57.160 ⇒ 00:32:04.360 Uttam Kumaran: Yeah, definitely. So that data that I showed, I mean, we could totally have that as like something behind an agent where you can ask those.
290 00:32:04.360 ⇒ 00:32:09.009 Scott_Harmon: Yeah, there’s gonna be a console that you’ll you’ll be able to see. That will give you all kinds of information about.
291 00:32:09.010 ⇒ 00:32:09.520 Uttam Kumaran: Yeah.
292 00:32:09.520 ⇒ 00:32:11.670 Scott_Harmon: What it’s answering, and who, and
293 00:32:12.040 ⇒ 00:32:15.910 Scott_Harmon: you know, in addition to the charts you showed, you’d be able to ask it questions about
294 00:32:16.120 ⇒ 00:32:18.839 Scott_Harmon: most commonly asked questions or stuff like that.
295 00:32:18.840 ⇒ 00:32:19.980 Uttam Kumaran: Yeah, that’s right.
296 00:32:23.610 ⇒ 00:32:44.220 Uttam Kumaran: Okay, I know. I grab more time. But I feel like we got through a lot of stuff today. I think the only thing that we didn’t get to show. And you know, I think I want to see how far we can get on this next week is updating documents. I think we still, I think probably last week we got a lot more information from you that on the core things, I think we have all the data
297 00:32:44.370 ⇒ 00:32:59.020 Uttam Kumaran: also the last piece. Sorry. One more thing, Scott. The last piece is, once we make sure all the data is in there, we clean that up. Then we’ll go talk. We’ll go. Think about how we can pull stuff from evolve and dream right at the moment. All of this is static.
298 00:32:59.140 ⇒ 00:33:03.480 Uttam Kumaran: I sort we’re sort of kicking the can on that, because I know that’s gonna be a little bit of a process.
299 00:33:03.660 ⇒ 00:33:06.869 Uttam Kumaran: But I want to start to make sure that data in there.
300 00:33:07.000 ⇒ 00:33:18.240 Uttam Kumaran: that pieces that we know are from evolving dream get flagged, and then we can make sure that those are pulled from those systems instead of being maintained here, which is the this, the one of the core issues.
301 00:33:18.560 ⇒ 00:33:19.060 Scott_Harmon: So.
302 00:33:19.060 ⇒ 00:33:19.520 YvetteRuiz: Oh, yeah.
303 00:33:19.520 ⇒ 00:33:25.030 Scott_Harmon: Last week last week we discussed only very briefly the
304 00:33:25.240 ⇒ 00:33:38.670 Scott_Harmon: Powerpoint presentations, the new hire training decks that are in there that include important information, and I don’t think we’ve been able to kind of ingest those. What’s the current state of add Tom or Miguel.
305 00:33:40.862 ⇒ 00:33:45.699 Uttam Kumaran: The so those are all source. I think, Miguel, are we bringing in
306 00:33:45.870 ⇒ 00:33:49.329 Uttam Kumaran: those Powerpoint as images? Or what’s the process now.
307 00:33:51.230 ⇒ 00:33:55.949 Miguel de Veyra: Basically, what I did is I converted all because most of the Powerpoints are actually just text.
308 00:33:56.880 ⇒ 00:34:08.250 Miguel de Veyra: Alright. So I converted them into some sort of documentation, and then the bot should know them, though I think Janice mentioned this Tuesday, that some of it are outdated. So we do need to update some of it.
309 00:34:08.580 ⇒ 00:34:13.590 JanieceGarcia: That. Or they’re also the Powerpoint is basically taking.
310 00:34:13.710 ⇒ 00:34:42.099 JanieceGarcia: Like, I’ll use the programs and codes that’s taking our menu of services and putting it into a Powerpoint. That’s easier to train and then having the menu of services there as a backup for the Csrs. So that’s where the redundancy also comes into play is because the Powerpoint, we’re gonna have the sops, or we’re gonna have, you know, the step by step, any of those documents actually written out. So that’s where cleaning up the documents is gonna.
311 00:34:42.100 ⇒ 00:34:42.500 Scott_Harmon: Oh!
312 00:34:42.500 ⇒ 00:34:44.310 JanieceGarcia: Huge for us, as well.
313 00:34:44.540 ⇒ 00:34:54.520 Scott_Harmon: So let me just play that back and see if I got it. Number one. We are ingesting the Powerpoints. I think I heard into that Bible document. Number 2,
314 00:34:54.679 ⇒ 00:34:58.190 Scott_Harmon: we need to do a step. And you refer to this to the Tom of just kind of
315 00:34:58.430 ⇒ 00:35:15.810 Scott_Harmon: getting rid of or or identifying contradictions, or things that are dated so that Janice can go. Wait a minute. Here’s the right, you know. Kind of most current. Okay? Got it. That’s the perfect answer. Got it. I just wanted to make sure there wasn’t some important information for Csrs in those
316 00:35:16.040 ⇒ 00:35:20.239 Scott_Harmon: that they’d been trained on that only existed in the
317 00:35:20.430 ⇒ 00:35:22.619 Scott_Harmon: Powerpoints that we were gonna Miss.
318 00:35:22.620 ⇒ 00:35:23.260 YvetteRuiz: Makes sense.
319 00:35:23.511 ⇒ 00:35:28.280 Scott_Harmon: But it sounds like that’s that we won’t that that’s not the case that we’re gonna pick it up.
320 00:35:28.280 ⇒ 00:35:28.680 Uttam Kumaran: Yeah.
321 00:35:28.680 ⇒ 00:35:33.592 JanieceGarcia: Right. And for for the most part, it is, it is definitely gonna be doubled
322 00:35:35.010 ⇒ 00:35:35.470 Scott_Harmon: Doubled.
323 00:35:36.150 ⇒ 00:35:38.359 YvetteRuiz: Yeah, like, they’re the same thing. The Powerpoints.
324 00:35:39.410 ⇒ 00:35:44.349 Uttam Kumaran: Like Powerpoint, like, I think, Powerpoint, yeah, Powerpoint is effective as like the medium,
325 00:35:44.930 ⇒ 00:36:11.499 Uttam Kumaran: so that’s fine, even if you guys were to create new Powerpoints which ideally, hopefully, the bot can even help with sourcing materials. For that, you could just drop those into that folder. And then anything that’s not like any difference. It will continue to update our docs. Again, I think that’ll be part of this update process where, if there’s 1 work stream, let’s, hey, we want to create new training decks. How do we make sure that if that’s net new information, how does that make it back to the.
326 00:36:11.690 ⇒ 00:36:14.530 Scott_Harmon: Yeah, this I I don’t mean to go down a rat hole this
327 00:36:14.930 ⇒ 00:36:22.079 Scott_Harmon: Uton. This may be better offline topic with you and Miguel. The kind of current state of the art of kind of parsing.
328 00:36:22.270 ⇒ 00:36:28.540 Scott_Harmon: whether it’s a Powerpoint or a Pdf. Or whatever is. It’s pretty easy to get the textual information
329 00:36:28.880 ⇒ 00:36:37.120 Scott_Harmon: out of it. There is important information in the figures and the graphics that some tools are better at extracting than others.
330 00:36:37.410 ⇒ 00:36:45.450 Scott_Harmon: I need to look at the at the Powerpoints to kind of, you know, like sometimes people have, you know, like a circle with 3, 60 and a bunch of like.
331 00:36:45.860 ⇒ 00:36:49.330 Scott_Harmon: And and the best AI tools actually understand
332 00:36:49.680 ⇒ 00:36:53.040 Scott_Harmon: that graphic concept and translate it into words.
333 00:36:53.320 ⇒ 00:36:54.350 Scott_Harmon: And
334 00:36:54.660 ⇒ 00:37:00.860 Scott_Harmon: and that that’s kind of what the state of the art Pdf readers are doing. I know we’re using Gemini, and I’ve.
335 00:37:00.860 ⇒ 00:37:01.450 Uttam Kumaran: That.
336 00:37:01.450 ⇒ 00:37:06.629 Scott_Harmon: I read recently, Geminis actually does the best at that, like even reading a graphical.
337 00:37:07.200 ⇒ 00:37:09.910 Scott_Harmon: you know diagram that that you’ve created.
338 00:37:10.180 ⇒ 00:37:14.959 Scott_Harmon: And then coming up with a you know an actual text description of
339 00:37:15.450 ⇒ 00:37:22.780 Scott_Harmon: of of the of the meaning in that figure. So that’s a little bit down a rat hole, but I think at the fullest of time we’ll be able to
340 00:37:23.290 ⇒ 00:37:32.579 Scott_Harmon: take even figures you’ve created. You know, in a training deck and get those into this document as well in work, so that the bot can answer a question around them.
341 00:37:38.910 ⇒ 00:37:39.890 Scott_Harmon: Retam.
342 00:37:40.238 ⇒ 00:37:44.070 Uttam Kumaran: Don’t know my my- my wi-fi just froze for a sec.
343 00:37:44.370 ⇒ 00:37:45.030 Scott_Harmon: There you go!
344 00:37:45.390 ⇒ 00:37:47.749 YvetteRuiz: Utam off topic. What’s your dog’s name?
345 00:37:47.780 ⇒ 00:37:49.259 Uttam Kumaran: His name is Finn.
346 00:37:49.770 ⇒ 00:37:56.013 YvetteRuiz: 10. Alright, he’s he always looks so. He’s so cozy back there. Relaxed there.
347 00:37:56.460 ⇒ 00:38:00.282 YvetteRuiz: Yeah, except earlier. He like almost broke the fence down like.
348 00:38:01.190 ⇒ 00:38:05.130 Uttam Kumaran: I got another dog. So yeah, he tired himself out, and then
349 00:38:05.410 ⇒ 00:38:08.838 Uttam Kumaran: then he just becomes the back permanent background for the day.
350 00:38:13.920 ⇒ 00:38:17.460 JanieceGarcia: Okay, let’s kind of looks like mighty vets, dogs. They’re cute.
351 00:38:17.460 ⇒ 00:38:19.673 YvetteRuiz: He does, yeah,
352 00:38:21.150 ⇒ 00:38:26.669 Scott_Harmon: Well, it sounds like sounds like pretty good progress, at least, for where I sit like you know, and
353 00:38:28.370 ⇒ 00:38:37.420 Scott_Harmon: it. It sounds like, just from where I sit, because I’m not in the details like Miguel. Or if I’m in that, you guys are doing a great job of working together and Janice, and that you’re being super helpful.
354 00:38:37.420 ⇒ 00:38:38.270 Uttam Kumaran: Thank you. Yeah.
355 00:38:38.270 ⇒ 00:38:40.740 Scott_Harmon: And responsive. We can kind of.
356 00:38:41.500 ⇒ 00:38:44.380 Scott_Harmon: you know. Go as fast as you want to go, like
357 00:38:44.630 ⇒ 00:38:48.850 Scott_Harmon: your ability to help keep pushing and putting stuff in those spreadsheets and stuff’s critical.
358 00:38:50.490 ⇒ 00:38:54.680 Scott_Harmon: So I think it seems like we have a pretty good, pretty good rhythm going on here.
359 00:38:55.440 ⇒ 00:39:03.787 YvetteRuiz: Yeah, I I agree. Miguel’s been super helpful. And going through these things, you’ve been awesome, just answering questions and being patient.
360 00:39:04.150 ⇒ 00:39:13.240 Scott_Harmon: Target day. Do, Tom, on the schedule. I don’t know how detailed your schedule is where we could like, hey? We want the 1st version that could be tested by see it like a milestone.
361 00:39:13.880 ⇒ 00:39:19.140 Scott_Harmon: that we could just kind of put in front of ourselves like, Hey, by the this date. We want
362 00:39:19.510 ⇒ 00:39:22.010 Scott_Harmon: Csrs to be able to start testing it.
363 00:39:22.010 ⇒ 00:39:39.010 Uttam Kumaran: Yeah, I mean, I was. I’m sort of hoping. I mean, our our target date is I think in 2 weeks or 2 weeks from now. And so basically, that week, we were hoping to at least have it available next week. My goal is to have this available in the ABC environment to access.
364 00:39:39.110 ⇒ 00:40:01.309 Uttam Kumaran: And then I think one of the conversations that we wanted to have is maybe there’s 1 or 2 folks on the team that we would like to trial this with, I think, when when this crew feels comfortable, just to sort of see how it works in the wild, maybe we can even shadow that person, or I can come and sit with them and and do some training. But that was our our initial goal was was roughly like 2 weeks
365 00:40:01.730 ⇒ 00:40:02.350 Uttam Kumaran: from now.
366 00:40:02.350 ⇒ 00:40:06.029 YvetteRuiz: Utah. You’re still waiting on Tim. Right cause. That’s he has to connect the chat.
367 00:40:06.030 ⇒ 00:40:11.193 Uttam Kumaran: Yeah, I haven’t. I haven’t hit him up until we sort of figured out that was all possible.
368 00:40:12.010 ⇒ 00:40:12.610 YvetteRuiz: Okay.
369 00:40:12.610 ⇒ 00:40:18.072 Uttam Kumaran: So ideally next week. I think I want to have this in your environment and the ability to actually
370 00:40:18.860 ⇒ 00:40:20.300 Uttam Kumaran: you know, test that.
371 00:40:24.240 ⇒ 00:40:24.699 Scott_Harmon: It was a great
372 00:40:24.700 ⇒ 00:40:29.540 Scott_Harmon: deal. But are you gonna deploy on Gemini, do you think, or do you have a I don’t think it really matters at all.
373 00:40:29.540 ⇒ 00:40:32.120 Uttam Kumaran: I think we probably will.
374 00:40:32.510 ⇒ 00:40:39.839 Uttam Kumaran: because it’s well, it’s working out really well. And I think for Tim’s sake it’ll be in their environment, too, because the Google product
375 00:40:40.480 ⇒ 00:40:43.410 Uttam Kumaran: and it’s working like Super well compared to everything else. So.
376 00:40:43.410 ⇒ 00:40:49.600 Scott_Harmon: Yeah. So I agree, you guys care because there’s 4 of the 5 of these different competing platforms that one of the exiton does is
377 00:40:50.180 ⇒ 00:40:58.750 Scott_Harmon: which one’s best for this, and it sounds like maybe Gemini, Google’s Gemini might be the one at least for now, and we could switch down the road. It’s not like you’re tied to it forever, but.
378 00:40:58.750 ⇒ 00:40:59.130 Uttam Kumaran: Yeah.
379 00:41:02.440 ⇒ 00:41:02.990 YvetteRuiz: Denise, wrote.
380 00:41:02.990 ⇒ 00:41:03.389 Scott_Harmon: So you can.
381 00:41:03.710 ⇒ 00:41:10.449 YvetteRuiz: I think that because we do have a a group of new hires that are going to get ready to start.
382 00:41:10.810 ⇒ 00:41:19.899 YvetteRuiz: You know, kind of the timing you probably I mean, once you go through that 1st round, it would, I mean, item can join in, and I mean it would be.
383 00:41:19.900 ⇒ 00:41:21.189 Uttam Kumaran: Did you do a lot of testing.
384 00:41:22.140 ⇒ 00:41:22.990 YvetteRuiz: Do you.
385 00:41:22.990 ⇒ 00:41:23.380 JanieceGarcia: I know.
386 00:41:23.380 ⇒ 00:41:27.129 Scott_Harmon: Do you record? Do you video your new hard training? Janice.
387 00:41:27.700 ⇒ 00:41:49.360 JanieceGarcia: I do not. It’s usually in person and and sometimes virtually but actually, with this coming out and with Yvette saying that my thought was me going back and doing the actual pest training, at least for the that 1st main week, because that’s so much information, and seeing if Udem can come and sit, and I’ll.
388 00:41:49.360 ⇒ 00:41:49.780 Uttam Kumaran: Yeah, but.
389 00:41:49.780 ⇒ 00:41:52.115 JanieceGarcia: Austin that entire week,
390 00:41:53.240 ⇒ 00:42:07.770 JanieceGarcia: and and doing it that way and going through the actual training that they have, and following the step by step, and just focusing on on the services so he can understand that. And then it may actually be helpful for this whole portion.
391 00:42:08.590 ⇒ 00:42:11.870 YvetteRuiz: Yeah, cause. Then they’ll be able to go in there and ask some of the and add, follow questions.
392 00:42:11.870 ⇒ 00:42:12.270 JanieceGarcia: Yes.
393 00:42:12.930 ⇒ 00:42:15.299 Scott_Harmon: Yeah, when is that? What is that scheduled for.
394 00:42:15.890 ⇒ 00:42:16.400 JanieceGarcia: Oh, we.
395 00:42:16.400 ⇒ 00:42:32.389 YvetteRuiz: We haven’t. We haven’t officially pinned that down. We just we had a job fair. Monday. We finalized the second interviews. And we’re hoping to finalize everything by Monday. So then, that would determine when that class is gonna start. But it’ll be a class about of at least 5 people.
396 00:42:32.780 ⇒ 00:42:38.759 JanieceGarcia: And it would probably be within within the next 3 weeks, which would be right on Adam’s timeline.
397 00:42:39.110 ⇒ 00:42:45.799 Scott_Harmon: Yeah, it’s kind of a neat idea to follow a co cohort of new employees kind of through their early journey at ABC, like.
398 00:42:45.930 ⇒ 00:42:52.360 Scott_Harmon: like, you know, almost as like a control group, right like, Hey, here’s 5.
399 00:42:52.360 ⇒ 00:42:53.190 JanieceGarcia: Right, exactly.
400 00:42:53.190 ⇒ 00:42:53.860 Scott_Harmon: Maybe that you know.
401 00:42:54.536 ⇒ 00:42:55.213 YvetteRuiz: Yup!
402 00:42:55.890 ⇒ 00:43:04.770 Scott_Harmon: They meet the anteater on day one right like eventually, if this works also like when it works, you’ll want them to meet the anteater
403 00:43:05.240 ⇒ 00:43:06.190 Scott_Harmon: right away.
404 00:43:06.190 ⇒ 00:43:06.980 YvetteRuiz: Right away. Yeah.
405 00:43:06.980 ⇒ 00:43:08.960 Scott_Harmon: And it’ll just kind of be there.
406 00:43:09.500 ⇒ 00:43:11.950 Scott_Harmon: Their friend, their ABC. Friend, the whole time. Okay.
407 00:43:11.950 ⇒ 00:43:13.989 YvetteRuiz: To meet Andy, the anteater.
408 00:43:13.990 ⇒ 00:43:14.630 Scott_Harmon: Yes.
409 00:43:19.330 ⇒ 00:43:20.469 Scott_Harmon: Oh, okay.
410 00:43:20.470 ⇒ 00:43:24.069 JanieceGarcia: You can change Andy with an I do, Andy with an I instead of a y.
411 00:43:24.870 ⇒ 00:43:25.849 YvetteRuiz: The Little.
412 00:43:25.850 ⇒ 00:43:27.929 Uttam Kumaran: It’s a little bit more. AI, yeah.
413 00:43:29.440 ⇒ 00:43:32.530 Scott_Harmon: Yeah, that’s true. That’s true. Okay, all right.
414 00:43:33.060 ⇒ 00:43:34.979 Scott_Harmon: Well, it’s your. It’s definitely your bot. I.
415 00:43:35.379 ⇒ 00:43:44.169 Uttam Kumaran: We’ll let you have a vote when we put the contest out there, Scott, we’ll we’ll send you the link to vote.
416 00:43:44.170 ⇒ 00:43:48.479 Scott_Harmon: I’m imagining, like the anteater being like animated like.
417 00:43:48.480 ⇒ 00:43:48.930 JanieceGarcia: Doggy.
418 00:43:48.930 ⇒ 00:43:53.720 Scott_Harmon: Depending on the kind of question the animators face changes and morphs based on like.
419 00:43:54.390 ⇒ 00:43:57.790 Scott_Harmon: oh, that’s a hard question, or Oh, that’s a great like, you know ultimately.
420 00:43:57.790 ⇒ 00:43:58.310 YvetteRuiz: Thinking.
421 00:43:58.310 ⇒ 00:44:00.910 Scott_Harmon: Can’t eater right? Yeah. Thinking or
422 00:44:01.700 ⇒ 00:44:09.530 Scott_Harmon: boy, I gotta escalate that. And you know, like, you know, which I know, I’m kind of moving around. But, as you, as you know, that’s gonna be possible. It almost is.
423 00:44:10.050 ⇒ 00:44:11.050 Scott_Harmon: You know, where you.
424 00:44:11.050 ⇒ 00:44:16.810 Steven: I would love that if we had Andy the anteater. Yeah, him talking and his expression, I mean.
425 00:44:16.810 ⇒ 00:44:20.629 JanieceGarcia: The little cloud, the thinking cloud that comes up with the answer.
426 00:44:20.890 ⇒ 00:44:24.969 YvetteRuiz: That is part of our core value funness. So that’s a little fun in there.
427 00:44:24.970 ⇒ 00:44:31.569 Scott_Harmon: Well, now, that’s important. If fun is the core value, that’s what I meant. That’s exactly the kind of thing that helps with adoption is like
428 00:44:31.890 ⇒ 00:44:35.740 Scott_Harmon: is, we want to have a little fun while we’re doing this work. And so.
429 00:44:36.180 ⇒ 00:44:43.470 Scott_Harmon: you know, if it’s not too much work, you can put in 8 or 10 different. Andy’s based on the kind of question. And
430 00:44:44.116 ⇒ 00:44:49.010 Scott_Harmon: you know, and maybe maybe work that in there at some point. That’s that’s a.
431 00:44:49.010 ⇒ 00:44:49.330 Steven: It’s.
432 00:44:49.330 ⇒ 00:44:50.010 Scott_Harmon: Question.
433 00:44:50.180 ⇒ 00:44:52.480 Steven: Make Andy have a little sarcasm. Maybe.
434 00:44:52.480 ⇒ 00:44:52.885 Uttam Kumaran: Yes.
435 00:44:53.290 ⇒ 00:44:57.825 YvetteRuiz: That’s that’ll be Steven Steven the bot. No.
436 00:44:59.930 ⇒ 00:45:02.220 JanieceGarcia: Steven Steven’s expressions. There.
437 00:45:02.220 ⇒ 00:45:02.739 YvetteRuiz: That’d be good.
438 00:45:03.300 ⇒ 00:45:05.107 YvetteRuiz: There we go.
439 00:45:05.710 ⇒ 00:45:08.019 Scott_Harmon: Alright! Very exciting stuff!
440 00:45:08.220 ⇒ 00:45:08.770 Uttam Kumaran: Yeah, thanks.
441 00:45:08.770 ⇒ 00:45:13.439 Uttam Kumaran: Progress everyone. Yeah, definitely. I’ll send some emails over and schedule some stuff for next week.
442 00:45:13.820 ⇒ 00:45:20.870 Scott_Harmon: I just Utah. Can I get you and Miguel? Can we just stay on it for a couple more minutes and just a couple of follow ups just while we’ve got you.
443 00:45:20.870 ⇒ 00:45:21.760 Uttam Kumaran: Yeah, definitely.
444 00:45:22.010 ⇒ 00:45:23.200 YvetteRuiz: Alrighty bye, guys!
445 00:45:23.200 ⇒ 00:45:24.070 YvetteRuiz: Bye, guys.
446 00:45:24.070 ⇒ 00:45:24.540 JanieceGarcia: No.
447 00:45:26.180 ⇒ 00:45:26.620 Miguel de Veyra: Thanks.
448 00:45:26.620 ⇒ 00:45:27.210 Uttam Kumaran: You.
449 00:45:31.422 ⇒ 00:45:36.400 Scott_Harmon: Yeah, I I didn’t have anything specific. I just thought, you know. Seems like pretty good progress. Are you?
450 00:45:37.000 ⇒ 00:45:39.680 Scott_Harmon: Are you pleased? And what you know. What do you.
451 00:45:39.680 ⇒ 00:45:46.360 Uttam Kumaran: Yeah, I think I was also surprised that the Csv’s worked in Gemini. We just shoved it in there and it worked.
452 00:45:46.590 ⇒ 00:45:48.390 Uttam Kumaran: I don’t know. We structured it.
453 00:45:49.130 ⇒ 00:45:51.869 Scott_Harmon: How’d you restructure? Did you just do it manually, Miguel?
454 00:45:52.573 ⇒ 00:45:57.300 Miguel de Veyra: Basically, I made them all I wrote a script made them all Json.
455 00:45:59.029 ⇒ 00:45:59.979 Scott_Harmon: Oh!
456 00:45:59.980 ⇒ 00:46:02.110 Uttam Kumaran: Those are relationships. Okay.
457 00:46:02.110 ⇒ 00:46:03.199 Scott_Harmon: Yeah, there. So you do have.
458 00:46:03.200 ⇒ 00:46:05.490 Miguel de Veyra: Set some sort of risk relationship. Yeah.
459 00:46:05.490 ⇒ 00:46:10.430 Scott_Harmon: So you made them all, Json, and then you crammed them into a Google sheet. Or what did you.
460 00:46:10.730 ⇒ 00:46:12.509 Miguel de Veyra: No, no. I put them as
461 00:46:12.740 ⇒ 00:46:16.559 Miguel de Veyra: raw tech raw. Json, basically on the context.
462 00:46:16.800 ⇒ 00:46:17.860 Scott_Harmon: Oh, that’s it!
463 00:46:17.860 ⇒ 00:46:19.619 Miguel de Veyra: There’s some technique found it. But yeah.
464 00:46:19.620 ⇒ 00:46:22.229 Scott_Harmon: Okay. So you do have a little bit of relational.
465 00:46:24.880 ⇒ 00:46:26.929 Scott_Harmon: Because you’re right. You could use the Json
466 00:46:26.930 ⇒ 00:46:28.690 Scott_Harmon: just some of the promotional stuff right?
467 00:46:28.690 ⇒ 00:46:29.310 Miguel de Veyra: Yeah, yeah.
468 00:46:29.310 ⇒ 00:46:32.740 Scott_Harmon: Oh, oh, that’s interesting. Okay, well, look, this is.
469 00:46:32.740 ⇒ 00:46:33.690 Uttam Kumaran: Similar, yeah.
470 00:46:35.560 ⇒ 00:46:40.889 Scott_Harmon: Yeah, I mean again, this is just such your wheel. Who? Absolutely, Tom, I’ll totally defer like you do database in your sleep.
471 00:46:40.890 ⇒ 00:46:46.080 Uttam Kumaran: No, I I was. I was surprised. I didn’t think it was gonna work, I said, just try it.
472 00:46:46.240 ⇒ 00:46:52.550 Uttam Kumaran: And but that’s why I wanna, I’m not convinced until we get the questions and the Evals running.
473 00:46:52.550 ⇒ 00:46:55.840 Scott_Harmon: So the only problem I have. I say, problem
474 00:46:56.340 ⇒ 00:47:01.019 Scott_Harmon: I I don’t think it’s a problem for the 1st phase. But let’s say we.
475 00:47:01.460 ⇒ 00:47:11.949 Scott_Harmon: we? You process the the sheets, and then you created Json’s, and that’s the source material for the agent. How do you present that visually in the master document, so that
476 00:47:12.130 ⇒ 00:47:17.849 Scott_Harmon: if people just want to look at it, because right now those spreadsheets look real ugly to me like they have to go in. And
477 00:47:18.060 ⇒ 00:47:25.070 Scott_Harmon: but how? How do? Or I guess they just continue to look at those sheets. It’s like when they wanted to go. Look at the source.
478 00:47:25.730 ⇒ 00:47:27.030 Scott_Harmon: What do they look at?
479 00:47:27.030 ⇒ 00:47:32.970 Uttam Kumaran: Yeah, I will. I think we’re gonna have them keep the sheets, and we will just process that
480 00:47:33.600 ⇒ 00:47:37.110 Uttam Kumaran: on a batch. Yeah. Cause I do think that she.
481 00:47:37.730 ⇒ 00:47:57.849 Uttam Kumaran: unless they have like a erp or something for this I don’t know, so they think they may continue to maintain the sheet. And it. It’s fine. My main concern was we could have brought that into the database and left it as a sheet. I think it’s fine, if similar to the Powerpoints like. If that’s the medium, then we will post, process it and put it in.
482 00:47:57.850 ⇒ 00:48:01.149 Scott_Harmon: Okay, I mean, the only reason I’m picking on is
483 00:48:01.330 ⇒ 00:48:07.349 Scott_Harmon: it? If it also offends my aesthetic sense, I think it’s probably, and I hate ugly spreadsheets. But that’s not a problem right?
484 00:48:07.350 ⇒ 00:48:12.849 Uttam Kumaran: Well, I actually think we do have opportunity to help them clean like I can help them make it a little bit like
485 00:48:13.130 ⇒ 00:48:13.780 Uttam Kumaran: better.
486 00:48:13.780 ⇒ 00:48:26.330 Scott_Harmon: Well, well, whenever you do, the other agent, which is the knowledge creation agent, you’re gonna have to solve the ability to update the sheet, and whenever you do that you could probably just tackle the format, clean it up, you know. Make it.
487 00:48:26.330 ⇒ 00:48:26.990 Uttam Kumaran: Yeah.
488 00:48:26.990 ⇒ 00:48:37.290 Scott_Harmon: Better, you know it’s just a spreadsheet. But I really like I mean, if it works what you’ve done, Miguel, if it, you know, if we can answer, ask it any question.
489 00:48:37.630 ⇒ 00:48:41.880 Scott_Harmon: you know. Look up kind of question, and they can answer it. Then that’s fantastic. I mean, that’s.
490 00:48:41.880 ⇒ 00:48:49.479 Uttam Kumaran: Yeah, and we we didn’t. So we haven’t done images for the Powerpoints. But yeah, I was basically like we should use Gemini multimodal and just
491 00:48:49.660 ⇒ 00:48:51.619 Uttam Kumaran: show images in.
492 00:48:51.620 ⇒ 00:48:55.800 Scott_Harmon: I haven’t looked at their Powerpoints. I mean it probably should. If there’s some.
493 00:48:55.800 ⇒ 00:49:02.539 Miguel de Veyra: Yeah, I reviewed their powerpoints. Actually, it’s it’s all text
494 00:49:02.680 ⇒ 00:49:08.569 Miguel de Veyra: like, it’s some pictures, the the only pictures there are like pictures of mosquitoes and the rodents.
495 00:49:08.570 ⇒ 00:49:13.700 Scott_Harmon: Yeah, that’s kind of what I guessed. I mean, I didn’t know if there was a lot of figures and graphics it didn’t.
496 00:49:13.700 ⇒ 00:49:14.230 Miguel de Veyra: So like.
497 00:49:14.230 ⇒ 00:49:16.249 Scott_Harmon: That kind of company. Yeah. Yeah.
498 00:49:17.310 ⇒ 00:49:24.060 Miguel de Veyra: Because most of their Powerpoints are. I think Janice mentioned it earlier. It’s there’s a doc for it.
499 00:49:24.980 ⇒ 00:49:31.000 Miguel de Veyra: So honestly, if if we figure everything out there might not even be need to put all the
500 00:49:31.200 ⇒ 00:49:32.490 Miguel de Veyra: Powerpoints and.
501 00:49:32.970 ⇒ 00:49:40.300 Scott_Harmon: Right. So she’s probably just flipping or cutting and pasting it from a doc onto a Powerpoint slide and then just narrating it when she does the training. So.
502 00:49:40.300 ⇒ 00:49:40.780 Miguel de Veyra: Yeah.
503 00:49:40.970 ⇒ 00:49:47.319 Scott_Harmon: Yeah, yeah, okay, okay. And again, later on, after this project, I think you’re gonna blow them away.
504 00:49:47.630 ⇒ 00:49:51.620 Scott_Harmon: You mentioned it very quickly. The ability to generate a new artifact.
505 00:49:51.620 ⇒ 00:49:52.039 Uttam Kumaran: Like. You’re not.
506 00:49:52.040 ⇒ 00:49:57.720 Scott_Harmon: Training like A, as you guys both know. That’ll almost come for free.
507 00:49:57.720 ⇒ 00:49:58.960 Uttam Kumaran: For free. Yeah, I sort of
508 00:49:59.380 ⇒ 00:50:02.409 Uttam Kumaran: like, Oh, yeah, you’re they’re gonna make new docs.
509 00:50:02.410 ⇒ 00:50:06.019 Scott_Harmon: Somewhere. There’s a mic. Drop moment when you go. Oh, you want a new
510 00:50:06.220 ⇒ 00:50:13.800 Scott_Harmon: 8 page, Powerpoint, that summarizes the following topics like it’ll just generate it for you and create a graphic for this and blah blah
511 00:50:14.010 ⇒ 00:50:18.850 Scott_Harmon: like whenever they realize that they can generate training artifacts
512 00:50:19.750 ⇒ 00:50:21.809 Scott_Harmon: or or other kinds of artifacts
513 00:50:22.440 ⇒ 00:50:25.731 Scott_Harmon: just again for free. They’re going to be blown away.
514 00:50:29.110 ⇒ 00:50:32.070 Uttam Kumaran: It’s yeah. It’s it’s working pretty well. We’re moving pretty fast.
515 00:50:33.214 ⇒ 00:50:37.480 Uttam Kumaran: I sort of. I’m glad we kicked the evolve stuff. I just am like.
516 00:50:37.830 ⇒ 00:50:38.490 Miguel de Veyra: That’s yeah.
517 00:50:38.490 ⇒ 00:50:39.530 Miguel de Veyra: That’s gonna be hard.
518 00:50:39.710 ⇒ 00:50:42.049 Uttam Kumaran: Well, I’m just like it’s always
519 00:50:42.210 ⇒ 00:50:45.239 Uttam Kumaran: that’s that’s always where there’s a delay in these projects.
520 00:50:45.780 ⇒ 00:50:50.860 Scott_Harmon: But it’s from a milestone perspective. That’s fine, right? Because you can get the the app.
521 00:50:50.860 ⇒ 00:50:51.300 Uttam Kumaran: Accuracy.
522 00:50:51.300 ⇒ 00:50:54.730 Scott_Harmon: See the efficacy of the agent figured out.
523 00:50:55.400 ⇒ 00:51:03.899 Scott_Harmon: And then, as a second phase, you can get, okay, let’s wire it up to the to the right source of information like, that’s a slide. That is a little bit more of a programming challenge. So
524 00:51:04.480 ⇒ 00:51:09.960 Scott_Harmon: but once you’ve got the act, you know the the agent answering accurately.
525 00:51:10.970 ⇒ 00:51:17.200 Scott_Harmon: Then that’s the that’s the problem that you could just sort of improve it over time by wiring it up directly to Api. So.
526 00:51:17.450 ⇒ 00:51:23.660 Uttam Kumaran: I think. Well, I think the only thing that’s also lingering in my mind is we’ll have to think about what the proposal is for the
527 00:51:24.030 ⇒ 00:51:25.660 Uttam Kumaran: next phase.
528 00:51:26.621 ⇒ 00:51:32.719 Uttam Kumaran: That’s why I want to try to get someone on their team using it. So we can use that
529 00:51:32.880 ⇒ 00:51:38.660 Uttam Kumaran: to sort of share that stuff is working, and then we can sort of think about what the
530 00:51:40.210 ⇒ 00:51:44.350 Uttam Kumaran: per resolution or per conversation sort of pricing is.
531 00:51:44.580 ⇒ 00:51:52.530 Scott_Harmon: Yeah. And I, we’ve been talking about that all along. Right? The the master metric is, yeah, I I think it’ll be some per resolution kind of a thing.
532 00:51:53.010 ⇒ 00:51:54.290 Scott_Harmon: Get it going?
533 00:51:54.820 ⇒ 00:51:58.920 Scott_Harmon: Ultimately, I think we want to be tied to a metric like
534 00:52:00.342 ⇒ 00:52:06.460 Scott_Harmon: reducing the number of callbacks. Yeah, the number of callbacks right? But those I think that
535 00:52:06.970 ⇒ 00:52:13.200 Scott_Harmon: I think it’ll be pretty straightforward to come up with an initial payment model
536 00:52:13.600 ⇒ 00:52:16.239 Scott_Harmon: based on usage for at least a while.
537 00:52:16.240 ⇒ 00:52:16.640 Uttam Kumaran: Yeah.
538 00:52:18.430 ⇒ 00:52:20.390 Uttam Kumaran: Yeah, I wanna just do something where they’re like.
539 00:52:20.610 ⇒ 00:52:23.079 Uttam Kumaran: okay, let’s go for it and it. And it’s
540 00:52:23.800 ⇒ 00:52:30.400 Uttam Kumaran: they’re okay with it. And then we can sort of see what our we’ll end up understanding a little bit of what our costs are to maintain it.
541 00:52:30.770 ⇒ 00:52:42.660 Uttam Kumaran: And like, Okay, do they want to meet every few weeks on it, like how much spending to keep things updated. And then, of course, like, I feel like they’re immediately gonna want this for every other division, so.
542 00:52:42.940 ⇒ 00:52:49.839 Scott_Harmon: For sure, absolutely for sure. They’re gonna want every. So there’s some obvious just right directly. Follow on work right like.
543 00:52:49.840 ⇒ 00:52:50.740 Uttam Kumaran: Yes.
544 00:52:50.740 ⇒ 00:52:58.989 Scott_Harmon: They’re they’re also there’s just other applications I know they’re gonna come back to you with. So the next one you’re gonna get hit with is
545 00:52:59.650 ⇒ 00:53:03.620 Scott_Harmon: in a good way. Is this training thing I keep talking about?
546 00:53:03.780 ⇒ 00:53:09.550 Scott_Harmon: They have a big investment in management training. So training
547 00:53:09.740 ⇒ 00:53:13.530 Scott_Harmon: their managers how to be better managers. They spend a bunch of money on it.
548 00:53:14.020 ⇒ 00:53:16.019 Scott_Harmon: It’s not very good the way they do it.
549 00:53:16.290 ⇒ 00:53:20.240 Scott_Harmon: And so that’s probably the next thing that they’re gonna want to talk about
550 00:53:22.530 ⇒ 00:53:24.329 Scott_Harmon: And there’s just a bunch of other
551 00:53:24.810 ⇒ 00:53:27.859 Scott_Harmon: applications. I know we’re gonna come up the
552 00:53:28.070 ⇒ 00:53:33.279 Scott_Harmon: the Hr. People have already asked me about a couple, the Luca pilots not going very well.
553 00:53:33.720 ⇒ 00:53:34.080 Uttam Kumaran: Okay.
554 00:53:36.500 ⇒ 00:53:43.139 Scott_Harmon: and what they really want. Luca, is this counselor kind of therap therapy kind of thing, and it turns out
555 00:53:43.320 ⇒ 00:53:46.850 Scott_Harmon: I just don’t think people want to do a whole lot of therapy coaching kind of stuff.
556 00:53:46.850 ⇒ 00:53:47.400 Uttam Kumaran: Okay.
557 00:53:47.670 ⇒ 00:53:54.220 Scott_Harmon: What they want to do, which Luca doesn’t really do is what’s called professional coaching.
558 00:53:55.390 ⇒ 00:53:58.469 Scott_Harmon: right where you’re you’re coaching a manager how to.
559 00:53:58.470 ⇒ 00:53:59.580 Uttam Kumaran: Yeah, yeah.
560 00:53:59.580 ⇒ 00:54:01.440 Scott_Harmon: Yeah. So, for example.
561 00:54:01.700 ⇒ 00:54:08.090 Scott_Harmon: you know how I’ve got this problem, how do I handle this problem employee? Or I’ve got this
562 00:54:08.620 ⇒ 00:54:13.420 Scott_Harmon: problematic customer? What do I do like those are. They’re really into that kind of stuff.
563 00:54:13.420 ⇒ 00:54:13.840 Uttam Kumaran: Okay.
564 00:54:14.318 ⇒ 00:54:17.670 Scott_Harmon: The other things they’ve asked about are
565 00:54:18.353 ⇒ 00:54:21.989 Scott_Harmon: performance review. So there’s all kinds of stuff I know they’re gonna
566 00:54:22.310 ⇒ 00:54:26.120 Scott_Harmon: want you to start doing. There’s just so much add on opportunity. Yeah. Okay.
567 00:54:26.810 ⇒ 00:54:28.849 Scott_Harmon: after we get to see our thing stood up.
568 00:54:28.850 ⇒ 00:54:29.440 Uttam Kumaran: Okay.
569 00:54:30.880 ⇒ 00:54:34.990 Scott_Harmon: Okay. Well, hey, Miguel? Great job. Sounds like great traction. This is a lot of fun. Good for you.
570 00:54:34.990 ⇒ 00:54:36.090 Uttam Kumaran: Yeah, this has been good.
571 00:54:37.360 ⇒ 00:54:45.060 Miguel de Veyra: Thank you. Guys autumn. Sorry. Final question. The golden data sheet. That’s the proper answer. Expected answer. Thing right?
572 00:54:45.060 ⇒ 00:54:49.250 Uttam Kumaran: Yes, yeah, like I, I just think we have to prep. We have to keep
573 00:54:49.480 ⇒ 00:55:08.469 Uttam Kumaran: reminding them. I just want them to sort of work for us on this, on this thing, because we can’t move forward on that unless they spend time. If we spend like 30 min and just have them go through a couple, I think, and we just keep poking them through the week. They’ll go add them. I just really want to have that that done, because that’s blocking a bunch of stuff. So.
574 00:55:08.470 ⇒ 00:55:09.750 Miguel de Veyra: Yeah, okay. But I’ll.
575 00:55:09.750 ⇒ 00:55:14.350 Scott_Harmon: Yeah. And if you’re not, if you’re if you need me to help chide because I couldn’t agree more like
576 00:55:15.120 ⇒ 00:55:19.899 Scott_Harmon: like unless they can, I I think they should be putting in like 50 questions.
577 00:55:19.900 ⇒ 00:55:20.970 Uttam Kumaran: Yes. Yeah.
578 00:55:20.970 ⇒ 00:55:25.090 Scott_Harmon: And it shouldn’t take them long to do that, because they live in this world.
579 00:55:25.340 ⇒ 00:55:31.069 Scott_Harmon: and the harder the better. You know, they should be trying to think about questions that would trip up
580 00:55:32.020 ⇒ 00:55:38.579 Scott_Harmon: rip this thing up. Yeah. And I think that I think that would help you more than anything else, you know. Get this thing dialed in.
581 00:55:38.920 ⇒ 00:55:40.760 Miguel de Veyra: I mean right now, there’s like
582 00:55:40.910 ⇒ 00:55:44.110 Miguel de Veyra: more around 50 more or less questions. There.
583 00:55:45.250 ⇒ 00:55:52.340 Uttam Kumaran: But I I wanna start to rank them by difficulty, and then I want us to be way more weighted on the more difficult.
584 00:55:52.340 ⇒ 00:55:53.479 Miguel de Veyra: Difficult towards.
585 00:55:53.480 ⇒ 00:55:57.559 Uttam Kumaran: It’s gonna take time. They’re not gonna be able to like, sit for 10 seconds and think about these.
586 00:55:57.560 ⇒ 00:55:58.260 Miguel de Veyra: Yeah.
587 00:55:58.260 ⇒ 00:56:03.350 Uttam Kumaran: The other thing I sort of want them. I would rather them do is just hand over all of their.
588 00:56:03.470 ⇒ 00:56:04.910 Miguel de Veyra: Questions, scripts?
589 00:56:05.310 ⇒ 00:56:06.050 Miguel de Veyra: Yeah.
590 00:56:06.050 ⇒ 00:56:06.560 Uttam Kumaran: Like all.
591 00:56:06.560 ⇒ 00:56:07.490 Scott_Harmon: Oh, they’re what oh.
592 00:56:07.490 ⇒ 00:56:12.569 Uttam Kumaran: You should just hand us like a month’s worth of transcripts from the calls, and we can build.
593 00:56:13.530 ⇒ 00:56:14.910 Scott_Harmon: Have you asked him for that yet.
594 00:56:14.910 ⇒ 00:56:20.619 Uttam Kumaran: I I did ask them. I don’t know. It’s taking some time. I’m gonna try to ask again. I asked you yesterday.
595 00:56:21.380 ⇒ 00:56:23.489 Miguel de Veyra: Yeah, okay, I just didn’t know.
596 00:56:24.280 ⇒ 00:56:24.910 Uttam Kumaran: Yeah.
597 00:56:25.480 ⇒ 00:56:27.490 Miguel de Veyra: Reschedule a call with Yvette on Monday.
598 00:56:27.520 ⇒ 00:56:29.250 Uttam Kumaran: I haven’t scheduled it yet. No.
599 00:56:29.960 ⇒ 00:56:32.379 Miguel de Veyra: Oh, but you’re gonna meet with Janice right.
600 00:56:32.550 ⇒ 00:56:35.449 Uttam Kumaran: I’m gonna yeah, whoever I get on the phone I’m gonna meet with. But I
601 00:56:35.970 ⇒ 00:56:37.999 Uttam Kumaran: seems like she’s less busy on like.
602 00:56:38.000 ⇒ 00:56:38.530 Miguel de Veyra: Yeah, yeah.
603 00:56:38.530 ⇒ 00:56:41.140 Uttam Kumaran: Think the event is handling like higher management stuff. So.
604 00:56:41.500 ⇒ 00:56:42.460 Miguel de Veyra: Okay. Okay.
605 00:56:43.570 ⇒ 00:56:48.920 Uttam Kumaran: Yeah, I’ll try to get in touch, maybe even spend like 30 min a day next week.
606 00:56:49.180 ⇒ 00:56:54.359 Uttam Kumaran: Yeah, my time is the worst. So if it’s on me, it’s gonna get.
607 00:56:54.650 ⇒ 00:56:55.449 Miguel de Veyra: Yeah, yeah, yeah.
608 00:56:55.450 ⇒ 00:56:55.840 Uttam Kumaran: So.
609 00:56:55.840 ⇒ 00:56:59.200 Miguel de Veyra: I’ll try to spend 30 min to an hour with yes.
610 00:56:59.848 ⇒ 00:57:02.260 Uttam Kumaran: If they’re open to meeting every day, and they.
611 00:57:02.260 ⇒ 00:57:02.930 Miguel de Veyra: Yeah, yeah.
612 00:57:02.930 ⇒ 00:57:05.110 Uttam Kumaran: Do it. Yeah, go for it. No stopping.
613 00:57:05.930 ⇒ 00:57:10.919 Scott_Harmon: I think Janice would probably be. I think my sense is, she is a little more available, and she’s just super
614 00:57:11.550 ⇒ 00:57:16.420 Scott_Harmon: super supportive of everything that’s going on that is more busy, a little bit harder to PIN down so
615 00:57:16.826 ⇒ 00:57:17.360 Scott_Harmon: I wouldn’t.
616 00:57:17.360 ⇒ 00:57:22.499 Miguel de Veyra: Should I? Should I? Should I email, Jeanette, direct Janice directly, or should I ask.
617 00:57:22.680 ⇒ 00:57:28.719 Uttam Kumaran: Just include everybody include me, Yvette Scott. And then the ABC email.
618 00:57:29.430 ⇒ 00:57:31.789 Uttam Kumaran: yeah. And then just just see if you can.
619 00:57:31.930 ⇒ 00:57:35.239 Uttam Kumaran: Yeah, just try to just try to, don’t, I would say
620 00:57:35.700 ⇒ 00:57:41.989 Uttam Kumaran: at Max, just email everybody. I think they’re totally fine with getting bombarded. I don’t think they really had a concern. We’re not.
621 00:57:41.990 ⇒ 00:57:43.229 Miguel de Veyra: Okay. Okay. Sure. Sure.
622 00:57:44.540 ⇒ 00:57:45.410 Miguel de Veyra: Okay.
623 00:57:46.990 ⇒ 00:57:48.160 Uttam Kumaran: Okay. Awesome.
624 00:57:48.160 ⇒ 00:57:48.850 Miguel de Veyra: There we go!
625 00:57:50.130 ⇒ 00:57:50.780 Scott_Harmon: Yeah, it’s great.
626 00:57:50.780 ⇒ 00:57:51.330 Scott_Harmon: Well, great.
627 00:57:51.330 ⇒ 00:57:52.120 Scott_Harmon: Good job. Good.
628 00:57:52.300 ⇒ 00:57:54.650 Uttam Kumaran: I’ll try to make sure in in our brain force.
629 00:57:55.550 ⇒ 00:58:01.100 Uttam Kumaran: Google. You can see the chat it’s in ours. But I’ll make sure that you can see it there as well, and you can play around.
630 00:58:01.670 ⇒ 00:58:05.660 Scott_Harmon: Okay, okay, guys, thanks. A lot. Good stuff.
631 00:58:05.660 ⇒ 00:58:06.319 Miguel de Veyra: Thanks, Evan.
632 00:58:06.710 ⇒ 00:58:08.520 Uttam Kumaran: Talk soon bye.