Meeting Title: Weekly Project Check Date: 2025-02-07 Meeting participants: Janna Wong, Uttam Kumaran, Steven, Miguel De Veyra, T F, Yvetteruiz, Scott_Harmon
WEBVTT
1 00:01:11.810 ⇒ 00:01:14.239 Scott_Harmon: I can’t hear you. I can’t hear you, but
2 00:01:19.460 ⇒ 00:01:20.580 Scott_Harmon: no.
3 00:01:28.180 ⇒ 00:01:29.649 Scott_Harmon: maybe it’s on my side.
4 00:01:29.650 ⇒ 00:01:30.704 YvetteRuiz: I’m on mute. There you go!
5 00:01:30.880 ⇒ 00:01:32.019 Scott_Harmon: There you are! There you are!
6 00:01:33.113 ⇒ 00:01:34.659 YvetteRuiz: Good morning!
7 00:01:34.660 ⇒ 00:01:35.730 Scott_Harmon: How are you?
8 00:01:35.730 ⇒ 00:01:37.819 YvetteRuiz: I’m well. How are you?
9 00:01:38.413 ⇒ 00:01:40.130 Scott_Harmon: I’m pretty well, thank you.
10 00:01:41.240 ⇒ 00:01:45.649 Scott_Harmon: I sent you over some notes. I don’t know if you had a chance to look at them, but
11 00:01:45.890 ⇒ 00:01:48.330 Scott_Harmon: I was to these, I didn’t.
12 00:01:50.510 ⇒ 00:01:53.040 Scott_Harmon: I didn’t have Denise’s email, so I didn’t send them to you. But.
13 00:01:53.040 ⇒ 00:01:53.760 YvetteRuiz: Oh.
14 00:01:54.466 ⇒ 00:01:58.800 YvetteRuiz: no worries. Yeah. I’ve been trying to read them. It’s just been a little bit stacked this morning with meetings.
15 00:01:58.800 ⇒ 00:01:59.440 YvetteRuiz: No problem.
16 00:01:59.840 ⇒ 00:02:04.470 Scott_Harmon: No problem. I just want just make sure. I got everything from the whiteboard.
17 00:02:05.440 ⇒ 00:02:06.769 YvetteRuiz: Down correct.
18 00:02:08.120 ⇒ 00:02:10.123 YvetteRuiz: So far, so good. Everything I’ve read.
19 00:02:11.280 ⇒ 00:02:16.600 Scott_Harmon: Yeah. The only thing I had in there was a question for you. What we’re waiting for, Tom is
20 00:02:16.970 ⇒ 00:02:21.151 Scott_Harmon: if you could figure out, you know, we’re gonna kind of focus on that
21 00:02:22.330 ⇒ 00:02:29.359 Scott_Harmon: the rate that the Csrs have to kind of call someone back because they couldn’t find the answer. I forgot what we call that, you know that.
22 00:02:30.570 ⇒ 00:02:36.610 Scott_Harmon: you know. Call call back rate, if you want to call it that. If you could figure out what that number is today.
23 00:02:36.900 ⇒ 00:02:37.570 YvetteRuiz: Okay.
24 00:02:37.570 ⇒ 00:02:40.899 Scott_Harmon: What percentage of the calls require that.
25 00:02:41.430 ⇒ 00:02:42.800 YvetteRuiz: Yeah, you know, meeting.
26 00:02:42.800 ⇒ 00:02:47.179 Scott_Harmon: The answer. I couldn’t find you know, any you people I couldn’t find it on chat. I couldn’t find like
27 00:02:47.390 ⇒ 00:02:53.419 Scott_Harmon: that. That’s what we we really want? If you could figure out what it is, is it 5% or 10 or 20 like what percent of calls.
28 00:02:54.120 ⇒ 00:03:01.639 YvetteRuiz: What what I do. I have a meeting with our data team this afternoon, so we’ll get. We’ll work on getting all that information and get that to you.
29 00:03:01.640 ⇒ 00:03:04.859 Scott_Harmon: Yeah, yeah, we really want to see if we can make that go down. So.
30 00:03:04.860 ⇒ 00:03:06.685 YvetteRuiz: Yeah, no, it’s a good measure.
31 00:03:07.845 ⇒ 00:03:10.902 YvetteRuiz: something. You need to be able to measure that so agree
32 00:03:11.180 ⇒ 00:03:14.540 Scott_Harmon: Okay, we come up with a fun name for the metric.
33 00:03:15.680 ⇒ 00:03:19.200 YvetteRuiz: You know. Resolution, one call resolution. Right? Yes.
34 00:03:19.910 ⇒ 00:03:20.470 Uttam Kumaran: Hey, everyone.
35 00:03:21.270 ⇒ 00:03:21.840 YvetteRuiz: Everything.
36 00:03:21.840 ⇒ 00:03:26.249 Scott_Harmon: Football football thing for me. So you call it like the punt rate. You know the rate you have to punt because.
37 00:03:26.837 ⇒ 00:03:28.600 YvetteRuiz: There you go!
38 00:03:28.600 ⇒ 00:03:30.540 Scott_Harmon: I don’t know. I’m just gonna punt call you back later.
39 00:03:30.540 ⇒ 00:03:31.435 YvetteRuiz: Okay.
40 00:03:32.610 ⇒ 00:03:33.789 YvetteRuiz: Hi Utah.
41 00:03:33.810 ⇒ 00:03:36.510 Uttam Kumaran: Hi! How are you? Great to see you again.
42 00:03:37.670 ⇒ 00:03:39.567 YvetteRuiz: Good Hi, Steven!
43 00:03:40.390 ⇒ 00:03:41.250 Steven: So that.
44 00:03:43.560 ⇒ 00:04:07.369 Uttam Kumaran: Awesome. So maybe we can just jump into things. I guess I wanted to spend a few minutes. Just introduce the team on our side. Miguel in particular. Who’s leading sort of our AI development here, and also sort of managing. Our team has been with us. Sort of was the reason why we even went into the AI business. As I mentioned yesterday.
45 00:04:07.990 ⇒ 00:04:11.110 Uttam Kumaran: I started to use AI to.
46 00:04:11.260 ⇒ 00:04:15.720 Uttam Kumaran: you know, automate a lot of stuff in our company. And I, you know.
47 00:04:16.019 ⇒ 00:04:28.810 Uttam Kumaran: met with Miguel and was like, Hey, do you want to come on and help us? And then quickly, we realized that this is a little bit harder to do, and we found, you know, some opportunity to sort of help other companies do it. And so Miguel has been really paramount. So, Miguel, maybe I’ll
48 00:04:28.910 ⇒ 00:04:34.900 Uttam Kumaran: let you introduce yourself. And then, yeah, ideally, today, just to set the stage. We made
49 00:04:35.150 ⇒ 00:05:03.010 Uttam Kumaran: a good amount of progress since yesterday. I know we just talked yesterday afternoon. We do have something to share. But one of the things that I think we’ll be building out, and I can get across the bow like early next week, is like a full project plan. The team and I met this morning. Reviewed notes from yesterday, and we do have a pretty good understanding of what the next few weeks are gonna look like in terms of development. But I guess before jumping that Miguel, maybe if you want to say Hi, and also introduce anyone else on the team.
50 00:05:03.850 ⇒ 00:05:11.669 Miguel de Veyra: Hey, guys, name is Miguel? Technically, I’m the one leading the project. And anything AI related in Brainforge.
51 00:05:12.188 ⇒ 00:05:17.599 Miguel de Veyra: Yeah. So that’s about me. And then one of our team members is Janna.
52 00:05:18.220 ⇒ 00:05:20.810 Miguel de Veyra: So yeah, Jana, could you please introduce yourself.
53 00:05:21.450 ⇒ 00:05:22.490 YvetteRuiz: Hi. Miguel.
54 00:05:22.490 ⇒ 00:05:30.149 Janna Wong: Everyone nice meeting you. I’m Jana and I have some experience with automations as well. So I’m here to like
55 00:05:31.370 ⇒ 00:05:32.714 Janna Wong: Help a sales point.
56 00:05:33.390 ⇒ 00:05:34.670 Janna Wong: Nice meeting everyone.
57 00:05:36.290 ⇒ 00:05:36.869 Steven: Good to meet you.
58 00:05:36.870 ⇒ 00:05:37.380 YvetteRuiz: Nope.
59 00:05:37.790 ⇒ 00:05:39.976 Uttam Kumaran: Yeah. So I don’t know. I think,
60 00:05:40.380 ⇒ 00:06:06.359 Uttam Kumaran: Miguel is that. And the team is definitely gonna undersell themselves. Miguel has done a lot of AI automation work in call center automation, particularly. A lot of the work, Yvette, that I mentioned around voice agents. Basically like automating call centers. Part of the reason why, thought this project would be really, really great is, you know, we’ve done some of this before. Ideally, though, that the challenge that we’re having is gonna be to build the best thing
61 00:06:06.500 ⇒ 00:06:35.622 Uttam Kumaran: within your environment that answers exactly your Csr’s questions. And so one of the things that I went through with the team this morning is really just outlining sort of everything we talked about. And I walked through. Really, I actually just walked through that the road and annual sort of example with everybody. And it’s it’s great, because everybody knows what pests are and like, what pest control is. But I think for for our sake I think it was helpful to walk through that clear example and sort of break down exactly.
62 00:06:35.920 ⇒ 00:06:36.580 Steven: Yeah.
63 00:06:36.580 ⇒ 00:06:45.368 Uttam Kumaran: Like, what are the key components on the on the AI Building side that we need to do? I think one thing I’ll just maybe kick off with is just talking a bit about
64 00:06:46.538 ⇒ 00:06:58.230 Uttam Kumaran: like where we’re where we’re sort of gonna focus our time over this next week or so, and talk about how we’re gonna split up the project, and it would love to just get your sort of take on that. And then I want to spend actually
65 00:06:58.350 ⇒ 00:07:06.280 Uttam Kumaran: as much time today showing what we’ve been able to do since bringing in your documents yesterday, and then hopefully leave you with this artifact
66 00:07:06.410 ⇒ 00:07:18.330 Uttam Kumaran: to sort of play around with. This weekend, or sort of as we’re building, and that’s really the artifact that you’ll see gets better over time. So let me just share let me just share my screen.
67 00:07:18.650 ⇒ 00:07:27.879 Uttam Kumaran: and we tried to hustle to get something together that’s in a deck, but we have a lot of notes written down and so I think hopefully, I’ll be able to just talk through
68 00:07:29.120 ⇒ 00:07:30.510 YvetteRuiz: A lot of this, but.
69 00:07:32.330 ⇒ 00:07:33.270 Uttam Kumaran: So,
70 00:07:34.600 ⇒ 00:07:48.920 Uttam Kumaran: yeah, really, where? What we’re starting to work towards. And I think hopefully, while we’re all talking to this, I can sort of flesh this out a little bit more. Really. The team and I spent the morning talking about what are the clear overall
71 00:07:49.410 ⇒ 00:07:57.829 Uttam Kumaran: parts of the project from the infrastructure position. Right? And so one of the things I went through with everybody is, I went through exactly who
72 00:07:57.840 ⇒ 00:08:17.680 Uttam Kumaran: on the team is our stakeholder right who is benefiting from this? And so we talked about the Csrs themselves. But we also talked about Kenny. We talked about everybody up up the chain everybody on this call and talked about how this agent really helps solve that problem. I think the second thing is also thinking about the form factor. Where does this live?
73 00:08:17.977 ⇒ 00:08:31.660 Uttam Kumaran: And how do we make this agent the place to go instead of knocking on their neighbor next door or so, or Google chatting someone else. Right? The 3rd piece, of course, is giving you guys the insight into the fact that the agent is working
74 00:08:31.961 ⇒ 00:08:42.809 Uttam Kumaran: and so it’s all the logs and the dashboards and the data associated with how the agent is getting used. And so there’s really 2 components to the AI agents. There’s 1 is going to be on retrieval.
75 00:08:43.200 ⇒ 00:08:55.300 Uttam Kumaran: and I’m gonna use some language that’s more common in the AI world. But of course you can think of retrieval as going and getting something and updating right? And so these are the 2
76 00:08:55.740 ⇒ 00:09:19.980 Uttam Kumaran: really big mechanisms that we want to solve across your data set right, and we talked a little bit yesterday about how we had you know, roughly, 50 documents in here, and I’ve been able to hand this all to the team and you know, we walk through some examples today of hey, let’s say we’re a Csr. We get a question about road and annual. It’s not we we want to. We want to get help. With that. You go to this page.
77 00:09:20.130 ⇒ 00:09:36.270 Uttam Kumaran: And I literally just said, What do we do now? It’s it’s it’s tough, right? And so really, clearly articulating like what the pain is. We walked through a couple of these docs, and even then it was a little bit difficult, and then said, Okay, cool. Now we can see how this would get kicked up the chain and escalated. And there’s 2 pieces there one is
78 00:09:36.340 ⇒ 00:09:41.739 Uttam Kumaran: being able to direct that question to an AI agent that gives you an answer. If it exists
79 00:09:41.780 ⇒ 00:09:45.930 Uttam Kumaran: right? Second thing is, let’s say it doesn’t exist. What’s the next step?
80 00:09:45.970 ⇒ 00:10:14.080 Uttam Kumaran: So one of the things we’re we’re also wanting to work on is is updates. And so this is where we get into what is the actual form factor of the data. That allows anyone in in the, you know, sort of value chain above Csrs for now to to propose and make updates without having to really know what the right document is, what the right update format is and then making sure that’s also disseminated and told to everybody. Right? Because just because you make an update to the right
81 00:10:14.683 ⇒ 00:10:17.587 Uttam Kumaran: page here doesn’t mean everybody knows that exists.
82 00:10:18.140 ⇒ 00:10:33.230 Uttam Kumaran: And so those are really the 2 pieces here on the AI agent side. There’s a couple of steps, of course, in between this that that will outline and we’re working on sort of creating the tickets for and executing on. But I’ll sort of stop there. I want to spend
83 00:10:33.410 ⇒ 00:10:35.929 Uttam Kumaran: this is that’s really gonna be
84 00:10:36.780 ⇒ 00:10:43.030 Uttam Kumaran: the meat of the work here. But I guess I’ll stop there. Does that make sense? Of course I know there’s not much text here. But
85 00:10:43.250 ⇒ 00:10:47.500 Uttam Kumaran: can I flesh that out any further, or talk a little bit more specifically.
86 00:10:48.640 ⇒ 00:10:54.249 YvetteRuiz: No, that totally makes sense. I mean, you pretty much covered a lot of the I mean. You were clear on the steps, for sure.
87 00:10:54.250 ⇒ 00:10:54.910 Uttam Kumaran: Okay.
88 00:10:55.230 ⇒ 00:10:57.650 YvetteRuiz: Yeah, I agree, 100%.
89 00:10:58.750 ⇒ 00:11:02.920 Uttam Kumaran: Steven. Any thoughts or or questions.
90 00:11:02.920 ⇒ 00:11:17.939 Steven: Yeah, no. Why, say, obviously, we’re just now starting it. We’ll probably have little better insights into how to go next week. But yeah, I mean for the groundwork, it seems, seems to make sense, and we’re excited to see the progress and get going.
91 00:11:17.940 ⇒ 00:11:20.800 Scott_Harmon: I’m just gonna I’m just gonna weigh in. I
92 00:11:21.100 ⇒ 00:11:35.660 Scott_Harmon: hopefully, my voice will be consistent on all this. I always try and focus on the business value and the results that you want to see at the end, and to try and keep those consistent. Because if you’re, gonna you know, pay a contractor like Brainforge to come, do work. It’s good to
93 00:11:35.850 ⇒ 00:11:43.740 Scott_Harmon: have an idea of what does success look like? And so I have 3. And ultimately we want metrics like firm metrics for that.
94 00:11:44.020 ⇒ 00:11:56.159 Scott_Harmon: So the most important metric we identified yesterday, Steven, I don’t know if you’re on the call, and we did this. Is, we want to reduce that the rate at which Csrs cannot get a question answered in real time.
95 00:11:56.860 ⇒ 00:12:01.470 Scott_Harmon: and therefore have to call a client back, and you know a vet
96 00:12:02.305 ⇒ 00:12:10.249 Scott_Harmon: detailed. Why, that’s so painful. Oftentimes you can’t reach the client again, or you know, it’s just it’s just a really bad client, you know, customer
97 00:12:10.500 ⇒ 00:12:11.450 Scott_Harmon: outcome.
98 00:12:12.710 ⇒ 00:12:23.849 Scott_Harmon: gonna get us what that base rate is like, what it is today. There’s some percentage of calls for the Csrs that they just they can’t find the information they need. So they I call it the punt rate.
99 00:12:23.950 ⇒ 00:12:24.580 Scott_Harmon: you know.
100 00:12:25.900 ⇒ 00:12:37.190 Scott_Harmon: And so we really want to reduce that. And then the second metric we’ll focus on really is just asking the Csrs, just literally, this is the simplest one. How often did they
101 00:12:37.330 ⇒ 00:12:39.649 Scott_Harmon: did they ask the anteater
102 00:12:39.920 ⇒ 00:12:45.869 Scott_Harmon: and the anteater in the logs. I call it the anteater, but we’ll call it whatever you want to call it
103 00:12:46.120 ⇒ 00:12:47.100 Scott_Harmon: will.
104 00:12:47.450 ⇒ 00:12:53.629 Scott_Harmon: We’ll have an idea of how often it was able to successfully answer a question. So there’ll be a confidence interval
105 00:12:54.360 ⇒ 00:13:01.069 Scott_Harmon: that Utam’s team delivers. So let’s say it answered a hundred questions. In the 1st month or 2
106 00:13:01.250 ⇒ 00:13:05.289 Scott_Harmon: it’ll for each question it’ll see the confidence that it was a hundred percent right?
107 00:13:06.102 ⇒ 00:13:15.610 Scott_Harmon: And so that’ll be important to look at. And the 3rd thing I think that’s going to be critical to look at will be really, you know.
108 00:13:16.460 ⇒ 00:13:21.800 Scott_Harmon: event, really, for you and and the other managers in that knowledge loop.
109 00:13:22.340 ⇒ 00:13:25.699 Scott_Harmon: how many new pieces of knowledge were you able to add?
110 00:13:27.620 ⇒ 00:13:36.640 Scott_Harmon: so instead of having to create a document like you, did, you know today, or update that spreadsheet that I that I’m so obsessed with, you know.
111 00:13:36.640 ⇒ 00:13:37.320 Scott_Harmon: would they?
112 00:13:37.320 ⇒ 00:13:45.930 Scott_Harmon: To use the anteater? And did? I was able to do it 2 times or 10 times or 20 times. So we want that to that to be
113 00:13:46.240 ⇒ 00:13:48.819 Scott_Harmon: something that from a knowledge
114 00:13:50.510 ⇒ 00:13:53.720 Scott_Harmon: you know what he’s calling knowledge updating perspective.
115 00:13:53.720 ⇒ 00:13:54.000 YvetteRuiz: Yeah.
116 00:13:54.280 ⇒ 00:13:57.609 Scott_Harmon: That you were, you were able to use it to create
117 00:13:57.830 ⇒ 00:14:01.329 Scott_Harmon: new knowledge. Right? So those are the 3 numbers. I’ll just keep those
118 00:14:01.660 ⇒ 00:14:09.970 Scott_Harmon: those you know in front of our minds. That’s what we’re going to be focused on, because, you know, we want to make sure we have the right business value, for y’all.
119 00:14:10.300 ⇒ 00:14:12.890 Scott_Harmon: Does that sound sound right to y’all?
120 00:14:13.310 ⇒ 00:14:18.589 YvetteRuiz: Yeah, definitely. I mean, those. Those are those are key metrics like you mentioned. Because those are.
121 00:14:18.760 ⇒ 00:14:24.902 YvetteRuiz: Those are very measurable. You’ll be able to get good insight on that, for sure. But yeah, totally makes sense, Scott, to us.
122 00:14:25.580 ⇒ 00:14:26.030 Scott_Harmon: Got it.
123 00:14:26.030 ⇒ 00:14:28.510 Steven: Yep, all good. Yeah, even. I were talking this morning.
124 00:14:30.110 ⇒ 00:14:33.380 Uttam Kumaran: Steven, I think your audio is coming in a little bit muted.
125 00:14:35.860 ⇒ 00:14:45.470 Steven: Yeah kind of service for the customer, but also obviously makes more efficiency for the Csr. That’s not having to call back and spend more time looking for the answer, and they’re able to focus on
126 00:14:45.620 ⇒ 00:14:51.140 Steven: the customer and being sympathetic or listening, or whatever. So yeah, I think so.
127 00:14:52.960 ⇒ 00:14:53.520 Uttam Kumaran: Right.
128 00:14:53.930 ⇒ 00:14:59.119 Uttam Kumaran: So I think ideally what we will start to do, you know, even as we begin to test
129 00:14:59.330 ⇒ 00:15:11.660 Uttam Kumaran: locally, and within this group, the agent, we will start to establish those metrics. And then throughout this month we can begin to see, even within our testing, what those metrics look like. So that’s what we’ll be driving through
130 00:15:11.730 ⇒ 00:15:30.499 Uttam Kumaran: 2 other items. One is the integration with Google Chat. I think that’s definitely if we can accomplish that, I think would be extremely important here. I need to look. Get back to you hopefully next week with some of a bit of a roadmap and timeline. I know Tim just emailed us this morning with some details.
131 00:15:30.863 ⇒ 00:15:37.460 Uttam Kumaran: We have done this in slack before. I don’t expect this to be much of a challenge. But just in case there’s authentication issues
132 00:15:37.935 ⇒ 00:15:48.980 Uttam Kumaran: but ideally, this would be the the way that that folks interact directly in slack the last piece is on evals and data. And maybe I’ll just say, evaluations and data
133 00:15:49.110 ⇒ 00:16:18.060 Uttam Kumaran: evaluations are really the fundamental like, you can call like a golden data set, which is basically like question and and what the expected answer is going to be. And this is something that I think I’ll I will work actually with the the pest team on producing, which is not, not only does like you know. What I mentioned yesterday was, you can chat with it, and it seems right. Every improvement we want to make we actually want to prove. Of course, we want to know as an engineering team that it’s getting better. But we also want to show that
134 00:16:18.160 ⇒ 00:16:19.440 Uttam Kumaran: we’re we’re not
135 00:16:19.580 ⇒ 00:16:48.810 Uttam Kumaran: winning somewhere and and losing somewhere else. And so we’ll build, you know, a golden set of, you know, maybe ideally, as many questions we can across all those common categories which you mentioned a lot around pricing scheduling services. And so we’ll build those so that every time we improve the agent we can make sure that it’s it’s given an example question. It would have answered it in the way we expected. So that’s another thing, that I’ll actually be taking on to build. And I think that would be a good, probably session. We can do
136 00:16:49.208 ⇒ 00:16:54.281 Uttam Kumaran: you know, probably virtually early next week. While the team continues on the agents.
137 00:16:55.110 ⇒ 00:16:58.460 YvetteRuiz: I think I want to spend the remaining time. Maybe, Miguel, if you want to.
138 00:16:58.840 ⇒ 00:17:02.920 Uttam Kumaran: Maybe 1st do a demo, and then you can also shoot the link to everybody.
139 00:17:04.210 ⇒ 00:17:07.400 Uttam Kumaran: So that folks can play around today
140 00:17:07.984 ⇒ 00:17:18.810 Uttam Kumaran: and sort of like ongoing of the of the sort of knowledge base agent that we did. And so this will be primarily just the 1st retrieval part. So, Miguel, if you want to share that, that’d be great.
141 00:17:21.099 ⇒ 00:17:24.599 Miguel de Veyra: Okay? Sure, I’ll send the link over in chat.
142 00:17:29.529 ⇒ 00:17:43.509 Miguel de Veyra: Yeah, yeah, that. That’s that. What is currently trained on like the data we have. Yeah. So that should be able to answer some of the questions. Wait. Let me. Let’s try like an example question.
143 00:17:46.940 ⇒ 00:17:52.575 YvetteRuiz: With the Oh, yeah, you could ask it.
144 00:17:53.840 ⇒ 00:17:55.199 Uttam Kumaran: Do you want to zoom in a little bit.
145 00:17:55.940 ⇒ 00:17:57.069 Miguel de Veyra: Okay. Yeah. Sure.
146 00:17:57.570 ⇒ 00:18:01.379 Scott_Harmon: Yeah, I didn’t get the link in the chat yet, either, Miguel. If you were going to send that.
147 00:18:02.560 ⇒ 00:18:06.589 Uttam Kumaran: I think we’re just gonna go through. And I think we can go through example first, st and then we can send it.
148 00:18:08.240 ⇒ 00:18:21.668 YvetteRuiz: you could ask, is carpenter ants covered? Because that’s another thing is Carpenter, and it’s covered under general pest control.
149 00:18:23.300 ⇒ 00:18:25.079 Miguel de Veyra: Sorry carpenter, ants.
150 00:18:25.581 ⇒ 00:18:30.500 YvetteRuiz: Yeah, is carpenter ants covered under general pest control.
151 00:18:31.870 ⇒ 00:18:33.890 Uttam Kumaran: And Miguel, can you also walk through?
152 00:18:34.290 ⇒ 00:18:40.260 Uttam Kumaran: You know how they can use this this overall and sort of how to get access to this link and
153 00:18:40.790 ⇒ 00:18:42.609 Uttam Kumaran: the security and everything.
154 00:18:42.610 ⇒ 00:18:44.470 Uttam Kumaran: Oh, oh, yeah, yeah, yeah.
155 00:18:44.780 ⇒ 00:18:46.250 YvetteRuiz: Mysticker movie.
156 00:18:48.400 ⇒ 00:18:49.050 YvetteRuiz: Exactly.
157 00:18:49.180 ⇒ 00:18:55.330 YvetteRuiz: That’s gonna be interesting, because I wonder what’s gonna share that’ll kind of help us kind of clean up some guides
158 00:18:55.970 ⇒ 00:18:56.630 YvetteRuiz: alright
159 00:19:00.710 ⇒ 00:19:07.757 YvetteRuiz: car for insurance are not specifically covered in the general pest control services. Yeah. Picked it up. That’s cool.
160 00:19:08.750 ⇒ 00:19:21.770 YvetteRuiz: Yeah. It says it’s exactly the verbiage of the agreement. Yeah. Well, this is the exact information that we have currently for the copied service description, Steven. So if the updates have been made to the actual agreement.
161 00:19:24.330 ⇒ 00:19:26.280 YvetteRuiz: That’s where we gotta clean some of that stuff.
162 00:19:26.630 ⇒ 00:19:28.360 YvetteRuiz: Make sure that they’re the same.
163 00:19:28.360 ⇒ 00:19:34.780 Scott_Harmon: That that’s exactly right. This is working off your your current. Just a couple of documents as they are
164 00:19:35.330 ⇒ 00:19:47.469 Scott_Harmon: doing is creating newer versions of all those documents that are more consistent and cleaner, and the agent will know to look at all the right places. And we’re gonna have to deal with
165 00:19:47.820 ⇒ 00:19:53.590 Scott_Harmon: some of the issues that the data in there that we learn from you is just kind of hard to pull together.
166 00:19:53.590 ⇒ 00:19:54.260 YvetteRuiz: Yeah.
167 00:19:54.570 ⇒ 00:20:05.760 Scott_Harmon: The the center of that for me has to do with the service descriptions. And there’s just a you know, to you gotta pull data from lots of places. So the agent’s gonna have to be
168 00:20:05.960 ⇒ 00:20:09.639 Scott_Harmon: set up properly to really have the right information handy.
169 00:20:10.121 ⇒ 00:20:19.190 Scott_Harmon: And that that’s called Narc, that’s called knowledge engineering. And so we have. That’s going to be something we we do right up front is to try and figure out, how do we get the right.
170 00:20:19.720 ⇒ 00:20:25.519 Scott_Harmon: the right service descriptions into the agent’s training set. Yeah.
171 00:20:25.520 ⇒ 00:20:27.020 Scott_Harmon: Yup, yup, okay.
172 00:20:28.260 ⇒ 00:20:31.761 Uttam Kumaran: Great. So, Miguel, do you want to walk through? How to access this and sort of
173 00:20:31.980 ⇒ 00:20:32.730 YvetteRuiz: Yeah, we.
174 00:20:33.004 ⇒ 00:20:36.839 Uttam Kumaran: Just a simple password setup. But I want to get this over to everybody.
175 00:20:37.670 ⇒ 00:20:38.779 Miguel de Veyra: Let me reshare my.
176 00:20:38.780 ⇒ 00:20:43.519 YvetteRuiz: So what you’re saying is your. This will be shared with myself and Janice, so we can start playing with it.
177 00:20:43.520 ⇒ 00:20:44.589 Uttam Kumaran: 100%. Yeah.
178 00:20:44.590 ⇒ 00:20:44.920 YvetteRuiz: Okay.
179 00:20:44.920 ⇒ 00:20:45.850 Uttam Kumaran: Exactly.
180 00:20:46.470 ⇒ 00:20:47.200 YvetteRuiz: Do they have.
181 00:20:47.823 ⇒ 00:20:49.069 Miguel de Veyra: Can, everyone.
182 00:20:49.400 ⇒ 00:20:51.900 Steven: See the link on the chat, the one I sent.
183 00:20:53.120 ⇒ 00:20:55.040 YvetteRuiz: Oh, wait! Well, I didn’t even let’s see.
184 00:20:55.040 ⇒ 00:20:55.470 Scott_Harmon: You said.
185 00:20:55.841 ⇒ 00:20:57.700 YvetteRuiz: On this meet, right where.
186 00:20:57.700 ⇒ 00:20:59.759 Uttam Kumaran: Yeah, there’s nothing I don’t see anything in zoom chat.
187 00:20:59.760 ⇒ 00:21:00.340 YvetteRuiz: I see it.
188 00:21:01.650 ⇒ 00:21:04.100 Steven: It says, this content can’t be viewed here.
189 00:21:04.100 ⇒ 00:21:11.790 YvetteRuiz: Oh, no, Nope, that’s not it right? Yes, no, that’s the right which you can.
190 00:21:12.550 ⇒ 00:21:16.649 Uttam Kumaran: I can. I can share it, Miguel, like, do you? Do you want to just send in the zoom chat.
191 00:21:17.250 ⇒ 00:21:20.599 Miguel de Veyra: Hey, wait! Let me send it again, because I sent it twice already.
192 00:21:21.240 ⇒ 00:21:22.070 Miguel de Veyra: Is this working.
193 00:21:22.580 ⇒ 00:21:24.049 Scott_Harmon: I got it from Utam. Yeah.
194 00:21:24.350 ⇒ 00:21:25.190 Steven: Yeah, I see if.
195 00:21:25.572 ⇒ 00:21:27.100 Miguel de Veyra: Not allowed to send.
196 00:21:27.310 ⇒ 00:21:29.069 YvetteRuiz: Oh, yeah, I stay from Utah.
197 00:21:29.490 ⇒ 00:21:32.950 Miguel de Veyra: Okay, yeah. So let me share screen.
198 00:21:32.950 ⇒ 00:21:33.940 YvetteRuiz: Passwords.
199 00:21:33.940 ⇒ 00:21:34.620 Scott_Harmon: What’s the path?
200 00:21:34.620 ⇒ 00:21:38.090 Miguel de Veyra: It’s ABC. Home all smart.
201 00:21:38.510 ⇒ 00:21:39.420 YvetteRuiz: Lower caps.
202 00:21:39.420 ⇒ 00:21:39.810 Uttam Kumaran: Yes.
203 00:21:39.810 ⇒ 00:21:42.350 Miguel de Veyra: Yes, all small letters.
204 00:21:42.350 ⇒ 00:21:43.450 YvetteRuiz: ABC. Home.
205 00:21:43.630 ⇒ 00:21:44.290 Miguel de Veyra: Yep.
206 00:21:44.480 ⇒ 00:21:46.440 YvetteRuiz: Cool. Here it is.
207 00:21:46.720 ⇒ 00:21:47.420 Scott_Harmon: Yep.
208 00:21:48.830 ⇒ 00:21:49.480 YvetteRuiz: We’re in.
209 00:21:50.410 ⇒ 00:21:51.670 YvetteRuiz: What do we want to ask?
210 00:21:52.220 ⇒ 00:21:55.307 YvetteRuiz: Ask it? Is
211 00:21:56.310 ⇒ 00:22:04.699 YvetteRuiz: Hold on! I know there’s the menu of services I’m trying to think of something that’s gonna be harder. Do I need to schedule a follow up check trap.
212 00:22:06.860 ⇒ 00:22:11.059 Uttam Kumaran: Any event you might. Do you mind sharing while you guys are interacting with it? Just so we can
213 00:22:11.560 ⇒ 00:22:12.580 Uttam Kumaran: observe.
214 00:22:14.070 ⇒ 00:22:14.800 YvetteRuiz: Yeah.
215 00:22:15.150 ⇒ 00:22:18.020 Scott_Harmon: What did? What documents did you feed it so far? Utam.
216 00:22:18.020 ⇒ 00:22:21.089 YvetteRuiz: Oh, I know, so did you do all the pest ones.
217 00:22:22.220 ⇒ 00:22:23.430 YvetteRuiz: Shared with you.
218 00:22:23.430 ⇒ 00:22:28.230 Uttam Kumaran: Yes, I guess, Miguel, you can confirm. I don’t know if we were able to get to the Powerpoints, but.
219 00:22:29.020 ⇒ 00:22:32.339 Miguel de Veyra: Everything we sent, including the Powerpoints.
220 00:22:32.340 ⇒ 00:22:32.880 YvetteRuiz: But it’s 2.
221 00:22:32.880 ⇒ 00:22:33.380 Miguel de Veyra: Things I.
222 00:22:34.100 ⇒ 00:22:36.260 YvetteRuiz: Switch with you.
223 00:22:36.260 ⇒ 00:22:42.350 Scott_Harmon: Problem is gonna be the problem with it’s gonna be. I’m almost sure I’m.
224 00:22:42.350 ⇒ 00:22:42.690 YvetteRuiz: How much.
225 00:22:42.770 ⇒ 00:22:43.639 Scott_Harmon: Didn’t get this.
226 00:22:44.747 ⇒ 00:22:48.070 YvetteRuiz: I’m so sorry.
227 00:22:49.040 ⇒ 00:22:55.889 Scott_Harmon: Just. I’m just trying to manage expectations that the the documents that have been fed into it so far, I think, are just the
228 00:22:56.300 ⇒ 00:22:56.940 Scott_Harmon: the
229 00:22:58.040 ⇒ 00:23:05.620 Scott_Harmon: the notes, you know, the sops, the mat that spreadsheet, which is where so many so much of the information is, it wouldn’t have been able to.
230 00:23:06.360 ⇒ 00:23:06.710 YvetteRuiz: Right?
231 00:23:07.170 ⇒ 00:23:10.060 YvetteRuiz: Okay, okay, yes. Can you guys see the screen?
232 00:23:10.060 ⇒ 00:23:10.670 Miguel de Veyra: Yes.
233 00:23:10.980 ⇒ 00:23:11.953 YvetteRuiz: Okay, bye?
234 00:23:13.690 ⇒ 00:23:15.350 YvetteRuiz: Which in the following.
235 00:23:23.012 ⇒ 00:23:26.877 YvetteRuiz: yeah, no. Be clear
236 00:23:39.970 ⇒ 00:23:41.730 YvetteRuiz: tech, trump or cage.
237 00:23:42.400 ⇒ 00:23:49.050 YvetteRuiz: Yes, for audit resources, you should automatically schedule one, follow up for cage needed to follow up rubbing question.
238 00:23:49.750 ⇒ 00:24:04.430 YvetteRuiz: Ask the probing questions. I like this because even the scratching of the wall, because I have it in one of my hot topic sheets showing you need to make sure. Is it rats, or is it.
239 00:24:04.430 ⇒ 00:24:10.859 Scott_Harmon: So the the hot topic sheets, and the sops, or what it would have been trained on so far.
240 00:24:10.860 ⇒ 00:24:11.929 YvetteRuiz: Okay. Okay, well, that makes sense.
241 00:24:11.930 ⇒ 00:24:15.610 YvetteRuiz: He already knows more than I do. It’s already alrighty.
242 00:24:15.610 ⇒ 00:24:21.049 YvetteRuiz: These are questions already that you know that will. They will have to know when they’re scheduling.
243 00:24:21.680 ⇒ 00:24:24.657 YvetteRuiz: or that come up often that they don’t.
244 00:24:37.960 ⇒ 00:24:41.319 Uttam Kumaran: Miguel, can you talk about the process for the spreadsheets? Currently.
245 00:24:42.630 ⇒ 00:24:43.140 Miguel de Veyra: Can we just.
246 00:24:43.140 ⇒ 00:24:45.920 Uttam Kumaran: Do we just ingest that Csv. Or what happened to those.
247 00:24:47.800 ⇒ 00:24:51.560 Miguel de Veyra: No, I should be able to add it to be honest like right now.
248 00:24:52.290 ⇒ 00:24:54.720 Scott_Harmon: It’s gonna going to be sure it’ll be okay.
249 00:24:55.020 ⇒ 00:24:56.270 Scott_Harmon: It’s gonna be hard.
250 00:24:56.270 ⇒ 00:24:57.490 YvetteRuiz: That would be another.
251 00:24:58.410 ⇒ 00:25:03.429 Uttam Kumaran: No, I don’t think I think it’ll be okay. We do a lot of Csv stuff for our for Brainforge already.
252 00:25:03.430 ⇒ 00:25:04.079 Uttam Kumaran: It should be okay.
253 00:25:04.080 ⇒ 00:25:04.660 Scott_Harmon: Right.
254 00:25:05.110 ⇒ 00:25:07.790 Miguel de Veyra: Okay, issue I had was, we’ll focus more.
255 00:25:07.790 ⇒ 00:25:18.620 Scott_Harmon: I think you’re gonna find some problems that you’re gonna have to do a little bit more. But but it’s not huge. But I’ve already ingested 4 versions of the Csv with 4 different tools. And you’re gonna have to do a little
256 00:25:18.870 ⇒ 00:25:20.360 Scott_Harmon: transformation on it.
257 00:25:26.130 ⇒ 00:25:27.170 YvetteRuiz: Yeah, yeah.
258 00:25:27.170 ⇒ 00:25:36.030 Uttam Kumaran: So a couple of things I want to point out. Maybe we have a couple more minutes left. I think 2 2 clear things. I think we’re gonna do in terms of like the Usability here. And one
259 00:25:36.350 ⇒ 00:25:56.819 Uttam Kumaran: I like. The best thing I think for us to do is to keep using this as often as possible. We are. Gonna be seeing the logs come in and sort of we’ll already be able to see what’s working and not the second thing is, Miguel, a couple of things on the on the Ui. Here we wanna have the suggested questions. So we can add that functionality which we also have. So it’ll say, like.
260 00:25:56.950 ⇒ 00:26:00.259 Uttam Kumaran: based on what you just answered. It’ll suggest you a couple of follow up questions.
261 00:26:01.960 ⇒ 00:26:04.910 Uttam Kumaran: And the second thing is, we want to show the
262 00:26:05.230 ⇒ 00:26:11.139 Uttam Kumaran: maybe the the source of the document in the short term of where it retrieved it. From prior to us.
263 00:26:11.707 ⇒ 00:26:13.992 Uttam Kumaran: Establishing the new document structure.
264 00:26:14.470 ⇒ 00:26:21.480 Scott_Harmon: That’s a great point. There needs to be a read more kind of button on every answer. So if someone wants to go look at the underlying Doc. They can.
265 00:26:21.880 ⇒ 00:26:24.239 YvetteRuiz: They can. They can see it. That makes sense. Yeah.
266 00:26:24.240 ⇒ 00:26:30.649 Scott_Harmon: And then you’re gonna need to decide as we go along. This is a separate app, as you can see, it runs in a browser.
267 00:26:31.750 ⇒ 00:26:36.400 Scott_Harmon: It may be that this needs to be not a separate app, but running Google Chat.
268 00:26:36.400 ⇒ 00:26:37.139 YvetteRuiz: Yeah, yeah.
269 00:26:37.140 ⇒ 00:26:41.660 Uttam Kumaran: I think we already decided on. Yeah, I think we already decided on. That’s gonna be what what the goal is.
270 00:26:41.976 ⇒ 00:26:43.243 YvetteRuiz: Chat for now, yeah,
271 00:26:44.000 ⇒ 00:26:50.260 Scott_Harmon: Okay? So that so the follow up questions. And so the stuff on the right won’t be there.
272 00:26:50.897 ⇒ 00:26:58.390 Scott_Harmon: Because you can’t surface that in Google chat that I know of. So it’ll really just be the agent itself. And then the follow up links and buttons.
273 00:26:59.170 ⇒ 00:27:01.180 Uttam Kumaran: Awesome.
274 00:27:01.180 ⇒ 00:27:07.669 Uttam Kumaran: We’ll interact a little bit with how the Google chat like Api works. I think we’ll be able to at least give whenever you interact with, give it some overall
275 00:27:07.670 ⇒ 00:27:13.529 Uttam Kumaran: information about like how to use it or suggested questions. And of course, like, I think we’ll
276 00:27:13.820 ⇒ 00:27:19.379 Uttam Kumaran: eventually, if we deploy, we’ll we’ll need to do a little bit of training for folks on, just like how they get access.
277 00:27:19.540 ⇒ 00:27:21.659 Uttam Kumaran: and it’ll take some time right? I mean, I think.
278 00:27:21.930 ⇒ 00:27:22.430 YvetteRuiz: Well.
279 00:27:22.430 ⇒ 00:27:25.430 Uttam Kumaran: Everybody here has tried chat. Gbt, it’s like a new way of Googling.
280 00:27:25.820 ⇒ 00:27:35.460 Uttam Kumaran: so it takes a little bit of time to to get a sense for it. But yeah, so this is really great. I guess anything else we want to tackle before I think we have.
281 00:27:35.640 ⇒ 00:28:04.999 Uttam Kumaran: We’re going to be sprinting on this all next week. I think the only meeting I want to try to set up early next week is to go over basically trying to create that database of questions. Ideally, we we kind of go through an exercise like this. I think we take an hour and sort of list out what are the highest priority things. And then that helps our team make sure that those are prioritized. Additionally, that helps our team understand how we need to organize the document structure. So we can. We can. Also the other item we’ll be looking for is basically ranking on like, easy to hard questions.
282 00:28:05.060 ⇒ 00:28:10.109 Uttam Kumaran: right? And so we do want to have some of those very complicated multi-step problems.
283 00:28:10.250 ⇒ 00:28:34.830 Uttam Kumaran: And so if if maybe helpful, I can throw together a spreadsheet and sort of share that, you guys can add. But really, we, you know, we we want to drive towards not only just answering the okay, cool went to a doc and grabbed it, but where? It may need to answer several questions, create a concise answer. And so that’s what we’ll work towards for Tuesday. And then I think, Miguel, for next demo? We can make sure that all documents
284 00:28:35.434 ⇒ 00:28:42.990 Uttam Kumaran: make it in through rag, and then the ui on this side. You know we have a little bit more on the suggestions.
285 00:28:42.990 ⇒ 00:28:47.719 Scott_Harmon: Well, can I steal 60 seconds here, Tom, just since we’ve got Miguel and everybody, I just wanted
286 00:28:47.950 ⇒ 00:28:51.039 Scott_Harmon: double click on what you just said to go one layer deeper.
287 00:28:52.620 ⇒ 00:28:55.569 Scott_Harmon: So do we. We have the time.
288 00:28:56.040 ⇒ 00:28:57.400 Uttam Kumaran: Yes, for 3 min.
289 00:28:57.400 ⇒ 00:29:04.260 Scott_Harmon: Let me so, Miguel, this is the spreadsheet
290 00:29:04.840 ⇒ 00:29:07.799 Scott_Harmon: that’s in there. Can you also? Can you see my screen.
291 00:29:08.670 ⇒ 00:29:12.530 YvetteRuiz: Should I stop sharing? I think I need to stop sharing.
292 00:29:12.920 ⇒ 00:29:13.430 Scott_Harmon: Oh!
293 00:29:14.910 ⇒ 00:29:18.459 YvetteRuiz: Okay, there we go. Scott’s favorite.
294 00:29:18.460 ⇒ 00:29:27.510 Scott_Harmon: So I like. So just to be clear with. So we don’t have to have another meeting. There’s a whole bunch of tabs. The main function of this spreadsheet is. It’s a it’s a
295 00:29:27.710 ⇒ 00:29:40.640 Scott_Harmon: tool that allows the Csrs to figure out which technicians can deliver which services in which locations, at what times. So it’s a scheduling. It’s an availability matrix.
296 00:29:41.040 ⇒ 00:29:53.010 Scott_Harmon: The data is pulled from a system, from a system they have called evolve. So ultimately, we’d like to build, and someone has to do a bunch of work to scrape data out of, evolve to build this spreadsheet.
297 00:29:53.280 ⇒ 00:30:07.619 Scott_Harmon: So if you look at this, if you look, there’s a i don’t know what these numbers are, but Santiago here in Row 10 can deliver the following service Monday to Friday in these zip codes.
298 00:30:08.760 ⇒ 00:30:14.459 Scott_Harmon: so that data would be used for a Csr to say, it’s not what
299 00:30:14.460 ⇒ 00:30:17.050 Scott_Harmon: technicians is that is that wrong? Did I say that wrong.
300 00:30:17.050 ⇒ 00:30:40.349 YvetteRuiz: Yeah, yes, because if we’re starting with Pest truthfully, and I need to share these 2 with you, and I’ll have to. I have event, and I are owners of one of them. But we’re not the owners of the other one. So, Tom, I’ll get you access to that one and tell them that they need to open it up because there is a separate spreadsheet like this for the pest side, and that’s what needs to be in there, because this is lawn.
301 00:30:40.350 ⇒ 00:30:42.330 Scott_Harmon: But am I describing it? Right so.
302 00:30:42.330 ⇒ 00:30:46.499 YvetteRuiz: Oh, yeah, yeah, you’re describing it. Perfect. Yeah, it’s just this is, yeah, this is just.
303 00:30:46.500 ⇒ 00:30:56.399 Scott_Harmon: So when I say, when I say we need to ingest it, Miguel, like I know we can do an Esv and suck it in. But ultimately the agents gonna need to be able to figure out.
304 00:30:57.020 ⇒ 00:30:57.630 YvetteRuiz: Right.
305 00:30:57.630 ⇒ 00:31:02.420 Scott_Harmon: The row column, logic to to scheduling question and.
306 00:31:03.300 ⇒ 00:31:10.240 Scott_Harmon: I think the question would be, help help me out here a little bit. Who can deliver what service in this Zip code on.
307 00:31:10.240 ⇒ 00:31:11.380 YvetteRuiz: Absolutely.
308 00:31:11.930 ⇒ 00:31:15.590 Scott_Harmon: On this day. That’s the question that you need to be able to answer from all of this
309 00:31:15.970 ⇒ 00:31:17.210 Scott_Harmon: data. So.
310 00:31:17.950 ⇒ 00:31:20.840 Scott_Harmon: That’s the test if you’ve ingested. So there’s a bunch
311 00:31:20.960 ⇒ 00:31:24.179 Scott_Harmon: said another way. There’s a bunch of semantic information in here
312 00:31:24.610 ⇒ 00:31:28.699 Scott_Harmon: that you’re going to need to make accessible, so the agent can answer
313 00:31:28.880 ⇒ 00:31:31.750 Scott_Harmon: availability and scheduling questions. There you go.
314 00:31:31.750 ⇒ 00:31:34.319 YvetteRuiz: Scheduling questions. There you go scheduling.
315 00:31:34.936 ⇒ 00:31:39.250 Steven: So will you need account. You’ll need.
316 00:31:40.080 ⇒ 00:31:51.109 Steven: Yeah, we talked about who controls evolve. Julie technically does but I also can get you in contact with one of the head people over there. Would that be helpful? I assume you need.
317 00:31:51.110 ⇒ 00:31:59.009 Uttam Kumaran: Yes, I mean ideally, if we, if you have a vent, if you have someone on the vendor side that I could just have in a back pocket in case we need questions for the Api
318 00:31:59.448 ⇒ 00:32:11.561 Uttam Kumaran: there is a high chance that that we will looking. I just. I just had a chance to look at Tim’s email. It looks like we, we were able to get the email and a service account
319 00:32:12.590 ⇒ 00:32:16.210 Uttam Kumaran: and a few other things. I’m not sure yet if we got
320 00:32:16.340 ⇒ 00:32:26.020 Uttam Kumaran: the evolve access yet. But yeah, anything you can do to help me get connected with someone there. Just so I have it. That would be amazing.
321 00:32:26.020 ⇒ 00:32:37.639 Scott_Harmon: And then one question on, and I know this is for law, and we want the one for pest. But one question. I didn’t get on the spreadsheet. I know you’re scraping the data out of evolve to build a spreadsheet.
322 00:32:38.010 ⇒ 00:32:40.320 Scott_Harmon: who puts it into evolve.
323 00:32:41.740 ⇒ 00:32:47.120 YvetteRuiz: That’s Julie, our operations. Manager.
324 00:32:47.370 ⇒ 00:32:50.390 Scott_Harmon: That’s what I thought. So there’s someone that works for
325 00:32:50.860 ⇒ 00:32:54.329 Scott_Harmon: the service line and operations that goes and
326 00:32:54.630 ⇒ 00:33:02.270 Scott_Harmon: puts into evolve, which technicians can do which services in which location? So that’s the whole information lifecycle, Tom. So
327 00:33:02.870 ⇒ 00:33:09.520 Scott_Harmon: if we want, we can keep that so that Julie still puts it in evolve. It’s probably best to leave that alone
328 00:33:09.790 ⇒ 00:33:14.330 Scott_Harmon: ideally, the AI can scrape it out of evolve, and and we don’t
329 00:33:14.520 ⇒ 00:33:16.899 Scott_Harmon: need folks to have to build this
330 00:33:17.500 ⇒ 00:33:21.290 Scott_Harmon: this spreadsheet at all would be would be a real great outcome, I think. Yeah.
331 00:33:21.840 ⇒ 00:33:22.320 YvetteRuiz: That’s exactly.
332 00:33:22.320 ⇒ 00:33:30.767 Uttam Kumaran: What we’re hoping for is that we will anything that we can source directly from the source system. We will basically ditch from sourcing from the documents.
333 00:33:31.920 ⇒ 00:33:32.330 Miguel de Veyra: Yeah.
334 00:33:32.330 ⇒ 00:33:33.080 Uttam Kumaran: Ideally.
335 00:33:33.540 ⇒ 00:33:34.240 YvetteRuiz: Yep.
336 00:33:34.920 ⇒ 00:33:35.830 Uttam Kumaran: Okay, great.
337 00:33:35.960 ⇒ 00:33:53.209 Uttam Kumaran: I thought this was a great meeting. I hopefully, we left you with something that you can play around with. I think we’ll schedule some time, for early next week. I’ll coordinate over email, I hope we have double the amount of stuff to share next Friday. But I’m really excited. I think we, you know. I’m glad we turned something around in like in a few hours.
338 00:33:53.210 ⇒ 00:33:54.300 YvetteRuiz: I know that was very.
339 00:33:54.300 ⇒ 00:33:54.890 Uttam Kumaran: Hi I mean.
340 00:33:54.890 ⇒ 00:33:58.959 YvetteRuiz: You should see our we’re like, I mean, we’re excited to get in there. I mean, this is.
341 00:33:59.340 ⇒ 00:34:08.549 YvetteRuiz: From that point to that point you guys have, I mean, you, you guys know. But again, you have no idea how what game changer this is gonna be for for agents.
342 00:34:08.550 ⇒ 00:34:15.219 Uttam Kumaran: It makes me really, really happy. So you know, we we’re gonna set sort of like a 4 week sort of roadmap of of technical
343 00:34:16.328 ⇒ 00:34:17.400 Uttam Kumaran: and sort of drive.
344 00:34:17.600 ⇒ 00:34:27.707 Uttam Kumaran: Drive forward as fast as possible, I think for next Friday we’ll have a we’ll have a lot more clarity on those technical tasks. And ideally, if there’s any blockers that we can get help with, we’ll let you know.
345 00:34:27.969 ⇒ 00:34:38.649 Steven: Do you need? So do you have everything you need from Tim. Now, then, based on that email, I’ll I’ll shoot an email out to introduce you to evolve. I’ll need to include Julie
346 00:34:38.809 ⇒ 00:34:48.209 Steven: trying to prevent her from being a roadblock, but she needs to know what’s going on. I don’t want to just go around her. But yeah, I want to be able to work directly with evolve. So introduce you all. But is there anything else you need from Tim right now.
347 00:34:48.210 ⇒ 00:34:51.940 Uttam Kumaran: I still believe I need evolve and dream. Api access.
348 00:34:52.222 ⇒ 00:34:54.479 YvetteRuiz: Think Tim’s on the call. He just messaged.
349 00:34:54.919 ⇒ 00:35:05.869 T F: Yeah, there, there is no standardized evolve Api, so there are no endpoints for y’all to call any. Any integration that y’all will do will need to be custom, and that must involve Julie.
350 00:35:06.250 ⇒ 00:35:12.334 Uttam Kumaran: Okay. So then maybe if you can loop me in with Julie, and then we can keep Tim on that email. And I can talk about the
351 00:35:12.670 ⇒ 00:35:15.160 Uttam Kumaran: the fields we would need, and we can set up those integrations.
352 00:35:17.469 ⇒ 00:35:21.380 Steven: I assume. I assume you guys already have some integration set up with evolve.
353 00:35:22.940 ⇒ 00:35:24.580 T F: Yes, there are a few.
354 00:35:24.580 ⇒ 00:35:32.849 Uttam Kumaran: Okay, cool. Okay? So I ideally, hopefully, we could maybe make mimic a couple of those if there’s a read. Only so yeah, we’ll get this up, and and does dream have have an Api as well.
355 00:35:33.310 ⇒ 00:35:38.119 T F: They do not have an Api that’s just ready for consumption. It’s the same situation. There.
356 00:35:38.120 ⇒ 00:35:41.710 Uttam Kumaran: Okay, okay, cool. So for both of those, I think we’ll work to mimic what you guys have.
357 00:35:44.080 ⇒ 00:35:47.069 Steven: Cool. So yeah, like, I said, I’ll shoot an email include
358 00:35:47.540 ⇒ 00:35:50.460 Steven: Julie and Bailey is his name over to evolve.
359 00:35:50.760 ⇒ 00:35:52.719 Steven: So you’ll have their contact.
360 00:35:53.280 ⇒ 00:35:53.610 Uttam Kumaran: Okay.
361 00:35:53.610 ⇒ 00:36:03.669 Scott_Harmon: And just while we’re on it, would you all remind me again? I was a little bit unclear, but you strike me out on what information lives, in, what system I get, what lives in, evolve.
362 00:36:03.800 ⇒ 00:36:06.019 Scott_Harmon: what information lives in dream again.
363 00:36:08.200 ⇒ 00:36:10.170 YvetteRuiz: All servants, all servants.
364 00:36:10.170 ⇒ 00:36:13.290 Steven: Service agreements. That’s our sales software.
365 00:36:14.280 ⇒ 00:36:16.210 Steven: Pricing service agreements.
366 00:36:17.490 ⇒ 00:36:20.080 Scott_Harmon: And so the right?
367 00:36:20.520 ⇒ 00:36:23.120 Scott_Harmon: So basically, it’s the contracts.
368 00:36:24.120 ⇒ 00:36:26.800 Scott_Harmon: Sign. So you know what the customer’s paying for.
369 00:36:27.070 ⇒ 00:36:28.363 Scott_Harmon: Yeah. And
370 00:36:29.010 ⇒ 00:36:31.380 YvetteRuiz: And the service they signed up for
371 00:36:31.910 ⇒ 00:36:35.761 YvetteRuiz: the description. Pretty much is is on the contract.
372 00:36:36.500 ⇒ 00:36:38.200 Scott_Harmon: The description, meaning.
373 00:36:38.200 ⇒ 00:36:48.090 YvetteRuiz: The service description. So if I signed up for a road in annual service, or came up with that, that whole description is there, as far as what’s covered in that.
374 00:36:50.170 ⇒ 00:36:51.860 Scott_Harmon: Got it? Okay?
375 00:36:52.130 ⇒ 00:36:52.810 Scott_Harmon: Oh.
376 00:36:53.270 ⇒ 00:36:54.870 YvetteRuiz: Which are important documents.
377 00:36:55.140 ⇒ 00:36:59.940 Scott_Harmon: Yeah, not not to go to. Okay, I I should probably stop, but I can never help myself.
378 00:37:01.712 ⇒ 00:37:04.199 Uttam Kumaran: I gotta jump, actually.
379 00:37:04.200 ⇒ 00:37:05.379 Scott_Harmon: I’ll I’ll ask later. But.
380 00:37:05.380 ⇒ 00:37:08.649 Uttam Kumaran: Okay, yeah, maybe we can just ask over email, maybe we can coordinate on that.
381 00:37:08.650 ⇒ 00:37:09.390 Uttam Kumaran: Sure. Okay.
382 00:37:10.330 ⇒ 00:37:11.389 Scott_Harmon: Thanks. Everyone.
383 00:37:11.390 ⇒ 00:37:12.290 YvetteRuiz: Thanks guys.
384 00:37:12.290 ⇒ 00:37:13.109 Uttam Kumaran: Bye, guys appreciate it.
385 00:37:13.110 ⇒ 00:37:14.140 Miguel de Veyra: Bye, guys, have a good day.
386 00:37:14.140 ⇒ 00:37:14.430 Uttam Kumaran: Bye.