Meeting Title: Brainforge x Avoca Transcript Sync Date: 2026-04-13 Meeting participants: YvetteRuiz, read.ai meeting notes, Pranav Narahari
WEBVTT
1 00:00:41.310 ⇒ 00:00:44.140 YvetteRuiz: Over 6. Great.
2 00:00:50.980 ⇒ 00:00:51.800 YvetteRuiz: Nope.
3 00:00:57.220 ⇒ 00:00:58.340 YvetteRuiz: Okay…
4 00:01:14.520 ⇒ 00:01:19.160 YvetteRuiz: Or 6, or 11… Nope.
5 00:01:21.550 ⇒ 00:01:26.950 YvetteRuiz: Okay, so 3.30… Okay, very dirty.
6 00:01:28.850 ⇒ 00:01:31.270 YvetteRuiz: 4, 6, 4, 11.
7 00:01:39.060 ⇒ 00:01:41.599 YvetteRuiz: 4, 6, for 11.
8 00:01:43.670 ⇒ 00:01:52.079 YvetteRuiz: Monday through Saturday. Okay, so then the other one should be 3.30 to 4, 4.
9 00:01:55.020 ⇒ 00:01:56.769 YvetteRuiz: Maybe this one…
10 00:01:57.020 ⇒ 00:01:57.590 Pranav Narahari: Hey, vet.
11 00:01:58.330 ⇒ 00:02:00.110 YvetteRuiz: Hey there!
12 00:02:00.940 ⇒ 00:02:01.769 Pranav Narahari: Good mornin’.
13 00:02:01.770 ⇒ 00:02:02.750 YvetteRuiz: learning.
14 00:02:03.380 ⇒ 00:02:06.260 Pranav Narahari: Welcome back! How’s your… how was your long weekend?
15 00:02:06.540 ⇒ 00:02:08.849 YvetteRuiz: It was good, it was good, thank you!
16 00:02:08.850 ⇒ 00:02:11.540 Pranav Narahari: Yeah, yeah, congratulations again.
17 00:02:13.040 ⇒ 00:02:14.819 YvetteRuiz: Thank you so much.
18 00:02:14.820 ⇒ 00:02:17.740 Pranav Narahari: Yeah, welcome back, welcome back.
19 00:02:17.740 ⇒ 00:02:18.690 YvetteRuiz: your weekend?
20 00:02:19.450 ⇒ 00:02:28.020 Pranav Narahari: It was pretty good. It was pretty good. I’m actually in Austin right now, but it was a very quick trip, and then I’m leaving tonight.
21 00:02:28.130 ⇒ 00:02:29.210 YvetteRuiz: Okay.
22 00:02:29.220 ⇒ 00:02:48.609 Pranav Narahari: However, I’m coming next week as well, so Saturday to midweek Tuesday or Wednesday, I’m gonna be back, and so I was talking to Janiece on Friday, because we’re trying to coordinate, if it works with you as well, like, maybe some type of meeting, if that works. I don’t know, I know your schedule usually for coming into Austin is Thursdays.
23 00:02:48.690 ⇒ 00:02:53.779 Pranav Narahari: But we can figure something out if it’s, like, San Antonio, like…
24 00:02:54.040 ⇒ 00:02:57.989 Pranav Narahari: Just wanted to float that, because, you know, I’ll be in Austin in the future, too, but…
25 00:02:58.370 ⇒ 00:02:58.750 YvetteRuiz: Yeah.
26 00:02:58.750 ⇒ 00:02:59.270 Pranav Narahari: too.
27 00:02:59.590 ⇒ 00:03:13.300 YvetteRuiz: Yeah, no, for sure. So, the week, the Monday, Tuesday, the 20th, yeah, because I’m on… I’ll be off the following week, Wednesday, Thursday, Friday, so I was gonna plan to go up there, but I could either go Monday or Tuesday.
28 00:03:13.700 ⇒ 00:03:17.949 Pranav Narahari: Cool. Okay, so… Tuesday would be best for me, because.
29 00:03:17.950 ⇒ 00:03:18.350 YvetteRuiz: Okay.
30 00:03:18.350 ⇒ 00:03:30.979 Pranav Narahari: we have some… we’re going to, like, a conference in Austin, which should be probably the whole day, but yeah, Tuesday morning, we could, you know, we could do something. I think that would be great.
31 00:03:31.270 ⇒ 00:03:42.620 YvetteRuiz: Yeah, so the only thing I have Tuesday morning, is our executive weekly meeting on Tuesday mornings. It’s, like, from 8.30 to 9.30ish, maybe about 10, so, like, after that, I’ll be available.
32 00:03:42.930 ⇒ 00:03:48.899 Pranav Narahari: Perfect. That sounds great. Yeah, my Tuesday’s wide open, my flight is at 2.30, so…
33 00:03:48.900 ⇒ 00:03:49.570 YvetteRuiz: Okay.
34 00:03:49.740 ⇒ 00:03:54.459 Pranav Narahari: Probably, like, you know, after lunch, I’ll just probably head out, but yeah.
35 00:03:54.460 ⇒ 00:03:54.870 YvetteRuiz: Yeah.
36 00:03:54.870 ⇒ 00:03:56.100 Pranav Narahari: More than enough time.
37 00:03:56.400 ⇒ 00:04:04.569 YvetteRuiz: Sure, absolutely, yeah, just so if you want to just put it on the calendar, like, from 10 on… you know, from 10 to 12, that’d be good.
38 00:04:04.570 ⇒ 00:04:08.409 Pranav Narahari: Cool, cool. I think Utam’s also excited. He’ll show up as well.
39 00:04:08.410 ⇒ 00:04:09.440 YvetteRuiz: Cool! Awesome.
40 00:04:09.440 ⇒ 00:04:12.400 Pranav Narahari: Cool. Awesome, I’m excited for that.
41 00:04:12.720 ⇒ 00:04:16.399 Pranav Narahari: So, also, Janiece, I believe, is out today. Yeah.
42 00:04:16.649 ⇒ 00:04:25.369 Pranav Narahari: Right, okay, so we have our KPI meeting after that. We can maybe just kind of talk a little bit about that this week. I wanted to start off, though, with,
43 00:04:25.580 ⇒ 00:04:35.689 Pranav Narahari: the transcript cancellation stuff, because I know that’s something you were thinking about, like, hey, this is something I’m going to have to do right now, whether it’s manually or maybe we can help with, like, some automations.
44 00:04:35.840 ⇒ 00:04:37.890 Pranav Narahari: Were you able to, like…
45 00:04:37.890 ⇒ 00:04:38.300 YvetteRuiz: blood.
46 00:04:38.300 ⇒ 00:04:42.840 Pranav Narahari: find a good way of getting those transcript IDs for cancellations?
47 00:04:42.840 ⇒ 00:05:03.230 YvetteRuiz: Yeah, so I paused on it, and sorry about that, I didn’t… I didn’t get to you, so… No problem. What I ended up finding when I was going through the actual list that I was getting ready to start getting all the transcripts ready for you guys, is that I started noticing that there was a handful of them that were bad debt, and those typically don’t go through phone calls, they’re just canceled because they didn’t pay us.
48 00:05:03.250 ⇒ 00:05:28.180 YvetteRuiz: And so then I was like, okay, let me, let me change the approach here. So, what I did is I tasked the leaders of that department to go through these cancellations and mark which ones were preventable and avoidable, meaning which ones were handled by the CSR. And so, and whatever they listed there. Then once they completed it, which I tasked them to get that to me by the end
49 00:05:28.180 ⇒ 00:05:31.389 YvetteRuiz: end of today. Once I get that, then I could get you those
50 00:05:31.480 ⇒ 00:05:37.759 YvetteRuiz: the trans IDs, so you guys can start pulling those and reviewing it, if that works.
51 00:05:38.030 ⇒ 00:05:44.729 Pranav Narahari: Yeah, no, that works great. Is there, like, even one or two that you have available right now?
52 00:05:45.400 ⇒ 00:05:56.580 Pranav Narahari: If not, that’s totally fine, because I’m just asking, so, in parallel, so that we can do some work today of just sourcing those transcripts from 8x8,
53 00:05:58.310 ⇒ 00:05:59.099 YvetteRuiz: Hang on, let me…
54 00:05:59.100 ⇒ 00:06:04.309 Pranav Narahari: Or even if you have just any transcript ID, it doesn’t even need to be for a cancellation.
55 00:06:04.750 ⇒ 00:06:08.500 Pranav Narahari: That would be great as well, because then that way we can just build the…
56 00:06:09.070 ⇒ 00:06:12.849 Pranav Narahari: the automation on our part for pulling in the transcript by ID.
57 00:06:13.320 ⇒ 00:06:17.500 YvetteRuiz: Yeah, I’ll get it to you here in just a minute. Let me, shoot, I don’t even know if…
58 00:06:19.520 ⇒ 00:06:25.840 YvetteRuiz: Shades are already in… Well, actually, let me do this.
59 00:06:28.920 ⇒ 00:06:30.950 YvetteRuiz: Hang on, Brunov, it may take me a minute.
60 00:06:31.250 ⇒ 00:06:32.539 Pranav Narahari: No problem, take your time.
61 00:06:33.520 ⇒ 00:06:35.260 YvetteRuiz: Shoot. Good.
62 00:06:37.280 ⇒ 00:06:40.529 YvetteRuiz: Oh, you know what? I have my data guy here.
63 00:07:08.890 ⇒ 00:07:17.080 YvetteRuiz: I’m having, my dad, guy, David, pulled a couple of them for me.
64 00:07:17.930 ⇒ 00:07:18.520 Pranav Narahari: Okay.
65 00:07:18.840 ⇒ 00:07:20.450 YvetteRuiz: Oh, shoot.
66 00:07:20.800 ⇒ 00:07:23.690 YvetteRuiz: Where’s my list? Too many tabs open.
67 00:07:25.250 ⇒ 00:07:29.379 Pranav Narahari: So is David the one that kind of… Manages 8x8.
68 00:07:30.600 ⇒ 00:07:32.360 YvetteRuiz: So, yeah, so David…
69 00:07:33.160 ⇒ 00:07:40.800 YvetteRuiz: manages the 8x8, a lot of… a lot of the KP… does all the KPIs, call volume,
70 00:07:41.050 ⇒ 00:07:45.559 YvetteRuiz: And then… My real-time analyst, who’s
71 00:07:45.710 ⇒ 00:08:01.789 YvetteRuiz: now out on FMLA, was the one that does all, like, all the real time, and who is really gonna start helping me with the speech analytics, which is building, like, the… the key phrases and all that through 8x8, but…
72 00:08:02.250 ⇒ 00:08:08.129 YvetteRuiz: Gotcha. David was originally going to be the one working with me to get all that done, but…
73 00:08:08.820 ⇒ 00:08:11.609 YvetteRuiz: With everything that we’ve kind of shifted with.
74 00:08:12.090 ⇒ 00:08:16.459 YvetteRuiz: I kinda… It’s just… there’s a lot of things going on.
75 00:08:16.460 ⇒ 00:08:18.099 Pranav Narahari: There’s a lot of things happening in parallel, yeah.
76 00:08:18.630 ⇒ 00:08:20.060 Pranav Narahari: Yeah. Yeah.
77 00:08:24.510 ⇒ 00:08:28.970 YvetteRuiz: And then Cynthia is over our QA,
78 00:08:30.480 ⇒ 00:08:33.849 YvetteRuiz: who works with CallSource, who does our QA gradients.
79 00:08:34.880 ⇒ 00:08:38.569 YvetteRuiz: So there’s, again, a lot of…
80 00:08:39.840 ⇒ 00:08:56.880 YvetteRuiz: different roles that we’re playing, so that’s kind of what I was mentioning earlier, that I’m trying to really consolidate, and not consolidate, but really get a better, structure with everybody that’s kind of working on some type of data, or whether it be, like I said, with Avoca.
81 00:08:56.880 ⇒ 00:09:06.300 YvetteRuiz: With you guys, with our internal data team, what is everybody working on today, and what’s the end goal? So…
82 00:09:06.970 ⇒ 00:09:08.180 Pranav Narahari: Yeah, so…
83 00:09:08.980 ⇒ 00:09:23.809 Pranav Narahari: David sounds like someone I should definitely have a conversation with them, right? For transcripts piece of things, like, he’s gonna be super useful for, you know, maybe any permissions we need, or any just, how-tos that could just accelerate things on our end.
84 00:09:24.770 ⇒ 00:09:25.679 Pranav Narahari: I had ordered 10.
85 00:09:25.680 ⇒ 00:09:41.650 YvetteRuiz: I mean, he’s limited. I mean, typically he’s limited. He just goes in and he pulls, like, a lot of the analytics. I mean, he can do research and give you some insight, but Tim, our IT director, is more the person who can give you more true insights on that.
86 00:09:41.650 ⇒ 00:09:43.549 Pranav Narahari: Perfect, so Tim has, like, a…
87 00:09:43.980 ⇒ 00:09:49.930 Pranav Narahari: He’s managing all of these different software tools and, you know, partnerships.
88 00:09:50.410 ⇒ 00:09:54.730 YvetteRuiz: That’s good to know. And then I don’t even mind if you join,
89 00:09:55.250 ⇒ 00:10:10.959 YvetteRuiz: our… our meetings with 8x8, because David and I have regular meetings, most of the time monthly meetings with them, just kind of talking through what projects we got going on, and just, any other things that come up.
90 00:10:10.960 ⇒ 00:10:15.119 Pranav Narahari: I would love to be a part of those, yeah. I think specifically for the transcript.
91 00:10:15.450 ⇒ 00:10:17.349 Pranav Narahari: that we’re working on now is…
92 00:10:17.470 ⇒ 00:10:27.280 Pranav Narahari: it’s gonna be just good to just absorb all that information. Our experience so far with 8x8 is that the API isn’t the most well-documented, and it’s not the
93 00:10:27.700 ⇒ 00:10:41.629 Pranav Narahari: developer-friendly for us so far, but I just want to make sure there’s not a knowledge gap there. Maybe they have certain knowledge that just hasn’t, you know, been updated online, or just hasn’t been given to us so far. So, yeah, if I could get.
94 00:10:41.630 ⇒ 00:10:55.730 YvetteRuiz: Yeah, what I could do, Pranav, is I could schedule us… I need to look at when the next meeting is. I’ll go ahead and, even if I gotta schedule another one, say, hey, can we just kind of jump into meeting? We want to kind of just revisit transcripts again, because they know that what we’ve been trying to build with,
95 00:10:55.730 ⇒ 00:11:04.179 YvetteRuiz: With the speech analytics piece of it, but then they’ve also talked to Sam. We’ve also had other conversations regarding all that, so…
96 00:11:04.180 ⇒ 00:11:04.510 Pranav Narahari: Yeah.
97 00:11:04.510 ⇒ 00:11:20.339 YvetteRuiz: And they’re working right now with the… with us, with Evolve, with the integration, as far as… well, you know what I had mentioned to you, like, the pop screen, and then, of course, us being, being able to put the… the trans… the summary, I’m sorry, in 8x8.
98 00:11:20.680 ⇒ 00:11:25.419 Pranav Narahari: Right, right. Yeah. If we can get one of those calls on the schedule this week, that would be great.
99 00:11:25.660 ⇒ 00:11:27.299 YvetteRuiz: Yep, I can do that.
100 00:11:27.300 ⇒ 00:11:28.330 Pranav Narahari: Perfect, thank you.
101 00:11:30.450 ⇒ 00:11:44.169 YvetteRuiz: Sorry, I’m just waiting on David right now. But while I wait for David, I know this isn’t, the KPI meeting one, the KPI meeting, but I did… I was going through, and I wanted…
102 00:11:44.740 ⇒ 00:11:49.879 YvetteRuiz: just to kind of start talking about building a scorecard for Andy?
103 00:11:50.080 ⇒ 00:11:58.829 YvetteRuiz: You know, I know we already have already some things, you know, like with speed, with accuracy, with adoption.
104 00:11:59.690 ⇒ 00:12:01.070 YvetteRuiz: But, you know.
105 00:12:03.100 ⇒ 00:12:17.669 YvetteRuiz: what is that looking like right now? Like, I just kind of want to see that week over week, and I know we’re slowly starting to talk about it getting on the dashboard and so forth, but I know now, once we start really looking at transcripts and seeing, okay.
106 00:12:17.750 ⇒ 00:12:33.170 YvetteRuiz: how… tagging them to see, okay, which, you know, did the agent use Andy when the call? What’s the difference with AHT, with someone utilizing it versus someone who’s not utilizing it? So, I just wanted to talk, what does that look like overall?
107 00:12:34.090 ⇒ 00:12:38.660 Pranav Narahari: Yeah, so we have all of this data being captured.
108 00:12:39.020 ⇒ 00:12:44.770 Pranav Narahari: per transcript, we’re gonna tag it per CSR, and per trainer, so…
109 00:12:45.390 ⇒ 00:13:05.209 Pranav Narahari: it’s really the dashboards that are going to be the most helpful for you guys, and I think we’ve talked a little bit about, you know, some of the reports specifically that are going to be useful, which is, hey, these are the top 10 category intents that Andy can answer currently, that are super underutilized.
110 00:13:05.460 ⇒ 00:13:12.550 Pranav Narahari: But, yeah, it’s another great thing that you just mentioned, too, is, like, how can we… also show
111 00:13:12.940 ⇒ 00:13:20.980 Pranav Narahari: the… maybe, performance of certain CSRs based on their… and then also, in a separate column, how does that…
112 00:13:21.090 ⇒ 00:13:23.520 Pranav Narahari: How’s that correlate with their usage of Andy?
113 00:13:25.120 ⇒ 00:13:40.320 Pranav Narahari: But there’s two main things that I would want to probably look at based on our previous conversations, which is hold time, and then also, this hold time, and what’s, like, another, maybe, key indicator of CSR success?
114 00:13:41.470 ⇒ 00:13:53.970 YvetteRuiz: Escalations. Okay. You know what I mean? I think escalations is another one, you know, first call… were they able to handle the call first… first time, or did they have to have… I think that would be another one that they would do.
115 00:13:54.580 ⇒ 00:13:55.250 Pranav Narahari: Okay.
116 00:13:55.390 ⇒ 00:14:03.480 Pranav Narahari: That’s great. Now, that information is currently in Evolve, or is it in a different software?
117 00:14:03.810 ⇒ 00:14:05.799 YvetteRuiz: It’s the… for the…
118 00:14:06.230 ⇒ 00:14:18.899 YvetteRuiz: The escalation PC, that one’s a little bit tricky, and that’s the one that I’ve been really trying to work through, because everybody has different escalation processes, how they work, and this has kind of been a hot topic that we’ve been talking about. So…
119 00:14:19.580 ⇒ 00:14:34.969 YvetteRuiz: when a cog has to get escalated, I’m working on putting a tag in Evolve, which we already have, note tags and so forth, so then that way we can follow through, but this is where I feel…
120 00:14:35.670 ⇒ 00:14:51.510 YvetteRuiz: the speech analytics piece of it in 8x8, the key phrases would really, really be helpful if we can start building that. And maybe that’s something we can partner up with, because again, that was supposed to be a real-time analyst that was going to help us do that, but I think if we can
121 00:14:51.960 ⇒ 00:15:04.699 YvetteRuiz: tag that through 8x8, and have those key phrases, I think that’ll be easy for you guys to be able to pull it, now that you guys have access to, the transcript portion of it.
122 00:15:05.540 ⇒ 00:15:11.730 Pranav Narahari: Okay, just so I understand this correctly, like, the key phrase would define whether this is, like, an escalation or not?
123 00:15:11.730 ⇒ 00:15:16.309 YvetteRuiz: Yeah, so that’s kind of what we were toying around with, right? So, like, either we use…
124 00:15:16.460 ⇒ 00:15:30.090 YvetteRuiz: deposition codes, because we could go that way, and… because we started, then we stopped, because the teams were inconsistent, right? So we can deposition the call that way, and we can pull that way, or we could tag in key phrases
125 00:15:30.530 ⇒ 00:15:40.449 YvetteRuiz: You know, so then that way, when Abyte is listening it, they’ll be able to categorize, okay, you can run reports based off of that, or you can, you know, tag the transcripts that way.
126 00:15:41.500 ⇒ 00:15:43.380 Pranav Narahari: Okay, yeah, I mean…
127 00:15:44.040 ⇒ 00:15:52.729 Pranav Narahari: Yeah, that’s one missing piece, I think, that we have right… that we don’t have right now, so if we can pull that information from Evolve, then we can definitely then
128 00:15:53.050 ⇒ 00:15:57.769 Pranav Narahari: pull that information into the real dashboard, where you guys are already seeing the usage analytics.
129 00:15:58.540 ⇒ 00:16:01.199 Pranav Narahari: Yeah, we can map everything together, so…
130 00:16:01.200 ⇒ 00:16:02.000 YvetteRuiz: Okay.
131 00:16:02.120 ⇒ 00:16:03.410 Pranav Narahari: That sounds great, yeah.
132 00:16:05.310 ⇒ 00:16:08.849 YvetteRuiz: Okay, so, sorry, David just got back to me right now. Okay. Perfect.
133 00:16:09.220 ⇒ 00:16:11.210 YvetteRuiz: Did it give me the transcript?
134 00:16:13.330 ⇒ 00:16:14.999 YvetteRuiz: Nope, he gave me the call.
135 00:16:23.010 ⇒ 00:16:23.930 YvetteRuiz: Alright.
136 00:16:26.080 ⇒ 00:16:29.760 Pranav Narahari: Let me also just ask Sam right now if we already have a transcript ID handy.
137 00:16:54.190 ⇒ 00:16:56.020 YvetteRuiz: Okay, so…
138 00:16:59.610 ⇒ 00:17:04.890 YvetteRuiz: Let’s see… Darn it! I knew I closed it.
139 00:17:05.130 ⇒ 00:17:11.040 YvetteRuiz: So what would be the best way to… the trans ID, how can I get that? Let me chat it through here.
140 00:17:11.500 ⇒ 00:17:12.640 Pranav Narahari: Yeah, that’s perfect.
141 00:17:17.470 ⇒ 00:17:18.220 Pranav Narahari: Okay.
142 00:17:21.020 ⇒ 00:17:23.239 YvetteRuiz: I’m trying to get you another one, sorry.
143 00:17:25.630 ⇒ 00:17:28.169 Pranav Narahari: That’s fine. I… honestly, even just one is fine.
144 00:17:28.170 ⇒ 00:17:28.780 YvetteRuiz: Okay.
145 00:17:29.150 ⇒ 00:17:29.730 Pranav Narahari: Yeah.
146 00:17:34.650 ⇒ 00:17:37.569 Pranav Narahari: I just want to make sure, given, you know, the…
147 00:17:37.770 ⇒ 00:17:40.640 Pranav Narahari: API that 8x8 has with this.
148 00:17:40.640 ⇒ 00:17:41.180 YvetteRuiz: Indeed.
149 00:17:41.180 ⇒ 00:17:43.139 Pranav Narahari: Then we can pull in all the…
150 00:17:43.610 ⇒ 00:17:46.139 Pranav Narahari: All the transcript information, any other…
151 00:17:46.380 ⇒ 00:17:51.110 Pranav Narahari: you know, tagging information, that’s also a part of what 8x8 provides.
152 00:17:51.490 ⇒ 00:17:51.920 YvetteRuiz: Yeah.
153 00:17:51.920 ⇒ 00:17:53.489 Pranav Narahari: I just want to make sure we can get that.
154 00:17:53.870 ⇒ 00:18:02.029 Pranav Narahari: So, I will report back to you on if we need anything more than just transcript ID for each of the cancellation calls.
155 00:18:02.540 ⇒ 00:18:05.670 Pranav Narahari: Yeah, you can expect that from me later today.
156 00:18:05.990 ⇒ 00:18:15.530 YvetteRuiz: Okay, so that’s what I was going to actually be working on, so that’ll be helpful, if that’s exactly what you’re needing to get the information that we’re trying to gather.
157 00:18:15.880 ⇒ 00:18:16.410 Pranav Narahari: Yup.
158 00:18:18.180 ⇒ 00:18:18.850 YvetteRuiz: Okay.
159 00:18:22.960 ⇒ 00:18:23.569 Pranav Narahari: Thank you.
160 00:18:26.350 ⇒ 00:18:26.950 YvetteRuiz: Okay.
161 00:18:30.140 ⇒ 00:18:31.290 YvetteRuiz: So…
162 00:18:37.850 ⇒ 00:18:41.049 YvetteRuiz: I’m sorry, I’m still… I just lost my train of thought here.
163 00:18:41.050 ⇒ 00:18:41.880 Pranav Narahari: No problem.
164 00:18:41.880 ⇒ 00:18:45.449 YvetteRuiz: Alright, so on the transcript side of it, I’m sorry,
165 00:18:46.010 ⇒ 00:18:49.249 YvetteRuiz: You’ll get that back to me, but where,
166 00:18:49.710 ⇒ 00:18:52.250 YvetteRuiz: Where… overall, where are we at? Just…
167 00:18:52.250 ⇒ 00:18:52.600 Pranav Narahari: Yeah.
168 00:18:52.600 ⇒ 00:18:54.389 YvetteRuiz: In general, with the transcripts.
169 00:18:54.750 ⇒ 00:19:10.340 Pranav Narahari: Yeah, yeah. So, in general, which is, like, the project, we’re gonna be wrapping up the central.co pilot this week. So, we have, like, an internal, like, deadline for us to do, kind of, our stress testing on Wednesday.
170 00:19:10.430 ⇒ 00:19:25.239 Pranav Narahari: And so, yeah, what you kind of missed on Friday, I can kind of give you a recap, was, we have the placement automation set up in the linear tickets. And so maybe it’s actually… let me just give you a quick demo of that.
171 00:19:34.660 ⇒ 00:19:40.320 Pranav Narahari: Let me pull up, let’s take it real quick, one second.
172 00:19:40.820 ⇒ 00:19:41.330 YvetteRuiz: Yep.
173 00:20:18.670 ⇒ 00:20:19.570 YvetteRuiz: Hmm.
174 00:20:19.860 ⇒ 00:20:24.709 Pranav Narahari: Yeah, so… this is kind of a test triage ticket that we put in place.
175 00:20:26.310 ⇒ 00:20:42.290 Pranav Narahari: the idea here is that whenever it gets assigned to a trainer, so basically, up until this point, Janiece will assign it to a trainer to figure out, what should be automated into Central Doc. That’s the current workflow.
176 00:20:42.360 ⇒ 00:20:56.349 Pranav Narahari: The problem with that, of course, is just sometimes we’re adding duplicate information, conflicting information, and sometimes there’s outdated information that, as part of this triage ticket, should be removed, but doesn’t get removed.
177 00:20:57.110 ⇒ 00:21:05.259 Pranav Narahari: there’s a few things there, right? We don’t want the ticket to be… or we don’t want the central dock to be getting too large as well, which is kind of…
178 00:21:05.800 ⇒ 00:21:12.950 Pranav Narahari: Causes us to then have a whole effort of, like, refactoring and re… re-organizing the central dock.
179 00:21:13.120 ⇒ 00:21:15.750 Pranav Narahari: Okay. Yeah, but basically here.
180 00:21:16.150 ⇒ 00:21:25.189 Pranav Narahari: We came up with just, like, a super simple form. Even this is gonna be, like, updated a little bit. We’re not gonna have the explainers provide a change type, but…
181 00:21:25.190 ⇒ 00:21:25.570 YvetteRuiz: that…
182 00:21:25.570 ⇒ 00:21:43.590 Pranav Narahari: the three pieces of information that they’re gonna give is the target, which is just, you know, which department are you targeting, why it should be updated, and then the exact word-for-word answer that they were expecting to see. And so, just providing that information.
183 00:21:43.700 ⇒ 00:21:46.690 Pranav Narahari: So… If I just go down here…
184 00:21:50.780 ⇒ 00:21:52.460 Pranav Narahari: And then I run that.
185 00:21:52.940 ⇒ 00:21:57.489 Pranav Narahari: Basically what happens is our automation then kicks off.
186 00:21:58.100 ⇒ 00:22:17.720 Pranav Narahari: to start thinking, and then after a few minutes, or sometimes a few seconds, you’ll get a full recommendation about where it should be added, what content’s gonna be added as well, and then, yeah, a description of why this, why this placement makes sense to us.
187 00:22:18.470 ⇒ 00:22:24.480 YvetteRuiz: So that is automatic, so once they submit that, they fill that out, that’s gonna come right back to them.
188 00:22:24.600 ⇒ 00:22:25.379 YvetteRuiz: Is that what I’m doing?
189 00:22:25.710 ⇒ 00:22:26.400 YvetteRuiz: I’m sorry.
190 00:22:26.400 ⇒ 00:22:33.040 Pranav Narahari: Yep, exactly. So I just pasted this, right? And so the automation just automatically gave this information here.
191 00:22:34.360 ⇒ 00:22:39.530 YvetteRuiz: Perfect. Okay, so then they would have the answer, I mean, They would have their…
192 00:22:39.680 ⇒ 00:22:43.449 YvetteRuiz: Their answer to whatever needed to be done, or it would be done
193 00:22:44.590 ⇒ 00:22:48.689 YvetteRuiz: it would be updated. I’m sorry, I’m not… I want to make sure that I’m following.
194 00:22:48.690 ⇒ 00:22:54.189 Pranav Narahari: Yeah, yeah, so the idea here is that all the trainers need to do is
195 00:22:54.310 ⇒ 00:23:01.590 Pranav Narahari: submit these, like, answers to these specific fields, and then we… we handle the rest with automations.
196 00:23:01.980 ⇒ 00:23:02.410 YvetteRuiz: Got it.
197 00:23:02.410 ⇒ 00:23:17.220 Pranav Narahari: So then, what they can expect is by… if, you know, they completed their triage, like this form right here, this comment, essentially, in the linear ticket, then by the next day, those… that change should be made in the central doc.
198 00:23:17.220 ⇒ 00:23:18.670 YvetteRuiz: Perfect, okay, alrighty.
199 00:23:18.670 ⇒ 00:23:19.220 Pranav Narahari: Yeah.
200 00:23:19.510 ⇒ 00:23:20.120 YvetteRuiz: Okay.
201 00:23:20.530 ⇒ 00:23:21.460 YvetteRuiz: That’s excellent.
202 00:23:21.460 ⇒ 00:23:23.480 Pranav Narahari: means… that sounds good, right? I think this
203 00:23:23.890 ⇒ 00:23:32.620 Pranav Narahari: better solution. Also, what this means for you and Janiece is that you guys will be able to see all of
204 00:23:32.780 ⇒ 00:23:38.200 Pranav Narahari: And we’ll have another automation here for checking if it’s conflicting or duplicate information.
205 00:23:38.640 ⇒ 00:23:51.129 Pranav Narahari: the final say will be for you and Janiece every single day to assess, is this a good addition to the central dock, or is it an incorrect, you know, wrong decision to add this to the central dock? .
206 00:23:51.460 ⇒ 00:23:55.309 YvetteRuiz: That’d be the approved… the approved… the approved level?
207 00:23:55.790 ⇒ 00:23:56.730 Pranav Narahari: Yes, exactly.
208 00:23:56.730 ⇒ 00:23:57.280 YvetteRuiz: with them.
209 00:23:57.840 ⇒ 00:23:58.400 Pranav Narahari: Yep.
210 00:23:59.020 ⇒ 00:24:04.210 Pranav Narahari: So, yeah, this approve will only, like, function for you and Janiece. Gotcha.
211 00:24:04.210 ⇒ 00:24:04.750 YvetteRuiz: Okay.
212 00:24:04.970 ⇒ 00:24:05.580 Pranav Narahari: Yeah.
213 00:24:06.060 ⇒ 00:24:06.660 Pranav Narahari: And so…
214 00:24:06.660 ⇒ 00:24:07.269 YvetteRuiz: I think with the.
215 00:24:07.270 ⇒ 00:24:09.549 Pranav Narahari: automations. It’s pretty smooth, right?
216 00:24:09.770 ⇒ 00:24:13.219 YvetteRuiz: Yeah, very, very, very cool. So when is this going to affect?
217 00:24:14.040 ⇒ 00:24:25.830 Pranav Narahari: This will be, in effect, the plan is for this Friday, we’ll take in the first batch, or… well, this Friday we’ll demo it, and then on Monday, we’ll take in the first batch. All the automations will be in place.
218 00:24:25.940 ⇒ 00:24:28.909 Pranav Narahari: And… Yeah, we should be good to go there.
219 00:24:29.260 ⇒ 00:24:30.340 YvetteRuiz: Okay, okay.
220 00:24:30.340 ⇒ 00:24:30.930 Pranav Narahari: Yeah.
221 00:24:30.930 ⇒ 00:24:36.019 YvetteRuiz: So, how is everything else, How is everything,
222 00:24:36.530 ⇒ 00:24:53.579 YvetteRuiz: progressing. I know Janiece was mentioning some stuff that you were in a meeting with the trainers, and how is all that going? Are we… are we making progress with the updates and kind of just some of the, the things that we’re looking to correct in the central doc with accuracy?
223 00:24:54.460 ⇒ 00:25:05.149 Pranav Narahari: Yes, I feel like what we’re doing, kind of, for the Central Dock project is it’s all about how do we have a good system in place to make these updates.
224 00:25:05.280 ⇒ 00:25:06.360 Pranav Narahari: I think…
225 00:25:06.620 ⇒ 00:25:14.270 Pranav Narahari: we had a good way of getting feedback, which was with Andy, it’s very simple, you know, thumbs down with an explanation. However.
226 00:25:14.470 ⇒ 00:25:29.779 Pranav Narahari: things would get lost in linear. You know, we were seeing tickets that are all the way back from, like, December. And so now with this new process, things really should not be delayed more than 24 hours. And if they are delayed more than 24 hours, then that’s gonna be…
227 00:25:29.940 ⇒ 00:25:33.819 Pranav Narahari: That’s gonna be brought up to you and Janiece, a report.
228 00:25:33.860 ⇒ 00:25:34.940 YvetteRuiz: Okay.
229 00:25:34.940 ⇒ 00:25:52.330 Pranav Narahari: And so, every single day, we’ll be giving you guys a memo or a report of saying, hey, these are the things that have been updated in the Central Doc, these are the things that were rejected in terms of adding to the central doc, and these are things that are still pending yours and Janiece’s review. So…
230 00:25:52.440 ⇒ 00:26:00.990 Pranav Narahari: the idea there is that every single triage ticket is being… is not being lost, is gonna be visible.
231 00:26:01.170 ⇒ 00:26:06.120 Pranav Narahari: And this is also a good memo for you guys to give to the individual trainers as well, because
232 00:26:06.390 ⇒ 00:26:20.090 Pranav Narahari: they can now confidently be like, okay, me… this change is now in the central doc. Because I think before, part of the ambiguity was, hey, I’m… added this information… I added this feedback into the central doc… or to the triage system.
233 00:26:20.630 ⇒ 00:26:24.319 Pranav Narahari: Has it been added yet? Has it been taken care of yet?
234 00:26:24.800 ⇒ 00:26:26.040 Pranav Narahari: It’s,
235 00:26:27.090 ⇒ 00:26:32.460 Pranav Narahari: I think the whole thing we’re trying to do right now is to bring back trust, right? So…
236 00:26:32.770 ⇒ 00:26:36.850 Pranav Narahari: And I think we have already done a lot of that, because I’m seeing with the usage, like.
237 00:26:36.850 ⇒ 00:26:37.190 YvetteRuiz: Yeah.
238 00:26:37.190 ⇒ 00:26:44.520 Pranav Narahari: Since I first joined, the usage is up 150… Conversation requests per week.
239 00:26:44.960 ⇒ 00:26:45.680 Pranav Narahari: So…
240 00:26:45.680 ⇒ 00:26:46.520 YvetteRuiz: Wow. Yeah.
241 00:26:46.520 ⇒ 00:26:47.799 Pranav Narahari: Yeah, things are already…
242 00:26:47.800 ⇒ 00:26:48.370 YvetteRuiz: So many…
243 00:26:48.370 ⇒ 00:26:48.810 Pranav Narahari: Yep.
244 00:26:48.810 ⇒ 00:26:58.549 YvetteRuiz: Of that, I’m so sorry, so of that, because that kind of goes into… I did a… I have, like, 30 pages worth of notes, because I was kind of… I took some time just to kind of really
245 00:26:58.810 ⇒ 00:27:06.300 YvetteRuiz: Really think on everything that, like, all the progress that has been made since, you know, you started helping us, but then, okay.
246 00:27:06.420 ⇒ 00:27:16.199 YvetteRuiz: more to developing the scorecard, right? So, on the thumbs up and the thumbs down, what is that… that percentage? You know what I mean? Like, do you have that data?
247 00:27:16.650 ⇒ 00:27:24.579 Pranav Narahari: Yes, let me, share my screen again, and I want to make sure you guys can honestly access this information whenever, because…
248 00:27:25.840 ⇒ 00:27:30.150 Pranav Narahari: This… it’s very… it’s very eye-opening, and also it gives,
249 00:27:30.890 ⇒ 00:27:34.569 Pranav Narahari: gives more context for just, like, how we’re doing, right? So…
250 00:27:35.180 ⇒ 00:27:44.969 Pranav Narahari: Here, I can add an additional column here for, let’s see… Thumbs up count… thumbs…
251 00:27:46.320 ⇒ 00:27:49.870 Pranav Narahari: I think thumbs down rate makes the most sense, actually.
252 00:27:56.890 ⇒ 00:28:03.220 Pranav Narahari: Yeah, so what we’re seeing here is that…
253 00:28:03.500 ⇒ 00:28:05.539 Pranav Narahari: Thumbs up rate is kind of variable.
254 00:28:05.780 ⇒ 00:28:11.030 Pranav Narahari: thumbs down rate, I should say. It’s kind of variable, week over week.
255 00:28:12.140 ⇒ 00:28:16.230 Pranav Narahari: So this is something that we want to continuously see go down.
256 00:28:16.620 ⇒ 00:28:17.080 YvetteRuiz: Yeah.
257 00:28:17.080 ⇒ 00:28:28.179 Pranav Narahari: However, it’s… definitely, we want to see this go down, but also, it’s going to be something that we want to caveat with seeing usage increase, right?
258 00:28:28.480 ⇒ 00:28:38.539 Pranav Narahari: because as usage increases, we may notice that they’re asking more questions that are out of the scope of what, the central doc has, and so…
259 00:28:38.640 ⇒ 00:28:45.300 Pranav Narahari: What’s gonna be more important is getting unique feedback on questions that were already asked before and aren’t getting
260 00:28:45.840 ⇒ 00:28:51.270 Pranav Narahari: update, and are still being incorrect, like, week over week, or day over day. Yeah.
261 00:28:51.730 ⇒ 00:28:54.800 Pranav Narahari: Yeah, so… What we’ll also.
262 00:28:54.800 ⇒ 00:28:55.260 YvetteRuiz: Notice, too.
263 00:28:55.260 ⇒ 00:29:02.110 Pranav Narahari: He was probably, like, Let’s also check the… the number of thumbs down… thumbs down count.
264 00:29:02.570 ⇒ 00:29:10.670 Pranav Narahari: Because for some of these, like, smaller usage departments.
265 00:29:11.260 ⇒ 00:29:14.250 Pranav Narahari: If it’s really just, like, this is gonna vary a lot, right?
266 00:29:14.670 ⇒ 00:29:22.090 Pranav Narahari: 2, 6, 6, 8, 9 even, like, it’s not a ton. But yeah, this.
267 00:29:22.090 ⇒ 00:29:23.070 YvetteRuiz: But I think you hit a…
268 00:29:23.070 ⇒ 00:29:24.140 Pranav Narahari: Something… yeah.
269 00:29:24.140 ⇒ 00:29:37.559 YvetteRuiz: Yeah, because I think you had… you had exactly where I was going with that. So, like, out of the thumbs down, is it content that is missing, or what are those… you know what I mean? Like, where… what are those thumbs down? Or is it, like you said, is it…
270 00:29:38.040 ⇒ 00:29:43.409 YvetteRuiz: misinformation, I got one response here, and then he gave me a different… what is that?
271 00:29:43.750 ⇒ 00:30:01.019 Pranav Narahari: Right, yeah. So, that’s information that we get from, like, the triage system that we’re not currently showing in the real dashboard. So, this is actually… I was talking with Casey about this last week, and we’re working on a system… well, as part of this new Central.co pilot.
272 00:30:01.020 ⇒ 00:30:11.569 Pranav Narahari: We’re going to be updating the real dashboard also with, hey, with some of the stuff that’s going into triage, is it on the central doc? Is it on, you know.
273 00:30:11.900 ⇒ 00:30:22.919 Pranav Narahari: Is it on the ZipDB? Is it on, you know, maybe the wording of the question wasn’t good? Because we were seeing, I think, sometimes with our, like, weekly categorization, it was like.
274 00:30:23.540 ⇒ 00:30:26.249 Pranav Narahari: No human could even answer that question, right?
275 00:30:26.250 ⇒ 00:30:26.820 YvetteRuiz: It was very.
276 00:30:26.820 ⇒ 00:30:27.170 Pranav Narahari: Yeah.
277 00:30:27.170 ⇒ 00:30:42.000 YvetteRuiz: And I know that’s come up before, because it’s like, okay, we gotta… we gotta do a better job training, but this is where this data would be very valuable to us. It’s like, okay, where that thumbs down, okay, what does that mean to me? So I’m getting all these thumbs down, but where is it? Is it because.
278 00:30:42.000 ⇒ 00:30:42.370 Pranav Narahari: Yeah.
279 00:30:42.370 ⇒ 00:30:46.129 YvetteRuiz: it’s not updated on there? Is it because they’re giving us just very…
280 00:30:46.300 ⇒ 00:30:53.679 YvetteRuiz: I would love to see that, because then that’s going to really start being very telling for us, the areas that are… we need to start executing on.
281 00:30:53.680 ⇒ 00:31:06.389 Pranav Narahari: Definitely. Yeah, you bring up a good point, like, in that system, that information’s already being captured by us from the triage tickets, right? Janiece is already routing all these different tickets to different,
282 00:31:06.520 ⇒ 00:31:21.109 Pranav Narahari: trainers, or she’s sometimes even deleting some of them based on being just poor feedback. Maybe it didn’t deserve a thumbs down. And so that information is gonna be super valuable in the real dashboard. I can see… I can see why.
283 00:31:21.250 ⇒ 00:31:22.650 Pranav Narahari: Yeah, so…
284 00:31:23.070 ⇒ 00:31:28.189 Pranav Narahari: We’ll, as part of this new triage system we’re putting in place, we’ll make sure that we implement that as well.
285 00:31:29.270 ⇒ 00:31:29.930 YvetteRuiz: Okay.
286 00:31:30.280 ⇒ 00:31:31.000 Pranav Narahari: Yeah.
287 00:31:31.560 ⇒ 00:31:36.529 YvetteRuiz: Yeah, because I think that’d be critical for accuracy, you know, just every… all the way around.
288 00:31:36.900 ⇒ 00:31:42.379 Pranav Narahari: Definitely, yeah, I think kind of the overarching theme of just what we’re trying to do for these next,
289 00:31:43.170 ⇒ 00:31:51.530 Pranav Narahari: just next two… two months, up until, like, end of May, is how can we just give you guys more tailored reports, right? Instead of just saying.
290 00:31:51.890 ⇒ 00:32:05.530 Pranav Narahari: hey, this department, you guys haven’t been using Y, we can be giving you more information about why they aren’t using it, so then… it’s just more detailed feedback, right? It’s even difficult for the trainers to just hear, hey, you guys aren’t
291 00:32:05.640 ⇒ 00:32:10.659 Pranav Narahari: your department isn’t using Andy that often. They’re like, oh.
292 00:32:11.150 ⇒ 00:32:21.869 Pranav Narahari: Can you give me a reason, maybe, like, what you’re noticing in terms of, like, the CSR usage? Like, I think even a lot of this information, even though they’re closer to the CSRs, they’re probably not seeing all this, right?
293 00:32:22.560 ⇒ 00:32:29.230 Pranav Narahari: individual, like, user requests to Andy, they probably aren’t looking through every single conversation.
294 00:32:29.490 ⇒ 00:32:42.790 YvetteRuiz: No, no, no, they’re not. I mean, I know that they’re not, so that’s why I want to step in to be able to provide them that, because, you know, like I’ve mentioned before, I come in at a higher level, and this is what I told Janiece, is like, I want to know the answers to
295 00:32:42.890 ⇒ 00:32:59.700 YvetteRuiz: why, you know what I mean? Like, one, I want… I don’t want it to be just a tool, right? I need it to be valuable. It needs to bring value to us, right? That’s the reason we’re investing in it, right? So, if I get a lot of thumbs down, I want to know what those thumbs downs are, so then that way, okay.
296 00:33:00.090 ⇒ 00:33:10.879 YvetteRuiz: is it information that’s… everything that we talked about, but I need to be able to provide that to them, so that that data’s, like, very, important.
297 00:33:11.060 ⇒ 00:33:13.450 Pranav Narahari: Okay, great. So currently…
298 00:33:13.630 ⇒ 00:33:26.880 Pranav Narahari: Currently, what is the… the data that you bring forward to them? When you try to, like, have these conversations with Janiece or with, like, the trainers, what is that conversation usually like? Is it just… just talking about usage, or is it anything else?
299 00:33:27.260 ⇒ 00:33:30.329 YvetteRuiz: So, what I do is, I always kind of… I always…
300 00:33:30.740 ⇒ 00:33:46.470 YvetteRuiz: I always get with them and just kind of say, okay, what was the goal, right? What was the end? What’s the end goal for us right here with Andy, right? And obviously, it’s supposed to be a smart assistant tool that’s supposed to get us quick answers instead of scrolling through all these documents.
301 00:33:47.140 ⇒ 00:34:05.980 YvetteRuiz: our onboarding, when we bring in new hires, which we’re bringing in a couple new hires, like, I shouldn’t have to go a 7-week period, I should be able to condense that a lot, you know, shorter, and then my average handle time, my experiences with the customers are going to be better, right? So, looking at all that, and knowing that that’s what Andy’s supposed to bring to the table.
302 00:34:06.110 ⇒ 00:34:10.050 YvetteRuiz: Are we using it, right? Okay, great, are we using it, right?
303 00:34:10.489 ⇒ 00:34:28.319 YvetteRuiz: But what value is that? Are we getting the answers that we’re needing? And if we’re not getting the answers that we’re needing, what answers aren’t we getting? Is it because it’s lacking in the system, or is it because Brainforge hasn’t updated, or is it totally broken? Like, that’s what I’m really trying to do, because at the end of the day, if
304 00:34:28.320 ⇒ 00:34:45.379 YvetteRuiz: I’m expected to be on a phone call with the customer to provide them great experience. I want to be able just to flip through that very quickly, get a quick answer, and be able to handle that phone call without… and try to make it a one-call resolution, right? So, that’s the… that’s the conversations that I have with them.
305 00:34:45.540 ⇒ 00:34:48.189 YvetteRuiz: when I’m talking to them, they’re just like, okay, well.
306 00:34:48.460 ⇒ 00:35:07.559 YvetteRuiz: is there ways to get the questions that they’re asking, you know what I mean? So then that way we can see what they’re asking. Is there a way for us to know… is there a way to train them, hey, you can’t ask just one question, you need to ask more of a question? So those are the conversations that I really do have with them to really, kind of, them tell me the story, but also thinking, like.
307 00:35:07.670 ⇒ 00:35:11.789 YvetteRuiz: Why are we using Andy? What’s the end goal for Andy, right?
308 00:35:11.790 ⇒ 00:35:14.070 Pranav Narahari: Yes, yes, yes. And…
309 00:35:14.890 ⇒ 00:35:19.350 Pranav Narahari: I guess I’m curious, and I’m having more of these conversations with trainers too, so I’ll ask the same question, but…
310 00:35:19.930 ⇒ 00:35:33.120 Pranav Narahari: what are the… what is the kind of responses that you’re getting from them? Are they just saying… are they giving you information about, like, hey, these are the… the conversations that we’re having with Andy? Or are they giving you information about, like.
311 00:35:34.870 ⇒ 00:35:40.470 Pranav Narahari: hey, these are the questions that Andy isn’t answering specifically.
312 00:35:41.240 ⇒ 00:35:59.070 YvetteRuiz: So, a lot, like, right now here lately, the ones… the feedback that I have been getting, like, specifically the last conversation, and I shared this with you, was with Patricia. You know, Patricia says, like, everything that I’m getting, it’s being provided correctly, you know what I mean? I’m getting those answers, and they’re getting correct.
313 00:35:59.070 ⇒ 00:36:23.150 YvetteRuiz: they’re being answered correctly. She’s having trouble with her trainer, you know, really, because in her response, she’s saying that it’s not delivering the answers that they’re supposed to, and that’s what the tech… that’s what the employees are saying. And so, my question to Patricia is, I need more information. My question to Janiece is, I need more questions. So, if you and Patricia are telling me that Andy is providing us.
314 00:36:23.150 ⇒ 00:36:32.270 YvetteRuiz: the information that we need, but yet the response is that their trainer, who’s supposed to be pushing that, and their employees are saying that it’s not, I need that, but…
315 00:36:32.720 ⇒ 00:36:49.520 YvetteRuiz: that’s just asking them to give me that information, where I can say, okay, Pranav, can you give me what thumbs down, what usage all their agents are using? Because if they come back and say, oh, well, China has all these complaints, I’m like, well, China’s not used it at all. So what complaints?
316 00:36:49.520 ⇒ 00:36:50.070 Pranav Narahari: Essentially.
317 00:36:50.070 ⇒ 00:37:05.769 YvetteRuiz: Right? Okay, so that’s why I need the usage per employee, right? And that’s the way they’re going to have to start asking questions, right? Like, if I have all my employees right now, and I see no one’s using it, my question is, why?
318 00:37:05.770 ⇒ 00:37:12.139 YvetteRuiz: You know what I mean? They’re on the phone every day. I mean, here’s their call volume, here’s how many times they’re using it, here’s their hold time.
319 00:37:12.680 ⇒ 00:37:33.939 YvetteRuiz: Okay, where’s those… what’s the gaps there? How do we make them more efficient, right? And then, of course, if they come back saying, if I go to China or whoever it is, and I’m asking, okay, why aren’t you using it? Well, it just gives me the wrong answers, okay? Specifically, what? But I have more data, more tangible data, it’s just like any scorecard, right? Like, I can go to them and say, okay, you have… your compliance is not there.
320 00:37:34.070 ⇒ 00:37:34.990 YvetteRuiz: Well…
321 00:37:35.320 ⇒ 00:37:43.490 YvetteRuiz: Yes, it is. Well, they can’t tell me no, because I have their compliance there, right? I know exactly where they fell off compliance. Does that make sense? Yes, yeah.
322 00:37:43.490 ⇒ 00:37:53.500 Pranav Narahari: a ton of sense. I think what a quick win is gonna be is, first off, I’m gonna take all this information, and then I’m going to…
323 00:37:53.740 ⇒ 00:38:12.279 Pranav Narahari: kind of develop a scorecard-type dashboard for you, because right now, with our usage dashboard, all you can see is the people that are using it. But I think what’ll also be useful is just to see every single CSR, right? All the CSRs that you guys have, and then just see, okay, what’s the number of calls are taken?
324 00:38:12.440 ⇒ 00:38:19.119 Pranav Narahari: And then, what… how many requests are they giving to Andy? And just on a per-week basis, right?
325 00:38:19.540 ⇒ 00:38:26.939 Pranav Narahari: that is then gonna give you a lot more information of, like, hey, like, you guys had this many calls, but then used any this…
326 00:38:27.250 ⇒ 00:38:29.720 Pranav Narahari: Didn’t use any at all.
327 00:38:30.130 ⇒ 00:38:40.700 Pranav Narahari: why is that? If you can ask that specific question, then I think you’re gonna be able to get more information, or at least have the trainers dive a little bit deeper with their CSRs, right?
328 00:38:40.700 ⇒ 00:38:41.260 YvetteRuiz: Yeah.
329 00:38:41.480 ⇒ 00:38:41.830 Pranav Narahari: Yeah.
330 00:38:41.830 ⇒ 00:38:56.840 YvetteRuiz: Then I… I love that, Pernal, because I think that we have… well, I don’t think we do have the data to all that, you know, even if we wanted to build that meaning scorecard, right? Amount of phone calls taken, here was your hold time, here was your average handle time.
331 00:38:56.840 ⇒ 00:38:57.160 Pranav Narahari: Right.
332 00:38:57.160 ⇒ 00:39:01.680 YvetteRuiz: Right? And kind of really… and then look at your indie usage in there and say, okay.
333 00:39:01.990 ⇒ 00:39:03.819 YvetteRuiz: Tell me the story here.
334 00:39:04.120 ⇒ 00:39:05.700 Pranav Narahari: Right. Yeah, yeah.
335 00:39:05.890 ⇒ 00:39:11.469 Pranav Narahari: Yeah, because some of these are flags, right? These are definitely flags where you should…
336 00:39:12.620 ⇒ 00:39:31.159 Pranav Narahari: And if you’re… if you’re noticing Andy’s not being used, then it has… there has to be a reason why. And then, with that reason why, we can improve, potentially, right? If there’s a reason of, like, hey, I keep on getting asked these questions, Andy’s just always, you know, taking way too long to respond, or not giving the right response, okay, that’s good data for us.
337 00:39:32.340 ⇒ 00:39:32.990 Pranav Narahari: Yeah.
338 00:39:32.990 ⇒ 00:39:33.550 YvetteRuiz: Yep.
339 00:39:34.050 ⇒ 00:39:40.789 Pranav Narahari: Okay, awesome. So, what we’ll need to do for that type of scorecard is we’re gonna need to get started on that transcript work.
340 00:39:41.040 ⇒ 00:39:53.309 Pranav Narahari: Yeah, so getting some of those, transcript IDs is gonna be helpful. We’re gonna be wrapping up the central.copilot this week. Next week is kind of our first, like, heads down, like.
341 00:39:53.470 ⇒ 00:40:00.439 Pranav Narahari: you know, having Casey, Sam, all working on the transcripts effort is gonna start next week.
342 00:40:01.000 ⇒ 00:40:06.750 Pranav Narahari: That, I’ll make sure, is baked in. Like, what we just… what we just talked about right now. I’ll make sure, like, a dashboard
343 00:40:06.870 ⇒ 00:40:07.770 Pranav Narahari: gets…
344 00:40:08.460 ⇒ 00:40:26.319 Pranav Narahari: gets put up for you guys early on in the process, because it’s actually pretty simple. Like, this doesn’t need to necessarily have an analysis of the transcript, which will be happening later on in May. Earlier on, what we can do is just have a dashboard, number of calls they’ve had, and a usage.
345 00:40:26.320 ⇒ 00:40:30.660 Pranav Narahari: And then, from that, 8x8… I think you said 8x8 was also…
346 00:40:30.870 ⇒ 00:40:35.239 Pranav Narahari: You say 8x8, or did you say Evolve is having the average hold time, and…
347 00:40:35.240 ⇒ 00:40:38.749 YvetteRuiz: No, 8x8 has the average… 8x8 has all that.
348 00:40:38.750 ⇒ 00:40:42.260 Pranav Narahari: Perfect, yeah, so then we can pull in all that information from 8x8 and then show it in real.
349 00:40:42.640 ⇒ 00:40:58.509 YvetteRuiz: Yeah, because… and I think that’s the direction to go, because I think once we establish, okay, your… the agents, what are we comparing it to? And exactly like we said, then your transcripts are going to be the next layer to it, right? Now I’m going to go in there and I’m going to really deep dive and kind of compare, you know.
350 00:40:59.220 ⇒ 00:41:18.989 YvetteRuiz: what are you saying on those phone calls, right? Like, did you even use Andy, for those phone calls? And what I’m gonna do when I get off the phone right now, I’m gonna… I’m gonna go pick an agent, just the one agent that I have in mind, just because I got an escalation from them the other day, a couple of them, where their whole time was… they kept putting the customer on hold.
351 00:41:19.100 ⇒ 00:41:30.169 YvetteRuiz: back and forth, and part of that’s kind of like, I’m listening to it, so I know where they should have gone to go get the questions and the answers, but what was that agent doing in the middle of that 10-minute phone call?
352 00:41:30.330 ⇒ 00:41:31.870 Pranav Narahari: Yeah, yeah.
353 00:41:32.080 ⇒ 00:41:40.269 Pranav Narahari: And I guess you can ask that question more confidently if you know, like, hey, Andy could have answered this question, Andy could have helped you a lot with this, right?
354 00:41:40.740 ⇒ 00:41:55.729 YvetteRuiz: And so could my trainers, and that’s why I want to get it… I want to be able to provide this data. You know, my leaders should be able to have the answers, holding their trainers accountable, like, they should have the same mindset that I have, like, okay, if we’re building Andy, just like I talked to you about over here.
355 00:41:55.730 ⇒ 00:42:15.649 YvetteRuiz: I’m going to ask my trainers to drive the results, but that means that we all have to know what we’re working towards, but the data is the other layer that really helps us, because then when I’m having my one-on-ones or my check-ins and saying, okay, you had, you know, here, here, here, let’s talk through this, right? Because it’s not a beat-up session, this is about how do we bake you better?
356 00:42:15.650 ⇒ 00:42:20.320 Pranav Narahari: Yes, yes, and it makes their lives easier too, right? That’s the end goal. Yeah.
357 00:42:20.320 ⇒ 00:42:20.960 YvetteRuiz: Yeah.
358 00:42:20.960 ⇒ 00:42:24.839 Pranav Narahari: Yeah. Okay, that sounds great.
359 00:42:26.200 ⇒ 00:42:43.930 Pranav Narahari: Yeah, I think this is a really good direction, because next week is when we’re going to really dive into transcripts, right? So I’ll be able to kind of do some of this upfront work, and I’ll, I’ll give you guys a scorecard template, and just kind of, like, a proof of concept of what it looks like on Friday during our call.
360 00:42:43.930 ⇒ 00:42:44.600 YvetteRuiz: Okay.
361 00:42:44.740 ⇒ 00:42:51.369 Pranav Narahari: And then we can make any refinements based on that, and then go, like, heads down, start building that out starting Monday next week.
362 00:42:52.000 ⇒ 00:42:53.899 YvetteRuiz: Okay, perfect. Sounds good.
363 00:42:54.440 ⇒ 00:42:55.010 Pranav Narahari: Cool.
364 00:42:55.320 ⇒ 00:43:01.260 Pranav Narahari: I’m trying to think if there’s anything else from Friday’s meeting that we should bring up.
365 00:43:04.930 ⇒ 00:43:11.950 Pranav Narahari: I don’t think so. I think we covered, kind of, the main thing, which is, like, some of this automation stuff that we’ve been working on with Essential.copilot, so…
366 00:43:11.950 ⇒ 00:43:12.280 YvetteRuiz: Okay.
367 00:43:12.280 ⇒ 00:43:18.590 Pranav Narahari: Yeah, and then just next week’s, hopefully we can, meet up in Austin or San Antonio, so… I’ll.
368 00:43:18.590 ⇒ 00:43:27.079 YvetteRuiz: Yeah, no, if you want to go ahead and send the invite for Tuesday, I’ll plan to be there on Tuesday. Like I said, after 10, I’ll be available.
369 00:43:27.300 ⇒ 00:43:29.599 Pranav Narahari: Cool, and is that for Austin or San Antonio?
370 00:43:30.320 ⇒ 00:43:32.999 YvetteRuiz: No, for Austin, I can go… I’ll drive out to Austin.
371 00:43:33.000 ⇒ 00:43:39.180 Pranav Narahari: Okay, perfect. Okay, let me also… so that works for me, and then let me just see if Janice can make that as well.
372 00:43:39.710 ⇒ 00:43:40.520 YvetteRuiz: Okay.
373 00:43:40.930 ⇒ 00:43:42.700 YvetteRuiz: Perfect. Sounds good!
374 00:43:43.460 ⇒ 00:43:44.609 Pranav Narahari: Sounds awesome. Okay, cool.
375 00:43:44.610 ⇒ 00:44:01.780 YvetteRuiz: All right. Well, I appreciate your time. Thank you so much. That whole, the linear ticket, that’s gonna be excellent. I think that’s a great move right there. And then if we can start building this… these scorecards, and just having more tangible data, you know, I think it’s gonna… that’s gonna be another layer to really help,
376 00:44:02.130 ⇒ 00:44:03.700 YvetteRuiz: drive results.
377 00:44:03.700 ⇒ 00:44:05.710 Pranav Narahari: Definitely, definitely, I totally agree.
378 00:44:06.480 ⇒ 00:44:08.920 YvetteRuiz: Alrighty. Well, have a good one. Okay.
379 00:44:08.920 ⇒ 00:44:11.010 Pranav Narahari: You too. Bye. Have a good one. Bye.