Meeting Title: ABC Home Dashboard Cleanup Date: 2025-03-28 Meeting participants: Uttam Kumaran, Amber Lin, Miguel De Veyra, Casie Aviles
WEBVTT
1 00:00:25.880 ⇒ 00:00:26.900 Miguel de Veyra: Hey! Amber.
2 00:00:27.650 ⇒ 00:00:35.010 Amber Lin: Hello! Good morning. I was looking at our dashboard, and I realized, the execs are gonna be
3 00:00:35.280 ⇒ 00:00:44.917 Amber Lin: really pissed to see that we have a 40, 50% error rate, as of
4 00:00:45.790 ⇒ 00:00:49.300 Miguel de Veyra: Yeah. But that’s not really on us, because it’s their data
5 00:00:52.750 ⇒ 00:00:53.320 Miguel de Veyra: Yes.
6 00:00:53.320 ⇒ 00:01:06.170 Amber Lin: I think what they’re gonna question is, how do you measure accuracy? Because I was going through the exact one. So I 1st excluded these tests, which I think will make it a little better. So we have
7 00:01:06.170 ⇒ 00:01:10.030 Casie Aviles: There are. There are a lot of testing data which adds noise to it.
8 00:01:10.750 ⇒ 00:01:16.860 Miguel de Veyra: Yeah, it should probably be, should also probably remove me and Casey, and we shouldn’t be there
9 00:01:17.550 ⇒ 00:01:21.519 Amber Lin: I see you mean the meeting or the this dashboard
10 00:01:21.520 ⇒ 00:01:24.320 Miguel de Veyra: Like in this dashboard, because we have, like a lot of tests
11 00:01:24.690 ⇒ 00:01:25.010 Amber Lin: Okay.
12 00:01:25.010 ⇒ 00:01:27.019 Miguel de Veyra: Probably best to remove me and Casey
13 00:01:27.363 ⇒ 00:01:31.490 Amber Lin: Casey do. Can you look at that just to filter it out
14 00:01:31.780 ⇒ 00:01:35.210 Casie Aviles: I think we can click on the filter on the left
15 00:01:39.630 ⇒ 00:01:43.919 Casie Aviles: Wait. I guess you can try to clear it first, st or Oh, wait. Yeah.
16 00:01:44.560 ⇒ 00:01:45.820 Miguel de Veyra: Username. Yeah
17 00:01:45.820 ⇒ 00:01:50.169 Casie Aviles: Yes, username, and then just click everyone else from ABC. So
18 00:01:51.700 ⇒ 00:01:52.180 Miguel de Veyra: Janine
19 00:01:52.180 ⇒ 00:01:52.590 Amber Lin: Bye.
20 00:01:52.590 ⇒ 00:01:54.419 Miguel de Veyra: Remove Casey, remove Casey.
21 00:01:54.420 ⇒ 00:01:55.660 Amber Lin: Negligs.
22 00:01:55.660 ⇒ 00:01:58.209 Miguel de Veyra: Uncheck check. Only the ABC people
23 00:01:58.210 ⇒ 00:01:59.250 Amber Lin: Excludes.
24 00:01:59.360 ⇒ 00:02:02.990 Amber Lin: Yeah, I was gonna exclude that. It’s service account.
25 00:02:05.160 ⇒ 00:02:05.770 Amber Lin: Okay?
26 00:02:05.770 ⇒ 00:02:06.719 Casie Aviles: Think that’s it.
27 00:02:07.890 ⇒ 00:02:09.970 Miguel de Veyra: I think this is all other way around
28 00:02:10.449 ⇒ 00:02:12.779 Amber Lin: No, I I clicked exclude so
29 00:02:12.780 ⇒ 00:02:13.680 Miguel de Veyra: Okay. Okay.
30 00:02:13.680 ⇒ 00:02:16.180 Amber Lin: These are not there. Okay.
31 00:02:16.180 ⇒ 00:02:21.289 Casie Aviles: The same story, but it’s much higher average quality score and
32 00:02:21.680 ⇒ 00:02:26.640 Amber Lin: Much lower error rate. So it’s 30%. 30 is the thing is that
33 00:02:26.750 ⇒ 00:02:47.009 Amber Lin: recently their test has been 60% error rate. So I’m a little scared because they’re they’re gonna look at this. Yes, but then they’re gonna go on the dashboard and look that the quality score is declining. And look at, this has went from 20% to 60%. They’re not very happy about that.
34 00:02:48.420 ⇒ 00:02:49.000 Casie Aviles: Yeah, that.
35 00:02:49.000 ⇒ 00:02:49.410 Miguel de Veyra: So.
36 00:02:49.410 ⇒ 00:02:50.330 Casie Aviles: That’s true.
37 00:02:50.970 ⇒ 00:02:57.959 Miguel de Veyra: Yeah. But though the thing is the way we we generate the quality score and the error rate isn’t really like the best way, either.
38 00:02:59.470 ⇒ 00:03:02.489 Miguel de Veyra: Like it’s, it’s we just use AI basically
39 00:03:03.270 ⇒ 00:03:04.200 Amber Lin: I see.
40 00:03:04.200 ⇒ 00:03:06.270 Miguel de Veyra: Not compare it to our golden data.
41 00:03:09.200 ⇒ 00:03:12.720 Miguel de Veyra: Right, and then not everything is on the golden data. It’s basically just the FAQ
42 00:03:12.720 ⇒ 00:03:21.210 Amber Lin: Yeah, let me go. Let’s let me go. Look at the output. Okay? So they don’t. They don’t know.
43 00:03:24.090 ⇒ 00:03:25.460 Amber Lin: That’s your mom.
44 00:03:25.630 ⇒ 00:03:31.459 Amber Lin: Yeah. Like, most. I think we should. I think we should show them. This
45 00:03:32.310 ⇒ 00:03:36.970 Amber Lin: of most of them is because we don’t have the data
46 00:03:37.260 ⇒ 00:03:41.669 Amber Lin: like if we show them that. How do I make this column bigger?
47 00:03:42.580 ⇒ 00:03:43.390 Amber Lin: This call
48 00:03:43.390 ⇒ 00:03:44.290 Casie Aviles: I’m not sure
49 00:03:44.900 ⇒ 00:03:48.530 Amber Lin: That’s okay. Let’s say.
50 00:03:48.530 ⇒ 00:03:53.849 Miguel de Veyra: I think you can go to ABC logs because majority of the stuff are there anyways like the one in slack
51 00:03:54.810 ⇒ 00:03:55.540 Amber Lin: Hmm.
52 00:03:55.860 ⇒ 00:04:06.450 Amber Lin: yeah, yeah, I just wanna I just wanna see for those who gave us an error rate. So the 100 here, what type of answers they gave. So this is.
53 00:04:07.030 ⇒ 00:04:16.190 Amber Lin: there’s no inspectors listed. There’s no access to pricing, no estimates, also pricing.
54 00:04:18.500 ⇒ 00:04:26.720 Amber Lin: I don’t know that probably we needed adjustment, okay, like in-house inspectors.
55 00:04:27.050 ⇒ 00:04:32.330 Amber Lin: Okay, assist customers don’t have information.
56 00:04:34.750 ⇒ 00:04:39.009 Amber Lin: Yeah, how do we explain this? So they know it’s not our fault.
57 00:04:39.510 ⇒ 00:04:44.760 Miguel de Veyra: It’s not on the central dock, or like the central dock, is not clear enough
58 00:04:44.760 ⇒ 00:04:51.960 Amber Lin: Okay. So we, I think we need to explain how we measure accuracy and then point out it’s their fault
59 00:04:54.070 ⇒ 00:04:55.280 Casie Aviles: Okay. Then
60 00:04:55.280 ⇒ 00:04:59.010 Miguel de Veyra: So I think what the the thing we can do is that
61 00:05:00.445 ⇒ 00:05:12.430 Miguel de Veyra: we can tell them that. You know, we’re still testing out the best way to get the accuracy score right? Because right now, the golden data sheet is still, basically, we’re still the ones who fill it up
62 00:05:13.170 ⇒ 00:05:13.970 Miguel de Veyra: right
63 00:05:14.790 ⇒ 00:05:18.920 Miguel de Veyra: And then, yeah, I don’t know how to do it. To be honest.
64 00:05:19.460 ⇒ 00:05:26.450 Amber Lin: Okay, so let’s think about this. Our error rates, we bench.
65 00:05:27.600 ⇒ 00:05:31.250 Amber Lin: Okay, we have, boss.
66 00:05:31.570 ⇒ 00:05:35.590 Amber Lin: It’s question, font answers.
67 00:05:37.090 ⇒ 00:05:38.819 Amber Lin: We come here.
68 00:05:40.030 ⇒ 00:05:43.930 Amber Lin: Let me just screenshot screenshot.
69 00:06:11.340 ⇒ 00:06:24.419 Amber Lin: And I don’t want them to have to filter out our data. Would it be possible to just delete it from our from the data source, Casey, can we just delete that
70 00:06:25.686 ⇒ 00:06:29.750 Casie Aviles: Our tests, I mean, yeah, I could filter it out
71 00:06:30.310 ⇒ 00:06:30.850 Amber Lin: Yes.
72 00:06:30.850 ⇒ 00:06:32.000 Amber Lin: So we don’t have to click.
73 00:06:32.710 ⇒ 00:06:35.519 Amber Lin: Yeah, yeah, just so that they don’t have to click, because
74 00:06:35.770 ⇒ 00:06:45.720 Amber Lin: I don’t think they know how to do this font answers, we compare answers to.
75 00:06:48.340 ⇒ 00:06:52.919 Amber Lin: If you’re thought answers to
76 00:07:02.660 ⇒ 00:07:07.420 Amber Lin: what I think, I’ll tell him what counts as an error
77 00:07:10.140 ⇒ 00:07:13.060 Amber Lin: we don’t have. Let’s do that
78 00:07:16.200 ⇒ 00:07:19.310 Amber Lin: at you. Don’t!
79 00:07:21.790 ⇒ 00:07:23.820 Amber Lin: Does match
80 00:07:28.710 ⇒ 00:07:36.840 Miguel de Veyra: Aren’t we gonna get blamed here amber that? Well, we’ve had like a couple of weeks to work with the data.
81 00:07:36.990 ⇒ 00:07:38.850 Miguel de Veyra: Why are you still not done
82 00:07:39.500 ⇒ 00:07:40.690 Amber Lin: Oh.
83 00:07:40.870 ⇒ 00:07:54.380 Amber Lin: I mean, yeah, I think, for I I think I’ll emphasize that we don’t have the data. I’ll say that like 60% of the like, 60, 65% of errors.
84 00:07:58.637 ⇒ 00:08:02.170 Amber Lin: Especially with pricing
85 00:08:02.740 ⇒ 00:08:04.500 Miguel de Veyra: Pricing and discounts. Yeah.
86 00:08:18.730 ⇒ 00:08:33.160 Amber Lin: Or even if, say, the question, let’s say, if the question that’s asked, it’s not on the golden data sheet, how is that going to be and like, how do we measure the accuracy of that? If we don’t have a model answer? Is it just automatically wrong
87 00:08:33.651 ⇒ 00:08:43.860 Miguel de Veyra: No, no, what we do is if it’s not on the goal of like, there’s no expected answer for it. What we do now is okay. Look at the structure. Is it more than 3 to 4 sentences?
88 00:08:44.059 ⇒ 00:08:44.860 Miguel de Veyra: Right? Isn’t
89 00:08:45.406 ⇒ 00:08:47.590 Casie Aviles: Yeah. Is it readable?
90 00:08:47.870 ⇒ 00:08:48.620 Miguel de Veyra: Is it readable.
91 00:08:48.620 ⇒ 00:08:50.170 Casie Aviles: Address, northwest, Shayra
92 00:08:50.750 ⇒ 00:08:52.419 Miguel de Veyra: Is there an oh, by the way.
93 00:08:55.960 ⇒ 00:09:08.709 Miguel de Veyra: So it’s more now if we if it’s if there’s no expected, I guess, Casey, that’s what we should have done on the prompt. If there’s no expected answer from the AI, it should look at the structure instead instead of, you know.
94 00:09:09.080 ⇒ 00:09:19.600 Miguel de Veyra: marking it on their own. Because basically, if it’s I’m sorry it shouldn’t be of bad quality. Score it, just, shouldn’t it? Just I I would say it should just be 0
95 00:09:19.970 ⇒ 00:09:22.670 Miguel de Veyra: and or on, you know, it shouldn’t be graded at all.
96 00:09:23.480 ⇒ 00:09:29.229 Amber Lin: Yeah, I want. How do I look at the answers like, how do I filter
97 00:09:30.880 ⇒ 00:09:32.090 Casie Aviles: Sorry. I don’t know
98 00:09:32.720 ⇒ 00:09:33.850 Amber Lin: How do I?
99 00:09:34.100 ⇒ 00:09:34.800 Miguel de Veyra: Get back.
100 00:09:35.110 ⇒ 00:09:35.730 Amber Lin: Okay.
101 00:09:38.410 ⇒ 00:09:40.059 Casie Aviles: What did you want to do? Sorry, Amber.
102 00:09:40.446 ⇒ 00:09:48.230 Amber Lin: I think I wanted to like. Look at this table and see what’s most decent per se.
103 00:09:48.860 ⇒ 00:09:50.820 Amber Lin: Oh, what happened here?
104 00:09:51.550 ⇒ 00:09:54.000 Amber Lin: Oh, I did something interesting.
105 00:09:54.440 ⇒ 00:09:55.849 Amber Lin: Oops! Yes.
106 00:09:59.700 ⇒ 00:10:06.260 Amber Lin: Which filter did I last 10 foot hours?
107 00:10:06.990 ⇒ 00:10:10.480 Amber Lin: That 7 days?
108 00:10:11.090 ⇒ 00:10:12.610 Amber Lin: Oh, gosh!
109 00:10:13.240 ⇒ 00:10:17.270 Amber Lin: 62%. Let’s see what.
110 00:10:18.160 ⇒ 00:10:27.060 Amber Lin: Let’s see what we did to get 62% go ahead
111 00:10:27.770 ⇒ 00:10:29.550 Miguel de Veyra: Why is it so high? What did we do?
112 00:10:29.839 ⇒ 00:10:33.319 Amber Lin: This is. This is this week. This is this week. So I’m
113 00:10:33.675 ⇒ 00:10:34.030 Miguel de Veyra: Okay.
114 00:10:34.030 ⇒ 00:10:36.359 Amber Lin: What made it rise so high?
115 00:10:38.130 ⇒ 00:10:46.569 Miguel de Veyra: Yeah. Like, for example, that one Casey, like I can help with that. Could you please? Sorry. Can you go back? This shouldn’t be tagged as an error. Because it’s very exploratory.
116 00:10:47.560 ⇒ 00:10:52.700 Miguel de Veyra: right? Hey? We’re basically asking it for more information. So this is automatically wrong.
117 00:10:53.390 ⇒ 00:10:55.750 Amber Lin: Okay. So let me
118 00:11:00.240 ⇒ 00:11:04.109 Amber Lin: like, yeah, that should not be an error.
119 00:11:05.220 ⇒ 00:11:06.089 Amber Lin: I’ve got ache
120 00:11:06.090 ⇒ 00:11:09.219 Miguel de Veyra: Cause. Yeah, cause amber. I think we also need to
121 00:11:09.350 ⇒ 00:11:18.810 Miguel de Veyra: to mention to them or not. I’m not sure how we should approach this. But we basically took me out of death right? So. And we’re limiting the ours drastically.
122 00:11:19.920 ⇒ 00:11:24.499 Miguel de Veyra: So of course, there’s gonna be like, you know, we’re not gonna be as fast as we were before.
123 00:11:24.500 ⇒ 00:11:26.690 Amber Lin: Yeah, I don’t think I don’t think
124 00:11:27.480 ⇒ 00:11:30.170 Amber Lin: I think Lautam should talk about that, because
125 00:11:30.170 ⇒ 00:11:30.950 Miguel de Veyra: Yeah, yeah.
126 00:11:30.950 ⇒ 00:11:34.009 Amber Lin: Brochure, and it’s more business way
127 00:11:34.710 ⇒ 00:11:35.740 Miguel de Veyra: Yeah, yeah.
128 00:11:35.740 ⇒ 00:11:37.100 Casie Aviles: Yeah, because
129 00:11:37.100 ⇒ 00:11:43.150 Miguel de Veyra: I’d hate it, for I’d hate it if they say, like, you guys had a week to figure this out. Well, not really. We had like a day
130 00:11:45.600 ⇒ 00:11:53.159 Casie Aviles: And yeah, I mean to be honest, a lot of the data that we had was just missing. And I had to think of a way to
131 00:11:53.160 ⇒ 00:11:53.980 Amber Lin: Yeah.
132 00:11:53.980 ⇒ 00:11:59.380 Casie Aviles: You know, backfill the data, since we didn’t really start measuring this until just this.
133 00:11:59.380 ⇒ 00:11:59.960 Casie Aviles: So we
134 00:12:00.800 ⇒ 00:12:01.540 Miguel de Veyra: Right.
135 00:12:02.100 ⇒ 00:12:11.840 Amber Lin: Yeah. So like this shouldn’t be. They should know that this is a not a final product. But they will want to be that. That’s the problem. I think
136 00:12:13.850 ⇒ 00:12:15.859 Casie Aviles: Yeah, honestly, I’m
137 00:12:16.210 ⇒ 00:12:22.019 Casie Aviles: I mean, I’d have to go and clean the data again, and maybe redo the scores if we want. We want this to be better
138 00:12:22.680 ⇒ 00:12:34.360 Amber Lin: You see, I think next week Annie will be here. So as long as we pass this meeting you won’t have to do the dashboard anymore. Annie knows dashboards like she’s she can do the data stuff.
139 00:12:34.920 ⇒ 00:12:38.599 Amber Lin: So I think we just need to survive this meeting without them being
140 00:12:39.030 ⇒ 00:12:41.409 Amber Lin: annoyed. And then we’ll be fine
141 00:12:42.060 ⇒ 00:12:48.170 Miguel de Veyra: I think we just explained, like for amber for this meeting. We just explained to them that, hey, you know, we’re still working on this
142 00:12:48.270 ⇒ 00:12:49.210 Miguel de Veyra: on the
143 00:12:49.670 ⇒ 00:12:54.859 Miguel de Veyra: I think Uttan will understand cause he’s the one who basically took majority of the development time out.
144 00:12:55.050 ⇒ 00:13:00.330 Miguel de Veyra: I think he’ll understand that, hey? You know as long as we show them like, what type of questions there are
145 00:13:00.956 ⇒ 00:13:08.429 Miguel de Veyra: like that’s throwing an error on like we just explained. Hey, we’re we. Our main goal was to get it to work. Now we’re just gonna optimize it
146 00:13:09.220 ⇒ 00:13:10.030 Amber Lin: Oh no!
147 00:13:10.430 ⇒ 00:13:11.200 Miguel de Veyra: Right.
148 00:13:11.370 ⇒ 00:13:18.850 Amber Lin: Okay. So let’s let me. That’s a really good point. Let me say, here dashboard structure built
149 00:13:20.160 ⇒ 00:13:20.810 Miguel de Veyra: Yeah.
150 00:13:24.600 ⇒ 00:13:27.119 Miguel de Veyra: Because we need to clean the data too, like no.
151 00:13:27.120 ⇒ 00:13:28.400 Amber Lin: Next step.
152 00:13:29.560 ⇒ 00:13:30.350 Casie Aviles: Yeah.
153 00:13:30.610 ⇒ 00:13:33.359 Amber Lin: Data, so much better.
154 00:13:33.590 ⇒ 00:13:36.079 Miguel de Veyra: Yes, now, Scott, don’t say anything like
155 00:13:36.570 ⇒ 00:13:38.979 Casie Aviles: Because the data is really noisy. And yeah.
156 00:13:39.140 ⇒ 00:13:41.459 Amber Lin: Yeah, I’m sorry.
157 00:13:42.170 ⇒ 00:13:43.050 Amber Lin: Yeah.
158 00:13:43.570 ⇒ 00:13:53.759 Amber Lin: like, why, okay? Why, some of the I’m not sorry. A 100% every. And some of the I’m not sorry 0% error rate
159 00:13:55.470 ⇒ 00:13:59.680 Casie Aviles: Yeah, that’s the thing like the data is really noisy right now. And
160 00:13:59.680 ⇒ 00:14:00.310 Amber Lin: Yeah, I think.
161 00:14:00.310 ⇒ 00:14:00.909 Miguel de Veyra: I mean, it’s not
162 00:14:02.270 ⇒ 00:14:11.099 Miguel de Veyra: because we don’t. Ideally, we wanna yeah. Ideally, we don’t wanna grade it like, you know, I we don’t. Wanna. I think I tested some of it.
163 00:14:11.400 ⇒ 00:14:20.640 Miguel de Veyra: or like the prompt like, if it’s not, you know, if you don’t have access to that data cause, remember, Amber, this shouldn’t, and I’m sorry. Shouldn’t even be
164 00:14:20.920 ⇒ 00:14:21.690 Amber Lin: Yeah.
165 00:14:21.690 ⇒ 00:14:28.529 Miguel de Veyra: Be wrong, it should be escalated. So technically, the quality score should be good. It’s not accuracy that we’re measuring
166 00:14:28.530 ⇒ 00:14:29.599 Amber Lin: Yeah.
167 00:14:30.670 ⇒ 00:14:33.769 Casie Aviles: Do you want to filter those out? The I’m sorry
168 00:14:34.040 ⇒ 00:14:39.579 Amber Lin: Yeah, look at that a lot better
169 00:14:40.430 ⇒ 00:14:42.723 Casie Aviles: Yeah. I guess that I’m sorry, sir.
170 00:14:43.010 ⇒ 00:14:47.099 Amber Lin: Yeah, okay, I’m gonna do some filtering.
171 00:14:48.350 ⇒ 00:14:52.090 Amber Lin: Yes, excluded. I’m sorry it’s excluded us.
172 00:14:52.210 ⇒ 00:14:59.850 Amber Lin: So let’s see what we still got wrong on. So that’s the
173 00:15:00.220 ⇒ 00:15:07.670 Amber Lin: like, essentially that cheat sheet where she she is. Okay, that’s like that makes sense like senior
174 00:15:07.670 ⇒ 00:15:10.829 Miguel de Veyra: Even this, could you please specify this shouldn’t be an error
175 00:15:11.440 ⇒ 00:15:15.909 Amber Lin: Hmm, yeah, let’s let’s note that down of things that need
176 00:15:16.080 ⇒ 00:15:28.239 Amber Lin: okay, things that needs. I’m not gonna put this on a slide. I’m just taking notes. See? Specific should not be error.
177 00:15:28.640 ⇒ 00:15:36.389 Miguel de Veyra: Yeah, like the bot asking the bot, the bot asking exploratory questions or to clarify should not be considered an error
178 00:15:36.780 ⇒ 00:15:41.980 Amber Lin: Which right now we are, of course, because, you know, if it doesn’t answer, it’s a very one to one right now.
179 00:15:41.980 ⇒ 00:15:47.800 Amber Lin: Yeah. Yeah. If things that they should not okay. And also
180 00:15:47.800 ⇒ 00:15:51.760 Miguel de Veyra: Maybe on the call amber. I don’t think so right, because I have to work on recruitment stuff
181 00:15:51.950 ⇒ 00:15:52.919 Amber Lin: What did you say?
182 00:15:53.090 ⇒ 00:15:55.499 Miguel de Veyra: Do I have to be on the ABC. Call? I don’t believe so.
183 00:15:55.500 ⇒ 00:16:04.090 Amber Lin: No, you don’t have to. You don’t have to. I I think I just wanted to make sure this with you guys. What’s the other thing that should not be an error
184 00:16:06.630 ⇒ 00:16:09.230 Amber Lin: Oh! And error is not accuracy right?
185 00:16:09.640 ⇒ 00:16:11.030 Miguel de Veyra: Yes, yes, yes.
186 00:16:12.270 ⇒ 00:16:17.820 Miguel de Veyra: so. And then the escalation, like, I’m sorry I don’t know. The escalation should not be, of course, an error.
187 00:16:24.720 ⇒ 00:16:30.010 Miguel de Veyra: and then I don’t know. Let’s explore more. I think this is gonna be our meeting anyways, for the Daily
188 00:16:30.010 ⇒ 00:16:34.588 Amber Lin: Oh, yeah, okay, great. So we can remove that meeting. Great, fantastic.
189 00:16:35.650 ⇒ 00:16:36.230 Amber Lin: Huh!
190 00:16:36.360 ⇒ 00:16:37.000 Amber Lin: Let’s go
191 00:16:37.000 ⇒ 00:16:38.399 Miguel de Veyra: Well, yeah, let’s check mark.
192 00:16:38.400 ⇒ 00:16:40.829 Amber Lin: Include some more of that. Okay.
193 00:16:40.830 ⇒ 00:16:41.570 Miguel de Veyra: Let’s check Mark.
194 00:16:41.570 ⇒ 00:16:42.690 Amber Lin: Fewer rates.
195 00:16:43.750 ⇒ 00:16:46.100 Miguel de Veyra: I think this is input, not the output
196 00:16:46.100 ⇒ 00:16:48.040 Amber Lin: Yeah, I just want what kind of
197 00:16:48.040 ⇒ 00:16:49.730 Casie Aviles: Hey? Guys, sorry
198 00:16:49.890 ⇒ 00:16:54.880 Amber Lin: Like, what is the cheat sheet like? What do you mean? What is a cheat sheet like? I don’t. I don’t know
199 00:16:54.880 ⇒ 00:17:03.779 Miguel de Veyra: No, I don’t think I I think it’s fine. I think it’s fine. We shouldn’t probably filter this. It’s more like the the output. Sorry, Casey, you were saying something
200 00:17:04.843 ⇒ 00:17:07.269 Casie Aviles: I have a 1 on one with autumn
201 00:17:07.480 ⇒ 00:17:09.820 Miguel de Veyra: Go ahead! Go ahead!
202 00:17:10.229 ⇒ 00:17:11.239 Amber Lin: I’m just trying
203 00:17:11.240 ⇒ 00:17:11.670 Casie Aviles: Bye.
204 00:17:11.670 ⇒ 00:17:13.100 Amber Lin: This number prettier.
205 00:17:13.410 ⇒ 00:17:15.210 Miguel de Veyra: Feel free to come back if he doesn’t come.
206 00:17:15.210 ⇒ 00:17:17.789 Miguel de Veyra: Okay, he’s not gonna go, I know.
207 00:17:20.740 ⇒ 00:17:21.260 Miguel de Veyra: Can you?
208 00:17:21.260 ⇒ 00:17:24.710 Miguel de Veyra: Can you open that question? Sorry I want to check our mosquito suppression
209 00:17:25.884 ⇒ 00:17:26.679 Amber Lin: Marcia
210 00:17:26.680 ⇒ 00:17:29.450 Miguel de Veyra: Yeah, that one cause I think this is correct. This has to be correct.
211 00:17:29.450 ⇒ 00:17:33.910 Amber Lin: Yeah, like, mosquito, regular suppression.
212 00:17:33.910 ⇒ 00:17:34.979 Miguel de Veyra: This is correct.
213 00:17:35.190 ⇒ 00:17:47.850 Amber Lin: Like. Maybe that’s because it’s not the because I think Janice mentioned. We asked about mosquitoes for something else, and it answered about suppression like it. It didn’t answer the right thing.
214 00:17:48.100 ⇒ 00:17:48.910 Amber Lin: That’s what
215 00:17:48.910 ⇒ 00:17:49.590 Miguel de Veyra: Oh, God!
216 00:17:49.590 ⇒ 00:17:52.540 Amber Lin: I think that’s what. Why it was wrong
217 00:17:52.540 ⇒ 00:17:56.130 Miguel de Veyra: And then this one we could exclude. I can help with that
218 00:17:56.510 ⇒ 00:17:58.620 Amber Lin: Can you please specify, exclude.
219 00:17:58.870 ⇒ 00:17:59.960 Miguel de Veyra: Oh, okay.
220 00:17:59.960 ⇒ 00:18:01.160 Amber Lin: We shot that.
221 00:18:03.570 ⇒ 00:18:09.560 Amber Lin: Gosh! That was too slow. I want to. Where was it?
222 00:18:11.580 ⇒ 00:18:12.749 Miguel de Veyra: There you go!
223 00:18:15.770 ⇒ 00:18:17.459 Miguel de Veyra: Come on over! Did you get it?
224 00:18:17.460 ⇒ 00:18:18.660 Amber Lin: Oh, I didn’t!
225 00:18:20.260 ⇒ 00:18:21.700 Amber Lin: I’m too slow.
226 00:18:23.650 ⇒ 00:18:24.240 Amber Lin: Oh.
227 00:18:24.240 ⇒ 00:18:25.889 Miguel de Veyra: I can take a screenshot of it
228 00:18:27.690 ⇒ 00:18:28.839 Amber Lin: I got it. Yes.
229 00:18:28.840 ⇒ 00:18:29.639 Miguel de Veyra: Okay. Okay. Nice.
230 00:18:30.180 ⇒ 00:18:35.119 Amber Lin: Okay, well, so like. Well, we don’t have to show them. I just want us to
231 00:18:35.280 ⇒ 00:18:37.310 Amber Lin: have some evidence
232 00:18:37.310 ⇒ 00:18:38.830 Miguel de Veyra: Yeah, yeah, some notes, basically
233 00:18:38.830 ⇒ 00:18:41.370 Amber Lin: So take that out. Take that out
234 00:18:41.860 ⇒ 00:18:45.709 Amber Lin: a mosquito. Okay? I understand the rainy.
235 00:18:49.030 ⇒ 00:18:53.650 Amber Lin: Okay, yeah, like, it doesn’t
236 00:18:53.920 ⇒ 00:18:58.069 Miguel de Veyra: Yeah, that one is not shall be included anyway. Sorry. Can you open that again?
237 00:18:58.070 ⇒ 00:18:58.440 Amber Lin: Yeah.
238 00:18:58.440 ⇒ 00:19:02.810 Miguel de Veyra: I wanna see? Like the last part, yeah.
239 00:19:03.390 ⇒ 00:19:05.319 Miguel de Veyra: Oh, yeah, okay.
240 00:19:07.680 ⇒ 00:19:13.629 Miguel de Veyra: I actually can. I can. I? Can you send me this link? So I can also check. So I think it’s better if there’s 2 of us
241 00:19:13.988 ⇒ 00:19:17.930 Amber Lin: W. How do I send you this link with all these
242 00:19:17.930 ⇒ 00:19:21.139 Miguel de Veyra: Oh, you’re you’re oh, sorry you’re in, Ville. I can just go in there
243 00:19:21.140 ⇒ 00:19:21.740 Amber Lin: Yeah.
244 00:19:26.240 ⇒ 00:19:34.450 Miguel de Veyra: Right because I I’m backlogged on recruitment a lot. So
245 00:19:34.890 ⇒ 00:19:36.130 Miguel de Veyra: What’s the name of this?
246 00:19:38.416 ⇒ 00:19:39.720 Miguel de Veyra: Said Log. Metric dashboard.
247 00:19:39.720 ⇒ 00:19:40.530 Miguel de Veyra: Okay, okay.
248 00:19:40.840 ⇒ 00:19:41.680 Amber Lin: Yeah.
249 00:19:41.810 ⇒ 00:19:44.870 Amber Lin: I don’t know why these things are wrong per se.
250 00:19:47.630 ⇒ 00:19:48.900 Miguel de Veyra: Yeah.
251 00:19:59.080 ⇒ 00:20:02.740 Miguel de Veyra: yeah. Even the Pharaoh ants, is it? It’s supposed to be covered right?
252 00:20:03.150 ⇒ 00:20:03.730 Amber Lin: Hello!
253 00:20:04.210 ⇒ 00:20:07.529 Miguel de Veyra: Payroll like under Standard.
254 00:20:07.950 ⇒ 00:20:16.119 Miguel de Veyra: Wait, wait! Let me in. Even the I think. Wait, wait! Let me check because I don’t. I don’t think there’s anything about fail ads right now under documentation.
255 00:20:17.640 ⇒ 00:20:21.209 Amber Lin: I mean, that’s part of the reason why it’s wrong. So I think,
256 00:20:23.730 ⇒ 00:20:25.419 Miguel de Veyra: Yeah, there’s no Pharaoh ants.
257 00:20:25.960 ⇒ 00:20:27.320 Amber Lin: That’s so funny.
258 00:20:31.180 ⇒ 00:20:35.090 Miguel de Veyra: We can’t I can’t. We can’t make it accurate if it’s not
259 00:20:35.850 ⇒ 00:20:40.720 Amber Lin: Yeah, team, like.
260 00:20:41.120 ⇒ 00:20:42.370 Miguel de Veyra: Hey? Enough!
261 00:20:42.540 ⇒ 00:20:43.420 Miguel de Veyra: Sorry!
262 00:20:44.900 ⇒ 00:20:46.240 Miguel de Veyra: Trying to eat each other again
263 00:20:46.240 ⇒ 00:20:53.079 Amber Lin: Can you? Can you take a screenshot of the Central Doc to show that we don’t have anything about Pharaoh ants?
264 00:20:53.390 ⇒ 00:20:58.369 Miguel de Veyra: Okay, okay, wait. How do you spell favor again? Phar. A,
265 00:20:58.370 ⇒ 00:21:00.019 Amber Lin: We actually are okay.
266 00:21:00.900 ⇒ 00:21:02.069 Miguel de Veyra: EHA.
267 00:21:02.770 ⇒ 00:21:06.279 Miguel de Veyra: Santo, EMJR. A. Yeah.
268 00:21:09.020 ⇒ 00:21:10.679 Miguel de Veyra: Because it’s not here
269 00:21:11.160 ⇒ 00:21:12.643 Amber Lin: Oh, he is not there!
270 00:21:12.940 ⇒ 00:21:17.289 Miguel de Veyra: What’s going on out there? Yeah, that happened to me yesterday.
271 00:21:18.513 ⇒ 00:21:22.140 Miguel de Veyra: Here, here, I’ll I’ll send it on on slack
272 00:21:22.840 ⇒ 00:21:23.290 Amber Lin: Sure.
273 00:21:23.290 ⇒ 00:21:23.920 Casie Aviles: Oh, I saw
274 00:21:23.920 ⇒ 00:21:27.150 Amber Lin: Paste it in the in the slides. It’s okay.
275 00:21:28.420 ⇒ 00:21:34.289 Casie Aviles: To deploy anything? Or is the filter, filter, filter, filtering fine for now
276 00:21:34.290 ⇒ 00:21:35.719 Miguel de Veyra: Filtering should be good for now
277 00:21:35.960 ⇒ 00:21:45.839 Amber Lin: Yeah, you know what? It’s okay. I think next week we’ll let any do that cause that’s like, that’s her expertise. I think. Right now we’re at a good enough number here
278 00:21:46.650 ⇒ 00:21:47.080 Casie Aviles: Bye.
279 00:21:47.080 ⇒ 00:21:47.600 Amber Lin: So the
280 00:21:47.600 ⇒ 00:21:52.120 Casie Aviles: How do we explain the the rise? Though, like, I mean, there’s a rise of
281 00:21:52.553 ⇒ 00:21:55.589 Amber Lin: That’s a good question. Let’s let’s see.
282 00:21:56.010 ⇒ 00:22:00.969 Amber Lin: So oh, this like this is a clarifying question that should be gone.
283 00:22:01.190 ⇒ 00:22:03.409 Amber Lin: She she yes, we do
284 00:22:03.410 ⇒ 00:22:08.779 Miguel de Veyra: I’m sorry. Can you send me that cheat sheet? One because I I don’t think it’s at or 8 15 even
285 00:22:09.980 ⇒ 00:22:10.650 Amber Lin: Well.
286 00:22:10.940 ⇒ 00:22:16.859 Miguel de Veyra: Like, can you? End of month? Okay, let me check the central talk to
287 00:22:16.860 ⇒ 00:22:17.470 Amber Lin: Hmm.
288 00:22:24.570 ⇒ 00:22:26.849 Miguel de Veyra: Like there’s the cheat isn’t even underdog
289 00:22:27.710 ⇒ 00:22:34.970 Amber Lin: I know, like it is, it is not there. So I think think I will just
290 00:22:36.810 ⇒ 00:22:38.796 Casie Aviles: But maybe they included it in the
291 00:22:39.520 ⇒ 00:22:42.219 Casie Aviles: What do you call this in the golden data sheet?
292 00:22:43.300 ⇒ 00:22:44.209 Miguel de Veyra: Let me check
293 00:22:44.210 ⇒ 00:22:45.260 Amber Lin: Oh!
294 00:22:45.260 ⇒ 00:22:47.629 Miguel de Veyra: I don’t think they did. I’m gonna be honest.
295 00:22:47.630 ⇒ 00:22:50.480 Miguel de Veyra: They did not do much for the Golden.
296 00:22:50.480 ⇒ 00:22:52.130 Miguel de Veyra: because they haven’t touched the PC.
297 00:22:52.900 ⇒ 00:22:53.820 Casie Aviles: Okay. Okay.
298 00:22:53.820 ⇒ 00:22:55.770 Amber Lin: Keep thinking that they did it, but on
299 00:22:55.770 ⇒ 00:22:57.169 Miguel de Veyra: No, they didn’t. They didn’t.
300 00:22:58.190 ⇒ 00:23:06.170 Casie Aviles: Yeah, I just don’t want them to say that. Okay, we add, but we added it, the ideal answer in the in the data, yeah, golden data sheet
301 00:23:07.980 ⇒ 00:23:11.360 Miguel de Veyra: Then what else? I’m checking? What else is supposed to be here?
302 00:23:13.570 ⇒ 00:23:16.410 Amber Lin: Okay, so let me
303 00:23:16.410 ⇒ 00:23:20.220 Miguel de Veyra: There’s 1 here sorry I’m here to assist
304 00:23:21.140 ⇒ 00:23:21.905 Amber Lin: Oh,
305 00:23:23.290 ⇒ 00:23:25.480 Miguel de Veyra: On the like way below
306 00:23:26.620 ⇒ 00:23:28.359 Amber Lin: I’m here.
307 00:23:29.270 ⇒ 00:23:30.380 Miguel de Veyra: Is that the bottom
308 00:23:31.270 ⇒ 00:23:31.980 Amber Lin: Oh!
309 00:23:32.960 ⇒ 00:23:36.609 Miguel de Veyra: Can you maybe control? Oh, there’s a lot we can actually clean
310 00:23:37.610 ⇒ 00:23:39.310 Amber Lin: I’m here.
311 00:23:40.000 ⇒ 00:23:46.050 Amber Lin: Oh, oh, okay, just huh!
312 00:23:48.050 ⇒ 00:23:53.342 Amber Lin: Nope, hello. Okay. Well, that’s 0% error rate. I’m gonna keep it there
313 00:23:54.150 ⇒ 00:23:54.620 Miguel de Veyra: No.
314 00:23:54.620 ⇒ 00:23:55.050 Amber Lin: First, st
315 00:23:55.050 ⇒ 00:23:55.660 Miguel de Veyra: Off. But yeah.
316 00:23:55.988 ⇒ 00:23:59.600 Amber Lin: Yeah, I’m I’m I’m gonna leave. They’re not gonna look into
317 00:23:59.600 ⇒ 00:24:05.550 Miguel de Veyra: Is it is, is there a is there a way we can filter like from one to 10? The average quality score? So we can check
318 00:24:06.910 ⇒ 00:24:12.169 Miguel de Veyra: from. Yeah, let’s do that. And then let’s just work, basically on the ones that suck.
319 00:24:14.450 ⇒ 00:24:17.890 Miguel de Veyra: This is correct. The rodent attribute. Can you
320 00:24:18.100 ⇒ 00:24:22.059 Miguel de Veyra: perform 3 60 search document, approval. Wait, let me check
321 00:24:22.680 ⇒ 00:24:23.310 Amber Lin: Hmm.
322 00:24:24.200 ⇒ 00:24:26.700 Miguel de Veyra: Rodent. What was the title? Again, rodent? What
323 00:24:26.910 ⇒ 00:24:30.110 Amber Lin: Rodent to remove a rodent attribute
324 00:24:30.410 ⇒ 00:24:32.000 Miguel de Veyra: Rows and attribute. I think
325 00:24:32.000 ⇒ 00:24:43.849 Amber Lin: Yeah, I I think the question, maybe the it didn’t match the question like, I want to see here what the question was. But now I can, only I can only see the output. So
326 00:24:43.850 ⇒ 00:24:45.740 Miguel de Veyra: Is there a way to do that? I’m not sure
327 00:24:45.740 ⇒ 00:24:48.320 Amber Lin: Oh, no, I don’t know
328 00:24:48.620 ⇒ 00:24:53.250 Casie Aviles: I don’t know. I’m not sure either, but it should. We should be able to see it, though
329 00:24:53.860 ⇒ 00:24:57.950 Miguel de Veyra: I mean, if we go to real, I do snowflake definitely
330 00:24:58.710 ⇒ 00:25:00.290 Casie Aviles: It’s in the data. Yeah.
331 00:25:00.290 ⇒ 00:25:03.480 Miguel de Veyra: Yeah, yes, the costumer, the cheap.
332 00:25:03.770 ⇒ 00:25:06.979 Miguel de Veyra: Honestly, we should probably remove anything. Cheat sheet related
333 00:25:08.980 ⇒ 00:25:09.740 Amber Lin: Hmm.
334 00:25:10.790 ⇒ 00:25:14.539 Miguel de Veyra: Right cause. I just don’t cause it’s not there. So why are we
335 00:25:14.540 ⇒ 00:25:15.340 Amber Lin: Oh!
336 00:25:15.340 ⇒ 00:25:16.310 Miguel de Veyra: Oh no!
337 00:25:16.310 ⇒ 00:25:17.020 Amber Lin: No.
338 00:25:17.020 ⇒ 00:25:17.310 Casie Aviles: Yeah.
339 00:25:19.230 ⇒ 00:25:22.380 Casie Aviles: Oh, man, I wish we could save. We could save the filter
340 00:25:22.380 ⇒ 00:25:25.879 Amber Lin: I know. I wish you can save the filter fuck.
341 00:25:26.700 ⇒ 00:25:30.360 Amber Lin: Can I go back? I can’t control. C.
342 00:25:30.760 ⇒ 00:25:40.299 Amber Lin: No. Oh, that’s my other stuff. I am so disappointed. Oh, fuck! Okay, anyways, let’s
343 00:25:40.300 ⇒ 00:25:45.070 Miguel de Veyra: No, it’s okay. It’s okay. I guess you just type. I’m sorry. So we can filter
344 00:25:48.240 ⇒ 00:25:50.339 Casie Aviles: And honestly, I don’t know real
345 00:25:52.330 ⇒ 00:25:57.179 Miguel de Veyra: Why is our error rate at the 100 holy Shit? Did we do anything
346 00:25:57.640 ⇒ 00:26:02.569 Amber Lin: Yeah, because I think we’re just it’s match or no match. So
347 00:26:02.834 ⇒ 00:26:07.330 Miguel de Veyra: Because you’re clicking exclude. Okay, you know, you haven’t clicked exclude yet. Okay, okay, that makes sense.
348 00:26:09.800 ⇒ 00:26:10.860 Amber Lin: Yeah.
349 00:26:12.080 ⇒ 00:26:14.160 Casie Aviles: Oh, we you could bookmark the view
350 00:26:14.660 ⇒ 00:26:18.030 Amber Lin: Oh, how do I work with you?
351 00:26:18.030 ⇒ 00:26:22.820 Casie Aviles: Right up right top, right near the bell. Icon
352 00:26:23.420 ⇒ 00:26:24.740 Miguel de Veyra: Oh, okay. Okay.
353 00:26:24.740 ⇒ 00:26:25.300 Amber Lin: A oh!
354 00:26:25.300 ⇒ 00:26:26.109 Miguel de Veyra: Yeah, yeah, let’s work.
355 00:26:26.110 ⇒ 00:26:30.479 Miguel de Veyra: I will. I will make that home after I after I do this.
356 00:26:30.480 ⇒ 00:26:34.840 Miguel de Veyra: No chance amber. Let’s do it from top to bottom like the worst quality scores. 1st
357 00:26:34.840 ⇒ 00:26:35.250 Amber Lin: Okay.
358 00:26:35.250 ⇒ 00:26:38.450 Miguel de Veyra: Yeah, just just select everything. First, st I believe. So.
359 00:26:39.010 ⇒ 00:26:43.790 Miguel de Veyra: Yeah. And then from there, let’s do. Don’t click again.
360 00:26:43.940 ⇒ 00:26:44.610 Miguel de Veyra: Oops.
361 00:26:45.890 ⇒ 00:26:49.410 Miguel de Veyra: Could you specify documents? Which could you? Yeah.
362 00:26:56.810 ⇒ 00:27:02.310 Miguel de Veyra: And then, sorry, can we? Can we prioritize the one that has the highest execution times?
363 00:27:02.950 ⇒ 00:27:06.000 Miguel de Veyra: Let’s try to clean that up first, st because I think a lot of them are.
364 00:27:07.740 ⇒ 00:27:09.040 Miguel de Veyra: Yeah, let’s remove the
365 00:27:09.040 ⇒ 00:27:10.410 Amber Lin: My cheat sheet
366 00:27:13.180 ⇒ 00:27:18.170 Miguel de Veyra: Okay, yeah, let’s not touch the ones that has at a rate of 0
367 00:27:21.940 ⇒ 00:27:27.149 Miguel de Veyra: Pharaoh? Answer, not yeah. Anything about Pharaoh. And so I think we just eat that right now.
368 00:27:27.820 ⇒ 00:27:28.810 Amber Lin: Okay.
369 00:27:29.470 ⇒ 00:27:30.470 Amber Lin: Goodbye.
370 00:27:30.980 ⇒ 00:27:32.180 Amber Lin: Documentation
371 00:27:32.180 ⇒ 00:27:34.749 Miguel de Veyra: That should not, that should not be an error.
372 00:27:34.750 ⇒ 00:27:42.269 Amber Lin: Okay, okay, that’s correct adjustment. Okay, whatever 6 shipment. Blah, blah.
373 00:27:42.270 ⇒ 00:27:47.380 Miguel de Veyra: In-house expenses can provide. That is correct, because in-house inspectors should provide the pricing
374 00:27:49.700 ⇒ 00:27:50.450 Amber Lin: Okay.
375 00:27:50.970 ⇒ 00:27:54.779 Miguel de Veyra: Yeah. So we should probably just select that like, you should probably select that yet.
376 00:27:54.780 ⇒ 00:27:56.229 Amber Lin: I think I want it.
377 00:27:56.650 ⇒ 00:27:57.660 Amber Lin: Hmm!
378 00:27:58.340 ⇒ 00:28:01.369 Amber Lin: So we’re clearing out the error risk. That’s wrong.
379 00:28:01.830 ⇒ 00:28:03.539 Miguel de Veyra: Yes, yes, basically.
380 00:28:03.860 ⇒ 00:28:08.170 Miguel de Veyra: Because in house inspectors can provide. Yeah, let’s save it. First.st
381 00:28:08.170 ⇒ 00:28:08.860 Amber Lin: There’s
382 00:28:11.060 ⇒ 00:28:13.330 Miguel de Veyra: Bookmark. This current view
383 00:28:13.500 ⇒ 00:28:14.350 Amber Lin: Okay.
384 00:28:15.620 ⇒ 00:28:16.359 Amber Lin: Perfect.
385 00:28:16.840 ⇒ 00:28:20.410 Casie Aviles: Oh, can can you? Can you try reloading if
386 00:28:22.310 ⇒ 00:28:23.799 Amber Lin: Just to be sure.
387 00:28:24.596 ⇒ 00:28:25.689 Amber Lin: This for
388 00:28:25.690 ⇒ 00:28:28.370 Miguel de Veyra: Well, you don’t have bookmarks yet. It says there
389 00:28:29.030 ⇒ 00:28:29.880 Amber Lin: Oh!
390 00:28:30.210 ⇒ 00:28:33.800 Miguel de Veyra: You don’t have bookmarks yet. Yeah, yeah, your bookmarks. You have no
391 00:28:34.380 ⇒ 00:28:36.870 Casie Aviles: I think you, you bookmark current review first, st
392 00:28:37.950 ⇒ 00:28:40.200 Miguel de Veyra: Yeah, yeah, bookmarker, and view first, st
393 00:28:40.460 ⇒ 00:28:41.940 Casie Aviles: Can you try that 1st
394 00:28:41.940 ⇒ 00:28:43.830 Amber Lin: Wait. Okay, where do I? Click.
395 00:28:43.830 ⇒ 00:28:44.829 Miguel de Veyra: Same, same, bottom.
396 00:28:44.830 ⇒ 00:28:45.970 Casie Aviles: Same button
397 00:28:46.210 ⇒ 00:28:46.960 Amber Lin: Okay.
398 00:28:47.380 ⇒ 00:28:52.030 Casie Aviles: And just up above above bookmark, bookmark current view. Assume? Yeah.
399 00:28:52.030 ⇒ 00:28:52.620 Miguel de Veyra: Yeah, there, you go.
400 00:28:53.017 ⇒ 00:28:58.179 Amber Lin: Refresh. I was so scared clean. What should I call it?
401 00:28:58.180 ⇒ 00:29:01.240 Miguel de Veyra: Clean the there you go!
402 00:29:01.240 ⇒ 00:29:02.080 Casie Aviles: Yeah.
403 00:29:02.230 ⇒ 00:29:03.580 Miguel de Veyra: Clean dashboard. Yeah.
404 00:29:05.470 ⇒ 00:29:06.489 Casie Aviles: Yeah, that works
405 00:29:07.000 ⇒ 00:29:08.790 Miguel de Veyra: I think that should be fine. Same filter.
406 00:29:08.790 ⇒ 00:29:09.430 Miguel de Veyra: Yes.
407 00:29:10.030 ⇒ 00:29:13.170 Casie Aviles: What can you hover over the information?
408 00:29:13.430 ⇒ 00:29:15.669 Casie Aviles: What does that say toggling this one
409 00:29:16.834 ⇒ 00:29:18.769 Amber Lin: Want to save everything right. I want to save
410 00:29:18.770 ⇒ 00:29:22.230 Miguel de Veyra: Yeah. Yeah. Yeah. Okay, okay, that should be okay. Fine.
411 00:29:22.230 ⇒ 00:29:24.120 Amber Lin: Remove tests.
412 00:29:24.500 ⇒ 00:29:30.200 Amber Lin: Do you know questions?
413 00:29:32.990 ⇒ 00:29:34.410 Amber Lin: Live.
414 00:29:35.200 ⇒ 00:29:40.550 Amber Lin: Okay. I’ll just say we were testing questions. They can figure it the rest out. Yes. Is it there
415 00:29:40.550 ⇒ 00:29:41.570 Casie Aviles: Okay. Nice.
416 00:29:41.570 ⇒ 00:29:42.450 Amber Lin: Oh!
417 00:29:42.450 ⇒ 00:29:45.400 Miguel de Veyra: Okay, now, we clean everything up again
418 00:29:49.680 ⇒ 00:29:52.289 Casie Aviles: I guess we could edit. Oh, wait! No, no, no.
419 00:29:53.050 ⇒ 00:29:56.870 Amber Lin: Hmm perfect. So I say, save that
420 00:29:58.120 ⇒ 00:30:00.239 Casie Aviles: Yeah. Refresh.
421 00:30:02.235 ⇒ 00:30:02.800 Casie Aviles: Yes.
422 00:30:03.207 ⇒ 00:30:04.430 Amber Lin: Yes, it’s there.
423 00:30:06.331 ⇒ 00:30:09.949 Casie Aviles: Okay? Good. I’m put. I’m putting this screenshot in there.
424 00:30:10.990 ⇒ 00:30:12.109 Miguel de Veyra: Screen it a bit more.
425 00:30:13.030 ⇒ 00:30:20.850 Miguel de Veyra: Let’s clean up a bit a bit more, because average execution time is still a bit shit, cause we we wanted to go down around 7
426 00:30:22.472 ⇒ 00:30:24.240 Amber Lin: I, see, I, see
427 00:30:24.240 ⇒ 00:30:28.619 Miguel de Veyra: Because it’s pushing it. Yeah, let’s remove the ones that are error rate.
428 00:30:28.840 ⇒ 00:30:33.729 Miguel de Veyra: Wait like the the ones that have the highest response time. And then they have the errors.
429 00:30:34.450 ⇒ 00:30:36.120 Miguel de Veyra: But of course, let’s see if it’s correct.
430 00:30:36.290 ⇒ 00:30:41.619 Amber Lin: Yeah. Let’s see, highest execution rate
431 00:30:44.330 ⇒ 00:30:48.309 Miguel de Veyra: Is that correct? I think the one that can you click that. What does it say?
432 00:30:53.490 ⇒ 00:30:55.840 Miguel de Veyra: I think that’s correct. Wait, let me check
433 00:30:55.840 ⇒ 00:30:59.009 Amber Lin: It is correct, it is, it is error rate 0. So
434 00:30:59.010 ⇒ 00:31:01.129 Miguel de Veyra: Okay, okay, yeah, so yeah.
435 00:31:01.480 ⇒ 00:31:03.960 Amber Lin: I think I can only filter by one at a time, so
436 00:31:03.960 ⇒ 00:31:07.559 Miguel de Veyra: Yeah. So let’s do. Yes, a customer. What’s this way?
437 00:31:11.250 ⇒ 00:31:11.670 Amber Lin: Cancel.
438 00:31:11.670 ⇒ 00:31:14.139 Miguel de Veyra: I mean that that’s correct error rate 0
439 00:31:14.390 ⇒ 00:31:15.680 Amber Lin: Yeah, let’s just look at
440 00:31:15.900 ⇒ 00:31:16.730 Miguel de Veyra: Yeah, yeah, this one.
441 00:31:16.730 ⇒ 00:31:17.550 Amber Lin: Service
442 00:31:17.550 ⇒ 00:31:22.170 Miguel de Veyra: Yes, this is correct. Yeah, because it skip.
443 00:31:22.960 ⇒ 00:31:26.900 Miguel de Veyra: If, yeah, this can. This is correct.
444 00:31:27.970 ⇒ 00:31:28.760 Amber Lin: Yeah, and also
445 00:31:28.760 ⇒ 00:31:34.019 Miguel de Veyra: As long as the as long as they’re paying they can skip as many services they want, because they’re paying
446 00:31:38.420 ⇒ 00:31:39.940 Amber Lin: So we should
447 00:31:40.720 ⇒ 00:31:49.959 Amber Lin: like, I think these are kind of important to point out as to well, if we point that out, they’re gonna ask, okay, why? Why did you market wrong?
448 00:31:50.510 ⇒ 00:31:51.350 Amber Lin: Right?
449 00:31:52.460 ⇒ 00:31:59.009 Amber Lin: So should I. Should I show this like that? This is actually right, or should we just exclude it
450 00:31:59.340 ⇒ 00:32:03.249 Miguel de Veyra: You can just exclude it like I think this is more of an internal filter
451 00:32:03.250 ⇒ 00:32:09.424 Amber Lin: Okay, okay, okay, let me just take a note for ourselves.
452 00:32:11.530 ⇒ 00:32:22.790 Amber Lin: some of the correct answers are still large ads, perfect, that? Okay.
453 00:32:25.810 ⇒ 00:32:27.660 Amber Lin: we’re getting there. We’re getting there
454 00:32:33.070 ⇒ 00:32:35.910 Miguel de Veyra: Yeah, cause as long as they’re paying. Yeah, it’s correct. As long as they’re paying with
455 00:32:35.910 ⇒ 00:32:39.740 Miguel de Veyra: yeah 100% 100% 100%
456 00:32:40.610 ⇒ 00:32:42.790 Miguel de Veyra: Have to handle an adjustment. What’s that?
457 00:32:43.070 ⇒ 00:32:44.350 Miguel de Veyra: It’s 14 seconds
458 00:32:51.270 ⇒ 00:32:52.470 Amber Lin: That is wrong
459 00:32:54.000 ⇒ 00:32:55.040 Miguel de Veyra: Alright sorry.
460 00:32:55.540 ⇒ 00:32:58.860 Miguel de Veyra: F, yeah, it’s correct. Flw billing. What the hell
461 00:32:58.860 ⇒ 00:33:02.709 Amber Lin: Oh, okay, I’m gonna copy that to this as well
462 00:33:03.300 ⇒ 00:33:06.330 Miguel de Veyra: Yeah, that’s correct. Because Flwb, billing is here.
463 00:33:06.330 ⇒ 00:33:14.320 Amber Lin: Yeah. So we we really have some error, me measurement. But I know that wasn’t our goal. So it’s fine
464 00:33:14.820 ⇒ 00:33:15.980 Amber Lin: in-house
465 00:33:17.930 ⇒ 00:33:22.359 Miguel de Veyra: In-house can provide codes, I think. Oh, yeah, wait. Sorry in house
466 00:33:24.560 ⇒ 00:33:28.419 Miguel de Veyra: the field versus what was that in house inspect
467 00:33:35.230 ⇒ 00:33:38.559 Miguel de Veyra: pricing over the phone. Yeah, in House inspector provides
468 00:33:38.900 ⇒ 00:33:47.969 Miguel de Veyra: field Inspect. Yeah, that’s correct in House is for basic services. But if you need, you know, identify entry points. That’s a field inspector, that’s correct.
469 00:33:47.970 ⇒ 00:33:53.540 Amber Lin: Okay, I’m gonna put that I noted that down 100%.
470 00:33:54.190 ⇒ 00:33:55.390 Amber Lin: Yeah, what’s that
471 00:33:55.990 ⇒ 00:33:56.780 Miguel de Veyra: Casual
472 00:33:56.780 ⇒ 00:33:58.259 Amber Lin: Do a smoke test?
473 00:33:58.830 ⇒ 00:34:02.159 Amber Lin: Why did that single line take 11 seconds
474 00:34:03.330 ⇒ 00:34:06.270 Miguel de Veyra: Yeah, there’s documentation ship like, here.
475 00:34:07.660 ⇒ 00:34:10.600 Miguel de Veyra: What was the answer here to schedule a small
476 00:34:10.600 ⇒ 00:34:11.520 Amber Lin: Test
477 00:34:12.480 ⇒ 00:34:23.649 Miguel de Veyra: Arrange appointment. Okay, scheduling around. I don’t. I don’t think there’s a specific way to
478 00:34:23.770 ⇒ 00:34:30.149 Miguel de Veyra: schedule it mentioned in the Doc, but I think we can mark that as long. That’s fine. No, that’s correct. That’s correct. Like we can leave it there
479 00:34:30.709 ⇒ 00:34:37.079 Amber Lin: Okay? So yes, we yes, we do offer a senior discount. Blah blah blah blah
480 00:34:37.080 ⇒ 00:34:39.609 Miguel de Veyra: That is correct. They offer a senior discount
481 00:34:39.610 ⇒ 00:34:42.030 Amber Lin: Okay, that that
482 00:34:42.030 ⇒ 00:34:44.929 Miguel de Veyra: What the cicada. I’m interested. What the cicada is
483 00:34:46.630 ⇒ 00:34:48.529 Amber Lin: Cigara killers.
484 00:34:48.739 ⇒ 00:34:51.739 Miguel de Veyra: Are covered under our general pest control services
485 00:34:54.210 ⇒ 00:34:54.900 Amber Lin: I don’t think so.
486 00:34:54.909 ⇒ 00:34:58.529 Casie Aviles: Tom will join, so I’ll send him the link
487 00:34:58.530 ⇒ 00:34:59.700 Amber Lin: Okay, sounds good.
488 00:35:02.140 ⇒ 00:35:04.660 Amber Lin: What are psychotic killers?
489 00:35:04.933 ⇒ 00:35:07.470 Miguel de Veyra: It’s basically part of the service. Wait, let me check
490 00:35:07.470 ⇒ 00:35:09.209 Amber Lin: Okay, so this is right.
491 00:35:10.080 ⇒ 00:35:11.710 Miguel de Veyra: I think so. Yeah.
492 00:35:12.720 ⇒ 00:35:15.210 Amber Lin: It’s not on their website, though when I scraped it I don’t remember
493 00:35:15.210 ⇒ 00:35:17.250 Miguel de Veyra: It’s under central. It’s under central, though.
494 00:35:17.250 ⇒ 00:35:20.280 Amber Lin: Oh, interesting. Okay.
495 00:35:20.570 ⇒ 00:35:25.240 Miguel de Veyra: Yeah. Wait. Let me check title program
496 00:35:27.110 ⇒ 00:35:28.040 Casie Aviles: Hey? Autumn.
497 00:35:28.480 ⇒ 00:35:29.360 Miguel de Veyra: How are you doing
498 00:35:29.360 ⇒ 00:35:45.459 Amber Lin: I would. We’re going over, going over real cleaning up some answers, as you can see after you clean it up. It’s like this because we essentially what we have is we have the structure. What we don’t have is clean data.
499 00:35:45.870 ⇒ 00:35:51.110 Amber Lin: Good evaluation answers from them.
500 00:35:51.330 ⇒ 00:35:53.359 Amber Lin: which I know it took so long and
501 00:35:53.360 ⇒ 00:35:56.880 Uttam Kumaran: I see what you mean. So anything that’s I’m sorry we should filter out
502 00:35:57.120 ⇒ 00:35:57.620 Amber Lin: Yeah.
503 00:35:57.620 ⇒ 00:35:59.889 Uttam Kumaran: Okay, I can help. I can help you do that.
504 00:36:00.160 ⇒ 00:36:03.580 Amber Lin: We did that. Actually, we did that, and we saved the
505 00:36:03.940 ⇒ 00:36:06.589 Uttam Kumaran: No, no, no, but you shouldn’t do it here. We should just
506 00:36:08.080 ⇒ 00:36:10.129 Uttam Kumaran: we should just filter it. We should.
507 00:36:11.290 ⇒ 00:36:15.570 Uttam Kumaran: Okay, maybe I can share. There’s a there’s a little bit of a better way to do this one
508 00:36:15.570 ⇒ 00:36:16.919 Amber Lin: Yeah, I think.
509 00:36:17.120 ⇒ 00:36:18.209 Miguel de Veyra: Put them. The other thing
510 00:36:18.210 ⇒ 00:36:24.740 Amber Lin: How long we have until the meeting I don’t. I don’t know about that, but as long as we pass meeting, Annie can do that next week.
511 00:36:25.010 ⇒ 00:36:40.190 Miguel de Veyra: Yeah, basically, you, Tim, there’s like a couple of issues that we have the structure we got down. But the grading of the quality score is still something that we, of course, need to improve because escalations are score one, and then they’re like a rate 100% which they shouldn’t be.
512 00:36:41.740 ⇒ 00:36:50.970 Miguel de Veyra: And then there’s a lot of stuff here that we that I do, we double check. And they’re they’re correct answers. But if because if they’re not in the golden data sheet. So they’re marked as wrong
513 00:36:52.210 ⇒ 00:36:58.310 Uttam Kumaran: Okay. So one of the things here I can make. I can help you. I can show you how to make this change in like 10 min
514 00:36:58.310 ⇒ 00:37:04.699 Amber Lin: Hmm, yes, please. Okay, let me stop sharing screen. But I saved the I’ve saved the filters.
515 00:37:05.865 ⇒ 00:37:09.669 Amber Lin: Please let us know how to. That will be really helpful.
516 00:37:09.670 ⇒ 00:37:11.096 Uttam Kumaran: Okay. So
517 00:37:19.710 ⇒ 00:37:22.449 Uttam Kumaran: yo, you guys see the earthquake
518 00:37:22.950 ⇒ 00:37:24.460 Amber Lin: No where.
519 00:37:24.460 ⇒ 00:37:26.029 Miguel de Veyra: Is it in Bangkok? Right
520 00:37:26.030 ⇒ 00:37:27.809 Uttam Kumaran: Yeah, 7, point 7,
521 00:37:28.760 ⇒ 00:37:30.490 Miguel de Veyra: Yeah, it was terrible.
522 00:37:33.750 ⇒ 00:37:36.570 Uttam Kumaran: Yeah, yesterday or today.
523 00:37:36.900 ⇒ 00:37:39.600 Amber Lin: Yeah, like, 2 h ago.
524 00:37:47.260 ⇒ 00:37:49.200 Miguel de Veyra: I wonder what stocks went down
525 00:37:54.530 ⇒ 00:37:57.259 Amber Lin: Is this? Oh, this is in cursor.
526 00:37:57.490 ⇒ 00:38:03.820 Amber Lin: and this cursor is connected to the get or connected to
527 00:38:05.951 ⇒ 00:38:08.110 Uttam Kumaran: Yes, it’s connected to git
528 00:38:08.110 ⇒ 00:38:08.640 Amber Lin: And then
529 00:38:08.640 ⇒ 00:38:08.970 Uttam Kumaran: It is good.
530 00:38:08.970 ⇒ 00:38:14.530 Amber Lin: Connected to Snowflake, and real sorry. I just don’t know how this works.
531 00:38:14.530 ⇒ 00:38:16.319 Uttam Kumaran: Yeah, once, hold on one second
532 00:38:16.320 ⇒ 00:38:17.010 Amber Lin: Oh!
533 00:38:17.474 ⇒ 00:38:19.330 Uttam Kumaran: Casey can probably answer
534 00:38:22.710 ⇒ 00:38:24.220 Casie Aviles: Sorry? What was the question?
535 00:38:24.901 ⇒ 00:38:35.030 Amber Lin: Where is cursor connected to so is cursor connected to Github, which is connected to Snowflake, which is connected to real? Or is it directly connected to real
536 00:38:37.965 ⇒ 00:38:42.248 Casie Aviles: No, I mean, how do I ex- explain?
537 00:38:42.760 ⇒ 00:38:47.429 Casie Aviles: So so the changes that Uton is making right now, that’s that gets track
538 00:38:47.650 ⇒ 00:38:53.370 Casie Aviles: using via git. And it’s for version control. And when auto makes changes to the code.
539 00:38:53.900 ⇒ 00:38:55.799 Casie Aviles: we could deploy it to their
540 00:38:56.390 ⇒ 00:39:00.540 Casie Aviles: repository, which is on Github. And then that’s where
541 00:39:01.170 ⇒ 00:39:04.630 Casie Aviles: with that code. That’s the real code. And that’s where
542 00:39:05.370 ⇒ 00:39:06.829 Amber Lin: The view right now.
543 00:39:06.830 ⇒ 00:39:07.380 Amber Lin: Oh.
544 00:39:07.380 ⇒ 00:39:08.909 Casie Aviles: So it’ll get updated
545 00:39:09.730 ⇒ 00:39:11.099 Amber Lin: The live one.
546 00:39:12.690 ⇒ 00:39:15.579 Casie Aviles: And then the Snowflake data is basic is
547 00:39:15.860 ⇒ 00:39:20.780 Casie Aviles: going to be untouched. It’s just, you know. Otam is filtering it out, I believe, on his side.
548 00:39:20.780 ⇒ 00:39:23.250 Amber Lin: Oh, so! So we are!
549 00:39:26.040 ⇒ 00:39:28.050 Casie Aviles: The code, the SQL
550 00:39:35.590 ⇒ 00:39:38.179 Amber Lin: Okay, so we’re making a local host
551 00:39:38.940 ⇒ 00:39:43.189 Amber Lin: editing it there and then pushing it to the public view.
552 00:39:44.020 ⇒ 00:39:45.020 Casie Aviles: Yes, yes.
553 00:39:45.020 ⇒ 00:39:46.050 Amber Lin: Let’s see.
554 00:39:46.050 ⇒ 00:39:54.112 Uttam Kumaran: But one thing I’m gonna do, Casey, is, I’m going to add another dimension here that is called
555 00:39:58.730 ⇒ 00:40:03.200 Uttam Kumaran: Like either we can call it is test, or we can say
556 00:40:06.720 ⇒ 00:40:09.679 Uttam Kumaran: I mean to be quite honest, like
557 00:40:09.820 ⇒ 00:40:13.460 Uttam Kumaran: I’m sorry should be a should be a 0 right
558 00:40:14.680 ⇒ 00:40:18.719 Uttam Kumaran: So I kind of argue that we need to leave that in.
559 00:40:20.690 ⇒ 00:40:23.280 Uttam Kumaran: Seems on. That’s like on us.
560 00:40:25.010 ⇒ 00:40:26.130 Miguel de Veyra: From
561 00:40:26.400 ⇒ 00:40:40.919 Miguel de Veyra: because we spoke to Janice. I think like 2 days ago, she said, that for those type of questions it should be like. For example, it’s not in the doc. It should be an escalation. So as long as we say, Hey, we don’t have the. I don’t have access to this? Can you escalate to your manager?
562 00:40:41.980 ⇒ 00:40:44.319 Miguel de Veyra: So how should we create that? Then
563 00:40:44.710 ⇒ 00:40:48.150 Uttam Kumaran: Okay, I see what you mean. Okay, so let’s create a column.
564 00:40:49.080 ⇒ 00:40:52.670 Amber Lin: The testing escalation problems
565 00:40:52.670 ⇒ 00:40:58.139 Uttam Kumaran: Can you send me that link for the for the one that you guys did? And I’ll just make sure that it ends up in here
566 00:40:58.767 ⇒ 00:41:00.650 Amber Lin: How do I
567 00:41:01.070 ⇒ 00:41:03.450 Uttam Kumaran: Go to the top right and click, click, share.
568 00:41:03.940 ⇒ 00:41:10.660 Amber Lin: Yeah, I just want honest. Oh, why is this so, Link? So long? How do I send this link?
569 00:41:11.350 ⇒ 00:41:15.039 Amber Lin: I’ll send it in our slack because Zoom Chat has a
570 00:41:16.580 ⇒ 00:41:19.109 Amber Lin: as a character limit. It seems
571 00:41:22.040 ⇒ 00:41:23.559 Uttam Kumaran: I’m not doing this in here
572 00:41:31.680 ⇒ 00:41:33.840 Amber Lin: Okay, I sent it in our slack.
573 00:41:45.410 ⇒ 00:41:48.684 Amber Lin: So from what we said, there is
574 00:41:52.650 ⇒ 00:41:59.380 Amber Lin: So there’s ex specification or exploratory questions. So when the bot asks
575 00:41:59.560 ⇒ 00:42:02.902 Amber Lin: clarifying questions that shouldn’t be an error.
576 00:42:03.949 ⇒ 00:42:09.379 Amber Lin: things that need escalation when the boss is sorry I don’t know shouldn’t be an error. And
577 00:42:10.360 ⇒ 00:42:15.160 Amber Lin: those 2 things, and some of them is correct, but it’s still marked as an error.
578 00:42:15.340 ⇒ 00:42:17.370 Amber Lin: So that’s the 3 categories.
579 00:43:02.580 ⇒ 00:43:03.280 Amber Lin: Hmm.
580 00:43:06.730 ⇒ 00:43:09.460 Uttam Kumaran: But what’s the other one? What are the other cases
581 00:43:13.510 ⇒ 00:43:15.490 Miguel de Veyra: Some are a bit more specific
582 00:43:15.680 ⇒ 00:43:19.099 Miguel de Veyra: cause like we checked that, and it’s correct.
583 00:43:19.250 ⇒ 00:43:22.479 Miguel de Veyra: But I think something more general would be
584 00:43:23.670 ⇒ 00:43:25.170 Amber Lin: Yeah, let me check the part where we
585 00:43:25.170 ⇒ 00:43:31.640 Uttam Kumaran: This is what I’m this is what I’m saying. Those are on us. Those are not. We should not exclude those like. We have to explain why
586 00:43:32.880 ⇒ 00:43:35.039 Uttam Kumaran: we, we need to fix our evals
587 00:43:35.420 ⇒ 00:43:36.290 Miguel de Veyra: Yeah, yeah.
588 00:43:42.180 ⇒ 00:43:42.970 Uttam Kumaran: Right.
589 00:43:43.640 ⇒ 00:43:45.939 Amber Lin: Totally so, I think.
590 00:43:46.830 ⇒ 00:43:47.800 Amber Lin: Hmm.
591 00:43:48.130 ⇒ 00:44:00.019 Amber Lin: So for those things it’s we need to tell the client we need to fix how we measure inaccurate answers, but for now I think I’m sorry it can be marked as escalations.
592 00:44:00.450 ⇒ 00:44:01.370 Amber Lin: I’m gonna check
593 00:44:01.370 ⇒ 00:44:04.330 Uttam Kumaran: So can you tell me like what what else here is like?
594 00:44:04.660 ⇒ 00:44:06.860 Uttam Kumaran: So so these are all like.
595 00:44:07.340 ⇒ 00:44:09.879 Uttam Kumaran: I don’t. I don’t feel comfortable filtering these out
596 00:44:10.470 ⇒ 00:44:10.790 Amber Lin: Yeah.
597 00:44:10.790 ⇒ 00:44:18.270 Uttam Kumaran: Reasonable responses. Where, if it’s 0, then it’s on us. But tell what else? Here is like. Okay, it’s a test case or stuff like that.
598 00:44:18.621 ⇒ 00:44:24.950 Amber Lin: I think. Thank you for your input or thank you for just when it starts with. Thank you.
599 00:44:26.520 ⇒ 00:44:37.289 Amber Lin: I will remove the other parts so that we can know that and edit our responses.
600 00:45:02.710 ⇒ 00:45:05.230 Amber Lin: And also I can help with that.
601 00:45:05.550 ⇒ 00:45:06.429 Amber Lin: I think
602 00:45:10.940 ⇒ 00:45:18.610 Amber Lin: anything that asks the user. Just, can you please specify like these type of questions are more exploratory
603 00:45:29.030 ⇒ 00:45:32.150 Uttam Kumaran: Okay, so let’s just, I’m just just keep it here for now.
604 00:45:40.360 ⇒ 00:45:43.690 Uttam Kumaran: so what other? What other like types like this? Do we have
605 00:45:46.040 ⇒ 00:45:48.100 Amber Lin: Did you just add the I’m sorry
606 00:45:48.780 ⇒ 00:45:49.500 Uttam Kumaran: Yeah.
607 00:45:49.500 ⇒ 00:45:52.420 Amber Lin: Yeah, there’s a there’s another one that’s
608 00:45:52.670 ⇒ 00:45:59.230 Amber Lin: that will. They will say, I can help with or thank you for your answer. So when it starts with starts with, Thank you.
609 00:46:08.840 ⇒ 00:46:11.269 Amber Lin: So you can. I see?
610 00:46:34.560 ⇒ 00:46:35.870 Amber Lin: Oh, great.
611 00:46:39.760 ⇒ 00:46:44.710 Amber Lin: yeah, that also.
612 00:46:55.000 ⇒ 00:46:56.510 Amber Lin: Mvc, user.
613 00:46:57.040 ⇒ 00:46:58.010 Amber Lin: Let’s see.
614 00:47:02.010 ⇒ 00:47:10.110 Amber Lin: Also to exclude the ones that starts with. Thank you. Cause that’s sort of a accepting feedback.
615 00:47:12.590 ⇒ 00:47:14.379 Amber Lin: Oh, you already have that great
616 00:47:32.560 ⇒ 00:47:34.490 Amber Lin: check. What kind of input stuff
617 00:47:44.780 ⇒ 00:47:48.350 Uttam Kumaran: So what? But you’re you’re like, why filter these out?
618 00:47:48.600 ⇒ 00:47:50.960 Uttam Kumaran: Oh, because the response pairs
619 00:47:51.190 ⇒ 00:47:57.770 Uttam Kumaran: are gonna be like, Hi, how are you? It’s like, not a good response. Okay, I mean, alright, whatever for now I’m okay with that. But
620 00:47:58.166 ⇒ 00:48:06.089 Amber Lin: There’s also like different Hi, hellos. But I think that’s that’s fine, like, there’s input, which is just yes, no.
621 00:48:06.370 ⇒ 00:48:13.870 Amber Lin: which is not really what the question is. But I think that’s fine for now.
622 00:48:14.140 ⇒ 00:48:16.789 Amber Lin: Okay, how does it look after? We adjust that
623 00:48:18.850 ⇒ 00:48:24.069 Uttam Kumaran: Well, okay, I’m sorry. What other ones are there other than I’m sorry for escalation.
624 00:48:28.470 ⇒ 00:48:29.579 Amber Lin: Let me check.
625 00:48:36.760 ⇒ 00:48:41.909 Amber Lin: Oh, good search super file, sir.
626 00:48:48.790 ⇒ 00:48:57.800 Amber Lin: Yeah. There’s 1 that says for more information. Please consult with your supervisor. Refer to the billing policies. I think anything that will say
627 00:48:58.380 ⇒ 00:49:00.210 Amber Lin: that involves your supervisor.
628 00:49:01.500 ⇒ 00:49:03.849 Amber Lin: Or please consult with your supervisor.
629 00:49:05.670 ⇒ 00:49:12.140 Amber Lin: Yeah, it also just it says, please consult with your trainer or supervisor
630 00:49:12.390 ⇒ 00:49:14.400 Uttam Kumaran: It’s but always says, Please consult.
631 00:49:14.980 ⇒ 00:49:18.520 Amber Lin: I believe so let me check the different answers,
632 00:49:24.080 ⇒ 00:49:32.059 Amber Lin: yeah, for the ones that has error rate. It’s it’s now just saying, please consult, so let me check.
633 00:49:32.970 ⇒ 00:49:33.850 Amber Lin: Oh.
634 00:49:53.510 ⇒ 00:49:55.210 Uttam Kumaran: Okay, how does this look
635 00:49:58.850 ⇒ 00:49:59.600 Uttam Kumaran: better?
636 00:50:00.444 ⇒ 00:50:02.400 Miguel de Veyra: I think you should remove me and Casey
637 00:50:03.700 ⇒ 00:50:05.049 Uttam Kumaran: Oh, wait! Hold on! Hold on!
638 00:51:12.740 ⇒ 00:51:15.899 Uttam Kumaran: Hey! You have needs escalation. True, false
639 00:51:16.310 ⇒ 00:51:16.960 Amber Lin: Hmm.
640 00:51:17.500 ⇒ 00:51:18.830 Uttam Kumaran: So now you have false.
641 00:51:19.040 ⇒ 00:51:22.600 Uttam Kumaran: All of these ones need escalation. I’m sorry. I’m sorry. I’m sorry. I’m sorry.
642 00:51:23.430 ⇒ 00:51:24.310 Uttam Kumaran: See that
643 00:51:24.530 ⇒ 00:51:25.840 Amber Lin: Great.
644 00:51:28.490 ⇒ 00:51:31.330 Uttam Kumaran: I don’t know how to fuck to make this wider, but
645 00:51:31.330 ⇒ 00:51:32.229 Miguel de Veyra: Is there
646 00:51:32.230 ⇒ 00:51:34.899 Amber Lin: Yeah, I want to. Oh, thank you.
647 00:51:34.900 ⇒ 00:51:41.091 Uttam Kumaran: I’m sorry. I’m sorry. I’m sorry. I’m sorry. I’m sorry. Blah, blah, okay, so these are mostly the ones that need to get escalated.
648 00:51:41.720 ⇒ 00:51:46.529 Miguel de Veyra: I have a question. By the way, is there a way? We can see both the input and the output on the same? You know
649 00:51:47.000 ⇒ 00:51:47.760 Amber Lin: Oh!
650 00:51:48.550 ⇒ 00:51:52.370 Uttam Kumaran: Yeah. So if you go to pivot, you go to input.
651 00:51:53.000 ⇒ 00:51:53.880 Uttam Kumaran: Okay.
652 00:51:54.899 ⇒ 00:51:55.599 Miguel de Veyra: Okay.
653 00:51:55.600 ⇒ 00:51:59.360 Amber Lin: Oh, yay, and and great
654 00:51:59.670 ⇒ 00:52:00.690 Miguel de Veyra: Okay. Yeah.
655 00:52:01.330 ⇒ 00:52:04.219 Uttam Kumaran: Well, you would just go here, and then you go total exchanges.
656 00:52:05.070 ⇒ 00:52:07.619 Uttam Kumaran: So, okay, actually, this is the other way.
657 00:52:15.630 ⇒ 00:52:18.630 Uttam Kumaran: So I think something like this, you can do.
658 00:52:19.000 ⇒ 00:52:21.740 Uttam Kumaran: Yeah? And basically, that’s great. Yeah.
659 00:52:21.740 ⇒ 00:52:23.950 Uttam Kumaran: I don’t know. You guys will have to play around a little bit here
660 00:52:23.950 ⇒ 00:52:24.460 Amber Lin: All right.
661 00:52:24.460 ⇒ 00:52:25.500 Uttam Kumaran: Okay. Okay. Yeah. Sure.
662 00:52:25.500 ⇒ 00:52:28.190 Amber Lin: What does what does the dashboard look like right now?
663 00:52:28.500 ⇒ 00:52:33.230 Amber Lin: Oh, so we should exclude the from the
664 00:52:33.230 ⇒ 00:52:34.410 Miguel de Veyra: It’s a bit there.
665 00:52:34.410 ⇒ 00:52:37.000 Amber Lin: Needs escalations from the error rate
666 00:52:38.554 ⇒ 00:52:42.030 Miguel de Veyra: I think. Shouldn’t we exclude me and Jason
667 00:52:42.710 ⇒ 00:52:46.489 Uttam Kumaran: Yeah, so what you can do here is you can say, real user or test user. So I’m gonna put
668 00:52:46.805 ⇒ 00:52:47.120 Miguel de Veyra: Okay.
669 00:52:47.120 ⇒ 00:52:48.986 Uttam Kumaran: I’m gonna change this to.
670 00:52:50.960 ⇒ 00:52:52.760 Uttam Kumaran: I’m gonna put this as
671 00:52:54.020 ⇒ 00:52:54.490 Amber Lin: And for.
672 00:52:54.490 ⇒ 00:53:00.682 Uttam Kumaran: Next user, and then well, what I’ll do is, I’ll say,
673 00:53:21.400 ⇒ 00:53:22.070 Amber Lin: Hmm.
674 00:53:39.310 ⇒ 00:53:41.669 Uttam Kumaran: So now you can go here you can say false
675 00:53:43.080 ⇒ 00:53:45.929 Amber Lin: And then also select for
676 00:53:46.400 ⇒ 00:53:49.709 Uttam Kumaran: So if you said you had true. But see, Miguel Casey.
677 00:53:59.400 ⇒ 00:54:02.129 Uttam Kumaran: yeah. So these are all non test users. Right?
678 00:54:02.130 ⇒ 00:54:02.560 Amber Lin: Okay.
679 00:54:02.560 ⇒ 00:54:03.330 Miguel de Veyra: Yes.
680 00:54:03.580 ⇒ 00:54:11.169 Amber Lin: Can you also select the needs? Isolation equals false? I just want to see what the error rate equals out to right now.
681 00:54:12.070 ⇒ 00:54:12.940 Amber Lin: Okay, that’s good.
682 00:54:12.940 ⇒ 00:54:13.740 Uttam Kumaran: Good.
683 00:54:14.090 ⇒ 00:54:15.350 Amber Lin: That’s fantastic. Okay.
684 00:54:15.350 ⇒ 00:54:16.240 Uttam Kumaran: Yay!
685 00:54:16.240 ⇒ 00:54:16.770 Amber Lin: I’m sorry.
686 00:54:17.310 ⇒ 00:54:17.850 Miguel de Veyra: Yeah.
687 00:54:17.850 ⇒ 00:54:21.780 Amber Lin: This is life, is. This is a lifesaver
688 00:54:22.180 ⇒ 00:54:24.271 Miguel de Veyra: We just we need a data person
689 00:54:25.130 ⇒ 00:54:27.989 Uttam Kumaran: Oh, dude! I want you guys to learn. You guys are getting close
690 00:54:27.990 ⇒ 00:54:28.660 Amber Lin: I want to
691 00:54:28.660 ⇒ 00:54:36.600 Uttam Kumaran: It’s not easy, you know, a lot of people. I’ve tried to get to learn real on the data side, and they don’t want to learn it, because it’s not like drag and drop.
692 00:54:37.000 ⇒ 00:54:40.560 Uttam Kumaran: Yeah, the other thing I’m gonna do is I’m gonna clean up these 2
693 00:54:40.560 ⇒ 00:54:41.080 Amber Lin: Okay.
694 00:54:41.080 ⇒ 00:54:44.025 Uttam Kumaran: What you can do here is if we go into
695 00:54:44.320 ⇒ 00:54:53.600 Amber Lin: Well, the coding is so much easier, because then you can adjust it with one line versus drag and drop. You have to go back and edit so annoying.
696 00:55:09.620 ⇒ 00:55:16.490 Amber Lin: I will add it to my next sprint golf to learn real.
697 00:55:30.850 ⇒ 00:55:41.469 Amber Lin: I think the big question, now that they’ll have that they see, the quality. Scoring error is decently good is why is why did the error rate spike up to 60%
698 00:55:42.070 ⇒ 00:55:44.960 Amber Lin: in in this past week?
699 00:55:46.060 ⇒ 00:55:47.630 Amber Lin: Oh, 40%
700 00:56:05.960 ⇒ 00:56:07.230 Uttam Kumaran: Oppression. Okay.
701 00:56:26.110 ⇒ 00:56:37.499 Uttam Kumaran: okay? So the one thing I just is, I’m cleaning this up right? Thumbs up, thumbs down incorrect answer as well. What I’m gonna do here is I’m gonna go to
702 00:56:37.870 ⇒ 00:56:44.530 Uttam Kumaran: incorrect answer. I’m gonna do expression. And I’m gonna do what this function is called Init cap.
703 00:56:44.730 ⇒ 00:56:46.789 Uttam Kumaran: She’s gonna capitalize the 1st letter.
704 00:56:47.563 ⇒ 00:56:48.489 Uttam Kumaran: In it.
705 00:56:48.490 ⇒ 00:56:49.020 Amber Lin: Oh!
706 00:56:49.020 ⇒ 00:56:49.830 Uttam Kumaran: There.
707 00:56:50.650 ⇒ 00:56:51.949 Uttam Kumaran: What was the error here?
708 00:56:55.380 ⇒ 00:56:58.720 Uttam Kumaran: Oh, shit! Never mind what I’m gonna do is do.
709 00:56:59.210 ⇒ 00:57:01.539 Uttam Kumaran: Oh, God! This is a different dog!
710 00:57:03.010 ⇒ 00:57:05.070 Amber Lin: So we can’t do the in the cap
711 00:57:05.210 ⇒ 00:57:09.779 Uttam Kumaran: No, it’s just not running in Snowflake. It’s running like in duck. dB,
712 00:57:10.390 ⇒ 00:57:14.939 Uttam Kumaran: so the function is different. But I’m just gonna ask you here.
713 00:57:17.430 ⇒ 00:57:20.940 Uttam Kumaran: Well, let’s see, I don’t think it’s gonna do the right thing. Let’s see.
714 00:57:26.830 ⇒ 00:57:30.389 Uttam Kumaran: What I can do is go. Here is where is this error?
715 00:57:34.100 ⇒ 00:57:41.030 Uttam Kumaran: Oh, is there? Oh, it’s already okay, normal. Okay, that’s what I really escalation. True, false.
716 00:57:42.070 ⇒ 00:57:42.910 Uttam Kumaran: Wow!
717 00:57:44.060 ⇒ 00:57:47.570 Uttam Kumaran: Metrics, oppression.
718 00:58:15.710 ⇒ 00:58:18.899 Uttam Kumaran: Why is this true? False? Why is this true? False?
719 00:58:19.140 ⇒ 00:58:22.430 Uttam Kumaran: Oh, detail. Okay, let’s do one more thing. Details, feedback
720 00:58:31.460 ⇒ 00:58:34.220 Miguel de Veyra: Utham. Does Annie know how to work with Rel.
721 00:58:34.500 ⇒ 00:58:37.709 Uttam Kumaran: No, she’ll learn, though, so it’s not
722 00:58:38.400 ⇒ 00:58:41.450 Amber Lin: It’s it doesn’t seem it doesn’t seem like it.
723 00:58:41.730 ⇒ 00:58:48.790 Uttam Kumaran: I mean you shouldn’t take like I I’ve been doing this now for a while, so it’s quick. But
724 00:58:49.190 ⇒ 00:58:49.880 Amber Lin: Yeah.
725 00:58:51.820 ⇒ 00:58:54.835 Uttam Kumaran: The other things I’m gonna do here is I’m gonna put
726 00:59:11.180 ⇒ 00:59:15.540 Amber Lin: Hmm, yeah. The total exchanges should just be down there
727 00:59:15.710 ⇒ 00:59:17.230 Uttam Kumaran: See? Yeah.
728 00:59:17.550 ⇒ 00:59:21.939 Amber Lin: So the quality, score, error, and time should be on top
729 00:59:32.880 ⇒ 00:59:34.700 Uttam Kumaran: Why did it change stuff here?
730 00:59:37.600 ⇒ 00:59:40.069 Uttam Kumaran: Oh, am I looking at the same thing? Oh, damn
731 00:59:44.420 ⇒ 00:59:48.880 Uttam Kumaran: okay, the last thing that we can do here is
732 00:59:49.840 ⇒ 00:59:53.260 Uttam Kumaran: preview. This. Another thing that you can do is
733 01:00:06.150 ⇒ 01:00:07.710 Amber Lin: 15 min left
734 01:00:12.320 ⇒ 01:00:15.569 Miguel de Veyra: Are we gonna start doing data work for ABC, 2
735 01:00:17.120 ⇒ 01:00:18.670 Uttam Kumaran: Yes.
736 01:00:18.670 ⇒ 01:00:19.660 Miguel de Veyra: Nice
737 01:00:19.930 ⇒ 01:00:20.790 Uttam Kumaran: Yeah.
738 01:00:21.960 ⇒ 01:00:23.680 Amber Lin: Are they gonna pay us extra
739 01:00:23.950 ⇒ 01:00:26.078 Miguel de Veyra: Of course, of course.
740 01:00:28.260 ⇒ 01:00:29.380 Uttam Kumaran: I think so.
741 01:00:29.980 ⇒ 01:00:31.106 Uttam Kumaran: Hope so.
742 01:00:32.540 ⇒ 01:00:34.029 Uttam Kumaran: That would seem fair
743 01:00:36.340 ⇒ 01:00:38.852 Miguel de Veyra: That’s the ideal scenario
744 01:00:39.860 ⇒ 01:00:43.339 Uttam Kumaran: So one thing you can do here is you can start adding canvas. So this is maybe
745 01:00:43.340 ⇒ 01:00:43.960 Amber Lin: Hmm.
746 01:00:43.960 ⇒ 01:00:46.229 Uttam Kumaran: It should take a second to play around with this.
747 01:00:46.610 ⇒ 01:00:48.549 Uttam Kumaran: You can start to do a
748 01:00:52.018 ⇒ 01:00:54.180 Uttam Kumaran: how do you get this in here?
749 01:00:55.060 ⇒ 01:00:56.820 Uttam Kumaran: Their plus button somewhere
750 01:01:12.980 ⇒ 01:01:15.990 Amber Lin: Is canvas drag and drop, or do we have to code that, too?
751 01:01:17.161 ⇒ 01:01:23.500 Uttam Kumaran: Canvas is drag and drop, although I’m not sure how this is entirely working. Right now, I’m just gonna commit this
752 01:01:23.500 ⇒ 01:01:24.100 Amber Lin: Yeah.
753 01:02:18.760 ⇒ 01:02:22.310 Uttam Kumaran: Okay, cool. Looks like we’re good.
754 01:02:22.890 ⇒ 01:02:23.330 Miguel de Veyra: Yeah.
755 01:02:34.550 ⇒ 01:02:35.579 Uttam Kumaran: What is going on
756 01:02:35.580 ⇒ 01:02:43.070 Amber Lin: Issue in linear 2 recheck the evaluations
757 01:02:43.750 ⇒ 01:02:50.630 Uttam Kumaran: Bye, I don’t know how to go do a canvas
758 01:02:51.760 ⇒ 01:02:53.960 Amber Lin: We’ll ask the real people.
759 01:02:54.150 ⇒ 01:02:55.940 Amber Lin: We’ll do that. It’s okay.
760 01:02:58.260 ⇒ 01:02:59.680 Amber Lin: It’s also kind of new.
761 01:03:00.490 ⇒ 01:03:02.640 Amber Lin: So maybe it’s not updated yet.
762 01:03:03.830 ⇒ 01:03:07.832 Uttam Kumaran: Just did the updates. I don’t know where it is
763 01:03:15.360 ⇒ 01:03:20.200 Uttam Kumaran: So I want to show them this looks great that I want like
764 01:03:37.330 ⇒ 01:03:41.580 Amber Lin: Okay. I guess my question is, is this dashboard pushed
765 01:03:43.060 ⇒ 01:03:43.650 Uttam Kumaran: Yeah.
766 01:03:45.010 ⇒ 01:03:47.340 Amber Lin: Oh, so if I refresh, I can see it
767 01:03:47.340 ⇒ 01:03:48.150 Uttam Kumaran: Yeah.
768 01:03:48.150 ⇒ 01:03:49.020 Amber Lin: Oh, okay.
769 01:03:50.220 ⇒ 01:03:50.880 Amber Lin: Bye.
770 01:04:04.860 ⇒ 01:04:05.690 Amber Lin: hmm
771 01:04:10.910 ⇒ 01:04:11.480 Uttam Kumaran: What?
772 01:04:13.280 ⇒ 01:04:15.710 Amber Lin: Seem to be there yet for me.
773 01:04:16.000 ⇒ 01:04:18.680 Amber Lin: or is there any filters? I should add on
774 01:04:18.910 ⇒ 01:04:21.830 Uttam Kumaran: No, I I see I see it here. I think
775 01:04:24.290 ⇒ 01:04:24.960 Miguel de Veyra: Alright!
776 01:04:26.100 ⇒ 01:04:29.750 Amber Lin: Is it pushed from your local post for the
777 01:04:30.490 ⇒ 01:04:33.380 Amber Lin: oh, can you see, Miguel, can you guys see
778 01:04:33.780 ⇒ 01:04:34.880 Miguel de Veyra: Check, in
779 01:04:35.110 ⇒ 01:04:37.080 Casie Aviles: Yeah, I could see the changes
780 01:04:37.080 ⇒ 01:04:41.380 Amber Lin: Oh, okay, so it’s my problem here. Maybe I’m on on a different thing.
781 01:04:42.030 ⇒ 01:04:43.809 Miguel de Veyra: Oh, yeah, I can see it.
782 01:04:45.010 ⇒ 01:04:49.040 Amber Lin: Can you send the link in the stock? Please.
783 01:04:52.390 ⇒ 01:04:53.080 Casie Aviles: Sure.
784 01:06:31.950 ⇒ 01:06:35.689 Uttam Kumaran: Okay, I gotta drop to work on this next step.
785 01:06:35.690 ⇒ 01:06:37.520 Amber Lin: Go ahead. Thank you for it. Thank you.
786 01:06:37.520 ⇒ 01:06:39.840 Uttam Kumaran: The real team about this, because I don’t know it should be working
787 01:06:39.840 ⇒ 01:06:44.890 Amber Lin: Okay, yeah. Let me ask them about that. Don’t worry. I’ll technology
788 01:06:44.890 ⇒ 01:06:46.289 Uttam Kumaran: Okay, cool. Makes sense.
789 01:06:46.290 ⇒ 01:06:47.590 Miguel de Veyra: Thanks. Everyone.
790 01:06:47.590 ⇒ 01:06:48.080 Amber Lin: Thank you.
791 01:06:48.080 ⇒ 01:06:49.680 Amber Lin: Bye, bye.