Meeting Title: AI Team | Daily Standup Date: 2025-03-26 Meeting participants: Janna Wong, Amber Lin, Janiecegarcia, Miguel De Veyra, Casie Aviles


WEBVTT

1 00:02:21.530 00:02:22.560 Amber Lin: Bye, team

2 00:02:26.210 00:02:26.919 Casie Aviles: Hey! Good morning!

3 00:02:27.760 00:02:33.850 Amber Lin: Good morning, everybody! Oh, Jenna, welcome back

4 00:02:35.030 00:02:35.850 Janna Wong: Hello!

5 00:02:36.650 00:02:40.640 Amber Lin: Oh, okay.

6 00:02:40.940 00:02:47.090 Amber Lin: I think Janice is coming. But let’s just get started, and then we’ll check with her anything that we need.

7 00:02:47.770 00:02:50.649 Amber Lin: Let me pull up linear

8 00:02:57.060 00:02:57.940 Amber Lin: right?

9 00:03:06.160 00:03:15.000 Amber Lin: So so quickly on linear we have.

10 00:03:17.330 00:03:20.259 Amber Lin: We’ve grown it quite a bit. So essentially. We just have

11 00:03:21.380 00:03:26.440 Amber Lin: the last few. Oh, great I see that the other.

12 00:03:26.990 00:03:30.539 Amber Lin: the other ones, are on end. Review. So

13 00:03:31.190 00:03:35.010 Amber Lin: let’s start with Casey. How was yesterday?

14 00:03:37.120 00:03:40.810 Casie Aviles: I’m sorry where we which ticket are we? Starting with

15 00:03:41.130 00:03:46.029 Amber Lin: Oh, here, let me filter filtered by your tickets.

16 00:03:50.760 00:03:51.490 Amber Lin: Hmm!

17 00:03:52.910 00:03:57.010 Casie Aviles: Oh, I, okay, okay. I’ll start with the ones under review.

18 00:03:58.630 00:04:03.063 Casie Aviles: Okay, so for? So we did the slack. Alerts.

19 00:04:04.650 00:04:13.019 Casie Aviles: yeah. So for the slack alerts we actually have the logs coming over now. I think, Miguel added you to the channel

20 00:04:13.797 00:04:19.149 Amber Lin: Yeah, let me go. Let me go. Check. I think I think I know Miguel showed me yesterday.

21 00:04:19.300 00:04:22.459 Amber Lin: Let me check if I have it? Yeah, I have ABC logs.

22 00:04:23.270 00:04:25.080 Miguel de Veyra: Yeah, book, trip.

23 00:04:26.360 00:04:32.060 Casie Aviles: Yeah. And then, there’s the spreadsheet functionality. But I think it’s

24 00:04:32.622 00:04:35.039 Casie Aviles: we. We have bugs over there. So that’s

25 00:04:35.360 00:04:42.970 Casie Aviles: something we have to fix still. So, but at least the the errors are being logged in the database, and also

26 00:04:43.110 00:04:44.540 Casie Aviles: and slack. So

27 00:04:45.150 00:04:51.620 Amber Lin: Yeah, that’s fantastic. I mean, that’s kind of all they wanted this week. So you think

28 00:04:51.740 00:04:55.700 Amber Lin: that will satisfy them for this Friday meeting. So that’s awesome.

29 00:04:56.991 00:05:14.629 Casie Aviles: yeah. So moving on to the the other 2 tickets, that, I have which is the in in wait, sorry. Incorrect answers and the thumbs data? Yeah, I could. These are interview they’re not live yet, but I can show how it looks like right now.

30 00:05:14.630 00:05:17.759 Amber Lin: Let me let everyone share a screen. There you go.

31 00:05:19.590 00:05:20.590 Amber Lin: Okay, cool.

32 00:05:23.590 00:05:26.360 Miguel de Veyra: Hey? There’s like 2 big blockers in your screen

33 00:05:26.800 00:05:27.608 Casie Aviles: Yeah, I don’t know.

34 00:05:27.810 00:05:28.230 Miguel de Veyra: Okay.

35 00:05:28.230 00:05:29.249 Casie Aviles: Rid of these.

36 00:05:29.480 00:05:33.059 Casie Aviles: I I always have these. I’m not sure how to get rid of these. Wait, let me.

37 00:05:33.670 00:05:34.870 Casie Aviles: Okay. Is it better now?

38 00:05:34.870 00:05:37.090 Amber Lin: That’s the that’s the zoom call thing

39 00:05:37.090 00:05:38.979 Miguel de Veyra: Yeah, not not sympathy. Yeah.

40 00:05:38.980 00:05:40.150 Amber Lin: But now it’s good

41 00:05:40.480 00:05:41.195 Casie Aviles: Okay,

42 00:05:42.610 00:05:49.810 Casie Aviles: right? So these, this is the dashboard right now. And yeah, this is just locally for now, okay?

43 00:05:50.050 00:05:56.090 Casie Aviles: And this, how it would look like. So as you could see, there are. This is the error rate. So there’s it’s 80 80%. But

44 00:05:56.645 00:05:58.919 Casie Aviles: yeah, I wouldn’t say this is representative

45 00:06:00.120 00:06:12.730 Casie Aviles: Yes, I wouldn’t say this is representative of the overall performance of the bot since. Yeah, these these are the tests that Janice were was doing so this morning. So yeah, I I

46 00:06:12.990 00:06:15.660 Casie Aviles: could see some of the tests earlier yourself.

47 00:06:16.320 00:06:22.289 Casie Aviles: Yeah, these were her tests. She was testing. I think the the questions where the bot hadn’t

48 00:06:22.590 00:06:24.680 Casie Aviles: inaccurate answer

49 00:06:24.870 00:06:26.020 Amber Lin: Thanks, Steve.

50 00:06:27.510 00:06:30.480 Casie Aviles: Yeah, sorry.

51 00:06:32.060 00:06:36.600 Casie Aviles: But yeah, so this is how it would look like and then.

52 00:06:37.330 00:06:40.609 Casie Aviles: but also we also have the quality score. So this is supposed.

53 00:06:42.540 00:06:45.849 Casie Aviles: Yes, this is supposed to be the summary score. So yeah,

54 00:06:46.730 00:06:50.130 Casie Aviles: so there, it’s pretty low right now. The same reason with the error

55 00:06:50.130 00:06:50.510 Amber Lin: Great

56 00:06:50.510 00:06:53.370 Casie Aviles: So it’s going to be one to 10.

57 00:06:55.040 00:06:59.209 Casie Aviles: What about the percentage of answers? We got wrong

58 00:06:59.500 00:07:00.889 Amber Lin: Say on the top

59 00:07:01.822 00:07:04.599 Casie Aviles: That’s the that would be the that’s the error rate. Yeah.

60 00:07:05.180 00:07:09.560 Amber Lin: Oh, okay, okay, I see. I see that makes a lot more sense.

61 00:07:09.760 00:07:13.970 Amber Lin: Let’s let’s scroll down to the bottom. We have the thumbs up and thumbs down data

62 00:07:15.150 00:07:19.640 Casie Aviles: Yes. But this is just also my test data and

63 00:07:19.640 00:07:20.240 Amber Lin: Hmm.

64 00:07:20.660 00:07:29.029 Casie Aviles: I have another database with other test thumbs up, thumbs down. But I didn’t include that. So I just added this one to me to show how it would look like

65 00:07:29.660 00:07:36.719 Amber Lin: Sounds good, and then, later on, I know it’ll it’ll be connected to all the Google chat, like everything will sync together. But this is awesome.

66 00:07:38.040 00:07:38.630 Casie Aviles: Yeah.

67 00:07:39.030 00:07:39.850 Amber Lin: Hmm.

68 00:07:40.370 00:07:47.979 Casie Aviles: So actually, I have a question for Janice. I’m just wondering if the

69 00:07:48.190 00:07:52.720 Casie Aviles: thumbs up, thumbs down, features actually available already on your end.

70 00:07:53.590 00:07:56.340 Casie Aviles: We did. Oh, yeah, sorry. Go ahead

71 00:07:56.340 00:08:00.149 JanieceGarcia: No, it’s okay. Sorry. I’m not

72 00:08:00.520 00:08:03.570 JanieceGarcia: seeing, I mean, unless I didn’t like an emoji

73 00:08:04.800 00:08:11.139 Casie Aviles: Oh, okay, okay, yeah. So yeah, we did send him the code. So I guess

74 00:08:13.050 00:08:13.570 Amber Lin: Yeah, look.

75 00:08:13.570 00:08:14.350 Casie Aviles: Follow up!

76 00:08:14.540 00:08:16.869 Amber Lin: Do something down. Check.

77 00:08:18.810 00:08:21.410 Amber Lin: And okay.

78 00:08:32.909 00:08:33.850 Amber Lin: good issue.

79 00:08:33.850 00:08:40.329 JanieceGarcia: And with the thumbs up and thumbs down. Does that mean? Also they’ll be able to put in a

80 00:08:40.620 00:08:42.490 JanieceGarcia: like a response or feedback

81 00:08:44.150 00:08:48.370 Casie Aviles: Yes, but the feature is for the thumbs down.

82 00:08:48.560 00:08:57.590 Casie Aviles: Yeah. So if you yeah, if you wrap the thumbs down and yeah, you’ll get the chance to input a, you know, a more detailed response

83 00:08:58.100 00:08:59.290 JanieceGarcia: Okay. Feedback.

84 00:09:00.280 00:09:01.280 JanieceGarcia: Perfect. Okay.

85 00:09:03.570 00:09:06.849 Amber Lin: That’s awesome. Let me go back to my screen.

86 00:09:07.050 00:09:10.640 Amber Lin: So we have that.

87 00:09:11.240 00:09:14.380 Amber Lin: So let’s I’m gonna pull that here.

88 00:09:16.120 00:09:18.610 Amber Lin: Let me pull that to

89 00:09:23.280 00:09:27.160 Amber Lin: there, you actually yeah, he should be in Pr review, too.

90 00:09:27.540 00:09:34.489 Amber Lin: My bad, anyways, what about? So I think that’s all for the dashboard. I think

91 00:09:35.280 00:09:37.499 Miguel de Veyra: Yeah, that’s pretty much it

92 00:09:38.468 00:09:43.310 Amber Lin: Okay? Alright. Let’s look at.

93 00:09:46.690 00:09:58.149 Amber Lin: Yeah. I think we, after we complete adding that to real, actually, I’m gonna move the one of real feats thumbs up with thumbs down back.

94 00:09:58.530 00:10:03.269 Amber Lin: just so that we can add the add the current data.

95 00:10:04.040 00:10:09.510 Amber Lin: So I’m gonna move it back here so so that we know what we’re looking at.

96 00:10:10.500 00:10:12.400 Amber Lin: Let’s look at

97 00:10:18.520 00:10:19.270 Amber Lin: here.

98 00:10:22.880 00:10:23.720 Amber Lin: Great

99 00:10:25.050 00:10:26.149 Miguel de Veyra: And no more tasks.

100 00:10:26.150 00:10:27.110 Amber Lin: So I’m gonna go.

101 00:10:28.040 00:10:30.260 Amber Lin: That was yesterday.

102 00:10:30.260 00:10:31.050 Miguel de Veyra: Yeah.

103 00:10:32.928 00:10:44.089 Miguel de Veyra: yeah, that’s basically just the slack and the spreadsheet, which is pretty much done. And then, of course, now, if you go to the ABC. Logs. You can actually see the answers that Janice

104 00:10:44.800 00:10:48.850 Miguel de Veyra: ask the bot, but then give the bot until you give him like a proper response. So

105 00:10:49.170 00:10:52.229 Miguel de Veyra: it’s working pretty well, I would say

106 00:10:52.700 00:10:55.490 Amber Lin: Do you want to show Denise the spreadsheet that we logged

107 00:10:55.490 00:10:58.409 Miguel de Veyra: Oh, yes, yes, wait. Sorry.

108 00:10:58.880 00:11:02.700 Miguel de Veyra: Let me let me just find it.

109 00:11:15.220 00:11:16.110 Miguel de Veyra: Insurance.

110 00:11:17.470 00:11:20.060 Miguel de Veyra: Oh, actually, I just I’ll just send the.

111 00:11:21.620 00:11:23.409 Miguel de Veyra: I’ll just send the link here.

112 00:11:26.460 00:11:28.930 Miguel de Veyra: Yeah, but this one we still have to.

113 00:11:30.310 00:11:31.030 Miguel de Veyra: Oh, wait!

114 00:11:32.090 00:11:33.310 Miguel de Veyra: Restricted

115 00:11:34.720 00:11:39.580 Amber Lin: Oh, I think you might be standing in to read the read AI. Meeting notes. I did that last time

116 00:11:39.890 00:11:46.209 Miguel de Veyra: Yeah. Yep, yep, Janice, I don’t think I can make it public

117 00:11:46.210 00:11:49.549 Amber Lin: Greeting group chat. It’s the it’s the top one

118 00:11:49.720 00:11:53.850 Miguel de Veyra: Yep. I sent it to her email, and then I’ll just send it to you, too.

119 00:11:54.660 00:11:55.170 Amber Lin: Great.

120 00:11:56.800 00:12:00.959 Amber Lin: I mean, we can send in the slack if you if you can, and they can open it up

121 00:12:04.970 00:12:07.050 Miguel de Veyra: But yeah, let me just share my screen

122 00:12:07.050 00:12:08.569 Amber Lin: Okay. Sounds good.

123 00:12:08.570 00:12:13.820 Miguel de Veyra: So ideally, it’s gonna look something like this like this, where you know.

124 00:12:14.110 00:12:17.580 Miguel de Veyra: this was one of our tests where? Okay, this is here

125 00:12:17.950 00:12:18.890 JanieceGarcia: So.

126 00:12:19.190 00:12:22.110 Miguel de Veyra: What’s the input the output, which is the expected

127 00:12:22.360 00:12:27.190 Miguel de Veyra: right? And then, basically, this is just color coding on how bad the answer is.

128 00:12:27.830 00:12:29.039 Miguel de Veyra: And then who the user is.

129 00:12:29.380 00:12:34.890 Miguel de Veyra: Right now, we’re still figuring some stuff here. But if you go to ABC logs

130 00:12:34.890 00:12:36.140 Casie Aviles: Yeah, there’s a bug

131 00:12:36.530 00:12:38.730 Miguel de Veyra: Yeah, there’s a bug right now that we need to fix.

132 00:12:39.570 00:12:43.299 Miguel de Veyra: But yeah, it should be pretty fast to fix.

133 00:12:45.200 00:12:51.010 Miguel de Veyra: Okay, is this something you find helpful ladies

134 00:12:52.170 00:13:11.349 JanieceGarcia: It actually is, I think it’ll truthfully, because that’s kind of what I’m focusing on is the errors and trying to figure out, okay, is it something that we need to do or like? If you look on the golden data sheet like with the what I was doing this morning? The faro ants before it was saying

135 00:13:11.530 00:13:26.270 JanieceGarcia: that. Yes, we it would go ahead and be covered in the general pest control service as long as it’s inside. But now it’s telling us that we would need to actually add that to the service. And so

136 00:13:27.650 00:13:39.110 JanieceGarcia: that’s what I wanted to ask. You guys is, you know, when we had a correct answer, but maybe it was just like a word off, or we just needed to add a word and now it’s giving

137 00:13:39.240 00:13:41.152 JanieceGarcia: a wrong answer.

138 00:13:43.050 00:13:51.950 JanieceGarcia: And I noted a couple of those. There was that one, and then there’s also the one like on the discount. That one as well.

139 00:13:52.980 00:13:55.179 JanieceGarcia: and I don’t know if you guys get that

140 00:13:55.180 00:13:56.060 Miguel de Veyra: Yes, yes.

141 00:13:56.636 00:14:01.269 Miguel de Veyra: Did we? Did you change anything on the central dock for the favor

142 00:14:01.270 00:14:07.580 JanieceGarcia: So I have not changed anything on the central dock for the feral ants

143 00:14:07.580 00:14:09.190 Miguel de Veyra: Okay, we’ll look into that

144 00:14:12.490 00:14:20.920 JanieceGarcia: But that’s I know. Yvette had gone through, and she updated, but all she was doing was changing some wording not actual protocols.

145 00:14:21.700 00:14:22.640 Miguel de Veyra: Okay. Okay.

146 00:14:29.410 00:14:45.549 Amber Lin: Sounds good. Janice, when you saw that on the golden data she were able to see, take a note somewhere so we can look, we can, which one is wrong. Okay, that’s perfect. Then we’ll go in and we’ll your notes, and we’ll make the changes, and we’ll see how we can edit

147 00:14:46.100 00:15:05.170 JanieceGarcia: Okay, I don’t know, I think, Casey was. Casey’s in there as well. And so he was probably noticing. But I was updating you know, either to where it needs the bot answer, update, or something that I need to do. And then I’m putting in your extra note the j column. I’m putting notes in there as well for you guys

148 00:15:05.960 00:15:15.130 Amber Lin: Sounds good, so we’ll put down a test to check the incorrect

149 00:15:20.990 00:15:21.950 Amber Lin: shoot

150 00:15:22.320 00:15:26.980 Miguel de Veyra: Yeah, it seems like the Pharaohn word is not longer on the central dock.

151 00:15:29.410 00:15:30.200 JanieceGarcia: Okay.

152 00:15:30.450 00:15:31.769 Miguel de Veyra: Yeah, maybe that’s right.

153 00:15:33.560 00:15:34.270 JanieceGarcia: Look at that!

154 00:15:34.440 00:15:38.739 Miguel de Veyra: Yeah, I’ll recheck the the files. Why, what happened there

155 00:15:47.040 00:15:47.920 Amber Lin: Great.

156 00:15:50.580 00:16:00.580 Amber Lin: Let’s see, took more to do this cycle. Oh, Janice, were you able to get a list of Csrs

157 00:16:01.380 00:16:06.470 JanieceGarcia: I sent that over to her. I did let her know she is working on the tactic tactics.

158 00:16:06.991 00:16:17.930 JanieceGarcia: And I think she’s gonna end up doing all 5 the new Csrs that we have that are coming in. So it’ll for sure we definitely will have that by Friday

159 00:16:18.420 00:16:24.609 Amber Lin: Okay, that’s good to know. I’m just gonna note that down. And then.

160 00:16:30.550 00:16:47.829 Amber Lin: oh, I think one other thing we wanted to ask today is we wanted to cause for the Csr. We just wanted to know when the Bot doesn’t have an answer. What should the Csrs do? So? How are they going to escalate essentially

161 00:16:48.660 00:16:58.209 JanieceGarcia: They? If they don’t have? If the bot does not have the answer I would point them in the direction of their trainer

162 00:16:59.630 00:17:00.870 JanieceGarcia: and supervisor.

163 00:17:03.120 00:17:04.890 JanieceGarcia: So then we can actually look at that

164 00:17:06.050 00:17:06.970 Amber Lin: Sounds good.

165 00:17:07.535 00:17:31.649 Amber Lin: Is there any particular thing you want us to note down? Or maybe the in that process? Should we note that they went to the trainer. Do we know? What train do we want to tell them what trainer it is because it that will require some extra processes, and we’ll push some other tasks back. If you just want the basic of okay, we don’t know this. Then it’ll be pretty fast

166 00:17:32.130 00:17:40.350 JanieceGarcia: Yeah, no, just just if the bot doesn’t. Doesn’t know. Please see your supervisor slash trainer

167 00:17:40.930 00:17:42.080 Amber Lin: Hmm, okay.

168 00:17:42.470 00:17:50.650 JanieceGarcia: Very, very vague. So that way it it’s pointed out. You know, we can actually use that across the board once once we’re able to push it out to the other departments, too.

169 00:17:51.110 00:17:59.020 Amber Lin: Okay, sounds good. Sounds good. That is great. So essentially, that will just be an extra answer when the boss says, Oh, I don’t have this information

170 00:18:00.540 00:18:05.950 Amber Lin: Great. Let me let’s remove that.

171 00:18:11.220 00:18:14.130 Amber Lin: Oh, let’s see

172 00:18:18.220 00:18:20.750 Amber Lin: Casey and Miguel. What do you think we have

173 00:18:21.110 00:18:24.969 Amber Lin: to work on? So I think we pretty much

174 00:18:26.340 00:18:28.240 Miguel de Veyra: Yeah, I think.

175 00:18:28.240 00:18:29.910 Amber Lin: It’s a lot of tweaks

176 00:18:29.910 00:18:44.410 Miguel de Veyra: Yeah, it’s basically, I think we have majority of it. Just the real dashboard. Finish it up. And then the the sheets. And then I guess the ongoing work would be just we’re gonna we’re gonna monitor the ABC logs and then that sheets

177 00:18:44.560 00:18:58.349 Miguel de Veyra: and then see you know where we can help. I think, Denise, what could be helpful is, for example, the the bot doesn’t know something. We just copy the keywords, and then we paste it on the Central Doc control F. If it’s not there, the bot wouldn’t know

178 00:18:59.540 00:19:19.270 JanieceGarcia: Okay, so that’s perfect. And then I did see the email amber that you had sent asking if I could meet with you and Miguel which I’d be more than happy to do. So. If you guys don’t want to do an early which I completely understand. I could do Monday anytime. In the afternoon

179 00:19:20.356 00:19:30.160 Amber Lin: Early would be better because Miguel is in a different time. Zone. Would you be? Would you be free this week, or in it? Would it have to be next week

180 00:19:31.867 00:19:33.810 JanieceGarcia: I know I’m

181 00:19:33.810 00:19:37.269 Amber Lin: Tomorrow after the stand up that works too

182 00:19:39.850 00:19:44.409 JanieceGarcia: So this where this week I’m not able to. I have a training class that starts at 9

183 00:19:44.410 00:19:45.820 Amber Lin: Oh, let me see!

184 00:19:46.121 00:19:50.349 JanieceGarcia: Cause. I’m training the new hires. So from 9 to 4 this week I’m

185 00:19:50.350 00:19:50.900 Amber Lin: Okay.

186 00:19:51.380 00:19:52.540 JanieceGarcia: I’m stacked

187 00:19:53.160 00:19:55.310 Amber Lin: Okay, don’t worry. Don’t worry.

188 00:19:55.975 00:20:00.760 Amber Lin: Essentially for us to prepare for the next cycle. So that we okay

189 00:20:00.760 00:20:08.319 Amber Lin: in doing what we have to do, and what’s like, what’s acceptable and what’s done? Let’s see. Next week

190 00:20:09.530 00:20:10.280 JanieceGarcia: I can.

191 00:20:10.400 00:20:14.870 JanieceGarcia: I would be able to do next week after the meeting if you wanted to.

192 00:20:15.800 00:20:21.969 JanieceGarcia: and that would give me more testing time, too, because I am planning on testing this afternoon. And then some Friday afternoon

193 00:20:22.440 00:20:32.599 Amber Lin: Sure. Okay, I will send you. I’ll send you the meeting link, or we can next next week, and just stay in the meeting. But I’ll send you an in, maybe

194 00:20:33.140 00:20:33.750 JanieceGarcia: Okay.

195 00:20:33.930 00:20:34.700 Amber Lin: Okay, that’s fine.

196 00:20:34.700 00:20:35.830 Amber Lin: Thank you so much.

197 00:20:36.410 00:20:37.160 JanieceGarcia: You’re welcome.

198 00:20:37.810 00:20:38.380 Amber Lin: Yeah.

199 00:20:38.798 00:20:48.059 Amber Lin: Miguel, in case is there anything Jenna can work on? Or she? I know we have an AI team planning, and we can talk about that a little bit later.

200 00:20:49.710 00:20:51.050 Miguel de Veyra: Yeah, I think later.

201 00:20:51.240 00:20:56.099 Amber Lin: Okay, sounds good. I mean, that’s all. That’s all for this meeting. Then thank you guys for coming

202 00:20:56.100 00:20:56.510 Miguel de Veyra: Amazing

203 00:20:56.510 00:20:57.160 JanieceGarcia: Thank you.

204 00:20:57.160 00:20:59.460 JanieceGarcia: Have a good day. Thank you. Have a great day bye.