Meeting Title: ABC | Ticket Grooming Date: 2025-04-29 Meeting participants: Miguel De Veyra, Casie Aviles, Amber Lin


WEBVTT

1 00:02:01.100 00:02:02.340 Amber Lin: With Lucy.

2 00:02:04.860 00:02:06.211 Casie Aviles: Hey? Amber? Good morning.

3 00:02:06.570 00:02:11.830 Amber Lin: Mindy, let me send one last message. I’ll be good.

4 00:03:06.630 00:03:09.238 Amber Lin: Let me share my venue.

5 00:03:10.220 00:03:13.299 Amber Lin: We’ll just go through the tickets.

6 00:03:15.820 00:03:16.820 Casie Aviles: Okay. Sure.

7 00:03:17.000 00:03:17.730 Amber Lin: Yeah.

8 00:03:18.600 00:03:26.314 Amber Lin: Oh, I saw the Casey. You did such a good job yesterday. I saw the agents and up Bam.

9 00:03:27.960 00:03:29.499 Casie Aviles: Do you think? Yeah, I mean.

10 00:03:29.500 00:03:41.330 Amber Lin: I wish I had it for other other teams, because I had to write a monthly summary for pool parts, and then I saw what you have for, madam. More damn! I wish I had it.

11 00:03:42.320 00:03:46.679 Casie Aviles: I mean, yeah, yeah, that’s the goal to have it for across all the other clients. So.

12 00:03:46.680 00:03:47.720 Amber Lin: Yeah.

13 00:03:48.320 00:03:49.330 Amber Lin: Oh.

14 00:03:50.250 00:04:01.570 Amber Lin: okay, back to our back to my tickets. Yeah. Okay. And then we’ll have to do the AI team tickets again. I don’t know if Miguel’s joining, but honestly, he doesn’t have to. This is just

15 00:04:01.700 00:04:04.459 Amber Lin: one by one. This is probably just us.

16 00:04:05.330 00:04:05.910 Casie Aviles: Okay.

17 00:04:07.050 00:04:09.600 Amber Lin: Feedback to suggestion.

18 00:04:11.000 00:04:14.300 Amber Lin: Think we think we did it? I

19 00:04:14.680 00:04:19.279 Amber Lin: I need to know from Denise, like, how, yeah.

20 00:04:19.390 00:04:24.790 Amber Lin: okay? I did say that. But I didn’t. I forgot to check with them.

21 00:04:28.530 00:04:31.960 Amber Lin: Streamline AI code deployment.

22 00:04:32.780 00:04:33.779 Amber Lin: I don’t know.

23 00:04:35.920 00:04:38.810 Casie Aviles: Oh, okay, I think this one’s for Tim. Right?

24 00:04:39.260 00:04:40.170 Amber Lin: Yeah.

25 00:04:48.260 00:04:51.160 Amber Lin: Tim still hasn’t gave us the Api key.

26 00:04:55.060 00:04:59.939 Amber Lin: And so how? How do you think this should be done?

27 00:05:02.497 00:05:07.429 Casie Aviles: Honestly, I’m I’m not familiar with the whole Cicd thing.

28 00:05:08.910 00:05:09.590 Amber Lin: Okay.

29 00:05:14.665 00:05:19.864 Casie Aviles: Yeah. But I think this was a suggestion made by Utam to make sure that you know

30 00:05:20.480 00:05:24.610 Casie Aviles: our deployment is much, I guess better than

31 00:05:25.050 00:05:28.070 Casie Aviles: the initial one where we just send it to.

32 00:05:28.070 00:05:28.720 Amber Lin: Hmm.

33 00:05:28.720 00:05:30.190 Casie Aviles: Via email.

34 00:05:30.740 00:05:36.930 Amber Lin: I see right now we are just telling him this change, and he updates it right.

35 00:05:38.200 00:05:40.330 Casie Aviles: Yeah, I believe this is on his side.

36 00:05:40.690 00:05:45.319 Amber Lin: I see so we should

37 00:05:46.300 00:05:51.949 Amber Lin: let me check when last time he sent a message in a channel.

38 00:05:58.520 00:06:02.830 Amber Lin: Oh, okay. So he did say he was going to do that.

39 00:06:04.400 00:06:04.960 Casie Aviles: Okay.

40 00:06:05.350 00:06:09.340 Amber Lin: So that would just be on him. But, Tim.

41 00:06:09.990 00:06:15.030 Amber Lin: let me just screenshot Tim’s message and put it in here.

42 00:06:25.140 00:06:33.370 Amber Lin: So I would say, this is blocked blocks.

43 00:06:35.570 00:06:36.200 Casie Aviles: Okay.

44 00:06:38.285 00:06:46.230 Amber Lin: Now we have looking at the outage

45 00:06:48.950 00:06:56.360 Amber Lin: seem to be a what do you think it’s in terms of priority and

46 00:06:56.930 00:07:00.569 Amber Lin: like, how exactly are we gonna investigate it?

47 00:07:01.680 00:07:07.880 Casie Aviles: Okay? So in terms of priority, I think it could be but medium.

48 00:07:08.400 00:07:09.000 Amber Lin: Hmm.

49 00:07:10.282 00:07:15.570 Casie Aviles: Yeah. And I guess what I would do here is, I will just look

50 00:07:15.690 00:07:22.940 Casie Aviles: more into like, what does this particular error mean. So I’m just gonna do some research and

51 00:07:24.109 00:07:26.420 Casie Aviles: I guess I’ll just take a look at

52 00:07:27.430 00:07:33.529 Casie Aviles: the executions. Whether this happened like, how often this did this happen?

53 00:07:38.430 00:07:44.510 Casie Aviles: And I will also take a look at like what triggered this

54 00:07:45.690 00:07:48.769 Casie Aviles: so like? What? What question did the Csr

55 00:07:48.900 00:07:51.070 Casie Aviles: ask the bot that triggered this.

56 00:07:51.970 00:07:55.350 Amber Lin: Hmm, yeah. I

57 00:07:56.064 00:08:05.040 Amber Lin: cause I know this happened around 2 times. Ish Janice had once. She has a wiser, not responding, and then Shannon had a case

58 00:08:05.160 00:08:09.830 Amber Lin: as well. Was it all because the

59 00:08:10.310 00:08:13.960 Amber Lin: open AI like Chat should be not model, went down.

60 00:08:16.109 00:08:25.660 Casie Aviles: Yeah, that was the initial. Yeah, initially, that’s what I think happened because, it doesn’t really like. When I tried to recreate the same questions that they did.

61 00:08:26.260 00:08:30.569 Casie Aviles: It didn’t really like. Yeah, this error didn’t really happen.

62 00:08:35.659 00:08:41.759 Casie Aviles: And yeah, sometimes it’s just that the bot would be I mean, the Llm would be down. So

63 00:08:42.059 00:08:44.839 Casie Aviles: yeah, it happens unannounced that time. So.

64 00:08:45.260 00:08:51.410 Amber Lin: I see. So how can we proactively prevent that?

65 00:08:54.760 00:08:55.740 Casie Aviles: Yeah, yeah,

66 00:08:56.330 00:09:03.809 Casie Aviles: I get. I I’m not entirely sure yet how I I guess I just have like a a rough idea like, maybe we could have

67 00:09:04.420 00:09:12.739 Casie Aviles: a way to switch models. So once, I guess an error hits with, let’s say, for Gpt. 4. O,

68 00:09:12.910 00:09:16.879 Casie Aviles: then I guess we could switch to another model for the meantime.

69 00:09:17.760 00:09:21.210 Casie Aviles: But yeah, I’m not sure if I could do that in any time yet.

70 00:09:21.700 00:09:23.870 Amber Lin: Hmm, I see.

71 00:09:29.280 00:09:30.500 Amber Lin: Okay. So there’s a.

72 00:09:30.500 00:09:31.780 Miguel de Veyra: Fallback, model.

73 00:09:32.350 00:09:40.129 Amber Lin: Yeah. Hi, Miguel. What about say like.

74 00:09:40.660 00:09:45.019 Amber Lin: can it alert us when it can’t respond?

75 00:09:45.490 00:09:46.040 Amber Lin: Like.

76 00:09:46.040 00:09:49.679 Casie Aviles: Yeah, there is. Yeah, it’s already sending like an error. But.

77 00:09:49.680 00:09:51.030 Amber Lin: Oh, okay. Okay.

78 00:09:51.030 00:09:52.589 Casie Aviles: It’s not clear, like.

79 00:09:52.810 00:09:59.890 Casie Aviles: like, yeah, that’s 1 more thing I I can work on to clarify, to make it clear to us like what exactly happens.

80 00:10:02.660 00:10:07.900 Casie Aviles: Because it just says error in workflow. So we don’t know what what actually happens.

81 00:10:11.010 00:10:25.250 Amber Lin: Let’s see, I think that’s that’s a pretty good start of clear of the 2 things that

82 00:10:25.650 00:10:32.359 Amber Lin: we can do to help with these like Llm outages.

83 00:10:35.510 00:10:36.450 Amber Lin: Cool.

84 00:10:37.280 00:10:42.720 Amber Lin: What would you say is the estimate if we wanna have that done?

85 00:10:43.480 00:10:46.280 Amber Lin: And now there’s like 2 tickets in there actually.

86 00:10:47.340 00:10:54.229 Casie Aviles: So for the error that’s going to be quick and then the fallback model. I think 2 points is reasonable

87 00:10:55.740 00:10:58.600 Casie Aviles: all back for all for all I mean.

88 00:10:59.070 00:10:59.500 Amber Lin: Oh!

89 00:10:59.500 00:11:00.610 Casie Aviles: For the entire ticket.

90 00:11:01.060 00:11:07.960 Amber Lin: Oh, okay, great I will.

91 00:11:09.010 00:11:10.240 Amber Lin: Fantastic.

92 00:11:17.970 00:11:19.030 Amber Lin: change.

93 00:11:22.350 00:11:29.230 Amber Lin: Sure. Why not here this Zip code thing?

94 00:11:30.160 00:11:33.329 Amber Lin: Oh, they said they were

95 00:11:34.580 00:11:39.599 Amber Lin: working on it, but that is a priority. I need to add more in that.

96 00:11:43.930 00:11:45.120 Amber Lin: Here.

97 00:11:45.980 00:11:48.750 Miguel de Veyra: Oh, I think there’s 1 thing we need to add here.

98 00:11:50.600 00:11:58.560 Miguel de Veyra: Basically because the spreadsheet we have is still, how do you say it?

99 00:11:58.860 00:12:04.230 Miguel de Veyra: It’s still the static one that they gave us like a month or 2 ago. It’s not the live one.

100 00:12:05.080 00:12:05.870 Amber Lin: This one.

101 00:12:08.280 00:12:11.910 Miguel de Veyra: Yeah, we need the access to their Google sheet, the live one.

102 00:12:14.230 00:12:17.569 Miguel de Veyra: or at least like a copy of it, like a a live copy.

103 00:12:17.880 00:12:19.150 Miguel de Veyra: I’m not sure how that works.

104 00:12:21.630 00:12:22.240 Amber Lin: Sounds.

105 00:12:22.240 00:12:27.930 Miguel de Veyra: So this ticket will probably take I don’t know. Probably half a day or a day.

106 00:12:28.630 00:12:34.690 Amber Lin: Okay, stay here until we clarify more.

107 00:12:34.890 00:12:35.440 Miguel de Veyra: Yeah.

108 00:12:36.200 00:12:42.610 Amber Lin: And do we know where the script is related to that Google Google Sheet.

109 00:12:43.660 00:12:49.339 Miguel de Veyra: The thing is right now, the way we did that is, we just copy pasted it into Gpt, and then, just Hey.

110 00:12:49.450 00:12:50.540 Miguel de Veyra: fix this.

111 00:12:52.200 00:12:57.080 Miguel de Veyra: It’s like very raw. We didn’t really do anything too special about it.

112 00:13:00.540 00:13:06.619 Amber Lin: I see. So we just retrieve the information by copy and paste. Essentially.

113 00:13:06.850 00:13:07.610 Miguel de Veyra: Yeah.

114 00:13:08.210 00:13:14.990 Amber Lin: Do we have a prompt this just says, don’t answer questions outside of pest.

115 00:13:19.520 00:13:22.359 Miguel de Veyra: I don’t think we do. Could we add that because

116 00:13:23.690 00:13:27.790 Miguel de Veyra: it’s gonna mess up the loan services, the oh, by the ways.

117 00:13:28.060 00:13:38.000 Amber Lin: Oh, I see. Cause I think the the question they’re getting is sometimes when they ask like, pass

118 00:13:38.280 00:13:41.490 Amber Lin: non pest availability.

119 00:13:42.538 00:13:48.250 Amber Lin: It kind of gets confused. But I’ll I’ll get more details on that one.

120 00:13:49.480 00:13:50.140 Miguel de Veyra: Okay.

121 00:13:50.490 00:14:02.320 Amber Lin: Yeah, I should have asked. I totally forgot about that. And this morning, okay, website and advise.

122 00:14:04.220 00:14:05.780 Amber Lin: what is this?

123 00:14:10.960 00:14:11.930 Amber Lin: Oh.

124 00:14:13.620 00:14:15.680 Miguel de Veyra: Do we have $1 call amber.

125 00:14:15.860 00:14:17.060 Amber Lin: Who don’t mind.

126 00:14:17.160 00:14:20.019 Miguel de Veyra: Should we hop on the other call, Utah might be waiting.

127 00:14:20.550 00:14:21.930 Amber Lin: The other. Call.

128 00:14:22.620 00:14:23.790 Miguel de Veyra: Yeah. The standard.

129 00:14:25.140 00:14:28.439 Amber Lin: Oh, I thought this was this, wait! Let me let me check.

130 00:14:29.120 00:14:29.850 Miguel de Veyra: Okay.

131 00:14:30.361 00:14:34.959 Amber Lin: He is not. He is on his sales block.

132 00:14:37.440 00:14:39.759 Amber Lin: Folk lock in right now.

133 00:14:40.740 00:14:42.939 Miguel de Veyra: Oh, okay, okay. So it probably won’t be training.

134 00:14:42.940 00:14:46.010 Amber Lin: Yeah, he won’t be joining. We’ll go through this.

135 00:14:52.820 00:14:56.229 Amber Lin: I forgot what what this one was for when I created the ticket.

136 00:14:57.070 00:15:02.679 Amber Lin: so I guess it’s like proactive raising issues proactively.

137 00:15:03.640 00:15:06.339 Casie Aviles: Yeah, I think this is about.

138 00:15:06.830 00:15:07.890 Miguel de Veyra: The spreadsheet.

139 00:15:09.587 00:15:13.160 Casie Aviles: When like, for example, sometimes there are some errors that

140 00:15:13.440 00:15:16.959 Casie Aviles: I guess we should have caught already with our evals.

141 00:15:17.170 00:15:17.790 Amber Lin: Hmm.

142 00:15:18.600 00:15:20.530 Casie Aviles: I think that’s what this is about. But.

143 00:15:20.530 00:15:21.450 Amber Lin: I see.

144 00:15:21.780 00:15:26.260 Casie Aviles: The the entire ticket, like. I’m not entirely sure yet how we want to.

145 00:15:26.260 00:15:30.337 Amber Lin: This is Chatgvt. You can ignore them.

146 00:15:31.480 00:15:32.400 Casie Aviles: Yeah.

147 00:15:53.510 00:15:57.079 Amber Lin: Okay, how is a way that we can do that.

148 00:16:06.600 00:16:09.820 Casie Aviles: I think, for now I could. Just

149 00:16:10.400 00:16:12.809 Casie Aviles: at the top of my head I could just take a look at

150 00:16:15.190 00:16:21.229 Casie Aviles: the questions that were considered as errors by the Csrs

151 00:16:21.450 00:16:25.319 Casie Aviles: and check that check like, how did we score that

152 00:16:25.530 00:16:28.270 Casie Aviles: and see if there’s like a mismatch?

153 00:16:29.330 00:16:34.619 Casie Aviles: So I I guess in a way, it’s kind of tied to the incorporating the thumbs down.

154 00:16:35.200 00:16:36.210 Amber Lin: Hmm.

155 00:16:37.790 00:16:39.639 Casie Aviles: To the quality score.

156 00:16:40.370 00:16:42.700 Amber Lin: Let me see, that’s great.

157 00:16:44.380 00:16:45.580 Amber Lin: So

158 00:16:48.870 00:16:51.939 Amber Lin: yeah. So I guess that’s a

159 00:16:52.710 00:16:56.749 Amber Lin: that’s the thing. That’s something that we need to

160 00:16:57.750 00:17:02.530 Amber Lin: do. And then there’s that! What else? From that?

161 00:17:04.230 00:17:10.489 Casie Aviles: Yeah. And yeah, just make sure that they there’s like a what do you call this?

162 00:17:11.650 00:17:14.529 Miguel de Veyra: Like, you know, the the evas would also.

163 00:17:14.810 00:17:15.780 Casie Aviles: Reflect

164 00:17:16.680 00:17:23.609 Casie Aviles: the the like. It’s correctly tagging it as error and may, because maybe we might be missing something with the logic of how we

165 00:17:24.329 00:17:26.510 Casie Aviles: score it, or something.

166 00:17:27.228 00:17:34.280 Casie Aviles: Maybe it’s too lenient, or maybe it’s missing like a, you know, like the criteria.

167 00:17:34.810 00:17:41.570 Casie Aviles: So, and then, additionally for it to be proactive. I guess we should. It should be alerting us, I mean, there is already like a

168 00:17:41.870 00:17:52.659 Casie Aviles: an alert, alert in place. But right now, since we don’t really have any error scores or bad quality scores. It’s not really getting any. We’re not getting any alerts right.

169 00:17:54.550 00:17:55.110 Amber Lin: Yeah.

170 00:17:55.110 00:17:55.710 Casie Aviles: So I.

171 00:17:57.850 00:17:58.889 Amber Lin: It’s small.

172 00:18:04.700 00:18:10.060 Amber Lin: you know, because our avals are not that accurate didn’t make sense to alert us.

173 00:18:10.972 00:18:16.850 Amber Lin: But I imagine if we improved all this alerting, then it will be.

174 00:18:17.060 00:18:18.659 Amber Lin: It’ll be a lot nicer.

175 00:18:20.378 00:18:26.960 Amber Lin: How do you think is the estimates, priorities.

176 00:18:28.860 00:18:36.740 Casie Aviles: And so I I guess there’s a little bit of ambiguity, so I would put it as 3.

177 00:18:38.020 00:18:38.590 Amber Lin: Oh.

178 00:18:39.860 00:18:45.149 Amber Lin: I mean, part of it isn’t in. Isn’t another ticket I can look at if I can condense it later.

179 00:18:48.500 00:18:50.540 Amber Lin: You say anything here.

180 00:18:52.420 00:18:54.929 Casie Aviles: Yeah. I’m leaning towards medium.

181 00:18:54.930 00:18:58.699 Amber Lin: Okay, sounds good.

182 00:19:05.200 00:19:11.860 Amber Lin: So we have like that also requires some good, good alerts. We talk also. This needs some good alerts.

183 00:19:12.230 00:19:13.139 Amber Lin: I see.

184 00:19:13.750 00:19:14.470 Amber Lin: Okay.

185 00:19:14.800 00:19:22.790 Amber Lin: So oh, goodness, okay. Next, one trainer bought.

186 00:19:26.300 00:19:35.990 Amber Lin: this is something we talked about in the and something that they wanted.

187 00:19:36.150 00:19:39.310 Amber Lin: So what this ticket is essentially is

188 00:19:42.260 00:19:50.370 Amber Lin: kind of like a interview bot. So the bot. Ultimately the bot knows what standards the information need to be.

189 00:19:50.480 00:20:01.550 Amber Lin: So it needs to have certain information. It needs to be in a certain format, and it just tries to get that information out of the trainers.

190 00:20:02.230 00:20:10.859 Amber Lin: and that could be in the form of the bot, will keep asking questions of what about this?

191 00:20:10.960 00:20:12.699 Amber Lin: Or have you.

192 00:20:13.416 00:20:16.730 Miguel de Veyra: I think we have a similar ticket yesterday that we talked about.

193 00:20:17.537 00:20:22.789 Amber Lin: Yesterday. Let me see where the ticket is. Yesterday it was

194 00:20:23.610 00:20:28.300 Amber Lin: this one that was when they suggestions

195 00:20:28.770 00:20:31.579 Amber Lin: we talked about it. We didn’t look at the tickets.

196 00:20:32.284 00:20:33.460 Amber Lin: Okay. Brought it up.

197 00:20:33.900 00:20:34.790 Amber Lin: Yeah.

198 00:20:47.990 00:20:55.870 Casie Aviles: So this will be like another chat. Bot, is it like that, Will?

199 00:20:56.270 00:21:01.900 Casie Aviles: It’s basically guide the Csr. For I or I mean the editors.

200 00:21:02.440 00:21:02.950 Amber Lin: Hmm.

201 00:21:05.300 00:21:06.890 Miguel de Veyra: I think this is like a

202 00:21:07.330 00:21:09.380 Miguel de Veyra: a use case of the bot.

203 00:21:11.380 00:21:14.260 Casie Aviles: Or is it going to be same or still? Andy?

204 00:21:15.400 00:21:18.299 Amber Lin: I think it’s gonna be different because it faces are completely different.

205 00:21:18.300 00:21:20.740 Miguel de Veyra: Yeah, yeah, I think it’s different.

206 00:21:21.780 00:21:22.100 Casie Aviles: Nice.

207 00:21:23.490 00:21:28.649 Miguel de Veyra: But basically, I think this is very. I think this is the same as the trainer bot, though.

208 00:21:29.063 00:21:30.479 Amber Lin: Yeah, it is. It is.

209 00:21:30.600 00:21:35.339 Miguel de Veyra: I think this is like a feature or something that they want us to.

210 00:21:37.290 00:21:39.870 Miguel de Veyra: How do you say it? Like.

211 00:21:39.970 00:21:44.489 Miguel de Veyra: basically they want us. They want trainer both to be responsible for.

212 00:21:44.830 00:21:48.540 Amber Lin: Yeah, essentially, they want. This is how they want the trainer bot to act.

213 00:21:48.540 00:21:51.330 Miguel de Veyra: So, okay.

214 00:21:51.330 00:21:56.820 Amber Lin: To trainer boss, too.

215 00:21:57.120 00:21:58.900 Amber Lin: Let’s just plain.

216 00:22:17.370 00:22:20.660 Amber Lin: So how do you think we can do this?

217 00:22:24.930 00:22:29.740 Miguel de Veyra: This number. This cause, this is basically just a prompt

218 00:22:30.777 00:22:36.649 Miguel de Veyra: we just need to basically create like the guidelines 1st and then train the bot there.

219 00:22:37.880 00:22:47.519 Miguel de Veyra: This, this, I think the complicated part here is the smart text editor, because in the ticket that we discussed yesterday, this is what we call the stretch goal.

220 00:22:49.340 00:23:07.140 Amber Lin: Oh, I see, I think, for the smart text editor, there’s a few parts right? There’s a we create. We help them edit the text. And then there’s a part where we insert the text. I don’t think we need to do the insert the text part yet.

221 00:23:07.140 00:23:08.170 Miguel de Veyra: Yeah, yeah.

222 00:23:08.170 00:23:18.050 Amber Lin: Yeah, so we can have edit formatting, which shouldn’t be that hard and then insert updates.

223 00:23:18.460 00:23:19.350 Amber Lin: Alright.

224 00:23:31.410 00:23:36.739 Amber Lin: yeah, I think we can do this. We can save this for later.

225 00:23:36.970 00:23:42.050 Amber Lin: and I’ll make probably make that like we can make this another ticket. So

226 00:23:42.200 00:23:44.570 Amber Lin: I guess this ticket is just for

227 00:23:51.450 00:23:53.229 Amber Lin: for like for this.

228 00:23:54.360 00:23:56.950 Amber Lin: So that would just be prompting you, said.

229 00:24:00.380 00:24:01.430 Miguel de Veyra: Yes, yes.

230 00:24:02.140 00:24:09.880 Amber Lin: See, what do we need for the prompt like? I can’t go ask them. But what type of information do we want for the prompt.

231 00:24:12.610 00:24:13.430 Miguel de Veyra: Oh.

232 00:24:17.140 00:24:18.000 Miguel de Veyra: wait! Sorry!

233 00:24:18.616 00:24:19.849 Casie Aviles: Formatting Guidelines.

234 00:24:20.200 00:24:22.870 Miguel de Veyra: No, no, the formatting guidelines has to come from us.

235 00:24:23.900 00:24:29.190 Miguel de Veyra: because, you know, we’re basically it has to be bought friendly. They they won’t know that.

236 00:24:29.190 00:24:29.800 Amber Lin: Hmm.

237 00:24:30.390 00:24:48.819 Miguel de Veyra: I think I think it’s also cases also correct. But I think formatting guidelines in a way that how do they want it to to be structured like, for example, you know, what’s the baseline? Should it have, for example, like us per ticket, we have a goal acceptance criteria details.

238 00:24:49.160 00:24:50.820 Miguel de Veyra: What’s like the baseline.

239 00:24:50.820 00:24:51.340 Amber Lin: Hmm.

240 00:24:51.340 00:24:54.579 Miguel de Veyra: Should the bot, you know, use as a blueprint.

241 00:24:55.200 00:24:58.220 Amber Lin: I see. So we should gather.

242 00:24:58.650 00:25:02.450 Amber Lin: should we gather good examples, or does.

243 00:25:02.450 00:25:03.890 Miguel de Veyra: Yeah, yeah, that would be good.

244 00:25:04.140 00:25:04.890 Amber Lin: Okay.

245 00:25:05.880 00:25:11.279 Miguel de Veyra: And then you know once what’s some of the follow up questions it should ask.

246 00:25:14.140 00:25:15.100 Amber Lin: Wow!

247 00:25:15.850 00:25:17.710 Amber Lin: Follow up!

248 00:25:21.100 00:25:24.600 Miguel de Veyra: And what do we wanna make sure it’s there

249 00:25:24.780 00:25:33.510 Miguel de Veyra: like in a certain, you know, for example, if for a certain product or service, what information should the bot always check should be there?

250 00:25:33.750 00:25:36.900 Miguel de Veyra: Is it the price? Is it the amount of time.

251 00:25:39.400 00:25:41.000 Amber Lin: Hmm, that’s great.

252 00:25:41.120 00:25:46.360 Amber Lin: So I can also maybe do a walk through.

253 00:25:52.770 00:25:54.740 Amber Lin: So I will.

254 00:25:55.580 00:26:04.080 Amber Lin: That’s, I think that itself should be its own ticket.

255 00:26:06.885 00:26:07.590 Amber Lin: okay.

256 00:26:10.090 00:26:17.929 Amber Lin: And then the smart editor should be another thing. So I think this is what’s gonna take the most time, and then the prompting will be easier.

257 00:26:18.680 00:26:21.859 Miguel de Veyra: As long as we have. Basically the dots, we’re getting pretty much.

258 00:26:21.860 00:26:27.149 Amber Lin: So how much would this part take in points? I can estimate the top part.

259 00:26:27.150 00:26:29.629 Miguel de Veyra: Probably 3 at most.

260 00:26:30.170 00:26:31.310 Amber Lin: Oh, okay.

261 00:26:33.270 00:26:37.600 Miguel de Veyra: For just the prompting there, not the deployment and everything, of course.

262 00:26:38.945 00:26:45.089 Amber Lin: and on which deployment we hopefully, Tim helps us to.

263 00:26:45.090 00:26:48.300 Miguel de Veyra: Yeah, yeah, I think this is, it’s not really part of this ticket.

264 00:26:48.700 00:26:54.090 Amber Lin: Yeah, okay. So I’ll say, it’s around here

265 00:26:54.290 00:26:57.160 Amber Lin: until I sort of break that down.

266 00:26:57.160 00:26:57.720 Miguel de Veyra: Yeah.

267 00:26:58.450 00:26:59.070 Amber Lin: Okay.

268 00:27:01.930 00:27:04.590 Miguel de Veyra: I think we also have a new client for this team.

269 00:27:05.660 00:27:09.229 Amber Lin: Oh, yeah, we’re gonna get one soon. It’s very exciting.

270 00:27:11.130 00:27:14.150 Miguel de Veyra: I think it’s Craig, right, Craig, something.

271 00:27:14.420 00:27:20.800 Miguel de Veyra: But yeah, the thing Casey has been working with him for like, for I think everything.

272 00:27:20.800 00:27:22.020 Casie Aviles: Sorry I didn’t hear it. I’m.

273 00:27:22.020 00:27:26.209 Miguel de Veyra: Yeah, I think the the things you want are pretty simple. Honestly.

274 00:27:26.816 00:27:30.623 Amber Lin: Oh, okay. Wh, who is Craig? And what does he do?

275 00:27:31.130 00:27:36.009 Miguel de Veyra: He basically is the founder series. Guy, like the one who founded it.

276 00:27:36.710 00:27:37.210 Amber Lin: He knows it.

277 00:27:37.210 00:27:37.640 Casie Aviles: Lot of.

278 00:27:37.640 00:27:38.040 Amber Lin: Oh!

279 00:27:38.040 00:27:40.109 Casie Aviles: Ceos, yeah.

280 00:27:40.110 00:27:43.359 Miguel de Veyra: He’s the head of AI for crocs, I believe.

281 00:27:43.360 00:27:45.310 Amber Lin: I see cool.

282 00:27:48.060 00:27:52.370 Amber Lin: That’s exciting. I don’t know if they need me there, if it’s such simple task.

283 00:27:52.600 00:27:54.230 Amber Lin: But no, they just got it.

284 00:27:54.230 00:27:56.629 Miguel de Veyra: I think it’s 5 HA week. The allocation.

285 00:27:56.630 00:27:58.409 Amber Lin: Yeah, yeah, yeah.

286 00:27:59.250 00:28:00.050 Amber Lin: Awesome.

287 00:28:00.220 00:28:02.739 Amber Lin: Okay. So you’re gonna get so busy.

288 00:28:04.690 00:28:08.230 Miguel de Veyra: And then I think there’s also a couple of other Demos that they wanna work on.

289 00:28:09.830 00:28:10.215 Miguel de Veyra: Yeah.

290 00:28:13.020 00:28:17.750 Amber Lin: Format text editor, format. Oh, goodness.

291 00:28:18.987 00:28:28.410 Amber Lin: promote heavy users, template channels, different narratives. Blah, blah, input.

292 00:28:30.258 00:28:33.599 Miguel de Veyra: We have a question, by the way, because we have so much things on

293 00:28:33.810 00:28:40.290 Miguel de Veyra: requirements started, and the other statuses, should we decide on which one to work on.

294 00:28:42.067 00:28:42.989 Amber Lin: For this week.

295 00:28:43.390 00:28:44.050 Miguel de Veyra: Yeah.

296 00:28:45.200 00:28:48.330 Amber Lin: I mean, what are we gonna work on this week?

297 00:28:52.970 00:28:53.530 Amber Lin: There.

298 00:28:53.530 00:28:54.143 Miguel de Veyra: Yeah, because.

299 00:28:55.362 00:28:58.050 Amber Lin: We can get started

300 00:28:58.540 00:29:05.540 Amber Lin: on the, because ultimately I know Utam wants us to stretch out my stuff a little bit more.

301 00:29:06.560 00:29:07.520 Miguel de Veyra: Okay. So.

302 00:29:07.520 00:29:15.459 Amber Lin: So not maybe like he was like, Can we do? 10 HA week? I was like, Wow, 10 h is very, very low.

303 00:29:15.790 00:29:16.990 Miguel de Veyra: Including your time.

304 00:29:16.990 00:29:20.389 Amber Lin: Yeah, I was like, what do you mean? 10 HI budget.

305 00:29:20.390 00:29:21.949 Miguel de Veyra: Meetings alone, or 5.

306 00:29:24.930 00:29:28.000 Amber Lin: Yeah. So I guess there’s only.

307 00:29:28.998 00:29:33.779 Miguel de Veyra: If we follow his 10 h we can only do one ticket a week right.

308 00:29:33.780 00:29:36.490 Amber Lin: Yeah, so.

309 00:29:36.490 00:29:42.079 Miguel de Veyra: I think we we stick with that. We just pick one ticket and then just work on it a 5 point ticket.

310 00:29:42.510 00:29:46.307 Amber Lin: Yeah, okay, so we can also,

311 00:29:49.300 00:29:50.930 Miguel de Veyra: As far as hyper I/O.

312 00:29:53.960 00:30:00.029 Amber Lin: And I need to get errors on that. I feel like

313 00:30:00.490 00:30:10.969 Amber Lin: it could also just be on my end, because I’m looking at the overflow age. How to deploy to those looking at scheduling office hours for the Csrs.

314 00:30:11.770 00:30:19.150 Amber Lin: and we can have can help them with formatting.

315 00:30:21.380 00:30:24.020 Miguel de Veyra: I think the trainer Bot will help them with the formatting.

316 00:30:24.220 00:30:30.810 Amber Lin: Yeah, yeah. So maybe that could be something that we work on, because that’s apparently quite hard for them.

317 00:30:35.200 00:30:36.190 Amber Lin: Oh.

318 00:30:44.680 00:30:49.270 Miguel de Veyra: And I think this is just part of the the trainer board.

319 00:30:49.570 00:30:54.380 Amber Lin: Yeah, I broke it up a little bit because I don’t want the other ticket to get too complicated.

320 00:30:58.760 00:31:04.539 Miguel de Veyra: Oh, let me check anything. Say, I remember we we did some sort of a trainer board.

321 00:31:07.150 00:31:07.900 Miguel de Veyra: Through email

322 00:31:18.910 00:31:20.470 Miguel de Veyra: patient assistant.

323 00:31:26.170 00:31:29.530 Miguel de Veyra: Oh, yeah, just need the details for this.

324 00:31:45.340 00:31:48.359 Miguel de Veyra: So I think they’re gonna be asking for the trainer. Bot soon.

325 00:31:50.310 00:31:51.589 Amber Lin: I agree.

326 00:31:52.700 00:31:54.710 Amber Lin: So what did?

327 00:31:54.880 00:31:56.180 Amber Lin: Yeah, that one

328 00:32:21.050 00:32:22.220 Amber Lin: sounds good.

329 00:32:22.630 00:32:30.070 Amber Lin: So what is, what do we need for it to help like?

330 00:32:30.850 00:32:37.820 Amber Lin: So they will input something, and we spit it out in the nice spot. Friendly format.

331 00:32:42.770 00:32:43.380 Casie Aviles: Yeah.

332 00:32:44.910 00:32:48.540 Amber Lin: Okay, so essentially, essentially, it’s just activity.

333 00:32:50.600 00:32:52.969 Miguel de Veyra: Yeah, we have a bot that

334 00:32:53.280 00:32:56.860 Miguel de Veyra: acts as a trainer. I’m not sure, have we connected this. But

335 00:32:57.220 00:33:01.449 Miguel de Veyra: to Google docs, I don’t think so I mean to Google, Docs, to Google, chat.

336 00:33:01.870 00:33:02.600 Casie Aviles: No.

337 00:33:03.380 00:33:04.240 Casie Aviles: Yeah, no.

338 00:33:06.570 00:33:09.030 Amber Lin: Was it possible to have 2.

339 00:33:09.920 00:33:14.739 Miguel de Veyra: I’m not sure. I don’t think we ever crossed that line. Is it possible, Katie, to have 2 bots?

340 00:33:15.340 00:33:16.450 Miguel de Veyra: It is right.

341 00:33:16.690 00:33:18.991 Casie Aviles: I think we yeah, we did like

342 00:33:19.980 00:33:22.180 Casie Aviles: a spike kind of there for.

343 00:33:22.450 00:33:28.760 Casie Aviles: And I I just know that there’s 1 thing we can try, but I’m not sure if it will work yet.

344 00:33:30.890 00:33:36.800 Amber Lin: Okay, so, link subscribe.

345 00:33:39.170 00:33:42.699 Casie Aviles: So this is for the deployment on their side, I believe.

346 00:33:44.213 00:33:48.820 Amber Lin: So they would know, do you think like, who would know.

347 00:33:51.440 00:33:53.370 Casie Aviles: I’m not sure if Tim would know, but

348 00:33:53.570 00:33:58.040 Casie Aviles: I can just try 1st within our environment and then

349 00:33:58.720 00:34:01.879 Casie Aviles: and then let we. We let them know that

350 00:34:02.120 00:34:05.849 Casie Aviles: if they could do the same thing for it to. So

351 00:34:05.990 00:34:08.330 Casie Aviles: you know, for the second bot to work.

352 00:34:12.770 00:34:19.370 Amber Lin: Let’s see. What other solutions would we reuse if we can’t make it work.

353 00:34:22.449 00:34:24.179 Miguel de Veyra: Provide them a ui, for now.

354 00:34:24.489 00:34:25.229 Amber Lin: Oh!

355 00:34:25.230 00:34:27.170 Casie Aviles: Yeah, that’s 1 other.

356 00:34:28.130 00:34:29.940 Amber Lin: Still, like the Demos we made.

357 00:34:30.550 00:34:32.070 Miguel de Veyra: Yeah, something like that.

358 00:34:33.500 00:34:37.719 Miguel de Veyra: I mean, we can. I mean, I can probably prop something up in the

359 00:34:37.980 00:34:48.939 Miguel de Veyra: just. So they have something to, you know, even before we figure out the them problem, I mean the second bot problem. I can probably prop something up so, cause I don’t think it’s that

360 00:34:49.449 00:34:57.739 Miguel de Veyra: cause. I think there’s not really a lot of people who really do the train who really do the documentation right? They’re gonna be a bit more understanding.

361 00:34:58.000 00:35:08.269 Amber Lin: You. That’s a great idea. So we should just make a ui for now and then investigate the Google chat later, because I know we kept getting stuck on this.

362 00:35:08.270 00:35:08.950 Miguel de Veyra: Yeah, yeah.

363 00:35:10.240 00:35:13.889 Miguel de Veyra: Especially if we have, you know, than ours.

364 00:35:14.280 00:35:20.570 Amber Lin: Yeah. Okay, let me just make the.

365 00:35:22.186 00:35:25.090 Amber Lin: So we’ll check later.

366 00:35:25.640 00:35:27.230 Amber Lin: Right? So what would.

367 00:35:27.420 00:35:32.220 Amber Lin: What would you estimate, sir, for? That’s not here.

368 00:35:37.040 00:35:39.499 Miguel de Veyra: We know what our format answers.

369 00:35:40.270 00:35:43.090 Miguel de Veyra: I mean probably 3 at most.

370 00:35:43.290 00:35:47.460 Amber Lin: Oh, I would say that it’s either urgent or not.

371 00:35:48.270 00:35:53.160 Miguel de Veyra: No, not really, probably a medium, so I think the the ones before are higher.

372 00:35:55.223 00:35:57.640 Amber Lin: Which one before was higher.

373 00:35:58.180 00:36:01.479 Miguel de Veyra: Because basically, now, there’s 3 trainer tickets, right?

374 00:36:01.480 00:36:02.160 Amber Lin: Oh!

375 00:36:03.010 00:36:05.170 Miguel de Veyra: Yeah, I think this is probably just medium.

376 00:36:05.470 00:36:11.490 Amber Lin: Yeah, let me go check the trainer. Bot 6.

377 00:36:15.510 00:36:17.160 Amber Lin: I’m gonna copy this to the.

378 00:36:17.160 00:36:18.989 Miguel de Veyra: Yeah, it’s too much trainer. But.

379 00:36:18.990 00:36:24.400 Amber Lin: Yeah. Feedback. I don’t think this is that high.

380 00:36:25.810 00:36:26.420 Miguel de Veyra: Yeah, yeah.

381 00:36:26.420 00:36:32.990 Amber Lin: Just updates like this is just a medium formatting big.

382 00:36:33.830 00:36:35.840 Amber Lin: This will be pretty high.

383 00:36:37.040 00:36:38.730 Amber Lin: And this would like.

384 00:36:40.040 00:36:46.310 Amber Lin: this is a foundation thing before. We can also do that right. They need to know how to format it correctly.

385 00:36:46.500 00:36:50.139 Amber Lin: and we also kind of have the guidelines to feed into that.

386 00:37:20.110 00:37:24.049 Amber Lin: Oh, so Uton, put some stuff here.

387 00:37:56.170 00:38:12.100 Amber Lin: Okay, awesome. This is this is what we need or what is this yeah, this one, i’m gonna

388 00:38:12.680 00:38:16.190 Amber Lin: copy and paste this

389 00:38:59.830 00:39:00.870 Amber Lin: awesome.

390 00:39:03.320 00:39:07.950 Amber Lin: I’m just trying to okay, some mormons.

391 00:39:16.210 00:39:22.560 Amber Lin: This is adding the overflow group this is in progress

392 00:39:23.660 00:39:31.240 Amber Lin: that I’m working with Janice on first, st last 3rd or

393 00:40:01.960 00:40:08.440 Amber Lin: oh, one thing that they was brought up in the meeting super small thing of just formatting issues.

394 00:40:10.070 00:40:13.770 Amber Lin: making it, giving it a line break.

395 00:40:19.038 00:40:24.510 Miguel de Veyra: Yeah, yeah, I think this. Yeah, this should be a simple formatting issue can be 1 point, I think.

396 00:40:43.380 00:40:44.929 Amber Lin: We can do that.

397 00:40:47.860 00:40:54.419 Amber Lin: I think that we are doing that. We could like proactively ask for acronyms. And

398 00:40:54.780 00:40:58.419 Amber Lin: how do we solve the last acronym issue?

399 00:40:59.630 00:41:02.419 Miguel de Veyra: I think we did this before already.

400 00:41:02.640 00:41:03.230 Amber Lin: Yeah, yeah.

401 00:41:03.600 00:41:10.940 Amber Lin: there was like a note, a whatever. But I’m not sure. It covers all the acronyms. That’s that’s all.

402 00:41:11.360 00:41:15.559 Miguel de Veyra: I mean, we all we need their acronyms first, st to be honest, because we’re just guessing.

403 00:41:22.130 00:41:25.409 Amber Lin: And then if we get the list, do we just plug that in?

404 00:41:26.450 00:41:28.980 Amber Lin: Or is it just a central Doc update like, does it.

405 00:41:28.980 00:41:32.930 Miguel de Veyra: No, no, not Central Doc, because abbreviations is behavioral.

406 00:41:34.310 00:41:38.229 Miguel de Veyra: So we need to add that to their basically get a prompt

407 00:41:40.760 00:41:42.789 Miguel de Veyra: like words to watch out for.

408 00:41:43.400 00:41:44.470 Amber Lin: Hmm.

409 00:41:52.520 00:41:57.969 Amber Lin: so I guess this is like 2 separate things and then testing.

410 00:42:00.550 00:42:04.040 Amber Lin: So how much time do you think that would take.

411 00:42:06.710 00:42:08.400 Miguel de Veyra: 2 points, probably yeah.

412 00:42:08.550 00:42:09.320 Amber Lin: Okay.

413 00:42:09.320 00:42:14.229 Miguel de Veyra: As long as we have the docs. Of course I mean the afternoon.

414 00:42:14.230 00:42:17.780 Amber Lin: Yeah, totally. So that I will. I’ll look into that.

415 00:42:18.910 00:42:19.850 Amber Lin: And

416 00:42:23.890 00:42:29.189 Amber Lin: with them no back to this.

417 00:42:32.840 00:42:39.399 Amber Lin: So we said, 3 points right? I’m gonna move this to just reviewed.

418 00:42:47.570 00:42:49.010 Amber Lin: Yeah, this one.

419 00:42:51.550 00:42:56.420 Amber Lin: An idea brought up also in Friday. But of how

420 00:42:57.620 00:43:09.779 Amber Lin: strict should Andy be on? Input because sometimes the Csrs. As Utam said, it was a trash in trash out problem, because Csrs put in 2 words.

421 00:43:09.970 00:43:12.110 Amber Lin: Andy is confused.

422 00:43:12.410 00:43:14.530 Amber Lin: So how should we.

423 00:43:16.570 00:43:23.670 Miguel de Veyra: I think we can change like the behavior like something. Put the instruction, put in the instructions, you know.

424 00:43:23.930 00:43:26.292 Miguel de Veyra: before you answer, make sure it’s like

425 00:43:27.550 00:43:32.029 Miguel de Veyra: it makes sense to even answer it. If not, then ask a follow up question.

426 00:43:47.265 00:43:53.789 Amber Lin: I remember somewhere related to this is about hallucinations of we. Rather it not answer.

427 00:43:54.250 00:43:57.790 Amber Lin: Then give a hallucinating answer.

428 00:43:58.810 00:44:02.390 Miguel de Veyra: Yeah. Yeah. But that’s before that’s after the processes. Before

429 00:44:04.070 00:44:10.140 Miguel de Veyra: I’m I’m not sure thinking for this we have, we cause it’s gonna add to the time. But I’m thinking.

430 00:44:10.140 00:44:10.659 Amber Lin: Like a different.

431 00:44:10.660 00:44:22.149 Miguel de Veyra: I’m thinking there should be like another node there that basically just filters it out, you know, if it’s like a good or bad response. So we don’t actually call the main workflow. But I’m not sure

432 00:44:22.420 00:44:27.100 Miguel de Veyra: if it’s a good idea, because that’s gonna add, probably like a second or 2 of time.

433 00:44:27.730 00:44:29.949 Amber Lin: Oh, really! Oh, see!

434 00:44:30.560 00:44:31.990 Amber Lin: Oh, well.

435 00:44:31.990 00:44:33.620 Miguel de Veyra: I think we can just add it to the prompt.

436 00:44:38.760 00:44:40.469 Amber Lin: So that would be.

437 00:44:49.700 00:44:52.659 Miguel de Veyra: Oh, shit sorry, Casey. Quick thing.

438 00:44:55.710 00:44:56.270 Casie Aviles: Yeah.

439 00:44:56.840 00:44:58.889 Miguel de Veyra: Can you fill out the notion thing?

440 00:45:00.420 00:45:02.999 Casie Aviles: Yeah, I filled out some parts, but it’s not done yet.

441 00:45:03.230 00:45:08.440 Miguel de Veyra: Yeah. Yeah. Cause utham is like following up a lot.

442 00:45:12.120 00:45:14.259 Miguel de Veyra: The green ones. Right? Is your answer.

443 00:45:15.330 00:45:18.250 Casie Aviles: Yeah, I added those yesterday, but for the rest.

444 00:45:18.250 00:45:18.800 Miguel de Veyra: Sure.

445 00:45:18.800 00:45:19.440 Casie Aviles: Nope.

446 00:45:20.040 00:45:21.109 Miguel de Veyra: Okay. Sure. Sure.

447 00:45:21.110 00:45:21.780 Amber Lin: Cool.

448 00:45:23.149 00:45:24.969 Miguel de Veyra: Sorry so

449 00:45:26.120 00:45:27.736 Amber Lin: Oh, good go ahead!

450 00:45:28.508 00:45:32.929 Miguel de Veyra: Yeah. Yeah. Good thing Craig came in, because I can probably

451 00:45:33.360 00:45:37.119 Miguel de Veyra: use that as like the leading client, right?

452 00:45:38.400 00:45:40.849 Miguel de Veyra: Because that’s what you’ve been doing with him anyway. So.

453 00:45:47.510 00:45:50.049 Miguel de Veyra: by the way, Amber, I’m actually done for the day.

454 00:45:50.410 00:45:52.100 Amber Lin: Yeah, that’s okay.

455 00:45:53.970 00:45:58.699 Miguel de Veyra: Like the sales board. I know it’s probably good to discuss that later. Sorry.

456 00:45:58.920 00:46:02.089 Amber Lin: Yeah, will you guys be able to make it today?

457 00:46:02.090 00:46:03.460 Miguel de Veyra: Yeah, yeah, definitely, I’ll be there.

458 00:46:03.460 00:46:05.479 Amber Lin: Okay, yeah, I don’t think there’s

459 00:46:05.870 00:46:11.360 Amber Lin: I don’t think there’s work I’m gonna assign you on ABC today. So.

460 00:46:11.570 00:46:12.389 Miguel de Veyra: Yeah, hopefully. Not.

461 00:46:12.390 00:46:14.919 Amber Lin: We’re just going through the tickets, that’s all.

462 00:46:17.670 00:46:23.869 Amber Lin: Oh, here’s another one format, the current. Central Doc.

463 00:46:24.650 00:46:27.800 Amber Lin: How are we gonna approach that.

464 00:46:29.100 00:46:31.059 Miguel de Veyra: Is that our job first.st

465 00:46:33.190 00:46:33.570 Casie Aviles: This.

466 00:46:33.570 00:46:35.029 Amber Lin: Make it bot friendly.

467 00:46:35.170 00:46:35.860 Amber Lin: Yeah.

468 00:46:36.573 00:46:38.980 Miguel de Veyra: I think we, I think, I added.

469 00:46:39.350 00:46:48.699 Miguel de Veyra: like cause. I think the only problem there was was the the day was right, Casey, and then I I think, kind of fix it last week, although I have to.

470 00:46:48.700 00:46:52.420 Casie Aviles: Yeah, there were some tables that they added, but

471 00:46:53.050 00:46:55.320 Casie Aviles: I haven’t really checked. If there are any new.

472 00:46:56.340 00:46:56.780 Miguel de Veyra: Yeah.

473 00:46:56.780 00:46:58.280 Casie Aviles: New tables right now.

474 00:46:59.300 00:47:03.459 Amber Lin: Oh, I can assign this to the client.

475 00:47:04.170 00:47:05.256 Amber Lin: Don’t worry.

476 00:47:06.240 00:47:12.820 Miguel de Veyra: Because I think with this one we can. I think it has to be like a collaboration with them. Because, yeah, tables are very like effective.

477 00:47:13.160 00:47:20.919 Miguel de Veyra: Of course I like tables, too, but I think they have to let us know, so we can hop in there and then. Or maybe the trainer bot will actually fix this problem.

478 00:47:20.920 00:47:23.070 Casie Aviles: Yeah, yeah. That’s the trainer bots.

479 00:47:23.696 00:47:30.299 Amber Lin: And one of their frustrations, I think, is like they. They add something to the central dock, but.

480 00:47:30.300 00:47:31.989 Miguel de Veyra: Yeah. It doesn’t reflect.

481 00:47:32.340 00:47:34.910 Casie Aviles: Yeah, the bot doesn’t read it, or you know, and.

482 00:47:34.910 00:47:36.320 Amber Lin: Oh!

483 00:47:36.320 00:47:42.960 Casie Aviles: There could be many factors also, like, maybe it’s their formatting, or maybe it’s not, and maybe there, it’s something on the.

484 00:47:42.960 00:47:44.300 Miguel de Veyra: Maybe it’s an image.

485 00:47:45.460 00:47:48.640 Casie Aviles: So, yeah, those are some factors to think about.

486 00:47:49.210 00:47:49.630 Amber Lin: I see.

487 00:47:49.630 00:47:53.739 Miguel de Veyra: Yeah, yeah, that’s why we need actually, the one thing we also need to do

488 00:47:54.330 00:48:05.770 Miguel de Veyra: is we we need to provide them a set of guidelines. I mean, it’s also for the bot that hey? This is the only format that’s gonna be effective for the bot to read the doc.

489 00:48:06.050 00:48:10.480 Amber Lin: Yeah, I gave them some. No formatting.

490 00:48:12.540 00:48:14.639 Amber Lin: Where is it? There’s like a

491 00:48:16.720 00:48:23.580 Amber Lin: there’s a guideline that I sent. I don’t think they read it, so I think this is a good chance to look at it

492 00:48:23.710 00:48:29.300 Amber Lin: together with them, and then point out any tables or images.

493 00:48:29.750 00:48:39.380 Amber Lin: and let them use the trainer bot to edit it fine or a tragic game.

494 00:48:39.860 00:48:47.739 Amber Lin: Okay, I’ll say, this is yeah, I’ll I’ll edit this more a little bit more.

495 00:48:51.220 00:48:52.220 Amber Lin: Backlog.

496 00:48:54.380 00:48:58.634 Amber Lin: Oh, we talked about this all by the way. Button

497 00:49:01.290 00:49:08.790 Amber Lin: We’ll have to explore this a little bit more, I think of how to.

498 00:49:09.210 00:49:11.980 Casie Aviles: I think the buttons accessible, at least to them. Now.

499 00:49:12.480 00:49:13.860 Amber Lin: Hmm pardon me.

500 00:49:14.570 00:49:17.730 Casie Aviles: I think they have access to the button, at least. Now.

501 00:49:18.390 00:49:20.799 Amber Lin: Yeah, yeah, they do have it, that’s great.

502 00:49:21.310 00:49:26.960 Amber Lin: Oh, let me move that ticket. Then, if it’s deployed and

503 00:49:39.250 00:49:42.630 Amber Lin: the Api was still stuck on Tim right.

504 00:49:44.720 00:49:45.350 Casie Aviles: Yeah.

505 00:49:45.350 00:49:46.497 Amber Lin: Okay, no worries.

506 00:49:50.220 00:49:56.090 Amber Lin: Alright! That’s good. I will just add, you guys, if I

507 00:49:56.850 00:49:59.869 Amber Lin: if we need to flesh out.

508 00:50:00.070 00:50:06.279 Amber Lin: Oh, any more tickets is this? Oh, I guess this is the last one.

509 00:50:13.328 00:50:18.099 Miguel de Veyra: Estimate is probably gonna be 3. I’m not gonna like this could take some time.

510 00:50:18.490 00:50:20.790 Amber Lin: Oh, 2, 5.

511 00:50:21.570 00:50:24.000 Miguel de Veyra: No, I don’t think it should take 5, probably.

512 00:50:24.670 00:50:25.770 Amber Lin: Sounds good.

513 00:50:26.710 00:50:28.279 Amber Lin: Open there!

514 00:50:30.350 00:50:31.280 Amber Lin: Pressed.

515 00:50:33.240 00:50:34.910 Amber Lin: Press the feedback.

516 00:50:35.350 00:50:42.220 Amber Lin: Oh, this is a big ticket. I think this is.

517 00:50:42.600 00:50:43.760 Miguel de Veyra: Yeah, I think.

518 00:50:43.760 00:50:46.632 Amber Lin: Process, engineering whatever.

519 00:50:48.960 00:50:51.950 Miguel de Veyra: Yeah, I think we can discuss. I don’t think we have the time, anyways, to do it.

520 00:50:52.672 00:50:55.560 Amber Lin: No, no, we’ll discuss

521 00:50:55.720 00:51:02.440 Amber Lin: in the future. I’ll give it a big one, so we’ll break down. Need to bring

522 00:51:03.520 00:51:06.759 Amber Lin: alright. We’ll talk about it later.

523 00:51:09.290 00:51:12.479 Amber Lin: Okay, thank you guys, I’m gonna hop to another call.

524 00:51:12.730 00:51:14.050 Amber Lin: Okay, thanks. Everyone.

525 00:51:14.050 00:51:14.710 Miguel de Veyra: Yeah, they.

526 00:51:14.710 00:51:15.349 Casie Aviles: Thanks, amber.

527 00:51:15.350 00:51:17.160 Amber Lin: Okay? Bye.