Meeting Title: Daily AI Team Sync Date: 2025-03-11 Meeting participants: Amber Lin, Patrik, Miguel De Veyra, Casie Aviles


WEBVTT

1 00:00:06.790 00:00:15.530 Amber Lin: Hi, Patrick, I’m just waiting for a few people to trickle in. Let me text the link in the group chat because I think it was a little bit confused.

2 00:00:15.680 00:00:21.670 Patrik: Yeah, okay, sounds good, is it? It’s usually at this time, right or no.

3 00:00:21.670 00:00:23.390 Amber Lin: Yeah, here, this time.

4 00:00:23.720 00:00:24.560 Patrik: Yeah, okay.

5 00:00:29.718 00:00:31.990 Amber Lin: Maybe it’s too late for them.

6 00:00:32.450 00:00:35.600 Amber Lin: Let me check may go.

7 00:00:36.870 00:00:38.220 Patrik: What time is it for them.

8 00:00:39.175 00:00:41.169 Amber Lin: It’s probably not.

9 00:00:41.450 00:00:42.620 Amber Lin: Let me see.

10 00:00:42.620 00:00:46.050 Patrik: Let’s see, 9, 30, local, 9, 30. Pm.

11 00:00:47.590 00:00:49.579 Patrik: Yeah, there are full 12 h.

12 00:00:49.790 00:00:51.480 Amber Lin: Yeah, they’re in Philippines.

13 00:00:51.810 00:00:52.430 Patrik: Wow!

14 00:00:57.050 00:00:59.460 Patrik: And you’re on the West Coast.

15 00:00:59.940 00:01:02.250 Amber Lin: I am in la right. Now. What about you?

16 00:01:02.250 00:01:03.770 Patrik: Okay. Nice. Nice.

17 00:01:03.770 00:01:04.120 Amber Lin: Yeah.

18 00:01:04.120 00:01:07.830 Patrik: I’m on the east coast, right outside of New York City.

19 00:01:08.200 00:01:09.790 Amber Lin: Oh, cool!

20 00:01:16.160 00:01:18.120 Miguel de Veyra: Oh, hey, guys.

21 00:01:18.690 00:01:20.909 Amber Lin: Hi, I think we’re.

22 00:01:20.910 00:01:21.510 Patrik: So.

23 00:01:21.510 00:01:27.459 Amber Lin: Waiting for Jonah. And then, yeah, Jenna won’t be joining today. She doesn’t go to meetings on Wednesdays.

24 00:01:27.630 00:01:30.990 Amber Lin: Oh, perfect! That’s good to note. I will.

25 00:01:32.670 00:01:38.640 Amber Lin: Great let. I I just met with

26 00:01:38.970 00:01:45.140 Amber Lin: Janice in the morning. I’ve got a lot of helpful, helpful instructions.

27 00:01:46.036 00:01:55.809 Amber Lin: So let me update you guys on what I talked about.

28 00:01:56.120 00:01:58.370 Amber Lin: So a few things.

29 00:02:00.350 00:02:08.880 Amber Lin: the 1st one quick ones, Google drive, she says, mostly for the pest stuff is up to date. So

30 00:02:09.000 00:02:11.380 Amber Lin: we can be safe to use that.

31 00:02:12.010 00:02:14.260 Amber Lin: The oh, by the way, hints.

32 00:02:15.580 00:02:20.189 Amber Lin: there’s 2 things. So one is

33 00:02:20.480 00:02:34.349 Amber Lin: general. Any general questions of Oh, I was about this same day reschedule. So any questions essentially, we can give them a power offer. So what’s on promotion for that month

34 00:02:34.430 00:02:54.930 Amber Lin: and for specific questions of the customers calling in for a rodent, and then we have sort of associate word association of oh, why is there a rodent? The root cause could be grass or could be trees, and then we give a service recommendation based on that. So that’s the 2 for the oh, by the way, that I thought was really helpful.

35 00:02:55.460 00:02:57.140 Amber Lin: and then

36 00:02:57.580 00:03:06.509 Amber Lin: I worked with her. So the last part is, I worked with her on the golden data sheet of getting specific answers. And

37 00:03:07.460 00:03:13.429 Amber Lin: here’s what her comment is. Right now. The answers we give

38 00:03:13.690 00:03:20.889 Amber Lin: have so much information that the Csrs get overwhelmed. It’s really hard to read.

39 00:03:21.640 00:03:22.860 Amber Lin: So

40 00:03:23.060 00:03:48.649 Amber Lin: a lot of the steps are also very common sense of like, oh, yeah, put that into the put this into the system. Put the customer’s information into the system like these things. We think that we probably don’t have to present them immediately. They could be if the Csr. Really don’t know, and I don’t know why would they wouldn’t know. They could ask that on the further follow up question. But for the initial answer, say.

41 00:03:48.940 00:03:52.406 Amber Lin: let’s say the example of

42 00:03:54.770 00:04:16.229 Amber Lin: Maybe. What’s the process for our same day? Reschedule to just give them a simple short, like one line 6, 1 line like bullet points instead of whole chunk, because having a whole chunk while you’re on the call, you’re not really gonna read any of that. So it has to be simple, easy to digest.

43 00:04:16.450 00:04:25.970 Amber Lin: And we can always allow follow up questions. So that’s the main takeaway from the call with Janice, and we also worked on putting in some answers.

44 00:04:25.970 00:04:29.720 Miguel de Veyra: Okay. So the idea is like, around 2 to 3 sentence answers. Now.

45 00:04:30.360 00:04:43.389 Amber Lin: Yeah. So something that the Csr can pick, scan, pick up and just speak it essentially. It will be even better if they can just say a word for word if we have the tone, and if we have the tonality.

46 00:04:43.660 00:04:48.120 Amber Lin: I don’t know if we could do that, but it should be really.

47 00:04:48.120 00:04:50.600 Miguel de Veyra: We could provide the samples we could probably do.

48 00:04:50.600 00:04:51.290 Amber Lin: Yeah.

49 00:04:51.420 00:04:59.120 Amber Lin: so we can see if it can become more readable chunks. I think if we do that improvement it will be a great win on Friday.

50 00:04:59.340 00:05:08.260 Amber Lin: because then we can show. Yeah, you can just read this. You don’t need to. You don’t need to digest it. This is, you can spit it out to the customer.

51 00:05:08.560 00:05:13.849 Miguel de Veyra: Did. Did she put, for example, on the golden data sheet what you know

52 00:05:14.180 00:05:18.590 Miguel de Veyra: it would look like? What would be the ideal answer that would be easy to digest for them.

53 00:05:19.120 00:05:21.989 Miguel de Veyra: Because from that we then we can base. You know how I can modify.

54 00:05:21.990 00:05:49.389 Amber Lin: Yeah, yeah, totally. I was trying to ask that. And I realized I can’t edit the golden data sheet. So I was kind of taking notes on my own notion. Here is what I took a screenshot of. This is what we have now, right on. The golden data sheet of this is so long like as a as you scan this. I don’t know where to read even. That’s just not a Csr. I don’t know where to read, and

55 00:05:49.920 00:05:53.889 Amber Lin: this is the 1st part like this seemed to reschedule.

56 00:05:54.090 00:05:57.370 Amber Lin: This is what she gave of.

57 00:05:57.560 00:06:04.879 Amber Lin: Yeah, just call the tech like, that’s easier to follow, because if you look at

58 00:06:05.160 00:06:07.569 Amber Lin: here, let me open this.

59 00:06:08.320 00:06:13.749 Amber Lin: If you say, confirm technician availability like

60 00:06:14.030 00:06:26.200 Amber Lin: this, these 2 lines are kind of repetitive, like it could be done in a sentence. It doesn’t need to. It doesn’t need to be this long and having it. This long reduces the effectiveness

61 00:06:26.310 00:06:33.840 Amber Lin: and say the next part of rodent attributes, right? Or like

62 00:06:34.080 00:06:40.970 Amber Lin: simple, easy to read. Say, here on the number 3,

63 00:06:41.200 00:06:47.390 Amber Lin: yeah, they know to maybe we could just say schedule the appointment and involve because they know it’s in.

64 00:06:47.860 00:07:04.980 Amber Lin: They know they have to answer the customer details, because that’s what required to schedule things, and they know to select the termite technician, because the system will require them to do that. And yes, I think the system will also make them to confirm the appointment date. So 3 can essentially just be

65 00:07:05.370 00:07:07.529 Amber Lin: schedule appointment and evolve.

66 00:07:07.750 00:07:10.270 Amber Lin: And that’s it, right?

67 00:07:11.790 00:07:16.000 Amber Lin: So that’s the biggest takeaway that I had from her.

68 00:07:17.130 00:07:20.820 Amber Lin: And see.

69 00:07:21.840 00:07:28.880 Amber Lin: Oh, last thing last thing website to scrape. I asked her what location

70 00:07:29.834 00:07:32.140 Amber Lin: she gave these for the ABC.

71 00:07:32.670 00:07:39.010 Amber Lin: And scrape all the services page and.

72 00:07:39.010 00:07:41.729 Miguel de Veyra: We’re only doing the best part of the business right.

73 00:07:41.790 00:07:47.759 Amber Lin: I think she wants everything for this, for the upsells.

74 00:07:47.760 00:07:53.790 Miguel de Veyra: I think you need. We need to clarify. We do them because I know they only paid for the past side of the business.

75 00:07:53.790 00:07:58.409 Amber Lin: Then we just do. Then we just do past. They can pay us more to expect.

76 00:07:59.670 00:08:03.490 Amber Lin: I think we can just show them that we’re capable of scraping it.

77 00:08:03.640 00:08:06.299 Amber Lin: and then if they walk more.

78 00:08:06.510 00:08:08.749 Amber Lin: they’ll pay for the proposal phase 2.

79 00:08:09.300 00:08:09.990 Miguel de Veyra: Okay.

80 00:08:09.990 00:08:17.279 Amber Lin: Okay, just pass, I think, what she that’s what she means as well, I believe.

81 00:08:19.350 00:08:26.070 Amber Lin: And lastly, for the 2 other sites probably don’t need to scrape them. This is essentially what the service is.

82 00:08:26.450 00:08:34.159 Amber Lin: So that could just be our answer for these 2 sites. There’s nothing to scrape really. Ken free

83 00:08:34.289 00:08:37.927 Amber Lin: is that and term and mesh.

84 00:08:38.549 00:08:51.479 Amber Lin: it’s a service and also a product. So service is only available in these areas product. You can order online. So we could also give a link to the website in our answer.

85 00:08:51.740 00:08:57.520 Amber Lin: That’s it. Simple that it from my side.

86 00:08:58.400 00:09:06.080 Miguel de Veyra: Okay, yeah. I think this one. We should just add this to the Central Doc, or maybe to train them. We should have even told them

87 00:09:06.480 00:09:09.070 Miguel de Veyra: to add it to the central dock. I’m sorry.

88 00:09:09.070 00:09:22.509 Amber Lin: Yeah, totally, but we can add it, and I can. I can check with her to confirm. I told her. Maybe I’ll meet with her on Thursday, so, depending on on our progress and what we need, I can meet with her again.

89 00:09:23.470 00:09:28.529 Miguel de Veyra: Okay? Janice is the basically the representative. So.

90 00:09:28.530 00:09:29.250 Amber Lin: That.

91 00:09:29.520 00:09:39.600 Miguel de Veyra: Them right cause. I I remember, like in our call, like 2 or 3 weeks ago, they were telling. They they told us that they wanted, like as much data from the reply as possible. But I was like.

92 00:09:39.920 00:09:43.299 Miguel de Veyra: they’re gonna trim it down eventually. So I think, yeah.

93 00:09:43.300 00:09:49.869 Amber Lin: Yeah, yeah, I was a little bit confused, because I think the event engineer different ideas. But.

94 00:09:51.780 00:09:52.680 Miguel de Veyra: Yeah, because.

95 00:09:52.680 00:09:55.709 Amber Lin: We could do is have

96 00:09:55.870 00:10:13.660 Amber Lin: the we should show, follow up questions. So the initial answer should be short enough for the Csrs to read and allow them to ask clarifying questions, and then we’ll get the details. I think, in our Friday presentation we’ll follow that format.

97 00:10:15.010 00:10:16.540 Miguel de Veyra: Okay, okay, if you could.

98 00:10:16.730 00:10:22.249 Miguel de Veyra: Sorry. Did you show something earlier like the sample questions, can we go back to that like the sample answer.

99 00:10:22.410 00:10:25.670 Amber Lin: Yes, let me do that. Yeah, right here.

100 00:10:38.500 00:10:45.079 Miguel de Veyra: So this is the same. The same day. Reschedule 1, 2, 3, 4, 5, 6, is the ideal answer right?

101 00:10:45.620 00:10:49.879 Amber Lin: Essentially to keep it short.

102 00:10:52.560 00:10:59.129 Miguel de Veyra: Okay, okay, sure, I’ll let me try. Because technically, the replies are something like that. But it’s just this is like a.

103 00:10:59.130 00:10:59.580 Amber Lin: Cool.

104 00:10:59.580 00:11:03.469 Miguel de Veyra: Very trimmed down version. But yeah, I think we can do that.

105 00:11:03.970 00:11:07.910 Amber Lin: Yeah, and so let’s switch.

106 00:11:10.360 00:11:19.270 Patrik: Hey, guys, I gotta step out for 5 min. I’ll be back. I’ve got a few questions for for you guys. Yeah, I’ll just be a few minutes.

107 00:11:19.270 00:11:20.660 Amber Lin: Okay. Talk to you soon.

108 00:11:20.660 00:11:22.010 Patrik: Alright, thanks, bye.

109 00:11:24.650 00:11:31.290 Amber Lin: Oh, and Janice does not have access to the bot, which is a problem.

110 00:11:32.330 00:11:34.880 Amber Lin: So I’m gonna get in touch with Tim.

111 00:11:35.216 00:11:39.249 Miguel de Veyra: They should have access to it either way, because we gave them.

112 00:11:39.250 00:11:40.429 Amber Lin: I see, I see.

113 00:11:40.430 00:11:40.800 Miguel de Veyra: Right.

114 00:11:40.800 00:11:47.604 Amber Lin: Probably doesn’t know how to access it. So I think today it’s important for us to

115 00:11:48.540 00:11:50.300 Amber Lin: do the loom video

116 00:11:50.450 00:11:58.859 Amber Lin: because we I’m meeting with Shannon. I asked Shannon to meet for Wednesday Wednesday afternoon, so we have more time, but essentially.

117 00:11:58.860 00:11:59.460 Miguel de Veyra: So.

118 00:11:59.970 00:12:04.259 Amber Lin: Around today, would we be able to record the loom video.

119 00:12:05.102 00:12:10.049 Miguel de Veyra: Yeah, there’s Casey, have we? Were we able to send the code.

120 00:12:10.210 00:12:10.989 Amber Lin: Yeah, I saw that.

121 00:12:10.990 00:12:11.590 Miguel de Veyra: Tim.

122 00:12:11.780 00:12:12.240 Casie Aviles: Yes.

123 00:12:12.240 00:12:13.099 Amber Lin: I saw that.

124 00:12:15.107 00:12:16.990 Miguel de Veyra: Did them reply.

125 00:12:17.750 00:12:19.050 Casie Aviles: Yeah, hey? Dude.

126 00:12:19.050 00:12:24.662 Amber Lin: Oh, he did. Great and 25.

127 00:12:26.100 00:12:26.920 Amber Lin: Okay.

128 00:12:29.580 00:12:35.309 Amber Lin: can you guys update me on? What’s what did you do yesterday? And what do you plan to do today.

129 00:12:38.240 00:12:44.419 Casie Aviles: yeah, I could start. So basically, what I did yesterday was, I set up the brain trust part? And

130 00:12:44.770 00:12:52.479 Casie Aviles: yeah, that should be working. So all of the messages that are sent to the bot will be evaluated.

131 00:12:53.430 00:12:54.050 Amber Lin: Oh!

132 00:12:57.524 00:13:03.909 Miguel de Veyra: Sorry. Just a quick thing, Amber. If they don’t have access to it, they can just they could just go here.

133 00:13:05.285 00:13:07.139 Amber Lin: Did you send it to me and.

134 00:13:07.140 00:13:08.580 Miguel de Veyra: Via slack. Yeah.

135 00:13:08.580 00:13:10.290 Amber Lin: Okay, let me go. Check.

136 00:13:12.390 00:13:17.080 Miguel de Veyra: It’s it’s like the web version of our bot that they can access anytime.

137 00:13:17.460 00:13:18.900 Amber Lin: Okay, that’s.

138 00:13:18.900 00:13:23.020 Miguel de Veyra: Just in case, yeah, that’s just in case you know, the.

139 00:13:23.600 00:13:27.489 Amber Lin: The Google stuff doesn’t work. At least they can try the box. Still, right?

140 00:13:27.830 00:13:32.890 Amber Lin: Yeah. There. It requires a password.

141 00:13:34.190 00:13:38.200 Miguel de Veyra: I think it’s brain. Forge 2025, or 2024.

142 00:13:39.550 00:13:40.890 Amber Lin: Capitalize, no cap.

143 00:13:42.280 00:13:44.680 Miguel de Veyra: Wait! Let me double. Let me just double check actually.

144 00:13:44.680 00:13:47.460 Amber Lin: Yeah, yeah, just send it to me when you can.

145 00:13:49.820 00:13:56.699 Amber Lin: Okay? See? Also, did you update Jenna on the internal lease? Reporting thing? Is she.

146 00:13:57.123 00:14:00.509 Casie Aviles: Yes, yes. I sent her a loom video.

147 00:14:00.910 00:14:02.010 Amber Lin: Oh, okay.

148 00:14:04.700 00:14:09.459 Amber Lin: And do you know if, since she’s not here, do you know, if she has her progress on that.

149 00:14:10.524 00:14:16.909 Casie Aviles: No, I haven’t talked to her about it, but she did start yesterday like she had questions so.

150 00:14:17.580 00:14:25.090 Amber Lin: Okay started on, and all but sales. But.

151 00:14:26.580 00:14:41.359 Miguel de Veyra: I think cause I think amber most of the time. Janet doesn’t attend the Tuesdays, I think, and Thursdays meeting. I think you can talk to her about, you know, asking her to send you a loom video for the Daily Progress, since she’s done.

152 00:14:41.360 00:14:44.799 Amber Lin: Yeah, I could just text her. It will be fine. I don’t know.

153 00:14:45.255 00:14:45.710 Miguel de Veyra: Okay.

154 00:14:46.244 00:15:00.610 Amber Lin: Like these video meetings are just for us to talk in real time, which is helpful. But like you guys kinda know what she’s doing, and I’ll just text her, and I can call her if we need it. Doesn’t. Sorry. Doesn’t

155 00:15:01.660 00:15:04.110 Amber Lin: Wednesday and Thursday?

156 00:15:06.210 00:15:13.930 Amber Lin: Okay, yeah, Miguel, did you end up finding a password for the demo.

157 00:15:14.450 00:15:16.460 Miguel de Veyra: Oh, wait! Sorry. I’m asking my AI system.

158 00:15:16.790 00:15:17.285 Amber Lin: Oh!

159 00:15:17.780 00:15:21.119 Miguel de Veyra: Because I don’t know cause the file is like so huge.

160 00:15:21.490 00:15:21.830 Amber Lin: Okay.

161 00:15:21.830 00:15:23.609 Miguel de Veyra: Can you try typing?

162 00:15:23.900 00:15:24.600 Amber Lin: Yeah, I’ll try.

163 00:15:24.600 00:15:25.160 Miguel de Veyra: ABC.

164 00:15:25.160 00:15:27.740 Amber Lin: But ABC.

165 00:15:27.740 00:15:30.959 Miguel de Veyra: No, all all small, all small letters, no caps.

166 00:15:30.960 00:15:37.139 Amber Lin: ABC, phone oh, yeah, we’re in.

167 00:15:37.460 00:15:38.130 Miguel de Veyra: Okay, yeah. There you go.

168 00:15:39.550 00:15:40.970 Amber Lin: Okay, perfect.

169 00:15:40.970 00:15:43.610 Miguel de Veyra: Let’s let’s take the fail safe just in case they can set it up.

170 00:15:43.610 00:15:49.150 Amber Lin: Yeah, just text it to you to remember? Well, yeah.

171 00:15:49.150 00:15:54.190 Miguel de Veyra: Yeah. And then, as for my end, I was working on the update, bought

172 00:15:54.300 00:16:05.009 Miguel de Veyra: the update agent yesterday made some pretty good progress. I think I should be able to see to show something tomorrow, or

173 00:16:05.160 00:16:07.800 Miguel de Veyra: think tomorrow, or Friday or Thursday.

174 00:16:08.380 00:16:13.179 Miguel de Veyra: But this one i i don’t think or sorry not. I don’t think

175 00:16:13.480 00:16:18.499 Miguel de Veyra: I think there should also be a feature to like. Add right? Not only update.

176 00:16:18.790 00:16:25.370 Amber Lin: Yeah, totally. I think eventually what this would be is to help them create.

177 00:16:25.840 00:16:27.600 Miguel de Veyra: Maintain the docs. Yeah.

178 00:16:27.600 00:16:32.759 Amber Lin: Yeah. So we’ll learn their structures and help them create these. Because

179 00:16:33.490 00:16:48.160 Amber Lin: cause today was talking to Denise. And she spends a lot of time manually creating these documents, and once she makes one update in this she has to make an update and the other one because they also have the key things here is that they also have duplicates. They have the

180 00:16:48.500 00:16:52.090 Amber Lin: and the training docs, which is, they need to update both of them.

181 00:16:52.680 00:17:02.079 Miguel de Veyra: Could we like for this weekend? We focus solely on the update because we also have to change right? The adding is, let’s add that later.

182 00:17:05.160 00:17:06.380 Miguel de Veyra: I think that’ll be pretty easy.

183 00:17:06.380 00:17:15.390 Amber Lin: Yeah, that’s A, that’s a good focus. Thank you for suggesting that everything add B, Sure, great, great

184 00:17:18.069 00:17:20.870 Amber Lin: so I’ll ask you about that tomorrow.

185 00:17:20.990 00:17:25.359 Amber Lin: So we have the initial specs or whatever that is.

186 00:17:25.660 00:17:28.369 Amber Lin: Yeah, perfect.

187 00:17:29.940 00:17:32.420 Miguel de Veyra: Oh, yeah, this, this, the research.

188 00:17:32.760 00:17:34.399 Miguel de Veyra: How did we do here? Casey.

189 00:17:35.890 00:17:43.610 Casie Aviles: Yeah, I did. I didn’t implement anything yet for this, but I did read the documents. I I mean the documentation. So.

190 00:17:45.780 00:17:50.179 Casie Aviles: Yeah, for the thumbs up and the thumbs down I don’t think that’s

191 00:17:50.670 00:18:05.100 Casie Aviles: I mean we could do it, but it’s not like a trigger, so I don’t think we could read it like when someone sends a like a thumbs up. I don’t think we can instantly see it, so it doesn’t trigger anything. So I guess

192 00:18:05.410 00:18:11.469 Casie Aviles: what we could do is like we could send a card like a radio buttons.

193 00:18:11.690 00:18:17.180 Casie Aviles: So I’m playing with that right now, like I’m checking it out. I could show you my screen.

194 00:18:17.590 00:18:18.080 Amber Lin: That’s great!

195 00:18:18.080 00:18:19.239 Casie Aviles: Give you guys an idea.

196 00:18:19.490 00:18:22.810 Amber Lin: Yeah, that’s a lot of progress.

197 00:18:25.770 00:18:27.800 Casie Aviles: Okay, I think you have to.

198 00:18:28.520 00:18:32.450 Amber Lin: How do I give you permission to share? There we go!

199 00:18:33.100 00:18:34.710 Amber Lin: Does this work now.

200 00:18:38.103 00:18:39.349 Casie Aviles: Okay, yeah.

201 00:18:39.350 00:18:40.310 Casie Aviles: Can you see it? Now?

202 00:18:41.233 00:18:45.896 Amber Lin: Wait! I’m trying to figure out. Oh, there I can see.

203 00:18:47.010 00:18:53.760 Casie Aviles: Yeah. So I I guess we could do something like this. So this is still, I’m just testing this out. But.

204 00:18:53.760 00:18:54.100 Amber Lin: Yeah.

205 00:18:54.100 00:19:00.790 Casie Aviles: Maybe we could have something like this like a radio button. And and you could have, text, input.

206 00:19:01.000 00:19:02.370 Casie Aviles: so yeah, this is.

207 00:19:04.120 00:19:08.910 Amber Lin: Like at this point. It doesn’t even need to look pretty.

208 00:19:09.070 00:19:15.420 Amber Lin: We’ll just tell them we have something, and then they can, and then we can figure out, like, yeah.

209 00:19:15.420 00:19:17.199 Miguel de Veyra: How will that look like Casey.

210 00:19:17.200 00:19:17.860 Amber Lin: Later.

211 00:19:19.970 00:19:28.629 Casie Aviles: I think it will be set to the chat. So some somewhere here like it. Yeah, I haven’t tried it here yet on the actual

212 00:19:29.070 00:19:30.930 Casie Aviles: chat interface.

213 00:19:31.240 00:19:31.990 Amber Lin: Yeah.

214 00:19:32.220 00:19:33.010 Miguel de Veyra: Maybe, but.

215 00:19:33.010 00:19:33.410 Casie Aviles: Yeah.

216 00:19:33.410 00:19:34.460 Miguel de Veyra: Know something.

217 00:19:35.900 00:19:36.840 Miguel de Veyra: Azeeba.

218 00:19:37.500 00:19:39.750 Patrik: Yeah, yeah, I’m back. No.

219 00:19:39.750 00:19:45.599 Patrik: Oh, shit, I’m I’m as in the dark as as you guys. So.

220 00:19:45.600 00:19:46.166 Miguel de Veyra: Yeah, okay.

221 00:19:46.450 00:19:51.450 Patrik: Yeah, yeah, I won’t be much, much help on on that flow.

222 00:19:51.610 00:19:52.000 Amber Lin: Okay.

223 00:19:52.000 00:19:53.479 Miguel de Veyra: Okay. Okay. No. Worries.

224 00:19:55.100 00:19:56.219 Patrik: That’s look nice, though.

225 00:19:56.570 00:19:57.600 Patrik: Yeah, so.

226 00:19:57.600 00:20:04.100 Amber Lin: See is this for individual answers, or is it just at the end of their conversation?

227 00:20:04.210 00:20:07.860 Amber Lin: Oh, you brought up last time? But there is no end of the conversation.

228 00:20:07.860 00:20:08.990 Miguel de Veyra: Yeah, there is no end.

229 00:20:08.990 00:20:12.170 Amber Lin: Okay, okay, yeah. And I don’t.

230 00:20:12.280 00:20:23.760 Amber Lin: I don’t. I don’t think this would trigger anything. I don’t think when I tell Chat Gpt, you did go. You did bad that it changes their answers. They just take it to their system. So it’s just for us.

231 00:20:24.070 00:20:24.790 Casie Aviles: Yes.

232 00:20:25.150 00:20:25.520 Amber Lin: Yeah.

233 00:20:25.860 00:20:30.290 Miguel de Veyra: There’s probably a way, Casey, we can do it like add a tool.

234 00:20:30.460 00:20:36.410 Miguel de Veyra: No, the feedback tool. So if it’s provided like, Hey, how did I do

235 00:20:36.610 00:20:39.800 Miguel de Veyra: it did terrible. It’s gonna use the feedback tool to log that

236 00:20:40.240 00:20:42.659 Miguel de Veyra: that could be one of the ways we do it.

237 00:20:43.424 00:20:46.560 Amber Lin: Would that be hard audio?

238 00:20:47.010 00:20:47.640 Amber Lin: I mean.

239 00:20:48.330 00:20:50.830 Miguel de Veyra: No, not really. But it’s

240 00:20:51.280 00:20:57.279 Miguel de Veyra: it’s kind of confusing for the user. Because you never. I don’t know. Because what if the user like, you know. Help.

241 00:20:57.510 00:21:02.789 Miguel de Veyra: That’s considered feedback, I suppose. But yeah, I think we could use something like that now, Casey.

242 00:21:03.550 00:21:04.160 Casie Aviles: Yeah.

243 00:21:04.160 00:21:04.770 Miguel de Veyra: There are some

244 00:21:08.470 00:21:21.980 Miguel de Veyra: no like when they’re talking to the bot like. They gave an answer. And then this answer is useless. That’s automatically considered the feedback, and then we connect the tool to the agent that uses it to log it right? We can do that.

245 00:21:22.080 00:21:28.350 Amber Lin: Also text, this is useless. And then we log.

246 00:21:28.590 00:21:31.060 Amber Lin: Yeah, I was gonna ask you guys.

247 00:21:31.200 00:21:44.190 Amber Lin: does this, does our bot have similar to chat gpt, because sometimes I tell Chat to. No, this is not the not the answer I want. I want this. Instead, does our bot have abilities to identify that? Or is it just.

248 00:21:44.190 00:21:44.540 Miguel de Veyra: There!

249 00:21:44.700 00:21:45.810 Amber Lin: Same bot.

250 00:21:46.180 00:21:49.610 Miguel de Veyra: Yeah, that that should be. That should be.

251 00:21:50.160 00:21:52.609 Miguel de Veyra: I mean the bot should be able to identify that.

252 00:21:52.740 00:21:58.859 Amber Lin: Yeah. So that’s a feedback look feedback feedback loop in its own right? So.

253 00:21:59.440 00:22:00.820 Miguel de Veyra: That should be fine.

254 00:22:01.230 00:22:01.920 Miguel de Veyra: Okay.

255 00:22:02.090 00:22:03.420 Amber Lin: I mean, yeah.

256 00:22:03.580 00:22:05.700 Miguel de Veyra: The simplest thing. First, st you can deliver.

257 00:22:06.060 00:22:12.170 Amber Lin: Yeah. And honestly, I think the good and bad is is pretty good.

258 00:22:12.280 00:22:25.524 Amber Lin: pretty good for now we’ll just see how to incorporate that. Maybe in a little button after the after the test text response. But I’m not the one doing it. So I’m spitting out ideas. If it’s too hard, just just don’t.

259 00:22:25.840 00:22:30.469 Miguel de Veyra: Yeah, I think, yeah, cause Google chat. You know, it’s very limited.

260 00:22:31.450 00:22:32.240 Amber Lin: Okay.

261 00:22:32.440 00:22:34.389 Patrik: That’s good to know.

262 00:22:34.683 00:22:39.940 Miguel de Veyra: Casey, do you need to expand to one more day for Spike to explore that part.

263 00:22:41.190 00:22:44.689 Casie Aviles: I mean, yeah, I mean, yeah, I would do it.

264 00:22:44.995 00:22:45.299 Miguel de Veyra: Yeah.

265 00:22:45.300 00:22:46.640 Casie Aviles: Within a day.

266 00:22:48.760 00:22:51.100 Amber Lin: Implement something simple.

267 00:22:53.236 00:22:54.290 Amber Lin: Last, but.

268 00:22:54.290 00:22:55.030 Miguel de Veyra: Yeah.

269 00:22:55.030 00:22:55.870 Amber Lin: Right?

270 00:22:56.170 00:23:02.460 Amber Lin: We see. And yeah, I’ll put this

271 00:23:02.860 00:23:06.879 Amber Lin: and we go working on update. But

272 00:23:14.540 00:23:18.659 Amber Lin: and then, yeah, and on the

273 00:23:18.890 00:23:32.529 Amber Lin: range trust part, what would be let. Now, I think we can talk about what we’re gonna do today. So I know Miguel’s gonna work on update bot case is gonna work on the feedback system. And what about the

274 00:23:32.840 00:23:33.530 Amber Lin: brain? Clear.

275 00:23:33.530 00:23:38.869 Patrik: I got a I got a small update and a few questions as well. So let me know when you want me to.

276 00:23:38.870 00:23:42.590 Amber Lin: Okay, okay, do you? Wanna do you wanna just say that. Now.

277 00:23:43.330 00:23:49.837 Patrik: Yeah, yeah, yeah. Yeah. So the 1st thing I was looking at,

278 00:23:50.420 00:23:53.400 Patrik: basically, I met with your Tom and and amber. And

279 00:23:54.130 00:23:59.254 Patrik: yeah, see, we’re we’re meeting soon. To get another overview.

280 00:24:00.110 00:24:07.570 Patrik: and Miguel. Thanks for the the download on your end as well. That really helps like jump start my understanding of the system.

281 00:24:08.129 00:24:13.299 Patrik: It sounds like the focus utam wanted me to like, look at is

282 00:24:14.096 00:24:24.539 Patrik: just around like performance and and any sort of optimizations. I think you guys did like a really solid job at getting from, you know, like the 30 seconds down to.

283 00:24:24.790 00:24:26.800 Patrik: I think it was like an average of 9.

284 00:24:27.730 00:24:35.669 Patrik: Okay, yeah. And ideally, like, we can continue to improve on that down to like.

285 00:24:36.040 00:24:52.371 Patrik: you know, sub, second responses. I think with the goal like the long term goal being, hey? Can we start to like explore like a speak, you know, like a a voice sort of option with this agent. Right?

286 00:24:53.310 00:25:01.720 Patrik: so I think you know, the 1st step in in my mind around, like the performance problem is like observability. What like?

287 00:25:01.940 00:25:09.709 Patrik: What the hell is taking long? And how do we? How do we know that? And then can we understand the chain like

288 00:25:10.340 00:25:15.430 Patrik: the changes that we’re making. What impact do they have on on our performance?

289 00:25:16.005 00:25:22.229 Patrik: So yesterday I spent some time reading through the N. 8 N. Docs and the brain trust docs.

290 00:25:22.610 00:25:29.790 Patrik: I think the ideal solution would be, we write the traces out

291 00:25:30.080 00:25:33.889 Patrik: to the Brain Trust Hotel Hotel connector.

292 00:25:34.260 00:25:40.236 Patrik: However, Nan doesn’t have any sort of like native support for

293 00:25:40.780 00:25:41.300 Miguel de Veyra: Hi Clark!

294 00:25:41.300 00:25:44.919 Patrik: Any sort of open to open telemetry, which is pretty annoying.

295 00:25:47.550 00:25:48.070 Patrik: I don’t know.

296 00:25:48.260 00:25:48.450 Miguel de Veyra: Bills.

297 00:25:48.450 00:25:48.980 Patrik: Something.

298 00:25:48.980 00:25:49.420 Miguel de Veyra: Incoming.

299 00:25:49.420 00:25:50.030 Patrik: Yeah.

300 00:25:50.470 00:25:56.540 Patrik: yeah, I was thinking that, too. Like, it seems like, there’s a like, a kind of big community. Ask for it.

301 00:25:57.790 00:26:04.332 Patrik: so I think that’s ultimately like that’s probably the the best, the ideal scenario.

302 00:26:04.980 00:26:13.950 Patrik: we can have the traces and brain trust. We can have the Eval and brain. Trust and it’ll capture like the entire workflow and pipeline.

303 00:26:14.730 00:26:17.080 Miguel de Veyra: Because right now, it’s just the input output right.

304 00:26:18.150 00:26:21.600 Patrik: Yeah, for for brain trust, or or for.

305 00:26:21.600 00:26:29.699 Miguel de Veyra: I mean the the one that cap that brain trust captures is. I think it’s just the out, the execution time and put an output and expected.

306 00:26:29.700 00:26:35.585 Patrik: Yeah, just the input output. I think I didn’t. Even I was gonna ask about how that was hooked up.

307 00:26:36.180 00:26:37.050 Patrik: But

308 00:26:37.880 00:26:46.609 Patrik: yeah. So the other thing, too, is like the Enterprise plan. Had some like log streaming that we’d be able to connect into, and then just pipe that wherever we want?

309 00:26:47.460 00:26:50.249 Patrik: We’re not on that plan. So that kind of

310 00:26:51.250 00:26:57.719 Patrik: kinda cut that option out. So it it basically just started to replace our current like snowflake

311 00:26:58.740 00:27:05.409 Patrik: stuff with a cron job that’s going to just take that

312 00:27:05.740 00:27:10.629 Patrik: action out of the out of the happy fat. The request, happy path!

313 00:27:11.404 00:27:18.840 Patrik: So what it’ll do is just run on some sort of like, you know, hourly cadence or or daily cadence, and

314 00:27:19.020 00:27:23.160 Patrik: pull all of the executions that

315 00:27:23.460 00:27:26.939 Patrik: execution data through the Nan Api

316 00:27:27.580 00:27:35.590 Patrik: and then write that out to just parse through that clean it up and write it out to Snowflake.

317 00:27:37.620 00:27:46.949 Patrik: and that’ll have all the like execution times it all. It breaks everything out by, you know which, all the nodes that we’re getting processed and things like that.

318 00:27:48.910 00:27:56.860 Patrik: so I think that’s like a fair solution until and we’ll just keep our our eye out on the hotel collector.

319 00:27:57.350 00:28:11.880 Miguel de Veyra: Yeah, cause, I think, like the next cause. I don’t think we can go some 5 seconds. I mean, we could. If it’s below cause. Our context is pretty huge. It’s like total. Both of it are around 50 to 80,000.

320 00:28:12.340 00:28:16.110 Miguel de Veyra: So I I don’t think we can really go

321 00:28:16.550 00:28:19.250 Miguel de Veyra: be below that, because, you know.

322 00:28:19.500 00:28:22.049 Miguel de Veyra: it’s up to the Llm model.

323 00:28:22.280 00:28:28.950 Miguel de Veyra: We can. What we can do about that is, we could train a model. But I’m not really good at training models. I don’t have experience on it.

324 00:28:28.950 00:28:31.760 Patrik: Yeah, or we reduce the contact size as well.

325 00:28:33.898 00:28:42.890 Miguel de Veyra: Context is their documentation. I mean, context is performed almost the same. That doesn’t make sense to, you know. Use a reckon.

326 00:28:45.838 00:28:52.259 Miguel de Veyra: I don’t know but it’s it’s below 10. So it should be okay. For now until we find some.

327 00:28:52.260 00:28:55.049 Patrik: Yeah, I mean, yeah, I mean, I like.

328 00:28:56.180 00:29:00.860 Patrik: look, I respond with Chat. Gbt, I’m not getting 5 second responses.

329 00:29:01.540 00:29:06.814 Patrik: so I think 5 seconds is pro still pretty slow. I think we still need to aim aim lower.

330 00:29:08.740 00:29:16.463 Patrik: and yeah, I mean, any any sort of like suggestions you have on that

331 00:29:17.190 00:29:21.709 Patrik: you know I can. I can take a look at and run with and you guys can. Kinda

332 00:29:21.850 00:29:26.584 Patrik: I don’t wanna derail the current, you know, derail the plan that you guys have

333 00:29:27.200 00:29:31.240 Patrik: But I definitely think it’s worth taking like a stab at at what we can do here.

334 00:29:31.990 00:29:32.810 Miguel de Veyra: Yeah,

335 00:29:34.090 00:29:39.120 Miguel de Veyra: I think the best we could do is because we can really lower the context, because that’s the documentation

336 00:29:41.380 00:29:42.040 Miguel de Veyra: like they.

337 00:29:42.040 00:29:43.260 Patrik: For the context.

338 00:29:47.200 00:29:47.770 Miguel de Veyra: Sorry go ahead.

339 00:29:47.770 00:29:50.679 Patrik: Yeah, okay, yeah, yeah, okay, keep going. What else?

340 00:29:50.997 00:29:56.389 Miguel de Veyra: Cause. I think the next step for us is to actually train the data on the model

341 00:29:56.930 00:30:08.849 Miguel de Veyra: right and then use that model deployed to azure. I don’t know how to do that, so I’m not sure if you have experience on it, but you could probably take some time off, maybe a week or 2, but I’m not sure it’s the best use of time.

342 00:30:14.923 00:30:18.790 Patrik: Okay? So yeah, I’ll just make a note of that.

343 00:30:20.610 00:30:28.250 Miguel de Veyra: Yeah, cause I tried training before using Openai. But the thing is, Casey, we can’t use it on assistant models, right? Only on.

344 00:30:29.620 00:30:31.219 Miguel de Veyra: I think, completions.

345 00:30:32.280 00:30:32.860 Casie Aviles: Yeah.

346 00:30:33.780 00:30:37.990 Patrik: Do you have notes on the rag solution?

347 00:30:39.350 00:30:42.280 Miguel de Veyra: Rag and context is performing almost the same.

348 00:30:43.770 00:30:46.499 Patrik: Is there a way I can test that? How do you? How are you testing that.

349 00:30:49.350 00:30:53.879 Miguel de Veyra: Oh, it will give you, Casey. Do you have the the regular thing?

350 00:30:54.010 00:30:55.660 Miguel de Veyra: We have a rug version right?

351 00:30:55.980 00:30:59.069 Casie Aviles: Yeah, we have another. N, 8 n, workflow for that.

352 00:30:59.490 00:31:00.510 Miguel de Veyra: What’s the name of that? Again.

353 00:31:00.510 00:31:01.940 Patrik: Bonus. Yeah.

354 00:31:02.710 00:31:04.219 Miguel de Veyra: This is the client Hub, ABC.

355 00:31:07.405 00:31:10.760 Casie Aviles: No, no, not that one company.

356 00:31:11.070 00:31:14.420 Casie Aviles: I think it’s ABC. Home Advanced Drug Test.

357 00:31:19.260 00:31:21.459 Miguel de Veyra: Yeah, let me send this to ait.

358 00:31:22.700 00:31:23.590 Patrik: Thank you.

359 00:31:35.840 00:31:36.490 Miguel de Veyra: Yeah.

360 00:31:50.110 00:31:56.580 Patrik: Yeah, so this will be, this will be more or less like a spike from me. Amber, to start digging into.

361 00:31:56.950 00:31:57.690 Miguel de Veyra: Yeah.

362 00:31:57.960 00:32:00.110 Patrik: Some of the performance stuff.

363 00:32:08.320 00:32:11.090 Amber Lin: Great and

364 00:32:12.620 00:32:19.209 Amber Lin: cool. That was very helpful to listen into that, though I missed a lot of key words. I’m like, what is that? By God.

365 00:32:19.210 00:32:22.100 Patrik: No, thank you for taking thanks for taking notes.

366 00:32:22.100 00:32:24.230 Amber Lin: I was trying to figure it out. I think.

367 00:32:24.230 00:32:24.900 Patrik: Yeah.

368 00:32:24.900 00:32:25.940 Amber Lin: Oh, what you mean!

369 00:32:26.420 00:32:27.837 Patrik: I appreciate it.

370 00:32:28.710 00:32:37.209 Patrik: yeah, I I guess the last quick thing on my end. So where are we at with the Brain trust stuff. And and how is that getting like?

371 00:32:37.790 00:32:42.600 Patrik: How is that plugged into the workflow.

372 00:32:43.270 00:32:52.090 Casie Aviles: Yeah, I can show you how it looks like, so wait. Yeah.

373 00:32:52.300 00:32:53.120 Amber Lin: I can see your screen.

374 00:32:53.120 00:32:56.100 Casie Aviles: Basically, I do a 8 second.

375 00:33:00.140 00:33:05.239 Casie Aviles: So going here to the workflow, I’ve added this Http request.

376 00:33:05.590 00:33:08.570 Patrik: Oh, that’s what that was for. Windmill. Yeah, yeah. So that’s yeah.

377 00:33:08.570 00:33:10.780 Casie Aviles: Yes, gotcha.

378 00:33:10.780 00:33:11.390 Patrik: Okay.

379 00:33:12.465 00:33:15.719 Casie Aviles: And then it sends these things

380 00:33:16.060 00:33:18.909 Casie Aviles: as input as our parameters to the

381 00:33:19.160 00:33:25.920 Casie Aviles: python script that I have hosted on windmill. And it has the brain trust stuff, and it should

382 00:33:26.170 00:33:34.909 Casie Aviles: show up here on. Yeah, I mean experiments over here. And then, yeah, it gets the scores.

383 00:33:36.830 00:33:39.212 Patrik: Gotcha gotcha, can I see the

384 00:33:40.250 00:33:42.459 Patrik: Could you go back to the script real quick?

385 00:33:46.260 00:33:53.050 Patrik: So you’re sending user input AI output, username and session id sounds like.

386 00:33:54.260 00:33:54.940 Casie Aviles: Yes.

387 00:33:55.610 00:33:56.390 Casie Aviles: Okay.

388 00:33:58.910 00:33:59.810 Patrik: Nice.

389 00:34:00.030 00:34:00.810 Patrik: Okay.

390 00:34:01.950 00:34:04.539 Casie Aviles: I can also send you this code.

391 00:34:06.390 00:34:13.144 Patrik: Yeah, yeah, yeah, no. Actually, I can. This is on, oh, this is on windmill.

392 00:34:14.730 00:34:19.589 Patrik: do you guys have a lot of stuff running on window. I might might end up needing access.

393 00:34:21.699 00:34:28.209 Casie Aviles: for the internal stuff that I have for brain parts, like the internal AI stuff. I have a lot there.

394 00:34:30.110 00:34:37.019 Patrik: Gotcha. Alright! I’ll let you know if I if I need access. But yeah, this is this is helpful. Thank you.

395 00:34:38.030 00:34:43.750 Amber Lin: Can I see the sorry, Casey? I wanted to see the free interest bells. What does it look like now?

396 00:34:44.750 00:34:45.683 Casie Aviles: Oh, sure,

397 00:34:50.969 00:34:52.370 Casie Aviles: yeah. So

398 00:34:53.340 00:34:59.850 Casie Aviles: let’s take a look like, here’s we have the input here. So this is the question, so how long does it take?

399 00:35:00.450 00:35:06.260 Casie Aviles: And then this is the output that the AI produce.

400 00:35:06.390 00:35:08.499 Casie Aviles: and then the here’s the expected one.

401 00:35:09.160 00:35:14.340 Casie Aviles: and we have scores here like for leveraging. We have

402 00:35:14.880 00:35:18.794 Casie Aviles: this 0 point 18, which is pretty low.

403 00:35:19.230 00:35:20.010 Amber Lin: Yeah.

404 00:35:20.530 00:35:22.930 Casie Aviles: But yeah, there’s also the the

405 00:35:23.505 00:35:29.870 Casie Aviles: we could consider. If this is like the best metric to use, because it’s a, it’s a heuristic. Yeah.

406 00:35:30.183 00:35:34.880 Amber Lin: Yeah, I looked it up. What it means? I think, 18 is pretty good.

407 00:35:36.350 00:35:38.410 Patrik: What ha, ha!

408 00:35:38.620 00:35:42.230 Patrik: Where are you pulling the expected response from.

409 00:35:43.260 00:35:48.100 Casie Aviles: Oh, yeah, it’s from another workflow that we set up.

410 00:35:50.160 00:35:52.369 Patrik: And how do you match that? Oh, okay.

411 00:35:52.370 00:35:57.640 Patrik: Why, if you’re matching some sort of input.

412 00:35:57.640 00:36:00.419 Casie Aviles: Yeah, yeah, exactly. We’re we’re matching.

413 00:36:02.020 00:36:02.650 Casie Aviles: So.

414 00:36:02.650 00:36:06.500 Patrik: Is there like there’s potentially like something like.

415 00:36:06.500 00:36:07.100 Casie Aviles: Yes.

416 00:36:09.300 00:36:11.390 Miguel de Veyra: This is where the golden data is.

417 00:36:11.720 00:36:17.959 Patrik: Yeah, yeah, so this quick. So this could potentially be wrong, right? As well.

418 00:36:18.130 00:36:20.929 Miguel de Veyra: No, this is definitely wrong. Because this was the old data.

419 00:36:22.770 00:36:30.446 Miguel de Veyra: We’re just basically our our entire test for this was to just see if it, if it connects everything but.

420 00:36:30.830 00:36:38.069 Patrik: Yeah, yeah, no. I’m saying like, so your eval part is getting an expected output. But that expected output

421 00:36:38.890 00:36:43.640 Patrik: is that expected outputs coming from another AI agent

422 00:36:43.750 00:36:46.780 Patrik: or another. Ai workflow right.

423 00:36:46.780 00:36:47.820 Miguel de Veyra: Yes, yes.

424 00:36:47.820 00:36:48.340 Casie Aviles: Yes.

425 00:36:48.340 00:36:51.113 Patrik: So I guess there’s potentially some some

426 00:36:51.760 00:36:54.670 Patrik: hallucinization that could go on there.

427 00:36:56.420 00:36:58.239 Miguel de Veyra: Yeah, there’s always a potential for losing.

428 00:36:58.240 00:36:58.930 Patrik: Yeah.

429 00:36:58.930 00:37:01.069 Casie Aviles: Yeah, this might not be the best.

430 00:37:01.950 00:37:04.510 Patrik: That’s interesting. I don’t know how else you would.

431 00:37:05.500 00:37:09.989 Patrik: You would find the expected output of an answer.

432 00:37:13.011 00:37:15.258 Patrik: It’s like a chicken on the egg problem.

433 00:37:17.240 00:37:25.479 Miguel de Veyra: Yeah, I mean, the way we did it is just get the golden data. Give it to the agent, hey? Based on this user question, what’s the expected answer?

434 00:37:26.150 00:37:26.920 Patrik: I think that’s the problem.

435 00:37:26.920 00:37:27.960 Miguel de Veyra: Just all a guest.

436 00:37:28.520 00:37:34.200 Amber Lin: Yeah, and probably with more training and more. I I.

437 00:37:34.502 00:37:37.220 Patrik: Do you need to email that part as well.

438 00:37:37.220 00:37:43.229 Amber Lin: That’s so funny, but it never is never finished.

439 00:37:44.000 00:37:45.520 Patrik: No, I guess not.

440 00:37:45.900 00:37:46.330 Amber Lin: Oh!

441 00:37:46.330 00:37:49.579 Patrik: It’s cool, though this is. This is, this is really interesting.

442 00:37:50.000 00:37:59.510 Amber Lin: Casey, do we have the, you know, the cool dashboard that we saw last time on Brain Trust.

443 00:38:06.290 00:38:06.789 Miguel de Veyra: It sounds good.

444 00:38:06.790 00:38:07.130 Miguel de Veyra: Sorry.

445 00:38:07.130 00:38:09.409 Patrik: Right now. The bot’s taking 20 seconds.

446 00:38:09.990 00:38:11.410 Amber Lin: Oh, no!

447 00:38:15.120 00:38:15.790 Amber Lin: I.

448 00:38:15.790 00:38:18.952 Patrik: Alright guys. I gotta jump

449 00:38:20.290 00:38:28.980 Patrik: Thanks for saying up, Casey. I’ll talk to you a little bit later, and I’ll chat with you guys on Thursday. But yeah, anything you guys need, you know, I’ll be on slack as well.

450 00:38:28.980 00:38:30.250 Amber Lin: Yeah. Totally.

451 00:38:32.070 00:38:33.890 Patrik: Thanks. Sis, thanks. Everyone.

452 00:38:34.240 00:38:42.409 Amber Lin: Hi and Casey Abigail just really quick. I know you can see my screen. I just have 2 more items, and we’ll be done.

453 00:38:43.320 00:38:49.350 Amber Lin: how we’re gonna work on this today as well. Right? The brain trust and internal Google Cloud to get it ready

454 00:38:49.660 00:38:51.740 Amber Lin: to implement on their system.

455 00:38:52.821 00:38:54.930 Miguel de Veyra: That’s the code we sent to them.

456 00:38:55.160 00:38:59.850 Amber Lin: Oh, so they can already put it use it on already.

457 00:38:59.850 00:39:00.530 Miguel de Veyra: Yeah, yeah.

458 00:39:00.530 00:39:03.190 Amber Lin: Oh, so that’s done. Okay. Oh, my goodness.

459 00:39:03.490 00:39:15.671 Amber Lin: that is done. I didn’t. I didn’t know. That was what I meant. I just thought we sent a updated code. That’s all. I, that’s what I thought, okay, that’s done. That is so awesome.

460 00:39:16.450 00:39:21.090 Amber Lin: next, I think then today, we just need to record loom video.

461 00:39:22.360 00:39:26.969 Amber Lin: that’s all I can. Amber will look at list of documents.

462 00:39:28.040 00:39:34.099 Amber Lin: or or, if already have that like, if you guys already have a list that will be great.

463 00:39:35.570 00:39:37.429 Miguel de Veyra: List of which documents sorry.

464 00:39:37.875 00:39:44.719 Amber Lin: Because last time when we talked, you suggested that we also give them a list of the documents the bot is referring to so.

465 00:39:44.720 00:39:46.180 Miguel de Veyra: Just give them the central dock.

466 00:39:46.740 00:39:47.490 Amber Lin: Yeah, okay.

467 00:39:47.980 00:39:56.839 Amber Lin: And sandwich great so today, I think if you guys have some time just to record the

468 00:39:57.896 00:39:59.490 Amber Lin: onboarding walkthrough

469 00:40:00.020 00:40:06.030 Amber Lin: just really quick how to access the bot, because also I don’t know how to access the bot either.

470 00:40:06.420 00:40:07.330 Miguel de Veyra: Okay. Okay.

471 00:40:09.710 00:40:10.779 Amber Lin: Yeah, that’s all.

472 00:40:11.200 00:40:15.419 Amber Lin: Document update bought more. Spike, the video, who you think.

473 00:40:15.420 00:40:18.340 Miguel de Veyra: I’m gonna check. Why, it’s taking so long again.

474 00:40:18.910 00:40:19.580 Amber Lin: Yeah.

475 00:40:19.850 00:40:20.290 Miguel de Veyra: Okay.

476 00:40:20.290 00:40:22.209 Amber Lin: Thank you guys so much for coming here.

477 00:40:22.450 00:40:24.370 Amber Lin: Thanks everyone.

478 00:40:24.790 00:40:28.089 Amber Lin: Bye, bye. Good job. Yesterday we got so much done.

479 00:40:30.040 00:40:31.570 Miguel de Veyra: Thanks, guys, thanks. Guys.

480 00:40:31.570 00:40:32.480 Miguel de Veyra: Have a good one. Bye, bye.