Meeting Title: AI Team | Daily Standup Date: 2024-03-27 Meeting participants: Janna Wong, Amber Lin, Patrik, Miguel De Veyra, Casie Aviles


WEBVTT

1 00:01:26.360 00:01:27.620 Amber Lin: Hi team.

2 00:01:30.010 00:01:30.550 Casie Aviles: Hey! Amber.

3 00:01:30.550 00:01:31.579 Amber Lin: I am going to

4 00:01:31.580 00:01:31.980 Janna Wong: Number

5 00:01:31.980 00:01:33.000 Amber Lin: Share my screen.

6 00:01:33.180 00:01:34.730 Miguel de Veyra: Hey? Everyone! Good morning!

7 00:01:35.310 00:01:36.540 Amber Lin: Good morning!

8 00:01:39.640 00:01:40.800 Miguel de Veyra: Sorry can I be heard

9 00:01:41.810 00:01:42.460 Casie Aviles: Yes.

10 00:01:42.460 00:01:43.570 Miguel de Veyra: Okay. Goodness.

11 00:01:47.180 00:01:48.490 Miguel de Veyra: is Janice coming

12 00:01:49.778 00:01:51.870 Amber Lin: I don’t know. It’s okay.

13 00:01:51.870 00:01:54.230 Miguel de Veyra: Cause. I think, she said. She has meetings today right

14 00:01:55.040 00:01:56.600 Amber Lin: Oh! Did she

15 00:01:56.600 00:02:01.240 Miguel de Veyra: Yeah, she has trainings, although I think, she said, after she’s gonna have training

16 00:02:01.240 00:02:01.750 Amber Lin: Yeah, sure.

17 00:02:01.750 00:02:02.550 Miguel de Veyra: Oh, sure!

18 00:02:02.680 00:02:08.009 Amber Lin: It’s it’s okay. I think we can get started. And when she comes, Phil, we’ll talk.

19 00:02:09.289 00:02:10.020 Amber Lin: So.

20 00:02:10.580 00:02:16.570 Amber Lin: 1st of all, thank you guys so much, so much for working so late yesterday. I am so sorry

21 00:02:17.280 00:02:22.449 Miguel de Veyra: We all got scolded by Utam, so we had to work really late

22 00:02:22.930 00:02:27.369 Amber Lin: So I really thank you guys for doing all of that. That was so good.

23 00:02:28.280 00:02:39.920 Amber Lin: And let’s just start by my updates. So yesterday I was I was doing the

24 00:02:40.380 00:02:44.140 Amber Lin: rollout plans and all the action plans. So

25 00:02:44.830 00:02:48.299 Amber Lin: we might, we might have to think about okay, what kind of

26 00:02:49.082 00:02:52.620 Amber Lin: maybe having Quickstore document and all of that

27 00:02:53.409 00:02:59.899 Amber Lin: I will share that. I’ll share that in our channel, and we can take a if you guys have time can take a look at it.

28 00:03:00.910 00:03:10.739 Amber Lin: and yes, Miguel and I yesterday were mostly. We’re mostly doing all the tickets to

29 00:03:13.340 00:03:19.610 Amber Lin: add requirements and due dates to everything and acceptance. Criteria.

30 00:03:20.237 00:03:23.499 Amber Lin: Miguel, anything else that you were doing yesterday

31 00:03:24.580 00:03:33.190 Miguel de Veyra: It’s probably just fixing this the tickets. And then Casey introduced me to one of cause I lost contact with that guy. But yeah, I’m working on the recruitment stuff, too.

32 00:03:34.370 00:03:35.519 Amber Lin: Hmm, okay.

33 00:03:36.700 00:03:43.059 Miguel de Veyra: And then I believe I I started the bit, but only a bit on the internal team stuff

34 00:03:43.260 00:03:44.260 Amber Lin: Oh, okay.

35 00:03:44.260 00:03:47.459 Miguel de Veyra: So I think Amber in terms. Speaking of that.

36 00:03:47.590 00:03:56.339 Miguel de Veyra: I think that the 1st thing we need to do like the task wise is just me, and you set meetings with the internal teams

37 00:03:56.980 00:04:02.300 Amber Lin: Okay? Yeah. Send meetings for the internal

38 00:04:02.960 00:04:06.079 Miguel de Veyra: Or even identify. Then the different internal teams

39 00:04:08.200 00:04:13.759 Amber Lin: So there’s Ops marketing data teams.

40 00:04:20.640 00:04:21.890 Amber Lin: Interesting.

41 00:04:22.890 00:04:24.140 Amber Lin: That’s it. Right?

42 00:04:24.410 00:04:24.970 Miguel de Veyra: Yep.

43 00:04:25.600 00:04:44.010 Amber Lin: Okay, need to do, assign it to you and due date. Let’s say, Okay, okay, sounds good.

44 00:04:44.970 00:04:46.979 Amber Lin: So that’s something we can do.

45 00:04:47.712 00:04:54.600 Amber Lin: Casey, how about you? I’m so sorry. There was so much stuff that I threw at you. How did

46 00:04:54.600 00:05:02.310 Casie Aviles: Yeah, yeah, sure. Yeah. So I primarily worked on the dashboard. I. This was automatically flagged as done because of

47 00:05:02.860 00:05:06.070 Casie Aviles: the automation. But I put it back in progress, since

48 00:05:06.250 00:05:14.110 Casie Aviles: I think I still need to have this reviewed by the team first, st before we actually have this, you know, for the clients

49 00:05:14.770 00:05:15.169 Amber Lin: Hmm.

50 00:05:16.725 00:05:22.290 Casie Aviles: But yeah, I could. But I did push it live already. And oh.

51 00:05:22.290 00:05:25.820 Casie Aviles: you guys want to see the dashboard right now.

52 00:05:25.820 00:05:26.390 Amber Lin: Short.

53 00:05:27.727 00:05:29.279 Casie Aviles: I’ll have to share.

54 00:05:36.600 00:05:39.409 Casie Aviles: Okay, let me just remove all of these.

55 00:05:41.010 00:05:46.480 Casie Aviles: Okay, so yeah, just I also made the loom video quick loom video. But

56 00:05:46.480 00:05:48.260 Amber Lin: Yeah, I saw, hmm.

57 00:05:48.870 00:05:52.930 Casie Aviles: Yeah, I just removed one of the dashboards since it was just my test.

58 00:05:53.150 00:05:54.929 Casie Aviles: So it’s not as relevant.

59 00:05:57.733 00:06:03.330 Casie Aviles: Yeah, it’s okay. So this is the the latest dashboard. So I tried to get

60 00:06:04.500 00:06:08.739 Casie Aviles: Yeah. I tried to remove all of the the line breaks here the blank.

61 00:06:09.030 00:06:13.990 Casie Aviles: Alright, it’s the nulls. I tried to remove as much as I can

62 00:06:14.340 00:06:15.170 Amber Lin: Oh!

63 00:06:15.420 00:06:20.866 Casie Aviles: And then for here, if there’s no feedback, I just label this as no feedback.

64 00:06:22.800 00:06:25.550 Casie Aviles: Yeah, they should be live since the end.

65 00:06:26.170 00:06:29.120 Casie Aviles: And then here’s like the data that we have

66 00:06:29.330 00:06:30.000 Amber Lin: Hmm.

67 00:06:31.587 00:06:44.910 Casie Aviles: Yeah, I mean, that’s pretty much it. I mean, most of the work that I did was just that’s where it. It’s yeah, I mean, it’s the data part. It’s the data cleaning, because our data was really messy and had a lot of missing values. So

68 00:06:45.020 00:06:47.199 Casie Aviles: that’s where I spent most of the time

69 00:06:48.850 00:06:49.570 Amber Lin: Oh,

70 00:06:51.240 00:06:54.879 Patrik: Oh, where is this data coming from? Just out of curiosity.

71 00:06:55.806 00:07:01.500 Casie Aviles: Yeah, it’s from this new table that I created. It’s on Snowflake. Oh, wait.

72 00:07:04.000 00:07:07.030 Casie Aviles: Yeah, it’s from Snowflake. And this is these are the data that

73 00:07:07.560 00:07:11.830 Casie Aviles: we’re logging from the workflows that we have for the AI agent

74 00:07:16.980 00:07:18.159 Amber Lin: A quick question about that

75 00:07:18.160 00:07:19.340 Casie Aviles: Is it

76 00:07:19.500 00:07:29.780 Amber Lin: Is it? Does it include the thumbs up thumbs down data? It’s like the updated version of that. Or is this still just our testing data

77 00:07:30.570 00:07:40.200 Casie Aviles: Oh, I removed the test data. That’s why I deleted the other dashboard. And this contains the the actual thumbs up thumbs down data from the Csrs

78 00:07:40.980 00:07:47.739 Amber Lin: Oh, okay, so I can move. You know, we have a task of like, Ask added to

79 00:07:49.330 00:07:53.170 Amber Lin: I’ll show you the show you the task in a moment.

80 00:07:55.160 00:07:55.780 Amber Lin: Cool.

81 00:07:57.560 00:08:00.870 Amber Lin: So it’s this one ABC logs backfield

82 00:08:01.660 00:08:03.019 Casie Aviles: Yes, it’s this one

83 00:08:07.550 00:08:11.370 Patrik: How’d you? How’d you generate the table

84 00:08:13.660 00:08:21.130 Casie Aviles: Oh, how did I did? How did I generate? I, yeah, this is the main. This is the the initially. This is the table that we were using.

85 00:08:21.680 00:08:27.690 Casie Aviles: ABC. Bought feedback, and there are some missing columns. So I just made a copy of this

86 00:08:28.963 00:08:35.599 Casie Aviles: and then I also ran this through an AI step that generated the quality score.

87 00:08:38.250 00:08:46.189 Casie Aviles: So yeah, we have all these quality scores. So it’s just checking the input against the output or yeah. And then also the expected.

88 00:08:46.945 00:08:47.700 Casie Aviles: yeah.

89 00:08:48.400 00:08:56.749 Patrik: So we wanted to track something other than that ABC workflow. We have to duplicate the snowflake stuff

90 00:09:00.810 00:09:06.099 Casie Aviles: yeah, I I mean, yeah, I just, I wanted to clean it up. So that’s why I duplicated it.

91 00:09:06.200 00:09:07.440 Casie Aviles: Yeah.

92 00:09:08.120 00:09:10.289 Patrik: Gotcha? Alright, yeah. Just wondering.

93 00:09:11.090 00:09:14.401 Casie Aviles: Okay. But yeah, I guess that’s pretty much it.

94 00:09:15.660 00:09:31.690 Amber Lin: That’s great. Case, can you? If you look at my screen, we have this little task, the ticket of adding feedback and thumbs data to real we have. We have it in the real dashboard. Is it up to date? Can I check this box and then move it

95 00:09:32.080 00:09:32.890 Amber Lin: out?

96 00:09:36.250 00:09:38.540 Amber Lin: This is up to date. Yeah.

97 00:09:38.540 00:09:44.240 Casie Aviles: I’ll have to check if they added a new added more. So I can update it again.

98 00:09:45.070 00:09:47.330 Amber Lin: Oh, so right now it’s manual, right?

99 00:09:47.910 00:09:54.219 Casie Aviles: Yeah, because I I created a new database. Yeah, I mean a new table to clean everything

100 00:09:54.680 00:09:56.269 Amber Lin: Okay, sounds good.

101 00:09:57.467 00:10:00.039 Amber Lin: That is in.

102 00:10:00.820 00:10:02.870 Amber Lin: I’ll move that that’s for me.

103 00:10:03.260 00:10:11.620 Amber Lin: So we’re just checking on the different dashboards.

104 00:10:13.210 00:10:14.490 Amber Lin: Who is?

105 00:10:17.850 00:10:18.770 Amber Lin: We did

106 00:10:18.770 00:10:24.469 Casie Aviles: One. I did not include an indication of the quality score. How it’s calculated

107 00:10:24.920 00:10:29.469 Amber Lin: I mean, we could either have it on the dashboard or have it on the spreadsheet.

108 00:10:31.300 00:10:36.629 Amber Lin: So is this still in progress? Should I just leave it there to

109 00:10:44.974 00:10:52.899 Casie Aviles: So there! There are those I mean this Rainforge one I don’t know who that is right now, so that’s why I left it unchecked.

110 00:10:54.149 00:11:06.019 Casie Aviles: For filtering the internal AI team. I didn’t really implement anything for that, but there is a filter feature for the dashboard. I could show as well again

111 00:11:06.370 00:11:07.472 Amber Lin: Okay, sounds good.

112 00:11:10.747 00:11:17.909 Amber Lin: That’s probably just adding a tag to the different people, and then filter by tag. I don’t know

113 00:11:17.910 00:11:18.540 Miguel de Veyra: Yeah.

114 00:11:18.540 00:11:27.779 Casie Aviles: There’s this filter feature here, and you could filter it according to the people. Let’s say, from the people, from ABC. So something like this

115 00:11:29.180 00:11:31.330 Miguel de Veyra: I have a question. Sorry, Casey.

116 00:11:32.290 00:11:35.089 Miguel de Veyra: cause we have to be very careful about the Rs.

117 00:11:35.430 00:11:41.040 Miguel de Veyra: How how long do you think this will still take to finish just on a rough estimate

118 00:11:42.410 00:11:46.572 Casie Aviles: I mean, it’s just, you know, it’s just cleaning up the rest of the

119 00:11:47.790 00:11:50.879 Casie Aviles: the ones that I haven’t done like, for example. I mean.

120 00:11:51.730 00:11:55.660 Casie Aviles: how important is the finding out who the brain forge

121 00:11:55.880 00:12:00.340 Casie Aviles: AI service about this like I I’m not sure how to go about this

122 00:12:00.510 00:12:03.370 Miguel de Veyra: Okay, okay, is okay.

123 00:12:03.610 00:12:06.760 Miguel de Veyra: Yesterday you spent, I assume, the whole 8 h. Right?

124 00:12:07.810 00:12:08.860 Casie Aviles: Yes.

125 00:12:08.860 00:12:10.190 Amber Lin: Okay, okay, okay.

126 00:12:11.033 00:12:18.880 Amber Lin: do you think we should just move forward with this like we can check if it’s good enough? If it’s good enough, we can save

127 00:12:18.880 00:12:23.079 Casie Aviles: Yeah, yeah, I want to check with him also. And

128 00:12:23.080 00:12:23.500 Miguel de Veyra: Yeah.

129 00:12:23.500 00:12:26.766 Casie Aviles: Confident with this, because you know

130 00:12:27.140 00:12:29.620 Miguel de Veyra: We can probably just send them a message, hey? With them. Here’s

131 00:12:29.620 00:12:37.839 Amber Lin: That’s good. So let’s wait from him to do until we do any more work on this, because it looks pretty good to me. I know there’s like nitty pick pick

132 00:12:37.840 00:12:41.470 Miguel de Veyra: Yeah details. But the execs probably won’t know this. So

133 00:12:41.470 00:12:54.380 Miguel de Veyra: yes, because just for reference, because I think amber in case he doesn’t know it yet, or the other member for ABC. We want to get down to around. Was it 25 or so we total spend for all members

134 00:12:56.160 00:13:02.730 Miguel de Veyra: Right. So I ideally, Casey, you’re gonna spend like 4 HA day on this, probably less

135 00:13:05.310 00:13:05.820 Amber Lin: Yeah.

136 00:13:05.820 00:13:11.719 Amber Lin: And we already spent quite a bit yesterday because we had an emergency

137 00:13:11.720 00:13:12.686 Miguel de Veyra: Yeah, yeah.

138 00:13:14.870 00:13:15.870 Amber Lin: Okay,

139 00:13:17.360 00:13:29.000 Amber Lin: yeah. The test list I will get from Denise on Friday. I can work on these and I’ll work on a presentation today. Okay, Patrick, any updates on your side?

140 00:13:31.723 00:13:34.690 Patrik: Yes, yeah. So I am

141 00:13:35.850 00:13:38.039 Patrik: trying to look at ways to make the

142 00:13:39.390 00:13:41.939 Patrik: Like chap up faster, really.

143 00:13:42.220 00:13:42.920 Amber Lin: That’s

144 00:13:43.850 00:13:48.379 Patrik: That’s core. Let me share my screen.

145 00:13:50.205 00:13:57.900 Patrik: So what I did was, I kind of rewrote our workflow yesterday in a couple different architectures.

146 00:13:59.260 00:14:07.579 Patrik: one following this kind of document stuffing pattern. You can see here one using

147 00:14:09.050 00:14:17.949 Patrik: like a similarity search to slim down the context that we’re putting into.

148 00:14:18.525 00:14:22.620 Patrik: What’s it called slim down the context that we’re shoving into the Llm

149 00:14:24.730 00:14:29.612 Patrik: Another one which actually like persists

150 00:14:30.460 00:14:39.589 Patrik: the embeddings locally. And then we just instead of rebuilding the embeddings, we actually just like, pull them from file.

151 00:14:41.820 00:14:43.950 Patrik: One of these is testing

152 00:14:44.845 00:14:48.940 Patrik: essentially having open AI generate the embeddings.

153 00:14:49.510 00:14:59.009 Patrik: Actually sorry. This one tests using like a local sentence transformer to generate the embeddings. This one is Gen is having open AI generate the embeddings.

154 00:14:59.330 00:15:06.020 Patrik: And then this one uses a cache embeddings. So we actually like skip, basically, it’s just like skip that step.

155 00:15:07.870 00:15:14.490 Patrik: and I’ve yet to implement like a full rag solution with, you know, like some sort of vector, database

156 00:15:17.050 00:15:23.400 Patrik: but I ran this through some benchmarking stuff so you can see the output here

157 00:15:25.280 00:15:35.929 Patrik: Basically, what I’ve found is on the Llm. Side. I’m pretty sure Openai is caching

158 00:15:36.320 00:15:39.120 Patrik: all the tokens that it generates.

159 00:15:39.290 00:15:42.409 Patrik: So as long as the context doesn’t change

160 00:15:43.288 00:15:49.912 Patrik: it’s just gonna hit the cash and we actually like don’t incur that much of a cost.

161 00:15:51.110 00:15:56.330 Patrik: But as soon as the what’s it called?

162 00:15:58.380 00:16:03.099 Patrik: As soon as this ABC, central Doc.

163 00:16:05.320 00:16:10.324 Patrik: this file that has all the all the information

164 00:16:11.450 00:16:17.240 Patrik: pest control as soon as that changes. Then basically, we will have a cache, miss, and then

165 00:16:20.720 00:16:22.900 Patrik: what’s it called? And then we’ll

166 00:16:23.300 00:16:23.930 Miguel de Veyra: I’ll have to read

167 00:16:23.930 00:16:25.410 Patrik: Be like super slow again

168 00:16:25.670 00:16:32.370 Miguel de Veyra: Yeah. But yeah, I mean, cause they change the document. Basically every day, like, everything changes

169 00:16:33.320 00:16:39.490 Patrik: Yeah, yeah. So I think there are a couple other like

170 00:16:39.680 00:16:42.329 Patrik: reasons why we would want to go with

171 00:16:42.550 00:16:46.750 Patrik: rag or or like a vector, solution.

172 00:16:47.569 00:16:48.920 Patrik: One is like.

173 00:16:49.200 00:16:55.999 Patrik: the document is going to continue to grow. So where does inevitably hit some sort of token limit on the Llm side?

174 00:16:56.760 00:16:57.750 Patrik: Yeah.

175 00:16:57.750 00:17:04.550 Miguel de Veyra: And then I think it will also reach a certain point where there’s nothing we can do in terms of speed, because the context is just too big

176 00:17:05.140 00:17:06.859 Patrik: Yes. Correct. Yeah.

177 00:17:07.700 00:17:11.329 Miguel de Veyra: Unless we slice it up to very smaller chunks

178 00:17:12.050 00:17:14.610 Patrik: Yeah, yeah, yeah, exactly.

179 00:17:15.730 00:17:23.629 Patrik: So that’s yeah. Basically was using this like this text splitter here to

180 00:17:24.069 00:17:30.707 Patrik: pull out. You know, relative chunks, and we’re shoving those into the context instead.

181 00:17:31.410 00:17:35.690 Patrik: so I think I’m like, I’m getting closer to a solution.

182 00:17:36.580 00:17:43.119 Patrik: like a better understanding of how we can work with. You know, this huge, this huge document?

183 00:17:43.430 00:17:50.180 Patrik: And I think the next step before we like move over to

184 00:17:50.620 00:17:57.190 Patrik: some sort of vector, solution is making sure that we can understand the quality

185 00:17:57.380 00:18:01.009 Patrik: coming out of the response.

186 00:18:01.480 00:18:05.039 Patrik: So that when we push it live, we know that we’re not like.

187 00:18:05.750 00:18:09.329 Patrik: you know, we’re not losing like 2030% on quality

188 00:18:10.440 00:18:14.869 Miguel de Veyra: And then I think I’m not sure if it’s possible via vector search.

189 00:18:15.040 00:18:22.409 Miguel de Veyra: because one of the client requests that I think eventually we’re gonna do is, for example, it gives out the certain, for example, about thermosel.

190 00:18:22.640 00:18:25.469 Miguel de Veyra: They want to basically include the page.

191 00:18:25.590 00:18:28.570 Miguel de Veyra: Like, you know, if you want to read more. Here’s the data.

192 00:18:29.250 00:18:31.210 Miguel de Veyra: Here’s the actual documentation

193 00:18:32.360 00:18:33.800 Patrik: Hmm, yes, yes.

194 00:18:33.800 00:18:36.430 Miguel de Veyra: Because I don’t think it’s possible via rug.

195 00:18:36.810 00:18:38.570 Miguel de Veyra: But context, it should be.

196 00:18:39.300 00:18:41.150 Miguel de Veyra: But yeah, not sure.

197 00:18:42.450 00:18:44.099 Patrik: Also, you can always split.

198 00:18:44.310 00:18:54.690 Patrik: You can always see if they ask for a document, or they want to include a document. You can grab it, split it, and then do a similarity search on that all in all in process

199 00:19:00.450 00:19:01.589 Patrik: that makes sense

200 00:19:02.790 00:19:05.050 Miguel de Veyra: No, I, yeah, a bit. But I think

201 00:19:05.320 00:19:18.639 Miguel de Veyra: what they specifically wanted was, it’s on the reply of the bot. So if they click that it will automatically go to Google Docs, or something like that, like the documentation. So they can read the entire thing because the bot only gives like a 2 to 3. Sentence. Summary right

202 00:19:19.470 00:19:26.069 Patrik: Oh, oh, once you want you’re saying you want to link out to the response should link out to the actual Google Doc.

203 00:19:26.070 00:19:30.390 Miguel de Veyra: Yeah. Or you know, whatever we use to display the actual documentation

204 00:19:31.270 00:19:33.009 Patrik: Gotcha gotcha gotcha

205 00:19:34.350 00:19:36.810 Miguel de Veyra: Which complicates a lot of things.

206 00:19:37.590 00:19:39.139 Patrik: Yeah, yeah, yeah.

207 00:19:39.848 00:19:42.400 Patrik: We need some sort of like, structured output

208 00:19:42.400 00:19:47.140 Miguel de Veyra: Yeah, but I don’t think we’re gonna be working on that anytime soon, though, because we’re

209 00:19:47.580 00:19:49.159 Miguel de Veyra: bringing down the arms a lot

210 00:19:53.580 00:19:58.400 Amber Lin: Yeah, Patrick. So what’s the what’s the response? Time right now?

211 00:20:00.220 00:20:02.270 Patrik: I haven’t touched anything on the

212 00:20:02.470 00:20:04.729 Amber Lin: Actual workflow. I see, I see.

213 00:20:04.730 00:20:06.210 Patrik: Is this kind of just like

214 00:20:06.510 00:20:09.729 Patrik: exploration, or more of like a spike? I guess

215 00:20:09.930 00:20:12.080 Amber Lin: Cool. So do you think

216 00:20:12.490 00:20:22.550 Amber Lin: we can change the response time like anytime soon? Or how long do you estimate this will take because we do want to reduce the hours we spend on this

217 00:20:31.970 00:20:35.410 Patrik: So you’re asking how long, how long it’s gonna take

218 00:20:35.630 00:20:37.450 Amber Lin: Yeah, what do you estimate?

219 00:20:39.755 00:20:40.360 Patrik: Question.

220 00:20:40.700 00:20:41.180 Amber Lin: Yeah.

221 00:20:45.050 00:20:48.881 Patrik: Yeah, that’s a good question. Let me look at my clockify.

222 00:20:50.470 00:20:58.980 Patrik: Let’s see, I spent 2 h on this yesterday. So let’s say

223 00:21:00.800 00:21:07.159 Patrik: another hour or 2 to get the rag running tested, and then

224 00:21:08.180 00:21:09.850 Patrik: Understanding, quality.

225 00:21:11.420 00:21:17.520 Patrik: And then maybe another like 2 to 3 h to

226 00:21:17.760 00:21:21.110 Patrik: actually do the NAN implementation

227 00:21:23.417 00:21:25.919 Miguel de Veyra: Amber. So thanks, Patrick Amber, because

228 00:21:25.920 00:21:26.500 Patrik: Yeah.

229 00:21:26.890 00:21:31.709 Miguel de Veyra: Put them, said the response. Times are like it’s between 5 to 10 seconds. Right? Casey

230 00:21:33.240 00:21:34.500 Casie Aviles: Oh, yeah. Yeah.

231 00:21:35.060 00:21:35.410 Amber Lin: Yeah.

232 00:21:35.410 00:21:40.450 Miguel de Veyra: Last time we were with Utam. He said, that should be fine, for now, because the the Yvette

233 00:21:40.450 00:21:48.930 Miguel de Veyra: originally wanted 30 Steve wanted. Was it Scott Scott wanted like 50. And then we what’s his name?

234 00:21:49.150 00:21:50.830 Miguel de Veyra: I forgot. There’s so many names.

235 00:21:50.990 00:21:56.809 Miguel de Veyra: But our partner basically said 5. So I think we’re on the we’re on where we want. Anyways.

236 00:21:57.490 00:22:21.680 Amber Lin: Okay. Sounds good. I think, Patrick. I know you’re like Super Super busy. And you had. It’s amazing you had time to explore all of this. Probably this would also apply to some more internal chat bots eventually. Right now, I think we see that’s all we wanted to do with this. I put their meeting pot in the wait room. But that’s all we probably wanted to.

237 00:22:22.520 00:22:46.450 Amber Lin: for now, because we’re over allocated. So that means we’re not the money right now. So after our AI team, me and Miguel, and we’ll talk about the internal AI backlogs, and probably there we’ll we’ll need your help on that as well. So I think we’ll tag you, or we’ll we’ll have another meeting when that comes up

238 00:22:47.860 00:22:55.250 Patrik: Alright. That sounds good. Yeah. I mean, I definitely wanna like, write down like, let me

239 00:22:55.370 00:23:10.750 Patrik: put all this information down. So there’s actually like a knowledge share for the rest of you guys. On what I’ve learned. But yeah, I mean, ideally, like, you know, we we can apply this to all the other workflows and and things like that.

240 00:23:10.750 00:23:19.020 Amber Lin: Yeah, sounds good. So I think in in linear, just to keep things clean, I will change your project just to update knowledge base. And that will be it

241 00:23:20.630 00:23:21.460 Patrik: Sounds good

242 00:23:21.800 00:23:26.609 Amber Lin: Thank you guys. And also just for just for the team, I think

243 00:23:27.800 00:23:35.290 Amber Lin: we are probably going to have a separate ABC stand up versus AI standup. So

244 00:23:35.300 00:23:58.530 Amber Lin: I will. I will text in the slack how that goes, because I want, if we have any internal, a discussions it to be on our separate meeting. And also probably moving forward the Friday client meetings. I think this week we still want Casey to present the dashboard, but moving forward. Probably the client meetings that you, the engineer, shouldn’t have to go

245 00:24:01.700 00:24:02.360 Casie Aviles: Okay.

246 00:24:03.080 00:24:08.729 Amber Lin: Yeah. But probably this Friday, we still need you guys to be there. Okay, that’s all for me.

247 00:24:08.870 00:24:16.759 Miguel de Veyra: Amber. I think the other thing is, do you? When do you want to start what we talked about yesterday about the end of day stuff, so we can send it all to Utah.

248 00:24:17.630 00:24:22.910 Amber Lin: Sure! I’ll slack you. We can meet just whenever

249 00:24:22.910 00:24:23.949 Miguel de Veyra: Okay. Okay. Sure.

250 00:24:24.180 00:24:25.810 Amber Lin: Okay, thanks guys.

251 00:24:25.810 00:24:27.250 Miguel de Veyra: Thanks. Everyone have a good day.

252 00:24:27.500 00:24:28.470 Amber Lin: Alright!