Meeting Title: Friday Brainforge Demos & Retro Date: 2025-05-09 Meeting participants: Luke Daque, Anne, Uttam Kumaran, Amber Lin, Ryan Brosas, Hannah Wang, Miguel De Veyra, Casie Aviles, Robert Tseng, Caio Velasco


WEBVTT

1 00:01:04.769 00:01:05.750 Miguel de Veyra: Hello! Everyone.

2 00:01:05.750 00:01:06.340 Uttam Kumaran: There you go!

3 00:01:09.100 00:01:10.039 Luke Daque: Hi guys.

4 00:01:10.930 00:01:11.800 Miguel de Veyra: I look.

5 00:01:19.480 00:01:20.880 Luke Daque: How’s everyone?

6 00:01:24.830 00:01:25.540 Uttam Kumaran: Good.

7 00:01:26.080 00:01:26.449 Luke Daque: Okay.

8 00:01:29.970 00:01:31.020 Uttam Kumaran: Aye.

9 00:01:34.680 00:01:36.540 Luke Daque: And you go to sleep already.

10 00:02:28.070 00:02:34.610 Uttam Kumaran: Yeah, I think the Eden team is gonna be late. So I guess Hannah and Amber, we can

11 00:02:35.620 00:02:42.280 Uttam Kumaran: butch up schedule a little bit or could do something else for the 1st bit of time.

12 00:02:43.060 00:02:47.890 Amber Lin: Oh, I didn’t know that they didn’t move this the retro.

13 00:02:48.330 00:02:49.410 Uttam Kumaran: Yeah.

14 00:02:49.920 00:02:51.090 Amber Lin: -Oh.

15 00:02:51.460 00:02:53.819 Luke Daque: Is it gonna be just us for now?

16 00:02:58.830 00:02:59.980 Uttam Kumaran: Yeah.

17 00:03:01.640 00:03:03.650 Amber Lin: Oh, damn there!

18 00:03:04.720 00:03:06.620 Amber Lin: 30 min.

19 00:03:12.020 00:03:16.979 Uttam Kumaran: We could. I mean, we could run it and just call it a wash, or we could.

20 00:03:18.220 00:03:21.389 Uttam Kumaran: I don’t know if everyone here is is down to just

21 00:03:21.840 00:03:26.669 Uttam Kumaran: run this back in 30 min. I know it’s late for some of you guys. So

22 00:03:30.010 00:03:31.220 Uttam Kumaran: more like a.

23 00:03:31.350 00:03:37.359 Amber Lin: Usually we’re starting an hour later, so we can just come back in 30 min.

24 00:03:39.910 00:03:45.179 Luke Daque: Damn yeah, Miguel Casey. And you guys good with that.

25 00:03:46.590 00:03:47.500 Casie Aviles: Yeah, sure.

26 00:03:47.780 00:03:49.610 Miguel de Veyra: Yeah, I think that should be okay.

27 00:03:51.060 00:04:01.749 Uttam Kumaran: I’m down to stay. I mean, I’m just gonna stay on. I’m doing work, so I’ll I can just hang out. I mean, I don’t know. Otherwise I’ll be hanging out in silence if anyone needs help with anything but

28 00:04:02.360 00:04:06.170 Uttam Kumaran: wanting the emails and stuff. So I’ll be here.

29 00:04:07.510 00:04:08.150 Amber Lin: Okay.

30 00:04:08.150 00:04:12.410 Uttam Kumaran: But everyone else can feel free to drop and join again in 30 min.

31 00:04:12.410 00:04:14.960 Amber Lin: Okay, see you guys in 30 min.

32 00:04:14.960 00:04:16.050 Uttam Kumaran: Okay, okay.

33 00:04:25.276 00:04:29.649 Uttam Kumaran: Kyle, the Eden team has a conflict. So

34 00:04:29.800 00:04:32.850 Uttam Kumaran: we were gonna officially start this in 30 min.

35 00:04:34.420 00:04:37.579 Uttam Kumaran: But I’m gonna I’m gonna sit here and do some work, and

36 00:04:38.210 00:04:41.439 Uttam Kumaran: oh, maybe I’ll play some music so.

37 00:04:41.440 00:04:41.980 Caio Velasco: Problem.

38 00:04:41.980 00:04:44.989 Uttam Kumaran: No pressure. If you want to join. Okay? Okay, yeah.

39 00:04:44.990 00:04:46.620 Caio Velasco: No worries. Thank you. Thank you.

40 00:04:47.280 00:04:50.389 Miguel de Veyra: Oh, here I am now you have to explain again. Utah.

41 00:04:50.390 00:04:52.029 Uttam Kumaran: No dude. You explain it, you explain it.

42 00:04:52.612 00:05:05.720 Miguel de Veyra: Yeah. So basically, Eden team will be a bit late by 30 min. So we put them is gonna stay on the call, and some people will stay with him. Feel free to leave, though, if you want. Then just come back 30 min after.

43 00:05:06.950 00:05:07.900 Ryan Brosas: Oh, okay.

44 00:05:14.080 00:05:21.020 Uttam Kumaran: How’s I guess. Let me let me think about what I want while I have everybody on the call like what I can do. Yeah. So how is

45 00:05:22.260 00:05:24.379 Uttam Kumaran: Walk me through this pre retrieval stuff?

46 00:05:25.080 00:05:29.489 Miguel de Veyra: Oh, yeah, remember that stuff you sent like over a month ago or 2 months ago.

47 00:05:30.840 00:05:34.620 Miguel de Veyra: I did that. But it’s all via code, I think wait. I think.

48 00:05:35.170 00:05:40.010 Uttam Kumaran: Which one that that course, or that one that that guy was like doing the whole thing.

49 00:05:40.010 00:05:43.460 Miguel de Veyra: No, no, the. It’s a blog, some sort of a blog.

50 00:05:43.460 00:05:44.490 Uttam Kumaran: Yeah, yeah, yeah.

51 00:05:45.390 00:05:47.330 Miguel de Veyra: The advanced drug thing.

52 00:05:47.570 00:05:48.300 Uttam Kumaran: Yeah.

53 00:05:48.980 00:05:50.540 Miguel de Veyra: Yeah, that one. So basically.

54 00:05:50.540 00:05:51.709 Uttam Kumaran: Send that to me again.

55 00:05:51.940 00:05:52.479 Miguel de Veyra: Yeah, sure.

56 00:05:55.090 00:05:56.260 Miguel de Veyra: Aid.

57 00:05:58.890 00:05:59.999 Miguel de Veyra: Oh, yeah, there you go.

58 00:06:00.310 00:06:01.849 Miguel de Veyra: I said it on AI team.

59 00:06:03.330 00:06:03.710 Uttam Kumaran: Oh, yeah.

60 00:06:03.710 00:06:13.720 Miguel de Veyra: Basically the yeah. The main problem that we had that’s why we couldn’t figure out was we were depending solely on super base and what on any 10 even and super base

61 00:06:14.670 00:06:18.809 Miguel de Veyra: like we were. You know, we were just trying to do that there.

62 00:06:19.240 00:06:23.089 Miguel de Veyra: So I took your advice last Tuesday, Gpt, Google.

63 00:06:23.440 00:06:41.310 Miguel de Veyra: And now, but yeah, basically, what’s what happens is, there’s a trunk. There’s a trunking that it’s all laid out in the documentation. But you know you have. It’s the large embedding is a maximum trunking of 8, 1, 8, 1, 9, 6 tokens. So we made sure it’s way below that

64 00:06:43.291 00:06:47.510 Miguel de Veyra: and then there has to be some sort of sliding window or basically overlap.

65 00:06:47.900 00:06:53.790 Miguel de Veyra: But yeah, I think the biggest part would be pre retrieval. Retrieval is a bit still on the embedding part

66 00:06:54.200 00:07:02.839 Miguel de Veyra: and then posted at the default. So all Pre and post are the ones that I will prioritize. But yeah, pre is technically almost done. I’m just

67 00:07:03.370 00:07:08.329 Miguel de Veyra: waiting to discuss with Casey. Probably I was hoping this time. But yeah, after the call. Then I guess.

68 00:07:08.900 00:07:15.050 Uttam Kumaran: Oh, so basically, what you did is you took the Javi messages and then Travi trans

69 00:07:15.790 00:07:19.609 Uttam Kumaran: Javi transformed is has just like.

70 00:07:20.610 00:07:22.589 Uttam Kumaran: I see what you mean. Okay.

71 00:07:22.590 00:07:23.360 Miguel de Veyra: Yes.

72 00:07:24.180 00:07:27.310 Uttam Kumaran: Nice dude. This metadata column is great.

73 00:07:27.490 00:07:30.060 Uttam Kumaran: Oh, dude! The keyword column is great.

74 00:07:30.260 00:07:31.220 Miguel de Veyra: Yes.

75 00:07:32.050 00:07:32.870 Uttam Kumaran: Wow!

76 00:07:32.870 00:07:35.659 Miguel de Veyra: See when we avoid the any time, limitation.

77 00:07:35.940 00:07:36.470 Uttam Kumaran: And that’s.

78 00:07:36.470 00:07:37.069 Miguel de Veyra: Oh, good!

79 00:07:37.070 00:07:39.460 Uttam Kumaran: I’m literally like so happy seeing this.

80 00:07:40.620 00:07:44.239 Uttam Kumaran: Sorry. I’m just kind of tired, but I I swear I’m very happy seeing this.

81 00:07:45.730 00:07:47.909 Uttam Kumaran: Wow! This is amazing dude.

82 00:07:48.970 00:08:02.430 Miguel de Veyra: And then there’s trunking strategies. Originally we were planning to do it like by week by 2 weeks, but then it some messages in the Avi are like 200 back and forth, so it can’t be that it, you know. Basically, it’s too long.

83 00:08:02.860 00:08:04.110 Uttam Kumaran: 200.

84 00:08:04.110 00:08:07.840 Miguel de Veyra: Yeah, like, you can see actually in there, like the returns message, count.

85 00:08:08.150 00:08:12.940 Miguel de Veyra: There, there’s a hundred 62 there.

86 00:08:12.940 00:08:13.399 Uttam Kumaran: As well as.

87 00:08:13.400 00:08:15.320 Miguel de Veyra: 34, 1, 68.

88 00:08:19.520 00:08:27.639 Miguel de Veyra: So basically, what had to happen was we had to separate that into different parts. And then you can see, there’s a part section and 4.

89 00:08:27.640 00:08:32.150 Uttam Kumaran: Dude 91 messages seems kind of insane. Are you really.

90 00:08:32.590 00:08:34.630 Miguel de Veyra: I mean, you can see the conversations there.

91 00:08:38.880 00:08:39.720 Uttam Kumaran: Oh!

92 00:08:59.370 00:09:01.480 Uttam Kumaran: Oh! This is for the chunk.

93 00:09:01.880 00:09:02.770 Miguel de Veyra: Yes.

94 00:09:03.210 00:09:09.750 Uttam Kumaran: Oh, okay, okay, so this isn’t necessarily one thread. Okay?

95 00:09:09.950 00:09:11.789 Miguel de Veyra: Yeah, yeah, yeah. That’s for the show.

96 00:09:11.790 00:09:14.989 Uttam Kumaran: Dude. I’m very jealous. So so the what this guy said worked.

97 00:09:16.050 00:09:17.350 Miguel de Veyra: So far. Yes.

98 00:09:18.760 00:09:25.239 Miguel de Veyra: we’re on pretty limited, because I have. I I wasn’t able to work on this yesterday just today, most of it. But

99 00:09:25.830 00:09:28.460 Miguel de Veyra: yeah, cause we had to deploy some agents yesterday.

100 00:09:29.050 00:09:32.049 Uttam Kumaran: And then where is the code running for this.

101 00:09:33.720 00:09:37.880 Miguel de Veyra: The pre retrieval is. Wait, let me, I think. Did I push it already? I’m not sure.

102 00:09:38.030 00:09:41.240 Uttam Kumaran: You should run this. This is like something great. You can run in Dagster.

103 00:09:42.660 00:09:44.349 Miguel de Veyra: Yeah, wait. Let me check.

104 00:09:47.920 00:09:50.819 Miguel de Veyra: Yeah, we transform. No, I haven’t pushed it.

105 00:09:53.620 00:10:02.460 Miguel de Veyra: But yeah, but of course, right, the the code for this will be completely different for zoom, for linear, for Github I mean Github.

106 00:10:02.460 00:10:03.500 Uttam Kumaran: Yeah, yeah, yeah.

107 00:10:04.440 00:10:10.890 Uttam Kumaran: So this is where you should work with a wish. And once you get whatever the pre retrieval step is, we should add that to Dagster

108 00:10:12.000 00:10:15.370 Uttam Kumaran: or cause he’s pulling the data in.

109 00:10:15.370 00:10:15.710 Miguel de Veyra: Yeah.

110 00:10:16.027 00:10:20.149 Uttam Kumaran: But perfect dude. This is like what I wanted to see. Okay, great.

111 00:10:22.710 00:10:24.383 Miguel de Veyra: Nice good thing.

112 00:10:25.110 00:10:27.750 Uttam Kumaran: No, it’s not. Finally, it’s more like.

113 00:10:29.420 00:10:32.250 Uttam Kumaran: you know, I knew it’s just gonna take some like.

114 00:10:33.020 00:10:36.370 Uttam Kumaran: it’s just this is the next level. You know, this is the next level.

115 00:10:36.370 00:10:37.450 Miguel de Veyra: Yes.

116 00:10:38.010 00:10:46.379 Miguel de Veyra: but yeah, I I were actually me. And Casey worked on this. We just hopped on a call. We were discussing ideas. Yada Yada. He had his. He had my, I had mine

117 00:10:46.760 00:10:47.740 Miguel de Veyra: back and forth.

118 00:10:48.660 00:10:50.030 Uttam Kumaran: Hmm, nice. Okay.

119 00:10:56.970 00:11:00.769 Miguel de Veyra: But this this hasn’t been embedded yet, because I want to try something

120 00:11:01.100 00:11:05.320 Miguel de Veyra: that’s cause you know. What I wanna avoid is, if there’s like no

121 00:11:06.390 00:11:11.229 Miguel de Veyra: overlap. I want to have like 500 to a thousand token overlap, so I’m still mad.

122 00:11:11.230 00:11:12.240 Uttam Kumaran: I see.

123 00:12:13.790 00:12:17.930 Miguel de Veyra: Put them. Since we’re gonna be a bit delayed, I’ll just get coffee downstairs.

124 00:12:18.160 00:12:19.130 Uttam Kumaran: Oh, yeah. Yeah. Yeah.

125 00:12:19.130 00:12:20.480 Miguel de Veyra: If I want beef I’ll just meat.

126 00:12:20.600 00:12:22.310 Miguel de Veyra: See, you guys, bye, bye.

127 00:12:31.380 00:12:37.379 Uttam Kumaran: Okay, guys. Well, all the landing pages are done. Fire, basically right? Most of them.

128 00:12:41.961 00:12:46.620 Hannah Wang: like 4 of them. But they’re not like perfect.

129 00:12:46.930 00:12:48.767 Hannah Wang: But it works.

130 00:12:50.930 00:12:53.750 Hannah Wang: Yeah, I mean, yeah.

131 00:12:55.500 00:13:00.369 Hannah Wang: yeah, I think once we have like the foundations down. I can go ahead and change like

132 00:13:00.780 00:13:02.280 Hannah Wang: copy and stuff.

133 00:13:02.780 00:13:07.159 Uttam Kumaran: Yeah, yeah, I agree. That’s why I was like, get it out because I can go in and make web flow, copy changes.

134 00:13:07.160 00:13:16.510 Hannah Wang: Yeah. So I think Helium’s, I mean, all of us are basically on that figma file right now, like modifying stuff. So I think helium’s working on it.

135 00:13:19.500 00:13:22.579 Hannah Wang: Oh, we have like 4 out, I think.

136 00:23:23.540 00:23:24.420 Miguel de Veyra: Hey! It’s.

137 00:23:25.700 00:23:26.450 Uttam Kumaran: Hello!

138 00:23:27.300 00:23:27.980 Miguel de Veyra: Alright!

139 00:24:00.220 00:24:07.190 Uttam Kumaran: So I want to. I want to try to centralize some of the nodes so like, for example, the slack formatter.

140 00:24:08.210 00:24:13.819 Uttam Kumaran: Oh, so you already created a slack tool. So can I just go. I can just go create a couple more tools right? Like similar to that.

141 00:24:16.420 00:24:17.350 Miguel de Veyra: Yes.

142 00:24:18.300 00:24:18.850 Uttam Kumaran: Okay.

143 00:24:24.630 00:24:28.870 Miguel de Veyra: And then, yeah, I think maybe next week that should be a job for me. And Casey

144 00:24:29.850 00:24:33.249 Miguel de Veyra: basically use folders. I don’t know why we’re not using folders.

145 00:24:35.800 00:24:37.459 Uttam Kumaran: Oh, yeah. The folders thing. Yeah.

146 00:34:09.460 00:34:11.050 Hannah Wang: I don’t know if we should wait

147 00:34:11.159 00:34:18.830 Hannah Wang: longer, or if we should just start, because maybe their meeting is running late or something.

148 00:34:18.830 00:34:24.340 Uttam Kumaran: Okay, yeah, it looks like folks are bash, maybe.

149 00:34:24.340 00:34:27.110 Hannah Wang: Regular crew that was here.

150 00:34:27.420 00:34:32.990 Uttam Kumaran: Maybe we maybe I think we just begin and they stream in. That’s fine.

151 00:34:39.570 00:34:48.120 Hannah Wang: Yeah. So I don’t know if we wanna like skip the icebreaker so we can get to the other content or.

152 00:34:49.120 00:34:51.120 Uttam Kumaran: Like I think we should do. The icebreaker.

153 00:34:51.340 00:34:52.370 Hannah Wang: Okay.

154 00:34:53.870 00:34:56.719 Uttam Kumaran: The other. Content is boring.

155 00:34:57.880 00:35:01.900 Luke Daque: Yeah, I guess we can start. Then it’s just gonna be us. But

156 00:35:02.710 00:35:05.129 Luke Daque: yeah, let me share my screen.

157 00:35:09.410 00:35:11.230 Luke Daque: Can you see my screen?

158 00:35:12.990 00:35:13.890 Luke Daque: Damn.

159 00:35:14.080 00:35:14.550 Uttam Kumaran: Yes.

160 00:35:14.550 00:35:18.099 Luke Daque: Let’s do this. Let’s go with the icebreaker.

161 00:35:19.097 00:35:21.100 Luke Daque: Yeah, this is.

162 00:35:22.650 00:35:34.350 Luke Daque: So let’s maybe just put well, it’s just the 5 or 6 of us. So maybe we can share each of us share like an interesting non work, related Tab. That’s currently open in our browsers right now.

163 00:35:34.350 00:35:37.449 Amber Lin: Okay, let me go. Find. Let me go find it.

164 00:35:37.760 00:35:38.740 Hannah Wang: That’s funny.

165 00:35:38.740 00:35:39.979 Uttam Kumaran: Oh, wow!

166 00:35:40.940 00:35:41.650 Luke Daque: Fleet’s not.

167 00:35:41.650 00:35:42.920 Miguel de Veyra: Circulated.

168 00:35:43.375 00:35:43.830 Luke Daque: Yeah.

169 00:35:44.180 00:35:46.629 Miguel de Veyra: Yeah, mine’s actually in my other screen.

170 00:35:46.860 00:35:49.610 Miguel de Veyra: It’s does anyone play league.

171 00:35:50.970 00:35:51.780 Amber Lin: Congratulations.

172 00:35:52.630 00:36:02.329 Miguel de Veyra: Yeah, basically, it’s why it’s why no one will ever beat Faker. It’s basically a video Youtube video of him of glazing him or something.

173 00:36:02.520 00:36:06.109 Amber Lin: Oh, so I’ve been following the guy since like 2014.

174 00:36:07.310 00:36:08.220 Luke Daque: Nice.

175 00:36:08.420 00:36:09.370 Ryan Brosas: No.

176 00:36:11.430 00:36:16.414 Uttam Kumaran: I don’t know if I like, cause my arc I have arc remove

177 00:36:17.580 00:36:21.710 Uttam Kumaran: tabs after like 12 h, cause I just get like

178 00:36:23.360 00:36:28.460 Uttam Kumaran: I do have some weird bookmarks, though. Does that kind of does that count? Can I like.

179 00:36:28.960 00:36:32.389 Luke Daque: I guess. So. Yeah, let’s see if I can.

180 00:36:32.390 00:36:33.600 Luke Daque: What’s yours?

181 00:36:35.860 00:36:44.110 Luke Daque: Mine’s essentially mine to this trading view.

182 00:36:44.520 00:36:45.820 Amber Lin: Oh dear!

183 00:36:45.820 00:36:46.210 Hannah Wang: Oh!

184 00:36:47.730 00:36:52.250 Luke Daque: If those been like rallying that past few days, like almost.

185 00:36:52.250 00:36:58.510 Amber Lin: Oh, wait! Look! How how deep did it drop last time, like.

186 00:36:58.510 00:36:59.870 Luke Daque: Not so rash.

187 00:36:59.870 00:37:05.180 Luke Daque: It is like 32%, 30%. And now it’s like, up.

188 00:37:05.180 00:37:06.360 Amber Lin: Oh, it’s back!

189 00:37:06.530 00:37:13.289 Luke Daque: Up again like 39%, 40% from the recent bill.

190 00:37:13.290 00:37:18.650 Amber Lin: The amount of people that have just probably just took the loss at that point.

191 00:37:19.340 00:37:24.710 Luke Daque: Yeah, it’s very dull at time. But yeah, I keep tabs on on the stuff.

192 00:37:24.710 00:37:25.990 Amber Lin: So interesting.

193 00:37:25.990 00:37:26.690 Luke Daque: Trading.

194 00:37:27.670 00:37:35.436 Amber Lin: That’s so interesting. I was almost gonna work at a crypto hedge fund. And then and then the the Crypto market went really bad.

195 00:37:36.870 00:37:37.550 Luke Daque: Okay.

196 00:37:39.728 00:37:44.629 Amber Lin: I shared my tab. I was looking at the different cities to

197 00:37:44.970 00:37:50.520 Amber Lin: move to in this days, because my lease is ending up in mid August, and I was like.

198 00:37:50.520 00:37:51.710 Hannah Wang: Okay.

199 00:37:52.120 00:37:59.210 Amber Lin: Maybe I don’t have to stay in la, so I was looking at different places to escape to in the summer. But then I realized.

200 00:37:59.600 00:38:05.760 Amber Lin: like July is the hottest month. So I’m kind of still stuck when it’s the hottest.

201 00:38:07.150 00:38:14.080 Amber Lin: But maybe maybe I’ll go to New York. I don’t know yet. Maybe I’ll go to San Francisco.

202 00:38:16.140 00:38:19.759 Hannah Wang: No, la! Has the best weather. Stay here.

203 00:38:19.760 00:38:20.370 Amber Lin: Thank you.

204 00:38:23.910 00:38:38.469 Hannah Wang: Mine is mine is non work related, but it feels like work because there’s data in it. And it’s my 1st time like looking at metrics and stuff. So I hooked up my website to Google analytics.

205 00:38:38.830 00:38:39.170 Uttam Kumaran: No.

206 00:38:39.170 00:38:41.329 Hannah Wang: I don’t know what any of this really is.

207 00:38:41.650 00:38:42.510 Hannah Wang: What.

208 00:38:42.830 00:38:44.460 Amber Lin: And we see your website.

209 00:38:44.460 00:38:49.899 Hannah Wang: Oh, I guess that’s a link. Oh, it’s just like kind of vulnerable, I guess. But sure you can.

210 00:38:50.610 00:38:53.226 Amber Lin: We need some boosted traffic.

211 00:38:53.750 00:38:54.610 Hannah Wang: Engagement.

212 00:38:54.610 00:38:54.990 Amber Lin: Badge.

213 00:38:54.990 00:38:56.461 Hannah Wang: With my site. Please.

214 00:38:56.830 00:38:58.740 Amber Lin: Drive and reshare.

215 00:38:58.740 00:39:00.020 Uttam Kumaran: Yeah.

216 00:39:00.200 00:39:00.680 Amber Lin: Oh!

217 00:39:00.680 00:39:01.220 Hannah Wang: Awesome.

218 00:39:01.220 00:39:06.880 Amber Lin: Thank you. I really dang. That looks like stock photos, the one with the shadows.

219 00:39:07.710 00:39:13.600 Hannah Wang: It’s not yeah. That’s that’s my website.

220 00:39:16.370 00:39:24.389 Hannah Wang: It was really easy making it on. It’s a platform called Show it I find it a lot easier than web, flow and.

221 00:39:24.620 00:39:25.090 Amber Lin: Hmm.

222 00:39:25.090 00:39:29.690 Hannah Wang: Like all the other software or website building tools.

223 00:39:30.890 00:39:31.680 Hannah Wang: Yeah.

224 00:39:31.680 00:39:36.199 Amber Lin: Mine is mine is unfortunately stuck on Wix, and I don’t like.

225 00:39:36.200 00:39:43.170 Hannah Wang: Oh, you show it. I mean, it’s kind of steep. The price is kind of steep, but I think it’s worth it if you

226 00:39:43.390 00:39:44.270 Hannah Wang: yeah.

227 00:39:44.700 00:39:51.000 Hannah Wang: Well, thanks. For like, thoroughly clicking through everything, showcasing my website.

228 00:39:51.000 00:39:51.640 Amber Lin: Good.

229 00:39:51.730 00:39:52.690 Hannah Wang: Thank you.

230 00:39:52.690 00:39:56.980 Amber Lin: Let me pull up my my website.

231 00:39:59.157 00:40:01.959 Amber Lin: I haven’t updated in a while.

232 00:40:02.090 00:40:04.920 Amber Lin: Okay, drop me it in the chat.

233 00:40:08.550 00:40:09.730 Hannah Wang: Oh, I’m in.

234 00:40:09.730 00:40:18.909 Amber Lin: Talking about the like. The League game. Me and my girlfriend were watching apex, I think nationals the other day.

235 00:40:19.110 00:40:20.940 Amber Lin: That was very interesting.

236 00:40:21.620 00:40:23.179 Luke Daque: Apex, legends.

237 00:40:25.480 00:40:26.380 Amber Lin: Oh.

238 00:40:30.505 00:40:35.910 Amber Lin: yeah, it’s about royal. It’s like a combination of controller and keyboard games.

239 00:40:35.910 00:40:36.460 Miguel de Veyra: Yeah.

240 00:40:36.900 00:40:37.700 Amber Lin: Yeah.

241 00:40:38.880 00:40:41.709 Luke Daque: Oh, you do. Registration to.

242 00:40:41.710 00:40:42.700 Amber Lin: Do a lot of.

243 00:40:42.700 00:40:44.739 Hannah Wang: Oh, wow! Thanks!

244 00:40:44.740 00:40:47.840 Luke Daque: Is this the actual painting, painting.

245 00:40:48.217 00:40:58.020 Amber Lin: This is digital. I I used to actual painting, but I moved around so much I can realistically only keep an ipad and not anything else.

246 00:40:58.720 00:40:59.300 Amber Lin: So.

247 00:40:59.678 00:41:02.329 Luke Daque: This is great, though, like you, yeah.

248 00:41:03.270 00:41:04.220 Amber Lin: I appreciate it.

249 00:41:06.040 00:41:07.240 Luke Daque: Awesome.

250 00:41:07.410 00:41:09.786 Amber Lin: Yeah, this is what I do outside of work.

251 00:41:10.050 00:41:10.735 Hannah Wang: Yeah,

252 00:41:14.690 00:41:18.010 Luke Daque: Yeah. Anybody else want to share there.

253 00:41:18.220 00:41:21.230 Luke Daque: Interesting non work, tabs. Ultam, you were saying.

254 00:41:21.230 00:41:37.669 Uttam Kumaran: Yeah, I can share. Yeah, I was looking cause like, when I’m on my when I’m on my machine, it’s like, actually mostly work. It’s kind of boring, but I was like looking like what’s in my safari tabs. And there’s a local like thrift store here in Austin that I’m I may go to tomorrow.

255 00:41:37.670 00:41:38.330 Hannah Wang: Oh!

256 00:41:38.330 00:41:39.350 Amber Lin: Yay!

257 00:41:39.870 00:41:42.139 Uttam Kumaran: And they have great stuff like.

258 00:41:42.140 00:41:42.800 Hannah Wang: Yeah, okay.

259 00:41:42.800 00:41:44.800 Uttam Kumaran: More more like curated thrift.

260 00:41:45.090 00:41:49.892 Uttam Kumaran: So it’s like a little bit pricey, but like, just like

261 00:41:50.560 00:41:54.750 Uttam Kumaran: really cool, like, I kind of like going for jackets, or like graphic tees

262 00:41:57.280 00:41:59.890 Uttam Kumaran: and they just have like a lot of stuff. So

263 00:42:00.040 00:42:03.260 Uttam Kumaran: I just sit and like kind of go through every single one, and like.

264 00:42:03.820 00:42:05.928 Uttam Kumaran: see if there’s anything cool.

265 00:42:07.420 00:42:08.669 Amber Lin: Who was that?

266 00:42:09.400 00:42:11.730 Uttam Kumaran: So let’s find anything.

267 00:42:12.330 00:42:14.260 Uttam Kumaran: I will, I will! I will.

268 00:42:15.540 00:42:18.190 Amber Lin: Oh! And said, Try to make a make.

269 00:42:18.840 00:42:19.880 Uttam Kumaran: Try to what.

270 00:42:20.090 00:42:23.999 Amber Lin: New figma make. Is it a? Is it automation for figma?

271 00:42:25.030 00:42:29.460 Uttam Kumaran: Wait there. No, there’s a new figma. It’s basically like bolt per figma.

272 00:42:30.680 00:42:32.600 Amber Lin: Oh, I see, I see

273 00:42:33.810 00:42:40.110 Amber Lin: I was looking for one when I was doing ux design, but then they don’t do it really well.

274 00:42:40.310 00:42:41.519 Amber Lin: Is it good?

275 00:42:44.630 00:42:45.950 Uttam Kumaran: I haven’t tried it yet.

276 00:42:48.000 00:42:49.690 Hannah Wang: I haven’t either, Anne. Happy.

277 00:42:49.690 00:42:50.250 Amber Lin: It’s.

278 00:42:51.060 00:42:58.449 Anne: I know I meant the Sigma draw because you said earlier that you’re drawing in Iowa.

279 00:42:58.610 00:43:00.220 Anne: so maybe you should try it.

280 00:43:02.350 00:43:02.970 Hannah Wang: Oh! Fantastic!

281 00:43:02.970 00:43:07.050 Anne: Instead of yeah, yeah. Instead of the adobe. Illustrate.

282 00:43:07.970 00:43:09.000 Amber Lin: Oh!

283 00:43:10.490 00:43:15.210 Hannah Wang: Dang figma is just trying to become an all in one platform.

284 00:43:16.940 00:43:26.080 Uttam Kumaran: No, some of your stuff is great, I mean. Look, I don’t know. It depends on. If Haleem is like, if he’s like we should move this to figma sites and we’ll move it. I just feel like it’s probably not as good yet, but like

285 00:43:26.290 00:43:30.069 Uttam Kumaran: I would do everything in Figma if we could. I mean, it’s we’re paying a lot of money for it.

286 00:43:30.640 00:43:31.470 Hannah Wang: Yeah.

287 00:43:36.440 00:43:39.540 Luke Daque: Cool. Yeah, I think that’s it for the icebreaker.

288 00:43:39.690 00:43:41.960 Luke Daque: unless, Anne, you want to share something.

289 00:43:45.480 00:43:49.739 Anne: No, okay, that’s so bad for maybe yes.

290 00:43:51.430 00:43:54.590 Anne: I think you don’t want to see that it’s

291 00:43:56.240 00:43:59.569 Anne: I want to renovate my bathroom. So.

292 00:44:00.420 00:44:01.529 Hannah Wang: Do you have a pinterest board.

293 00:44:03.634 00:44:06.180 Anne: None so far. Just browse, maybe.

294 00:44:07.660 00:44:08.490 Uttam Kumaran: Nice.

295 00:44:10.000 00:44:10.640 Luke Daque: Cool.

296 00:44:11.720 00:44:17.029 Luke Daque: Yeah. So maybe we can proceed with the next part, which is the lab share.

297 00:44:17.450 00:44:24.649 Luke Daque: And so I’m gonna share something about continuous improvement, which is something I

298 00:44:26.770 00:44:31.980 Luke Daque: haven’t done in the past, basically. So practicing continuous improvement at work and beyond.

299 00:44:32.590 00:44:36.160 Luke Daque: So for anybody who doesn’t know what continuous improvement is.

300 00:44:36.160 00:44:37.809 Amber Lin: Whoa! Whoa!

301 00:44:38.858 00:44:55.440 Luke Daque: It’s basically it originated in Japan. Like Kaizen, it’s a philosophy philosophy meaning change for the better, basically. And it’s like improvement in an ongoing effort. So that’s why it’s called continuous, because, like

302 00:44:56.140 00:45:05.119 Luke Daque: small incremental steps, and that leads to significant results over time. And which is very interesting, really, because, like

303 00:45:05.430 00:45:24.839 Luke Daque: the human nature is really not very good at adapting to drastic changes. But when you do it like small, consistent changes over time. That’s where like results really happen. Like, for example, losing weight. If you do a drastic diet change, that’s not gonna work. But if you like.

304 00:45:25.270 00:45:29.409 Luke Daque: do a diet change incrementally, slowly, like removing

305 00:45:29.640 00:45:33.959 Luke Daque: 50 calories a day, for example, or like exercising, like

306 00:45:34.620 00:45:38.010 Luke Daque: having 500 more steps in like

307 00:45:38.380 00:45:42.990 Luke Daque: every day that would like lead to better results over time.

308 00:45:45.710 00:45:51.370 Luke Daque: So a little bit of history, and like how I discovered or not really discovered, but like

309 00:45:53.240 00:46:11.790 Luke Daque: like how I got into continuous improvement. So I used to work in Lexmark as a process analyst. And that’s basically my my role there as a process analyst. I was like mapping workflows identifying brought bottlenecks and like recommending changes to improve the processes

310 00:46:12.100 00:46:26.009 Luke Daque: and basically saw firsthand like how that impacted a lot in terms of like eliminating waste and getting process efficiencies and stuff like that. And and on a side note, that’s actually also where I’ve

311 00:46:26.710 00:46:32.260 Luke Daque: incidentally fell in love with data, basically because we need, we need data to measure

312 00:46:32.903 00:46:41.980 Luke Daque: improvements. And without it, we really can’t measure improvements objective objectively. So it’s like data driven

313 00:46:43.110 00:46:44.220 Luke Daque: results.

314 00:46:45.070 00:46:55.250 Luke Daque: So yeah, so we can apply continuous improvement at work. There’s a lot of things that we can improve like how we can improve our communication, for example. And that’s

315 00:46:55.390 00:47:07.410 Luke Daque: exactly what we are already starting to do. Like documenting everything trying to automate stuff, especially with the advent of AI. There’s a lot that we can improve, and that’s that’s really great.

316 00:47:09.170 00:47:13.789 Luke Daque: But I’m like for for this change session. I’m like trying to

317 00:47:14.350 00:47:22.589 Luke Daque: show like how we can also do it outside work, like in terms of hobbies, and like even just general improvement.

318 00:47:24.210 00:47:26.439 Luke Daque: On yourself. So like

319 00:47:26.590 00:47:48.399 Luke Daque: these are like examples of like how I try to implement it in. In my hobbies, like in table tennis, for example, I tried to like record my matches, then review my mistakes and try to improve practice. Target my weaknesses to to improve over there like speed queuing. I’m not sure if you’re familiar with anybody’s familiar with speed queuing, it’s basically like solving

320 00:47:48.510 00:47:50.059 Luke Daque: the Rubik’s cube as fast as.

321 00:47:50.060 00:47:51.670 Uttam Kumaran: What’s your fastest time?

322 00:47:52.797 00:47:56.839 Luke Daque: Still just 11 seconds, not even sub.

323 00:47:57.240 00:48:00.810 Uttam Kumaran: Nice some 10 seconds.

324 00:48:00.810 00:48:03.910 Uttam Kumaran: This was, I think, 30 seconds.

325 00:48:04.560 00:48:06.400 Luke Daque: That’s that’s good enough.

326 00:48:06.400 00:48:09.349 Amber Lin: I’ve never done it under 30 min.

327 00:48:09.350 00:48:11.280 Hannah Wang: Can’t even do it. Yeah. So.

328 00:48:11.280 00:48:23.190 Uttam Kumaran: I was only using beginner strategy. I didn’t even go like advanced like I didn’t do. I didn’t look at. Go to this next like f 2 l. Like these steps. I mean, I had it, but they were longer. They’re like 4 or 5 sequence steps.

329 00:48:23.660 00:48:26.689 Luke Daque: Yeah, they’re they’re they’re even longer steps. But yeah.

330 00:48:27.215 00:48:34.609 Luke Daque: I got addicted to that, especially during the pandemic, like, just nothing to do. So I just sold them excuses and stuff like that. So

331 00:48:35.000 00:48:42.139 Luke Daque: yeah, also, like, trading. Yeah, back back testing strategies improving. So it’s basically the same cycle

332 00:48:42.570 00:48:44.219 Luke Daque: private. Second to

333 00:48:45.150 00:49:01.219 Luke Daque: so yeah, it’s essentially continuous. Improvement is the same as the growth mindset. If you apply it to yourself and like learning enforces the belief that you don’t need to be great to start. But you need to start to be great and keeps us humble as well, and.

334 00:49:01.220 00:49:02.549 Uttam Kumaran: Nice. I like that.

335 00:49:02.850 00:49:03.570 Luke Daque: Yeah.

336 00:49:04.280 00:49:12.400 Luke Daque: it’s not about perfection, but an ongoing process to improve little by little every day. So in incremental improvements, basically.

337 00:49:13.380 00:49:19.820 Luke Daque: thank you so much like examples of like what you can ask yourself to improve basement.

338 00:49:20.620 00:49:23.639 Uttam Kumaran: It’s sort of how we do the Re. We do the retros right like.

339 00:49:23.890 00:49:25.160 Luke Daque: Exactly. Yeah.

340 00:49:26.300 00:49:26.750 Uttam Kumaran: Yeah.

341 00:49:26.750 00:49:28.740 Luke Daque: Yeah, this is like small.

342 00:49:29.780 00:49:33.170 Luke Daque: Some quote, I found somewhere, second.

343 00:49:33.740 00:49:35.799 Luke Daque: yeah, any any thoughts on those guys.

344 00:49:35.800 00:49:40.039 Amber Lin: When I was 1st learning project management, I was looking at a lot of the

345 00:49:40.250 00:50:08.350 Amber Lin: like supply chain management, and I know, like lean 6 sigma and all that is a big part in supply chain management. So remind me of that. And then also, I just wanna echo what you said is like, I. I enjoy doing this every single day like currently more on personal stuff. For work. I don’t know it just for me. It just feels like the natural thing. So it’s pretty cool to see that

346 00:50:08.510 00:50:11.519 Amber Lin: like this is something also very important to you, too.

347 00:50:13.170 00:50:13.920 Luke Daque: Nice.

348 00:50:14.670 00:50:15.480 Amber Lin: Yeah.

349 00:50:17.010 00:50:23.429 Amber Lin: and we have a lot of process in the company that we can. We can use some of that insight.

350 00:50:24.070 00:50:25.870 Luke Daque: Yeah, definitely, okay.

351 00:50:26.940 00:50:31.260 Uttam Kumaran: Yeah, I think a lot about just like iteration cycles, too, like in a company, you know.

352 00:50:31.370 00:50:46.989 Uttam Kumaran: Probably the easiest form that we can all understand is like sprints. Right? So we have a start. We have every daily, and then we have the end, and that is a 2 week cycle right? But commonly we’re as limited as our the fastest ways. We can

353 00:50:47.290 00:50:49.589 Uttam Kumaran: do something test and then improve.

354 00:50:49.790 00:51:04.209 Uttam Kumaran: And so I think about like, how can we shorten that to like days right? Like we just maybe 1 30 min meeting away from like learning something. And so for me, it’s always about, how do you increase the iteration cycles? How do we try to do more cycles?

355 00:51:05.840 00:51:08.240 Uttam Kumaran: We’re only human. But I also think that

356 00:51:08.390 00:51:14.219 Uttam Kumaran: commonly businesses there’s just so much going on that things get pushed, and then for the business.

357 00:51:14.390 00:51:21.379 Uttam Kumaran: The iteration cycle is a lot slower, right? And so I think about every week is the business getting 1% better.

358 00:51:21.500 00:51:25.099 Uttam Kumaran: which means at the end of the year we should be 300% better. Right?

359 00:51:25.600 00:51:29.809 Uttam Kumaran: That’s for me. When I look at when I reflect on the week is like.

360 00:51:30.360 00:51:32.200 Uttam Kumaran: are we 1%, you know. So.

361 00:51:35.680 00:51:42.330 Luke Daque: Yes, yeah. Any any other pops guys?

362 00:51:42.900 00:51:48.019 Luke Daque: If not, we can proceed with the next part, which is the client. Help.

363 00:51:49.560 00:51:51.400 Uttam Kumaran: Do you want to refresh this, Luke.

364 00:51:52.190 00:51:52.920 Luke Daque: Pressure.

365 00:51:54.020 00:51:54.930 Uttam Kumaran: Okay, cool.

366 00:51:55.730 00:51:56.830 Uttam Kumaran: So

367 00:51:57.450 00:52:04.081 Uttam Kumaran: yeah. And I would say, for the most part, except for Javi, I think we’re doing really well.

368 00:52:04.850 00:52:18.440 Uttam Kumaran: So maybe I’ll just focus on that because I think every everything else we’re we’re going pretty strong. Or maybe one of the one someone from the Eden team can maybe comment on Eden. But

369 00:52:18.993 00:52:27.496 Uttam Kumaran: on. Javi. Yeah. So we’re basically in renewal talks with them. But I don’t know. They’re sort of dragging their feet. So let’s see, I think this is

370 00:52:28.090 00:52:32.890 Uttam Kumaran: This is the tough part of of what we do is is always trying to share.

371 00:52:33.410 00:52:41.659 Uttam Kumaran: You know, we’re we’re always up against people trying to do this on their own. And so ultimately, there are times where clients graduate from us.

372 00:52:43.090 00:52:58.469 Uttam Kumaran: you know. So I think this is a this is an opportunity for us to see like, hey? Where? In what ways can we still be helpful for them? I mean, in what ways are they like interested in taking on stuff themselves? It’s also a very hard time for us e-commerce businesses.

373 00:53:00.040 00:53:07.040 Uttam Kumaran: Just given like shipping overseas from over from basically, mostly from China, is is really affected.

374 00:53:07.495 00:53:22.350 Uttam Kumaran: So given, our our company is working with a lot of e-commerce companies. It’s it’s important for us to empathize and then find ways to pivot right? And I think pool parts is a good example of. We heard that it’s affecting their business, and we find a couple of ways to to improve

375 00:53:22.934 00:53:30.610 Uttam Kumaran: but overall I feel pretty good about all of our companies. The only other company I don’t have here is we started with off the record this week.

376 00:53:31.437 00:53:35.730 Uttam Kumaran: So we’re working on AI optimizations for them.

377 00:53:35.910 00:53:37.370 Uttam Kumaran: And we have

378 00:53:37.510 00:53:44.319 Uttam Kumaran: 2 proposals out for 2 new clients. Not proposals. Actually, those are actually both in signing. So

379 00:53:44.490 00:53:49.714 Uttam Kumaran: this may expand by another 2. Hopefully by Tuesday.

380 00:53:51.130 00:53:59.850 Uttam Kumaran: so overall, if you’re pretty good, I think the next big milestone that I I kind of want to call out in this call is for urban stems. I I want to really try to lock down

381 00:54:00.080 00:54:02.880 Uttam Kumaran: the proposal, Doc, if we can.

382 00:54:03.371 00:54:07.730 Uttam Kumaran: So that I can get that over to Zack and get them to review

383 00:54:08.185 00:54:15.910 Uttam Kumaran: so that’s something I want to do. And then for pool parts, same thing for the AI work that they asked for. I want to make sure that

384 00:54:16.150 00:54:19.639 Uttam Kumaran: that proposal is ready to go, so I can send them to approve

385 00:54:20.396 00:54:28.239 Uttam Kumaran: and then I think the last piece is across the board on our clients. I think we’ve now been able to nail data, pipelines.

386 00:54:28.873 00:54:37.249 Uttam Kumaran: data, modeling dashboards, and the next sort of layer is gonna be on analysis and insights.

387 00:54:37.667 00:54:52.330 Uttam Kumaran: You know, I was talking to Kyle this morning, and I mentioned to him that, like it’s often businesses don’t ask us for this directly. They’re mainly like we want a dashboard. But what they actually want is like an answer. And they they want answers, and they almost sometimes don’t even know what questions to ask.

388 00:54:52.430 00:55:12.109 Uttam Kumaran: and I think Amber did some analysis this week. I know Robert did some as well. So I think this is the next thing in data that I’m sort of asking all the folks in the data team on if anyone’s interested in this, because across all of these clients we have data modeled in a good place, but we need this sort of like.

389 00:55:12.250 00:55:15.040 Uttam Kumaran: go deep and find some insights.

390 00:55:15.477 00:55:25.309 Uttam Kumaran: I don’t know whether this is necessarily like another team. I don’t know whether we need like a weekly thing to sort of discuss how to do this. I mean, maybe

391 00:55:25.960 00:55:31.699 Uttam Kumaran: for the for the analysts on the call like Amber Annie. Like anyone else who’s sort of done this work.

392 00:55:32.050 00:55:42.289 Uttam Kumaran: My perspective is like I in the past. When I’ve done this work, it’s sort of different, because I have to have all the data ready. But then I come trying to ask some questions, and I sort of just continue to peel the onion back.

393 00:55:42.430 00:55:47.959 Uttam Kumaran: For example, if I’m trying to answer like, Hey, how do we get this company to make more money per product?

394 00:55:48.130 00:56:02.810 Uttam Kumaran: Then I’m gonna start looking at like, hey, what are the costs? Have the cost changed over time? Is there anything in the the highest cost that seem like things we can remove. Those are the sort of things I do. I sort of like, just continue to go. 1 1 more start, one more step

395 00:56:03.360 00:56:10.399 Uttam Kumaran: but you sort of have to chase curiosity a little bit. It’s a different sort of thing than having a clear, you know. Hold a plug, but

396 00:56:11.410 00:56:14.509 Uttam Kumaran: that’s the the next thing that’s on my mind for data work.

397 00:56:22.240 00:56:23.190 Luke Daque: Sounds good

398 00:56:26.239 00:56:30.800 Luke Daque: can go to the next slide, which is dental.

399 00:56:31.320 00:56:36.640 Luke Daque: So I guess I’ll be demoing the data team for now, because, like our age is off.

400 00:56:37.221 00:56:44.860 Luke Daque: So basically, we were going to demo what we started working on, which is connecting to meta plane

401 00:56:45.510 00:56:53.830 Luke Daque: with basically for every for those who don’t know what meta plane is, it is essentially a

402 00:56:56.480 00:57:04.199 Luke Daque: What do you call this? A data observability tool. And it’s also not very like it’s

403 00:57:05.330 00:57:18.700 Luke Daque: it’s not that hard to set up, I would say, although it still needs like a bit of like technical knowledge and stuff like that. And the the cool thing about Meta plane is you can connect to different data stacks like whether it’s the data warehouse in this

404 00:57:19.100 00:57:33.679 Luke Daque: example, we were testing it out on Eden. So they’re using bigquery. We were able to connect it with bigquery their data visualization tool tableau and Dbt core, basically, which is what we are using for data modeling.

405 00:57:33.970 00:57:35.650 Luke Daque: And so, yeah, this is.

406 00:57:35.850 00:57:43.855 Luke Daque: you can see even here in the Dbt tests that we did. It’s essentially like elementary right?

407 00:57:44.430 00:57:51.519 Luke Daque: where we can see the the models that were run, the tests that would

408 00:57:51.770 00:58:06.770 Luke Daque: run as well. And if there are any failing tests in here. How long it took for the models to run this way we can like, visualize and be able to see like if there are like models that are running for so long, how we will not. We might need to like, improve

409 00:58:07.030 00:58:10.770 Luke Daque: the efficiency of those models, or like tests that have failed.

410 00:58:11.572 00:58:14.459 Luke Daque: And there’s we’re still in the early

411 00:58:14.580 00:58:26.640 Luke Daque: part of exploring this, because, there’s a lot of stuff we can do like. We can add monitors here like source freshness monitors, or like data anomalies as well.

412 00:58:27.240 00:58:28.539 Luke Daque: which is pretty cool.

413 00:58:29.860 00:58:32.140 Luke Daque: We can also see the lineage

414 00:58:33.040 00:58:36.720 Luke Daque: for all the data models that we have

415 00:58:37.410 00:58:46.740 Luke Daque: like this one. For example, you can see where the data is coming from, and if there are also, like downstream models that are

416 00:58:46.850 00:58:48.440 Luke Daque: being used for these.

417 00:58:48.570 00:58:55.049 Luke Daque: Well, actually, these are the downstream, and these are the upstream. Mostly, it’s from right to left. Looks like.

418 00:58:56.490 00:59:00.700 Luke Daque: so yeah, that’s really amazing

419 00:59:00.900 00:59:06.290 Luke Daque: if you ask me, because, like, we have one tool that shows all the different

420 00:59:06.490 00:59:12.329 Luke Daque: data sources that we have or data staff that we have. And you can like monitor in just one place.

421 00:59:13.917 00:59:17.580 Luke Daque: So yeah, any any questions on this, so far.

422 00:59:18.230 00:59:33.079 Amber Lin: I kind of. I kinda know the answer. I just wanna hear from you guys of what is the ultimate impact of this? How is this gonna make, like our our team’s lives better or improve our clients like just more on the impact side.

423 00:59:35.970 00:59:40.960 Luke Daque: Yeah, so basically, this is this is going to help us in like

424 00:59:41.560 00:59:58.579 Luke Daque: hopefully, like faster issue detection. Like, if there are like tests that fail, or, like Ddp runs that fail, we should be hopefully, we should be able to use this to fix those issues before. Like the stakeholder, the client will be able to to see that like.

425 00:59:59.111 01:00:19.598 Luke Daque: so yeah, it basically like cuts down the firefighting after a dashboard user or before, like before they see the bugs or data anomalies on their end, like what we’re currently having in pool parts, right? Like they are seeing the data anomalies 1st before us. So that’s not really good.

426 01:00:20.180 01:00:27.600 Amber Lin: So is this something that, like we have to check actively each week, or does it like alert us?

427 01:00:27.930 01:00:30.380 Amber Lin: So it’s more of an active thing to do.

428 01:00:30.730 01:00:47.100 Luke Daque: Yeah, that’s some. That’s a process that we’ll have to decide on. We haven’t like talked about that yet, but we can do it that way. We can either like somebody can check this once a day, or or whatever or like. We can also have it. Send us a slack message or something.

429 01:00:47.554 01:00:51.810 Luke Daque: But we’ll have to like figure out the best approach to the monitoring side.

430 01:00:53.230 01:01:03.030 Amber Lin: Very exciting, very exciting, because most of our workforce right now is patching. So if we can avoid this from the start, that would be. That would have been awesome.

431 01:01:04.050 01:01:05.800 Luke Daque: Yeah, I agree.

432 01:01:05.800 01:01:11.139 Uttam Kumaran: Yeah, I think we also have. It’s basically like some housekeeping stuff, too, like every month, someone should go

433 01:01:11.250 01:01:14.280 Uttam Kumaran: prune models change stuff to incremental.

434 01:01:14.890 01:01:20.840 Uttam Kumaran: So that’s something that we should do like once a month, basically across clients ideally.

435 01:01:22.600 01:01:27.460 Luke Daque: Right, like tech depths. Essentially.

436 01:01:27.460 01:01:28.210 Uttam Kumaran: Yeah.

437 01:01:32.030 01:01:35.600 Luke Daque: So that’s essentially it for the data side.

438 01:01:37.020 01:01:44.320 Luke Daque: Does anybody from the AI or sales want to do any demo for this week?

439 01:01:45.530 01:01:49.949 Uttam Kumaran: Yeah, I mean, Miguel, I kinda wanted to.

440 01:01:51.430 01:01:58.140 Uttam Kumaran: did we? did we make any of the like quick fixes from the AI from the stuff. Yesterday.

441 01:01:59.140 01:02:07.639 Miguel de Veyra: The quick fixes. Wait. I don’t think the at least I didn’t do it. I worked on the github stuff yesterday.

442 01:02:08.700 01:02:12.360 Uttam Kumaran: I think the one thing I want to share, and then

443 01:02:12.999 01:02:26.490 Uttam Kumaran: or maybe do you mind sharing like you mind sharing just like the improvements you guys did in super base for the embeddings. And then, if you can also send in the Zoom chat the article that we’re using as like sort of the foundation.

444 01:02:26.680 01:02:33.330 Uttam Kumaran: And you know, we have a lot of data folks on the call. So I think they’d be interested to see, like, hear more about vector embeddings.

445 01:02:34.241 01:02:36.329 Uttam Kumaran: Pre-filtering things like that.

446 01:02:37.520 01:02:39.600 Miguel de Veyra: Okay, yeah, sure. Let me.

447 01:02:40.480 01:02:41.999 Miguel de Veyra: Should I share my screen? Then?

448 01:02:42.850 01:02:43.810 Uttam Kumaran: Yes, please.

449 01:02:45.260 01:02:46.426 Miguel de Veyra: Okay, yeah.

450 01:02:47.230 01:02:48.060 Miguel de Veyra: Here.

451 01:02:49.400 01:02:51.810 Miguel de Veyra: Can everyone see it? Am I sharing the right screen?

452 01:02:54.360 01:02:55.010 Uttam Kumaran: Yes.

453 01:02:55.330 01:02:58.297 Miguel de Veyra: Okay, yeah, so basically,

454 01:02:59.190 01:03:00.560 Uttam Kumaran: Just zoom in a little bit.

455 01:03:00.980 01:03:01.980 Miguel de Veyra: Okay, yeah, we can.

456 01:03:02.980 01:03:03.920 Miguel de Veyra: There you go.

457 01:03:05.590 01:03:06.869 Miguel de Veyra: I think this is the important.

458 01:03:06.870 01:03:09.109 Uttam Kumaran: It didn’t do it, didn’t. Oh, yeah, okay, there it is, great.

459 01:03:09.110 01:03:14.510 Miguel de Veyra: Okay. So this was the messages from S 3. Basically, they’re just, you know, raw.

460 01:03:14.510 01:03:18.780 Uttam Kumaran: Wait, start, start start from what? What even this project is.

461 01:03:19.150 01:03:20.359 Miguel de Veyra: Okay, okay, yeah.

462 01:03:20.710 01:03:26.369 Miguel de Veyra: And so the project is, basically, we have this client hubs where we’ll

463 01:03:26.510 01:03:37.490 Miguel de Veyra: it has the ideal situation would be, it has access to all slack messages, to all you know, Zoom Meetings to all linear tickets and github code. Even

464 01:03:37.620 01:03:46.549 Miguel de Veyra: so, this is right. Now we’re throwing it all into context, which obviously is, you know, not a good idea, because it takes a lot of tokens.

465 01:03:46.800 01:03:52.529 Miguel de Veyra: and it’s very expensive to run, and just not the most efficient way to do it. It’s just the fastest way to do it.

466 01:03:53.424 01:04:09.289 Miguel de Veyra: So yeah, we came up with this. Basically. What? Ha! What what a wish, with the helpful wish is, we get all the messages from slack, and then we throw it into S. 3. And then from S. 3 we basically get the into here.

467 01:04:10.710 01:04:15.900 Miguel de Veyra: There’s like 4,000 messages in the Yavi slack channels.

468 01:04:16.320 01:04:24.000 Miguel de Veyra: And then what I did basically the pre retrieval part of that here’s the documentation

469 01:04:25.215 01:04:34.009 Miguel de Veyra: the pre retrieval part is basically just collecting all those messages. We divided them into weeks. And then we basically put them into one context like this.

470 01:04:35.700 01:04:45.980 Miguel de Veyra: oh, wait, let me. Just yeah. So it’s gonna look something like this. And then this is the only time we’ll embed it instead of, you know, just throwing it all into one place. So in this way it’s

471 01:04:46.270 01:04:57.240 Miguel de Veyra: it’s a bit more structured. And then, when you know when the AI actually, when the retrieval model pulls data in, it’s actually, it actually has the entire context of instead of just, you know.

472 01:04:57.400 01:04:59.979 Miguel de Veyra: having one message for one context.

473 01:05:01.340 01:05:03.380 Miguel de Veyra: So, yeah, I mean, this is

474 01:05:03.630 01:05:08.509 Miguel de Veyra: this is the way we should have done it like a couple of months back, basically.

475 01:05:08.750 01:05:12.210 Miguel de Veyra: But yeah, oh, go ahead.

476 01:05:12.690 01:05:26.869 Amber Lin: Can you explain to us what you mean by embedding? I think some of us, some of the folks here, and maybe including me, also need more and more detailed explanation of what you mean by embedding.

477 01:05:27.030 01:05:29.680 Amber Lin: and that also ties into the article too.

478 01:05:30.150 01:05:31.799 Miguel de Veyra: Okay. Sure. Casey, go ahead.

479 01:05:32.590 01:05:42.529 Casie Aviles: Yeah. So I like to think of embeddings as basically just a numerical representation of semantic, or like a meaning of a word.

480 01:05:42.970 01:05:46.640 Casie Aviles: So like, I think one of the things I learned before is

481 01:05:47.684 01:05:57.429 Casie Aviles: we could. Basically, we’re transforming a word like, for example, royalty, turn it into a vector embedding. So it’s going to be a numerical representation.

482 01:05:58.595 01:06:06.609 Casie Aviles: And then you could kind of add, like 2 embeddings together like royalty plus the embedding of

483 01:06:06.820 01:06:12.340 Casie Aviles: a male or female, and then it’s going to be when you add them. The resulting embedding would be

484 01:06:12.940 01:06:18.369 Casie Aviles: like, king or queen. So something like that. Right? You could perform

485 01:06:19.515 01:06:22.710 Casie Aviles: operations. Basically, you’re treating words as numbers kind of.

486 01:06:23.020 01:06:25.659 Casie Aviles: So that’s what the embedding part is.

487 01:06:25.840 01:06:30.880 Casie Aviles: And we could also one cool thing that we could also do with that is, we could match

488 01:06:32.095 01:06:39.340 Casie Aviles: embeddings together. So for example, we take a user input, so it has it asks a question like.

489 01:06:41.200 01:06:45.239 Casie Aviles: Yeah, like, what is AI or something? And then we vectorize that

490 01:06:45.420 01:06:49.990 Casie Aviles: which just means we’re turning it into a numerical representation, right?

491 01:06:50.090 01:06:58.969 Casie Aviles: And then, based on that, we are going to match the numerical representation of this query to something that is already in this database.

492 01:06:59.520 01:07:08.999 Casie Aviles: So let’s say you know what is AI, and then we find, like a section, we vectorize the section from a document where AI is this and that, you know something like that. And then.

493 01:07:09.720 01:07:13.150 Casie Aviles: yeah, basically, that’s how that’s how it works. I’m not sure if

494 01:07:13.960 01:07:17.260 Casie Aviles: I managed to simplify it. But that’s how I understood it.

495 01:07:17.630 01:07:26.580 Miguel de Veyra: To simplify. Yeah, yeah, that’s pretty much a bit technical. But yes, summarize it. It’s based embedding as basically the best way

496 01:07:26.820 01:07:30.879 Miguel de Veyra: to for the AI to read it without having to pass through context.

497 01:07:31.600 01:07:34.239 Uttam Kumaran: Yeah. So one, I think one way, I like to think about it.

498 01:07:34.890 01:07:39.680 Uttam Kumaran: Is, it’s just a number representation of text.

499 01:07:39.950 01:07:44.480 Uttam Kumaran: And it’s a vector embedding. And if you’re familiar with like, vector

500 01:07:45.031 01:08:07.890 Uttam Kumaran: there’s basically you just want to see how you want a a numerical representation of 2 things. And you can see how far they are from each other. So what this is doing is adding dimensionality. So if you want to say you have the word duh, and you have the word pizza, and you you. We have a framework to actually embed those as numbers, and

501 01:08:08.210 01:08:24.170 Uttam Kumaran: interpret some distance from them, and so, therefore, when we try to do something like, Hey, I want to look at is this slack message relevant to this other slack message? There is a mathematical equation to run that finds that that distance.

502 01:08:24.674 01:08:28.140 Uttam Kumaran: So in order for the AI to do what’s called retrieval

503 01:08:28.625 01:08:36.609 Uttam Kumaran: we need to be able to say, 1st you ask the question, then we use the embeddings to go find the relevant slack messages for that question.

504 01:08:36.760 01:08:42.950 Uttam Kumaran: and then we pull those messages out and add them to your prompt, the rough flow.

505 01:08:44.438 01:08:53.839 Amber Lin: So I posted a image in our chat. I think if if I’m clear on what this is, so, the embedding is the

506 01:08:54.029 01:09:08.079 Amber Lin: relationship between 2 concepts. Essentially right or it’s like in a relation. If we think about it, even in a relational 3D. Space is how things like how far they from each other.

507 01:09:08.330 01:09:16.240 Uttam Kumaran: So embeddings, embeddings themselves don’t have the notion of distance. Right embedding is

508 01:09:16.340 01:09:20.810 Uttam Kumaran: converting these words to a series of numbers.

509 01:09:21.300 01:09:38.609 Uttam Kumaran: When you, when you then compare them together, because now there are numbers, you can then find the distance between them in like a vector space and so again, as you see on the bottom of that image. A vector what in math, what they say? It’s like a it’s like a a

510 01:09:38.750 01:09:46.159 Uttam Kumaran: align with like a direction. Right? And so this is actually what you’re trying to find is, what is the distance between?

511 01:09:46.380 01:09:54.360 Uttam Kumaran: And this thing like a cat and a kitten are closer in embedding. Their vectors are closer than adult to cat.

512 01:09:54.480 01:10:01.889 Uttam Kumaran: Right? I think the main challenge I have is there’s a lot of math that goes into this embedding model. It’s just a black box.

513 01:10:02.460 01:10:08.009 Uttam Kumaran: But what you’re just trying to do is find a framework to actually do distance.

514 01:10:08.330 01:10:16.319 Uttam Kumaran: If you were to say, adult, what’s the distance between adult and child? And you just look at the letters. There’s actually no like inherent

515 01:10:16.864 01:10:34.949 Uttam Kumaran: like way you can say the letters a DULT. Put together are close to CHIL d. The only thing you could do is you could do what’s called like, there’s like this, different, like distance function. So you can say how many letters do are do have to change. But then you’re basically not looking at what the word means.

516 01:10:35.090 01:10:36.180 Uttam Kumaran: Right? So

517 01:10:36.300 01:10:42.540 Uttam Kumaran: just because it starts with A and starts with C doesn’t mean anything right? The word adult is the combination of the letters.

518 01:10:42.750 01:11:02.399 Uttam Kumaran: So what the embedding model does is it allows you to say in? And the embedding model is really like one of the secret sauces of these AI companies is that they’re able to say no. Actually, the word adult and the meaning behind it is closer to the word and the meaning behind child than it is to cat.

519 01:11:04.100 01:11:07.870 Uttam Kumaran: There’s a whole theory about like how to build these embedding models.

520 01:11:08.220 01:11:12.810 Uttam Kumaran: But if you put that somewhere, that that’s what’s happening.

521 01:11:16.530 01:11:21.070 Uttam Kumaran: So so what what ultimately is? When I ask a question like, tell me.

522 01:11:21.250 01:11:25.900 Uttam Kumaran: tell me, who is the main client on the Javi side like, who is the main stakeholder?

523 01:11:26.960 01:11:29.940 Uttam Kumaran: It that it takes my question.

524 01:11:30.080 01:11:38.939 Uttam Kumaran: embeds it, and then compares it to all the vectors to find slack messages that could answer that, and then it pulls that into your prompt.

525 01:11:39.210 01:11:41.199 Uttam Kumaran: and then and then it finds the answer.

526 01:11:50.170 01:11:56.799 Luke Daque: So I have a question, though, is like, How is this helping in terms of like the token usage and like, why does it

527 01:11:57.900 01:12:02.579 Luke Daque: cost less tokens to do it this way, then, yeah.

528 01:12:03.111 01:12:32.320 Miguel de Veyra: Yeah. So basically, the entire. For example, this entire thing is like, if you know the entire messages, if we just put this into context, it’s around 258,000 tokens. So if we put it into context, that’s basically we have to run through all that messages right? So, even if it’s like a non related to the executive team, the AI will still go through the executive team if it’s a Co. If you know if it’s about what’s the updates yesterday, it’ll still

529 01:12:32.880 01:12:37.010 Miguel de Veyra: run for it will still look into the messages, basically from a year ago.

530 01:12:37.490 01:12:40.448 Miguel de Veyra: So basically, embeddings is just

531 01:12:41.640 01:12:52.910 Miguel de Veyra: it, it only gets what’s related. So instead of you know, going through all that actually added, I think, added the tokens here now, it’s untransformed like instead, because it also you know

532 01:12:53.120 01:12:59.210 Miguel de Veyra: how many chunks we want. But usually I just do 10 or 15. So instead of having to run through, you know

533 01:12:59.340 01:13:05.829 Miguel de Veyra: 100,000 tokens, 200,000. It’s just gonna run on, for example, if it’s this 10, only right?

534 01:13:06.660 01:13:11.030 Miguel de Veyra: So it’s a lot less than you know, running through everything.

535 01:13:16.780 01:13:19.060 Casie Aviles: Basically, we’re just guessing what we need.

536 01:13:19.250 01:13:22.090 Miguel de Veyra: Yes, but that’s the hard part.

537 01:13:22.090 01:13:27.699 Uttam Kumaran: We have. We haven’t. We have, you know, we have hundreds of thousands of slack messages. So the problem is.

538 01:13:28.040 01:13:33.810 Uttam Kumaran: I, we don’t need to put, it’s basically like, if you were to copy and paste it all into your chat, gpt.

539 01:13:33.950 01:13:34.540 Miguel de Veyra: Yes.

540 01:13:34.540 01:13:48.519 Uttam Kumaran: It’s either gonna tell you it’s too much or it’s gonna not have the perfect answer. So what we’re doing is you’re basically pre filtering. You’re saying, I only want to take the records that are relevant to the ask. So you basically can

541 01:13:48.730 01:13:56.700 Uttam Kumaran: throw away 99% of the garbage. This is what happens. Also. For example, if you go to Chat, Gpt, and you take like a contract, and you’re like, Hey, tell me if

542 01:13:56.810 01:14:14.360 Uttam Kumaran: you’re you’re an expert lawyer. Tell me about this contract. It’s gonna consider the whole thing. And even if you say you’re an expert lawyer, just tell me a clause 5. It’s still gonna read the whole thing, and it may change what it answers. So instead, we actually only want Clause 5 to be included in the prompt.

543 01:14:14.670 01:14:17.658 Uttam Kumaran: Do you have contextual open? By the way, Miguel?

544 01:14:18.665 01:14:22.940 Uttam Kumaran: maybe you can share that like that. I think the visuals are really nice.

545 01:14:24.670 01:14:27.040 Miguel de Veyra: But let’s up that context.

546 01:14:27.140 01:14:28.660 Miguel de Veyra: I know I don’t have it open.

547 01:14:29.190 01:14:32.109 Miguel de Veyra: Let me find the link. That’s fine. Sorry.

548 01:14:34.028 01:14:41.260 Uttam Kumaran: I just want to share one more example, because it’s like a visual example of how this works in documents which is a bit easier to understand than

549 01:14:42.816 01:14:44.089 Uttam Kumaran: text embeddings.

550 01:14:44.460 01:14:47.299 Miguel de Veyra: Yes, yes. Tax embeddings is a bit hard to explain.

551 01:14:51.200 01:14:55.520 Miguel de Veyra: Okay, yeah. Here, okay, give me 1 min.

552 01:14:59.970 01:15:00.750 Uttam Kumaran: So the

553 01:15:00.750 01:15:15.380 Uttam Kumaran: the demo that we’re gonna show is basically we we’re partnering with this company that does this, this whole software. They do. They do this where you pass in the document, and it actually will go find like maybe pick, pick the one on the right, or

554 01:15:15.680 01:15:19.180 Uttam Kumaran: I don’t know if they may, their examples, maybe pretty rich.

555 01:15:22.508 01:15:25.089 Uttam Kumaran: So go ahead. Just type something. Yeah.

556 01:15:26.220 01:15:38.820 Uttam Kumaran: So when you ask a question like this to AI, it needs to go through and basically 1st go find the relevant documents and then inside the document find the answer. So click on any of the references, Miguel, if you zoom in a little bit.

557 01:15:40.820 01:15:47.109 Uttam Kumaran: So what’s happening here is when you ask this question, what is the best? And this is just a bunch of documents about

558 01:15:47.410 01:15:49.380 Uttam Kumaran: like car and driver.

559 01:15:49.590 01:15:52.420 Uttam Kumaran: It’s like a bunch of 4 documents. So

560 01:15:52.520 01:16:04.150 Uttam Kumaran: it actually needs to 1st say, what are the 10 or 15 out of the 100 documents I have, what are the few that matter? And then within them it goes and actually does like an both an image and text retrieval.

561 01:16:04.300 01:16:09.400 Uttam Kumaran: So it like, goes and builds this, this box, this bounding box.

562 01:16:09.570 01:16:23.139 Uttam Kumaran: and then pulls out that answer. So then imagine, for example, if you were to take this screenshot put into chat. Gpt, you get the answer. But the problem is not that it’s actually like finding that this document and this part of the document is what’s relevant.

563 01:16:24.450 01:16:28.649 Uttam Kumaran: This is what this is, basically what retrieval augmented generation is.

564 01:16:38.050 01:16:38.600 Luke Daque: What is.

565 01:16:38.600 01:16:39.180 Uttam Kumaran: Cool.

566 01:16:42.510 01:16:44.150 Luke Daque: Yeah, thanks for that.

567 01:16:45.000 01:16:45.930 Luke Daque: There you go.

568 01:16:52.690 01:16:59.790 Luke Daque: We can proceed, I guess, to the next slides, which is the stretch possibilities. Oh, yeah.

569 01:17:01.910 01:17:15.069 Uttam Kumaran: Yeah, I just wanted to highlight 2 things that I I just want to highlight 2 folks that are doing some stretching. Kyle, volunteered, and is gonna start working directly with Ryan on

570 01:17:15.170 01:17:17.100 Uttam Kumaran: publishing some content.

571 01:17:17.210 01:17:18.980 Uttam Kumaran: I know in the past

572 01:17:19.340 01:17:34.610 Uttam Kumaran: you’ve gone viral for some good architecture post. So I think we’ll basically assist you in in creating some of those. So appreciate that. And then Ryan from the marketing team is gonna be helping a little bit with go to market automation

573 01:17:34.730 01:17:38.043 Uttam Kumaran: some some of the stuff we’re doing with clay and things like that. So

574 01:17:38.764 01:17:47.280 Uttam Kumaran: I’m really happy to see that we’re people are stretching into things outside of their world or slightly within their world. Just so like

575 01:17:47.390 01:18:04.640 Uttam Kumaran: different flavor. And then, yeah, we’re still open to folks, and if anyone’s interested in taking any of these items, a lot of them are again really focused on on sales. And then I think the one thing that’s more focused on sort of a retention is going to be this engagement lead

576 01:18:05.138 01:18:10.860 Uttam Kumaran: but still a lot of opportunity even we need. I would say that we need definitely more people. If we want to create content.

577 01:18:10.980 01:18:17.719 Uttam Kumaran: we have a great workflow and playbook to do that. So yes, okay.

578 01:18:25.010 01:18:29.499 Luke Daque: Oh, I think next we have the executive app updates.

579 01:18:30.190 01:18:37.299 Uttam Kumaran: Yeah, I feel like we already shared a good amount about how how things are going. I think the the only thing on my side is

580 01:18:37.828 01:18:48.419 Uttam Kumaran: over the past 3 weeks I’ve really changed up my schedule. So the 1st half of the day is really really focused on sales and trying to just almost keep like 60 70% of the day free

581 01:18:48.953 01:18:53.080 Uttam Kumaran: to basically work on follow ups from those meetings. That’s, I think.

582 01:18:53.280 01:18:58.479 Uttam Kumaran: allowed us to really build a healthy pipeline of of new business, new potential business.

583 01:18:59.870 01:19:02.821 Uttam Kumaran: Yeah, I’m just continuing to double down there.

584 01:19:04.460 01:19:13.339 Uttam Kumaran: I think from the other. The other sort of I think main change that I messaged about is that I’ll be working more closely with the AI team to to sort of bring

585 01:19:13.796 01:19:18.649 Uttam Kumaran: all of our ideas. Really, to fruition hopefully, within the next few weeks.

586 01:19:18.780 01:19:21.580 Uttam Kumaran: Things like the AI agents that you’re seeing.

587 01:19:22.003 01:19:30.210 Uttam Kumaran: You know, and a few other things on the sales side. So I’m really excited. Will allow us to keep costs low and continue to scale

588 01:19:32.030 01:19:35.939 Uttam Kumaran: But yeah, that’s the main stuff for me. I don’t know, Robert. If you had anything else.

589 01:19:38.177 01:19:40.470 Robert Tseng: Yeah, no, I don’t have anything else.

590 01:19:40.970 01:19:41.420 Uttam Kumaran: Okay.

591 01:19:44.330 01:19:48.559 Luke Daque: Next we have shout outs. Anybody want to shout out anybody?

592 01:19:52.450 01:19:57.929 Luke Daque: I guess we’ve already! Shouted out Ryan and Kyle for their stretch assignments.

593 01:19:58.670 01:19:59.400 Uttam Kumaran: Yes.

594 01:20:10.700 01:20:11.300 Luke Daque: No, but.

595 01:20:13.400 01:20:15.909 Miguel de Veyra: Guest shout out to Avesh again.

596 01:20:16.660 01:20:19.570 Miguel de Veyra: he’s not here, but yeah. Shout out to him, and Casey.

597 01:20:22.730 01:20:23.254 Uttam Kumaran: Nice.

598 01:20:25.090 01:20:26.119 Miguel de Veyra: And Utah.

599 01:20:27.260 01:20:27.969 Uttam Kumaran: Thank you.

600 01:20:27.970 01:20:29.699 Miguel de Veyra: For telling us to Google that.

601 01:20:35.570 01:20:46.379 Luke Daque: I was gonna shout out to a wish as well, but he’s also not here. For the meta plane stuff. We’ve been like discussing them like doing like back and forth communication

602 01:20:46.900 01:20:50.169 Luke Daque: for this. And yeah, it’s pretty cool.

603 01:20:54.590 01:21:01.279 Luke Daque: Yeah, if there’s nothing else we have the next part, which is just. QA. If anybody has any questions.

604 01:21:02.340 01:21:04.670 Luke Daque: go ahead and ask.

605 01:21:14.900 01:21:17.820 Uttam Kumaran: No questions. Okay, I’m happy with that.

606 01:21:19.860 01:21:30.009 Luke Daque: If none, then I guess that concludes our meeting today thanks everyone for joining.

607 01:21:31.360 01:21:32.450 Uttam Kumaran: Thank you for hosting.

608 01:21:32.450 01:21:33.350 Hannah Wang: Staying.

609 01:21:33.600 01:21:34.790 Hannah Wang: Yeah, we’re coming.

610 01:21:35.530 01:21:36.470 Miguel de Veyra: Have a nice.

611 01:21:36.470 01:21:40.389 Uttam Kumaran: I love. I love the the lab share. That was great.

612 01:21:41.820 01:21:43.450 Luke Daque: Cool. Thanks. Yeah, I think

613 01:21:43.580 01:21:49.190 Luke Daque: should be in the slides. But it’s pretty messy, because, like I, I added, some of

614 01:21:50.160 01:21:56.240 Luke Daque: what do you call Natalie on animations. So, yeah.

615 01:21:59.440 01:22:00.020 Uttam Kumaran: Cool.

616 01:22:00.320 01:22:02.620 Uttam Kumaran: Okay, thanks. Everyone. Appreciate it.

617 01:22:02.840 01:22:05.359 Luke Daque: Sounds good. Thanks. See you bye, bye.