Meeting Title: Daily AI Team Sync Date: 2025-02-27 Meeting participants: Janna Wong, Uttam Kumaran, Miguel De Veyra, Casie Aviles


WEBVTT

1 00:01:27.920 00:01:31.269 Miguel de Veyra: Hey, Jenna? Yeah. We talked all along. So, yeah.

2 00:01:35.140 00:01:36.579 Miguel de Veyra: Hello! Can you hear me?

3 00:01:38.030 00:01:39.000 Janna Wong: Oh yes, yes.

4 00:01:39.000 00:01:47.470 Miguel de Veyra: Yeah. With ABC document, okay, document document sheets.

5 00:01:50.190 00:01:52.280 Miguel de Veyra: So, for example.

6 00:01:58.160 00:02:03.410 Miguel de Veyra: there you go, like, for example.

7 00:02:04.310 00:02:11.080 Miguel de Veyra: is this one or you know, Canada or multiple program free.

8 00:02:12.010 00:02:14.520 Miguel de Veyra: So in analog and S,

9 00:02:15.070 00:02:29.699 Miguel de Veyra: my switch, and then like, match this atx, and then the rest will just put there. Here, put it there and then. Code. But yeah, basically, you know, for now we can skip atx. It’s just a matter of multiplying these.

10 00:02:29.920 00:02:33.320 Miguel de Veyra: doing it everything. And then the name Nathan Parameline, as well.

11 00:02:33.490 00:02:34.690 Janna Wong: - okay.

12 00:02:34.860 00:02:40.919 Miguel de Veyra: Be. And then, of course, we want to add context details. You know, we want to add it here.

13 00:02:43.210 00:02:48.250 Miguel de Veyra: But yeah, I I guess that should be pretty easy, right?

14 00:02:49.120 00:02:50.089 Janna Wong: Yeah, yeah, okay.

15 00:02:50.670 00:02:58.340 Miguel de Veyra: Actually, let’s see if indeed, over there, code one

16 00:03:03.130 00:03:05.620 Miguel de Veyra: pasa hat.

17 00:03:05.620 00:03:14.100 Janna Wong: Input. All for the column is if, like Paano, pag Isang, node lang.

18 00:03:14.800 00:03:16.110 Miguel de Veyra: What do you miss on Odelong.

19 00:03:16.800 00:03:20.979 Janna Wong: Codex, Zip.

20 00:03:22.240 00:03:23.060 Miguel de Veyra: That’s up!

21 00:03:23.060 00:03:23.890 Janna Wong: -Oh.

22 00:03:25.000 00:03:26.300 Janna Wong: And double click.

23 00:03:27.500 00:03:28.210 Miguel de Veyra: Which one.

24 00:03:28.470 00:03:30.849 Janna Wong: Eong, Atxn, Seba.

25 00:03:31.350 00:03:32.000 Miguel de Veyra: As well.

26 00:03:32.820 00:03:36.420 Janna Wong: No, no Yung code double click mo para makita mo yung code

27 00:03:38.860 00:03:59.860 Janna Wong: and then scroll down. Not sure. I input date da ballet input dot all that was lahat.

28 00:04:00.790 00:04:01.880 Miguel de Veyra: A little.

29 00:04:03.407 00:04:07.990 Janna Wong: is lahat talagan ang node in.

30 00:04:07.990 00:04:08.440 Miguel de Veyra: Yeah.

31 00:04:08.440 00:04:08.890 Janna Wong: Hello!

32 00:04:11.100 00:04:13.939 Miguel de Veyra: Because if nothing problem didn’t have any case before, yeah.

33 00:04:14.260 00:04:22.090 Miguel de Veyra: So before, before Shabu Maliks added, M. Chain.

34 00:04:22.580 00:04:23.670 Janna Wong: To to Google.

35 00:04:25.295 00:04:34.279 Miguel de Veyra: Call Austin contest, and then this one will be

36 00:04:37.830 00:04:40.609 Miguel de Veyra: awesome, being the product.

37 00:04:48.240 00:04:58.190 Miguel de Veyra: Don’t diamond field atx context, and then it’ll learn next. How do we then drag this

38 00:05:00.600 00:05:02.179 Miguel de Veyra: like all of these? Right?

39 00:05:02.480 00:05:04.309 Miguel de Veyra: Why don’t we’ll do that again, Jason.

40 00:05:05.070 00:05:06.669 Janna Wong: Developer is in both.

41 00:05:06.880 00:05:08.909 Janna Wong: That was that all.

42 00:05:10.820 00:05:11.610 Casie Aviles: Hey? Autumn.

43 00:05:12.330 00:05:12.810 Miguel de Veyra: Okay. Okay.

44 00:05:12.810 00:05:13.480 Janna Wong: Them.

45 00:05:14.330 00:05:15.140 Uttam Kumaran: Hey, guys.

46 00:05:16.000 00:05:16.750 Miguel de Veyra: And so.

47 00:05:17.580 00:05:18.880 Janna Wong: Parentheses.

48 00:05:23.790 00:05:26.420 Janna Wong: and then no, after that all.

49 00:05:32.280 00:05:32.860 Miguel de Veyra: oh, no!

50 00:05:32.860 00:05:40.979 Janna Wong: I think you need the dollar sign and the start that one. Yeah. And okay.

51 00:05:41.298 00:05:44.400 Miguel de Veyra: There you go. Okay, let’s see.

52 00:05:47.820 00:05:48.650 Miguel de Veyra: Okay.

53 00:05:48.960 00:05:49.490 Janna Wong: There!

54 00:05:49.830 00:05:51.110 Miguel de Veyra: Okay. Yeah. Nice.

55 00:05:51.900 00:05:52.320 Janna Wong: Nice.

56 00:05:52.320 00:05:55.560 Miguel de Veyra: And and then just so, it’s not like confusing.

57 00:05:56.770 00:06:00.700 Miguel de Veyra: So it’s a different. You know the hell.

58 00:06:02.920 00:06:09.420 Miguel de Veyra: Yeah. And then, since you have the last step, schema

59 00:06:10.250 00:06:15.860 Miguel de Veyra: Austin context. And then. Now you know, you just pass it. Here are here.

60 00:06:29.500 00:06:30.260 Miguel de Veyra: there you go.

61 00:06:31.750 00:06:32.280 Janna Wong: Great.

62 00:06:34.630 00:06:37.470 Miguel de Veyra: Let me save this, and then we just do the same for everyone. Then.

63 00:06:37.930 00:06:39.030 Janna Wong: Yeah, yeah, okay.

64 00:06:42.400 00:06:44.289 Miguel de Veyra: Yeah, I think that’s it. With this one.

65 00:06:44.720 00:06:45.539 Janna Wong: Thank you.

66 00:06:48.050 00:06:54.440 Miguel de Veyra: And then, yeah, I guess the next one would be oh, it actually! Why did I stop sharing screen my bad

67 00:06:57.093 00:07:02.930 Miguel de Veyra: I ha! I hopped on a call with the event in Janice yesterday. It’s just a pretty

68 00:07:03.690 00:07:08.859 Miguel de Veyra: good call, so they I asked for their FAQ, so they sent this.

69 00:07:09.690 00:07:15.829 Miguel de Veyra: And I’m currently adding it to, where’s the golden sheet?

70 00:07:18.180 00:07:21.417 Miguel de Veyra: Yeah, basically here. And then some some

71 00:07:22.470 00:07:26.270 Miguel de Veyra: answers. They even they don’t know that. So they have to double check.

72 00:07:27.460 00:07:29.530 Miguel de Veyra: I’ll follow up with them today.

73 00:07:30.267 00:07:33.540 Miguel de Veyra: But yeah, some of some was given like an answer already.

74 00:07:33.850 00:07:36.320 Miguel de Veyra: but some like this are still, you know, to follow.

75 00:07:37.980 00:07:41.860 Miguel de Veyra: and then I’m just adding the fa keys here. Just so we have more data.

76 00:07:48.240 00:07:53.089 Miguel de Veyra: And then, as for Tim, I think, he replied. We’ll probably meet with him tomorrow.

77 00:07:53.490 00:07:55.980 Uttam Kumaran: Yeah, did I book that meeting.

78 00:07:57.167 00:07:59.430 Miguel de Veyra: Let me check. Where’s my calendar?

79 00:08:00.485 00:08:03.909 Uttam Kumaran: Yeah, okay, I’ll I’ll book it. I’ll put time on for tomorrow.

80 00:08:04.160 00:08:05.100 Miguel de Veyra: Okay. Okay.

81 00:08:07.344 00:08:09.619 Uttam Kumaran: Yeah, basically, I just wanna.

82 00:08:10.510 00:08:15.310 Uttam Kumaran: I mean, do we have like the steps on our side that we need to do? I think maybe Jno.

83 00:08:15.480 00:08:16.930 Uttam Kumaran: probably just need to.

84 00:08:16.930 00:08:18.130 Miguel de Veyra: Create a loom. Video.

85 00:08:19.010 00:08:23.059 Uttam Kumaran: Yeah, or like a document on, like how we set it up on our side.

86 00:08:29.050 00:08:31.099 Miguel de Veyra: No, we can’t hear you, John, if you’re talking.

87 00:08:31.960 00:08:35.410 Janna Wong: I’m sorry. Oh, yeah, I can create a loom video for that.

88 00:08:36.450 00:08:37.030 Uttam Kumaran: Cool.

89 00:08:37.690 00:08:41.479 Miguel de Veyra: Yeah, I think that’s pretty much it. We’re pretty much ready for tomorrow

90 00:08:41.750 00:08:42.549 Uttam Kumaran: Nice.

91 00:08:42.559 00:08:55.259 Miguel de Veyra: We? I don’t wanna give this to them yet until we’ve tested this. So we’ll just stick to their context. Because as per our last meeting yesterday. They’re still cleaning a lot of stuff up on their end

92 00:08:56.769 00:09:04.459 Miguel de Veyra: like they’re but they are slowly getting there. But it’s never close to a product. So yeah.

93 00:09:05.230 00:09:08.730 Uttam Kumaran: We also got a ton of calls.

94 00:09:09.240 00:09:15.383 Uttam Kumaran: Oh, yeah, thinking about throwing those into 11 labs or into

95 00:09:18.280 00:09:21.910 Uttam Kumaran: yeah, probably into 11 labs or something where

96 00:09:22.200 00:09:26.170 Uttam Kumaran: or whisper where I’m doing like speech to text.

97 00:09:28.210 00:09:33.739 Uttam Kumaran: And then I want to. We can start. We can throw that into AI and basically have it.

98 00:09:34.370 00:09:38.970 Uttam Kumaran: write evals for us and basically create test cases.

99 00:09:48.540 00:09:49.840 Uttam Kumaran: yeah, okay.

100 00:09:49.950 00:09:59.049 Uttam Kumaran: yeah. And I was playing around, I I need a little bit more time with real. I was having running to some bugs yesterday. Trying to set it up so.

101 00:10:00.390 00:10:02.270 Miguel de Veyra: What do you call the second transcripts.

102 00:10:04.250 00:10:08.650 Uttam Kumaran: Yeah, we got yeah, just like, transcript evals. Yeah.

103 00:10:09.250 00:10:09.900 Miguel de Veyra: Okay.

104 00:10:10.780 00:10:17.239 Miguel de Veyra: how how does this work, by the way, with them after the for until March 3, do we still like spend hours on

105 00:10:17.470 00:10:18.399 Miguel de Veyra: on them?

106 00:10:19.220 00:10:21.650 Uttam Kumaran: Yeah, basically, I, I mean.

107 00:10:21.840 00:10:27.130 Uttam Kumaran: our goal was like, we have a fixed price for the 1st phase. And then

108 00:10:27.790 00:10:30.440 Uttam Kumaran: we basically charge as soon as we

109 00:10:30.880 00:10:38.350 Uttam Kumaran: sort of get past this. Okay? So so next week we’re gonna put it all, I’m gonna put a proposal.

110 00:10:41.200 00:10:45.359 Uttam Kumaran: That’s why next week it’s really important for us to try to start doing some testing

111 00:10:45.580 00:10:48.889 Uttam Kumaran: because I can. I’ll I’ll be able to put a proposal

112 00:10:49.410 00:10:54.449 Uttam Kumaran: in front of them for their for the next time. But if we need more time, it’s okay.

113 00:10:54.730 00:10:56.310 Uttam Kumaran: We just.

114 00:10:56.640 00:11:05.220 Uttam Kumaran: I probably won’t charge like an additional amount. But I want to put a proposal in front of them for the usage base fees. So yep.

115 00:11:05.910 00:11:13.429 Miguel de Veyra: Yeah. Okay. Also, they sent like a lot of docs yesterday, discounts and stuff. So we’ll probably have to add this, and then.

116 00:11:13.430 00:11:14.540 Uttam Kumaran: Oh, really.

117 00:11:14.540 00:11:16.579 Miguel de Veyra: Yeah, like this once.

118 00:11:17.200 00:11:23.659 Miguel de Veyra: cause I I kind of knew they won’t have like the answers on the sheets. That’s why they haven’t filled it out.

119 00:11:23.850 00:11:24.940 Miguel de Veyra: So I asked.

120 00:11:24.940 00:11:26.099 Uttam Kumaran: Wrote, these.

121 00:11:26.100 00:11:27.170 Miguel de Veyra: Yeah, yeah, yeah.

122 00:11:28.090 00:11:29.410 Uttam Kumaran: Can you open one up.

123 00:11:29.845 00:11:30.280 Miguel de Veyra: Here.

124 00:11:34.550 00:11:35.990 Uttam Kumaran: Oh, damn okay.

125 00:11:36.650 00:11:38.799 Miguel de Veyra: He last updated was 4 years ago.

126 00:11:41.070 00:11:43.719 Uttam Kumaran: So are you gonna put these into the.

127 00:11:43.720 00:11:45.070 Miguel de Veyra: Golden Street, yes.

128 00:11:45.070 00:11:46.370 Uttam Kumaran: Yeah, okay.

129 00:11:46.370 00:11:46.960 Miguel de Veyra: Yeah.

130 00:11:48.110 00:11:54.750 Miguel de Veyra: And then some docs. I I told, I’m not sure if they added already, internal company, yeah, cause

131 00:11:54.880 00:11:59.909 Miguel de Veyra: what we wanna avoid was that they? We mix everything that’s already here. And then, you know.

132 00:12:00.110 00:12:03.580 Miguel de Veyra: just double implement everything. So they’re they’re gonna add it here.

133 00:12:03.970 00:12:08.560 Miguel de Veyra: The new docs that they’re rewriting. They did really like the central Doc idea.

134 00:12:08.820 00:12:09.410 Uttam Kumaran: Okay.

135 00:12:09.800 00:12:13.940 Miguel de Veyra: And then they said, It’s a lot, you know, cleaner than the previous one on the Bible.

136 00:12:14.500 00:12:16.190 Miguel de Veyra: because this one I cleaned a bit.

137 00:12:18.590 00:12:20.259 Miguel de Veyra: Yeah, as you can see there.

138 00:12:21.860 00:12:27.710 Miguel de Veyra: So yeah, I think for them, it’s pretty much good. I really wanna test this out.

139 00:12:28.370 00:12:32.330 Miguel de Veyra: and then I’ll check out the one you sent forever.

140 00:12:33.980 00:12:37.310 Miguel de Veyra: Sorry I have to do right now. My voice isn’t the best.

141 00:12:37.640 00:12:38.940 Uttam Kumaran: No no no problem.

142 00:12:39.227 00:12:44.850 Miguel de Veyra: But yeah, I think that’s it for this one, Casey, can you take over?

143 00:12:45.290 00:12:46.179 Miguel de Veyra: Thank you.

144 00:12:46.490 00:12:47.100 Casie Aviles: Sure.

145 00:12:47.790 00:12:49.380 Uttam Kumaran: Oh, sure!

146 00:12:52.375 00:12:58.950 Casie Aviles: Yeah. So yesterday I was more focused on just adding the tickets, notion tickets as context for

147 00:12:59.630 00:13:05.819 Casie Aviles: the bot. So yeah, let me just show some of the tests that I have.

148 00:13:07.700 00:13:08.540 Casie Aviles: Yes, it.

149 00:13:08.870 00:13:10.370 Casie Aviles: AI test channel.

150 00:13:15.140 00:13:18.199 Casie Aviles: Yeah. So here, for example, I asked.

151 00:13:19.880 00:13:23.670 Casie Aviles: I asked the bot, are there any tickets that are currently blocked? So

152 00:13:24.110 00:13:31.410 Casie Aviles: yeah, it’s able to pull from notion the tickets, but just like the properties.

153 00:13:31.630 00:13:35.309 Casie Aviles: So the actual contents of the tickets are not yet.

154 00:13:36.400 00:13:37.000 Uttam Kumaran: Okay.

155 00:13:38.645 00:13:46.929 Casie Aviles: And yeah. So I asked it some questions like this, although, there are instances where the answers are not perfect yet, like

156 00:13:48.110 00:13:55.260 Casie Aviles: I asked it for your for tickets assigned to you, and it only gave like, just a a few, but not the complete

157 00:13:55.650 00:14:00.700 Casie Aviles: tickets. So yeah, that’s that’s still something that needs to be worked on.

158 00:14:02.410 00:14:07.474 Casie Aviles: Okay? And then then and then for the I mean, since we

159 00:14:08.970 00:14:12.560 Casie Aviles: yeah, just looking at this some of the features for the.

160 00:14:13.790 00:14:16.760 Casie Aviles: for the, for initiative, for the Junior Pm. Side.

161 00:14:17.840 00:14:26.110 Casie Aviles: A lot here is also pertaining to like notion tickets, I I’d say, like it’s dependent on having the notion tickets. Context.

162 00:14:26.110 00:14:26.950 Uttam Kumaran: Yes.

163 00:14:27.850 00:14:36.352 Casie Aviles: Yeah, that’s why I worked on that yesterday, and another workflow that I’m still it’s not yet

164 00:14:37.540 00:14:44.150 Casie Aviles: done. But I have this initial version of just this automated email summary. So it sorry.

165 00:14:44.620 00:14:49.599 Casie Aviles: Yeah. For, oh, okay, great producing end of the client email summary. So

166 00:14:50.090 00:14:56.730 Casie Aviles: basically, how it works is, we get the contacts from the notion tickets, slack communications.

167 00:14:56.970 00:14:58.700 Casie Aviles: and then the AI step.

168 00:14:59.420 00:14:59.750 Uttam Kumaran: Okay.

169 00:14:59.750 00:15:00.920 Casie Aviles: And then it creates.

170 00:15:00.920 00:15:07.649 Uttam Kumaran: I don’t. I don’t need the email. I don’t need it to be in email like it could just get sent to the

171 00:15:07.840 00:15:09.799 Uttam Kumaran: it could just get sent to like

172 00:15:12.870 00:15:17.360 Uttam Kumaran: or basically it could. Just it just needs to get honestly produced ad hoc.

173 00:15:17.610 00:15:23.890 Uttam Kumaran: because basically on on like Friday, what on Friday. I would like to just basically say.

174 00:15:24.820 00:15:27.470 Uttam Kumaran: Okay, let’s produce a summary of what we did this week.

175 00:15:28.670 00:15:32.550 Uttam Kumaran: So I don’t think the email step draft is is so necessary.

176 00:15:34.990 00:15:36.600 Casie Aviles: Hmm, okay, okay. So

177 00:15:37.057 00:15:43.660 Casie Aviles: ideally, it would be like, you’re you’re just going to ask the bot or like, there’s a button or something like a trigger

178 00:15:43.990 00:15:46.499 Casie Aviles: to automatically create the emails right?

179 00:15:54.090 00:15:54.990 Uttam Kumaran: Hmm.

180 00:15:56.010 00:16:04.909 Uttam Kumaran: yeah, I I actually just don’t. Yeah, I basically wanted to just be able to return the text like having it draft the email that’s like overkill.

181 00:16:05.380 00:16:05.740 Casie Aviles: Okay.

182 00:16:05.740 00:16:10.740 Uttam Kumaran: That’s fine, I think more important is, can you work with Nico and and

183 00:16:10.960 00:16:16.229 Uttam Kumaran: and ask him about what our format is? He’ll give you the best output format.

184 00:16:17.000 00:16:18.150 Casie Aviles: Okay. Yeah. Sure.

185 00:16:18.500 00:16:19.080 Uttam Kumaran: Okay.

186 00:16:22.975 00:16:26.260 Casie Aviles: But yeah, that’s yeah. That’s my progress so far.

187 00:16:29.570 00:16:30.370 Uttam Kumaran: Okay.

188 00:16:31.490 00:16:34.490 Uttam Kumaran: Okay. Another piece is, I sent 2 articles.

189 00:16:34.490 00:16:34.870 Miguel de Veyra: Yep.

190 00:16:34.870 00:16:41.439 Uttam Kumaran: I honestly would recommend you guys, and maybe even just spend yeah, like.

191 00:16:41.660 00:16:47.080 Uttam Kumaran: just spend like an hour or 2 reading these I was. I couldn’t sleep.

192 00:16:47.080 00:16:49.610 Miguel de Veyra: Oh, yeah, you were online, like, 3 h ago.

193 00:16:49.610 00:16:53.164 Uttam Kumaran: I was up like at 4 am. Reading. And

194 00:16:54.090 00:16:58.830 Uttam Kumaran: yeah, this article is insane. Like, did the guy break down?

195 00:16:59.190 00:17:06.630 Uttam Kumaran: He’s doing sort of like what we’re doing except like one step ahead, where he explains the reason for like

196 00:17:06.950 00:17:16.869 Uttam Kumaran: having like, why we should consider adding inference, why we should like, consider different modes of context. And then he has a very good article on

197 00:17:17.230 00:17:20.060 Uttam Kumaran: on How on Rag Strategies

198 00:17:20.339 00:17:28.709 Uttam Kumaran: and Casey. He basically has, like a 4 part series on building what he calls second brain, which is literally our

199 00:17:29.090 00:17:30.649 Uttam Kumaran: our notion work.

200 00:17:30.990 00:17:33.429 Uttam Kumaran: I think it would be really important.

201 00:17:33.917 00:17:43.439 Uttam Kumaran: And I’m gonna try to to read it when I get time this afternoon the whole thing, but I think it would be really important for all of us to

202 00:17:43.980 00:17:51.949 Uttam Kumaran: to read it, and maybe I’ll even schedule time or tomorrow. During during the retro meeting we can talk about

203 00:17:52.120 00:17:53.580 Uttam Kumaran: things we learned.

204 00:17:54.135 00:17:58.869 Uttam Kumaran: I may. Even if we like it. I may even consider he has like a book

205 00:17:59.420 00:18:06.840 Uttam Kumaran: that’s like that was very highly recommended that I may go read. I don’t know. I just thought it was extremely practical and

206 00:18:07.220 00:18:12.300 Uttam Kumaran: had almost like all the core pieces that we’re using

207 00:18:14.270 00:18:20.060 Uttam Kumaran: and basically like he has the same goal of like trying to use notion as like the second brain.

208 00:18:20.210 00:18:26.960 Uttam Kumaran: And then how do we load that all into different types of context, different rag strategies?

209 00:18:27.640 00:18:28.970 Uttam Kumaran: It’s really good.

210 00:18:31.710 00:18:34.449 Casie Aviles: Okay, yeah, I’ll spend some time reading this as well.

211 00:18:35.930 00:18:39.088 Uttam Kumaran: So maybe during tomorrow’s retro, we can.

212 00:18:43.540 00:18:48.889 Uttam Kumaran: Maybe it’s a it’s a we could talk during tomorrow’s retro about it. And then also, like.

213 00:18:49.170 00:18:54.519 Uttam Kumaran: maybe we can do the the 1st part about the article, and then the second part we can do. We can do a retro.

214 00:18:57.390 00:18:58.050 Casie Aviles: Okay.

215 00:19:00.280 00:19:01.460 Uttam Kumaran: Okay. Awesome.

216 00:19:06.240 00:19:07.170 Uttam Kumaran: Okay.

217 00:19:07.380 00:19:08.300 Uttam Kumaran: Great.

218 00:19:11.390 00:19:13.099 Miguel de Veyra: Okay, yeah, I think that’s pretty much it.

219 00:19:13.930 00:19:17.509 Uttam Kumaran: Okay, cool, alright. Well, let me know if

220 00:19:18.160 00:19:21.969 Uttam Kumaran: if there’s anything else. And and yeah, Casey tag me if I can test the

221 00:19:22.250 00:19:28.069 Uttam Kumaran: the bot at all. Output formats, etc.

222 00:19:28.980 00:19:33.009 Casie Aviles: Yeah, you can test the the Javi bot already on AI test channel.

223 00:19:33.550 00:19:34.280 Uttam Kumaran: Okay.

224 00:19:36.020 00:19:39.490 Uttam Kumaran: And I think Nico should be reaching out about Demos tomorrow, too.

225 00:19:40.010 00:19:40.680 Casie Aviles: Oh, yeah, give it.

226 00:19:40.680 00:19:41.370 Miguel de Veyra: Was this.

227 00:19:41.650 00:19:43.639 Casie Aviles: I’m going to record that alone for that.

228 00:19:44.600 00:19:46.729 Uttam Kumaran: Okay. Great. Do we want a demo?

229 00:19:46.880 00:19:48.930 Uttam Kumaran: Anything on the ABC side?

230 00:19:52.150 00:19:57.389 Miguel de Veyra: Oh, did we, Demo? That I don’t! I don’t! I don’t. I’m not sure.

231 00:19:57.830 00:19:58.540 Uttam Kumaran: Okay.

232 00:19:59.140 00:20:04.329 Miguel de Veyra: We could probably demo the integration to Google. But I think everything else is pretty basic.

233 00:20:04.770 00:20:05.460 Uttam Kumaran: Okay.

234 00:20:06.000 00:20:08.279 Miguel de Veyra: It’s just more. A lot of context.

235 00:20:08.730 00:20:09.750 Uttam Kumaran: Yeah, yeah.

236 00:20:09.890 00:20:12.170 Miguel de Veyra: It’s more on organization of data.

237 00:20:12.620 00:20:13.310 Uttam Kumaran: Yeah.

238 00:20:15.370 00:20:17.599 Miguel de Veyra: Okay. Thanks. Everyone. Okay.

239 00:20:17.600 00:20:18.360 Uttam Kumaran: Thank you.

240 00:20:19.320 00:20:20.010 Casie Aviles: Thank you. Bye.

241 00:20:20.010 00:20:20.540 Uttam Kumaran: Thank you.