Meeting Title: Daily AI Team Sync Date: 2025-03-06 Meeting participants: Uttam Kumaran, Amber Lin, Miguel De Veyra, Casie Aviles


WEBVTT

1 00:00:08.890 00:00:10.420 Uttam Kumaran: Hey, Casey? Good evening.

2 00:00:11.470 00:00:12.717 Casie Aviles: Hey, Otam? Good morning.

3 00:00:38.410 00:00:39.680 Miguel de Veyra: Hey? Guys, good morning.

4 00:00:40.610 00:00:41.620 Uttam Kumaran: Hey! Good morning!

5 00:00:46.430 00:00:48.239 Amber Lin: Good morning, everyone.

6 00:00:49.980 00:00:51.010 Uttam Kumaran: Hey! Good morning!

7 00:00:51.480 00:00:53.031 Amber Lin: Hi! I made it.

8 00:00:54.043 00:00:58.076 Uttam Kumaran: Amazing. Thank you so much. Well, I guess I’ll do.

9 00:00:58.700 00:01:05.419 Uttam Kumaran: I’ll do some quick intros. And maybe we can.

10 00:01:06.560 00:01:27.729 Uttam Kumaran: maybe we can just take things off, so I guess I wanted to introduce amber. Amber is joining from La, and is joining on to take over project management and sort of client management for ABC, and of course, ideally several other clients. And I’m super super excited. She is.

11 00:01:27.840 00:01:32.686 Uttam Kumaran: I’m extremely interested in AI has actually, you know that that Lm,

12 00:01:33.900 00:01:40.230 Uttam Kumaran: the Llm thing I sent the other the other day, Miguel Casey, that like substack

13 00:01:42.250 00:01:49.190 Uttam Kumaran: with like the inference. And the yeah. So actually, she actually went and did that earlier this week, or like part of it, or most of it.

14 00:01:49.190 00:01:52.430 Amber Lin: I’m no, I’m at Module 2 right now.

15 00:01:52.600 00:01:59.869 Uttam Kumaran: Alright. Well, that’s more progress than I’ve done. So it’s like. And it’s funny because I was like, Oh, I literally just sent this

16 00:02:00.010 00:02:03.589 Uttam Kumaran: to the team. And we’re going through the same thing. So long. Story short, I think.

17 00:02:04.192 00:02:14.329 Uttam Kumaran: amber has a both back, like multiple backgrounds in in project management and data, and like people stuff and recruiting. So I think she’s super super well equipped.

18 00:02:14.776 00:02:26.650 Uttam Kumaran: Probably the only thing we’ll have to figure out moving forward is like the timing of these meetings we’re we’re all 3 of us now are all 3 of our groups are now on way. Different time zones, I guess you guys are

19 00:02:27.130 00:02:31.260 Uttam Kumaran: like Manila Philippines crew. You guys are on the most different. But.

20 00:02:31.260 00:02:33.200 Miguel de Veyra: 6. Am. Her time. No.

21 00:02:33.540 00:02:53.985 Uttam Kumaran: It’s 6. It’s 6, 30, or 6. Yeah, it’s it’s 8. My, it’s 8 30, my time, which is still early. And I know it’s late, your guys time. But I that’s why I wanted to move this a little bit. But I think as things get smoother again, this this meeting will go better. I know we talk every day.

22 00:02:54.490 00:03:00.650 Uttam Kumaran: but yeah, I sort of briefed her yesterday on everything ABC, mostly in terms of the problems.

23 00:03:01.320 00:03:20.950 Uttam Kumaran: the problems we’re trying to solve. I sent her sort of all the meetings, all the documents, but I didn’t give her a great overview of the technical solution. Not because I don’t think I could have it was well, actually, I don’t know if I could. At this point. It’s kind of complicated, but I would rather, Miguel, you and Casey sort of provide that for her.

24 00:03:21.429 00:03:39.340 Uttam Kumaran: Because you guys know the the challenges and like the core pieces of logic. I’ve actually given her access to Miro. I’ve given her access to the drive. Of course she’s in the Channel and for Friday, for tomorrow ideally. Amber will be presenting

25 00:03:39.450 00:03:44.050 Uttam Kumaran: the roadmap view to ABC.

26 00:03:44.170 00:03:59.170 Uttam Kumaran: And sort of start to take over a little bit on like the Communications Project management side. So I guess I’ll I’ll leave that there beyond this stuff. I also think she’s gonna be a great resource for our internal AI efforts around the Aipm.

27 00:03:59.450 00:04:07.390 Uttam Kumaran: Additionally, this team is gonna start to have, like I already today, the engineering team is asking me about how we can use AI for a few things.

28 00:04:07.510 00:04:15.629 Uttam Kumaran: So this team is going to get a lot of love from operations, from sales and from engineering on future automations.

29 00:04:16.179 00:04:26.050 Uttam Kumaran: And so I really really am. I’m looking forward to Amber sort of being able to to lead this team to take those on and build sort of have that sort of client.

30 00:04:26.230 00:04:36.379 Uttam Kumaran: you know, cost like client contractor relationship with our internal teams being like. So what requirements do we need making sure that they can adopt the tools? Things like that. So

31 00:04:36.670 00:04:38.132 Uttam Kumaran: I’ll pause there.

32 00:04:38.870 00:04:50.690 Uttam Kumaran: we can run this like a normal stand up. But also I definitely would like for y’all to spend some time together doing a debrief of of the project. So we can take this wherever we want to go.

33 00:04:51.820 00:04:54.599 Amber Lin: I’m also meeting with Miguel right after this meeting.

34 00:04:54.600 00:04:55.760 Uttam Kumaran: Oh, cool. Okay.

35 00:04:55.760 00:05:02.230 Amber Lin: Yeah. So if we want to get some stuff moving, I can catch up in my meeting with Miguel.

36 00:05:02.670 00:05:08.390 Uttam Kumaran: Okay, so how about we? Maybe let’s just talk about. Let’s just do a normal stand up, Miguel.

37 00:05:08.390 00:05:09.090 Miguel de Veyra: Okay.

38 00:05:10.720 00:05:18.540 Uttam Kumaran: yeah, I’m interested in how stuff went yesterday. I have some updates on getting you guys some back end help. So yeah, let’s just run it. Normally.

39 00:05:19.460 00:05:23.959 Miguel de Veyra: Yeah. So basically, we just need brain trust to

40 00:05:25.690 00:05:28.729 Miguel de Veyra: go enterprise because there’s no other paid plan

41 00:05:29.691 00:05:37.168 Miguel de Veyra: cause we can’t. We run out of tokens, basically got to use to be able to use them. But it’s fully functional now.

42 00:05:39.110 00:05:47.639 Miguel de Veyra: we’ve already message, you’ve already messaged develop people to cancel. So what we’re priority prioritizing right now is that

43 00:05:47.970 00:05:54.440 Miguel de Veyra: to move an it? And what you call this the anit and workflow Api to directly to Google chat.

44 00:05:56.240 00:06:01.720 Miguel de Veyra: So yeah, Casey and Janna is currently working on that together. Just cause you know.

45 00:06:02.010 00:06:05.909 Miguel de Veyra: that’s top priority while we’re waiting for brain trust. Basically.

46 00:06:05.910 00:06:11.689 Uttam Kumaran: So they so Tim said that those those 2 people have access. Right? What else do we need to do? There.

47 00:06:14.844 00:06:20.169 Miguel de Veyra: If we’re gonna cancel vellum that won’t work. So we’re gonna you’re migrating. Okay? So.

48 00:06:21.120 00:06:25.859 Uttam Kumaran: So you need me to go. Try to. Is there a self service option for that or no?

49 00:06:25.860 00:06:28.210 Miguel de Veyra: No, no, you have to contact them. I checked.

50 00:06:28.600 00:06:29.920 Uttam Kumaran: Did you contact them yet.

51 00:06:30.865 00:06:33.540 Miguel de Veyra: No, you said you will yesterday.

52 00:06:33.540 00:06:34.949 Uttam Kumaran: Okay, yeah. I didn’t. Okay.

53 00:06:37.710 00:06:38.380 Uttam Kumaran: Okay.

54 00:06:38.500 00:06:41.850 Miguel de Veyra: Because I think starts with them right like he knows the people.

55 00:06:42.850 00:06:51.046 Uttam Kumaran: Yeah, I asked him. He didn’t get back to me, either. We’re both twins in that. We we both dropped the ball and everything, so I will.

56 00:06:52.110 00:06:55.849 Uttam Kumaran: Did you sign up with a with a particular email? Or it’s with my email.

57 00:06:57.362 00:06:59.360 Miguel de Veyra: We use the invite you sent.

58 00:06:59.360 00:07:02.370 Uttam Kumaran: Okay, okay, let me do that right now, while we’re on the call. Let’s continue.

59 00:07:02.370 00:07:19.429 Miguel de Veyra: Okay? And then one of the other big things, I guess, is the one we we have the rag thing that the advanced rag. Basically, it’s not yet done. But it is kind of working. So what I’m working on right now, while John and Casey is working on that is basically comparing, you know, context waste versus

60 00:07:19.710 00:07:20.570 Miguel de Veyra: Prague.

61 00:07:20.890 00:07:32.269 Miguel de Veyra: But that was working pretty well, since you know, it was organized. But yeah, I’ll create, maybe like a spreadsheet with 10 questions, or maybe 20 questions each, and then just see what the difference in them.

62 00:07:33.222 00:07:35.760 Miguel de Veyra: Responses are from the form, basically.

63 00:07:37.700 00:07:41.960 Uttam Kumaran: Okay, how is the? So the the response, time is getting better.

64 00:07:41.960 00:07:43.270 Miguel de Veyra: Yes, yes.

65 00:07:43.270 00:07:46.879 Uttam Kumaran: Do you have? Do you have something I can share with them this morning about that

66 00:07:48.360 00:07:51.459 Uttam Kumaran: like about what the new response times are.

67 00:07:54.730 00:07:59.430 Miguel de Veyra: Probably by end of day I can send you something, not the not this morning.

68 00:07:59.430 00:08:00.270 Uttam Kumaran: Like

69 00:08:00.610 00:08:07.409 Uttam Kumaran: if I was to say I want to send something in the next hour. Do you have like anything? I can sort of give them.

70 00:08:08.680 00:08:13.019 Miguel de Veyra: I can run 20 or 30 tests, and then we can, you know, send it to them, you guys.

71 00:08:14.600 00:08:18.610 Uttam Kumaran: Okay. But what right now, you just have sort of like it’s, it seems faster.

72 00:08:18.610 00:08:19.619 Miguel de Veyra: Yes, yes, yes.

73 00:08:19.620 00:08:21.159 Uttam Kumaran: Okay, okay, all right.

74 00:08:23.870 00:08:29.159 Uttam Kumaran: Is there anything from brain trust like a screenshot that I can send over to them?

75 00:08:30.520 00:08:33.910 Miguel de Veyra: Oh, wait! Let me check my logs.

76 00:08:40.340 00:08:41.600 Miguel de Veyra: Alright! Let me check.

77 00:08:48.510 00:08:52.720 Miguel de Veyra: I’ll send. I’ll I’m running something. I’ll send you the error once it’s done.

78 00:08:55.210 00:08:56.479 Miguel de Veyra: Here you go.

79 00:08:56.900 00:08:58.220 Miguel de Veyra: Okay, I’ll send this to you.

80 00:09:11.930 00:09:13.929 Miguel de Veyra: I sent it to ait.

81 00:09:15.660 00:09:16.390 Uttam Kumaran: Okay.

82 00:09:24.970 00:09:27.589 Miguel de Veyra: Okay? And then, yeah, I think that’s pretty much it.

83 00:09:30.880 00:09:31.600 Uttam Kumaran: Okay,

84 00:09:35.900 00:09:45.409 Uttam Kumaran: okay, great. So the other thing on my side. So I have, I have a guy that’s gonna solve all of our back end optimization problems. He’s

85 00:09:45.560 00:09:48.849 Uttam Kumaran: my really really good friend. His name is Patrick Devlin.

86 00:09:49.689 00:10:01.839 Uttam Kumaran: Don’t want this to be like a an understatement, but maybe, like the most talented engineer like I’ve there’s probably one other person he’s like, probably like 55.

87 00:10:02.410 00:10:07.340 Uttam Kumaran: Patrick is like my age, but probably the most talented engineer I’ve ever worked with like

88 00:10:07.920 00:10:13.821 Uttam Kumaran: full stack, like I don’t know what’s fuller stack, or like every stack sort of like

89 00:10:14.860 00:10:20.340 Uttam Kumaran: like. It’s an extremely talented back end front end full stack, streaming real time.

90 00:10:20.610 00:10:33.429 Uttam Kumaran: He’s worked on my crypto stuff like really, really talented person. I I asked him to come. Basically give us just sort of help help you guys and be a resource on this optimization problem.

91 00:10:34.810 00:10:39.149 Uttam Kumaran: And so I’m gonna I’m gonna start to. I’m gonna include him on stuff

92 00:10:39.230 00:11:04.825 Uttam Kumaran: I gave him the overview I haven’t get again. I think Miguel connect you with him because you’ll give him a great good technical overview of everything. And his, his job is basically one to see around, see to like basically take us from. Okay, it’s like, basically question, our our current assumptions like is Google cloud functions. The best way to do this. Is there anything else we could do in super base? What else can we optimize? He’s gonna be the guy.

93 00:11:05.570 00:11:14.210 Uttam Kumaran: that way, it frees you guys to continue working on the core Llm logic. And then basically, any anything around deployment?

94 00:11:14.720 00:11:18.680 Uttam Kumaran: Shit like that. Just ask him for it. Yeah, he’s.

95 00:11:19.040 00:11:20.639 Miguel de Veyra: Yeah, like, anything? Devops.

96 00:11:21.010 00:11:26.379 Uttam Kumaran: Anything. Yeah, basically anything. I mean, even the Lm stuff. I’m sure he’ll figure it out. But, like

97 00:11:27.080 00:11:40.130 Uttam Kumaran: I I would, his time would be most valuable to us used on anything related to deployment devops, cloud functions, cloud databases. Apis like.

98 00:11:40.340 00:11:41.210 Uttam Kumaran: yeah.

99 00:11:43.040 00:11:56.142 Uttam Kumaran: And he’s so he’s he’s so nice like you’ll you’ll really you’ll really enjoy spending time with him, I also think, try to build a good relationship with him. I think I want him to sort of help us on even internal stuff. And like next projects.

100 00:11:56.820 00:12:18.903 Uttam Kumaran: he’s currently working in another company that I it’s another. It’s a friend’s. It’s a it’s a company who is owned by a friend of mine, who I actually referred Patrick to, because I couldn’t afford to hire him when I started the company. But I’m really, really so pumped that he’s he has an opportunity to work for us. So I will make the connection today. And

101 00:12:19.860 00:12:30.020 Uttam Kumaran: yeah, I’m I’m hoping that he can be a good resource as we as we have the core logic done here. I think a lot of it is optimization. And I also, I think he’ll be helpful on this rag

102 00:12:30.460 00:12:36.159 Uttam Kumaran: like rag inference, all that like complicated stuff. Basically.

103 00:12:41.490 00:12:47.939 Uttam Kumaran: Okay, I’m I’m I’m just submitting the the brain. Trust thing.

104 00:12:49.136 00:12:50.050 Uttam Kumaran: Let’s see.

105 00:12:52.920 00:12:55.029 Miguel de Veyra: Okay, Utah, I’m sorry. Just a quick question.

106 00:12:55.160 00:13:00.629 Miguel de Veyra: what do you need? What do you need about the response? Time is basically something like a Google sheet

107 00:13:00.800 00:13:02.950 Miguel de Veyra: like what was it before? What was it now?

108 00:13:03.880 00:13:09.049 Uttam Kumaran: Oh, yeah, it will literally like, I just need you to tell me 2 numbers. Yeah.

109 00:13:09.050 00:13:09.680 Uttam Kumaran: how are you doing?

110 00:13:09.680 00:13:11.860 Uttam Kumaran: I don’t need anything. I just want to tell them something.

111 00:13:11.860 00:13:13.677 Uttam Kumaran: Oh, okay, okay, okay, yeah. Okay.

112 00:13:13.980 00:13:18.789 Uttam Kumaran: Just give me a sense of like before we were averaging this. Now we’re averaging this.

113 00:13:19.840 00:13:23.149 Miguel de Veyra: Okay. I thought you wanted that, you know.

114 00:13:23.150 00:13:26.209 Uttam Kumaran: No, no, it’s good. It’s a good question. It’s good. It’s good to clarify.

115 00:13:26.210 00:13:31.930 Miguel de Veyra: Okay. Okay, yeah. Give me. Let me just finish my meeting with Amber after, and then I’ll hop around with Casey and then send it to you.

116 00:13:32.520 00:13:33.230 Uttam Kumaran: Okay.

117 00:13:38.170 00:13:38.900 Miguel de Veyra: Okay.

118 00:13:40.020 00:13:41.579 Uttam Kumaran: Okay. Great anything else.

119 00:13:42.594 00:13:45.639 Miguel de Veyra: No, I think that’s pretty much it on my end.

120 00:13:46.730 00:13:50.030 Uttam Kumaran: Anything else. Amber, that we wanted to to discuss.

121 00:13:50.811 00:13:58.050 Amber Lin: I just want a since we still have a teeny bit of time, just a quick overview of what did we present last week.

122 00:13:58.360 00:14:01.820 Amber Lin: And then, what are we gonna present this week.

123 00:14:02.280 00:14:10.150 Uttam Kumaran: Yeah. So last week what do we present last week? So last week? Oh.

124 00:14:10.150 00:14:12.199 Miguel de Veyra: More on data, their data.

125 00:14:12.750 00:14:14.849 Uttam Kumaran: Oh, yeah, we talked about

126 00:14:17.260 00:14:19.919 Uttam Kumaran: but he talks about what metrics we’re collecting.

127 00:14:20.290 00:14:22.490 Uttam Kumaran: We talked a little bit about.

128 00:14:24.680 00:14:28.930 Uttam Kumaran: shit! I don’t know, dude, I have to. I have to check the email. I have to check the video. Actually.

129 00:14:28.930 00:14:33.810 Miguel de Veyra: Yeah. But yeah, from what I, from what I can remember, it was mostly

130 00:14:34.636 00:14:41.370 Miguel de Veyra: so around, there’s data, cause it’s still not up to date most of it. And then, yeah.

131 00:14:41.370 00:14:43.870 Uttam Kumaran: Oh, yeah, we should. You should ask them.

132 00:14:44.200 00:14:52.650 Uttam Kumaran: Ask Patrick about about the evolve stuff, too.

133 00:14:54.950 00:14:56.020 Amber Lin: Nice.

134 00:14:56.940 00:15:04.710 Uttam Kumaran: When you when you guys, when you guys talk with him, ask him about how do we? How do we? How do we want to do the evolve, and I explained to him the whole project. So he he kind of gets it. But.

135 00:15:04.710 00:15:05.360 Miguel de Veyra: Okay.

136 00:15:08.430 00:15:10.600 Miguel de Veyra: when when’s Patrick coming in? By the way.

137 00:15:10.600 00:15:15.150 Uttam Kumaran: As soon as I as soon as I got on slack.

138 00:15:15.586 00:15:16.460 Miguel de Veyra: Okay, okay.

139 00:15:23.480 00:15:25.820 Miguel de Veyra: But I also have to reply to Hannah.

140 00:15:43.240 00:15:45.479 Uttam Kumaran: Okay. Anything else.

141 00:15:46.323 00:15:47.890 Miguel de Veyra: No, I think that’s pretty much it.

142 00:15:47.890 00:15:54.479 Uttam Kumaran: I can. Yeah, I guess you guys can probably pop on earlier or whatever. But I I think I need this meeting link. So.

143 00:15:55.200 00:15:55.930 Amber Lin: Okay.

144 00:15:56.490 00:15:56.960 Uttam Kumaran: Okay.

145 00:15:57.220 00:15:59.049 Miguel de Veyra: Thanks. Everyone have a good day.

146 00:15:59.050 00:16:00.039 Amber Lin: Thank you. Talk soon.

147 00:16:00.040 00:16:03.559 Amber Lin: Thank you, everyone, Miguel. I’ll go to our meeting room.

148 00:16:03.560 00:16:05.159 Miguel de Veyra: Okay, okay, sure. I’ll hop on now.

149 00:16:05.450 00:16:06.220 Amber Lin: Bye, guys.