Meeting Title: AI Team Weekly Planning Date: 2025-02-03 Meeting participants: Miguel De Veyra, Casie Aviles, Uttam Kumaran
WEBVTT
1 00:05:59.760 ⇒ 00:06:01.930 Miguel de Veyra: You, you, you stop!
2 00:06:03.500 ⇒ 00:06:04.559 Casie Aviles: Hey! Hey!
3 00:06:06.610 ⇒ 00:06:08.720 Miguel de Veyra: Didn’t move the message without.
4 00:06:13.180 ⇒ 00:06:14.120 Miguel de Veyra: hey? What’s up?
5 00:06:14.500 ⇒ 00:06:15.190 Uttam Kumaran: Hey!
6 00:06:15.410 ⇒ 00:06:16.210 Casie Aviles: Yeah. Done.
7 00:06:16.650 ⇒ 00:06:17.310 Uttam Kumaran: Hey!
8 00:06:19.850 ⇒ 00:06:24.089 Miguel de Veyra: And that dog feels so relaxed now he looks so relaxed.
9 00:06:28.950 ⇒ 00:06:34.146 Uttam Kumaran: I know. I wish I was like that. I feel so stressed.
10 00:06:36.550 ⇒ 00:06:39.140 Uttam Kumaran: Is Janet coming today or no?
11 00:06:39.650 ⇒ 00:06:41.110 Miguel de Veyra: No, I don’t think so.
12 00:06:42.020 ⇒ 00:06:43.400 Miguel de Veyra: It’s 2 Am.
13 00:06:43.860 ⇒ 00:06:44.280 Uttam Kumaran: Okay.
14 00:06:44.700 ⇒ 00:06:46.070 Miguel de Veyra: She’ll be there tomorrow, though.
15 00:06:53.270 ⇒ 00:06:58.039 Uttam Kumaran: Okay, perfect. Yeah. Where do we want to start?
16 00:06:59.740 ⇒ 00:07:06.860 Miguel de Veyra: I guess we start with the Okr, because that’s, you know, pretty solid. And then we decide how to go with the ABC stuff.
17 00:07:07.410 ⇒ 00:07:09.950 Uttam Kumaran: Okay, cool, perfect. Let me pull it up.
18 00:08:34.720 ⇒ 00:08:43.089 Uttam Kumaran: Okay, okay, so yeah, this is our task for
19 00:08:43.370 ⇒ 00:08:53.710 Uttam Kumaran: 2 things on the AI side. So one is, every client needs to receive a high quality message. So short term is, I need some help to bring in slack data
20 00:08:56.220 ⇒ 00:08:58.770 Uttam Kumaran: into our internal snowflake.
21 00:08:59.265 ⇒ 00:09:08.280 Uttam Kumaran: So that’s 1 task. I’m gonna put up on the AI team board in order to work on that and let me just pull up the overall okrs here, so you can see them.
22 00:09:08.780 ⇒ 00:09:09.760 Uttam Kumaran: Oh.
23 00:09:16.770 ⇒ 00:09:21.140 Uttam Kumaran: so this is really the.
24 00:09:21.340 ⇒ 00:09:25.559 Uttam Kumaran: This is really the core task right now that we need to take on, which is
25 00:09:25.970 ⇒ 00:09:30.039 Uttam Kumaran: measure the quality and the number of messages that we’re sending to clients.
26 00:09:30.330 ⇒ 00:09:36.160 Uttam Kumaran: So one thing that I’m gonna work on is basically trying to see how we can bring in those slack messages
27 00:09:36.470 ⇒ 00:09:38.089 Uttam Kumaran: into the data warehouse.
28 00:09:38.200 ⇒ 00:09:44.750 Uttam Kumaran: I’ll be hitting the slack Api and getting them. And second is, we want to try to use AI to basically
29 00:09:44.900 ⇒ 00:09:49.420 Uttam Kumaran: understand like what messages we’re sending to clients
30 00:09:49.570 ⇒ 00:09:53.509 Uttam Kumaran: and make sure that we are getting at least one every day to each client.
31 00:09:55.220 ⇒ 00:10:01.900 Miguel de Veyra: And then set up another. Yeah, yeah. And then, like, set up some alert. That, hey? You haven’t messaged this client today. Something like that.
32 00:10:01.900 ⇒ 00:10:02.540 Uttam Kumaran: Correct.
33 00:10:04.120 ⇒ 00:10:06.269 Miguel de Veyra: Okay, yeah, more on analytics.
34 00:10:07.700 ⇒ 00:10:11.199 Uttam Kumaran: Yeah, so partly data. And then partly, we’ll do some zapier work.
35 00:10:11.320 ⇒ 00:10:14.950 Miguel de Veyra: Yeah, which which client is like.
36 00:10:15.710 ⇒ 00:10:18.779 Miguel de Veyra: do you wanna test or run a test in this one.
37 00:10:19.780 ⇒ 00:10:22.679 Uttam Kumaran: We wanna we wanna try to test with Javi first.st
38 00:10:22.680 ⇒ 00:10:23.420 Miguel de Veyra: Okay.
39 00:10:24.878 ⇒ 00:10:29.059 Uttam Kumaran: So let’s let me create this ticket in our board.
40 00:10:31.350 ⇒ 00:10:36.320 Miguel de Veyra: Is there like a what do you call this client, Javi? I don’t think I’m part of it.
41 00:10:36.320 ⇒ 00:10:37.602 Uttam Kumaran: Yes, there is
42 00:10:38.320 ⇒ 00:10:40.039 Uttam Kumaran: I can add you to those.
43 00:10:40.200 ⇒ 00:10:42.450 Uttam Kumaran: I can add you to those. Once we
44 00:10:43.280 ⇒ 00:10:44.800 Uttam Kumaran: once we start working on it.
45 00:10:44.800 ⇒ 00:10:45.500 Miguel de Veyra: Yep. Yep.
46 00:10:47.070 ⇒ 00:10:52.539 Uttam Kumaran: So let’s take a look at these. Are any. Can we close out any of these.
47 00:10:55.541 ⇒ 00:10:58.610 Miguel de Veyra: This breakup tickets. We can close to be honest.
48 00:11:01.430 ⇒ 00:11:06.280 Casie Aviles: I think we can also close the summarizer manual transcript entry.
49 00:11:12.620 ⇒ 00:11:17.000 Miguel de Veyra: And then this updating new tickets based on.
50 00:11:30.860 ⇒ 00:11:32.960 Uttam Kumaran: So how do we have all the databases? Then.
51 00:11:34.352 ⇒ 00:11:36.000 Miguel de Veyra: The tasks! Not yet.
52 00:11:36.700 ⇒ 00:11:39.100 Miguel de Veyra: I’m still trying to figure out how to do that.
53 00:11:39.200 ⇒ 00:11:40.520 Miguel de Veyra: That’s why it’s in blocked.
54 00:11:42.570 ⇒ 00:11:44.780 Uttam Kumaran: But which ones do we still have to bring in.
55 00:11:46.150 ⇒ 00:11:47.749 Miguel de Veyra: The tasks. Only the tasks.
56 00:11:47.980 ⇒ 00:11:48.990 Uttam Kumaran: Oh, okay. Okay.
57 00:11:51.260 ⇒ 00:11:54.970 Miguel de Veyra: So yeah, generally, I think it should be okay.
58 00:11:59.230 ⇒ 00:12:02.529 Miguel de Veyra: the notion guys replied. But it’s kinda
59 00:12:03.350 ⇒ 00:12:06.670 Miguel de Veyra: general support type of reply. So it’s not really helpful.
60 00:12:07.230 ⇒ 00:12:08.110 Uttam Kumaran: Oh, okay.
61 00:12:09.220 ⇒ 00:12:10.580 Miguel de Veyra: So I think I’ll follow up.
62 00:12:11.580 ⇒ 00:12:12.789 Uttam Kumaran: And how about this one.
63 00:12:13.553 ⇒ 00:12:17.320 Miguel de Veyra: This one. I’m not sure how we wanna proceed with this, because
64 00:12:17.670 ⇒ 00:12:20.100 Miguel de Veyra: this was the way to update tickets. But.
65 00:12:21.990 ⇒ 00:12:23.849 Uttam Kumaran: You need some specific examples.
66 00:12:24.537 ⇒ 00:12:29.280 Miguel de Veyra: That one. And is this a priority right now?
67 00:12:30.680 ⇒ 00:12:32.539 Uttam Kumaran: Yeah, let’s move it to plan.
68 00:12:32.540 ⇒ 00:12:33.040 Miguel de Veyra: Yeah.
69 00:12:44.940 ⇒ 00:12:47.959 Uttam Kumaran: yeah. And then. Now, this one is basically done.
70 00:12:48.770 ⇒ 00:12:49.350 Miguel de Veyra: Yeah.
71 00:12:51.980 ⇒ 00:12:59.130 Uttam Kumaran: This is sort of like, yeah, this is, gonna I’m gonna turn this into automating junior, you know.
72 00:12:59.130 ⇒ 00:13:00.100 Miguel de Veyra: Pm, yeah.
73 00:13:00.490 ⇒ 00:13:04.379 Uttam Kumaran: And then I’ll I’ll finish this one up soon as I can.
74 00:13:07.070 ⇒ 00:13:09.509 Uttam Kumaran: All of these are not so urgent.
75 00:13:16.330 ⇒ 00:13:22.340 Miguel de Veyra: Do you? Do you think we should add any new demos, since we’re reaching out to law offices or.
76 00:13:23.740 ⇒ 00:13:28.060 Uttam Kumaran: I think the biggest thing right now is I wanna prioritize the work for
77 00:13:29.920 ⇒ 00:13:32.239 Uttam Kumaran: well, once I want to do the bring in.
78 00:13:40.047 ⇒ 00:13:41.360 Miguel de Veyra: Slack into Snowflake
79 00:13:45.590 ⇒ 00:13:47.939 Miguel de Veyra: is Snowflake some sort of database.
80 00:13:48.060 ⇒ 00:13:48.670 Uttam Kumaran: Yeah.
81 00:13:49.553 ⇒ 00:13:50.139 Miguel de Veyra: I see.
82 00:13:50.850 ⇒ 00:13:51.890 Miguel de Veyra: SQL.
83 00:13:54.220 ⇒ 00:13:57.029 Miguel de Veyra: Wait! Let me check. If they’re in notion they should be right.
84 00:13:58.640 ⇒ 00:14:04.520 Uttam Kumaran: So create alert workflow for missed client.
85 00:14:15.510 ⇒ 00:14:17.210 Miguel de Veyra: Oh, Snowflake is in notion.
86 00:14:22.110 ⇒ 00:14:26.789 Casie Aviles: What kind of data are we pulling from? Slap? Is it going to be? Mostly messages.
87 00:14:27.200 ⇒ 00:14:28.669 Uttam Kumaran: Yeah. Messages.
88 00:14:50.890 ⇒ 00:14:52.880 Miguel de Veyra: Okay, I’m kind of seeing division.
89 00:14:54.000 ⇒ 00:14:56.680 Uttam Kumaran: The next thing is also, we want to start to measure
90 00:14:56.810 ⇒ 00:15:03.430 Uttam Kumaran: how many we want to start to get the data from we just don’t have. We don’t have the logs from N. 8 N. Right?
91 00:15:05.040 ⇒ 00:15:08.460 Uttam Kumaran: I don’t know like this is where I think maybe we should start to move everything to vellum.
92 00:15:10.920 ⇒ 00:15:13.579 Uttam Kumaran: because otherwise, how are we gonna do this next? Okr.
93 00:15:14.720 ⇒ 00:15:15.420 Casie Aviles: Yeah.
94 00:15:17.520 ⇒ 00:15:20.599 Miguel de Veyra: Now we can try, because out of a room
95 00:15:21.050 ⇒ 00:15:27.040 Miguel de Veyra: notion, hey? Notion, anything has this feature where they’re this custom package for you, if you know.
96 00:15:27.210 ⇒ 00:15:30.889 Miguel de Veyra: if you only want something, we can ask for the data right?
97 00:15:31.670 ⇒ 00:15:33.960 Uttam Kumaran: No, I asked them, but they said, they can’t do that.
98 00:15:34.330 ⇒ 00:15:35.759 Miguel de Veyra: Oh, do! Do you set all right?
99 00:15:35.990 ⇒ 00:15:36.650 Miguel de Veyra: Sure.
100 00:15:38.080 ⇒ 00:15:40.700 Uttam Kumaran: I mean, how do you guys feel about moving everything to vellum.
101 00:15:47.570 ⇒ 00:15:51.260 Miguel de Veyra: I mean. Casey had more experience from it, so I.
102 00:15:51.260 ⇒ 00:15:51.580 Casie Aviles: Thank you.
103 00:15:51.580 ⇒ 00:15:53.889 Miguel de Veyra: Is the better one to speak on this.
104 00:15:55.140 ⇒ 00:16:01.449 Casie Aviles: I mean, yeah, for me. I I didn’t get to like, test it thoroughly. But
105 00:16:02.481 ⇒ 00:16:04.648 Casie Aviles: I guess the 1st thing is
106 00:16:05.680 ⇒ 00:16:10.509 Casie Aviles: I just, you know, for me. It’s maybe I’m a bit biased with anything but
107 00:16:11.070 ⇒ 00:16:17.279 Casie Aviles: initially like I had to get used to it. But of course. That’s just for the initial part.
108 00:16:18.060 ⇒ 00:16:23.630 Casie Aviles: what I do like about vellum is, you know, we could be more confident with the solutions that we’re building.
109 00:16:23.900 ⇒ 00:16:30.649 Casie Aviles: you know, because there’s like evals and version control. And you know, observability. So I I do think that’s a huge plus
110 00:16:31.563 ⇒ 00:16:34.670 Casie Aviles: that said, Yeah, it’s gonna take some
111 00:16:35.410 ⇒ 00:16:41.520 Casie Aviles: time, I guess, to get used to transferring from no an agent to bellum.
112 00:16:42.600 ⇒ 00:16:49.079 Casie Aviles: And yeah, like, there are some things that’s going to take more time, because it’s more granular, like with the workflow builder. It’s
113 00:16:50.322 ⇒ 00:16:54.309 Casie Aviles: there’s less abstraction. So not a lot. There’s less integrations from
114 00:16:54.870 ⇒ 00:17:00.040 Casie Aviles: from any 10. And like, and with any 10. So yeah, I guess.
115 00:17:00.040 ⇒ 00:17:07.810 Miguel de Veyra: Okay. Remember, Casey, you showed me something like it’s 1 step in any 10, and there’s like 4 or 5 steps in value. Right?
116 00:17:09.578 ⇒ 00:17:15.159 Casie Aviles: Yeah, basically, like, you know, like the blocks. Or I guess that’s their equivalent of nodes.
117 00:17:15.339 ⇒ 00:17:21.409 Casie Aviles: It’s going to be more. They’re more basic. So you know, unlike with the na 10 where you just
118 00:17:21.609 ⇒ 00:17:27.310 Casie Aviles: where adult just does a, it’s like an object right for anytime. But
119 00:17:27.560 ⇒ 00:17:32.339 Casie Aviles: with vellum, it’s more like controlling the the flow of execution. So
120 00:17:33.020 ⇒ 00:17:41.370 Casie Aviles: yeah, not sure if it’s making sense. But yeah, that’s how I felt about building it vallum. So.
121 00:17:43.180 ⇒ 00:17:50.119 Uttam Kumaran: I mean, I can let you guys decide. But I think now we have actually like a constraint where one
122 00:17:50.310 ⇒ 00:17:55.579 Uttam Kumaran: we want to build. We need to be able to measure the amount of requests
123 00:17:55.740 ⇒ 00:18:02.510 Uttam Kumaran: that we’re that we’re getting. Second, we’re we’re gonna we need to do evals and testing. So
124 00:18:03.860 ⇒ 00:18:08.419 Uttam Kumaran: we need to make a decision ideally in the next, like day or 2
125 00:18:09.600 ⇒ 00:18:12.049 Uttam Kumaran: on what we want to do longer term, because
126 00:18:13.030 ⇒ 00:18:16.840 Uttam Kumaran: it doesn’t seem clear to me that N. 8 N. Is going to be our production platform.
127 00:18:18.620 ⇒ 00:18:20.639 Casie Aviles: Yeah, yeah, that’s I understand that.
128 00:18:21.890 ⇒ 00:18:22.390 Miguel de Veyra: I mean.
129 00:18:22.390 ⇒ 00:18:23.110 Casie Aviles: Have like.
130 00:18:23.560 ⇒ 00:18:28.220 Miguel de Veyra: I mean the the things they’re coming out, though, like the 1st quarter
131 00:18:28.360 ⇒ 00:18:30.539 Miguel de Veyra: from what we watched last Friday.
132 00:18:31.460 ⇒ 00:18:35.600 Miguel de Veyra: So it’s gonna be part of that, so I’m not sure.
133 00:18:38.050 ⇒ 00:18:46.099 Uttam Kumaran: I just don’t feel comp like I don’t believe in any of the product. Demos like coming out anytime soon with everything we need
134 00:18:49.480 ⇒ 00:18:56.970 Uttam Kumaran: And honestly for me the number that when we’re working with clients, the the thing that matters most is resiliency and testing.
135 00:18:58.660 ⇒ 00:19:04.990 Uttam Kumaran: Well, I don’t know. Like I can give you guys another 2 days to sort of figure it out. But I would like to know
136 00:19:05.410 ⇒ 00:19:08.909 Uttam Kumaran: what we’re what the plan is for evals.
137 00:19:11.340 ⇒ 00:19:15.130 Uttam Kumaran: Like model testing, etc.
138 00:19:16.080 ⇒ 00:19:17.120 Miguel de Veyra: Okay. Yeah.
139 00:19:18.090 ⇒ 00:19:21.919 Casie Aviles: Yeah, with Ebas I was able to try like the semantic similarity
140 00:19:22.170 ⇒ 00:19:24.530 Casie Aviles: metric. I think that’s what we can
141 00:19:24.970 ⇒ 00:19:30.070 Casie Aviles: get the most value off or like we could build like, I guess, a golden
142 00:19:31.258 ⇒ 00:19:35.740 Casie Aviles: output and compare it against the AI AI’s generated output.
143 00:19:40.610 ⇒ 00:19:49.159 Uttam Kumaran: I mean, like, do we? I mean, there’s a there’s a couple of other competitors that we could keep looking at trace loop. I didn’t feel really great about
144 00:19:50.006 ⇒ 00:19:56.330 Uttam Kumaran: I mean, arise, was the only other one that I know is really really good like.
145 00:19:56.940 ⇒ 00:20:00.010 Uttam Kumaran: do you want to try? Should we try to do a rise.
146 00:20:11.180 ⇒ 00:20:20.070 Miguel de Veyra: Wait. Sorry, what type of like? Because we we talk about evils. But specifically, what type of evils are we like? What do we want to actually see.
147 00:20:21.120 ⇒ 00:20:23.500 Uttam Kumaran: Yeah, I want to do like semantic similarity.
148 00:20:25.050 ⇒ 00:20:26.080 Uttam Kumaran: Primarily.
149 00:20:27.640 ⇒ 00:20:29.379 Miguel de Veyra: Okay, wait. Let me take note of that.
150 00:20:32.400 ⇒ 00:20:33.820 Uttam Kumaran: But if you go to like.
151 00:20:37.270 ⇒ 00:20:39.790 Uttam Kumaran: yeah, I think this is a really good
152 00:20:40.370 ⇒ 00:20:45.060 Uttam Kumaran: page. But if you go here like, this is some.
153 00:20:46.410 ⇒ 00:20:51.620 Uttam Kumaran: This is some good info on like how to run evals.
154 00:20:58.210 ⇒ 00:21:01.330 Uttam Kumaran: so I don’t know. Maybe we should try a rise too, like.
155 00:21:02.930 ⇒ 00:21:08.360 Miguel de Veyra: Then I can probably ask in their discord if the someone is probably thinking like this
156 00:21:09.410 ⇒ 00:21:12.360 Miguel de Veyra: in a Nathan discord, so I’ll check it out there.
157 00:21:12.940 ⇒ 00:21:18.539 Uttam Kumaran: I just feel like, Nan. They’re start. They’re trying to go into enterprise. These guys started as like enterprise. So
158 00:21:21.120 ⇒ 00:21:26.570 Uttam Kumaran: I don’t really know a better one apart from arise, you know.
159 00:22:01.900 ⇒ 00:22:03.899 Uttam Kumaran: I mean, we thought about helicopic, right?
160 00:22:06.620 ⇒ 00:22:10.990 Uttam Kumaran: But telephone I don’t feel like, wasn’t. It? Wasn’t for AI Builder.
161 00:22:11.160 ⇒ 00:22:13.049 Uttam Kumaran: It was just for monitoring.
162 00:22:15.450 ⇒ 00:22:20.920 Casie Aviles: Yeah, yeah, like, I said, the tricky part was actually integrating it with any them
163 00:22:26.040 ⇒ 00:22:29.249 Casie Aviles: like, for most of the events that I’ve checked.
164 00:22:29.700 ⇒ 00:22:35.450 Casie Aviles: So the other one that I was that I also tried was long fuse, but for long visa
165 00:22:35.560 ⇒ 00:22:38.910 Casie Aviles: I had to like use the SDK so
166 00:22:39.110 ⇒ 00:22:40.939 Casie Aviles: had to use long chain for that.
167 00:22:41.370 ⇒ 00:22:44.050 Casie Aviles: So it’s not integrated with any 10 either.
168 00:22:47.100 ⇒ 00:22:48.210 Uttam Kumaran: Yeah.
169 00:23:18.120 ⇒ 00:23:19.635 Uttam Kumaran: I don’t know.
170 00:24:02.050 ⇒ 00:24:08.280 Uttam Kumaran: I mean, how about I let, how about I give you guys like maybe we figure think about it until tomorrow and let me know what we want to do.
171 00:24:08.450 ⇒ 00:24:09.660 Miguel de Veyra: Yeah, let’s, I think.
172 00:24:09.660 ⇒ 00:24:11.430 Uttam Kumaran: Because we have to make a decision.
173 00:24:11.430 ⇒ 00:24:13.999 Miguel de Veyra: Yeah, yeah, let’s sit on a spike, I guess.
174 00:24:26.130 ⇒ 00:24:30.220 Miguel de Veyra: But honestly, I think it could be a good idea if we could. You know.
175 00:24:30.220 ⇒ 00:24:31.020 Uttam Kumaran: Do.
176 00:24:31.290 ⇒ 00:24:37.590 Miguel de Veyra: To to do volume because we have a client right now. So it’s not out of pocket, right?
177 00:24:38.470 ⇒ 00:24:39.459 Miguel de Veyra: So if we’re gonna.
178 00:24:39.460 ⇒ 00:24:40.130 Uttam Kumaran: Agree.
179 00:24:40.290 ⇒ 00:24:40.640 Casie Aviles: Yeah.
180 00:24:40.640 ⇒ 00:24:44.389 Casie Aviles: And yeah, everything we need will be in one place for that long.
181 00:24:44.750 ⇒ 00:24:45.700 Miguel de Veyra: Yeah.
182 00:24:46.270 ⇒ 00:24:50.959 Miguel de Veyra: I don’t know. I I kinda wanna cause it’s like 500 bucks a month. So I kinda wanna
183 00:24:51.400 ⇒ 00:24:55.369 Miguel de Veyra: I don’t know. I’ll I’ll watch some videos tonight on Youtube or whatever.
184 00:25:01.520 ⇒ 00:25:04.097 Uttam Kumaran: Yeah, I like, I don’t know. I think
185 00:25:08.750 ⇒ 00:25:09.739 Uttam Kumaran: I like
186 00:25:10.870 ⇒ 00:25:14.879 Uttam Kumaran: bellum. I think they had a lot. I think you’re right that they don’t probably don’t have all the.
187 00:25:15.410 ⇒ 00:25:17.940 Uttam Kumaran: They don’t have all the like integrations, but
188 00:25:19.580 ⇒ 00:25:24.269 Uttam Kumaran: they’re the only ones that I found that have like a lot of stuff in there, and can do.
189 00:25:24.270 ⇒ 00:25:24.759 Miguel de Veyra: Built in.
190 00:25:24.760 ⇒ 00:25:26.060 Uttam Kumaran: Can do everything.
191 00:25:27.960 ⇒ 00:25:29.180 Casie Aviles: Yeah, that’s going to be achieved.
192 00:25:29.180 ⇒ 00:25:30.050 Uttam Kumaran: You said they’re in.
193 00:25:30.400 ⇒ 00:25:32.880 Uttam Kumaran: Yeah, like, what do you think is the major trade off.
194 00:25:34.090 ⇒ 00:25:42.400 Casie Aviles: Yeah, like, I think it’s good that we have more granularity with development like everything is there like we already have that you don’t have to worry about
195 00:25:42.540 ⇒ 00:25:47.850 Casie Aviles: looking for evals because it’s there and prompt management. We don’t have to use like Google Docs anymore.
196 00:25:48.490 ⇒ 00:25:50.410 Casie Aviles: It’s just there. And then we could just pull it
197 00:25:50.910 ⇒ 00:25:53.629 Casie Aviles: from the repo and into the workflow.
198 00:25:53.900 ⇒ 00:25:55.180 Casie Aviles: So those are the good.
199 00:25:56.010 ⇒ 00:25:57.540 Casie Aviles: Yeah, like the the pro.
200 00:25:57.540 ⇒ 00:25:59.929 Miguel de Veyra: But I think it’s a double edged sword. No.
201 00:26:02.740 ⇒ 00:26:06.640 Casie Aviles: Hmm, yeah, yeah, that’s the trade off. Basically, I mean.
202 00:26:06.640 ⇒ 00:26:14.560 Uttam Kumaran: I mean, I don’t know. Like right now. I I don’t think any then is like cause even for us, like I’m trying to. I’m trying to get a sense of
203 00:26:14.970 ⇒ 00:26:25.800 Uttam Kumaran: like, can we run evals for our own agents? Can we get data out? And that doesn’t seem like really clear that we can do that on any. Then they have 8, 50 a month, and they’re adding it soon. But it’s not there right now.
204 00:26:38.810 ⇒ 00:26:47.329 Uttam Kumaran: and like overall like, we’re also going to be adding more people, Miguel, to the development process like, I don’t think I think na, then, is a little bit more complicated
205 00:26:49.170 ⇒ 00:26:51.889 Uttam Kumaran: like this is a good place to add people into.
206 00:26:52.130 ⇒ 00:26:52.910 Miguel de Veyra: Yeah.
207 00:26:58.890 ⇒ 00:27:02.140 Miguel de Veyra: why don’t they have like a hundred dollar plan? No. But yeah, I think.
208 00:27:02.670 ⇒ 00:27:06.389 Uttam Kumaran: No, I mean, I actually don’t mind. The price is fair.
209 00:27:07.360 ⇒ 00:27:10.510 Uttam Kumaran: because right now we’re not. We’re not paying. We’re paying like 20 bucks a month.
210 00:27:11.224 ⇒ 00:27:17.719 Uttam Kumaran: The I would say, don’t worry about the price. But even if we were to get the N. 8 N. Thing, we still can’t do a lot of stuff.
211 00:27:19.310 ⇒ 00:27:20.030 Miguel de Veyra: Yeah.
212 00:27:20.470 ⇒ 00:27:25.120 Miguel de Veyra: And until they at least release those, you know. Stuff promised.
213 00:27:25.460 ⇒ 00:27:28.679 Uttam Kumaran: And like I arise is the only other one
214 00:27:28.800 ⇒ 00:27:31.220 Uttam Kumaran: like, I don’t know. Maybe we can do.
215 00:27:32.590 ⇒ 00:27:38.270 Uttam Kumaran: I don’t know how we can integrate. Yeah, I think these are probably just like different ones, you know. But
216 00:27:41.080 ⇒ 00:27:45.669 Uttam Kumaran: the nice thing is, it doesn’t seem like, arise. You can build agents in.
217 00:27:45.670 ⇒ 00:27:46.279 Miguel de Veyra: Yeah, yeah.
218 00:27:46.280 ⇒ 00:27:46.660 Casie Aviles: Yes.
219 00:27:46.660 ⇒ 00:27:48.879 Miguel de Veyra: It’s just an observability platform.
220 00:27:51.060 ⇒ 00:27:52.140 Uttam Kumaran: So.
221 00:27:53.300 ⇒ 00:27:54.770 Casie Aviles: Yeah, they still limit. 3.
222 00:28:04.144 ⇒ 00:28:09.009 Miguel de Veyra: I’m this might be a stupid idea. I don’t know, Casey. Let’s discuss it later.
223 00:28:09.370 ⇒ 00:28:10.030 Casie Aviles: Sure.
224 00:28:10.340 ⇒ 00:28:11.930 Miguel de Veyra: But I have like a I don’t know
225 00:28:13.070 ⇒ 00:28:17.159 Miguel de Veyra: cheesy idea. I don’t know. I’ll think about it more before I propose it.
226 00:28:25.020 ⇒ 00:28:28.509 Uttam Kumaran: Okay, so do you wanna guys, you wanna let me know tomorrow.
227 00:28:28.510 ⇒ 00:28:33.579 Miguel de Veyra: Yeah, yeah, we’ll discuss today, and then I’ll let you know tomorrow.
228 00:28:34.630 ⇒ 00:28:35.220 Uttam Kumaran: Okay.
229 00:28:36.820 ⇒ 00:28:43.490 Casie Aviles: Yeah, cause we didn’t even best I could think of are like, you know, good solutions to the logging and observability like.
230 00:28:44.390 ⇒ 00:28:45.250 Casie Aviles: yeah.
231 00:28:45.250 ⇒ 00:28:49.740 Miguel de Veyra: Yeah. Well, then, can you add to that ticket like, what? What data we wanna
232 00:28:50.260 ⇒ 00:28:54.010 Miguel de Veyra: see except for what was the data you mentioned?
233 00:28:54.200 ⇒ 00:28:55.960 Miguel de Veyra: Semantic similarity?
234 00:29:23.400 ⇒ 00:29:24.490 Miguel de Veyra: Okay, yeah.
235 00:29:25.460 ⇒ 00:29:26.060 Uttam Kumaran: Okay.
236 00:29:39.300 ⇒ 00:29:43.550 Uttam Kumaran: okay, great. So I think that’s the step one.
237 00:29:46.220 ⇒ 00:29:49.060 Uttam Kumaran: How do you do you want to talk about ABC stuff?
238 00:29:49.500 ⇒ 00:29:50.430 Miguel de Veyra: No.
239 00:29:50.680 ⇒ 00:29:53.230 Miguel de Veyra: Are they gonna start this week or not?
240 00:29:53.474 ⇒ 00:30:08.630 Uttam Kumaran: No, but I sort of want to have. I wanna go through what you said. And then I’m chatting with Scott today to see, like what his involvement is is gonna be. And then, basically, I’m gonna send them all the contract stuff later today to sign. So we’ll we’ll most likely start either end of this week or early next week.
241 00:30:08.630 ⇒ 00:30:12.140 Miguel de Veyra: Monday. Yeah. Oh, yeah. Monday, Monday. I’m not gonna be here, too.
242 00:30:12.773 ⇒ 00:30:15.560 Miguel de Veyra: Yeah. I think we could discuss it. We still have time.
243 00:30:17.630 ⇒ 00:30:18.020 Uttam Kumaran: Okay.
244 00:30:34.146 ⇒ 00:30:35.480 Miguel de Veyra: Stop sharing.
245 00:30:35.480 ⇒ 00:30:38.140 Uttam Kumaran: Do you want? Oh, yeah, I thought, I don’t know if you want to share your Miro.
246 00:30:38.856 ⇒ 00:30:39.549 Miguel de Veyra: Okay, sure.
247 00:30:40.340 ⇒ 00:30:42.310 Miguel de Veyra: Oh, it was my hero.
248 00:30:47.490 ⇒ 00:30:48.490 Miguel de Veyra: Okay, here.
249 00:30:52.690 ⇒ 00:30:58.160 Miguel de Veyra: So I I think honestly, the phase one is pretty simple to the deliverable side.
250 00:30:58.340 ⇒ 00:31:00.710 Miguel de Veyra: So it’s just a knowledge. It’s a knowledge base. Yeah.
251 00:31:01.800 ⇒ 00:31:08.820 Miguel de Veyra: And then I guess this will change, depending on, you know, if we stick with our current setup, or if we go to vellum.
252 00:31:12.190 ⇒ 00:31:16.160 Miguel de Veyra: because I’m pretty sure we don’t want to upload their files into vellum right? Tuto!
253 00:31:19.216 ⇒ 00:31:26.269 Uttam Kumaran: I’m not 100, sure. But yeah, we’ll need to think about where we ingest it, we can adjust into super base. Yeah.
254 00:31:26.690 ⇒ 00:31:30.549 Miguel de Veyra: Yeah, I think I I think super base is performing really well right now. So
255 00:31:30.880 ⇒ 00:31:33.759 Miguel de Veyra: I’ll just, I’ll just recommend we stay here.
256 00:31:34.190 ⇒ 00:31:43.180 Miguel de Veyra: And then this one we delayed. And then we want to know, of course, all the documents they have, like how many pages and stuff, because we might.
257 00:31:43.340 ⇒ 00:31:51.169 Miguel de Veyra: you know, need to create multiple knowledge bases multiple tools and then just redirect based on that.
258 00:31:52.150 ⇒ 00:31:56.160 Miguel de Veyra: And then I guess I just called it point blank.
259 00:31:56.410 ⇒ 00:32:03.689 Miguel de Veyra: for I’m not sure, because it didn’t say here that they have to interact with it, but I’m pretty sure they do right?
260 00:32:04.000 ⇒ 00:32:11.050 Miguel de Veyra: So this could be any 10 again, anything or vellum
261 00:32:14.440 ⇒ 00:32:20.759 Miguel de Veyra: tentative and then the other thing was, how will we build it for them? Because
262 00:32:22.410 ⇒ 00:32:28.389 Miguel de Veyra: right do we need like to create a different ui, and then.
263 00:32:28.390 ⇒ 00:32:34.319 Uttam Kumaran: Yeah. So that’s that’s an open question is like, do they want this to be within
264 00:32:34.420 ⇒ 00:32:40.320 Uttam Kumaran: one of their systems? Or does this live outside? Basically.
265 00:32:41.050 ⇒ 00:32:44.560 Miguel de Veyra: I think either way we have to host it, cause
266 00:32:45.460 ⇒ 00:32:50.029 Miguel de Veyra: the most probably the way they’ll use this this. They will just I frame this into somewhere
267 00:32:53.400 ⇒ 00:32:55.229 Miguel de Veyra: so that their people can use it.
268 00:32:57.860 ⇒ 00:33:01.269 Miguel de Veyra: Or you know, they’ll ask us for something like this.
269 00:33:05.600 ⇒ 00:33:12.170 Miguel de Veyra: So yeah, I think this will be the result tomorrow, and then this one. I guess we could talk to them right
270 00:33:12.960 ⇒ 00:33:19.320 Miguel de Veyra: before signing. This one is a bit focused already. I mean point blank already.
271 00:33:21.450 ⇒ 00:33:23.950 Miguel de Veyra: But yeah, I think that’s pretty much it for this one.
272 00:33:24.520 ⇒ 00:33:27.992 Uttam Kumaran: So we so can we take some notes? So one
273 00:33:29.080 ⇒ 00:33:32.630 Uttam Kumaran: who do we want on our team to sort of work on this?
274 00:33:32.960 ⇒ 00:33:34.980 Uttam Kumaran: Primarily like
275 00:33:35.610 ⇒ 00:33:43.389 Uttam Kumaran: I’m I’m we don’t have a we don’t have like a project manager for this right now. So it may just be.
276 00:33:43.820 ⇒ 00:33:49.230 Uttam Kumaran: I mean, it may potentially just be you, Miguel, as like the the Pm. Running this with the client.
277 00:33:49.230 ⇒ 00:33:49.800 Miguel de Veyra: Yep.
278 00:33:53.090 ⇒ 00:33:53.660 Miguel de Veyra: Oh!
279 00:33:53.660 ⇒ 00:33:54.170 Uttam Kumaran: But.
280 00:33:55.690 ⇒ 00:34:00.809 Miguel de Veyra: Honestly, there was a part of me that wanted to throw this to John. I’m not sure.
281 00:34:01.880 ⇒ 00:34:04.490 Miguel de Veyra: because it’s fairly new, and it’s fairly simple.
282 00:34:04.630 ⇒ 00:34:06.630 Miguel de Veyra: It’s a way to get your feet wet.
283 00:34:08.469 ⇒ 00:34:15.540 Miguel de Veyra: What do you think? Cause if we throw this to Casey, then he has to stop on the the more complicated stuff he’s working on right now.
284 00:34:16.730 ⇒ 00:34:18.199 Uttam Kumaran: That’s fine. Okay?
285 00:34:19.730 ⇒ 00:34:20.770 Uttam Kumaran: So
286 00:34:21.010 ⇒ 00:34:26.570 Uttam Kumaran: how like, I guess the one thing to we need to, we just have a list of questions we need to answer from their side
287 00:34:26.770 ⇒ 00:34:32.130 Uttam Kumaran: like, do they want this to do? They want to host this, are they okay using cloud tools?
288 00:34:35.409 ⇒ 00:34:41.660 Uttam Kumaran: Second, is, we need we need to think about like, what are, what are the what’s the initial
289 00:34:42.234 ⇒ 00:34:45.059 Uttam Kumaran: set of documents that we want to go after?
290 00:34:55.510 ⇒ 00:34:59.009 Uttam Kumaran: And you kind of you get what the project is right overall.
291 00:34:59.010 ⇒ 00:35:00.040 Miguel de Veyra: Yeah, yeah, yeah.
292 00:35:01.060 ⇒ 00:35:03.560 Miguel de Veyra: I’m surprised by how simple phase one is, though.
293 00:35:05.900 ⇒ 00:35:07.609 Uttam Kumaran: Yeah, it’s just a month. So.
294 00:35:14.920 ⇒ 00:35:18.439 Miguel de Veyra: Oh, wow! My favorites are messed up.
295 00:35:18.810 ⇒ 00:35:20.850 Miguel de Veyra: And then what else do we need to?
296 00:35:22.910 ⇒ 00:35:25.879 Miguel de Veyra: Oh, yeah, I mean, I listed one here. Wait, let me check.
297 00:35:28.700 ⇒ 00:35:34.280 Miguel de Veyra: and then I know that’s probably it knowledge base. Oh, yeah.
298 00:35:36.920 ⇒ 00:35:39.190 Miguel de Veyra: this is just for us, for the team.
299 00:35:43.200 ⇒ 00:35:47.849 Casie Aviles: So are we just going to start with the with the rag Chatbot with them?
300 00:35:48.720 ⇒ 00:35:50.159 Miguel de Veyra: Yeah, that’s I think that would.
301 00:35:50.160 ⇒ 00:35:51.410 Miguel de Veyra: Yeah, I do.
302 00:35:52.190 ⇒ 00:36:03.980 Uttam Kumaran: I think we need, we basically need to think about like, what? How do they? How are they? Gonna how are they gonna interact with this right? Like, do they want to interact with a website they don’t want to interact with in this in a chat environment.
303 00:36:04.140 ⇒ 00:36:05.789 Uttam Kumaran: That’s another question. Like
304 00:36:05.970 ⇒ 00:36:10.410 Uttam Kumaran: they previously said, they may be interested in interacting with us in in Google Chat, for example.
305 00:36:12.060 ⇒ 00:36:13.370 Miguel de Veyra: And yeah.
306 00:36:14.010 ⇒ 00:36:16.940 Miguel de Veyra: And then we want to know who’s the main users. I guess.
307 00:36:25.750 ⇒ 00:36:30.070 Uttam Kumaran: Yeah, like, who is gonna be the primary stakeholder for for feedback.
308 00:36:32.080 ⇒ 00:36:37.250 Miguel de Veyra: And then platform platform. They’ll use it on.
309 00:36:41.360 ⇒ 00:36:43.580 Miguel de Veyra: I guess this is a bit connected. Now.
310 00:36:59.010 ⇒ 00:37:06.610 Miguel de Veyra: yeah, I think, who’s gonna be my main contact. Here is it gonna be Steven or.
311 00:37:07.467 ⇒ 00:37:12.079 Uttam Kumaran: No, it’s gonna be. That’s also what we’re gonna determine on our 1st kickoff call.
312 00:37:12.260 ⇒ 00:37:13.079 Uttam Kumaran: Okay, who’s gonna be there?
313 00:37:13.080 ⇒ 00:37:14.430 Uttam Kumaran: Primary contact.
314 00:37:15.140 ⇒ 00:37:18.660 Miguel de Veyra: This one is more on the team side. So I’ll just change this to blue.
315 00:37:20.240 ⇒ 00:37:21.380 Miguel de Veyra: Yeah.
316 00:37:30.350 ⇒ 00:37:31.140 Miguel de Veyra: okay.
317 00:37:40.870 ⇒ 00:37:44.180 Miguel de Veyra: we probably have to do daily. What’s like the
318 00:37:44.890 ⇒ 00:37:47.109 Miguel de Veyra: how does Robert put it like the
319 00:37:48.670 ⇒ 00:37:56.120 Miguel de Veyra: amount of communication cause? I’m pretty sure we’re gonna have a slack channel with them. Do we need to send them daily updates and stuff like that.
320 00:37:56.580 ⇒ 00:38:00.549 Uttam Kumaran: Yeah, I think we’ll have something around updates, like, we’ll establish a cadence for updates.
321 00:38:01.160 ⇒ 00:38:02.170 Miguel de Veyra: Okay, okay.
322 00:38:06.290 ⇒ 00:38:09.020 Miguel de Veyra: yeah. I think that’s Casey. Do you have anything to add.
323 00:38:10.620 ⇒ 00:38:15.880 Casie Aviles: No, not really. I think the initial questions are also the ones that I had in mind.
324 00:38:20.080 ⇒ 00:38:23.899 Miguel de Veyra: I think the big thing here with them is, what if they wanted to create like an app? Now.
325 00:38:26.290 ⇒ 00:38:30.720 Uttam Kumaran: Oh, I don’t think it’ll be an app. I think I think we just need to think about also, like
326 00:38:31.540 ⇒ 00:38:35.290 Uttam Kumaran: these are all documents that currently live in like Google drive. So
327 00:38:35.640 ⇒ 00:38:39.549 Uttam Kumaran: where do they want like, do we need to do this in super base or
328 00:38:43.070 ⇒ 00:38:47.309 Uttam Kumaran: Do they keep using Google drive. For example, we just build a Google drive integration.
329 00:38:47.310 ⇒ 00:38:49.060 Miguel de Veyra: Yeah, with the one we had before.
330 00:38:53.590 ⇒ 00:38:56.039 Miguel de Veyra: And yeah, I think that’s another question
331 00:38:56.920 ⇒ 00:39:02.629 Miguel de Veyra: that we need to ask them just here, is it?
332 00:39:03.570 ⇒ 00:39:06.130 Miguel de Veyra: does the knowledge base have to be dynamic?
333 00:39:19.540 ⇒ 00:39:24.240 Miguel de Veyra: Or is it gonna be like, Hey, here’s a hundred documents. This is what I wanted to be trained on
334 00:39:35.930 ⇒ 00:39:37.829 Miguel de Veyra: opera spelling. Probably.
335 00:39:58.900 ⇒ 00:39:59.800 Miguel de Veyra: Okay.
336 00:40:00.630 ⇒ 00:40:02.500 Miguel de Veyra: What about you, Tim? Any more questions.
337 00:40:07.540 ⇒ 00:40:10.739 Uttam Kumaran: Hmm, okay, yeah, I think this is
338 00:40:10.860 ⇒ 00:40:15.210 Uttam Kumaran: enough. I mean, like, how do you feel about like pming this
339 00:40:15.450 ⇒ 00:40:19.510 Uttam Kumaran: Miguel like, and just sort of like meeting with the client and moving things forward?
340 00:40:21.360 ⇒ 00:40:29.120 Uttam Kumaran: and then basically working with Jana to execute. I mean, I’ll still be on every call. But how do you feel about like taking that role.
341 00:40:29.510 ⇒ 00:40:33.349 Miguel de Veyra: Oh, yeah, sure, although, can you guide me like for the 1st few times? Because.
342 00:40:33.350 ⇒ 00:40:34.420 Uttam Kumaran: Yeah, yeah.
343 00:40:34.420 ⇒ 00:40:35.030 Miguel de Veyra: Yeah.
344 00:40:35.330 ⇒ 00:40:35.910 Uttam Kumaran: I think the big.
345 00:40:35.910 ⇒ 00:40:36.250 Miguel de Veyra: It’s been.
346 00:40:36.250 ⇒ 00:40:47.430 Uttam Kumaran: The biggest thing is just like sitting with the client. And I think we’re gonna I’m gonna see whether, like if we bring in televera. I’ll need to probably bring on one more pm. To sort of help work through these.
347 00:40:47.660 ⇒ 00:40:49.999 Miguel de Veyra: All the planning related to this, so.
348 00:40:55.650 ⇒ 00:40:56.280 Miguel de Veyra: Then.
349 00:40:56.280 ⇒ 00:41:02.589 Uttam Kumaran: Overall again. The biggest thing is just like to getting the tickets in and then executing on them, and then delivering for feedback.
350 00:41:02.840 ⇒ 00:41:03.400 Miguel de Veyra: Yeah.
351 00:41:06.740 ⇒ 00:41:08.219 Miguel de Veyra: Yeah. That should be fine.
352 00:41:11.840 ⇒ 00:41:12.630 Uttam Kumaran: Okay.
353 00:41:12.630 ⇒ 00:41:14.738 Miguel de Veyra: I was really excited over the weekend.
354 00:41:16.060 ⇒ 00:41:18.959 Uttam Kumaran: No, I’m glad I’m excited. I want this to sort of finish up.
355 00:41:19.090 ⇒ 00:41:29.859 Uttam Kumaran: So I wanna I wanna I’m also gonna meet with Scott. He wants to be involved in some way, but I don’t. I’m not. I’m not really interested in him being involved on like the execution side, because he doesn’t sound a developer.
356 00:41:30.566 ⇒ 00:41:31.279 Uttam Kumaran: But I.
357 00:41:31.280 ⇒ 00:41:33.159 Miguel de Veyra: Yeah. But I didn’t even give feedback.
358 00:41:34.270 ⇒ 00:41:46.550 Uttam Kumaran: But he may be helpful on the Pm. Side, but I don’t want he’s not an employee, so I don’t. I don’t know. I want to figure out like I’m gonna meet with him in an hour and sort of figure out like what the process is, how he wants to do this. Yeah.
359 00:41:50.230 ⇒ 00:41:52.063 Miguel de Veyra: Maybe he can invest now.
360 00:41:54.010 ⇒ 00:41:55.869 Uttam Kumaran: Yeah, I just think like
361 00:41:56.070 ⇒ 00:42:04.290 Uttam Kumaran: on any of the times, like, I mean, this is a kind of a tangent. But when you feel stuff on the sales side. Better just to ask the ask questions about what we’re doing, because
362 00:42:04.430 ⇒ 00:42:09.550 Uttam Kumaran: we’re not like whenever we bring another person in. It’s not all, it’s not a good thing
363 00:42:11.200 ⇒ 00:42:21.309 Uttam Kumaran: like the less people we can do things with is the better thing. So I don’t actually not very interested in his involvement from the AI engineering side. He’s not an AI engineer.
364 00:42:21.310 ⇒ 00:42:21.640 Miguel de Veyra: Yeah.
365 00:42:22.057 ⇒ 00:42:26.550 Uttam Kumaran: Like he’s. He’s helpful for bringing us the client, but like.
366 00:42:26.550 ⇒ 00:42:27.870 Miguel de Veyra: He’s a he’s a connect.
367 00:42:28.900 ⇒ 00:42:32.139 Uttam Kumaran: Yeah, like he can’t do the engineering. So what do we need him there for?
368 00:42:35.000 ⇒ 00:42:40.240 Uttam Kumaran: So that’s the sort of thing I don’t. Wanna the more people that get added the slower things get.
369 00:42:40.460 ⇒ 00:42:41.219 Miguel de Veyra: Yeah, cause there’s
370 00:42:42.710 ⇒ 00:42:54.379 Uttam Kumaran: Yeah, it’s just that’s just that’s just life. So I wanna I wanna be very conscious of like not adding people, you know, that’s what I said, even like last year, where I was like. The more people that get added.
371 00:42:55.100 ⇒ 00:42:59.290 Uttam Kumaran: the less there is to go around, you know, so
372 00:42:59.500 ⇒ 00:43:08.619 Uttam Kumaran: it shouldn’t. It’s not a positive thing when we have to do. It’s not necessarily a positive thing. And ideally, again, we want to be multiplying in that
373 00:43:09.040 ⇒ 00:43:11.900 Uttam Kumaran: like, if you look at the revenue of the company
374 00:43:12.280 ⇒ 00:43:16.329 Uttam Kumaran: to the ratio of people should still be really high. Right? That’s our goal.
375 00:43:16.540 ⇒ 00:43:18.020 Uttam Kumaran: That’s the way we balance that.
376 00:43:18.020 ⇒ 00:43:19.759 Miguel de Veyra: Yeah. Something like, steam. No.
377 00:43:21.560 ⇒ 00:43:22.480 Uttam Kumaran: What do you mean?
378 00:43:22.610 ⇒ 00:43:25.929 Miguel de Veyra: Average salary for steam employees is 800,000.
379 00:43:26.820 ⇒ 00:43:30.650 Miguel de Veyra: Oh, yeah, I mean, like, well, average salary.
380 00:43:31.350 ⇒ 00:43:32.100 Miguel de Veyra: Yeah.
381 00:43:32.330 ⇒ 00:43:32.909 Miguel de Veyra: Cause we have.
382 00:43:32.910 ⇒ 00:43:34.050 Uttam Kumaran: Revenue per employee.
383 00:43:34.050 ⇒ 00:43:39.030 Miguel de Veyra: No, no, no salary per year, cause they have like 40, or I don’t know. So employees.
384 00:43:39.770 ⇒ 00:43:41.310 Uttam Kumaran: Oh, damn! That’s crazy!
385 00:43:41.310 ⇒ 00:43:42.319 Casie Aviles: Oh, like valve.
386 00:43:42.660 ⇒ 00:43:46.310 Miguel de Veyra: Yeah, no, yeah. Yeah. Steam. I’m not sure. I just saw it somewhere.
387 00:43:46.310 ⇒ 00:43:47.130 Uttam Kumaran: Valve.
388 00:43:47.340 ⇒ 00:43:48.000 Miguel de Veyra: Yeah.
389 00:43:50.740 ⇒ 00:43:51.989 Uttam Kumaran: We’ll get there.
390 00:43:53.890 ⇒ 00:43:57.849 Miguel de Veyra: And yeah, I think that’s pretty much it for that. And then.
391 00:43:58.350 ⇒ 00:44:01.550 Uttam Kumaran: The one thing I wanna discuss is that guy seems.
392 00:44:01.740 ⇒ 00:44:03.199 Miguel de Veyra: Seems really good. No.
393 00:44:04.020 ⇒ 00:44:04.590 Uttam Kumaran: Who.
394 00:44:04.590 ⇒ 00:44:06.100 Miguel de Veyra: Anton? Was it Anton.
395 00:44:07.310 ⇒ 00:44:13.299 Uttam Kumaran: Yeah, I’m gonna ask him whether he’s he’s interested in sort of working on this in particular. This project.
396 00:44:13.300 ⇒ 00:44:13.820 Miguel de Veyra: Yeah.
397 00:44:13.820 ⇒ 00:44:21.109 Uttam Kumaran: So I’m gonna I’m gonna call him on Wednesday, and I’ll I’ll see. I’ll if you want to join. I’ll add you. We can both meet him.
398 00:44:21.420 ⇒ 00:44:22.110 Miguel de Veyra: Yeah.
399 00:44:22.720 ⇒ 00:44:28.729 Miguel de Veyra: yeah, just in case I can’t go there. And there’s 1 because he has, like a lot of machine learning stuff.
400 00:44:28.920 ⇒ 00:44:32.800 Miguel de Veyra: Does he know how to actually train models.
401 00:44:33.600 ⇒ 00:44:36.070 Miguel de Veyra: I think that’s the thing.
402 00:44:36.970 ⇒ 00:44:38.640 Uttam Kumaran: Yeah. I can ask him.
403 00:44:38.640 ⇒ 00:44:47.040 Miguel de Veyra: Cause. I tried it in open AI like you can. You can. You can do it, but it’s not on assistant. You can’t use an assistant. Me and Casey tried.
404 00:44:48.110 ⇒ 00:44:53.470 Miguel de Veyra: but I think Azure has. But my God, azure is so complicated, like the ui.
405 00:44:53.470 ⇒ 00:44:59.789 Uttam Kumaran: Hmm, yeah. Azure is very complicated. I’ll okay. I’ll see. I’ll ask him.
406 00:45:00.080 ⇒ 00:45:03.610 Miguel de Veyra: Because if he knows that, then yeah, that’s a game changer for us.
407 00:45:04.040 ⇒ 00:45:04.990 Uttam Kumaran: Yeah.
408 00:45:04.990 ⇒ 00:45:10.160 Miguel de Veyra: Because we don’t even need knowledge bases. We can train the AI on actual data.
409 00:45:10.970 ⇒ 00:45:12.210 Uttam Kumaran: Yeah, I agree.
410 00:45:15.640 ⇒ 00:45:24.009 Uttam Kumaran: Okay, cool. So let me I’m gonna schedule that for Wednesday, I’ll add you. And then, yeah, I think the 2 big things. I want to finish this decision on platform.
411 00:45:24.010 ⇒ 00:45:24.889 Miguel de Veyra: Platform, yeah.
412 00:45:24.890 ⇒ 00:45:27.810 Uttam Kumaran: In the next 2 days, ideally, tomorrow.
413 00:45:30.070 ⇒ 00:45:32.060 Uttam Kumaran: And then I want to
414 00:45:32.330 ⇒ 00:45:37.680 Uttam Kumaran: basically run towards our Okr, which I think Casey will primarily take. And then, yeah, I want basically want
415 00:45:38.197 ⇒ 00:45:43.482 Uttam Kumaran: Miguel, you and Jana just to just try to own ABC entirely. And then
416 00:45:44.460 ⇒ 00:45:45.160 Miguel de Veyra: Okay.
417 00:45:45.160 ⇒ 00:45:46.569 Uttam Kumaran: Yeah, we’ll make it happen for them.
418 00:45:47.250 ⇒ 00:45:48.130 Miguel de Veyra: Okay. Yeah.
419 00:45:49.650 ⇒ 00:45:51.290 Uttam Kumaran: Okay. Cool any other questions.
420 00:45:54.130 ⇒ 00:45:55.890 Miguel de Veyra: No, I think that’s pretty much it.
421 00:45:56.070 ⇒ 00:45:57.360 Casie Aviles: Yeah. No.
422 00:45:58.730 ⇒ 00:45:59.350 Uttam Kumaran: Thanks guys.
423 00:45:59.350 ⇒ 00:46:00.030 Miguel de Veyra: Thanks. Guys have a.
424 00:46:00.030 ⇒ 00:46:01.080 Uttam Kumaran: Back, in slack.
425 00:46:01.460 ⇒ 00:46:01.849 Casie Aviles: Thank you.