Meeting Title: ABC | backlog grooming Date: 2025-04-03 Meeting participants: Amber Lin, Miguel De Veyra, Casie Aviles
WEBVTT
1 00:00:31.380 ⇒ 00:00:34.739 Amber Lin: I’ll ping him with the the link.
2 00:00:35.650 ⇒ 00:00:39.689 Amber Lin: How did? How’s everything going with the internal AI stuff
3 00:00:42.084 ⇒ 00:00:50.249 Casie Aviles: For the internal AI stuff. Yesterday we were. I was just on a call with Miguel, and we were figuring out like the tickets, and
4 00:00:50.250 ⇒ 00:00:52.019 Miguel de Veyra: Wait. Which meeting is this?
5 00:00:52.150 ⇒ 00:00:54.839 Miguel de Veyra: Okay? I’m getting confused
6 00:00:55.380 ⇒ 00:01:02.260 Amber Lin: Yeah, I know. I know. The talk talked a while with Janice. Apparently her son’s going into the Air Force.
7 00:01:03.650 ⇒ 00:01:04.739 Amber Lin: Yeah.
8 00:01:05.090 ⇒ 00:01:16.510 Amber Lin: And apparently Son wants to do cyber security and do computer engineering. And it’s very, very shy, but has decided to go into the Air Force. So that was very interesting.
9 00:01:18.330 ⇒ 00:01:19.940 Miguel de Veyra: He likes complicated stuff.
10 00:01:20.965 ⇒ 00:01:23.145 Miguel de Veyra: He likes complicated stuff.
11 00:01:23.690 ⇒ 00:01:24.990 Amber Lin: Probably, yeah.
12 00:01:25.878 ⇒ 00:01:28.030 Miguel de Veyra: Utam is gonna be joining this call right?
13 00:01:28.600 ⇒ 00:01:33.159 Amber Lin: Ideally, ideally, I don’t see him here, but
14 00:01:36.040 ⇒ 00:01:37.389 Miguel de Veyra: I don’t think so. He’s gonna be
15 00:01:37.390 ⇒ 00:01:38.339 Amber Lin: I don’t think so.
16 00:01:38.340 ⇒ 00:01:45.530 Miguel de Veyra: Sorry. Just a different question amber. Do we have an AI team grooming that I’m not invited to
17 00:01:45.920 ⇒ 00:01:46.310 Amber Lin: No
18 00:01:46.310 ⇒ 00:01:46.950 Miguel de Veyra: Probably not helpful.
19 00:01:47.630 ⇒ 00:01:52.409 Amber Lin: Because we kind of groomed it this Monday. I can schedule it for us. When do we want it?
20 00:01:54.061 ⇒ 00:01:59.680 Miguel de Veyra: No, no, I was just asking, because I I remember last week we had one Wednesday.
21 00:02:00.330 ⇒ 00:02:05.770 Miguel de Veyra: I don’t know. On Thursday the one with Utam, and the team, like the basically the Xx and the Vms
22 00:02:07.875 ⇒ 00:02:08.900 Amber Lin: Huh!
23 00:02:08.900 ⇒ 00:02:11.280 Miguel de Veyra: What was that meeting called with? Let me get the mic
24 00:02:11.280 ⇒ 00:02:13.460 Amber Lin: Oh, that’s the allocation meeting right
25 00:02:13.460 ⇒ 00:02:15.579 Miguel de Veyra: Yeah, yeah, I thought that was weekly
26 00:02:16.310 ⇒ 00:02:25.820 Amber Lin: It is kind of it is on Thursday. I think it’s mostly opposite. Pm.
27 00:02:26.520 ⇒ 00:02:27.509 Miguel de Veyra: Oh, okay. Okay.
28 00:02:27.847 ⇒ 00:02:37.640 Amber Lin: We’re probably we already decided on that. So we’re just gonna keep everyone on track. Every Thursday we’ll check in. So it’s a more of a Pm. Meeting. Pm.
29 00:02:37.640 ⇒ 00:02:38.880 Miguel de Veyra: Okay, okay, makes sense.
30 00:02:38.880 ⇒ 00:02:47.650 Amber Lin: Yeah for the AI roadmap. I think we could do grooming. Say one. Maybe Mondays.
31 00:02:47.980 ⇒ 00:02:48.610 Amber Lin: We can’t
32 00:02:48.610 ⇒ 00:02:48.980 Miguel de Veyra: And
33 00:02:48.980 ⇒ 00:02:51.910 Amber Lin: You can, I’ll check check on everybody’s free.
34 00:02:53.020 ⇒ 00:02:57.710 Amber Lin: Yeah. But you guys are talking about the AI team tickets. I’m just catching that up
35 00:02:58.480 ⇒ 00:03:04.270 Miguel de Veyra: Okay, yeah. So how should we do this? But it’s only for ABC number
36 00:03:06.120 ⇒ 00:03:11.150 Amber Lin: Yeah, just run it. Run it through really quick. I’m just on the AI team tickets.
37 00:03:12.258 ⇒ 00:03:16.920 Amber Lin: How’s the sources going? Is it still needing escalation?
38 00:03:16.920 ⇒ 00:03:19.040 Miguel de Veyra: Yeah, we basically need.
39 00:03:19.280 ⇒ 00:03:26.448 Miguel de Veyra: we basically need Patrick there because he’s the only one who has experience with Dlt Hub, that’s the one Utam wants to use.
40 00:03:27.430 ⇒ 00:03:33.190 Miguel de Veyra: We already have like a working version working. It’s just not using that one that Utam does
41 00:03:33.700 ⇒ 00:03:40.689 Miguel de Veyra: want to use. But you know, we’re that’s why, we’re trying to reach out to Patrick, see when he can be available to speak with him.
42 00:03:41.170 ⇒ 00:03:46.829 Amber Lin: I see, I see. Is there something else cause? That’s for the Zoom Meetings right
43 00:03:47.460 ⇒ 00:03:50.449 Miguel de Veyra: Yeah. But Utam wants to use dlt for everything
44 00:03:50.850 ⇒ 00:03:52.960 Amber Lin: Oh, I see so
45 00:03:52.960 ⇒ 00:03:56.509 Miguel de Veyra: And that’s like, basically that blocks the entire agent stuff
46 00:03:58.910 ⇒ 00:04:01.939 Miguel de Veyra: Unless there’s stuff you discussed with
47 00:04:02.390 ⇒ 00:04:06.890 Miguel de Veyra: Robert that we need, that we can work on right now. We’re pretty much blocked
48 00:04:06.890 ⇒ 00:04:07.559 Amber Lin: Okay. I see.
49 00:04:07.560 ⇒ 00:04:07.920 Miguel de Veyra: I’m doing
50 00:04:07.920 ⇒ 00:04:14.309 Amber Lin: So can we ask autumn also on on that, to see if there’s anything he wants to do
51 00:04:14.667 ⇒ 00:04:17.879 Miguel de Veyra: Yeah, I’ve tagged him, basically because he wanted to
52 00:04:18.089 ⇒ 00:04:21.430 Miguel de Veyra: basically separated into different tickets, which we did
53 00:04:21.430 ⇒ 00:04:22.000 Amber Lin: Hmm.
54 00:04:22.420 ⇒ 00:04:25.139 Miguel de Veyra: And then we’re currently running a spike for Dld.
55 00:04:26.080 ⇒ 00:04:32.540 Amber Lin: Can I put Spike in escalation like? Can I put that move some tickets into escalation?
56 00:04:32.540 ⇒ 00:04:35.169 Miguel de Veyra: Oh, yeah, I think it should. It could be. Yes.
57 00:04:35.390 ⇒ 00:04:38.700 Amber Lin: Okay, I’ll move the spike one there. What about the meeting?
58 00:04:38.820 ⇒ 00:04:42.939 Amber Lin: Existing videos? Mostly, I think I edited that.
59 00:04:46.081 ⇒ 00:04:47.649 Miguel de Veyra: Sorry. Can you share your screen?
60 00:04:47.980 ⇒ 00:04:54.599 Amber Lin: Yeah, I will. Sorry share screen here. What about this one?
61 00:04:55.650 ⇒ 00:05:04.770 Amber Lin: I think I edit this title. So I don’t even know what I was talking about? Is this something that we’re still working on like, what is this
62 00:05:07.100 ⇒ 00:05:08.439 Casie Aviles: Oh, yeah, I did.
63 00:05:08.660 ⇒ 00:05:11.760 Casie Aviles: I did add the to the to do there. So
64 00:05:11.940 ⇒ 00:05:13.410 Amber Lin: Oh, wait! Sorry!
65 00:05:13.770 ⇒ 00:05:15.760 Miguel de Veyra: And so but in progress
66 00:05:18.545 ⇒ 00:05:20.739 Casie Aviles: Yeah, I did put it in.
67 00:05:21.060 ⇒ 00:05:34.709 Casie Aviles: Sorry. I mean, I guess the tickets are a little confusing. But what I basically did there was the 1st part was just, you know, the getting the upcoming meetings from Zoom to S. 3.
68 00:05:35.200 ⇒ 00:05:36.530 Casie Aviles: But then
69 00:05:36.630 ⇒ 00:05:44.229 Casie Aviles: that is using the windmill implementation that we have, and apparently we want to switch to dlt hub.
70 00:05:45.050 ⇒ 00:05:45.519 Casie Aviles: Oh, I see
71 00:05:45.520 ⇒ 00:05:53.200 Miguel de Veyra: So yeah, I I have a suggestion here, actually, Casey for the existing ones I don’t think we actually need to use.
72 00:05:53.380 ⇒ 00:06:01.410 Miguel de Veyra: I don’t think it matters what we use really for the existing videos. Right? So while we’re waiting, can we start moving that because I don’t, there’s a lot of them right?
73 00:06:01.410 ⇒ 00:06:08.439 Casie Aviles: Yes, exactly. That’s the second part of the task, which? Yeah, you can go click on the the ticket.
74 00:06:09.010 ⇒ 00:06:25.500 Casie Aviles: I kind of did this because I can see 2 ways that 2 things that we need to work on. So the 1st one is like, yeah, the one that I did, which is the upcoming meetings. And then the second one bulk transfer, which is, we have a lot of existing recordings, right? And there’s a lot of them. And
75 00:06:25.850 ⇒ 00:06:35.010 Casie Aviles: now the the. It’s a different problem, because the the thing is, I have to figure out how to transfer it all of it to the Lt. Hub, and
76 00:06:35.800 ⇒ 00:06:42.540 Casie Aviles: like I, I won’t be doing that manually, since there’s a like hundreds of recordings, so I’ll have to figure out like some
77 00:06:43.390 ⇒ 00:06:46.430 Miguel de Veyra: Are we still gonna use dlt for this or no, I don’t think we need to right
78 00:06:47.290 ⇒ 00:06:54.110 Casie Aviles: That’s the thing. I I’m not sure how we could use the Lt. Hub in that here for that for this use case. So
79 00:06:54.110 ⇒ 00:06:56.909 Miguel de Veyra: Yeah for existing. I don’t think we need to use dlt
80 00:06:57.510 ⇒ 00:06:59.980 Casie Aviles: I mean at the top of my head. Hey? Sorry.
81 00:07:01.110 ⇒ 00:07:10.790 Miguel de Veyra: Yeah, go. Because I think it’s a waste of time if we cause, because we’re gonna get blocked again. Ideally, we wanna work on this while we’re blocked. So we’re not, you know, completely
82 00:07:11.470 ⇒ 00:07:12.340 Miguel de Veyra: cost
83 00:07:12.900 ⇒ 00:07:13.460 Amber Lin: Oh!
84 00:07:13.460 ⇒ 00:07:23.879 Casie Aviles: Yes. So the idea I have is to just use. I guess. Python, just another custom script to transfer everything
85 00:07:24.070 ⇒ 00:07:25.760 Miguel de Veyra: Yeah, existing ones.
86 00:07:26.520 ⇒ 00:07:27.200 Amber Lin: Okay?
87 00:07:27.340 ⇒ 00:07:35.420 Amber Lin: Sure. I just made a ticket so we can. Well, maybe this is blocked right now, because we want to use Dlt, we can work on the other one. So I just said
88 00:07:35.420 ⇒ 00:07:39.280 Miguel de Veyra: We have a ticket for that already. Uton didn’t want to use sub tickets anymore.
89 00:07:43.210 ⇒ 00:07:49.260 Miguel de Veyra: I. I created some tickets under, like the Zoom, the data sources. And he told me basically to dump use it
90 00:07:50.450 ⇒ 00:07:51.740 Amber Lin: So interesting.
91 00:07:53.318 ⇒ 00:07:54.631 Miguel de Veyra: I’m confused, too.
92 00:07:55.760 ⇒ 00:08:01.220 Amber Lin: Yeah, let me confirm that with him. Cause sometimes we just kind of need sub tickets
93 00:08:01.720 ⇒ 00:08:03.720 Miguel de Veyra: But maybe you go to
94 00:08:03.720 ⇒ 00:08:05.350 Amber Lin: In a project
95 00:08:05.350 ⇒ 00:08:10.890 Miguel de Veyra: Yeah. So if you go to, I think in review, I is it like the tickets for review
96 00:08:11.220 ⇒ 00:08:12.569 Miguel de Veyra: and scroll to the left?
97 00:08:12.860 ⇒ 00:08:19.240 Miguel de Veyra: So data sources. This is all this. 3 tickets under data sources are all under zoom. But they’re not sub issues.
98 00:08:20.030 ⇒ 00:08:27.559 Miguel de Veyra: Because I I tried making it a sub issue with them that that you know they’re pretty big tickets on their own. So they shouldn’t be some issues
99 00:08:28.890 ⇒ 00:08:30.470 Amber Lin: Hmm, okay.
100 00:08:30.470 ⇒ 00:08:37.450 Miguel de Veyra: That’s why I just tag them, you know, zoom extract data using dlt help. So that’s blocked right now.
101 00:08:37.450 ⇒ 00:08:38.270 Amber Lin: Okay.
102 00:08:38.480 ⇒ 00:08:41.499 Miguel de Veyra: And then load assets. You know, Yada Yadda, whatever
103 00:08:41.500 ⇒ 00:08:47.100 Amber Lin: Yeah, I mean, we can make a project for even just zoom and slack together.
104 00:08:48.460 ⇒ 00:08:50.320 Amber Lin: But we could.
105 00:08:51.360 ⇒ 00:08:55.146 Amber Lin: Yeah, I mean this, the the ball transfer?
106 00:08:56.690 ⇒ 00:09:02.819 Amber Lin: Oh, yeah, let me. We can move that to to do. I think we can do that one
107 00:09:03.360 ⇒ 00:09:04.400 Amber Lin: right now.
108 00:09:05.275 ⇒ 00:09:11.070 Amber Lin: I’m working on these 2. Let me guys let let me share the
109 00:09:11.890 ⇒ 00:09:16.820 Amber Lin: to discovery results. I worked on them last night.
110 00:09:16.970 ⇒ 00:09:21.780 Amber Lin: So I think we I want. But the thing is we’re
111 00:09:22.724 ⇒ 00:09:23.669 Miguel de Veyra: Thing.
112 00:09:24.150 ⇒ 00:09:24.810 Amber Lin: Huh!
113 00:09:25.960 ⇒ 00:09:28.670 Miguel de Veyra: Oh, it’s an invite. Okay, okay, I’m tripping
114 00:09:28.670 ⇒ 00:09:33.830 Amber Lin: Yeah, so I have. Oh, not that.
115 00:09:34.420 ⇒ 00:09:43.679 Amber Lin: Here I have roadmap for data and sales. The only thing is, we kind of are also stuck
116 00:09:43.680 ⇒ 00:09:44.290 Miguel de Veyra: Are you speaking
117 00:09:44.290 ⇒ 00:09:47.420 Amber Lin: Until we until we can. You see, my
118 00:09:48.020 ⇒ 00:09:49.200 Miguel de Veyra: Casey, let me know
119 00:09:50.200 ⇒ 00:09:51.580 Casie Aviles: Yeah, yeah, we can hear you.
120 00:09:51.920 ⇒ 00:09:56.739 Amber Lin: Okay, so we’re kind of stuck here until Utam confirms
121 00:09:57.138 ⇒ 00:09:59.129 Miguel de Veyra: Can’t hear anything. Let me.
122 00:09:59.130 ⇒ 00:09:59.770 Miguel de Veyra: Miguel doesn’t.
123 00:09:59.770 ⇒ 00:10:00.579 Miguel de Veyra: So there you go
124 00:10:00.920 ⇒ 00:10:01.570 Amber Lin: Oh, okay.
125 00:10:01.720 ⇒ 00:10:03.570 Casie Aviles: I don’t think Miguel can hear us
126 00:10:03.570 ⇒ 00:10:06.050 Amber Lin: I see? That’s okay. That’s okay. We’ll wait for him.
127 00:10:09.920 ⇒ 00:10:16.760 Amber Lin: Yeah. So that’s I’ll catch you up, Casey. Hello! You’re back.
128 00:10:17.610 ⇒ 00:10:18.470 Amber Lin: Can you hear us?
129 00:10:18.520 ⇒ 00:10:21.920 Miguel de Veyra: Thank you. So give me a moment
130 00:10:22.670 ⇒ 00:10:25.429 Amber Lin: Hmm, Casey, I’ll just catch you up.
131 00:10:26.113 ⇒ 00:10:39.599 Amber Lin: With the data team and the sales team. Because that’s 2 things we’re focusing on for the internal clients, because data team is where we have cost and sales teams where we get revenue. So we want to improve that before we do anything for other teams
132 00:10:40.090 ⇒ 00:10:40.920 Casie Aviles: Okay.
133 00:10:41.290 ⇒ 00:10:48.260 Amber Lin: For the I think the sales team is a lot easier to to
134 00:10:48.410 ⇒ 00:10:56.020 Amber Lin: do. But essentially the data team here their pain points is a lot of firefighting. So a lot of ad hoc tasks
135 00:10:56.360 ⇒ 00:10:58.080 Miguel de Veyra: Okay, there you go. Finally.
136 00:10:58.840 ⇒ 00:10:59.770 Casie Aviles: Can you hear us now?
137 00:10:59.770 ⇒ 00:11:00.920 Amber Lin: Can you hear us?
138 00:11:02.450 ⇒ 00:11:04.320 Miguel de Veyra: Nolly. Oh, what is this
139 00:11:06.270 ⇒ 00:11:07.810 Casie Aviles: Sorry, Miguel, can you hear us
140 00:11:07.810 ⇒ 00:11:09.410 Miguel de Veyra: Yeah, yeah. Now, I can hear you guys, sorry.
141 00:11:09.410 ⇒ 00:11:37.020 Amber Lin: Okay, great. So I was telling Casey right now. We had 2 discovery calls with these 2 main teams, and I mo made a roadmap for each based on their their priorities. Right? And I want to confirm with you guys I want to confirm with Utah before we do anything. But there’s probably a few things that foundationally we can do regardless so we’re looking at data team of, they have a lot of ad hoc, firefighting.
142 00:11:37.742 ⇒ 00:11:47.220 Amber Lin: panning off is very hard because there’s not really any documentation. These are their common reparative works, and these are their common bottlenecks
143 00:11:47.440 ⇒ 00:11:49.630 Amber Lin: and boundaries.
144 00:11:50.250 ⇒ 00:11:56.230 Amber Lin: And so I was thinking of the short term roadmap.
145 00:11:56.650 ⇒ 00:12:10.388 Amber Lin: we we should. This is something which I’m emphasize of having AI education. So I’m doing a separate plan for education. But it’s it’s here as well. So educating them how to use the existing AI tools better.
146 00:12:11.230 ⇒ 00:12:20.200 Amber Lin: so essentially helping the bugs and bad data not get unnoticed. So essentially flagging, helping them create documentation.
147 00:12:22.340 ⇒ 00:12:33.750 Amber Lin: making sure nothing falls through the cracks, is kind of like the ticket creator, or like from zoom and slack just to just to check what kind of requests that came has came in.
148 00:12:34.060 ⇒ 00:12:37.700 Amber Lin: and then long term roadmap. We’ll skip that for now.
149 00:12:37.920 ⇒ 00:12:43.829 Amber Lin: and so I’ve ranked it based on easy wins to harder things. I think
150 00:12:43.990 ⇒ 00:12:59.900 Amber Lin: we already have kind of a request. Summarizer, right? Because they’re talking about their clients. Sent them. Request either ad hoc, just randomly through, slack those smaller requests, or
151 00:12:59.900 ⇒ 00:13:00.330 Casie Aviles: They do?
152 00:13:00.330 ⇒ 00:13:18.510 Amber Lin: Talk about it in zoom, and it’s a bigger request. So these are the 2 main sources where requests come in and sometimes things fall. Things fall through the cracks because there’s a lot of things on their plate, and they might lose track. So they were asking if we can have a
153 00:13:19.140 ⇒ 00:13:26.030 Amber Lin: summarizer to essentially put those tasks together.
154 00:13:26.730 ⇒ 00:13:35.049 Amber Lin: And that’s 1 thing I want to confirm with Utam. If this is something that’s really influential, because we only talked to them a lot of day
155 00:13:35.950 ⇒ 00:13:42.740 Miguel de Veyra: Yeah. And then, did you get from the milady the client we wanna work with? Because I think he has to also clarify with them right
156 00:13:42.999 ⇒ 00:13:46.370 Amber Lin: No, he doesn’t know dum, a lot has been here for a month.
157 00:13:47.070 ⇒ 00:13:52.119 Amber Lin: So I think for the data team, we probably need to talk to a wage as well.
158 00:13:53.180 ⇒ 00:14:00.429 Amber Lin: because I don’t think Demolati’s view is completely representative of other people’s pains.
159 00:14:00.920 ⇒ 00:14:01.790 Miguel de Veyra: License the
160 00:14:01.790 ⇒ 00:14:06.839 Miguel de Veyra: so a quick note on this the slack and zoom summarizer is kind of blocked by the Dlt stuff
161 00:14:07.100 ⇒ 00:14:08.230 Amber Lin: Hmm, okay.
162 00:14:09.090 ⇒ 00:14:10.210 Amber Lin: Sounds good
163 00:14:11.300 ⇒ 00:14:20.560 Amber Lin: for. And also, I was like, this could be in linear. This doesn’t probably doesn’t need to be that, but might be helpful for Pm’s.
164 00:14:23.290 ⇒ 00:14:27.960 Amber Lin: One more question about I’d be on team.
165 00:14:30.010 ⇒ 00:14:34.319 Amber Lin: And this is just creating documentation.
166 00:14:35.030 ⇒ 00:14:52.160 Amber Lin: So probably through cursor. This is probably not going to be that hard. We can just write out a template and then, or fix it, find out a good template that they can use. So I do think that’s not gonna be that hard
167 00:14:52.680 ⇒ 00:14:57.739 Miguel de Veyra: Yeah, this is more really on. It’s not really more on development. It’s just more on education
168 00:14:57.740 ⇒ 00:14:59.839 Amber Lin: Yeah, yeah, yeah, okay, more.
169 00:14:59.840 ⇒ 00:15:00.430 Casie Aviles: Dude.
170 00:15:00.580 ⇒ 00:15:03.240 Casie Aviles: So the data team aren’t using Parser is that
171 00:15:03.550 ⇒ 00:15:04.330 Miguel de Veyra: They are, they are.
172 00:15:04.330 ⇒ 00:15:04.690 Amber Lin: They are.
173 00:15:05.360 ⇒ 00:15:06.000 Casie Aviles: Oh, okay.
174 00:15:06.000 ⇒ 00:15:08.579 Amber Lin: Don’t think I don’t know if they’re using it
175 00:15:09.080 ⇒ 00:15:19.869 Amber Lin: optimally. So I wanted to check like, maybe we can give them templates. Teach them. How better prompting looks like. So that type of stuff
176 00:15:20.130 ⇒ 00:15:21.039 Casie Aviles: I see you too.
177 00:15:21.040 ⇒ 00:15:21.929 Amber Lin: Yeah, and that
178 00:15:21.930 ⇒ 00:15:25.750 Miguel de Veyra: I think what they wanted more from, I think
179 00:15:27.070 ⇒ 00:15:34.460 Miguel de Veyra: amber. You mentioned this on the call with them. A Loti is, if we can have cursor. What was it? Again, have the entire context.
180 00:15:35.170 ⇒ 00:15:42.150 Miguel de Veyra: contact context, model protocol, something like that. I think that could be the thing that’s very helpful for them.
181 00:15:42.150 ⇒ 00:15:47.090 Amber Lin: Okay, yeah, I think that probably help
182 00:15:47.390 ⇒ 00:15:54.000 Amber Lin: do the documentation as well, because they’ll have more access to more things
183 00:15:55.490 ⇒ 00:16:06.489 Miguel de Veyra: Yeah, because looking at it like I use cursor a lot, too. When I code I I’ll never use templates right if I can just tell them what to do, so I would, I would think, cause
184 00:16:06.820 ⇒ 00:16:11.700 Miguel de Veyra: cause this is one of the things that Utham says is, it’s up to us for them to use it
185 00:16:11.700 ⇒ 00:16:12.030 Amber Lin: Yeah.
186 00:16:12.030 ⇒ 00:16:19.770 Miguel de Veyra: You have to go for. Imagine if you’re coding, you have to go to a document in notion, get a template, just paste something you’re not gonna do it. I’m not gonna
187 00:16:20.215 ⇒ 00:16:24.670 Amber Lin: Oh, right, okay. So we should experiment with Mcp ourselves.
188 00:16:24.930 ⇒ 00:16:28.209 Miguel de Veyra: Yes, yes, mg, would be the thing to do, I guess.
189 00:16:28.210 ⇒ 00:16:29.590 Amber Lin: Okay, awesome, awesome.
190 00:16:29.900 ⇒ 00:16:33.720 Amber Lin: But still, if it’s, it’s it’s an installation. So it will
191 00:16:33.720 ⇒ 00:16:34.310 Miguel de Veyra: Yes.
192 00:16:34.310 ⇒ 00:16:34.750 Amber Lin: Pretty easy.
193 00:16:35.032 ⇒ 00:16:36.160 Miguel de Veyra: I’ll check it out
194 00:16:36.160 ⇒ 00:16:42.440 Amber Lin: Okay? There’s also the checklist.
195 00:16:43.234 ⇒ 00:16:55.909 Amber Lin: You know. Remember, he said, the analysts keep paying them with like task. That’s not the Ae’s job. So this is also more operational. I don’t know how we’re gonna do this, but
196 00:16:55.910 ⇒ 00:17:02.829 Miguel de Veyra: Our analysts should know how to check before pinging, because I don’t know their workflows. That’s the thing. What are they pinging them for?
197 00:17:03.310 ⇒ 00:17:04.299 Miguel de Veyra: Did the model right
198 00:17:04.300 ⇒ 00:17:07.390 Amber Lin: Remember last time let me scroll up here.
199 00:17:08.680 ⇒ 00:17:13.269 Amber Lin: Here, right? So down the pipeline
200 00:17:13.760 ⇒ 00:17:23.369 Amber Lin: you have. You create the data models. You put it in the warehouse, and the analysts take it from the warehouse to build visualizations right?
201 00:17:23.550 ⇒ 00:17:30.639 Amber Lin: So I think it’s the separation here, and making the analysts
202 00:17:31.060 ⇒ 00:17:35.710 Amber Lin: check with themselves before they as an ae like
203 00:17:35.710 ⇒ 00:17:36.769 Miguel de Veyra: Oh, okay, so.
204 00:17:36.770 ⇒ 00:17:38.459 Amber Lin: But I don’t know how we’re gonna do that
205 00:17:38.720 ⇒ 00:17:45.620 Miguel de Veyra: Engineers create the mode. Can we map this out in some in, you know. So engineers create
206 00:17:46.690 ⇒ 00:17:50.620 Miguel de Veyra: engineers create the models. I guess that’s the term
207 00:17:51.300 ⇒ 00:17:52.090 Amber Lin: Hmm.
208 00:17:52.090 ⇒ 00:17:54.610 Miguel de Veyra: And then analysts create the dashboard
209 00:17:55.560 ⇒ 00:17:56.290 Amber Lin: Yeah.
210 00:17:56.550 ⇒ 00:17:57.290 Miguel de Veyra: Oh, okay. Okay.
211 00:17:57.290 ⇒ 00:18:04.290 Amber Lin: Probably start because you guys also work with drill now. So they probably start from Snowflake.
212 00:18:04.430 ⇒ 00:18:08.139 Amber Lin: They don’t create the snowflake stuff. They take the snowflake stuff.
213 00:18:08.280 ⇒ 00:18:16.169 Amber Lin: and then they pick the tables. They need pick the rows and they they make it into dashboard problem in real, most likely
214 00:18:17.053 ⇒ 00:18:22.499 Miguel de Veyra: Think it would be, I think, what? What cause? Basically, we want
215 00:18:23.030 ⇒ 00:18:25.560 Miguel de Veyra: an intermediary between the 2 of them.
216 00:18:25.970 ⇒ 00:18:29.369 Miguel de Veyra: That’s basically an agent. I think that we could look into building
217 00:18:29.370 ⇒ 00:18:29.980 Amber Lin: Hmm.
218 00:18:30.500 ⇒ 00:18:34.309 Miguel de Veyra: Right. But I think I we need to understand the process a bit more, because
219 00:18:34.450 ⇒ 00:18:36.519 Miguel de Veyra: I’m not that familiar with the entire data
220 00:18:36.520 ⇒ 00:18:37.790 Amber Lin: Totally, totally.
221 00:18:37.790 ⇒ 00:18:43.940 Miguel de Veyra: Maybe a Pm. Would be a better stuff for this, because if we ask an Ae, they’re gonna give aes perspective
222 00:18:44.330 ⇒ 00:18:45.560 Amber Lin: Huh, yeah.
223 00:18:45.912 ⇒ 00:18:49.439 Miguel de Veyra: Then they’re gonna get asked, yeah, they’re gonna give da
224 00:18:49.440 ⇒ 00:18:52.830 Amber Lin: Gosh, just back.
225 00:18:54.240 ⇒ 00:18:56.039 Amber Lin: Okay? Or Utah.
226 00:18:56.809 ⇒ 00:19:06.589 Amber Lin: Yeah. And I don’t know how urgent that is like, if it’s a quick fix, then we can do it, but if it’s not, we could probably just tell the analysts to check with themselves
227 00:19:06.920 ⇒ 00:19:07.830 Miguel de Veyra: Okay.
228 00:19:07.830 ⇒ 00:19:08.580 Amber Lin: Yeah.
229 00:19:09.661 ⇒ 00:19:14.539 Amber Lin: And then what is this? Blah blah blah?
230 00:19:14.690 ⇒ 00:19:22.660 Amber Lin: Oh, how to use chat gpt better, or maybe how to use cursor. But same thing like AI education.
231 00:19:23.890 ⇒ 00:19:33.219 Amber Lin: Yeah, this is probably where it would be helpful to build like an agent, because basically, what we can do is, hey? Here’s like a context of good SQL code.
232 00:19:33.470 ⇒ 00:19:39.159 Miguel de Veyra: You know. Yeah, build Gpt agent, or
233 00:19:39.160 ⇒ 00:19:42.879 Amber Lin: Wouldn’t take a long time like we just feed it. Some context
234 00:19:43.560 ⇒ 00:19:48.271 Miguel de Veyra: Yeah, yeah, technically, if we have same issue with the ABC
235 00:19:50.190 ⇒ 00:19:51.800 Miguel de Veyra: hard, if we had the data
236 00:19:52.978 ⇒ 00:20:08.561 Amber Lin: Be easy if we had the data. Well, yeah, I mean a wish and knows what’s good code. So we’ll just have them feed us some code, and then we’ll feed into the agent
237 00:20:09.350 ⇒ 00:20:10.469 Miguel de Veyra: Technically, yeah, we can
238 00:20:10.470 ⇒ 00:20:16.190 Amber Lin: Okay, yeah. And the medium term, I was thinking.
239 00:20:17.231 ⇒ 00:20:23.810 Amber Lin: you know, this is the one with, they wanted context and on everything, and monitoring
240 00:20:24.150 ⇒ 00:20:29.190 Amber Lin: like flagging different things, right? If things change
241 00:20:29.905 ⇒ 00:20:38.080 Amber Lin: alerting, alerting them of if there’s a if there’s a bad data or whatever happened.
242 00:20:38.460 ⇒ 00:20:43.100 Amber Lin: I think that’s what I was trying to get at. I don’t know if this wording is very clear
243 00:20:46.250 ⇒ 00:20:51.250 Miguel de Veyra: So, yeah, I think this is the one we discussed with them a lot of where they want someone to
244 00:20:51.790 ⇒ 00:20:54.160 Miguel de Veyra: Make a Pr review or Qa to code
245 00:20:55.420 ⇒ 00:20:55.940 Amber Lin: Yeah.
246 00:20:55.940 ⇒ 00:20:58.920 Miguel de Veyra: Something like that. Did you run this through the Ml. A day
247 00:20:59.680 ⇒ 00:21:11.212 Amber Lin: I think I understood him. I think for this one they’re talking about. Sometimes there’s errors or there’s bad data, but they don’t realize it. So just like with
248 00:21:11.800 ⇒ 00:21:28.470 Amber Lin: well, parts, I think we weren’t checking the data for 10, something days. And there wasn’t data for like half a month. And we didn’t know that. And the client was like, I’m pissed at this dashboard is not working.
249 00:21:28.730 ⇒ 00:21:39.809 Amber Lin: and we need to catch those before the client does. So something that automatically pings us. If if something’s not going right like like that.
250 00:21:40.160 ⇒ 00:21:43.880 Casie Aviles: Yeah, kind of the same problem with the data we had with ABC,
251 00:21:47.600 ⇒ 00:21:48.900 Casie Aviles: Take a lot of bad data
252 00:21:48.900 ⇒ 00:21:56.170 Miguel de Veyra: Because I think this is more of like a quality assurance job, right? Because it says there in the sentence, we don’t have tests
253 00:21:56.400 ⇒ 00:21:59.629 Miguel de Veyra: to catch it. Test cases, sqa, job.
254 00:22:00.030 ⇒ 00:22:07.210 Amber Lin: Hmm, okay, yeah. I guess the I guess he understood it better than I, did it said Qa, agent.
255 00:22:13.280 ⇒ 00:22:14.450 Miguel de Veyra: One triage.
256 00:22:14.870 ⇒ 00:22:18.729 Amber Lin: Yeah, I bet this will take a little bit more time than whatever’s up there
257 00:22:18.990 ⇒ 00:22:19.600 Miguel de Veyra: Yep.
258 00:22:20.640 ⇒ 00:22:23.959 Amber Lin: And then smart requests.
259 00:22:26.250 ⇒ 00:22:29.459 Amber Lin: Oh, so this is like a linear bot, I suppose
260 00:22:32.260 ⇒ 00:22:37.379 Miguel de Veyra: I think this could be the thing that we we work on 1st to be honest.
261 00:22:38.000 ⇒ 00:22:46.560 Miguel de Veyra: things fall through. So, for example, inbound requests. So, for example, in channels, we can just monitor it on like a daily basis.
262 00:22:48.530 ⇒ 00:22:55.919 Miguel de Veyra: And then, you know, if the what requests are there, we just have an AI that process it and stuff like that, and basically create linear tickets
263 00:22:56.110 ⇒ 00:22:57.340 Amber Lin: So this is
264 00:22:57.500 ⇒ 00:23:05.010 Amber Lin: we. I don’t think at 1st we even need to create linear tickets. We probably just need to text them in the channel. You know this
265 00:23:05.510 ⇒ 00:23:07.630 Miguel de Veyra: Yeah, like, the questions for today
266 00:23:07.790 ⇒ 00:23:14.070 Amber Lin: And then later on, we add line here, because I think these other things we also need to do so probably have that
267 00:23:14.170 ⇒ 00:23:22.480 Amber Lin: chat like message thing, and then we have these done, which is pretty quick, and then we add it to linear because we also have other clients.
268 00:23:22.900 ⇒ 00:23:27.679 Amber Lin: So I think if it works just bare bones, we’ll just let it work for now
269 00:23:29.030 ⇒ 00:23:32.490 Miguel de Veyra: Yeah, yeah, we can do it through slack. 1st slack to slack
270 00:23:33.030 ⇒ 00:23:39.470 Amber Lin: Yeah, yes, slack, maybe later zoom to slack, but slack will be easier, for now, I suppose.
271 00:23:40.462 ⇒ 00:24:01.239 Amber Lin: That one column metric lookup. Oh, yeah, this is on the documentation, because a lot of times they don’t have documentation. And a lot of times they, the engineers, are on boarded to a project mid project. And they’re like, what does this even mean? What does this column mean? What? What is this? What is this came from?
272 00:24:01.480 ⇒ 00:24:04.990 Amber Lin: So what Patrick Peter suggested
273 00:24:05.430 ⇒ 00:24:09.499 Miguel de Veyra: Is it the one you did before? Is it called? Can it do this? The one for Javi
274 00:24:10.430 ⇒ 00:24:19.569 Casie Aviles: Ideally. That was the problem that we wanted to solve. But I yeah, like, I I mean what I mentioned that there’s it’s still not there yet, so
275 00:24:20.110 ⇒ 00:24:26.460 Casie Aviles: still need to have some improvements over that. But at least we have, like an initial idea how we want to do it.
276 00:24:26.460 ⇒ 00:24:31.400 Amber Lin: Hmm, okay, let me comment 7, 1 second.
277 00:24:31.400 ⇒ 00:24:37.799 Casie Aviles: So honestly, it’s just a matter of, you know, good context. And yeah, how we manage that with the bot
278 00:24:43.970 ⇒ 00:24:46.180 Amber Lin: Yeah, and this is
279 00:24:46.720 ⇒ 00:24:48.990 Miguel de Veyra: This one’s probably we don’t. Wanna
280 00:24:48.990 ⇒ 00:24:51.879 Amber Lin: No, this is very long term, probably not not thinking about it.
281 00:24:51.880 ⇒ 00:24:52.490 Miguel de Veyra: Yeah, not yet.
282 00:24:52.828 ⇒ 00:24:58.590 Amber Lin: Context. Aware, this takes a lot more work, I assume, because we need a lot more context.
283 00:24:58.910 ⇒ 00:25:03.320 Amber Lin: I don’t know. Or I think this is a little bit less effort, and
284 00:25:03.780 ⇒ 00:25:04.700 Miguel de Veyra: Yeah. It’s
285 00:25:06.050 ⇒ 00:25:07.680 Amber Lin: Yeah, this is not
286 00:25:07.680 ⇒ 00:25:08.330 Miguel de Veyra: Not yet
287 00:25:08.330 ⇒ 00:25:09.150 Amber Lin: Didn’t.
288 00:25:09.380 ⇒ 00:25:17.690 Miguel de Veyra: Summary table request auto document. Dbt, models. Yeah, I wouldn’t. I wouldn’t spend time on the long term. Complex initiatives, right?
289 00:25:19.220 ⇒ 00:25:21.289 Miguel de Veyra: Maybe we could even move this to.
290 00:25:21.500 ⇒ 00:25:22.700 Miguel de Veyra: I don’t know. Q. 3.
291 00:25:22.920 ⇒ 00:25:24.729 Miguel de Veyra: There’s a lot of stuff that we need to.
292 00:25:24.730 ⇒ 00:25:29.040 Miguel de Veyra: Yeah, totally, this is definitely long term. I’m just gonna
293 00:25:32.510 ⇒ 00:25:35.400 Amber Lin: Here, there we go.
294 00:25:36.520 ⇒ 00:25:42.739 Amber Lin: And summarizing tables. Yeah, we go. Went through that. Just some random metrics
295 00:25:42.900 ⇒ 00:25:46.209 Amber Lin: of how we define success. Right? How did we
296 00:25:46.420 ⇒ 00:25:53.880 Amber Lin: measure the AI enablement? How do we measure the data quality and the preservabilities?
297 00:25:54.860 ⇒ 00:26:01.319 Amber Lin: Just some ideas that I just put there? But I do want to run you guys through the sales one that I did.
298 00:26:01.480 ⇒ 00:26:04.489 Amber Lin: So we can discuss them together.
299 00:26:04.990 ⇒ 00:26:11.860 Miguel de Veyra: And then how? How should we proceed with this? Should we add, like 3 or 4 tokens to 8 tokens tickets to the backlog.
300 00:26:12.390 ⇒ 00:26:14.940 Amber Lin: Yeah, I I would
301 00:26:16.240 ⇒ 00:26:24.719 Amber Lin: like, I can flush them out. Maybe everything in the easy wins. We put them in the backlog immediately
302 00:26:25.462 ⇒ 00:26:30.150 Amber Lin: and the tickets started, and maybe in the medium we put them in the backlog
303 00:26:30.370 ⇒ 00:26:33.680 Amber Lin: like, I can just drop all of these into tickets.
304 00:26:34.360 ⇒ 00:26:37.800 Amber Lin: You know the wait where? I said, you know these.
305 00:26:38.110 ⇒ 00:26:41.049 Amber Lin: like each one of them, make I make a ticket for.
306 00:26:42.750 ⇒ 00:26:46.400 Amber Lin: and then if they get bigger, we can convert them to a project. What do you think
307 00:26:49.950 ⇒ 00:26:52.960 Miguel de Veyra: Yeah, I think, yeah, I think can be done.
308 00:26:53.310 ⇒ 00:26:53.960 Amber Lin: Sure.
309 00:26:54.990 ⇒ 00:27:05.859 Amber Lin: Yeah, cause it also wants faster results. So if we start with easy wins, we’re gonna get some cool results without having to do that much work.
310 00:27:09.460 ⇒ 00:27:13.933 Amber Lin: so let me run you through sales. Sales is pretty straightforward.
311 00:27:14.560 ⇒ 00:27:15.620 Amber Lin: So
312 00:27:16.360 ⇒ 00:27:31.320 Amber Lin: their key pains are pretty similar of leads are falling through the cracks, because Robert is one person. He is one person sales team, so he he can’t do that much, and this is not like the unactive leads are not getting that much love
313 00:27:32.420 ⇒ 00:27:41.280 Amber Lin: and then the other parts less less is, say, manual. Crm, blah, blah, that’s okay.
314 00:27:45.050 ⇒ 00:27:48.369 Amber Lin: I think that I’ll move that bit later.
315 00:27:49.390 ⇒ 00:27:50.660 Amber Lin: Oh, gosh.
316 00:27:58.460 ⇒ 00:28:00.250 Amber Lin: yeah, I think they’re
317 00:28:00.370 ⇒ 00:28:10.690 Amber Lin: priorities, are these? I asked. Robert, and he’s very clear on this. He’s like, before we do any quality improvement we have to
318 00:28:11.443 ⇒ 00:28:20.766 Amber Lin: have to process and have nets in place to catch everything so improving, so example of quality. Oh, it’s not here.
319 00:28:23.180 ⇒ 00:28:30.000 Amber Lin: so to show you guys, what does quality mean? Like pipeline like these are quality
320 00:28:30.840 ⇒ 00:28:55.840 Amber Lin: right? Scoring the leads so enriching the quality of the data, giving the leads more information, scoring it. That’s quality forecasting, predicting the likelihood of close, also adding more data to a certain lead. That’s also quality. So what he says is deal with these stuff later. We don’t even have the foundation in place. Does that make sense
321 00:28:56.170 ⇒ 00:28:56.840 Miguel de Veyra: Okay.
322 00:28:57.010 ⇒ 00:29:02.040 Amber Lin: Yeah. So to make sure, nothing falls through the cracks.
323 00:29:02.716 ⇒ 00:29:08.340 Amber Lin: Creating leads list from events that that’s kind of what John was working on, but it would. It went
324 00:29:08.520 ⇒ 00:29:17.710 Amber Lin: very, very slow. They should be fast enough, like at different events, linked to interactions that should become a lead list
325 00:29:18.030 ⇒ 00:29:26.599 Amber Lin: and proposal pitching take time. So that’s the 3 main pain points of priorities. That, he said. Is that clear to you guys?
326 00:29:27.370 ⇒ 00:29:31.460 Miguel de Veyra: Sorry again. My God! So the the top 3 priorities are
327 00:29:32.670 ⇒ 00:29:40.009 Miguel de Veyra: Making sure nothing falls through the cracks, creating lead lists from event season, so we can quickly convert that into a campaign
328 00:29:40.240 ⇒ 00:29:46.319 Miguel de Veyra: proposal, pitching, taking time pitching to upward. And that, okay, okay, okay.
329 00:29:46.320 ⇒ 00:29:49.850 Amber Lin: Yeah. So reducing the whatever it takes to
330 00:29:49.960 ⇒ 00:30:01.210 Amber Lin: make that faster, right? And maybe assets, I think that’s that part is where our client bots come in because we want he wants to create case studies, right?
331 00:30:02.540 ⇒ 00:30:04.560 Amber Lin: These from client
332 00:30:05.030 ⇒ 00:30:13.129 Amber Lin: bots and then client bots. We probably have enough data to do. But anyways, here’s the again the ranked.
333 00:30:13.380 ⇒ 00:30:16.700 Amber Lin: the rings list of
334 00:30:17.390 ⇒ 00:30:26.900 Amber Lin: a follow up tracker that’s just in notion. So it could be pretty fast. We look at these this stuff, and we send a notification
335 00:30:27.160 ⇒ 00:30:37.829 Amber Lin: like the structure is there? Everything is tagged. So I don’t think we even need to train the agent. We just say, if you see this tag check in at this frequency.
336 00:30:38.840 ⇒ 00:30:40.490 Amber Lin: So that’s 1 thing
337 00:30:41.400 ⇒ 00:30:43.219 Miguel de Veyra: Yeah, that’s pretty fast.
338 00:30:43.630 ⇒ 00:30:43.960 Amber Lin: Yeah.
339 00:30:43.960 ⇒ 00:30:45.580 Miguel de Veyra: Those AI label
340 00:30:46.910 ⇒ 00:30:57.460 Amber Lin: I think he already uses it. So maybe this is how like, maybe how
341 00:30:57.900 ⇒ 00:31:05.199 Amber Lin: a sales AI like asset my F and q’s. But I’m also I’m writing that already.
342 00:31:05.910 ⇒ 00:31:17.569 Amber Lin: Proposal. Asset library. Yeah, that one. So creating from Client Hub bots. I tried this with our ABC. Bought the other day it was pretty good. You guys want to see
343 00:31:18.460 ⇒ 00:31:19.040 Miguel de Veyra: Hey!
344 00:31:19.040 ⇒ 00:31:22.179 Amber Lin: Let me share it was. It was nice
345 00:31:26.080 ⇒ 00:31:26.950 Amber Lin: Oh.
346 00:31:30.040 ⇒ 00:31:32.340 Amber Lin: AI agents.
347 00:31:32.760 ⇒ 00:31:39.700 Amber Lin: So the other day I said, I’m a company looking for da-da-da. Do this
348 00:31:39.990 ⇒ 00:31:42.950 Amber Lin: case study gave them a format.
349 00:31:43.310 ⇒ 00:31:58.500 Amber Lin: And then, yeah, I answered, pretty well. So it was like summary background information problem, statement objectives.
350 00:31:59.130 ⇒ 00:32:03.379 Amber Lin: methodology like, what do we? How do we do it?
351 00:32:03.540 ⇒ 00:32:11.329 Amber Lin: Solutions, results, please, or visuals, testimonials?
352 00:32:12.313 ⇒ 00:32:16.189 Amber Lin: That is kind of, I guess that’s true.
353 00:32:16.490 ⇒ 00:32:23.551 Amber Lin: And yeah, I don’t even know if I said that.
354 00:32:25.306 ⇒ 00:32:34.470 Amber Lin: Yeah. So if we have more of these bots, I guess case studies can be a lot easier cause. Robert, right now probably doesn’t.
355 00:32:34.690 ⇒ 00:32:40.459 Amber Lin: He has to manually put those contexts together into, say, maybe Gpt
356 00:32:40.460 ⇒ 00:32:46.160 Miguel de Veyra: Yeah, I think this is actually, this is part of the client agent like that project
357 00:32:46.340 ⇒ 00:32:46.770 Amber Lin: Yeah.
358 00:32:46.770 ⇒ 00:33:00.149 Miguel de Veyra: Like, if you go to linear. So cause yeah, though, that’s the thing, though, because linear it’s all. It’s all based on that client, because that’s the main thing we actually want to build like the client agent. So the the vision of
359 00:33:00.350 ⇒ 00:33:04.390 Miguel de Veyra: from when we last talked is that there’s gonna be a brain forge agent.
360 00:33:04.540 ⇒ 00:33:14.740 Miguel de Veyra: Basically, you know, bring, that’s our central agent. And then we’re gonna have, for example, this ABC agent. There’s gonna be a Jaffe agent. And then each client agent
361 00:33:14.860 ⇒ 00:33:19.869 Miguel de Veyra: not only has not only has
362 00:33:21.070 ⇒ 00:33:37.279 Miguel de Veyra: context, for you know the development, but also for sales, also for for everything else. Basically. So you can ask it. Anything about that, hey? What’s the use case? What’s what’s the point of Brainforge for this client? It can answer that, you know. If, for example, hey, where does this
363 00:33:38.500 ⇒ 00:33:43.179 Miguel de Veyra: column come from? It can also answer that that’s like the general idea
364 00:33:43.180 ⇒ 00:34:07.789 Amber Lin: Goal, right? I think that’s gonna completing that. It’s a very big task. So it might take a long time, I think, in between, we should drop some outputs for our clients here and there, so they’re excited to work with us and say, instead of say, having to wait 3 months. So that’s the only thing. Why I put this there. Because I know we’re working on it. So this will be faster.
365 00:34:11.067 ⇒ 00:34:23.500 Amber Lin: Medium difficulty events to campaign. That’s kind of what John was working on. So we do have some scraping stuff there drafting proposals
366 00:34:23.639 ⇒ 00:34:24.130 Amber Lin: that
367 00:34:24.139 ⇒ 00:34:27.009 Miguel de Veyra: Yeah, we have something like that before
368 00:34:27.010 ⇒ 00:34:34.260 Amber Lin: Right. I don’t think it’s the greatest cause, Robert said. He stopped using it because it was the agent. Wasn’t
369 00:34:34.620 ⇒ 00:34:40.709 Amber Lin: that flexible because I was using the same case studies of cocoa forever. So
370 00:34:40.930 ⇒ 00:34:41.150 Miguel de Veyra: Yeah.
371 00:34:41.159 ⇒ 00:34:44.149 Casie Aviles: That was for the lead research agent
372 00:34:44.150 ⇒ 00:34:45.139 Amber Lin: Yeah, yeah.
373 00:34:46.010 ⇒ 00:34:46.830 Casie Aviles: Yeah, so.
374 00:34:46.830 ⇒ 00:34:51.794 Miguel de Veyra: I don’t think that was designed to draft proposals, cause we did, for like a we did like
375 00:34:52.070 ⇒ 00:34:52.480 Amber Lin: Oh!
376 00:34:52.489 ⇒ 00:34:59.200 Miguel de Veyra: Demo for a client before Hpi, which basically drafts proposal or and
377 00:34:59.360 ⇒ 00:35:04.449 Amber Lin: Then I don’t think Robert even knows that 8
378 00:35:04.450 ⇒ 00:35:08.740 Miguel de Veyra: I don’t think that, Hpi. It’s Polygon
379 00:35:10.070 ⇒ 00:35:10.940 Amber Lin: Quality.
380 00:35:10.940 ⇒ 00:35:15.109 Miguel de Veyra: Oh, sorry, no, no, sorry. It’s HPI not b sorry
381 00:35:15.305 ⇒ 00:35:15.500 Amber Lin: I
382 00:35:15.500 ⇒ 00:35:16.330 Miguel de Veyra: Yeah, yeah.
383 00:35:16.330 ⇒ 00:35:17.030 Casie Aviles: Yes.
384 00:35:18.040 ⇒ 00:35:25.619 Miguel de Veyra: Though we we as I don’t think we pursued that client because they don’t wanna didn’t push through. Basically
385 00:35:26.600 ⇒ 00:35:34.359 Amber Lin: I see. Okay, we’ll we’ll look at that. If that’s already something there, it might be pretty fast. But I think he wants this.
386 00:35:34.780 ⇒ 00:35:38.189 Amber Lin: He wants this above anything else
387 00:35:38.600 ⇒ 00:35:43.700 Amber Lin: he wants that that could be later. That doesn’t really matter. For now, Ike
388 00:35:44.000 ⇒ 00:35:47.180 Miguel de Veyra: Okay? Then, I guess we prioritize this ticket
389 00:35:47.320 ⇒ 00:35:59.549 Amber Lin: Yeah, and that’s probably even not an agent. Maybe we it can become like you see it. And you draft a AI message to send to the person.
390 00:35:59.890 ⇒ 00:36:08.789 Amber Lin: but that’s something down the line. But right now there’s a lot of leads lying in the database that we don’t contact.
391 00:36:10.380 ⇒ 00:36:12.839 Amber Lin: So we want to get that
392 00:36:13.320 ⇒ 00:36:14.110 Miguel de Veyra: Okay.
393 00:36:14.110 ⇒ 00:36:14.590 Amber Lin: Right, so
394 00:36:14.590 ⇒ 00:36:16.779 Miguel de Veyra: Should we decide the back legs now.
395 00:36:17.370 ⇒ 00:36:17.940 Amber Lin: Hmm.
396 00:36:18.160 ⇒ 00:36:21.700 Miguel de Veyra: Should we decide, like what we add to the what we will add to the backlog
397 00:36:22.190 ⇒ 00:36:29.360 Amber Lin: I mean, I’ll add everything to the backlog, and we can pick which ones we want to move to ready for development right?
398 00:36:29.360 ⇒ 00:36:29.900 Miguel de Veyra: Okay. Okay.
399 00:36:29.900 ⇒ 00:36:30.750 Amber Lin: Yeah, they
400 00:36:30.750 ⇒ 00:36:32.399 Miguel de Veyra: How about for ABC, though?
401 00:36:33.660 ⇒ 00:36:39.240 Amber Lin: Good question. I think we’re both we’re done with is, yeah. What do you
402 00:36:39.240 ⇒ 00:36:41.719 Miguel de Veyra: Next cycle is gonna be next week, right
403 00:36:41.720 ⇒ 00:36:49.309 Amber Lin: Yeah, yeah. Sorry. Thank you for bringing that up. Let’s talk about ABC, so ABC, right now, we’re rolling things out.
404 00:36:49.560 ⇒ 00:36:51.280 Amber Lin: So it’s
405 00:36:52.170 ⇒ 00:37:17.339 Amber Lin: roll out to 5 Csrs. Probably we’re pretty much on that done with that this week. So just monitoring errors. And then I think sometime sometime later next week, we’re gonna roll out to everybody. But it’s the same process. So I don’t think there’s not that much more in Rollout. The main work will be the document update agent, which I know we’re not. We’re we sort of pause that.
406 00:37:17.600 ⇒ 00:37:20.979 Amber Lin: So I guess we can work on the backlog for that. What do you think
407 00:37:21.788 ⇒ 00:37:31.539 Miguel de Veyra: Yeah, that one. And then the other thing that we want to add next cycle is. So, for example, there’s errors. We want basically a bot that recommends.
408 00:37:32.210 ⇒ 00:37:37.720 Miguel de Veyra: But it’s gonna be part of the same workflow, but recommends what? So you know, what could be the fix for that error?
409 00:37:37.890 ⇒ 00:37:40.290 Amber Lin: Oh, I see.
410 00:37:40.810 ⇒ 00:37:48.010 Amber Lin: Let me add that to backlog Red, let me share my screen share screen.
411 00:37:48.530 ⇒ 00:37:49.300 Amber Lin: That’s been here
412 00:37:54.190 ⇒ 00:38:01.919 Amber Lin: thought that suggests updates based on error.
413 00:38:03.940 ⇒ 00:38:10.209 Amber Lin: Alright, this is the Csr Csr bots. I’ll put
414 00:38:11.670 ⇒ 00:38:13.690 Miguel de Veyra: I’ll put it there, whatever.
415 00:38:13.690 ⇒ 00:38:15.239 Miguel de Veyra: Yeah. But improvements
416 00:38:16.540 ⇒ 00:38:17.920 Amber Lin: There’s that.
417 00:38:18.780 ⇒ 00:38:20.140 Amber Lin: And
418 00:38:20.450 ⇒ 00:38:29.770 Amber Lin: that’s not that important. That’s not that important. Oh, I mean, yeah. And also working on the data team. That’s also something that’s on our plate.
419 00:38:30.080 ⇒ 00:38:36.959 Amber Lin: Do get all the call data and get it links to the Apis.
420 00:38:37.430 ⇒ 00:38:40.230 Amber Lin: I think that’s more of a 1 time thing like
421 00:38:40.430 ⇒ 00:38:43.570 Amber Lin: as long as we do it once we should be fine
422 00:38:43.570 ⇒ 00:38:45.549 Miguel de Veyra: Is there no one that Annie’s gonna work on
423 00:38:45.570 ⇒ 00:38:48.279 Amber Lin: Yeah. And then he’s gonna work on that. So
424 00:38:48.680 ⇒ 00:38:49.689 Miguel de Veyra: Okay, okay, so.
425 00:38:49.690 ⇒ 00:39:03.289 Amber Lin: We’ll sign her to that right now, letting her learn real and figure out the canvas, which I don’t think is that hard? Because I figured it out, too. But I’m not the data analyst here. So she’s gonna have to figure it out. I think she did
426 00:39:04.170 ⇒ 00:39:04.850 Miguel de Veyra: Okay.
427 00:39:04.850 ⇒ 00:39:16.299 Amber Lin: Yeah. So we’ll have the canvas. We’ll have her link with Brian and get their data. Put it in Snowflake. That’s her. That’s gonna be mostly her job.
428 00:39:16.490 ⇒ 00:39:24.890 Amber Lin: And for us, there’s not gonna do that. So whatever the remote deployment
429 00:39:25.340 ⇒ 00:39:30.680 Amber Lin: update, yeah for the update bot, let’s talk about that where the trainer bought
430 00:39:32.440 ⇒ 00:39:33.310 Miguel de Veyra: Yeah.
431 00:39:33.310 ⇒ 00:39:36.400 Amber Lin: A more tickets here, or some backlog items
432 00:39:36.400 ⇒ 00:39:44.980 Miguel de Veyra: This one, the update board, because it’s technically done the update and add new knowledge. But it’s not that. That’s the problem, though.
433 00:39:45.320 ⇒ 00:39:50.140 Miguel de Veyra: cause it’s not gonna be on the central dock. There’s no way we can do that there. It’s just not
434 00:39:50.390 ⇒ 00:39:52.100 Miguel de Veyra: technically feasible.
435 00:39:53.060 ⇒ 00:39:54.269 Amber Lin: What do you mean?
436 00:39:55.630 ⇒ 00:40:02.260 Miguel de Veyra: Cause we cause the way the update bought will. The it’s designed is that it’s gonna be on database.
437 00:40:02.610 ⇒ 00:40:09.179 Miguel de Veyra: It’s not on. It’s not on the Google Doc. So we have to create basically a Ui
438 00:40:10.120 ⇒ 00:40:12.830 Miguel de Veyra: because it has to be a record right in a database.
439 00:40:13.560 ⇒ 00:40:15.290 Miguel de Veyra: It can’t be.
440 00:40:16.080 ⇒ 00:40:20.360 Miguel de Veyra: It can’t be in the in Google, in Google, docs, sorry
441 00:40:22.430 ⇒ 00:40:24.719 Miguel de Veyra: there’s just no way it can be in Google. Docs.
442 00:40:25.210 ⇒ 00:40:30.375 Amber Lin: I see. I mean they have to approve it right? I don’t think we’re
443 00:40:31.490 ⇒ 00:40:39.990 Amber Lin: If we suggest certain updates, how do we go from there to the Google docs. It doesn’t have to be in the same workflow. But is there a way to go from
444 00:40:40.330 ⇒ 00:40:43.360 Amber Lin: there to Google Docs? Eventually
445 00:40:44.870 ⇒ 00:40:59.890 Miguel de Veyra: I mean, we can basically. But if they change something on Google docs, it’s gonna be basically just for display. It’s never gonna reflect on the actual bot because the bot will use. Now instead of Google docs, it’s gonna use the database
446 00:41:04.320 ⇒ 00:41:11.809 Casie Aviles: But sorry I I believe there was a feature where the bot overrides the Central doc, though right
447 00:41:12.696 ⇒ 00:41:15.759 Miguel de Veyra: Yeah. But the thing is they they edit it on.
448 00:41:15.900 ⇒ 00:41:23.380 Miguel de Veyra: Sometimes they edit on Google docs. But we don’t read Google Docs. It’s just updating it for display, basically
449 00:41:25.410 ⇒ 00:41:26.660 Casie Aviles: Oh, I see! I see!
450 00:41:27.570 ⇒ 00:41:34.609 Miguel de Veyra: Right. We might even have to replace Google Docs. Because we technically, we don’t need it at all. If you create like a Ui where they can see it.
451 00:41:37.930 ⇒ 00:41:45.080 Miguel de Veyra: So it’s I. I’m not sure how we should proceed with that like, because the way it’s built right now let me share my screen
452 00:41:45.080 ⇒ 00:41:46.040 Amber Lin: Yeah, okay.
453 00:41:46.450 ⇒ 00:41:48.949 Miguel de Veyra: Anything. Let me just load up
454 00:41:50.330 ⇒ 00:41:57.070 Miguel de Veyra: cause we clarified this with with what’s your name?
455 00:41:58.620 ⇒ 00:42:00.510 Amber Lin: Jenny saved, hosted.
456 00:42:00.510 ⇒ 00:42:02.429 Miguel de Veyra: Jenny. Sorry, Yvette, I forgot, but
457 00:42:02.430 ⇒ 00:42:05.956 Amber Lin: And then grace is the one that might do, updating as well
458 00:42:06.250 ⇒ 00:42:08.129 Miguel de Veyra: Yeah, I think it was even better
459 00:42:08.130 ⇒ 00:42:08.820 Amber Lin: Hmm.
460 00:42:08.820 ⇒ 00:42:14.799 Miguel de Veyra: So basically, they don’t really need this update. But to be on Google Docs
461 00:42:15.150 ⇒ 00:42:15.750 Amber Lin: Yeah.
462 00:42:16.500 ⇒ 00:42:19.530 Miguel de Veyra: Because the way this works is, if we go to super base.
463 00:42:22.460 ⇒ 00:42:28.720 Miguel de Veyra: let me pull up the central Doc here, minimize for now rest in peace for now.
464 00:42:30.743 ⇒ 00:42:33.750 Miguel de Veyra: I think it’s ABC of here.
465 00:42:36.710 ⇒ 00:42:40.760 Miguel de Veyra: real table. ABC, general, Doc, I guess it’s this one.
466 00:42:44.030 ⇒ 00:42:45.539 Miguel de Veyra: So yeah, here.
467 00:42:46.580 ⇒ 00:42:50.050 Miguel de Veyra: So basically, what we did is
468 00:42:50.330 ⇒ 00:43:03.809 Miguel de Veyra: this was way before. So this is, this is in no way up to date. What Casey did? Was he put this stuff in here? So, for example, for this title for this updating the program. If we oh, so you can call us.
469 00:43:04.520 ⇒ 00:43:11.730 Miguel de Veyra: we basically put this here. This is the divider, this 3 equal signs. And then this is a record in here.
470 00:43:11.920 ⇒ 00:43:17.870 Miguel de Veyra: This is the only way we can update it. So because if we, for example, talk to. There’s no way I can talk here.
471 00:43:18.070 ⇒ 00:43:21.570 Miguel de Veyra: but basically we’re gonna base it off here
472 00:43:21.770 ⇒ 00:43:23.020 Amber Lin: So, hey?
473 00:43:23.160 ⇒ 00:43:26.110 Miguel de Veyra: You know which one I want to update this
474 00:43:27.290 ⇒ 00:43:28.880 Miguel de Veyra: We can only update here
475 00:43:29.090 ⇒ 00:43:30.050 Amber Lin: Yeah, okay.
476 00:43:30.490 ⇒ 00:43:35.540 Miguel de Veyra: So if they make changes here it we can’t. We’re it’s not gonna reflect
477 00:43:36.110 ⇒ 00:43:44.980 Amber Lin: How do we then connect the central dock to the super base things? Can we refresh every, say 24 h
478 00:43:46.854 ⇒ 00:43:49.929 Miguel de Veyra: We can, because for these are all texts.
479 00:43:50.380 ⇒ 00:44:02.869 Miguel de Veyra: these are text, right? So there’s no how do you say it? There’s no divider like how. There’s no way for us to tell that this is for this. This goes here. You know, there’s no id, basically
480 00:44:03.240 ⇒ 00:44:04.810 Amber Lin: Oh!
481 00:44:04.810 ⇒ 00:44:07.679 Miguel de Veyra: So we had to create it like manually, for each
482 00:44:07.680 ⇒ 00:44:11.609 Amber Lin: We can edit the Central Doc to give it ids. Can we
483 00:44:13.191 ⇒ 00:44:15.050 Miguel de Veyra: Even then it won’t be.
484 00:44:17.020 ⇒ 00:44:24.700 Miguel de Veyra: I don’t know. There’s no point because we, the Ids, are here like these are just that. Technically, this is just for display
485 00:44:25.120 ⇒ 00:44:26.189 Amber Lin: Let’s see.
486 00:44:28.180 ⇒ 00:44:32.849 Miguel de Veyra: And I mean in here. I still, I think I clear entire. Doc, update with new info
487 00:44:33.000 ⇒ 00:44:37.020 Miguel de Veyra: get current, though I think it’s this one that I’m using. But
488 00:44:41.580 ⇒ 00:44:44.700 Miguel de Veyra: Yeah, so it’s good. And it’s gonna look like this. So it’s not good at all.
489 00:44:49.910 ⇒ 00:44:54.059 Miguel de Veyra: Like, yeah, this is 52 pages. As you can see it’s not up to date like this is, I think.
490 00:44:54.290 ⇒ 00:44:55.590 Miguel de Veyra: 74
491 00:44:56.280 ⇒ 00:44:56.980 Amber Lin: Okay?
492 00:44:59.460 ⇒ 00:45:08.609 Amber Lin: I think a few questions is mostly, how do we make sure that the super base is up to date with the Google docs?
493 00:45:09.677 ⇒ 00:45:12.949 Miguel de Veyra: Have to do it again manually, basically
494 00:45:12.950 ⇒ 00:45:16.380 Amber Lin: Can that be a scheduled thing
495 00:45:17.400 ⇒ 00:45:22.180 Amber Lin: like? Can it be automated to do that? Say every 24 h?
496 00:45:22.700 ⇒ 00:45:27.679 Amber Lin: So it’ll be tedious if we, and there’ll be a lot of errors if you have to manually update it
497 00:45:28.290 ⇒ 00:45:43.490 Miguel de Veyra: Yeah, that’s the thing, though. That’s why we’re using the A, that’s why we’re developing the agent. But so there’s basically the doc itself will never be touched by a human right. So every everything that goes up that updates this is AI augmented basically
498 00:45:43.490 ⇒ 00:45:44.050 Amber Lin: Good chat.
499 00:45:44.050 ⇒ 00:45:48.540 Miguel de Veyra: That’s the idea. So technically, this, no one should ever touch the document
500 00:45:48.540 ⇒ 00:45:56.989 Amber Lin: That’s a good idea. So there’s the we suggest the update.
501 00:45:57.110 ⇒ 00:46:01.990 Amber Lin: And then Yvette needs to approve Yvette or Janice needs to approve the update.
502 00:46:03.620 ⇒ 00:46:08.050 Amber Lin: So, and how do we plan to do that?
503 00:46:08.790 ⇒ 00:46:12.820 Miguel de Veyra: Yeah, that’s where like, the entire ui comes in because.
504 00:46:13.040 ⇒ 00:46:19.079 Miguel de Veyra: ideally, what’s gonna happen? Because this this is where it’s like a lot of gray area, because I’m not sure to how to proceed.
505 00:46:19.590 ⇒ 00:46:22.290 Miguel de Veyra: because this will take a lot of development time
506 00:46:22.820 ⇒ 00:46:23.770 Amber Lin: Hmm.
507 00:46:25.510 ⇒ 00:46:26.590 Miguel de Veyra: ABC. Home
508 00:46:28.270 ⇒ 00:46:31.340 Amber Lin: I mean right now they’re pretty happy with just
509 00:46:31.620 ⇒ 00:46:34.549 Amber Lin: find like control. F, and and
510 00:46:34.750 ⇒ 00:46:35.360 Miguel de Veyra: Yeah.
511 00:46:35.360 ⇒ 00:46:38.339 Amber Lin: Talk, and then editing is in Central Doc.
512 00:46:38.760 ⇒ 00:46:51.919 Miguel de Veyra: And then the only good thing also that will allow us to. For example, we already gave a reply. Right what now we can do because we have the Ids. So hey? If it lacks details, here’s the Id. You can then read the entire thing
513 00:46:52.450 ⇒ 00:46:56.200 Amber Lin: That’s something that that right now. I don’t think we can do that right, Casey
514 00:46:57.170 ⇒ 00:46:57.969 Casie Aviles: Sorry, what
515 00:46:58.240 ⇒ 00:46:59.220 Miguel de Veyra: Like, for example.
516 00:46:59.220 ⇒ 00:46:59.950 Casie Aviles: Exactly.
517 00:47:00.110 ⇒ 00:47:14.160 Miguel de Veyra: So, for example, right now we have ids. Right? So, for example, this was found. This was the one we use this row. Now we can tell them that, hey? We saw he, you know. If you need more details, you can read the entire documentation here
518 00:47:16.830 ⇒ 00:47:17.590 Casie Aviles: Okay.
519 00:47:18.390 ⇒ 00:47:21.669 Miguel de Veyra: So, because right now we can do that in the existing Csr board.
520 00:47:25.136 ⇒ 00:47:31.060 Miguel de Veyra: But yeah, the thing now we need to do is we need to have a ui.
521 00:47:32.700 ⇒ 00:47:36.060 Miguel de Veyra: but I’m not sure Utah wants to, because this is a shit ton of time
522 00:47:36.890 ⇒ 00:47:40.620 Casie Aviles: Yeah, I mean, and there’s just we don’t have a lot of developers for that
523 00:47:40.620 ⇒ 00:47:44.859 Miguel de Veyra: Yeah. And then, technically, the only one who can really code. The Ui is me.
524 00:47:45.440 ⇒ 00:47:46.250 Casie Aviles: Yeah.
525 00:47:47.490 ⇒ 00:47:53.619 Amber Lin: Can it be? Can we live without a ui like? Can it live in Google? Chat
526 00:47:55.290 ⇒ 00:47:57.950 Miguel de Veyra: That has its own problems, but also
527 00:47:58.930 ⇒ 00:48:02.019 Casie Aviles: That was where I was blocked with integrating the Google Chat
528 00:48:02.020 ⇒ 00:48:06.299 Amber Lin: Oh, I see. So we probably don’t want it in Google Chat
529 00:48:06.560 ⇒ 00:48:10.390 Miguel de Veyra: And then, yeah, cause also, one thing that they want to do is, for example.
530 00:48:10.935 ⇒ 00:48:26.680 Miguel de Veyra: the bot suggests a reply. Right? Here’s the changes. Here’s the list of changes, for example, and then it goes, Yada Yada Yada. And then we should ideally, what we want to do is give them the chance to edit it manually like the the AI output
531 00:48:27.320 ⇒ 00:48:33.029 Miguel de Veyra: like, Hey, I wanna change this to instead of you know, maybe it’s like term cell.
532 00:48:33.290 ⇒ 00:48:34.130 Miguel de Veyra: I mean.
533 00:48:35.120 ⇒ 00:48:45.690 Miguel de Veyra: I don’t know triple, else. This is, for example, they can go into the AI output, and then, you know, just correct it. And then, okay, this is good. Then they click something, and then it updates it
534 00:48:46.500 ⇒ 00:48:51.189 Amber Lin: Or or or maybe not there, right? Because this is also the one of the bigger problems
535 00:48:51.879 ⇒ 00:49:17.429 Miguel de Veyra: Once Shannon, or whoever they can submit the request that, hey? I’m changing this, then, once that we have, then we can create like a list of changes that was requested or updates that was requested. And then there’s gonna be like, basically a list of updates that only Janice or Yvet can see, and then only once they approve. Does it go? Does it? Does it actually change these?
536 00:49:17.888 ⇒ 00:49:25.390 Miguel de Veyra: So no, no one can actually just delete, you know, or edit stuff out like right now, if I delete this no, the bottle, no, nothing
537 00:49:25.390 ⇒ 00:49:29.939 Amber Lin: Yeah, great. I think that’s that’s the that’s the ideal workflow
538 00:49:30.530 ⇒ 00:49:35.009 Miguel de Veyra: Then list of updates, and then they have then approve.
539 00:49:37.420 ⇒ 00:49:42.420 Miguel de Veyra: and then then only then do we update
540 00:49:45.160 ⇒ 00:49:50.499 Miguel de Veyra: right? So this is the advantage of the Ui, because I don’t think this is possible via the chat. But
541 00:49:50.750 ⇒ 00:49:55.199 Miguel de Veyra: I like the Google chat right? Casey. Yeah, I don’t think it’s possible. There’s no way we can do it.
542 00:49:55.646 ⇒ 00:49:56.540 Casie Aviles: I mean
543 00:49:57.215 ⇒ 00:50:00.939 Amber Lin: There’s different tools that can generate ui codes.
544 00:50:01.420 ⇒ 00:50:04.289 Amber Lin: I mean, stack was the bolt stack
545 00:50:04.290 ⇒ 00:50:04.800 Casie Aviles: Yeah, right.
546 00:50:04.800 ⇒ 00:50:10.430 Amber Lin: Client is a front and back end.
547 00:50:10.930 ⇒ 00:50:13.939 Amber Lin: They generate that like. We can ask
548 00:50:14.190 ⇒ 00:50:16.820 Amber Lin: if they can do it for us.
549 00:50:18.730 ⇒ 00:50:22.539 Casie Aviles: It’s they’re probably gonna charge more than what we charge. ABC
550 00:50:22.720 ⇒ 00:50:34.729 Amber Lin: Oh, I mean, if they’re kind of our clients. But I know we can try those tools. If we have a baseline of of code. You can edit it. I don’t know.
551 00:50:35.090 ⇒ 00:50:37.320 Amber Lin: I don’t know that much about Ui
552 00:50:39.800 ⇒ 00:50:41.859 Miguel de Veyra: Yeah, this is, this is like the problem. Now.
553 00:50:41.860 ⇒ 00:50:44.319 Amber Lin: Front and back end development, right
554 00:50:44.320 ⇒ 00:50:45.230 Miguel de Veyra: Yeah, yeah.
555 00:50:45.390 ⇒ 00:50:46.030 Amber Lin: Okay.
556 00:50:46.030 ⇒ 00:50:52.160 Miguel de Veyra: And it’s not actually clear with Janice or Yvette like how they want this update thing to work like.
557 00:50:52.330 ⇒ 00:50:56.940 Miguel de Veyra: because if they want it through Google Chat it’s gonna be. I wouldn’t recommend it. Basically
558 00:50:56.940 ⇒ 00:51:02.489 Amber Lin: Yeah, I don’t think they care. It’s mostly what we tell them would work. They’re like, Oh, okay, we’ll do that
559 00:51:02.790 ⇒ 00:51:04.159 Miguel de Veyra: Okay. Yeah. Then.
560 00:51:04.160 ⇒ 00:51:14.249 Amber Lin: The idea that isn’t going to be in Google Chat. I don’t think they remember what we told them anymore. So it’s okay. So whatever. Whatever we decide on that works, we’ll tell them
561 00:51:14.410 ⇒ 00:51:18.850 Miguel de Veyra: Yeah. And then also, there’s like, there’s like this problem. Or if they want it on Google Chat.
562 00:51:20.440 ⇒ 00:51:24.770 Miguel de Veyra: the updating takes a lot of time, because if you go into this workflow alone
563 00:51:24.920 ⇒ 00:51:35.630 Miguel de Veyra: has to do this, do that, and then, you know, there’s a lot of steps that needs to be done. But if they actually, if they if we have a Ui, we don’t, we can actually skip this steps.
564 00:51:36.680 ⇒ 00:51:39.530 Miguel de Veyra: Oh, okay, I see
565 00:51:40.580 ⇒ 00:51:45.479 Amber Lin: Why don’t we make a wireframe for the ui like run a spike.
566 00:51:45.690 ⇒ 00:51:47.620 Amber Lin: run a spike, make a wireframe
567 00:51:48.980 ⇒ 00:51:49.530 Miguel de Veyra: Oh!
568 00:51:49.530 ⇒ 00:51:53.690 Amber Lin: And and then see what’s needed to code and
569 00:51:55.660 ⇒ 00:51:56.500 Miguel de Veyra: Yeah, we could
570 00:51:56.767 ⇒ 00:51:58.639 Amber Lin: Until we don’t have an ui right
571 00:51:58.890 ⇒ 00:51:59.820 Miguel de Veyra: Yeah.
572 00:52:02.300 ⇒ 00:52:04.330 Amber Lin: The process is pretty straightforward.
573 00:52:04.500 ⇒ 00:52:12.280 Miguel de Veyra: Yeah, cause I remember I’m not sure, Casey, if Utam mentioned this before. But he doesn’t want to code it. I’m I’m that’s what this is where I’m confused
574 00:52:13.170 ⇒ 00:52:14.530 Casie Aviles: Sorry doesn’t matter.
575 00:52:14.530 ⇒ 00:52:15.940 Miguel de Veyra: Not coder ui, because
576 00:52:15.940 ⇒ 00:52:16.390 Casie Aviles: Yeah.
577 00:52:16.390 ⇒ 00:52:17.660 Miguel de Veyra: Development, Time
578 00:52:18.320 ⇒ 00:52:18.930 Casie Aviles: Yeah.
579 00:52:19.436 ⇒ 00:52:20.130 Amber Lin: We can
580 00:52:20.130 ⇒ 00:52:20.640 Casie Aviles: Good.
581 00:52:21.070 ⇒ 00:52:27.299 Amber Lin: Okay, let’s let’s make the make the ticket. We’ll confirm with him. He’ll say yes or no.
582 00:52:27.530 ⇒ 00:52:39.680 Miguel de Veyra: Yeah, I guess we can just create like a ticket for this. Now the update agent, and then, like the only content there is, how should we proceed with this? Because it can’t be Google docs like, I’ve already explained that to them before.
583 00:52:40.660 ⇒ 00:52:46.519 Miguel de Veyra: So it has to be super base, because the also one of the advantages of this is, we can use vector embeddings
584 00:52:47.230 ⇒ 00:52:48.230 Amber Lin: Oh!
585 00:52:49.920 ⇒ 00:52:56.690 Miguel de Veyra: And then I think I also even set up like, for example, if they update something, I already did the vector I think, yeah.
586 00:52:58.000 ⇒ 00:53:02.869 Miguel de Veyra: So it’s pretty much done from being honest. It’s just yeah.
587 00:53:04.060 ⇒ 00:53:06.220 Miguel de Veyra: How should we? How should we deploy it?
588 00:53:06.940 ⇒ 00:53:07.800 Miguel de Veyra: How should it look like
589 00:53:07.800 ⇒ 00:53:14.069 Amber Lin: So the the model itself is created. But we need something client facing.
590 00:53:14.870 ⇒ 00:53:18.170 Amber Lin: This is not gonna work for the client. The super base part
591 00:53:18.170 ⇒ 00:53:18.660 Miguel de Veyra: Yes.
592 00:53:18.660 ⇒ 00:53:21.479 Amber Lin: Google Chat apparently does not work.
593 00:53:23.340 ⇒ 00:53:30.440 Casie Aviles: I mean, we could spend some time trying to debug that. It’s just I hit a wall right now.
594 00:53:30.440 ⇒ 00:53:49.740 Amber Lin: I feel like, even with Google Chat, it doesn’t give enough capabilities because we want them to be able to pull it up right? So it’s more like a ui rather than a chat, right? Because we probably want them to see the documents. We want them to see the list of things that’s probably needed to be approved
595 00:53:49.740 ⇒ 00:53:56.500 Casie Aviles: Yeah, totally. I mean, that was our 1st pitch, in the 1st place, with 2 otam. But
596 00:53:56.900 ⇒ 00:53:58.549 Casie Aviles: I’m not sure really
597 00:53:58.550 ⇒ 00:54:03.810 Miguel de Veyra: I think that was the pitch we had to him before. But yeah, it was here.
598 00:54:05.810 ⇒ 00:54:11.900 Miguel de Veyra: Hey? Good chat record to update. Yeah, this one he didn’t want to do
599 00:54:12.740 ⇒ 00:54:14.190 Amber Lin: Okay.
600 00:54:15.210 ⇒ 00:54:17.019 Amber Lin: I see. I see.
601 00:54:17.860 ⇒ 00:54:25.580 Amber Lin: I mean, even from the most simple part. Let’s roll it back right now. They’re updated updating it directly to Google docs.
602 00:54:25.580 ⇒ 00:54:27.639 Miguel de Veyra: Yeah. And I don’t think there’s a problem anyways.
603 00:54:27.640 ⇒ 00:54:30.430 Amber Lin: There’s nothing wrong with that
604 00:54:30.720 ⇒ 00:54:31.430 Miguel de Veyra: Yeah, true.
605 00:54:32.310 ⇒ 00:54:35.190 Casie Aviles: I mean, yeah, honestly, but I don’t
606 00:54:35.680 ⇒ 00:54:42.159 Amber Lin: Do you think we could just have a Google Google Docs Api
607 00:54:43.090 ⇒ 00:54:47.170 Miguel de Veyra: Or no Google Docs Api won’t work it- it won’t work. We tried it out
608 00:54:47.170 ⇒ 00:54:48.659 Amber Lin: Oh, damn it!
609 00:54:48.660 ⇒ 00:54:54.609 Miguel de Veyra: I think honestly, the the only update agent I would do is it doesn’t actually update this.
610 00:54:54.930 ⇒ 00:55:11.789 Miguel de Veyra: It just helps them. Hey, I wanna it’s basically it’s kind of bullshit. But basically what my idea is to make it simpler is, it’s still on Google Chat, right, or whatever Ui or wherever. But it doesn’t need to update anything. It basically just helps them format it correctly.
611 00:55:12.570 ⇒ 00:55:28.980 Miguel de Veyra: So, for example, hey, I need to, for example, for this one, the billing. Hey? We, we have a new, I think, easy pay. We have, for example. Now, they’re accepting stripe, right? So instead of having to think. And you know. Basically they’re gonna think that
612 00:55:29.533 ⇒ 00:55:34.190 Miguel de Veyra: you know what should be the contents of stripe, they just provided it. Here. Update this document.
613 00:55:34.810 ⇒ 00:55:39.060 Miguel de Veyra: I need to add stripe into this. And then, for example, let’s actually try this here.
614 00:55:39.430 ⇒ 00:55:43.830 Miguel de Veyra: So I can better show you guys. So for example.
615 00:55:44.702 ⇒ 00:55:52.659 Miguel de Veyra: for title, this billing. Right? We’re just gonna copy this. And then here, and then we’re gonna do.
616 00:55:53.900 ⇒ 00:55:58.679 Miguel de Veyra: So basically, the agent will be system from, I don’t know. This is just for
617 00:55:59.445 ⇒ 00:56:04.630 Miguel de Veyra: for the element system prompt is gonna be, how do you say this?
618 00:56:05.840 ⇒ 00:56:10.290 Miguel de Veyra: I need to add, oops scribe.
619 00:56:10.540 ⇒ 00:56:13.700 Miguel de Veyra: Actually, this is a user prompt. What am I doing user problem?
620 00:56:14.200 ⇒ 00:56:18.269 Miguel de Veyra: I need to add stripe because we accept stripe. Now accept stripe
621 00:56:18.270 ⇒ 00:56:22.349 Amber Lin: I suppose system prompt would be like, keep other like, keep the contents
622 00:56:22.510 ⇒ 00:56:28.589 Miguel de Veyra: Yeah. Yeah. And then system prompt would be, yeah, let’s just do it again. Although this is the wrong way to do system. Prompt.
623 00:56:29.935 ⇒ 00:56:30.730 Miguel de Veyra: Yeah.
624 00:56:30.730 ⇒ 00:56:34.889 Amber Lin: I learned a bit about Nas, so I kind of learned a little bit about all the
625 00:56:34.890 ⇒ 00:56:35.460 Miguel de Veyra: You know.
626 00:56:35.460 ⇒ 00:56:36.340 Amber Lin: Prompting things.
627 00:56:39.590 ⇒ 00:56:42.860 Miguel de Veyra: Just 5% more expensive.
628 00:56:43.180 ⇒ 00:56:44.360 Miguel de Veyra: I don’t know
629 00:56:44.650 ⇒ 00:56:46.150 Amber Lin: Okay, let’s see what it does.
630 00:56:46.990 ⇒ 00:56:53.430 Amber Lin: So we’re essentially just taking this chat interface into an an, yeah, technically.
631 00:56:53.430 ⇒ 00:56:55.950 Amber Lin: yeah. And that’s okay. They don’t care
632 00:56:56.450 ⇒ 00:56:58.370 Miguel de Veyra: Yeah, I don’t, you know.
633 00:56:58.560 ⇒ 00:57:00.680 Miguel de Veyra: I I doubt they do give shit.
634 00:57:00.940 ⇒ 00:57:02.219 Miguel de Veyra: Yeah, there you go.
635 00:57:03.140 ⇒ 00:57:06.479 Miguel de Veyra: Right? It. Basically, it’s gonna produce something like this.
636 00:57:06.480 ⇒ 00:57:09.270 Amber Lin: Yeah. And then they copy and paste it into Google.
637 00:57:09.270 ⇒ 00:57:12.024 Miguel de Veyra: Yeah, cause they’re more than happy to do Google chat
638 00:57:12.300 ⇒ 00:57:19.870 Amber Lin: Yeah, it’s so easy. And the thing is they don’t even a lot of times. If you have another chat, bot, they have to learn the Chatbot, and they don’t understand
639 00:57:20.310 ⇒ 00:57:21.039 Casie Aviles: Yeah, there we go.
640 00:57:21.040 ⇒ 00:57:21.950 Casie Aviles: Very little. Sec.
641 00:57:22.350 ⇒ 00:57:23.700 Amber Lin: Yeah, let’s just do this.
642 00:57:24.110 ⇒ 00:57:26.970 Miguel de Veyra: This is what I would suggest. Right?
643 00:57:27.070 ⇒ 00:57:31.119 Miguel de Veyra: I think this is low fidelity, high impact
644 00:57:31.120 ⇒ 00:57:34.339 Amber Lin: I know. Let’s just do that. I mean you
645 00:57:34.340 ⇒ 00:57:34.799 Miguel de Veyra: Or what do you
646 00:57:36.970 ⇒ 00:57:46.459 Miguel de Veyra: What we could do is we, let’s create. I’m gonna create like a quick bot in what do you call this in?
647 00:57:46.840 ⇒ 00:57:49.420 Miguel de Veyra: And it ended. Basically, just, you know, fill
648 00:57:49.750 ⇒ 00:58:04.110 Miguel de Veyra: that does this, and then we can send it to Janice tomorrow and then ask her for feedback, or maybe on Monday, right? Like, Hey, yeah, we created like this demo, this bot that could help basically, instead of having to think
649 00:58:04.560 ⇒ 00:58:07.929 Miguel de Veyra: on what to put here, the bot can or should I delete them?
650 00:58:08.390 ⇒ 00:58:14.780 Miguel de Veyra: No, no, okay. Instead of having to think, you just tell it what you want, and then it’s gonna format it up to standard
651 00:58:15.581 ⇒ 00:58:23.090 Amber Lin: Yeah, that was one of the things that we promised to do for them. Right helping them create standardized training documents
652 00:58:23.090 ⇒ 00:58:25.309 Miguel de Veyra: Yeah, so this is very, yeah.
653 00:58:25.760 ⇒ 00:58:33.190 Miguel de Veyra: Because, looking back, the the thing we sold to them is actually this.
654 00:58:35.210 ⇒ 00:58:40.190 Miguel de Veyra: I think it’s this knowledge creation agent. It’s basically to help them build their knowledge bases.
655 00:58:40.748 ⇒ 00:58:47.219 Miguel de Veyra: update anything. It’s just don’t help them build. This is the one we used social to get them to buy us.
656 00:58:47.590 ⇒ 00:58:50.170 Miguel de Veyra: so, you know, make something work with them.
657 00:58:50.470 ⇒ 00:58:52.428 Amber Lin: Okay, yeah, that’s so easy.
658 00:58:54.320 ⇒ 00:58:55.903 Miguel de Veyra: Like. I don’t overthink it.
659 00:58:56.930 ⇒ 00:59:01.549 Amber Lin: Cause the the thing is is to put this into a use case right.
660 00:59:01.770 ⇒ 00:59:18.270 Amber Lin: Our AI like our coding efforts very low. But let’s think about how they’re gonna use it. Right? They have something they want to update. They also need to find it in the Google Doc, like finding it in the Google, Doc is kind of like a control search or whatever.
661 00:59:18.270 ⇒ 00:59:19.979 Amber Lin: Yeah, yeah, it’s very
662 00:59:19.980 ⇒ 00:59:26.330 Amber Lin: identify will make it easier. I think the only other step for them to do is to help them find where it is
663 00:59:26.780 ⇒ 00:59:27.470 Miguel de Veyra: Yeah.
664 00:59:28.340 ⇒ 00:59:33.150 Amber Lin: And then, once they find it, they can just copy and paste the suggestions, and then you’ve available and improve them
665 00:59:33.550 ⇒ 00:59:41.429 Miguel de Veyra: We did, I think, yeah, I mean, yeah, let’s not overthink it, because ideally, the the only next thing we wanna do is
666 00:59:41.540 ⇒ 00:59:44.850 Miguel de Veyra: segment. This. Basically like, for example, this is.
667 00:59:45.040 ⇒ 00:59:54.890 Miguel de Veyra: they didn’t have this before. I was the one who added this right? That’s why I tagged it here. Start of Ppts because they didn’t have documentation for their Powerpoints just Powerpoints.
668 00:59:57.730 ⇒ 00:59:58.959 Miguel de Veyra: So you know.
669 00:59:59.390 ⇒ 01:00:01.389 Miguel de Veyra: So just cleaning up the central dock.
670 01:00:01.590 ⇒ 01:00:05.779 Miguel de Veyra: yeah, yeah, basically, it’s just about to standardize their stuff
671 01:00:06.125 ⇒ 01:00:14.760 Amber Lin: Yeah, let me. Let’s just add a ticket check out. I think he’s gonna like this because it’s very low effort. We can just
672 01:00:14.760 ⇒ 01:00:19.293 Miguel de Veyra: Because, yeah. And then this one that definitely we can do in the next cycle, right? Because
673 01:00:19.560 ⇒ 01:00:20.030 Amber Lin: We’re not.
674 01:00:21.420 ⇒ 01:00:32.570 Miguel de Veyra: And everything with the with anything. Update and add document. It’s probably gonna take with the ours. We have probably more than I don’t know 6 or 7 cycles
675 01:00:32.570 ⇒ 01:00:37.640 Amber Lin: Yeah, I don’t think we’re gonna I think this is what we’re gonna do.
676 01:00:40.170 ⇒ 01:00:44.370 Amber Lin: Can you write the ticket? I don’t know what I’m writing
677 01:00:45.422 ⇒ 01:00:47.959 Miguel de Veyra: Yeah, sure go to limit
678 01:00:52.220 ⇒ 01:00:56.810 Amber Lin: So I’m gonna move the integrate into Google chat back to the backlog
679 01:00:57.240 ⇒ 01:00:59.030 Miguel de Veyra: No issues.
680 01:00:59.200 ⇒ 01:01:02.900 Amber Lin: Trainer interact with the bot kind of did that.
681 01:01:04.080 ⇒ 01:01:09.440 Amber Lin: But we should, we should validate that essentially.
682 01:01:09.580 ⇒ 01:01:12.789 Amber Lin: Can you also send the mural board.
683 01:01:13.240 ⇒ 01:01:16.749 Amber Lin: or do I? Is it the ABC. Mural board? I’ll just
684 01:01:16.750 ⇒ 01:01:19.770 Miguel de Veyra: Yeah, everything wait. Let me just go
685 01:01:21.440 ⇒ 01:01:23.770 Miguel de Veyra: here. I don’t know what the
686 01:01:26.900 ⇒ 01:01:29.870 Miguel de Veyra: I think. It is. More updated again.
687 01:01:32.810 ⇒ 01:01:34.290 Miguel de Veyra: Field agent.
688 01:01:38.280 ⇒ 01:01:41.019 Miguel de Veyra: Oh, so many stuff on the Mirror board.
689 01:01:41.870 ⇒ 01:01:46.180 Miguel de Veyra: No need to let me inform
690 01:01:46.240 ⇒ 01:01:48.790 Casie Aviles: We’re mostly just brain dumping there.
691 01:01:51.337 ⇒ 01:01:53.640 Amber Lin: Where is it?
692 01:01:55.700 ⇒ 01:01:56.350 Amber Lin: Oh.
693 01:01:56.890 ⇒ 01:01:59.169 Casie Aviles: Which one are you looking for like
694 01:01:59.503 ⇒ 01:02:05.830 Amber Lin: The one we just wrote in the Mirror board. I know it’s all yellow. I’m trying to find it
695 01:02:05.950 ⇒ 01:02:07.630 Miguel de Veyra: Standardized.
696 01:02:11.490 ⇒ 01:02:12.440 Amber Lin: Fall.
697 01:02:13.600 ⇒ 01:02:16.520 Amber Lin: Cool, interesting.
698 01:02:34.670 ⇒ 01:02:40.109 Amber Lin: Oh, there it is. Thank you. I tracked your mouse. I’m just gonna screenshot this and put it in
699 01:02:50.210 ⇒ 01:02:52.509 Miguel de Veyra: Typically turn on rainfall
700 01:02:55.130 ⇒ 01:03:05.990 Amber Lin: So let’s let’s make a list with a new suggestion. Right? The they
701 01:03:06.210 ⇒ 01:03:11.269 Amber Lin: do. They still interact with it in Google, in Google, chat like if we do the
702 01:03:11.270 ⇒ 01:03:13.749 Miguel de Veyra: Yeah, right now, it doesn’t really matter
703 01:03:14.100 ⇒ 01:03:17.789 Miguel de Veyra: for the introduction. Google Chat would be ideally good
704 01:03:18.819 ⇒ 01:03:19.599 Amber Lin: Yeah.
705 01:03:20.490 ⇒ 01:03:23.030 Amber Lin: Same pass in.
706 01:03:23.510 ⇒ 01:03:33.870 Amber Lin: Oh, Chat, I have. The subway agent helps right? It
707 01:04:01.805 ⇒ 01:04:15.220 Miguel de Veyra: Is, unless discussed. Otherwise, otherwise agent want to look into any document.
708 01:04:15.500 ⇒ 01:04:17.110 Miguel de Veyra: Hey, son?
709 01:04:17.280 ⇒ 01:04:20.180 Miguel de Veyra: Users will just will. Just
710 01:04:25.880 ⇒ 01:04:28.689 Miguel de Veyra: Chatbot will generate.
711 01:04:28.980 ⇒ 01:04:34.040 Miguel de Veyra: Understand? There you go.
712 01:04:37.810 ⇒ 01:04:40.670 Miguel de Veyra: Okay, there you go. They should reach out.
713 01:04:41.090 ⇒ 01:04:44.389 Miguel de Veyra: Okay, yeah, this is fine. This is better. Ui trigger.
714 01:04:50.930 ⇒ 01:04:52.370 Miguel de Veyra: Okay, yeah. This is fine.
715 01:04:53.745 ⇒ 01:04:55.819 Miguel de Veyra: Sorry. Can you guys hear me?
716 01:04:55.820 ⇒ 01:04:58.750 Amber Lin: Yes, I can hear you. I let
717 01:04:58.750 ⇒ 01:04:59.730 Miguel de Veyra: There you go!
718 01:05:00.750 ⇒ 01:05:02.219 Amber Lin: Oh, wow.
719 01:05:02.750 ⇒ 01:05:08.986 Amber Lin: yeah, every time I generate a ticket with our with AI, I’m like, Wow! This would have took took me 20 min
720 01:05:09.543 ⇒ 01:05:16.109 Miguel de Veyra: Yeah, I mean, you just have to, basically because the 1st 2 was shipped. So you have to like, basically give it the requirements
721 01:05:16.110 ⇒ 01:05:16.660 Amber Lin: Yeah.
722 01:05:16.660 ⇒ 01:05:17.360 Miguel de Veyra: What’s this?
723 01:05:17.360 ⇒ 01:05:19.790 Amber Lin: Make up his system.
724 01:05:20.290 ⇒ 01:05:27.609 Miguel de Veyra: Oh, shit, fuck! Sorry he would. Can you review this tickets while he’s here?
725 01:05:27.720 ⇒ 01:05:29.578 Miguel de Veyra: Because autumn is never here
726 01:05:35.040 ⇒ 01:05:43.440 Amber Lin: I’m gonna just make making a mural for the workflows.
727 01:05:44.230 ⇒ 01:05:44.890 Amber Lin: Let’s see
728 01:06:06.430 ⇒ 01:06:08.780 Miguel de Veyra: Is it polytomic we can’t use because
729 01:06:09.470 ⇒ 01:06:11.929 Miguel de Veyra: they don’t support large files. Right?
730 01:06:12.130 ⇒ 01:06:13.289 Miguel de Veyra: Was that what we found out
731 01:06:13.910 ⇒ 01:06:16.950 Casie Aviles: I mean, that’s what? Yeah, that’s what
732 01:06:16.950 ⇒ 01:06:17.300 Miguel de Veyra: Be this.
733 01:06:17.300 ⇒ 01:06:23.149 Casie Aviles: Say, I told us. But I still want to, just, you know, take a look for myself. First, st
734 01:06:23.150 ⇒ 01:06:27.989 Casie Aviles: yeah, yeah, okay. Okay. I’m trying to set a meeting with them. Now
735 01:06:28.240 ⇒ 01:06:32.240 Miguel de Veyra: Just, you know. Tell him what we found out, and then we’re still looking into it.
736 01:06:32.360 ⇒ 01:06:33.810 Miguel de Veyra: but they’re probably not wrong.
737 01:06:35.150 ⇒ 01:06:38.969 Casie Aviles: Yeah, I mean what Otam said here on the the AI team that
738 01:06:39.130 ⇒ 01:06:43.580 Casie Aviles: he does apparently polytomic doesn’t support Zoom, but
739 01:06:43.850 ⇒ 01:06:46.359 Casie Aviles: he encourages us to check. Still so
740 01:06:48.290 ⇒ 01:06:53.470 Miguel de Veyra: He probably knows that it’s not. He just wants to see if we checked properly
741 01:06:55.740 ⇒ 01:06:56.710 Amber Lin: No.
742 01:07:04.880 ⇒ 01:07:11.999 Amber Lin: yeah, I don’t know how much AI is in just yeah. Let me
743 01:07:12.380 ⇒ 01:07:15.029 Amber Lin: let me share my screen to show you guys.
744 01:07:15.680 ⇒ 01:07:20.979 Amber Lin: this is essentially what we’re gonna be doing like.
745 01:07:21.500 ⇒ 01:07:31.580 Amber Lin: So the trainer asked in Google chat like, Help me write it, Gpt returns a response manually. Right?
746 01:07:32.030 ⇒ 01:07:36.390 Amber Lin: Manually find place to update Central Doc.
747 01:07:36.870 ⇒ 01:07:38.859 Amber Lin: This is also manually
748 01:07:39.270 ⇒ 01:07:41.649 Miguel de Veyra: Not gonna be finding anything anymore
749 01:07:41.990 ⇒ 01:07:43.499 Amber Lin: Oh, I know they find it.
750 01:07:44.090 ⇒ 01:07:45.630 Miguel de Veyra: Okay. Okay. Manually. Okay. Okay.
751 01:07:45.630 ⇒ 01:07:48.560 Amber Lin: Trainer, trainer.
752 01:07:49.340 ⇒ 01:07:51.340 Miguel de Veyra: Manually update the solution
753 01:07:53.906 ⇒ 01:08:04.690 Amber Lin: That’s the only step that’s AI, that’s the only thing. And then this is in Google Docs, maybe the. And then for the manager, right. They maybe see a list of updates.
754 01:08:05.200 ⇒ 01:08:08.960 Amber Lin: They prove it manually still. This is
755 01:08:09.450 ⇒ 01:08:13.390 Amber Lin: so. This is just if they click a button, it is approved.
756 01:08:13.550 ⇒ 01:08:23.729 Amber Lin: And then, like, eventually, we want to announce to the Csrs.
757 01:08:25.910 ⇒ 01:08:26.850 Amber Lin: So let’s see
758 01:08:26.859 ⇒ 01:08:33.509 Miguel de Veyra: Oh, sure, sorry, Amber Casey, do we need Casey on this meeting? Because I think he’s been on meetings for like 2 h already.
759 01:08:33.510 ⇒ 01:08:35.379 Amber Lin: Oh, my goodness I think
760 01:08:35.380 ⇒ 01:08:36.970 Miguel de Veyra: Yeah, I’m fine, because I think we
761 01:08:36.970 ⇒ 01:08:37.310 Amber Lin: Got it.
762 01:08:37.920 ⇒ 01:08:39.309 Amber Lin: No, I have no
763 01:08:39.460 ⇒ 01:08:42.050 Miguel de Veyra: Or not the number we we haven’t looked at the
764 01:08:42.050 ⇒ 01:08:44.460 Amber Lin: Oh, my goodness, really, okay.
765 01:08:44.859 ⇒ 01:08:52.779 Amber Lin: this is good. I think I think you have a good idea of what we’re gonna do. It was nice to have your input. I, me and Miguel, will work on the tickets together.
766 01:08:52.779 ⇒ 01:08:55.789 Miguel de Veyra: Yeah, yeah, Casey, so you can work on something else. Sorry, man.
767 01:08:56.109 ⇒ 01:08:56.439 Amber Lin: Ian.
768 01:08:56.439 ⇒ 01:09:00.069 Amber Lin: Oh, sure, sure, for mentioning that it’s totally forgot
769 01:09:01.024 ⇒ 01:09:01.679 Miguel de Veyra: Alright!
770 01:09:01.950 ⇒ 01:09:02.470 Casie Aviles: Bye, guys.
771 01:09:02.479 ⇒ 01:09:04.489 Miguel de Veyra: Thanks, Casey. I’ll speak to you later.
772 01:09:04.670 ⇒ 01:09:05.450 Amber Lin: Bye.
773 01:09:05.930 ⇒ 01:09:17.289 Miguel de Veyra: Cause. I think, Amber one of the things we also need to look into right now, which I already asked Utam to review, but the one that they gets in linear for AI team. Do you wanna help cause I think this is pretty straightforward.
774 01:09:18.050 ⇒ 01:09:20.400 Miguel de Veyra: Or do you want to finish this first, st and then we hop there
775 01:09:21.940 ⇒ 01:09:26.660 Amber Lin: Hop, where it’s all linear tickets. Yeah.
776 01:09:26.660 ⇒ 01:09:34.999 Miguel de Veyra: AI team linear tickets like the ones and the requirements and review, because I’m not sure what to do else. There, I’ve already added everything
777 01:09:36.410 ⇒ 01:09:37.640 Miguel de Veyra: So I’m not sure that
778 01:09:37.640 ⇒ 01:09:41.279 Amber Lin: Blocked right? We’re just blocked by certain things
779 01:09:42.044 ⇒ 01:09:48.999 Miguel de Veyra: Yeah. But can you check like AI 92, 93 and 89? Make sure I’m not tripping
780 01:09:49.728 ⇒ 01:09:52.471 Amber Lin: Okay, let me get there.
781 01:09:53.020 ⇒ 01:09:59.389 Miguel de Veyra: Cause, Utam said. You know the Zoom Meetings data sources. AI 70. It’s too big. So I separated it into 3 tickets
782 01:10:01.520 ⇒ 01:10:03.940 Miguel de Veyra: I’m not seeing your screen. By the way, I can only see
783 01:10:03.940 ⇒ 01:10:07.539 Amber Lin: Oh, my bad, my bad! Let me
784 01:10:11.300 ⇒ 01:10:18.579 Amber Lin: AI issues in the ready for requirements, and review right
785 01:10:18.580 ⇒ 01:10:21.590 Miguel de Veyra: Yes, yes. So 92, 93 and 89
786 01:10:22.715 ⇒ 01:10:28.409 Miguel de Veyra: since I think these are pretty clear already. I’ve adjusted this based on this transcripts
787 01:10:29.410 ⇒ 01:10:30.150 Amber Lin: Hmm.
788 01:10:32.140 ⇒ 01:10:35.449 Miguel de Veyra: Because if you go scroll down to the bottom, he has comments there.
789 01:10:48.600 ⇒ 01:10:53.609 Miguel de Veyra: So basically, if each technical requirement I created a separate ticket for each
790 01:10:53.610 ⇒ 01:10:58.350 Amber Lin: Hmm, great. So each of them, okay, that
791 01:10:58.890 ⇒ 01:11:01.490 Amber Lin: think this should be a project
792 01:11:02.880 ⇒ 01:11:06.999 Miguel de Veyra: Yeah. So if you go like one step back, go back to AI team.
793 01:11:07.120 ⇒ 01:11:12.900 Miguel de Veyra: The ones in, for example. Yeah. Extract zoom, it has their own goal. A/C
794 01:11:12.900 ⇒ 01:11:13.660 Amber Lin: Oh!
795 01:11:13.660 ⇒ 01:11:16.880 Miguel de Veyra: And yeah, so if you click anything there, it’s actually complete.
796 01:11:16.880 ⇒ 01:11:21.670 Amber Lin: So this is blocked. This is this. One is blocked by the dlt
797 01:11:21.830 ⇒ 01:11:22.690 Miguel de Veyra: Yes, every
798 01:11:22.690 ⇒ 01:11:23.660 Amber Lin: That’s also
799 01:11:23.660 ⇒ 01:11:27.040 Miguel de Veyra: See block by dlt, so
800 01:11:28.110 ⇒ 01:11:36.990 Amber Lin: Yeah, and that’s so. He hasn’t moved any of this to ready for development.
801 01:11:37.720 ⇒ 01:11:38.900 Amber Lin: Hmm!
802 01:11:40.320 ⇒ 01:11:45.939 Amber Lin: Let me check. Only didn’t he have a comment on that one
803 01:11:45.940 ⇒ 01:11:49.360 Miguel de Veyra: Yes, even if Uta moves it to ready for development, it can’t be done.
804 01:11:49.530 ⇒ 01:11:50.340 Miguel de Veyra: Yeah.
805 01:11:51.120 ⇒ 01:11:53.980 Amber Lin: Yeah, so does this work
806 01:11:55.500 ⇒ 01:12:03.520 Miguel de Veyra: This one. I probably will tell you them. It’s not possible, because polytomic doesn’t support large data, so might as well just stick to dlt
807 01:12:06.050 ⇒ 01:12:07.990 Miguel de Veyra: Yeah, I already know. Mentioned it. There.
808 01:12:07.990 ⇒ 01:12:09.499 Amber Lin: Okay. Sounds good.
809 01:12:12.840 ⇒ 01:12:22.690 Amber Lin: you know. Oh, I mean, good thing is that we do have the different
810 01:12:24.000 ⇒ 01:12:35.229 Amber Lin: things. For from the 2 notion, docs, that we now have. We can make that into tickets. And so we can get something moving, because this is kind of stuck, and I don’t think likes.
811 01:12:35.340 ⇒ 01:12:55.760 Amber Lin: And he doesn’t know what’s really going on down in here. So if he sees that nothing has moved, he said, he’s not going to be very happy, so why don’t we make those new tickets? Get them moving? Spit them out in an hour or so? Some of them are really simple. Not an hour sorry at least 2 h, 3 h, and then we’ll have, we can say, Hey, we did this this week.
812 01:12:56.200 ⇒ 01:12:57.140 Miguel de Veyra: Yeah, yeah.
813 01:12:57.140 ⇒ 01:13:01.600 Amber Lin: Okay, let me, can I share my whole?
814 01:13:03.410 ⇒ 01:13:05.670 Amber Lin: How do I share my whole screen.
815 01:13:11.740 ⇒ 01:13:12.670 Amber Lin: Paul?
816 01:13:17.640 ⇒ 01:13:19.500 Amber Lin: Sharing notion.
817 01:13:23.280 ⇒ 01:13:35.789 Amber Lin: Yeah, I’ll make a I’ll make a sales project, and then why is there 2?
818 01:13:38.170 ⇒ 01:13:39.040 Amber Lin: Paul?
819 01:13:39.850 ⇒ 01:13:43.580 Amber Lin: Wait, Mango! Why is there 2 client agent projects
820 01:13:44.650 ⇒ 01:13:46.500 Miguel de Veyra: Sorry I can’t see your screen
821 01:13:46.500 ⇒ 01:13:50.079 Amber Lin: Oh, it’s okay. One of them is empty. I’ll just delete that
822 01:13:55.460 ⇒ 01:13:57.849 Miguel de Veyra: It’s gonna be available too late for me
823 01:13:58.980 ⇒ 01:13:59.410 Amber Lin: Oh!
824 01:13:59.410 ⇒ 01:14:02.790 Miguel de Veyra: But yeah, I’ll try to stick up. See what can happen
825 01:14:03.710 ⇒ 01:14:04.350 Amber Lin: Wait!
826 01:14:04.540 ⇒ 01:14:07.160 Amber Lin: If that’s too late for you, I can create the tickets
827 01:14:07.410 ⇒ 01:14:17.360 Miguel de Veyra: No, no, we we’re gonna discuss because I want to hop on a call with him to basically just tell it. Speak with him if it’s still not good, I want to hop on a call with him. Make sure it’s everything
828 01:14:17.360 ⇒ 01:14:18.450 Amber Lin: Okay.
829 01:14:19.480 ⇒ 01:14:23.260 Miguel de Veyra: So I think we’re gonna get scolded if we still don’t move this into ready for development.
830 01:14:23.260 ⇒ 01:14:34.040 Amber Lin: Yeah, okay, I’m gonna change the client agent into like foundations. Because right now, it’s all like foundations, data sources. Right?
831 01:14:34.740 ⇒ 01:14:37.400 Miguel de Veyra: And then, if you could follow up with
832 01:14:38.130 ⇒ 01:14:41.010 Miguel de Veyra: what’s his name? Oh, no, no, Patrick.
833 01:14:42.310 ⇒ 01:14:44.810 Amber Lin: Yeah, I messaged him. He hasn’t replied so
834 01:14:44.810 ⇒ 01:14:45.340 Miguel de Veyra: Yes.
835 01:14:45.360 ⇒ 01:14:47.659 Amber Lin: Think he’s 50 with his job.
836 01:14:50.600 ⇒ 01:14:56.639 Amber Lin: Oh, I think that will just have to be stuck. Let’s work on the smaller stuff
837 01:14:57.030 ⇒ 01:15:08.039 Amber Lin: and get that going I will make a sales some sales tickets.
838 01:15:12.750 ⇒ 01:15:18.670 Amber Lin: Hmm, okay, where is this?
839 01:15:30.900 ⇒ 01:15:37.769 Amber Lin: I mean, I don’t think you have to stay on the call with me. I’m just. I’m just making these tickets
840 01:15:38.170 ⇒ 01:15:38.680 Amber Lin: and
841 01:15:39.180 ⇒ 01:15:39.680 Miguel de Veyra: Sure.
842 01:15:39.680 ⇒ 01:15:43.939 Amber Lin: Yeah, we decided, we’re gonna 1st do the lead. Follow up tracker right
843 01:15:44.050 ⇒ 01:15:49.510 Amber Lin: for the sales. That’s just from notion. Do you know the details about that?
844 01:15:51.930 ⇒ 01:15:58.930 Miguel de Veyra: Actually, no, no, but yeah, I it’s. Is it still the same sales database
845 01:15:58.930 ⇒ 01:16:02.630 Amber Lin: Yeah, it’s a notion like everything for sales is a notion
846 01:16:02.630 ⇒ 01:16:03.580 Miguel de Veyra: Okay. Yeah. Good.
847 01:16:03.580 ⇒ 01:16:08.269 Amber Lin: Robert has everything really clean? Because what is Casey working on today?
848 01:16:09.215 ⇒ 01:16:13.970 Miguel de Veyra: Dlt hub. I basically the the zoom migrations, the ones that are existing
849 01:16:13.970 ⇒ 01:16:14.780 Amber Lin: Oh, okay.
850 01:16:14.780 ⇒ 01:16:15.520 Miguel de Veyra: 1, st 3,
851 01:16:15.520 ⇒ 01:16:21.289 Amber Lin: Sure sounds good. And your what are you? What most things are you focusing on today?
852 01:16:22.440 ⇒ 01:16:24.090 Miguel de Veyra: I’m done for the day. But yeah, I work
853 01:16:24.090 ⇒ 01:16:24.430 Amber Lin: Oh.
854 01:16:25.610 ⇒ 01:16:38.979 Amber Lin: sounds good I will make this. Then I will ping Utam to review, and once he reviews, we’ll have something to do cause I something to do. When you guys wake up for tomorrow because I
855 01:16:39.080 ⇒ 01:16:41.809 Amber Lin: do, you want us to have something for Friday?
856 01:16:41.950 ⇒ 01:16:48.239 Amber Lin: Because right now we don’t really have anything to show like we have some foundational stuff, but nothing people can see
857 01:16:48.700 ⇒ 01:16:51.119 Miguel de Veyra: Yeah, yeah, I mean, we got blocked by Dlt.
858 01:16:51.330 ⇒ 01:17:00.480 Amber Lin: Yeah, okay, I I will. Ping, I I will just message you or message Utam in the Channel. Once once I have the tickets
859 01:17:01.000 ⇒ 01:17:01.830 Miguel de Veyra: Okay. Sure. Sure.
860 01:17:01.830 ⇒ 01:17:03.640 Amber Lin: Okay, thanks for the call.
861 01:17:04.050 ⇒ 01:17:04.739 Miguel de Veyra: Thanks, bye, bye.
862 01:17:04.740 ⇒ 01:17:06.939 Amber Lin: Long call for all of us. Bye, bye.
863 01:17:06.940 ⇒ 01:17:08.020 Miguel de Veyra: Yeah, bye, bye.