Meeting Title: ABC working session Date: 2026-02-04 Meeting participants: Mustafa Raja, Casie Aviles, Pranav Narahari, Samuel Roberts, Amber Lin
WEBVTT
1 00:01:21.920 ⇒ 00:01:24.370 Mustafa Raja: Hey, hey Jesse, I have a question.
2 00:01:26.290 ⇒ 00:01:28.009 Casie Aviles: Oh, hey, hey, hey, yeah.
3 00:01:28.440 ⇒ 00:01:33.090 Mustafa Raja: So we want to move to all of the infrastructure, right?
4 00:01:33.720 ⇒ 00:01:35.020 Mustafa Raja: Or for ABC?
5 00:01:35.650 ⇒ 00:01:39.720 Casie Aviles: Yeah, I mean, everything that’s… that we have on our side.
6 00:01:39.720 ⇒ 00:01:40.620 Mustafa Raja: I think.
7 00:01:40.780 ⇒ 00:01:42.560 Casie Aviles: Yeah, we moved it to them.
8 00:01:42.730 ⇒ 00:01:49.510 Mustafa Raja: Yeah, so we… so in Cloud SQL, we will also be moving the tables for DB2, right?
9 00:01:52.040 ⇒ 00:01:54.879 Casie Aviles: Oh, wait, sorry, what was that again? You mean, like, the…
10 00:01:55.100 ⇒ 00:01:58.759 Mustafa Raja: The tables that we are using for the admin UI.
11 00:02:00.150 ⇒ 00:02:03.390 Casie Aviles: The tables… oh, yeah, yeah, yeah. I mean, ideally…
12 00:02:03.390 ⇒ 00:02:04.140 Mustafa Raja: is right.
13 00:02:05.200 ⇒ 00:02:07.569 Casie Aviles: Yes, currently it’s in our superbase.
14 00:02:07.570 ⇒ 00:02:09.859 Mustafa Raja: Okay, okay, yeah, just wanted to confirm that.
15 00:02:10.570 ⇒ 00:02:12.899 Casie Aviles: Yeah, ideally we also move it there.
16 00:02:13.720 ⇒ 00:02:14.330 Mustafa Raja: Okay.
17 00:02:18.410 ⇒ 00:02:19.340 Samuel Roberts: How’s it going, Al?
18 00:02:20.230 ⇒ 00:02:21.080 Mustafa Raja: Good, how are you?
19 00:02:21.170 ⇒ 00:02:23.190 Casie Aviles: Yeah, we were just talking about…
20 00:02:23.530 ⇒ 00:02:28.350 Casie Aviles: Whether we’re going to move the superbase tables as well to there.
21 00:02:29.320 ⇒ 00:02:31.540 Casie Aviles: Cloud SQL instance.
22 00:02:36.430 ⇒ 00:02:39.189 Samuel Roberts: Good point. Probably eventually, but I don’t.
23 00:02:39.190 ⇒ 00:02:39.580 Casie Aviles: Yeah.
24 00:02:39.580 ⇒ 00:02:41.139 Samuel Roberts: That’s a huge priority, you know.
25 00:02:42.420 ⇒ 00:02:46.280 Mustafa Raja: Yeah, I was just drafting the email… So I just wanted.
26 00:02:46.280 ⇒ 00:02:48.510 Samuel Roberts: Oh, right, right, right. Okay.
27 00:02:48.830 ⇒ 00:02:49.420 Samuel Roberts: Yeah, I…
28 00:02:49.420 ⇒ 00:02:50.759 Mustafa Raja: So I’ve been in a meeting…
29 00:02:50.760 ⇒ 00:02:54.179 Samuel Roberts: I’m just… I haven’t seen anything on Slack, so if there was updates, I didn’t see anything.
30 00:02:54.900 ⇒ 00:03:00.900 Mustafa Raja: Oh yeah, I made a draft, you know, drafting me?
31 00:03:01.190 ⇒ 00:03:08.260 Mustafa Raja: put Cloud SQL in the thread, so if you would look at that… That would be nice.
32 00:03:11.250 ⇒ 00:03:12.819 Samuel Roberts: Okay, yeah, I’ll do that.
33 00:03:15.230 ⇒ 00:03:16.060 Samuel Roberts: Mr.
34 00:03:16.190 ⇒ 00:03:20.699 Samuel Roberts: this, or during this working session, but let’s get, amber says she’ll be a little late.
35 00:03:24.710 ⇒ 00:03:26.640 Mustafa Raja: I think she’s in a meeting with Udam.
36 00:03:27.790 ⇒ 00:03:32.869 Samuel Roberts: Yeah, she said she’d be 15 minutes late. Okay. Okay. So, I think…
37 00:03:33.490 ⇒ 00:03:42.959 Samuel Roberts: Plan was to start talking about the… automation ideas for… The central docs.
38 00:03:43.210 ⇒ 00:03:48.750 Samuel Roberts: Has everyone seen that?
39 00:03:49.130 ⇒ 00:03:49.760 Mustafa Raja: Nope.
40 00:03:50.510 ⇒ 00:03:52.870 Samuel Roberts: Okay, let me find it…
41 00:03:56.320 ⇒ 00:03:59.490 Casie Aviles: Yeah, I only saw the email that… 100%.
42 00:04:00.720 ⇒ 00:04:07.879 Samuel Roberts: Okay, yeah, that was… mmm… Came after this… where’s the share button?
43 00:04:10.970 ⇒ 00:04:12.759 Samuel Roberts: where are they?
44 00:04:13.030 ⇒ 00:04:18.019 Samuel Roberts: this… Notion page…
45 00:04:21.760 ⇒ 00:04:26.349 Samuel Roberts: So… Just for context, the idea is that
46 00:04:27.500 ⇒ 00:04:31.419 Samuel Roberts: The central docs need to be improved in terms of the actual
47 00:04:32.440 ⇒ 00:04:37.999 Samuel Roberts: documents themselves, first and foremost. And doing that with them.
48 00:04:38.230 ⇒ 00:04:41.499 Samuel Roberts: Amber did a lot of work with the pest one, I believe?
49 00:04:42.570 ⇒ 00:04:44.560 Samuel Roberts: Correct me if I’m wrong on any of this, but…
50 00:04:44.560 ⇒ 00:04:45.040 Casie Aviles: Yes.
51 00:04:45.970 ⇒ 00:04:53.480 Samuel Roberts: I… She said that was just, like, very time-consuming, so the thought was, how do we… leverage
52 00:04:54.020 ⇒ 00:05:03.669 Samuel Roberts: tools to speed that up. And then also, ongoing improvement, so…
53 00:05:05.870 ⇒ 00:05:09.199 Samuel Roberts: just a little walkthrough of this notion. I can share a screen if you guys want.
54 00:05:09.530 ⇒ 00:05:10.090 Mustafa Raja: Yeah.
55 00:05:11.210 ⇒ 00:05:13.509 Samuel Roberts: Let me get that in the right place, sorry.
56 00:05:15.290 ⇒ 00:05:19.480 Samuel Roberts: Share… Okay.
57 00:05:20.040 ⇒ 00:05:24.579 Samuel Roberts: So… This was, yeah, we started this doc the other day.
58 00:05:25.680 ⇒ 00:05:30.570 Samuel Roberts: basically, Utam, like, had us list out, like, how it all… like, the process of it.
59 00:05:32.530 ⇒ 00:05:35.230 Samuel Roberts: and what we’re using AI for today…
60 00:05:36.080 ⇒ 00:05:37.550 Samuel Roberts: And how it was used.
61 00:05:38.420 ⇒ 00:05:40.720 Samuel Roberts: All ready to, like, figure out the structure.
62 00:05:41.550 ⇒ 00:05:44.560 Samuel Roberts: what’s manual? And then we just kind of…
63 00:05:47.740 ⇒ 00:05:53.480 Samuel Roberts: Came up with some ideas of how to move forward with this, and then… No…
64 00:05:54.200 ⇒ 00:05:57.359 Samuel Roberts: If that got overwritten by the other stuff…
65 00:05:58.300 ⇒ 00:06:03.630 Samuel Roberts: But basically, like, we started listing out these sorts of things, and then Amber took that with
66 00:06:05.790 ⇒ 00:06:11.400 Samuel Roberts: GPT-5.2 and Opus, and just… Tried to come up with some more ideas for…
67 00:06:11.680 ⇒ 00:06:14.029 Samuel Roberts: ways to improve the central docs.
68 00:06:15.630 ⇒ 00:06:19.149 Samuel Roberts: And then that’s where I think the email came out of, I don’t know if it’s in here or not, but…
69 00:06:19.710 ⇒ 00:06:21.259 Samuel Roberts: It might have been separate, yeah.
70 00:06:22.140 ⇒ 00:06:26.800 Samuel Roberts: So, the thought for this meeting, I guess, is to…
71 00:06:27.790 ⇒ 00:06:30.809 Samuel Roberts: Think through how we can start doing some of these things.
72 00:06:31.060 ⇒ 00:06:35.930 Samuel Roberts: Probably, you know, things that are…
73 00:06:36.070 ⇒ 00:06:41.520 Samuel Roberts: you know, lowest effort, highest impact ratio. I guess the quick wins, they’re kind of called out here.
74 00:06:42.360 ⇒ 00:06:45.980 Samuel Roberts: So, I guess we can start talking about that.
75 00:06:47.460 ⇒ 00:06:53.309 Samuel Roberts: Things like flagging unanswered questions.
76 00:06:54.630 ⇒ 00:06:57.850 Pranav Narahari: We sort of started this, but we should do this on a more regular…
77 00:06:59.600 ⇒ 00:07:02.620 Samuel Roberts: Regular cadence.
78 00:07:03.960 ⇒ 00:07:09.529 Pranav Narahari: Is the idea of this meeting to kind of rank these and figure out, like, order of operations, which ones we do?
79 00:07:10.580 ⇒ 00:07:15.790 Samuel Roberts: Yeah, I think that’s probably the outcome that we’ll get to here. I don’t think…
80 00:07:16.980 ⇒ 00:07:20.820 Pranav Narahari: I think those two that you just mentioned were the ones I was gonna mention.
81 00:07:20.820 ⇒ 00:07:23.189 Samuel Roberts: Yeah, those are the quick wins, let’s see there.
82 00:07:24.360 ⇒ 00:07:25.040 Samuel Roberts: Okay.
83 00:07:25.990 ⇒ 00:07:27.210 Samuel Roberts: So…
84 00:07:30.580 ⇒ 00:07:36.079 Samuel Roberts: Just trying to think, because some of these bigger ones are also going to be very high impact, or medium ones.
85 00:07:42.920 ⇒ 00:07:43.690 Samuel Roberts: Okay.
86 00:07:45.110 ⇒ 00:07:46.390 Samuel Roberts: Priority core.
87 00:07:50.180 ⇒ 00:07:54.460 Samuel Roberts: I mean, does anyone have any questions about this doc yet, or… I just want to make sure you guys are up to speed.
88 00:07:56.600 ⇒ 00:07:57.120 Casie Aviles: I don’t know.
89 00:07:57.120 ⇒ 00:07:57.830 Pranav Narahari: I’m up to speak.
90 00:07:57.830 ⇒ 00:07:58.430 Casie Aviles: head.
91 00:07:59.450 ⇒ 00:08:02.630 Casie Aviles: Okay. I’m just reading through, for now, this table.
92 00:08:02.630 ⇒ 00:08:03.470 Samuel Roberts: Okay, yeah.
93 00:08:03.910 ⇒ 00:08:08.040 Samuel Roberts: Yeah, okay, just take, like, maybe a few minutes and just look through it a little bit.
94 00:08:08.720 ⇒ 00:08:12.409 Samuel Roberts: If you have any other ideas as well, you guys have been closer to it.
95 00:08:15.950 ⇒ 00:08:21.740 Samuel Roberts: You know, you can… And that… I might…
96 00:08:22.090 ⇒ 00:08:26.209 Samuel Roberts: make a quick cup of coffee, if that’s cool, because I just jumped off another meeting.
97 00:08:26.320 ⇒ 00:08:27.020 Mustafa Raja: Oh, that’s cool.
98 00:08:28.130 ⇒ 00:08:32.479 Samuel Roberts: Okay. So yeah, just take a quick look, and if you guys want to, you know…
99 00:08:32.659 ⇒ 00:08:35.710 Samuel Roberts: Start spitballing a little bit about what we can get going on.
100 00:08:36.090 ⇒ 00:08:40.279 Samuel Roberts: You know, there’s some interesting things here that could be pretty good, like…
101 00:08:40.960 ⇒ 00:08:44.890 Samuel Roberts: Learning the formats and trying to auto-format things.
102 00:08:46.940 ⇒ 00:08:51.749 Samuel Roberts: Yeah, make sure no content goes live, that’s, like, bad, so…
103 00:08:53.090 ⇒ 00:08:58.570 Samuel Roberts: yeah, take a look at it a little bit, we can spitball some more ideas. You know, the end result is that we want
104 00:08:58.840 ⇒ 00:09:03.129 Samuel Roberts: good… human and AI-readable central docs, I guess?
105 00:09:03.330 ⇒ 00:09:08.690 Samuel Roberts: mostly the AI, because eventually we don’t want them going to the central dock, but for now,
106 00:09:09.140 ⇒ 00:09:10.990 Samuel Roberts: How do we get the central dock?
107 00:09:11.590 ⇒ 00:09:18.509 Samuel Roberts: to be a better document for us and them, and kind of help them do that. Yeah.
108 00:09:19.150 ⇒ 00:09:19.870 Samuel Roberts: Okay.
109 00:09:20.420 ⇒ 00:09:23.700 Samuel Roberts: I’m gonna… Be right back.
110 00:09:24.110 ⇒ 00:09:24.690 Mustafa Raja: Okay.
111 00:09:25.530 ⇒ 00:09:26.190 Samuel Roberts: Alright.
112 00:09:27.340 ⇒ 00:09:44.070 Pranav Narahari: Mustafa, Casey, quick question on the central doc and how it’s being used. So, yeah, my understanding before Sam just said that piece was that it was only being used by AI, but does the client also, if they don’t get the answer from the AI, they go into the central doc to look for it?
113 00:09:45.170 ⇒ 00:09:48.100 Casie Aviles: Yeah, they also read through the central talk.
114 00:09:49.130 ⇒ 00:09:50.410 Pranav Narahari: Gotcha. Okay.
115 00:09:51.590 ⇒ 00:09:59.029 Pranav Narahari: It might be worth… having some… I don’t know, if…
116 00:10:00.020 ⇒ 00:10:07.479 Pranav Narahari: Google Docs is the right place to put it, but when they’re going into the central dock, just like
117 00:10:07.970 ⇒ 00:10:10.959 Pranav Narahari: Monitoring where they’re looking.
118 00:10:11.290 ⇒ 00:10:13.980 Pranav Narahari: Could be, like, really insightful for us.
119 00:10:14.160 ⇒ 00:10:15.700 Pranav Narahari: And how they’re getting to the point.
120 00:10:15.810 ⇒ 00:10:22.680 Pranav Narahari: like, actually finding the answer that they’re looking for. I don’t know if there’s, like, a clean solution for that, to be honest. It seems kind of complex.
121 00:10:25.110 ⇒ 00:10:26.080 Casie Aviles: Yeah.
122 00:10:26.080 ⇒ 00:10:26.740 Pranav Narahari: Yeah.
123 00:10:27.710 ⇒ 00:10:29.030 Pranav Narahari: That’s something I’ll think about.
124 00:10:32.810 ⇒ 00:10:34.970 Casie Aviles: Yeah, I think that’s, that’s,
125 00:10:35.840 ⇒ 00:10:43.340 Casie Aviles: it’s not, like, a bad idea. I mean, I just… for, like, the central doc, yeah, you’re right, Google Doc may not be the best, like.
126 00:10:43.480 ⇒ 00:10:45.860 Casie Aviles: We can’t really track that, I think.
127 00:10:46.140 ⇒ 00:10:48.400 Casie Aviles: You know, like, where they’re looking at.
128 00:10:48.920 ⇒ 00:10:51.680 Casie Aviles: Or, like, how they’re using the docs, but…
129 00:10:53.540 ⇒ 00:11:03.020 Casie Aviles: Yeah, ideally, they… the chatbot, Andy would just get that… get everything… From that document, and then…
130 00:11:03.490 ⇒ 00:11:07.600 Casie Aviles: Or, like, answer their questions based on that document.
131 00:11:08.950 ⇒ 00:11:10.750 Pranav Narahari: Yeah, that makes sense. And…
132 00:11:10.880 ⇒ 00:11:20.339 Pranav Narahari: Is it fair to say that, like, the first place that they’ll go to is to Andy, and then if they don’t get the right answer from Andy, then they go to the central doc?
133 00:11:20.580 ⇒ 00:11:23.829 Casie Aviles: Yes, yes, that’s, that’s, that’s pretty much what…
134 00:11:24.590 ⇒ 00:11:28.650 Casie Aviles: should happen, you know, they would ask Andy first before they… because
135 00:11:29.280 ⇒ 00:11:37.939 Casie Aviles: they should be getting, like, quick responses, since they would be using Andy. Like, for example, they’re in the middle of a call.
136 00:11:38.280 ⇒ 00:11:39.939 Casie Aviles: they would chat with Andy.
137 00:11:40.080 ⇒ 00:11:41.730 Casie Aviles: And ideally, they get, like.
138 00:11:41.880 ⇒ 00:11:47.970 Casie Aviles: the answer really quickly, so that should be, like, the value that Andy or the chatbot adds.
139 00:11:48.870 ⇒ 00:11:49.550 Casie Aviles: So…
140 00:11:49.550 ⇒ 00:11:50.150 Pranav Narahari: Yep.
141 00:11:50.450 ⇒ 00:11:56.619 Casie Aviles: If they have to, like, go to the doc, if they don’t get the right answer to Andy, then that adds, like.
142 00:11:57.700 ⇒ 00:12:00.639 Casie Aviles: Time, you know, to look for the answer.
143 00:12:01.440 ⇒ 00:12:02.110 Pranav Narahari: Yeah.
144 00:12:04.660 ⇒ 00:12:05.529 Samuel Roberts: What’d I miss?
145 00:12:09.980 ⇒ 00:12:10.929 Samuel Roberts: How we doing?
146 00:12:12.480 ⇒ 00:12:19.399 Casie Aviles: Oh, we were just talking about, like, how the CSRs… are using the central DAW.
147 00:12:21.880 ⇒ 00:12:23.039 Casie Aviles: So I was just… Okay.
148 00:12:23.040 ⇒ 00:12:24.010 Samuel Roberts: Oh, yeah.
149 00:12:24.460 ⇒ 00:12:27.189 Casie Aviles: Yeah, I was just explaining that ideally.
150 00:12:27.790 ⇒ 00:12:31.570 Casie Aviles: the answer… the questions they have would go through Andy first, but…
151 00:12:32.100 ⇒ 00:12:37.560 Casie Aviles: You know, there are times where if they don’t get it, then they have to reference the central doc again.
152 00:12:39.780 ⇒ 00:12:40.470 Samuel Roberts: Yes.
153 00:12:40.830 ⇒ 00:12:52.380 Pranav Narahari: And in all cases, they feel like the central doc has the answer, or there are also cases that we get feedback that Andy didn’t provide the right answer, and the central doc didn’t have the relevant info?
154 00:12:52.920 ⇒ 00:12:59.659 Casie Aviles: Yeah, there are cases like that, so that means they did not update the central dock, or, like, it wasn’t worded
155 00:12:59.940 ⇒ 00:13:04.770 Casie Aviles: Very, you know, in a way that’s helpful to answering their questions.
156 00:13:05.670 ⇒ 00:13:12.710 Pranav Narahari: Yeah, like, so, the central docs just wasn’t worded in a way where they could find it. That could also be a possibility, I guess.
157 00:13:13.360 ⇒ 00:13:20.910 Casie Aviles: Yeah, that’s a factor as well, so I think that’s also part of why we’re trying to optimize how it’s structured, so it should be able to.
158 00:13:20.910 ⇒ 00:13:21.520 Samuel Roberts: Yeah.
159 00:13:22.760 ⇒ 00:13:23.450 Pranav Narahari: Gotcha.
160 00:13:25.690 ⇒ 00:13:26.530 Samuel Roberts: Amber.
161 00:13:30.010 ⇒ 00:13:33.650 Amber Lin: Hello! Where are we at with the discussion?
162 00:13:34.690 ⇒ 00:13:41.039 Samuel Roberts: Yeah, we were just kind of getting them up to speed on the notion, and then I stepped away to get some coffee, because I just jumped off another meeting, and now we’re back.
163 00:13:42.190 ⇒ 00:13:44.800 Amber Lin: Gotcha, okay.
164 00:13:45.280 ⇒ 00:13:55.859 Amber Lin: as you guys keep discussing, I’ll add it to the can, and then we can… we can look at it and see, okay, which one are we doing first, which one has higher priority, so we can assign it there.
165 00:13:56.150 ⇒ 00:14:03.830 Amber Lin: I know there’s also some transcript stuff that we’re working on, so, like, I’ll also add that as well.
166 00:14:09.140 ⇒ 00:14:13.359 Samuel Roberts: Okay, so… I guess let’s start… Excuse me.
167 00:14:16.040 ⇒ 00:14:22.179 Samuel Roberts: The quick wins, so… Unanswered question report.
168 00:14:22.430 ⇒ 00:14:27.799 Samuel Roberts: Seems like a good one. What do you say, the thumbs-down analysis we could also do, because we have that data.
169 00:14:27.910 ⇒ 00:14:30.350 Samuel Roberts: We’ve done that a little bit, but I think we want to…
170 00:14:31.410 ⇒ 00:14:34.499 Mustafa Raja: I think Rel might already have that, I’m not sure, though.
171 00:14:37.200 ⇒ 00:14:41.440 Samuel Roberts: Oh, like, it’ll… yeah, it’ll have the… like…
172 00:14:41.850 ⇒ 00:14:43.709 Samuel Roberts: It’ll have patterns and stuff, though?
173 00:14:44.880 ⇒ 00:14:45.330 Casie Aviles: I don’t.
174 00:14:45.440 ⇒ 00:14:46.080 Mustafa Raja: Ding.
175 00:14:46.240 ⇒ 00:14:48.350 Casie Aviles: So, not yet, but…
176 00:14:48.350 ⇒ 00:14:51.510 Samuel Roberts: Yeah, I think we’re gonna wanna do, like, a categorization kind of thing.
177 00:14:51.710 ⇒ 00:14:53.290 Casie Aviles: Yeah, we haven’t done that.
178 00:14:53.290 ⇒ 00:14:55.530 Samuel Roberts: done manually, but I think the idea is to do it.
179 00:14:56.180 ⇒ 00:14:56.860 Casie Aviles: Ew.
180 00:14:57.330 ⇒ 00:15:09.969 Samuel Roberts: Automated basis, so… Okay, I guess let’s talk about the unanswered questions report, so… Basically, that would be…
181 00:15:10.350 ⇒ 00:15:15.219 Samuel Roberts: I guess it’s similar to the thumbs down, in that if they… Don’t like the answer.
182 00:15:15.540 ⇒ 00:15:20.209 Samuel Roberts: we would look at it, see if it said, sorry, I don’t know that, and then figure out
183 00:15:21.340 ⇒ 00:15:23.239 Samuel Roberts: What type of question it was.
184 00:15:24.260 ⇒ 00:15:28.270 Samuel Roberts: maybe… Where it should have come from.
185 00:15:32.420 ⇒ 00:15:34.920 Samuel Roberts: So that would be something that runs…
186 00:15:37.250 ⇒ 00:15:42.120 Samuel Roberts: On a trigger of something being added to… What, Snowflake right now?
187 00:15:47.850 ⇒ 00:15:49.960 Mustafa Raja: Unanswered question.
188 00:15:50.110 ⇒ 00:15:50.810 Mustafa Raja: Hmm.
189 00:15:52.500 ⇒ 00:15:56.860 Mustafa Raja: Yeah, I’m just thinking, if you want that on feedback, or on every…
190 00:15:58.020 ⇒ 00:15:59.000 Casie Aviles: She goes.
191 00:16:01.350 ⇒ 00:16:03.980 Samuel Roberts: Oh, you think there’s unanswered questions they’re not leaving feedback for?
192 00:16:04.650 ⇒ 00:16:14.790 Mustafa Raja: I mean, I guess for the bad responses, for each badge response, they would, you know, leave a feedback, so we might only want to look into that.
193 00:16:14.900 ⇒ 00:16:15.870 Mustafa Raja: Something?
194 00:16:15.870 ⇒ 00:16:16.470 Samuel Roberts: Yeah.
195 00:16:18.130 ⇒ 00:16:20.929 Samuel Roberts: Yeah, okay, so let’s make a plan for that one.
196 00:16:22.690 ⇒ 00:16:27.499 Samuel Roberts: Let’s make a plan for… Ongoing, thumbs-down analysis.
197 00:16:29.630 ⇒ 00:16:34.670 Samuel Roberts: What other ones do we think are… Important.
198 00:16:35.050 ⇒ 00:16:36.290 Samuel Roberts: most important.
199 00:16:37.050 ⇒ 00:16:43.780 Samuel Roberts: I think… this… is important.
200 00:16:45.340 ⇒ 00:16:48.099 Samuel Roberts: But I don’t know if this is gonna be as helpful until we get…
201 00:16:49.140 ⇒ 00:16:50.070 Mustafa Raja: Are you sharing your screen?
202 00:16:50.070 ⇒ 00:16:57.799 Samuel Roberts: docs a little more straightened out. Oh my goodness, I thought I was, I’m so sorry. I forgot. Yeah, yeah, okay. I was saying the bi-weekly AI doc audit.
203 00:16:58.550 ⇒ 00:17:02.549 Mustafa Raja: Oh yeah, yeah, that’s the, that’s the major idea shared also, right?
204 00:17:03.320 ⇒ 00:17:04.520 Mustafa Raja: with opus.
205 00:17:06.260 ⇒ 00:17:13.689 Mustafa Raja: Yeah, I think that that’s going to help, because in the long one, they have messed up headings, which messes up the answers.
206 00:17:14.099 ⇒ 00:17:18.440 Mustafa Raja: So, Opus should be pretty good with that.
207 00:17:20.020 ⇒ 00:17:20.650 Samuel Roberts: Okay.
208 00:17:24.900 ⇒ 00:17:27.950 Samuel Roberts: What else? What else do we have highlighted here?
209 00:17:28.040 ⇒ 00:17:33.650 Mustafa Raja: So, I think, so, for bi-weekly, do we… do we want this automated some… somehow?
210 00:17:35.300 ⇒ 00:17:42.800 Samuel Roberts: Yes, yeah, yeah, I think that’s the… yeah, we’re gonna… we’re gonna need to build, like, a few, kind of, agents in the background that are gonna be doing this and surfacing.
211 00:17:43.000 ⇒ 00:17:46.159 Samuel Roberts: Changes or making changes with the approval and stuff.
212 00:17:47.030 ⇒ 00:17:47.929 Mustafa Raja: Like it.
213 00:17:51.190 ⇒ 00:17:53.849 Samuel Roberts: So, I guess we got… how many are here?
214 00:17:54.160 ⇒ 00:17:57.979 Samuel Roberts: 30… Isn’t a table that I can sort? No, okay.
215 00:17:58.620 ⇒ 00:18:01.199 Samuel Roberts: So what are all the low-effort ones that are…
216 00:18:03.670 ⇒ 00:18:06.519 Mustafa Raja: I think it’s 5, 7, 15, and 25, right?
217 00:18:07.110 ⇒ 00:18:10.749 Samuel Roberts: Yeah, I just wanna see, I just want to make sure, missing info predictor?
218 00:18:14.340 ⇒ 00:18:17.240 Samuel Roberts: Don’t know what that Gaps First Pest Doc template…
219 00:18:21.000 ⇒ 00:18:29.790 Samuel Roberts: Tag metadata generator… I kinda have that, don’t we? But it doesn’t work on all of the…
220 00:18:29.970 ⇒ 00:18:32.320 Mustafa Raja: The metadata one, let me see…
221 00:18:33.330 ⇒ 00:18:34.340 Samuel Roberts: Number 15.
222 00:18:34.340 ⇒ 00:18:36.350 Mustafa Raja: Sections like searching…
223 00:18:43.410 ⇒ 00:18:48.569 Mustafa Raja: This would be different than the metadata in Superbase, right, though?
224 00:18:49.580 ⇒ 00:18:51.550 Samuel Roberts: I… I don’t know. I mean, maybe, I guess we could test.
225 00:18:51.550 ⇒ 00:19:02.040 Mustafa Raja: Yeah, I think, I think the, I think we… this is, this is more like, adding metadata to each, each feedback, or each response, so we can skim through.
226 00:19:02.340 ⇒ 00:19:04.949 Mustafa Raja: Scim through it better, you know?
227 00:19:05.650 ⇒ 00:19:07.170 Mustafa Raja: Maybe, maybe she…
228 00:19:07.170 ⇒ 00:19:08.880 Samuel Roberts: Responses, or the central doc?
229 00:19:09.590 ⇒ 00:19:16.430 Mustafa Raja: responses in general, this isn’t either… this isn’t directly with the embeddings, I feel.
230 00:19:16.720 ⇒ 00:19:17.879 Mustafa Raja: This is not talking about.
231 00:19:17.880 ⇒ 00:19:18.930 Samuel Roberts: the content.
232 00:19:21.090 ⇒ 00:19:40.060 Amber Lin: I think when I generated this, I think what they’re saying is adding tags directly in the SOPs, because they’re, like, we know… they would know the clearest when they were adding content, or where it belongs, and, like, that will help us add metadata in the…
233 00:19:40.290 ⇒ 00:19:44.570 Amber Lin: In the Supabase tags as well.
234 00:19:45.330 ⇒ 00:19:50.020 Casie Aviles: Well, so basically we’re enriching how we query with metadata, right?
235 00:19:50.850 ⇒ 00:19:55.900 Amber Lin: I guess we’re enriching the content with tags, so that.
236 00:19:55.900 ⇒ 00:19:56.350 Casie Aviles: Oh, yeah.
237 00:19:56.350 ⇒ 00:19:58.440 Amber Lin: The data is more accurate.
238 00:19:58.690 ⇒ 00:19:59.839 Casie Aviles: Okay, okay.
239 00:20:00.800 ⇒ 00:20:02.430 Samuel Roberts: Okay, well, the things I like…
240 00:20:02.660 ⇒ 00:20:07.290 Samuel Roberts: I like the most, I think, are the unanswered questions, so we’ll analyze all the thumbs-downs.
241 00:20:07.600 ⇒ 00:20:11.079 Samuel Roberts: That are unanswered, and figure out if there’s content missing or whatever.
242 00:20:11.660 ⇒ 00:20:12.610 Samuel Roberts: I think…
243 00:20:13.350 ⇒ 00:20:24.619 Samuel Roberts: The thumbs-down analysis is also good, because we’ll… if there are, you know, other wrong answers, we can figure out why on a regular basis, and we have a little bit of an idea of how to do that with the categories we put together, right?
244 00:20:25.850 ⇒ 00:20:28.559 Samuel Roberts: Like, for what we did kind of manually.
245 00:20:29.540 ⇒ 00:20:37.320 Samuel Roberts: I think the… SOP Auto Formatter, number 13, is interesting, too.
246 00:20:40.780 ⇒ 00:20:44.340 Samuel Roberts: I don’t know how that necessarily fits in timeline-wise, like…
247 00:20:46.600 ⇒ 00:20:48.569 Samuel Roberts: How can we get the doc… you know, like.
248 00:20:51.900 ⇒ 00:20:52.500 Samuel Roberts: what changes.
249 00:20:52.500 ⇒ 00:20:53.020 Mustafa Raja: That’s what you think.
250 00:20:53.020 ⇒ 00:20:54.690 Samuel Roberts: Just with that auto-formatter.
251 00:20:54.920 ⇒ 00:20:56.950 Mustafa Raja: What does this mean, though, Sotti?
252 00:20:59.760 ⇒ 00:21:01.060 Samuel Roberts: the SOP Auto Formatter?
253 00:21:01.060 ⇒ 00:21:01.840 Mustafa Raja: Yeah.
254 00:21:02.610 ⇒ 00:21:04.240 Samuel Roberts: Yeah, so, like, I think…
255 00:21:05.560 ⇒ 00:21:09.100 Samuel Roberts: Amber, you may be able to speak to this better, but that, like, they don’t have a good…
256 00:21:09.800 ⇒ 00:21:14.230 Samuel Roberts: Consistent way to… List, you know, how to do things.
257 00:21:14.930 ⇒ 00:21:34.670 Amber Lin: Yeah, so it’s like, oh, how do we… their SOPs are kind of formatted all differently, like, their timings, for example, duration information is scattered throughout. There’s, like, some here, some in examples, some in the table, so, like, it… like, the formatting’s not consistent, and that makes their updates
258 00:21:35.180 ⇒ 00:21:41.170 Amber Lin: Even harder to do, because they don’t even know where to put it, so it gets worse and worse throughout time.
259 00:21:42.100 ⇒ 00:21:42.900 Mustafa Raja: Okay.
260 00:21:44.680 ⇒ 00:21:48.350 Samuel Roberts: So the idea is that something that would Read what they have.
261 00:21:48.480 ⇒ 00:21:49.740 Samuel Roberts: And then…
262 00:21:50.360 ⇒ 00:21:50.920 Mustafa Raja: ingested.
263 00:21:51.230 ⇒ 00:21:53.300 Samuel Roberts: Better way to do it?
264 00:21:54.230 ⇒ 00:21:54.880 Mustafa Raja: didn’t get approved.
265 00:21:54.880 ⇒ 00:21:57.950 Samuel Roberts: Approval to be added or replaced.
266 00:21:58.730 ⇒ 00:21:59.629 Mustafa Raja: Okay, okay.
267 00:22:01.110 ⇒ 00:22:02.120 Samuel Roberts: Which one?
268 00:22:02.120 ⇒ 00:22:03.569 Amber Lin: Are we talking about?
269 00:22:03.570 ⇒ 00:22:04.390 Mustafa Raja: 13…
270 00:22:04.390 ⇒ 00:22:05.490 Samuel Roberts: Number 13?
271 00:22:05.900 ⇒ 00:22:07.139 Amber Lin: November 13th.
272 00:22:07.340 ⇒ 00:22:08.940 Amber Lin: I see, okay.
273 00:22:10.240 ⇒ 00:22:15.690 Samuel Roberts: I don’t know how much of the central doc that would… something that would cover, but I feel like that would help a lot in terms of…
274 00:22:17.390 ⇒ 00:22:20.029 Samuel Roberts: SOPs, specifically, which I imagine is a chunk of it.
275 00:22:21.800 ⇒ 00:22:27.570 Amber Lin: I just added it to the… the Gantt, so.
276 00:22:27.570 ⇒ 00:22:28.550 Samuel Roberts: Okay, cool.
277 00:22:28.550 ⇒ 00:22:38.430 Amber Lin: let’s just, like, tag their priorities there, and then we can give a rough timeline, because I think this is what we’re going to start as soon as possible, so…
278 00:22:38.430 ⇒ 00:22:38.940 Samuel Roberts: We can fight.
279 00:22:38.940 ⇒ 00:22:42.470 Amber Lin: Find out which one’s most important. We can just add them to the timeline.
280 00:22:42.490 ⇒ 00:22:43.110 Mustafa Raja: Yup.
281 00:22:43.110 ⇒ 00:22:46.880 Samuel Roberts: Okay, yeah, I would say… Let’s add.
282 00:22:49.700 ⇒ 00:22:51.969 Amber Lin: So this starts, like, I think this…
283 00:22:51.970 ⇒ 00:22:57.839 Samuel Roberts: Yeah, bi-weekly Audit Doc, duplicate content detector, Contradictory Info Finder, and a unity score calculator.
284 00:22:58.550 ⇒ 00:23:04.140 Samuel Roberts: Should we start with, like, a… should we start in the audit section, or…
285 00:23:04.320 ⇒ 00:23:08.650 Amber Lin: Oh, sorry, let’s… let’s do importance first. Sorry, I’m jumping ahead.
286 00:23:08.650 ⇒ 00:23:09.450 Samuel Roberts: Okay, yeah.
287 00:23:09.450 ⇒ 00:23:12.710 Amber Lin: I don’t know which one to mark. Let’s go one by one.
288 00:23:18.070 ⇒ 00:23:18.980 Amber Lin: Yeah.
289 00:23:19.390 ⇒ 00:23:25.990 Samuel Roberts: I would say… What are our levels? Like, 135, is that what we’re gonna do?
290 00:23:27.340 ⇒ 00:23:34.760 Amber Lin: Yeah, starting from… I’m starting from here.
291 00:23:38.490 ⇒ 00:23:42.040 Samuel Roberts: I would say that’s, like, a… Three, maybe?
292 00:23:43.190 ⇒ 00:23:47.110 Samuel Roberts: I feel like it’s gonna be helpful, but it’ll be more helpful once we get things a little cleaned up, so…
293 00:23:48.060 ⇒ 00:23:50.909 Samuel Roberts: Duplicated content strikes me as a… oh, go ahead.
294 00:23:51.500 ⇒ 00:23:59.429 Mustafa Raja: I think we can, merge 2 and 3. What do you think about that? It’s going to be, an AI agent coming through it, right?
295 00:24:01.000 ⇒ 00:24:03.100 Samuel Roberts: Contradictory and duplicate, yeah.
296 00:24:05.980 ⇒ 00:24:09.420 Samuel Roberts: Yeah, I would say this one is a high… one of the higher priority ones to me.
297 00:24:09.420 ⇒ 00:24:16.509 Amber Lin: So, should I just say this is, like, a initial? I mean, initial… I was wanting to put an initial audit one, but I feel like…
298 00:24:16.810 ⇒ 00:24:20.569 Amber Lin: this will be our initial audit, so I’m gonna put it as 5.
299 00:24:20.970 ⇒ 00:24:25.910 Samuel Roberts: Yeah, I think if we can build an agent that can go through And…
300 00:24:27.760 ⇒ 00:24:31.920 Samuel Roberts: Find that kind of information, surface it to us, we can…
301 00:24:32.730 ⇒ 00:24:35.930 Samuel Roberts: Cool. Make those changes, we can figure out how we can make the changes, yeah.
302 00:24:36.250 ⇒ 00:24:36.940 Mustafa Raja: Great.
303 00:24:37.360 ⇒ 00:24:45.419 Amber Lin: Alright, so I guess this is a little bit lower? Is this not as urgent? Or… I don’t… because I don’t know how we’re gonna do that.
304 00:24:46.010 ⇒ 00:24:47.580 Mustafa Raja: ambiguity score.
305 00:24:49.620 ⇒ 00:24:53.230 Samuel Roberts: Yeah, that’s a little tough, because it doesn’t have all the context that they might have.
306 00:24:56.160 ⇒ 00:24:57.379 Samuel Roberts: Yeah, that’s fine for now.
307 00:24:58.900 ⇒ 00:24:59.300 Amber Lin: Okay.
308 00:24:59.300 ⇒ 00:25:02.809 Mustafa Raja: I think that’s really dependent on 2 and 3 ones, right?
309 00:25:04.560 ⇒ 00:25:08.639 Mustafa Raja: Those will be the ones that’s causing the ambiguity, and then we can score that.
310 00:25:09.500 ⇒ 00:25:10.220 Samuel Roberts: That’s true.
311 00:25:11.110 ⇒ 00:25:11.680 Amber Lin: Cool.
312 00:25:11.930 ⇒ 00:25:16.680 Amber Lin: And then let’s look at the query analysis and feedback.
313 00:25:17.720 ⇒ 00:25:23.240 Samuel Roberts: I would say… Unanswered questions and thumbs down are high priority.
314 00:25:26.080 ⇒ 00:25:28.650 Casie Aviles: Yeah, take tons, tons, kind of divider.
315 00:25:30.010 ⇒ 00:25:42.919 Samuel Roberts: I’d say query categorization is somewhat related to the thumbs-down analysis, because we’re going to want to do… and unanswered questions. Basically, like, if we build something that can categorize queries, that’ll be helpful for doing other sorts of analysis, right?
316 00:25:42.920 ⇒ 00:25:46.660 Amber Lin: Yeah, I think this will have to do for all questions, because that’s…
317 00:25:46.660 ⇒ 00:25:49.050 Samuel Roberts: That’s what I mean, so that’s probably the first thing, then.
318 00:25:49.320 ⇒ 00:25:50.030 Amber Lin: Yeah, husband.
319 00:25:50.030 ⇒ 00:25:50.560 Samuel Roberts: asking, but…
320 00:25:50.560 ⇒ 00:25:59.719 Amber Lin: So that will be, all questions, thumbs, downs, unanswered.
321 00:26:00.200 ⇒ 00:26:00.880 Amber Lin: Yes.
322 00:26:01.260 ⇒ 00:26:06.600 Amber Lin: So, essentially, all questions. Cool, so I’m gonna put this as… .
323 00:26:06.600 ⇒ 00:26:13.369 Mustafa Raja: So for categorizations, so, what categories would we be creating, you know?
324 00:26:13.800 ⇒ 00:26:16.979 Amber Lin: I guess we’ll have to run that through our AI to see what we have.
325 00:26:16.980 ⇒ 00:26:17.420 Samuel Roberts: Yeah, I mean…
326 00:26:17.420 ⇒ 00:26:20.399 Amber Lin: Let me add a task of, like,
327 00:26:20.690 ⇒ 00:26:21.990 Casie Aviles: It’d be fine.
328 00:26:21.990 ⇒ 00:26:24.730 Amber Lin: All questions asked?
329 00:26:25.920 ⇒ 00:26:27.320 Samuel Roberts: I wouldn’t have that as a Gantt.
330 00:26:27.500 ⇒ 00:26:28.870 Samuel Roberts: That’s a linear ticket.
331 00:26:29.750 ⇒ 00:26:30.460 Amber Lin: Okay.
332 00:26:30.660 ⇒ 00:26:34.409 Samuel Roberts: You know what I mean? I don’t think that’s, like, a… like, that’ll be part of one of these things, yeah.
333 00:26:34.410 ⇒ 00:26:41.900 Amber Lin: Yeah, sounds good. So I’m gonna put… I guess I’ll put this as slower? Like, these are… these feel like smaller tasks.
334 00:26:42.170 ⇒ 00:26:43.970 Amber Lin: Like, under the overall…
335 00:26:43.970 ⇒ 00:26:46.210 Samuel Roberts: Yeah, I mean, a little bit.
336 00:26:46.210 ⇒ 00:26:46.830 Amber Lin: time.
337 00:26:47.330 ⇒ 00:26:48.510 Amber Lin: Oh, okay.
338 00:26:49.950 ⇒ 00:26:54.889 Amber Lin: And then longest query analysis.
339 00:26:55.380 ⇒ 00:26:57.469 Amber Lin: What’s… what’s that gonna be?
340 00:26:59.210 ⇒ 00:27:02.170 Samuel Roberts: I think that’s even lower right now. I don’t know if that’s…
341 00:27:02.170 ⇒ 00:27:02.760 Amber Lin: I just put it…
342 00:27:02.760 ⇒ 00:27:10.439 Samuel Roberts: Yeah, I feel… Slow responses frustrate, but that’s not necessarily a central dog thing either, that’s more.
343 00:27:10.440 ⇒ 00:27:11.990 Amber Lin: Yeah, yeah.
344 00:27:12.520 ⇒ 00:27:13.549 Amber Lin: Cool, okay.
345 00:27:13.550 ⇒ 00:27:14.329 Samuel Roberts: I mean, it’s not unrelated.
346 00:27:14.330 ⇒ 00:27:28.529 Amber Lin: Let’s look at… yeah, let’s look at the rag and retrieval, then. I feel like this is, like, a later… later thing, but they’re important, but I feel like this probably comes later than the minor small changes we’ll do first.
347 00:27:28.530 ⇒ 00:27:32.839 Samuel Roberts: Yeah, they’re also, like, higher, high and medium effort, according to this, which I think is probably true.
348 00:27:32.840 ⇒ 00:27:33.770 Amber Lin: I see.
349 00:27:37.710 ⇒ 00:27:40.569 Amber Lin: Cool, so how would we do this?
350 00:27:40.690 ⇒ 00:27:42.710 Amber Lin: Should I move this to the bottom?
351 00:27:45.070 ⇒ 00:27:48.490 Samuel Roberts: What’s the next one? Content formatting is definitely more important. Yeah, workflow maintenance.
352 00:27:48.490 ⇒ 00:27:57.729 Amber Lin: So, move all the rag retrieval stuff to the bottom. Okay, let’s do the content formatting, then.
353 00:27:59.310 ⇒ 00:28:04.319 Samuel Roberts: I think SOP Autoformatters… Pretty high.
354 00:28:05.690 ⇒ 00:28:12.919 Amber Lin: Yeah, how would we detect… so would this 95, would that detect what we need to format here?
355 00:28:17.590 ⇒ 00:28:24.949 Amber Lin: Or is this just one… each individual one, and we manually put in what we want to format? So how would that.
356 00:28:24.950 ⇒ 00:28:27.170 Samuel Roberts: No, I think it’ll… I think we’d have to…
357 00:28:27.380 ⇒ 00:28:30.720 Samuel Roberts: I mean, so the pest stock is kind of our gold standard, right?
358 00:28:32.580 ⇒ 00:28:34.850 Samuel Roberts: So, if we can extract, like.
359 00:28:35.880 ⇒ 00:28:41.670 Samuel Roberts: from that, a way to tell AI what we like about the SOPs?
360 00:28:43.100 ⇒ 00:28:47.049 Samuel Roberts: In that one, and then run that against the other ones.
361 00:28:47.420 ⇒ 00:28:48.150 Samuel Roberts: So…
362 00:28:49.280 ⇒ 00:28:50.000 Amber Lin: Okay.
363 00:28:50.820 ⇒ 00:28:57.370 Samuel Roberts: And that might be a little bit of human input of, like, you know, this isn’t good, this is good, you know, it can maybe, you know, tune the prompt from that.
364 00:28:58.650 ⇒ 00:28:59.040 Amber Lin: Okay.
365 00:28:59.040 ⇒ 00:29:04.230 Samuel Roberts: I don’t know if it’s necessarily the same as contradictory and duplicate detector, But…
366 00:29:06.400 ⇒ 00:29:11.429 Samuel Roberts: I mean, the auto-formatter is probably the end result of doing some more, like, analysis on.
367 00:29:11.830 ⇒ 00:29:13.350 Amber Lin: Yeah.
368 00:29:13.350 ⇒ 00:29:14.200 Samuel Roberts: anyway, so I think…
369 00:29:14.200 ⇒ 00:29:15.349 Amber Lin: These two sets.
370 00:29:15.350 ⇒ 00:29:16.030 Samuel Roberts: sections, or…
371 00:29:16.030 ⇒ 00:29:26.960 Amber Lin: We’ll know, oh, a lot of the stuff points to this section, a lot of that stuff points to that section, then we’ll be able to say, okay, where will we apply these tools?
372 00:29:28.740 ⇒ 00:29:29.510 Amber Lin: Okay.
373 00:29:29.850 ⇒ 00:29:33.909 Amber Lin: So I’ll put them all as…
374 00:29:34.960 ⇒ 00:29:40.270 Amber Lin: Two? Or, I mean, they’re import… important, so how should I prioritize these?
375 00:29:40.270 ⇒ 00:29:45.640 Samuel Roberts: I put the SOP as a high… Okay.
376 00:29:45.790 ⇒ 00:29:46.960 Samuel Roberts: In QA.
377 00:29:46.960 ⇒ 00:29:48.199 Amber Lin: The other ones are…
378 00:29:50.690 ⇒ 00:29:51.310 Samuel Roberts: Is it…