Meeting Title: Miguel <> Hannah - Pool Parts Case Study Date: 2025-07-16 Meeting participants: Hannah Wang, Miguel de Veyra


WEBVTT

1 00:01:10.120 00:01:11.190 Miguel de Veyra: Hey! Anna!

2 00:01:12.130 00:01:13.430 Hannah Wang: Hi! How’s it going.

3 00:01:17.360 00:01:18.150 Hannah Wang: Oh no!

4 00:01:18.550 00:01:19.960 Miguel de Veyra: Okay, I guess.

5 00:01:20.750 00:01:23.610 Hannah Wang: Oh, is there anything I can do to help.

6 00:01:25.060 00:01:32.000 Miguel de Veyra: No, not really. That’s fine. Yeah. Sorry meeting pool parts right? Sorry.

7 00:01:33.660 00:01:38.080 Hannah Wang: Yeah, no worries. This should.

8 00:01:39.390 00:01:45.416 Hannah Wang: Well, well, I just wanna say, like, if you’re not doing well.

9 00:01:46.280 00:01:49.919 Hannah Wang: it’s okay to like, take some time for yourself. And like.

10 00:01:50.130 00:01:53.640 Hannah Wang: Yeah, I I think your health. And you know your well-being is

11 00:01:53.780 00:01:56.871 Hannah Wang: the most important thing. So

12 00:01:58.020 00:02:00.019 Hannah Wang: yeah, it’s okay to not feel.

13 00:02:00.020 00:02:00.630 Miguel de Veyra: Yeah.

14 00:02:00.630 00:02:01.090 Hannah Wang: Yeah.

15 00:02:01.550 00:02:03.110 Miguel de Veyra: It should be okay.

16 00:02:03.430 00:02:03.750 Hannah Wang: Okay.

17 00:02:03.750 00:02:05.090 Miguel de Veyra: You know a couple of.

18 00:02:07.060 00:02:12.330 Miguel de Veyra: although I’m not sure this is the best time to do this, because we’re actually

19 00:02:12.630 00:02:19.290 Miguel de Veyra: gonna come out with like another feature for them, and I’m proposing something with with them. I think that might be a bit more interesting.

20 00:02:19.470 00:02:21.710 Miguel de Veyra: because, remember, Contextual Demo.

21 00:02:22.810 00:02:24.909 Miguel de Veyra: Or it shows like the.

22 00:02:25.376 00:02:26.310 Hannah Wang: Boxes. Yeah.

23 00:02:26.310 00:02:41.640 Miguel de Veyra: Yeah, cause, basically, what? What I’m working on next right now is actually like a diy, like top 5 diy stuff for them. So, for example, there’s people who want to install something on their pool or repair something.

24 00:02:43.006 00:02:55.770 Miguel de Veyra: Basically they have. I’m thinking, if they could like create like a document, or we could help them create that document. And then it will be something like contextual right? And where we show them the pictures of actually how to do it

25 00:02:56.410 00:02:59.470 Miguel de Veyra: like how it would look like. I think that would be like a lot

26 00:02:59.730 00:03:02.359 Miguel de Veyra: better for you, for your use case.

27 00:03:02.470 00:03:06.580 Miguel de Veyra: because then that’s like a lot different than just chat gpt right where it’s just text.

28 00:03:07.310 00:03:12.579 Hannah Wang: I see. When is that feature like, what’s the timeline roadmap for that.

29 00:03:13.840 00:03:27.509 Miguel de Veyra: Well, we’re doing like the Diy itself is like, actually, like, I’m done with the bot. I’m just implementing it on the Ui but the contextual one is cause I’m gonna I just came up with the idea like a few

30 00:03:27.690 00:03:34.140 Miguel de Veyra: hours ago. And I was like, Yeah, if you know, if it’s like a diy thing, then it’s definitely worth to have pictures. There.

31 00:03:35.300 00:03:41.070 Hannah Wang: I see, so so do you have like an estimate of when that work would be finished.

32 00:03:41.940 00:03:44.820 Miguel de Veyra: Probably for the contextual.

33 00:03:46.087 00:03:47.460 Hannah Wang: For bull parts.

34 00:03:48.360 00:03:57.280 Miguel de Veyra: Without the contextual. I think I can finish it by tomorrow, like I should be able to send stuff cause we’re gonna have 2 bots for them, one for basically, you know.

35 00:03:59.050 00:04:02.149 Miguel de Veyra: installing. And then general pool questions.

36 00:04:02.470 00:04:06.049 Hannah Wang: Okay. But then integrating contextual. When do you think that would be done?

37 00:04:07.300 00:04:12.100 Miguel de Veyra: I’m not sure because they have to create like a document. First, st you know.

38 00:04:12.180 00:04:12.660 Miguel de Veyra: Oh, okay.

39 00:04:12.790 00:04:17.980 Miguel de Veyra: yeah. So it’s probably gonna be next week. If that even pushes through. But I think they’d be interested.

40 00:04:18.470 00:04:19.075 Hannah Wang: Okay.

41 00:04:19.680 00:04:20.799 Miguel de Veyra: Build them or ours.

42 00:04:22.019 00:04:28.291 Hannah Wang: Well, I still think it’s worth a shot just to have like case studies. And then, if it’s the case that

43 00:04:28.969 00:04:48.129 Hannah Wang: you integrate contextual, and that works well like I can. If it’s okay, I can meet with you again and just go over that demo case study. So I am not entirely sure the work that you did for pool parts, or what Tom is talking about. He just said, to reach out to you guys for a pool parts.

44 00:04:49.027 00:04:51.089 Hannah Wang: Case study so.

45 00:04:51.500 00:04:57.900 Miguel de Veyra: Yeah, cause. Honestly, that’s why I’m not sure that there’s like a really deep case. I’m not sure.

46 00:04:58.160 00:05:05.529 Miguel de Veyra: because all basically, all we did is like a chat Gpt, for you know, chemistry questions.

47 00:05:06.530 00:05:20.379 Miguel de Veyra: So it’s nothing innovative, right? Like, you know, it’s there. It’s just we basically just rebranded it. And then, you know, we added Serp Api, which is Chat Gpt has that. And then we scraped Facebook and read and Reddit.

48 00:05:20.690 00:05:21.020 Hannah Wang: Which is.

49 00:05:21.020 00:05:24.890 Miguel de Veyra: You know, Chat Gpt can do it. Maybe not Facebook.

50 00:05:25.010 00:05:33.349 Miguel de Veyra: But you know, like it’s not cutting edge. That’s why I think once we have contextual. It’s a lot, you know. It’s a lot more

51 00:05:35.325 00:05:37.589 Miguel de Veyra: valuable for use case.

52 00:05:37.860 00:06:07.447 Hannah Wang: I see. Well, how about we? Just I how about I just like quote unquote interview you right now and then I’ll ask you, Tom what he thinks. If it’s worth waiting until we have contextual integrated, and then, if if so, like, we just won’t use this case study, and then we’ll use the next generation. Okay, cool. So if you could like. Share your screen with me of like whatever Demo exists for pool parts. And then I can start asking you questions.

53 00:06:08.390 00:06:10.299 Miguel de Veyra: Yeah, thank, you.

54 00:06:11.130 00:06:13.930 Hannah Wang: Awesome. It’s loading on my side.

55 00:06:14.790 00:06:17.900 Miguel de Veyra: Yeah, this should be. Cause this is all.

56 00:06:18.180 00:06:21.540 Miguel de Veyra: We actually didn’t design this for desktop at all.

57 00:06:22.440 00:06:23.670 Hannah Wang: So just mobile.

58 00:06:23.880 00:06:25.020 Miguel de Veyra: Yeah, yeah.

59 00:06:25.020 00:06:27.210 Hannah Wang: Okay, okay,

60 00:06:28.630 00:06:34.359 Hannah Wang: So the 1st kind of section is the context of why this was built in the 1st place. So

61 00:06:36.610 00:06:48.210 Hannah Wang: I had AI come up with questions for me to ask, but I don’t know how relevant, or I don’t know how to phrase it in a way that makes sense for the current pool parts study. But I guess

62 00:06:50.520 00:06:56.679 Hannah Wang: like what? What issue? Basically were you trying to solve with this chat, bot.

63 00:06:58.550 00:07:02.300 Miguel de Veyra: Hmm, wait. Let me think.

64 00:07:04.390 00:07:12.320 Miguel de Veyra: yeah. Okay. So basically, what they wanted was, you know, a chat bot that could, because everyone’s having chat bots. Now that people could, you know.

65 00:07:12.680 00:07:19.309 Miguel de Veyra: could people could use especially potential clients, or even existing clients, can use to basically ask general.

66 00:07:19.980 00:07:29.819 Miguel de Veyra: you know, pool questions so such as, Why is my pool green? What should I do to you know? Should I? What should I buy? Where’s like the nearest store?

67 00:07:30.720 00:07:42.490 Miguel de Veyra: And then, you know. And then that’s the initial version of it. And then eventually, of course, we’re gonna as you see, and like over here, like we’re doing the Diy. And then.

68 00:07:44.320 00:07:57.150 Miguel de Veyra: yeah. And then eventually, we’re gonna add their product recommendations. Like, so, for example, if someone asks, Hey, you know, my pool pump is broken, and then, hey, we actually have this pool pool pump. Yada Yada, you might be interested in it.

69 00:07:58.070 00:08:04.950 Miguel de Veyra: So eventually, it’s gonna be like a product recommendation chat. What tool, but it’s like branded for them, and not just chat. Gpt.

70 00:08:05.450 00:08:15.350 Hannah Wang: So is this chat bot, does it exist on the pool parts website anywhere? Or is it like a separate link that customers have to go to or.

71 00:08:15.350 00:08:15.939 Miguel de Veyra: Do you.

72 00:08:15.940 00:08:19.710 Hannah Wang: Is it internal pool parts team, or is it going out to their customers.

73 00:08:19.860 00:08:24.470 Miguel de Veyra: Client. This is for clients, I believe, but it’s it’s deployed on their website.

74 00:08:25.000 00:08:32.840 Miguel de Veyra: It’s it’s this one, pptz.ai. I think they’re gonna create like a branding or marketing pusher around it. Sooner or later.

75 00:08:33.880 00:08:36.919 Hannah Wang: So this is for parts to Go’s clients.

76 00:08:37.169 00:08:38.099 Miguel de Veyra: Yes, yes.

77 00:08:38.100 00:08:45.019 Hannah Wang: Okay, cool and do you know, like, did you do you happen to know, I guess? Like

78 00:08:46.731 00:08:57.418 Hannah Wang: why, they wanted to create this, and I guess what custom? What were customers doing before that made them want to like build this chat, bot.

79 00:08:58.785 00:09:03.639 Miguel de Veyra: From what I heard from, I think it was Ben is that, you know, there’s like a lot of

80 00:09:03.750 00:09:15.440 Miguel de Veyra: rubbish inputs from, because people don’t usually know where to go online, right? People who own because they’re if you own a pool, you’re probably rich and kind of old.

81 00:09:15.560 00:09:31.850 Miguel de Veyra: So so they? They’re asking. You know, there’s Facebook groups. I’ve been into that because I had to scrape it reddit threads. And then it’s all you know. It’s all general answers. Nothing standardized, you know, and every pool, as Ben mentioned is like a lot different. So there’s no one. Size fits all.

82 00:09:32.300 00:09:42.810 Miguel de Veyra: So that’s why, you know, they created this one. So you could describe what? Ha! What’s happening to your pool? You can upload upload pictures to it, and then the bot will, basically, you know, analyze it.

83 00:09:43.281 00:09:48.359 Miguel de Veyra: give you what what needs to be done, or what’s the problem, or what stuff like that.

84 00:09:48.620 00:09:52.083 Hannah Wang: Okay? And do you know, if pool parts to go

85 00:09:52.580 00:09:58.560 Hannah Wang: and try to implement anything before to solve this problem? Or is.

86 00:09:58.560 00:09:59.299 Miguel de Veyra: It’s just like the.

87 00:09:59.300 00:10:00.460 Hannah Wang: 1st iteration.

88 00:10:01.510 00:10:05.010 Miguel de Veyra: As far as I know. No, they don’t have it.

89 00:10:05.740 00:10:06.090 Hannah Wang: Okay.

90 00:10:06.090 00:10:08.589 Miguel de Veyra: We don’t have any prior drives.

91 00:10:08.880 00:10:09.580 Hannah Wang: Okay?

92 00:10:11.112 00:10:14.800 Hannah Wang: Okay, let’s see

93 00:10:19.420 00:10:29.460 Hannah Wang: And do you know, if they wanted to build this chat bot to like

94 00:10:29.630 00:10:38.920 Hannah Wang: increase sales? I mean, eventually, yeah, it’s gonna recommend their products. Right? So is this just like the 1st initiative to get to that next stage?

95 00:10:40.210 00:10:40.890 Hannah Wang: Okay?

96 00:10:42.615 00:10:43.830 Hannah Wang: Okay.

97 00:10:43.830 00:10:45.089 Miguel de Veyra: Face by face.

98 00:10:45.510 00:10:48.158 Hannah Wang: Hmm, got it face by face.

99 00:10:49.390 00:11:03.999 Hannah Wang: okay. So I’m gonna ask you questions about how you implemented the solution. So what tools? And like Apis and stuff did you use? You can just walk me through like how you developed this chat? Bot, yeah.

100 00:11:04.760 00:11:13.900 Miguel de Veyra: So this front end is built on Mern stock. It’s a lot of code. So I don’t think, yeah, mern is basically react express.

101 00:11:14.810 00:11:18.370 Miguel de Veyra: MER is what’s r

102 00:11:22.027 00:11:26.410 Miguel de Veyra: it’s mongo express app and and node sorry.

103 00:11:26.410 00:11:26.730 Hannah Wang: Yeah.

104 00:11:26.900 00:11:43.929 Miguel de Veyra: Okay, my bad. And then, basically, that’s, you know, the front end and back end. Which is, you know. And I think everyone knows that by now. I think, though, the magic is here, actually, so if we go, is this the AI agent? Yeah, it’s this one.

105 00:11:44.210 00:11:50.270 Miguel de Veyra: So basically, what happens is, you know, we get the web hook. Ui over here. This is when you message

106 00:11:50.820 00:11:55.429 Miguel de Veyra: where spool parts. Okay, let me move this here. So when you message something, it goes here

107 00:11:55.700 00:12:09.869 Miguel de Veyra: from here, you, we generate. We receive basically a session id, what the input was. And then golden data sheet is basically this huge chunk of data, the scraped ones, the important ones.

108 00:12:10.250 00:12:13.079 Miguel de Veyra: I just put it there for now and then, basically the time.

109 00:12:13.540 00:12:15.619 Miguel de Veyra: So it’s all always a waste. Oh.

110 00:12:16.970 00:12:21.210 Miguel de Veyra: aware of the time. And then basically, we have the main agent

111 00:12:21.720 00:12:26.839 Miguel de Veyra: which has a couple of tools right? This is where

112 00:12:27.030 00:12:33.660 Miguel de Veyra: most of the scrape data are. The ones in here are like pretty golden

113 00:12:33.880 00:12:35.669 Miguel de Veyra: faqs. Yeah, there you go. That’s how.

114 00:12:35.670 00:12:36.459 Hannah Wang: You call it.

115 00:12:36.650 00:12:40.219 Miguel de Veyra: And then this is basically the ones we scrape from Reddit and Facebook.

116 00:12:40.660 00:12:44.640 Miguel de Veyra: It’s actually, I think over here, yeah, it’s in Google docs.

117 00:12:45.000 00:12:46.552 Miguel de Veyra: So it’s over here.

118 00:12:47.980 00:12:52.200 Miguel de Veyra: so that’s the tool this agent uses. If there’s like, it’s a pretty complex question.

119 00:12:53.100 00:13:01.590 Miguel de Veyra: And then Serp Api is what the agent uses to browse the website. So it has that same capability, as you know. Chat gpt.

120 00:13:01.720 00:13:07.609 Miguel de Veyra: So this is useful, for, hey? You know, where can I buy pool parts

121 00:13:08.160 00:13:10.759 Miguel de Veyra: pull parts, pull parts on

122 00:13:11.010 00:13:19.779 Miguel de Veyra: this location or for New York. What’s the nearest reports on my address? This will then, you know, do a search, and then the agent will know it.

123 00:13:20.510 00:13:26.319 Miguel de Veyra: And then, after that’s that’s basically it. And then the way we pass images is actually

124 00:13:27.371 00:13:45.029 Miguel de Veyra: so if we go here, we just because we process the image, not here. We actually process it on the front, on the back end. Because if we do it here. It’s like a lot more complicated, which and it’s a lot slower. That’s why we we do it there and then we just pass it here. So this agent. If you if you see

125 00:13:46.740 00:13:55.590 Miguel de Veyra: built in vision, there you go. So basically, we just instructed the agent that, hey? You know, you will receive an analysis of the image that was uploaded don’t

126 00:13:55.910 00:14:08.560 Miguel de Veyra: cause sometimes, initially like it’s gonna describe the image, because, you know, it’s not its own. But basically what this does is, hey? You actually own this image. This was processed by us. You’re part of a bigger application.

127 00:14:10.440 00:14:13.829 Miguel de Veyra: So yeah, that’s basically the main core of it. And then

128 00:14:14.370 00:14:20.210 Miguel de Veyra: we have an analytics agent. This, basically, you know, analyzes what ha, what’s happening to it. Sorry.

129 00:14:21.090 00:14:29.719 Miguel de Veyra: Basically, if the output was correct. Did it follow the standards? Yeah, they are the the formatting and stuff like that

130 00:14:31.210 00:14:32.730 Miguel de Veyra: basically just grading.

131 00:14:33.360 00:14:34.160 Hannah Wang: Okay.

132 00:14:34.160 00:14:40.300 Miguel de Veyra: And then we store every conversation into S. 3. We convert into Csv, and then we store it into a bucket.

133 00:14:40.610 00:14:47.200 Miguel de Veyra: It’s basically our source of truth. And then we basically upload this into sheets every conversation.

134 00:14:47.370 00:14:49.220 Miguel de Veyra: I’m not sure if the link is here.

135 00:14:49.540 00:15:00.869 Miguel de Veyra: Yeah, but basically, we, we upload it here, you know, we stored what the message was, what the reply was of what the reply of the bot was the timestamp, the session Id. And then the score.

136 00:15:01.420 00:15:06.069 Miguel de Veyra: basically. And then the way this works is basically we’re just

137 00:15:07.132 00:15:09.730 Miguel de Veyra: cleaning doesn’t really matter. Honestly, this.

138 00:15:10.520 00:15:13.710 Miguel de Veyra: They’re basically just cleaners. I put them there just to clean stuff.

139 00:15:13.900 00:15:14.650 Hannah Wang: Okay.

140 00:15:15.470 00:15:18.669 Miguel de Veyra: And then, yeah, that’s basically this one.

141 00:15:21.460 00:15:24.970 Hannah Wang: And then from Google sheets is that somehow.

142 00:15:25.320 00:15:31.109 Hannah Wang: like connected to the pool parts Chatbot, and it displays the message there.

143 00:15:31.110 00:15:38.700 Miguel de Veyra: No, no, basically using the session. Id it, just, you know, it’s here. That’s so that we don’t have persistent memory for now.

144 00:15:38.700 00:15:39.180 Hannah Wang: I see.

145 00:15:39.180 00:15:43.269 Miguel de Veyra: Once you if you’re chatting with it and you reload, it’s basically gone forever.

146 00:15:44.220 00:15:47.849 Hannah Wang: Then why do you? Why do you store it in Google sheets all the you know.

147 00:15:47.850 00:15:50.130 Miguel de Veyra: So we can keep track of the records because we don’t.

148 00:15:50.130 00:15:51.590 Miguel de Veyra: Oh, with right now.

149 00:15:51.590 00:15:51.950 Hannah Wang: Okay.

150 00:15:51.950 00:15:53.649 Miguel de Veyra: Of the conversations. Yeah.

151 00:15:53.650 00:15:54.480 Hannah Wang: I see.

152 00:15:55.570 00:15:56.470 Hannah Wang: Okay.

153 00:15:57.790 00:16:02.880 Miguel de Veyra: And yeah, I think we should also like while we’re at it right.

154 00:16:03.918 00:16:28.889 Miguel de Veyra: I think this one is like the. This is what they’re very interested in. So, for example, you message it, you know, I just bought 1.5 horsepower pump. Can you walk me through how to install it myself? It basically follows the same flow. It’s only that the agent has a different prompting. So it has, for example, installation. It has this prompt for, or is it installation? Right? So there’s 6 steps.

155 00:16:31.025 00:16:36.829 Miguel de Veyra: Basically, the agent will guide you through it. So, for example, hey? So step 0 is safety prep.

156 00:16:37.020 00:16:41.369 Miguel de Veyra: right? And then you say, Okay, done. And then it’s gonna move you to the next step

157 00:16:41.875 00:16:47.529 Miguel de Veyra: second, thirds, and so on, and so forth. Of course, for this one, for the testing I just did. Happy path, for now.

158 00:16:48.120 00:16:55.359 Miguel de Veyra: Just to see if it’s you know. Following the steps. Which is, it? Did pretty much. So yeah.

159 00:16:55.790 00:17:05.179 Miguel de Veyra: And then even it even says, you know, ensure that the power is off while making disconnections. So it’s also like, very considerate of the user safety.

160 00:17:06.420 00:17:11.990 Hannah Wang: So this is like another chat bot that you’re building out for them, basically, or like a tutorial based.

161 00:17:12.579 00:17:18.219 Miguel de Veyra: Yes, yes, this is also for the users. Actually, we were thinking initially of, you know, if I if we should

162 00:17:18.539 00:17:26.509 Miguel de Veyra: consolidate the 2. But then I recommended to them just separated, for now it’s a lot easier to to

163 00:17:27.309 00:17:28.879 Miguel de Veyra: merge the 2. Then, you know.

164 00:17:29.589 00:17:35.889 Miguel de Veyra: we break something that’s already working. So basically cause they’re basically 2 different use cases.

165 00:17:36.321 00:17:39.849 Miguel de Veyra: right? So might as well just separate them. For now and then just

166 00:17:41.122 00:17:43.699 Miguel de Veyra: build from there and then connect them together.

167 00:17:43.700 00:17:44.690 Hannah Wang: I see?

168 00:17:45.980 00:17:51.160 Hannah Wang: And is this one deployed yet, or did you give it to pull parts yet? Or is this still a working.

169 00:17:51.160 00:17:54.390 Miguel de Veyra: No, no, this was this was started yesterday.

170 00:17:54.390 00:17:56.919 Hannah Wang: Oh, okay, got it?

171 00:17:57.940 00:18:03.039 Hannah Wang: Oh, by the way, how long did the chat the other chat bot take you to build

172 00:18:03.400 00:18:05.650 Hannah Wang: or like? How, how long was the project?

173 00:18:07.380 00:18:09.820 Miguel de Veyra: It was, I think, almost

174 00:18:10.920 00:18:20.780 Miguel de Veyra: the chat, but itself like took a day or 2, maybe 2 days. It’s the data, the scraping, and you know, and everything building the Ui. That took a lot of time.

175 00:18:21.440 00:18:23.749 Hannah Wang: Do you know how approximately, how long.

176 00:18:25.580 00:18:32.139 Miguel de Veyra: Probably 2 weeks, all in all cause we cause we also had to set, because it has a like a lot of parts.

177 00:18:34.350 00:18:36.170 Miguel de Veyra: The main agent, of course.

178 00:18:36.772 00:18:53.400 Miguel de Veyra: This one was basically we had to scrape Reddit and Facebook. We had to run a script. So that’s a different code. And then the image analysis for the user. Uploaded images are also a different script that runs on the back end. And then, of course, the ui is also a different script.

179 00:18:54.980 00:19:00.990 Miguel de Veyra: and then the analytics is also a different agent. The upload to S. 3. This took a bit of time to do.

180 00:19:01.280 00:19:13.659 Miguel de Veyra: I know it’s it’s looks simple now. But it was, you know, kind of help to organize that. And then, yeah, this one was basically the logging stuff. This was the easiest part, I would say.

181 00:19:14.040 00:19:18.819 Miguel de Veyra: but I think the what took the most time is this one the the scraping.

182 00:19:18.820 00:19:20.020 Hannah Wang: Scraping. I see.

183 00:19:20.020 00:19:29.699 Miguel de Veyra: Yeah. And then we also had this bug where someone messaged us that you know. Hey? Your Api keys are leak leaking. So we had to move everything to the back end and.

184 00:19:29.700 00:19:30.080 Hannah Wang: Yeah.

185 00:19:30.080 00:19:31.550 Miguel de Veyra: Added time. Of course.

186 00:19:31.550 00:19:32.260 Hannah Wang: Right.

187 00:19:32.430 00:19:37.591 Miguel de Veyra: So we had to clean that because basically, if they have access to our open AI keys, we’re kind of.

188 00:19:37.850 00:19:43.690 Hannah Wang: Yes, yeah, always put those in the back end somehow.

189 00:19:45.140 00:19:51.329 Hannah Wang: okay, so is this currently, do you know, if this is currently being used by their customers right now.

190 00:19:52.040 00:19:55.149 Miguel de Veyra: I think they honestly, I don’t think so.

191 00:19:55.320 00:19:58.200 Miguel de Veyra: They are testing it internally, like here.

192 00:19:58.370 00:19:59.040 Hannah Wang: Hmm.

193 00:19:59.530 00:20:05.740 Miguel de Veyra: So they are testing it. But I I think they what they want to do is they want to deploy this once the Diy is done, and then.

194 00:20:05.740 00:20:06.270 Hannah Wang: Nice to do.

195 00:20:06.270 00:20:07.820 Miguel de Veyra: Product recommendations.

196 00:20:07.970 00:20:11.960 Miguel de Veyra: I think they’re they’re starting to deploy it with, like their close friends and family.

197 00:20:11.960 00:20:12.580 Hannah Wang: Yeah.

198 00:20:13.110 00:20:16.439 Miguel de Veyra: Like, but I don’t think it’s publicly deployed yet.

199 00:20:16.810 00:20:18.070 Hannah Wang: So it’s just internal.

200 00:20:18.360 00:20:28.610 Miguel de Veyra: Yeah, even if you look at their mobile webs, mobile one, they they don’t want to deploy this pretty sure, because you know, as you can see like it’s 1 of like, it’s very basic.

201 00:20:28.610 00:20:29.660 Hannah Wang: Yeah, got it?

202 00:20:29.660 00:20:39.469 Miguel de Veyra: They actually wanna redesign this. But I’m not sure because it’s gonna come from their designers. I think not ours. So I’m I’m not aware of if they’ve had any progress there.

203 00:20:39.770 00:20:42.735 Hannah Wang: Okay? And do you know, if

204 00:20:43.430 00:20:50.269 Hannah Wang: like, within the internal testing, if people gave any feedback about the chat, bot.

205 00:20:51.825 00:20:59.880 Miguel de Veyra: There was some bug reports about, you know, scrolling and stuff like that initially, but so far they’ve been pretty happy with it.

206 00:21:00.320 00:21:12.320 Hannah Wang: Okay. Do you know, if it’s possible to get me like some quotes, or just like copy paste some messages for me on like their feedback, and what they said.

207 00:21:12.720 00:21:18.709 Miguel de Veyra: Yeah, yeah, definitely, I think, what’s his name? Dan? Was it, Dan?

208 00:21:22.090 00:21:26.679 Miguel de Veyra: I’ll check my email. But I’m pretty sure it’s like over here.

209 00:21:26.680 00:21:27.400 Hannah Wang: Okay.

210 00:21:28.520 00:21:33.670 Miguel de Veyra: Yeah, it’s like, there you go. They have a lot of feedback, so I’ll probably send this to you.

211 00:21:33.890 00:21:49.799 Hannah Wang: Okay, can you? Yeah, forward that to me? Just so I can look at it. And then also, if I wanna add like screenshots, just to know, like what I can message the chat, bot, and stuff like that like that. That screenshot is really good. Whatever Tom sent

212 00:21:51.560 00:21:51.950 Miguel de Veyra: This one.

213 00:21:51.950 00:21:53.710 Hannah Wang: Forward me that email, yeah.

214 00:21:53.710 00:21:57.370 Miguel de Veyra: Okay? Do I? Just how do you forward this?

215 00:21:57.964 00:21:59.779 Miguel de Veyra: Is it this one.

216 00:22:00.371 00:22:05.539 Hannah Wang: All the way at the bottom all the way. At the bottom there’s 3 buttons on the left

217 00:22:05.980 00:22:09.779 Hannah Wang: all the way. Yeah. So on the right. It go down a little bit.

218 00:22:09.780 00:22:10.460 Miguel de Veyra: Here.

219 00:22:11.100 00:22:12.870 Hannah Wang: No! Where is my pen?

220 00:22:12.870 00:22:13.739 Miguel de Veyra: Hey! This one!

221 00:22:13.740 00:22:14.200 Hannah Wang: Yes.

222 00:22:14.200 00:22:16.030 Miguel de Veyra: I don’t think so. Is it this one.

223 00:22:16.580 00:22:17.760 Hannah Wang: I thought so.

224 00:22:21.380 00:22:22.739 Miguel de Veyra: No, it’s not this one.

225 00:22:24.120 00:22:24.740 Hannah Wang: Oh!

226 00:22:27.830 00:22:32.009 Miguel de Veyra: Yeah, maybe because I’m just Ccd here. Also.

227 00:22:32.500 00:22:33.730 Hannah Wang: Oh!

228 00:22:33.990 00:22:35.440 Miguel de Veyra: I’m not sure I’ll ask Uta.

229 00:22:35.650 00:22:37.969 Hannah Wang: Okay. Okay. No. Worries.

230 00:22:37.970 00:22:39.459 Miguel de Veyra: Or is it here?

231 00:22:40.280 00:22:41.860 Miguel de Veyra: Forward! More.

232 00:22:41.860 00:22:45.220 Hannah Wang: Forward all. I don’t know what that does all the way at the bottom.

233 00:22:45.620 00:22:46.380 Miguel de Veyra: This one.

234 00:22:46.690 00:22:49.600 Miguel de Veyra: Oh, forward! Oh, there you go! I think it’s this one. Then.

235 00:22:49.630 00:22:51.010 Miguel de Veyra: Oh, okay.

236 00:22:52.890 00:22:54.859 Hannah Wang: Hannah dot wang yeah.

237 00:22:55.030 00:22:56.009 Miguel de Veyra: There you go!

238 00:22:57.000 00:22:57.580 Hannah Wang: Cool.

239 00:22:57.580 00:22:59.990 Miguel de Veyra: Will they know I forwarded it. I think that should be fine, though.

240 00:22:59.990 00:23:03.109 Hannah Wang: I don’t think so. Yeah, it’s just between you and me.

241 00:23:05.010 00:23:06.820 Hannah Wang: Oh, it took they did it.

242 00:23:07.560 00:23:09.920 Hannah Wang: Oh, it’s just to me, though, like the forward and.

243 00:23:09.920 00:23:10.870 Miguel de Veyra: Oh, okay.

244 00:23:10.870 00:23:12.829 Hannah Wang: Only to me if you click on.

245 00:23:13.430 00:23:17.180 Hannah Wang: Yeah, no worries. I know email is always so scary.

246 00:23:17.790 00:23:20.940 Hannah Wang: Once you send it, you can’t take it back. So.

247 00:23:20.940 00:23:24.600 Miguel de Veyra: Especially this is, this is CEO and CEO.

248 00:23:24.600 00:23:27.910 Hannah Wang: Yeah, yeah, it’s it’s terrifying. I understand.

249 00:23:29.090 00:23:32.140 Hannah Wang: Okay, cool. Let’s see,

250 00:23:36.430 00:23:40.750 Miguel de Veyra: And if you wanna the site is, it’s just this pptg.ai.

251 00:23:40.750 00:23:46.279 Hannah Wang: Okay, yeah, I think I’ll remember. But you can message it to me right now in slack for zoom

252 00:23:49.720 00:23:51.360 Hannah Wang: let’s see.

253 00:23:53.060 00:24:09.500 Hannah Wang: And then I probably already asked you this, but just to make sure the AI Transcript picks up on it. This chat, bot! The goal is to roll it out to the entire, like their entire customer base on their website, right? Like, it’s gonna be on their website.

254 00:24:10.260 00:24:12.900 Miguel de Veyra: Yes, I think so. Yes.

255 00:24:12.900 00:24:13.610 Hannah Wang: Okay.

256 00:24:13.800 00:24:19.199 Miguel de Veyra: But it’s basically their chat gpt, for it’s basically pull parts. Gpt, what they want.

257 00:24:19.870 00:24:26.429 Hannah Wang: Is it gonna be like a little chat bot on their pptg website? Or is it gonna be that separate domain.

258 00:24:26.430 00:24:28.279 Miguel de Veyra: Probably a separate app. Yeah.

259 00:24:28.430 00:24:30.960 Hannah Wang: Oh, separate! Oh, it’s gonna be an application.

260 00:24:31.640 00:24:33.019 Miguel de Veyra: It’s web app, sorry web, app, separate.

261 00:24:33.020 00:24:34.900 Hannah Wang: Well, that yeah. Separate domain.

262 00:24:35.210 00:24:36.040 Hannah Wang: Okay, cool.

263 00:24:36.770 00:24:50.449 Hannah Wang: Okay. I think this is good for now and then, whatever like, basically, what I’m gonna do is I’m gonna format this conversation into the format we use for case studies, and then I’ll send it to

264 00:24:50.975 00:25:02.840 Hannah Wang: you and Utam to review if everything’s correct, and then I think Utam can help fill in like whatever is incorrect or missing from what I typed out. So

265 00:25:03.070 00:25:04.490 Hannah Wang: I think this is good.

266 00:25:05.030 00:25:06.060 Miguel de Veyra: Okay. Yeah.

267 00:25:08.910 00:25:13.739 Hannah Wang: Wow! There’s like so many metrics that’s crazy. I don’t even know what those mean.

268 00:25:15.000 00:25:17.035 Miguel de Veyra: Most of it is, I don’t know either.

269 00:25:19.020 00:25:22.940 Miguel de Veyra: Okay, yeah. I think that’s yeah.

270 00:25:23.100 00:25:32.669 Hannah Wang: Okay, cool. Yeah. And I’ll keep me updated on the contextual one. I think that’d be pretty cool, and that’s more visually appealing.

271 00:25:32.990 00:25:34.059 Miguel de Veyra: Yeah. Yeah. Yeah.

272 00:25:34.060 00:25:34.839 Miguel de Veyra: Oh, yeah.

273 00:25:35.040 00:25:35.730 Hannah Wang: Alright!

274 00:25:35.730 00:25:37.489 Miguel de Veyra: I proposed on the ticket, not sure.

275 00:25:37.490 00:25:38.200 Hannah Wang: Okay.

276 00:25:38.620 00:25:42.749 Miguel de Veyra: Yeah, there’s yeah, nothing else. Nothing. Okay.

277 00:25:43.250 00:25:44.850 Hannah Wang: Alright! Thank you, Miguel.

278 00:25:44.850 00:25:46.590 Miguel de Veyra: Okay, thanks, Anna. Have a good day. Bye. Bye.

279 00:25:46.590 00:25:47.000 Hannah Wang: Bye.