Meeting Title: ABC Home Demo Runthrough Date: 2024-12-10 Meeting participants: Uttam Kumaran, Miguel De Veyra, Casie Aviles, Scott_Harmon
WEBVTT
1 00:01:09.460 ⇒ 00:01:10.300 Uttam Kumaran: Here we go!
2 00:01:12.350 ⇒ 00:01:13.270 Miguel de Veyra: Hey, Tim?
3 00:01:16.750 ⇒ 00:01:17.530 Miguel de Veyra: We had.
4 00:01:17.530 ⇒ 00:01:18.359 Uttam Kumaran: You’re like.
5 00:01:19.590 ⇒ 00:01:20.380 Miguel de Veyra: Sorry, what.
6 00:01:20.700 ⇒ 00:01:24.209 Uttam Kumaran: I say I joined early, because I know you’re you got on early.
7 00:01:24.410 ⇒ 00:01:26.969 Miguel de Veyra: Yeah, yeah, I always join early.
8 00:01:32.380 ⇒ 00:01:33.370 Uttam Kumaran: Hey! Scott!
9 00:01:34.870 ⇒ 00:01:36.009 Scott_Harmon: Hey? How you doing, Buddy?
10 00:01:36.390 ⇒ 00:01:37.619 Uttam Kumaran: Hey? Good! How are you?
11 00:01:39.550 ⇒ 00:01:40.639 Scott_Harmon: Not too bad.
12 00:01:42.770 ⇒ 00:01:44.610 Scott_Harmon: How’s life over in East Austin?
13 00:01:46.360 ⇒ 00:01:50.180 Uttam Kumaran: It’s good, but it’s cold, this weekend insane.
14 00:01:50.960 ⇒ 00:01:53.119 Scott_Harmon: Yeah, yeah. It was indeed, indeed.
15 00:01:53.120 ⇒ 00:01:54.649 Uttam Kumaran: Bell on the east coast.
16 00:01:55.700 ⇒ 00:02:01.450 Scott_Harmon: I think I forgot to tell you, I think, was it 2 weeks ago? Maybe it was like the week before Thanksgiving.
17 00:02:01.490 ⇒ 00:02:06.700 Scott_Harmon: I think you and I had talked on the phone, you know, briefly or something, and
18 00:02:07.060 ⇒ 00:02:14.449 Scott_Harmon: you know how you butt dial someone when your phone’s in your pocket, you butt dial me. It must have been from a club, because it was like 1230 at night.
19 00:02:14.680 ⇒ 00:02:15.479 Uttam Kumaran: It’s coffee.
20 00:02:15.480 ⇒ 00:02:19.999 Scott_Harmon: I get this call from Utah. I was asleep. Oh, Utah’s calling me, you know. I grab it.
21 00:02:20.010 ⇒ 00:02:23.846 Scott_Harmon: and I can hear some pretty cool music playing. So I figured, oh, okay.
22 00:02:24.130 ⇒ 00:02:26.667 Uttam Kumaran: It was definitely a butt dial.
23 00:02:27.090 ⇒ 00:02:31.460 Scott_Harmon: Yeah, I didn’t. I didn’t think you were calling me to like, you know, loop me into the club for the club or something.
24 00:02:35.410 ⇒ 00:02:37.308 Scott_Harmon: I got a kick out of it
25 00:02:39.270 ⇒ 00:02:42.570 Scott_Harmon: thankfully. I didn’t hear you say anything embarrassing. You don’t have to worry.
26 00:02:43.400 ⇒ 00:02:44.100 Scott_Harmon: You’re like, if you.
27 00:02:44.100 ⇒ 00:02:44.530 Uttam Kumaran: Yeah. No.
28 00:02:44.530 ⇒ 00:02:49.580 Scott_Harmon: When you’re out in the bars after 12. You don’t really want you don’t. Don’t really want yourself.
29 00:02:50.090 ⇒ 00:02:57.430 Uttam Kumaran: No, no, we’re probably we’re probably just yeah. And we’re probably there’s this couple of these East Austin bars that very good music.
30 00:02:57.980 ⇒ 00:03:03.679 Uttam Kumaran: But we were probably just dancing or something. Yeah, thankfully. No, no company secrets.
31 00:03:05.172 ⇒ 00:03:08.180 Uttam Kumaran: No, no IP secrets.
32 00:03:08.410 ⇒ 00:03:10.658 Scott_Harmon: No, you’re very good. You’re very good.
33 00:03:13.760 ⇒ 00:03:14.350 Scott_Harmon: Hey, Miguel.
34 00:03:14.350 ⇒ 00:03:14.920 Uttam Kumaran: Well, we.
35 00:03:15.650 ⇒ 00:03:16.460 Miguel de Veyra: Hey! There!
36 00:03:17.350 ⇒ 00:03:23.979 Uttam Kumaran: And Casey’s on as well. Casey’s also, I think you you may have met Casey. He’s on the AI team and he works. I think
37 00:03:24.050 ⇒ 00:03:26.950 Uttam Kumaran: Miguel Casey worked on the ABC. Stuff right?
38 00:03:27.850 ⇒ 00:03:29.320 Uttam Kumaran: Or what did you handle that.
39 00:03:30.065 ⇒ 00:03:31.570 Miguel de Veyra: It’s kind of both both of us.
40 00:03:31.570 ⇒ 00:03:32.310 Uttam Kumaran: Okay.
41 00:03:32.550 ⇒ 00:03:37.045 Uttam Kumaran: yeah. So I I send it to Scott and basically also sent Scott the knowledge base that we worked on
42 00:03:37.818 ⇒ 00:03:48.811 Uttam Kumaran: but basically for this meeting. We just wanted to walk through like what we wanted to run through for the demo basically also have, like a couple of examples just written down that we know work super well. So we’re not
43 00:03:49.390 ⇒ 00:03:50.990 Uttam Kumaran: sort of going off the cuff
44 00:03:52.900 ⇒ 00:03:56.910 Uttam Kumaran: and then we can just kind of send it out, write a document out with everything
45 00:03:57.040 ⇒ 00:04:03.859 Uttam Kumaran: like that. And then, if there’s any sort of advanced use cases we want to share. We have a few days before the demo that we can try to pull something together.
46 00:04:04.870 ⇒ 00:04:07.790 Scott_Harmon: Okay, perfect. Yeah, that sounds exactly right. Like.
47 00:04:07.970 ⇒ 00:04:12.590 Scott_Harmon: yeah, I think that’s great. Because sometimes when we do, you do the demos.
48 00:04:13.910 ⇒ 00:04:19.739 Scott_Harmon: what you say is you? You go? You ask the client to like oh, what kind of question would you like to ask? And they don’t.
49 00:04:19.829 ⇒ 00:04:21.389 Miguel de Veyra: They don’t think very.
50 00:04:21.459 ⇒ 00:04:24.589 Scott_Harmon: Creatively on their feet. So it’s just better to actually
51 00:04:25.000 ⇒ 00:04:30.020 Scott_Harmon: have a little bit more of a can dialogue ready to show.
52 00:04:30.190 ⇒ 00:04:34.580 Scott_Harmon: And and if we want, we can actually email Steve.
53 00:04:34.980 ⇒ 00:04:39.719 Scott_Harmon: to get some questions if you want, you know. So like before the meeting. So we’ve already.
54 00:04:40.740 ⇒ 00:04:43.229 Scott_Harmon: you know, we’ve already got the questions ready.
55 00:04:43.742 ⇒ 00:04:49.269 Scott_Harmon: But let’s let’s let’s see what you got, and then, you know, we can go from there.
56 00:04:50.150 ⇒ 00:04:51.666 Uttam Kumaran: Yeah, Casey, do you wanna
57 00:04:51.970 ⇒ 00:04:52.780 Miguel de Veyra: Area.
58 00:04:53.330 ⇒ 00:04:54.470 Uttam Kumaran: Yeah, do you want to run it?
59 00:04:54.700 ⇒ 00:05:00.861 Casie Aviles: Yeah, sure, yeah. So here’s the
60 00:05:01.420 ⇒ 00:05:03.849 Casie Aviles: the interface of the agent. And
61 00:05:05.044 ⇒ 00:05:14.089 Casie Aviles: yeah, we could. Basically, we did some a few tests earlier. And so some of the questions that we could ask are like, so we just prepared like generic questions. For now.
62 00:05:14.830 ⇒ 00:05:17.920 Casie Aviles: So we have. We can try something like this.
63 00:05:21.160 ⇒ 00:05:27.530 Scott_Harmon: And the knowledge source you’re using is the Sop website. Luton.
64 00:05:28.050 ⇒ 00:05:28.830 Uttam Kumaran: Yes.
65 00:05:30.345 ⇒ 00:05:30.610 Casie Aviles: Yes.
66 00:05:30.610 ⇒ 00:05:31.710 Miguel de Veyra: This one, yeah.
67 00:05:32.040 ⇒ 00:05:34.859 Scott_Harmon: So this one, yeah, I I ended up like I said.
68 00:05:35.170 ⇒ 00:05:41.039 Scott_Harmon: try. I pulled it myself and dropped it in the notebook. Lm is kind of hard to get. But yeah, okay, so I think
69 00:05:41.090 ⇒ 00:05:43.669 Scott_Harmon: we’re adding this, you were using the same source. Okay.
70 00:05:44.193 ⇒ 00:05:52.820 Miguel de Veyra: It kind of takes a while, because, you know, there’s a lot of Pdf files there, and we’re just using the in the built in. We didn’t really put it on any 10 yet.
71 00:05:53.010 ⇒ 00:05:53.340 Uttam Kumaran: Yeah.
72 00:05:53.340 ⇒ 00:05:59.070 Scott_Harmon: You didn’t you? Didn’t. You didn’t use the images, did you? There’s a there’s 1 section on there that’s actual that
73 00:05:59.230 ⇒ 00:06:01.380 Scott_Harmon: the images of the past. I
74 00:06:02.900 ⇒ 00:06:03.300 Casie Aviles: Oh, yeah.
75 00:06:03.300 ⇒ 00:06:08.380 Scott_Harmon: You go back there the llm like if you go down like go
76 00:06:10.580 ⇒ 00:06:14.810 Scott_Harmon: along the top. See? Click on. Nope, go back.
77 00:06:15.285 ⇒ 00:06:17.660 Miguel de Veyra: Customer services, bro services, protocols.
78 00:06:18.470 ⇒ 00:06:23.279 Scott_Harmon: Yeah, under under protocols and pest images click on that button on the top.
79 00:06:23.400 ⇒ 00:06:24.230 Scott_Harmon: Nav.
80 00:06:25.200 ⇒ 00:06:25.669 Casie Aviles: Test
81 00:06:26.720 ⇒ 00:06:29.059 Scott_Harmon: At the top, very top nav up at the top of the.
82 00:06:29.060 ⇒ 00:06:30.390 Miguel de Veyra: Not bar-off bar.
83 00:06:30.750 ⇒ 00:06:34.459 Scott_Harmon: No, yeah. Over to the left protocols and Nope, Nope, over to the left.
84 00:06:35.230 ⇒ 00:06:37.389 Miguel de Veyra: Protocols and past images.
85 00:06:37.770 ⇒ 00:06:40.820 Scott_Harmon: Yeah, I I actually played around with like, there’s a
86 00:06:41.100 ⇒ 00:06:44.590 Scott_Harmon: there’s a pest. Images click on pest images like, I tried to
87 00:06:45.250 ⇒ 00:06:49.900 Scott_Harmon: import that. I think I thought it would kind of be cool that that would be for later.
88 00:06:50.220 ⇒ 00:06:54.280 Scott_Harmon: But the Llm I was using, or the I was using Google didn’t really
89 00:06:54.950 ⇒ 00:06:58.629 Scott_Harmon: absorb those. Very well. I thought you could eventually even add, like a
90 00:06:59.430 ⇒ 00:07:04.079 Scott_Harmon: Oh, what is this kind of a thing to your AI to recognize a pest? But
91 00:07:05.772 ⇒ 00:07:08.990 Miguel de Veyra: Yeah, I think that’s yeah. That’s a bit comp more complicated.
92 00:07:08.990 ⇒ 00:07:14.960 Casie Aviles: I think something we can do for that is to just give it to chat, gpt, and although it will create like a
93 00:07:15.580 ⇒ 00:07:18.480 Casie Aviles: what do you call this? A text description of the.
94 00:07:18.910 ⇒ 00:07:24.889 Casie Aviles: And then we could use that to as context for the AI. So that’s something we could also explore. Yeah.
95 00:07:24.890 ⇒ 00:07:29.309 Scott_Harmon: Okay? Yeah. Again, I think that’s that’s for later. I was just curious. What? What all you
96 00:07:29.960 ⇒ 00:07:34.170 Scott_Harmon: what all you pulled pulled in for your for your rag data.
97 00:07:34.390 ⇒ 00:07:35.959 Miguel de Veyra: So here’s the.
98 00:07:36.310 ⇒ 00:07:39.780 Scott_Harmon: You use the sop, just the sop answers. It sounded like, Okay.
99 00:07:39.780 ⇒ 00:07:40.390 Miguel de Veyra: Yep.
100 00:07:41.190 ⇒ 00:07:48.530 Miguel de Veyra: So here’s like one of the answers. We wanted to keep it conversational. So you know, it’s short maximum of around 150 characters.
101 00:07:48.840 ⇒ 00:07:51.419 Miguel de Veyra: because I’m assuming this would be like a Chatbot, right?
102 00:07:51.830 ⇒ 00:07:53.659 Miguel de Veyra: So we don’t want it.
103 00:07:53.660 ⇒ 00:08:00.849 Scott_Harmon: The main. I think the main app is a chat bot. And specifically, there’s 2 different we, and we’ll find out more on the call.
104 00:08:01.020 ⇒ 00:08:03.880 Scott_Harmon: But there’s there’s 2 different possible
105 00:08:04.580 ⇒ 00:08:07.909 Scott_Harmon: users or personas for the chat bot.
106 00:08:07.950 ⇒ 00:08:12.960 Scott_Harmon: One would be an actual end. Customer like a residential homeowner.
107 00:08:12.960 ⇒ 00:08:13.350 Miguel de Veyra: Yeah.
108 00:08:13.350 ⇒ 00:08:23.010 Scott_Harmon: That, you know. Would that that was gonna buy, or, you know, subscribe to the service, or had a question about their service. The other one would be one of their call center technicians.
109 00:08:23.760 ⇒ 00:08:28.580 Scott_Harmon: That’s answering it. That’s taking a question from a home homeowner.
110 00:08:29.333 ⇒ 00:08:30.300 Miguel de Veyra: Okay, yeah.
111 00:08:31.229 ⇒ 00:08:37.820 Miguel de Veyra: This one. Yeah, this one was built more towards, like a client facing instead of like a more of an internal tool.
112 00:08:38.390 ⇒ 00:08:45.309 Scott_Harmon: Okay, we can. We can. Every time. I just think it’s important that we we position that you could be. You could tune this
113 00:08:46.160 ⇒ 00:08:51.650 Scott_Harmon: very easily to to respond appropriately to either an external customer
114 00:08:52.117 ⇒ 00:08:54.739 Scott_Harmon: or or an ABC employee, because
115 00:08:55.080 ⇒ 00:08:59.460 Scott_Harmon: because you’d want those responses to be a little bit different, like a different voice.
116 00:09:00.073 ⇒ 00:09:00.899 Scott_Harmon: You know.
117 00:09:00.900 ⇒ 00:09:03.280 Uttam Kumaran: So we haven’t talked to. We didn’t talk
118 00:09:03.460 ⇒ 00:09:09.040 Uttam Kumaran: click like specifically to them about like, was it going to be? Externally? I know we talked a little bit about, almost like a co-pilot
119 00:09:09.090 ⇒ 00:09:14.430 Uttam Kumaran: for us. That’s a couple of different like we, there’s a different basically set of expectations that we would have
120 00:09:14.440 ⇒ 00:09:16.160 Uttam Kumaran: on the way we build it, depending on.
121 00:09:16.160 ⇒ 00:09:16.490 Scott_Harmon: Yeah.
122 00:09:16.490 ⇒ 00:09:17.600 Uttam Kumaran: Only externally, so.
123 00:09:17.600 ⇒ 00:09:22.960 Scott_Harmon: Yeah, I’d like to discover on the call on Thursday, which one they’d be more excited about
124 00:09:23.896 ⇒ 00:09:26.399 Scott_Harmon: and then we can come back and tweak.
125 00:09:26.940 ⇒ 00:09:33.410 Scott_Harmon: I I think there’s 2 differences. One of them is probably, you know, if it’s external clients, you want a few other sources.
126 00:09:33.840 ⇒ 00:09:36.539 Scott_Harmon: and then there’s there’s also some guardrails
127 00:09:36.860 ⇒ 00:09:42.730 Scott_Harmon: of information I think you wouldn’t want to share with a client that maybe you would share.
128 00:09:42.880 ⇒ 00:09:43.250 Miguel de Veyra: Yeah.
129 00:09:43.250 ⇒ 00:09:47.762 Scott_Harmon: You know, to to an employee. So we’ll get that. We’ll get that
130 00:09:48.790 ⇒ 00:09:51.860 Scott_Harmon: clarified on Thursday, when we have our call with them.
131 00:09:51.860 ⇒ 00:10:01.349 Miguel de Veyra: Would you like Scott, if you like, build one for client and one for because basically, it accesses the same set of knowledge base? We just have to change the instructions to be a bit more descriptive.
132 00:10:01.790 ⇒ 00:10:09.529 Scott_Harmon: Yeah, I I think you know, we’ll find out on Thursday what my goal on Thursday is to get them to fund a project.
133 00:10:09.630 ⇒ 00:10:14.040 Scott_Harmon: and and and then to ask them which client you know, which persona
134 00:10:14.270 ⇒ 00:10:19.169 Scott_Harmon: they would want to start with, so that we have. We have clarity, and then from there.
135 00:10:19.420 ⇒ 00:10:23.249 Scott_Harmon: you know you can. We can build the right guardrails. You could build the right voice.
136 00:10:23.270 ⇒ 00:10:26.080 Scott_Harmon: And then there’s other information that you’d wanna
137 00:10:26.410 ⇒ 00:10:29.659 Scott_Harmon: you’d want to pull in, depending on which client, so we’ll
138 00:10:29.820 ⇒ 00:10:32.689 Scott_Harmon: we’ll know that after Thursday. I think, Miguel.
139 00:10:33.070 ⇒ 00:10:33.970 Miguel de Veyra: Okay, yeah.
140 00:10:35.653 ⇒ 00:10:47.789 Miguel de Veyra: But for the demo, because I’m assuming we’re gonna demo something to them. Right? Would you like also something for cause? This is client facing. We could quickly prop something up for internal. Would you like also to do that?
141 00:10:48.050 ⇒ 00:10:56.440 Scott_Harmon: No, I think this is fine for Thursday. We could just we could just explain that we could change it. But I think we can use what you’ve got here, you know, client, facing.
142 00:10:56.440 ⇒ 00:10:57.540 Miguel de Veyra: Okay, okay, sure.
143 00:10:59.930 ⇒ 00:11:08.360 Miguel de Veyra: And then as for your question to them earlier about quickly adding, like, an external source, basically Casey, can you show it.
144 00:11:09.280 ⇒ 00:11:13.750 Casie Aviles: Oh, sure, sure! So I think this is the Pdf. That we use.
145 00:11:13.750 ⇒ 00:11:14.240 Scott_Harmon: Yeah.
146 00:11:14.240 ⇒ 00:11:16.610 Casie Aviles: Earlier. Yeah, so what we.
147 00:11:16.610 ⇒ 00:11:20.159 Scott_Harmon: I did the same. I did the same thing. Yeah, yeah, I found the same thing.
148 00:11:20.800 ⇒ 00:11:25.309 Casie Aviles: Yeah. So we have like a tool for that. So
149 00:11:26.850 ⇒ 00:11:34.330 Casie Aviles: basically, we will just take the file, URL, the URL. And we could.
150 00:11:34.440 ⇒ 00:11:36.790 Casie Aviles: We’re going to paste it here
151 00:11:37.260 ⇒ 00:11:39.950 Casie Aviles: perfect. And then and then it should
152 00:11:40.489 ⇒ 00:11:45.240 Casie Aviles: produce something like this, so it gets the con the contents, the text contents of the file.
153 00:11:46.020 ⇒ 00:11:47.389 Scott_Harmon: Yeah, perfect.
154 00:11:47.730 ⇒ 00:11:52.370 Casie Aviles: Yeah. And it adds that just kidding it. And yeah.
155 00:11:52.370 ⇒ 00:11:53.640 Scott_Harmon: Yeah, I found it’s
156 00:11:53.810 ⇒ 00:12:00.459 Scott_Harmon: that’s great. I I crawled around in that same basf website that seems to be they use a lot of their products.
157 00:12:02.420 ⇒ 00:12:06.580 Scott_Harmon: I I think one of the questions we should have Utam should show
158 00:12:07.370 ⇒ 00:12:13.669 Scott_Harmon: that the the bot can pull information from both the Sop. And
159 00:12:14.280 ⇒ 00:12:18.409 Scott_Harmon: that BA. Sf, so like we want a question that illustrates
160 00:12:19.140 ⇒ 00:12:22.190 Scott_Harmon: that it was able to pull knowledge from.
161 00:12:22.330 ⇒ 00:12:23.370 Uttam Kumaran: One or the other.
162 00:12:23.900 ⇒ 00:12:25.940 Scott_Harmon: Or or both, right like
163 00:12:26.665 ⇒ 00:12:29.750 Scott_Harmon: you you know what I mean so that that we can go. Hey, look! It’s
164 00:12:30.120 ⇒ 00:12:34.459 Scott_Harmon: you know this this customer asked a question, and the the answer was, actually.
165 00:12:35.180 ⇒ 00:12:40.909 Scott_Harmon: you know, either in the your sop or maybe it was in the this Basf data, you know information.
166 00:12:41.550 ⇒ 00:12:45.630 Scott_Harmon: And the bot just knew it. Because, you know, that’s like, so we want to question that
167 00:12:46.410 ⇒ 00:12:48.869 Scott_Harmon: that would pull from that.
168 00:12:51.510 ⇒ 00:12:52.490 Miguel de Veyra: So, yeah, so.
169 00:12:53.890 ⇒ 00:12:54.520 Scott_Harmon: That’s okay. Yeah.
170 00:12:54.520 ⇒ 00:12:54.940 Scott_Harmon: Case.
171 00:12:54.940 ⇒ 00:13:00.189 Uttam Kumaran: In relevance. When you ask a question, can we see the knowledge that it pulled from Casey
172 00:13:01.160 ⇒ 00:13:03.460 Uttam Kumaran: like in the actual flow.
173 00:13:03.750 ⇒ 00:13:04.100 Miguel de Veyra: Yep.
174 00:13:04.363 ⇒ 00:13:08.839 Casie Aviles: Yeah. Here, I think with the tool, we could take a look at the the output here.
175 00:13:08.840 ⇒ 00:13:11.560 Scott_Harmon: So you could take a look at the source right there. Yeah, yeah.
176 00:13:11.560 ⇒ 00:13:11.970 Uttam Kumaran: Yeah.
177 00:13:12.420 ⇒ 00:13:17.979 Scott_Harmon: So so Utam, we’ll just. And I, you could probably just read that source right there, like, we just want a question
178 00:13:18.370 ⇒ 00:13:22.680 Scott_Harmon: that show. Or you know, questions that show that we’re pulling from both sources.
179 00:13:23.210 ⇒ 00:13:28.119 Uttam Kumaran: Okay? So we’ll do one. That’s like a very simple question, one that pulls from like
180 00:13:28.760 ⇒ 00:13:31.780 Uttam Kumaran: both sources or considers both sources.
181 00:13:31.780 ⇒ 00:13:37.729 Uttam Kumaran: Yeah, is there? Are there any? And then, is there any other demo we want to try
182 00:13:38.290 ⇒ 00:13:40.230 Uttam Kumaran: to? Well, we all.
183 00:13:40.230 ⇒ 00:13:40.840 Scott_Harmon: I noticed.
184 00:13:40.840 ⇒ 00:13:43.830 Uttam Kumaran: Booking one. We we did a we did a little booking tool.
185 00:13:44.180 ⇒ 00:13:47.320 Scott_Harmon: Yeah, I was just gonna say, the booking tool. I see it down there
186 00:13:47.560 ⇒ 00:13:49.630 Scott_Harmon: that you’ve connect. Can you show me?
187 00:13:50.080 ⇒ 00:13:51.859 Scott_Harmon: Can you show me how that would work.
188 00:13:53.336 ⇒ 00:13:54.409 Miguel de Veyra: Yeah. So.
189 00:13:55.090 ⇒ 00:14:00.350 Miguel de Veyra: Casey, try if, for example, you’re an interested client, and then you just want to book like an appointment, basically.
190 00:14:03.430 ⇒ 00:14:05.369 Miguel de Veyra: or a rodent something. Oh, yeah, yeah.
191 00:14:05.930 ⇒ 00:14:06.610 Scott_Harmon: Yeah, yeah.
192 00:14:06.850 ⇒ 00:14:16.280 Miguel de Veyra: But right now, Scott, the way the booking works is we we only basically if someone books, we just add it to a Google sheet, because we don’t really have. You know.
193 00:14:16.770 ⇒ 00:14:18.720 Miguel de Veyra: we don’t really know where to book it yet.
194 00:14:19.850 ⇒ 00:14:22.210 Miguel de Veyra: but then, you know, we can move it anywhere.
195 00:14:22.532 ⇒ 00:14:25.109 Scott_Harmon: So this, this is the information that the
196 00:14:26.390 ⇒ 00:14:31.889 Scott_Harmon: the booking app requires. Are those the fields? The contact number? Okay? Got it.
197 00:14:34.030 ⇒ 00:14:38.679 Scott_Harmon: Yeah. And I suppose you, the way you do that is just develop a tool that would interactively.
198 00:14:38.940 ⇒ 00:14:39.770 Miguel de Veyra: Yeah, yeah.
199 00:14:40.820 ⇒ 00:14:49.919 Uttam Kumaran: Yeah. And then basically like, this is a dummy tool. But this would just call like, what if they had an Api for their booking. Or this would just basically hit that and submit a request.
200 00:14:49.920 ⇒ 00:14:50.619 Miguel de Veyra: They have a seat.
201 00:14:50.620 ⇒ 00:14:56.930 Uttam Kumaran: Or or potentially, if that doesn’t exist, then this would just route it to whoever handles that basically.
202 00:14:57.160 ⇒ 00:14:57.630 Miguel de Veyra: Yeah, maybe.
203 00:14:57.630 ⇒ 00:15:01.859 Scott_Harmon: Yeah. The only thing that we obviously want to add is just do it in real time. So that
204 00:15:01.990 ⇒ 00:15:07.250 Scott_Harmon: I assume that the booking tool, you know, checks availability. Right? So
205 00:15:08.460 ⇒ 00:15:12.799 Scott_Harmon: another, you know, like, when we would do this for real, you’d say
206 00:15:13.230 ⇒ 00:15:21.129 Scott_Harmon: what days work for you, and then it would come, like, you know. Oh, Monday, Tuesday, Wednesday. Great, hey? We have these following availabilities on Monday, or, you know, like you’d
207 00:15:22.080 ⇒ 00:15:24.196 Scott_Harmon: yeah, I I think that’s
208 00:15:26.350 ⇒ 00:15:29.866 Scott_Harmon: I, yeah, we could just leave this dummy tool here for now.
209 00:15:30.160 ⇒ 00:15:35.520 Miguel de Veyra: Cause. That would depend on what you know. Book calendar they use of, if or if they use a Crm.
210 00:15:35.870 ⇒ 00:15:38.189 Miguel de Veyra: that’s gonna be where it depends a lot on.
211 00:15:39.270 ⇒ 00:15:42.549 Scott_Harmon: Yeah, I had. Yeah, yeah. So I think
212 00:15:43.130 ⇒ 00:15:45.120 Scott_Harmon: I think this is gonna be.
213 00:15:45.880 ⇒ 00:15:50.209 Scott_Harmon: all we need is just the list of actual questions.
214 00:15:50.860 ⇒ 00:15:56.050 Scott_Harmon: And I think we should have those prepared in advance of time. That, okay, yeah.
215 00:15:56.050 ⇒ 00:15:56.460 Scott_Harmon: You, too.
216 00:15:56.460 ⇒ 00:15:59.619 Uttam Kumaran: Put it all in a notion, basically. So we can just copy paste it in. There.
217 00:16:00.300 ⇒ 00:16:04.779 Scott_Harmon: Yeah, yeah, yeah. And say, Hey, look, we just thought up our own set of questions
218 00:16:04.920 ⇒ 00:16:07.759 Scott_Harmon: just to show how easy it is for the
219 00:16:07.800 ⇒ 00:16:13.680 Scott_Harmon: for the bot to consult these different knowledge sources, even even to the point of booking a meeting.
220 00:16:13.860 ⇒ 00:16:16.740 Scott_Harmon: and then and then let them ask their own.
221 00:16:16.750 ⇒ 00:16:20.659 Scott_Harmon: you know, after after we do like our 3 or 4 questions.
222 00:16:20.780 ⇒ 00:16:21.260 Uttam Kumaran: Sure.
223 00:16:23.840 ⇒ 00:16:24.400 Scott_Harmon: I
224 00:16:25.540 ⇒ 00:16:28.176 Scott_Harmon: The only thing I could like. I said I was
225 00:16:30.220 ⇒ 00:16:33.520 Scott_Harmon: scraping around a little. I’m using the just a
226 00:16:33.720 ⇒ 00:16:36.150 Scott_Harmon: play with the same information I use the
227 00:16:36.330 ⇒ 00:16:40.639 Scott_Harmon: have any of you used the notebook? Lm, tool from Google.
228 00:16:42.530 ⇒ 00:16:43.280 Miguel de Veyra: No, no.
229 00:16:43.280 ⇒ 00:16:49.440 Uttam Kumaran: Miguel, have you? Did you hear about that? I think I might have sent it internally like, it’s the thing that it basically turns any document into a podcast.
230 00:16:50.160 ⇒ 00:16:50.780 Scott_Harmon: It’s pretty
231 00:16:50.780 ⇒ 00:16:56.227 Scott_Harmon: cool. Yeah, just let me share my screen real quick. I’ll show it to you. Just just so.
232 00:16:58.380 ⇒ 00:17:00.220 Scott_Harmon: yeah, let me show.
233 00:17:01.970 ⇒ 00:17:03.660 Scott_Harmon: Let’s see.
234 00:17:16.619 ⇒ 00:17:20.919 Scott_Harmon: So where do you go? I think it’s here. Yeah. So can you see my screen? Okay.
235 00:17:20.920 ⇒ 00:17:21.480 Miguel de Veyra: Yep.
236 00:17:22.700 ⇒ 00:17:25.719 Scott_Harmon: Yeah. So this is Google’s, it’s just a free tool for
237 00:17:25.829 ⇒ 00:17:30.120 Scott_Harmon: you. Basically, you just import whatever rag sources you want over here on the left.
238 00:17:30.820 ⇒ 00:17:35.389 Scott_Harmon: And then, you know, you can ask a bunch of questions. It’s like a group tool.
239 00:17:35.710 ⇒ 00:17:37.940 Scott_Harmon: But what’s really interesting is, you can
240 00:17:38.030 ⇒ 00:17:44.059 Scott_Harmon: tell it to generate a podcast from the from the sources.
241 00:17:44.660 ⇒ 00:17:47.949 Scott_Harmon: And then you can just tell it, like, what kind of
242 00:17:48.560 ⇒ 00:17:52.900 Scott_Harmon: you know, like what the general theme of the podcast should be
243 00:17:53.860 ⇒ 00:17:55.950 Scott_Harmon: and it and it generate like.
244 00:17:56.060 ⇒ 00:17:57.849 Scott_Harmon: I ask it to just generate a.
245 00:18:06.820 ⇒ 00:18:08.020 Miguel de Veyra: No, we can’t hear you.
246 00:18:10.010 ⇒ 00:18:13.140 Scott_Harmon: So it generates this 2 person conversational. Podcast.
247 00:18:13.150 ⇒ 00:18:14.990 Scott_Harmon: And you can see I’ve got the.
248 00:18:15.640 ⇒ 00:18:18.680 Scott_Harmon: These are the sap. This is the Sops.
249 00:18:19.860 ⇒ 00:18:25.929 Scott_Harmon: And then I just imported, like you did some of the Basf product. Information like this is their.
250 00:18:27.090 ⇒ 00:18:28.839 Scott_Harmon: you know, their Pdf stuff.
251 00:18:29.540 ⇒ 00:18:36.330 Scott_Harmon: And I even found some training content that, like a video
252 00:18:36.490 ⇒ 00:18:40.059 Scott_Harmon: where they train their technicians on how to do the pest control.
253 00:18:40.320 ⇒ 00:18:45.620 Scott_Harmon: So I just scraped the scrape, the
254 00:18:46.250 ⇒ 00:18:50.700 Scott_Harmon: the voice from the from the video, and you can now ask questions about any of it. So
255 00:18:50.920 ⇒ 00:18:53.750 Scott_Harmon: you know not probably not as slick as your tool, but
256 00:18:54.890 ⇒ 00:18:58.439 Scott_Harmon: just and then the other thing. By the way, I pulled in a
257 00:18:58.920 ⇒ 00:19:04.489 Scott_Harmon: a statement that we got because we’re an ABC. Customer like we use ABC Pest control here at my house.
258 00:19:05.230 ⇒ 00:19:07.900 Scott_Harmon: So I I got her a monthly bill.
259 00:19:08.560 ⇒ 00:19:11.479 Scott_Harmon: So this was our our monthly statement.
260 00:19:12.310 ⇒ 00:19:14.990 Scott_Harmon: We just happened to get it yesterday from ABC,
261 00:19:15.020 ⇒ 00:19:17.329 Scott_Harmon: and so I was able to ask.
262 00:19:17.330 ⇒ 00:19:17.890 Uttam Kumaran: Oh, nice!
263 00:19:18.201 ⇒ 00:19:23.330 Scott_Harmon: Yeah. So I was able to ask like, hey, what
264 00:19:23.360 ⇒ 00:19:29.940 Scott_Harmon: like? What are we? Look at this bill and tell me what are the services that I pay for.
265 00:19:30.380 ⇒ 00:19:35.569 Scott_Harmon: And so it it’s it. We pay for 2 services. So it went up of like.
266 00:19:36.090 ⇒ 00:19:40.009 Scott_Harmon: here’s here’s what you get for the services. Here’s what’s included.
267 00:19:41.170 ⇒ 00:19:45.000 Scott_Harmon: you know. And I could, you know, when did I last pay? Is my bill current? So
268 00:19:45.050 ⇒ 00:19:47.289 Scott_Harmon: it just kind of did it all automatically.
269 00:19:48.355 ⇒ 00:19:54.639 Scott_Harmon: Despite, you know, smashing all of these different sources together on, you know, underneath one kind of rag
270 00:19:55.350 ⇒ 00:20:01.279 Scott_Harmon: umbrella. So anyway, it’s just kind of a cool gee whiz thing. I
271 00:20:01.360 ⇒ 00:20:05.959 Scott_Harmon: I think they might be able to use the podcasting for some sort of like a training.
272 00:20:07.290 ⇒ 00:20:09.650 Scott_Harmon: I know they’re very interested in training
273 00:20:11.017 ⇒ 00:20:14.359 Scott_Harmon: and and and so maybe later on.
274 00:20:14.500 ⇒ 00:20:21.490 Scott_Harmon: if we can get going, they could actually use a tool like this to generate like training audios.
275 00:20:22.040 ⇒ 00:20:23.226 Uttam Kumaran: For people,
276 00:20:23.820 ⇒ 00:20:26.799 Scott_Harmon: Yeah, yeah. But I just thought it was for me.
277 00:20:27.307 ⇒ 00:20:34.000 Uttam Kumaran: I’ll send you the link to. It’s interesting. I mean, it just sounds like it’s literally a podcast. About a topic. I think for trainings.
278 00:20:34.020 ⇒ 00:20:35.450 Uttam Kumaran: It’d be really really great.
279 00:20:36.040 ⇒ 00:20:36.560 Miguel de Veyra: Okay. Yeah.
280 00:20:36.560 ⇒ 00:20:40.130 Scott_Harmon: Yeah. Yeah. And, by the way, I don’t know why I tried to share sure.
281 00:20:40.130 ⇒ 00:20:43.979 Uttam Kumaran: Yeah, maybe it’s something in my organization or something I don’t know. I’ll have to check.
282 00:20:43.980 ⇒ 00:20:46.160 Scott_Harmon: Do you have it like a gmail account.
283 00:20:46.955 ⇒ 00:20:52.690 Uttam Kumaran: Yeah. If you just type in UTTA, m KUMA,
284 00:20:53.410 ⇒ 00:20:54.670 Uttam Kumaran: RAN.
285 00:20:56.130 ⇒ 00:20:56.910 Scott_Harmon: 17.
286 00:20:57.370 ⇒ 00:21:01.790 Uttam Kumaran: And it RAN. 17@gmail.com.
287 00:21:05.140 ⇒ 00:21:05.750 Scott_Harmon: Oops.
288 00:21:06.000 ⇒ 00:21:08.699 Scott_Harmon: Yeah, see if you get this and
289 00:21:14.520 ⇒ 00:21:18.240 Uttam Kumaran: Cause I didn’t even get a notification last time. But yeah, I’ll check for this.
290 00:21:18.240 ⇒ 00:21:23.439 Scott_Harmon: Yeah, I think it gets kind of funky when it likes. Seems to want Gmail addresses. I guess.
291 00:21:24.006 ⇒ 00:21:28.050 Scott_Harmon: It’s like a prototype product that they just sort of released early.
292 00:21:29.192 ⇒ 00:21:34.230 Scott_Harmon: It’s really meant for teachers that are teaching classes.
293 00:21:34.320 ⇒ 00:21:36.250 Scott_Harmon: So it has like built in
294 00:21:36.770 ⇒ 00:21:40.940 Scott_Harmon: fields. For like generate me, a class outline generate me a discussion.
295 00:21:41.180 ⇒ 00:21:41.840 Uttam Kumaran: Hmm.
296 00:21:41.840 ⇒ 00:21:47.819 Scott_Harmon: Like a like a discussion note, generate a quiz like it automatically generates these
297 00:21:47.870 ⇒ 00:21:50.520 Scott_Harmon: artifacts. If you were going to be teaching a class
298 00:21:52.150 ⇒ 00:21:55.189 Scott_Harmon: from. And you just keep, you know. And then you can.
299 00:21:55.240 ⇒ 00:22:05.369 Scott_Harmon: you know, generate a FAQ like it just does all these things automatically really designed for teachers, I think, is pretty cool, so I’ve been loading up like AI dot like AI papers.
300 00:22:06.510 ⇒ 00:22:07.550 Scott_Harmon: and then.
301 00:22:07.550 ⇒ 00:22:10.610 Uttam Kumaran: Oh, nice! And then having it just like, do a little podcast.
302 00:22:11.080 ⇒ 00:22:18.979 Scott_Harmon: Yeah, that would like, let’s let’s say I’m sort of interested, interested in like chain of thought reasoning. I found 4 papers on chain of thought, reasoning that are like
303 00:22:19.230 ⇒ 00:22:23.779 Scott_Harmon: related, but different. And then I had it generate a podcast comparing and contrasting the.
304 00:22:24.440 ⇒ 00:22:25.210 Uttam Kumaran: Oh, interesting!
305 00:22:25.360 ⇒ 00:22:29.130 Scott_Harmon: Like a whole bunch of stuff like, I can’t read. The papers are too technical.
306 00:22:30.790 ⇒ 00:22:33.749 Scott_Harmon: But it’s, I said, just generate me, a 20 min podcast
307 00:22:34.210 ⇒ 00:22:40.420 Scott_Harmon: summarizing these different approaches of what they’re good for and what you know. And it was incredibly informative. So.
308 00:22:40.690 ⇒ 00:22:50.240 Uttam Kumaran: Yeah, yeah, they’re on a on 60 min this weekend they put there was a segment on this thing, you know, Khan Academy, like the
309 00:22:50.735 ⇒ 00:22:51.230 Uttam Kumaran: famously.
310 00:22:51.230 ⇒ 00:22:58.260 Uttam Kumaran: Yeah. So the guy they they’ve pivoted. And they’ve they’re starting to build agents for
311 00:22:58.909 ⇒ 00:23:08.800 Uttam Kumaran: schools. And they’re they’ve deployed them now. And they did a big segment on it. And it’s really really cool like, worth watching. I think it’s on their Youtube,
312 00:23:09.370 ⇒ 00:23:26.889 Uttam Kumaran: on 60 min about like how they’re deploying agents to every single student. And then they have, like a teacher that can see all the interactions that the students are having with the agent and the agent is really helpful. It doesn’t like give you all the answers. It like coaches you through stuff. And then every kid, basically, because most kids. Now they get a laptop
313 00:23:26.970 ⇒ 00:23:29.409 Uttam Kumaran: like a chromebook or something issued by the school. They.
314 00:23:29.760 ⇒ 00:23:31.700 Uttam Kumaran: these kids. Now, I think it’s like
315 00:23:31.850 ⇒ 00:23:35.309 Uttam Kumaran: $10 per kid per year they get access to the
316 00:23:35.350 ⇒ 00:23:38.831 Uttam Kumaran: to the it’s called Conmigo, and they can just like talk and use it.
317 00:23:40.300 ⇒ 00:23:44.330 Scott_Harmon: Yeah, I’ll I’ll look it up. I mean, that’s like again. I don’t mean to take us off
318 00:23:44.470 ⇒ 00:23:49.139 Scott_Harmon: in the weeds, but I think something similar like that for corporate training. Like, if you think about.
319 00:23:49.140 ⇒ 00:23:50.330 Uttam Kumaran: Yeah.
320 00:23:50.330 ⇒ 00:23:55.300 Scott_Harmon: Think about like a personalized training bot for every employee.
321 00:23:55.500 ⇒ 00:23:59.159 Scott_Harmon: And you know, when you when ABC hires
322 00:23:59.660 ⇒ 00:24:03.359 Scott_Harmon: someone, they kind of get this training. Bot. You know that
323 00:24:03.480 ⇒ 00:24:08.220 Scott_Harmon: it says, Okay, here’s the topics that we need to learn. And you know, kind of more like a self.
324 00:24:08.810 ⇒ 00:24:11.940 Scott_Harmon: because normally, the yeah, where it says, here’s 17 videos.
325 00:24:11.940 ⇒ 00:24:12.520 Uttam Kumaran: Here’s a video.
326 00:24:12.520 ⇒ 00:24:13.139 Scott_Harmon: Watch him.
327 00:24:13.140 ⇒ 00:24:20.679 Uttam Kumaran: Everyone skips through. Nobody watches anything. Yeah, you’re totally right. And it also on board them to how to use AI basically like even using
328 00:24:21.130 ⇒ 00:24:34.960 Uttam Kumaran: cause at our company like I mean, I don’t know, Miguel, we should probably that’s a good idea for us to do we? So we have a channel. Now, Scott, with AI agents in them. So we have a couple of AI agents that now everybody has access to. We have a lead researcher agent.
329 00:24:35.249 ⇒ 00:24:39.599 Uttam Kumaran: which basically, you can type in. And maybe here, I’ll I’ll just. I’ll just even show you, because I think you’ll find it
330 00:24:41.260 ⇒ 00:24:48.919 Uttam Kumaran: really interesting. Basically, I was like, look, we have a couple of agents that we’re all using. Let’s make sure that they’re all available in slack
331 00:24:49.466 ⇒ 00:24:55.760 Uttam Kumaran: that way. Anyone who wants to leverage them can use it. And then we’re also doing a couple of other things. So
332 00:24:55.820 ⇒ 00:24:58.720 Uttam Kumaran: let me just share this.
333 00:24:58.830 ⇒ 00:25:01.570 Uttam Kumaran: So this is like our internal slack
334 00:25:02.039 ⇒ 00:25:16.699 Uttam Kumaran: and in particular, we have like this lead researcher agent. Basically. What you can do is you can send it like a a linkedin of a person or a company. And it’ll actually provide you with basically like, here’s the overview. Here’s like a couple of key contacts
335 00:25:16.810 ⇒ 00:25:21.190 Uttam Kumaran: stuff about them. And then recommendations for like how we can help them, because I.
336 00:25:21.190 ⇒ 00:25:21.589 Scott_Harmon: Did you?
337 00:25:21.590 ⇒ 00:25:22.860 Scott_Harmon: It’s not that agent.
338 00:25:23.120 ⇒ 00:25:34.789 Uttam Kumaran: Yes, we developed this. Yeah, yeah. And it’s hooked up through Zapier and it then. So that way, nobody has to go into like a separate portal. It all can happen here. The other thing we’re developing is a zoom agent.
339 00:25:35.072 ⇒ 00:25:41.139 Uttam Kumaran: So one thing is all of our Zoom Meetings we’re currently recording. But they’re all getting shoved into Google Drive. And then we’re building.
340 00:25:41.140 ⇒ 00:25:41.520 Scott_Harmon: Oh, yeah.
341 00:25:41.520 ⇒ 00:25:48.180 Uttam Kumaran: That you can basically say, like, what happened that last meeting or playing with the meeting for Tuesday, or give me some trends?
342 00:25:48.688 ⇒ 00:25:53.469 Uttam Kumaran: Because for for me, like on the sales side, I get hit with like, Hey.
343 00:25:53.810 ⇒ 00:26:08.609 Uttam Kumaran: so and so is like a potential lead. Or can you go talk to this person and quickly, we want to find out information about them. But the I think the bigger thing about this is like we’re gonna basically just have a set almost like it’ll just basically be like other employees that you could just ping here.
344 00:26:08.610 ⇒ 00:26:08.930 Scott_Harmon: Yes.
345 00:26:08.930 ⇒ 00:26:10.600 Uttam Kumaran: Have agents that do everything.
346 00:26:12.480 ⇒ 00:26:14.240 Scott_Harmon: Yeah, yeah, that’s perfect. That.
347 00:26:14.440 ⇒ 00:26:18.359 Scott_Harmon: Yeah. By the way, that I introduce you to Greg, that Greg is just
348 00:26:18.390 ⇒ 00:26:25.120 Scott_Harmon: that’s what he’s building as a whole sales, you know. Agent thing. So, but what you’re doing there is super useful, and
349 00:26:25.450 ⇒ 00:26:30.740 Scott_Harmon: the idea of just keep building agents and keep using them internally, is a great idea.
350 00:26:30.740 ⇒ 00:26:43.490 Uttam Kumaran: Yeah, and it and it again, the stuff we’re doing for all these Demos, too. I’m like, okay, we have an idea like, even, for when you mentioned the training thing I’m like, well, we’re gonna have to. We’re gonna start thinking about training a lot more next year. And yeah, it’s just have some.
351 00:26:43.490 ⇒ 00:26:50.239 Scott_Harmon: What I what I really like about this Google notebook thing. And I, you know, maybe you have topics like this within brain forge. But
352 00:26:51.550 ⇒ 00:26:55.089 Scott_Harmon: let’s let’s say you’re you’re studying kind of an advanced topic.
353 00:26:55.710 ⇒ 00:26:58.439 Scott_Harmon: and it, you know, you have to read papers, and
354 00:26:58.530 ⇒ 00:27:01.079 Scott_Harmon: you know, and you want to just kind of
355 00:27:01.330 ⇒ 00:27:03.680 Scott_Harmon: debate it a little bit and understand it.
356 00:27:04.300 ⇒ 00:27:04.640 Uttam Kumaran: And you can.
357 00:27:04.640 ⇒ 00:27:07.840 Scott_Harmon: Create, create a notebook, invite everybody to it.
358 00:27:08.240 ⇒ 00:27:11.820 Scott_Harmon: and say, Look, here’s 5 websites and 5 papers. You know that
359 00:27:12.010 ⇒ 00:27:16.879 Scott_Harmon: on this topic, and then you can have the AI kind of lead. The discussion.
360 00:27:17.050 ⇒ 00:27:22.749 Uttam Kumaran: You know, like, what? How should we use this? Is this better to use for this or that, like a.
361 00:27:22.800 ⇒ 00:27:27.300 Scott_Harmon: That are more of a technical, you know, Forum.
362 00:27:27.480 ⇒ 00:27:32.439 Scott_Harmon: you know a lot of times in R&D, you, just you just need to debate AI. Should we use this or that, or.
363 00:27:32.440 ⇒ 00:27:33.240 Uttam Kumaran: No! Totally.
364 00:27:33.240 ⇒ 00:27:36.010 Scott_Harmon: Out this new thing, and it’s just a good place to
365 00:27:36.470 ⇒ 00:27:39.219 Scott_Harmon: to discuss and look at alternatives
366 00:27:39.860 ⇒ 00:27:43.510 Scott_Harmon: as a team have with AI kind of acting.
367 00:27:43.510 ⇒ 00:27:44.190 Uttam Kumaran: Facilitating.
368 00:27:44.480 ⇒ 00:27:49.840 Scott_Harmon: Facilitator. And you know, you can do all the normal stuff like, hey, create a pros and cons
369 00:27:50.180 ⇒ 00:27:53.839 Scott_Harmon: document between, you know, using this tool, or that tool and.
370 00:27:53.840 ⇒ 00:27:54.400 Uttam Kumaran: Yeah.
371 00:27:54.400 ⇒ 00:27:57.150 Scott_Harmon: I think it would be just really interesting to
372 00:27:57.390 ⇒ 00:28:00.140 Scott_Harmon: as a place to kind of deep dive on like a
373 00:28:01.690 ⇒ 00:28:04.590 Scott_Harmon: yeah on a given topic. Or, yeah, right?
374 00:28:05.030 ⇒ 00:28:09.090 Uttam Kumaran: Yeah, that’s really, really interesting, like something like a meeting facilitator.
375 00:28:09.470 ⇒ 00:28:17.290 Uttam Kumaran: or even just like a pre meeting like, listen to this, 5 min podcast before you attend a meeting. Because, you know, Pre reads like a lot of people.
376 00:28:17.360 ⇒ 00:28:22.179 Uttam Kumaran: I feel like, I want to start doing them. But it’s gonna be tough, but I feel like Pre. Listens.
377 00:28:22.180 ⇒ 00:28:22.840 Scott_Harmon: Yeah.
378 00:28:22.840 ⇒ 00:28:23.709 Uttam Kumaran: It’s not that bad.
379 00:28:23.710 ⇒ 00:28:27.970 Scott_Harmon: Great idea you could do like. Okay, you know, here’s the company information.
380 00:28:28.100 ⇒ 00:28:34.489 Scott_Harmon: you know. Here’s who we’re meeting with. Here’s the thing, you know, like again, just just a nice focused discussion area
381 00:28:35.060 ⇒ 00:28:37.250 Scott_Harmon: where you’re using the AI to kind of
382 00:28:37.940 ⇒ 00:28:41.399 Scott_Harmon: summarize a bunch of information that would take you.
383 00:28:41.910 ⇒ 00:28:46.040 Scott_Harmon: You know any one of your folks. Gosh! That would take me 2 h to read all that stuff.
384 00:28:46.040 ⇒ 00:28:47.250 Uttam Kumaran: Yeah, yeah, yeah.
385 00:28:47.610 ⇒ 00:28:48.740 Scott_Harmon: Okay. All right.
386 00:28:48.740 ⇒ 00:28:52.979 Uttam Kumaran: Yeah, look back back to ABC. I I think if you could just share.
387 00:28:53.140 ⇒ 00:28:53.710 Scott_Harmon: You know, even.
388 00:28:53.710 ⇒ 00:29:00.859 Uttam Kumaran: Yeah, we’re gonna throw. We have a yeah. Well, I’ll throw it into a notion, because it’ll be there’s a place where we already have information about the architecture.
389 00:29:01.280 ⇒ 00:29:01.610 Scott_Harmon: Yeah, yeah.
390 00:29:01.610 ⇒ 00:29:08.170 Uttam Kumaran: Add in like a demo section that just shows like, here are the 3 questions to ask, or 3 or 5 questions that we’re gonna ask that way in a demo.
391 00:29:08.170 ⇒ 00:29:09.700 Uttam Kumaran: Yeah, we could copy paste as in.
392 00:29:09.960 ⇒ 00:29:11.129 Uttam Kumaran: And we know they were.
393 00:29:11.130 ⇒ 00:29:14.729 Scott_Harmon: I just want to make sure that that and you could drive, of course.
394 00:29:14.930 ⇒ 00:29:20.559 Scott_Harmon: that that we that we point out that, hey? This question was able to get its answer.
395 00:29:20.640 ⇒ 00:29:23.929 Scott_Harmon: you know, to Casey’s point, because it, you know, it was actually.
396 00:29:24.330 ⇒ 00:29:27.819 Scott_Harmon: you know, maybe looked in the basf, Doc, you know, like.
397 00:29:27.940 ⇒ 00:29:32.160 Scott_Harmon: I think one of the things I really want to help them understand is how easy it is
398 00:29:32.720 ⇒ 00:29:39.020 Scott_Harmon: to integrate additional sources of knowledge into the agent, and.
399 00:29:39.020 ⇒ 00:29:39.410 Uttam Kumaran: Yeah.
400 00:29:39.410 ⇒ 00:29:43.220 Scott_Harmon: Because, remember the way this got going, Miguel and Casey is they were.
401 00:29:43.360 ⇒ 00:29:48.100 Scott_Harmon: They were originally thinking that they wanted to buy a knowledge based tool.
402 00:29:48.520 ⇒ 00:29:53.130 Scott_Harmon: you know, like a just a big, just a big Cms or
403 00:29:53.290 ⇒ 00:29:56.109 Scott_Harmon: housing all kinds of knowledge. You know that
404 00:29:56.170 ⇒ 00:29:58.420 Scott_Harmon: that they need to run their company.
405 00:29:59.020 ⇒ 00:30:03.989 Scott_Harmon: And I told them, don’t don’t buy a knowledge based tool or a Cms like AI is so much better
406 00:30:04.480 ⇒ 00:30:08.720 Scott_Harmon: at just, you know, using rag, basically to.
407 00:30:09.320 ⇒ 00:30:15.889 Scott_Harmon: you know, to integrate information in interesting ways. So that’s kind of how this project started.
408 00:30:16.130 ⇒ 00:30:21.210 Scott_Harmon: And so I just wanna make sure that we point out how easy it was
409 00:30:21.740 ⇒ 00:30:27.950 Scott_Harmon: to pull in several different sources of knowledge into the into the context window.
410 00:30:31.190 ⇒ 00:30:32.100 Scott_Harmon: And
411 00:30:32.290 ⇒ 00:30:40.930 Scott_Harmon: I can guarantee you they’re gonna come up with 5. 0, could you integrate this? Could you integrate that which I think is what we want you, Tom? Right, we want them going. Oh, what if you.
412 00:30:40.930 ⇒ 00:30:41.750 Uttam Kumaran: Yeah, exactly.
413 00:30:41.750 ⇒ 00:30:44.980 Scott_Harmon: You know, we have this additional information. Could you
414 00:30:45.040 ⇒ 00:30:50.820 Scott_Harmon: like? It’s it’s just that’s kind of where we want to land. So again, the objective for the meeting is just to
415 00:30:51.590 ⇒ 00:30:57.090 Scott_Harmon: to get them to identify a particular persona and application that they want to start with.
416 00:30:57.650 ⇒ 00:31:03.769 Scott_Harmon: and that they could fund an initial project with no, and you know.
417 00:31:04.020 ⇒ 00:31:09.529 Scott_Harmon: basically, you know, qualify them financially and make sure that this is something that they’re ready to spend money on
418 00:31:09.700 ⇒ 00:31:13.250 Scott_Harmon: and and from there we can get a documented.
419 00:31:13.590 ⇒ 00:31:16.289 Scott_Harmon: You know more of a documented project plan in place.
420 00:31:16.770 ⇒ 00:31:17.330 Uttam Kumaran: Cool.
421 00:31:17.510 ⇒ 00:31:24.859 Uttam Kumaran: Okay? And then, yeah, we’re looking forward to the stuff from televero as well. And then, yeah, we’re kind of pause on Hpi until we get a good sense. There.
422 00:31:25.120 ⇒ 00:31:27.590 Uttam Kumaran: Then.
423 00:31:29.470 ⇒ 00:31:35.249 Uttam Kumaran: yeah, I guess anything else that comes up where we want to start to do some Demos. We’re putting together a little bit of like
424 00:31:35.320 ⇒ 00:31:39.739 Uttam Kumaran: a service offering deck, which will have like kind of all these Demos in one place.
425 00:31:39.830 ⇒ 00:31:41.579 Uttam Kumaran: so that maybe that’ll help
426 00:31:41.940 ⇒ 00:31:46.610 Uttam Kumaran: when you go to a new person. And it’s like, what are all the things we’ve basically isolated to like
427 00:31:46.780 ⇒ 00:32:06.569 Uttam Kumaran: 10 different services. There’s like internal chat. There’s like internal internal operations and automation. And there’s like external. We’ve basically separated them. It’s like, okay, you have voice, we have chat bots. We have this technology base. So we’re kind of creating this deck that maybe it’ll help just accelerate some of these conversations, create a capabilities deck basically.
428 00:32:06.910 ⇒ 00:32:14.229 Scott_Harmon: Yeah, I think I can help with a lot is obviously introductions and meetings of people I know on my network. But then, as we go deeper down verticals
429 00:32:14.840 ⇒ 00:32:15.470 Scott_Harmon: again.
430 00:32:15.470 ⇒ 00:32:15.860 Uttam Kumaran: Yes.
431 00:32:15.860 ⇒ 00:32:17.559 Scott_Harmon: State or or healthcare.
432 00:32:18.440 ⇒ 00:32:27.499 Scott_Harmon: you know. That’s where I think I could add a lot of value. And you know I have a lot just by mining contacts. So as you develop a point of view on verticals.
433 00:32:27.760 ⇒ 00:32:33.420 Scott_Harmon: and your demos start to show value in a particular vertical.
434 00:32:34.284 ⇒ 00:32:34.739 Uttam Kumaran: Yeah.
435 00:32:34.740 ⇒ 00:32:37.110 Scott_Harmon: That’s where I that’s where I think we could
436 00:32:37.420 ⇒ 00:32:42.660 Scott_Harmon: make, hey? You know, the the more the Demos can can
437 00:32:42.840 ⇒ 00:32:46.650 Scott_Harmon: go beyond. Just kind of a Gee whiz, demo. One of the things about yeah.
438 00:32:46.650 ⇒ 00:32:49.949 Scott_Harmon: Demos is, you get these gweez effect like, Wow, he’s an AI cool.
439 00:32:50.140 ⇒ 00:32:57.139 Scott_Harmon: And and I think that’s that’s nice. I mean, I think it’s good that people get excited about AI. But where? Where? I think you’ll really
440 00:32:57.410 ⇒ 00:33:00.740 Scott_Harmon: have a bigger impact is when you connect it to like a financial
441 00:33:01.510 ⇒ 00:33:04.740 Scott_Harmon: like a financial bottom line kind of an issue.
442 00:33:04.740 ⇒ 00:33:05.120 Uttam Kumaran: Yeah.
443 00:33:05.120 ⇒ 00:33:10.100 Scott_Harmon: Like. Oh, my God! I spend like 50 grand a year on this, or a hundred grand a year on that, or.
444 00:33:10.100 ⇒ 00:33:23.649 Uttam Kumaran: No. And we want, I wanna buy. I wanna bias towards the clear outcomes where there’s clear where we’ve affected cause that helps us sell this even more in the future. And if it works, which that’s we’re betting on, then it’s like everybody agrees like the impact.
445 00:33:23.690 ⇒ 00:33:27.930 Uttam Kumaran: It’s when the impact isn’t like really clearly measurable. Or we didn’t really agree on. Like, where
446 00:33:28.300 ⇒ 00:33:30.420 Uttam Kumaran: how does this gonna impact stuff? Then?
447 00:33:30.670 ⇒ 00:33:31.699 Uttam Kumaran: Yeah, it’s.
448 00:33:32.290 ⇒ 00:33:41.049 Uttam Kumaran: And also with the AI stuff again. It’s a lot more of like, can, are we? We’re we’re replacing like what someone would do. So you want to almost like price towards that outcome
449 00:33:41.130 ⇒ 00:33:42.340 Uttam Kumaran: is what we’re finding out.
450 00:33:42.340 ⇒ 00:33:47.319 Scott_Harmon: Yeah. You took the words out of my mouth on that. I think the more that you can get concrete about
451 00:33:47.610 ⇒ 00:33:51.229 Scott_Harmon: price. You know to your point based on outcome. And there’s
452 00:33:52.020 ⇒ 00:33:56.829 Scott_Harmon: underneath that heading. I think there’s 2 different ways to do it, and I think you can do both, or either
453 00:33:57.070 ⇒ 00:34:02.859 Scott_Harmon: one of them is price based on work performed.
454 00:34:03.450 ⇒ 00:34:11.000 Scott_Harmon: So, for example, in the ABC. Case, it may be question like we answered 150,000 questions. You know some number of questions.
455 00:34:11.880 ⇒ 00:34:12.920 Scott_Harmon: and
456 00:34:13.340 ⇒ 00:34:18.889 Scott_Harmon: and then you leave it up to ABC to decide. Well, what what’s the financial value of that? And you, basically, you pay.
457 00:34:19.230 ⇒ 00:34:22.270 Scott_Harmon: you pay based on how many questions that you answer.
458 00:34:24.030 ⇒ 00:34:28.840 Scott_Harmon: The the way those kind of systems work. You’re probably aware of this. But there’s a whole
459 00:34:29.219 ⇒ 00:34:33.220 Scott_Harmon: big industry precedent that involves in the sourcing world where
460 00:34:34.670 ⇒ 00:34:40.930 Scott_Harmon: like. If you hire an outsource call center, you know, offshore, they’ll have a complicated rate schedule.
461 00:34:41.330 ⇒ 00:34:46.899 Scott_Harmon: the level of the question, what what it had to do with how long it took to answer. They’ll have a bunch of
462 00:34:48.739 ⇒ 00:34:50.989 Scott_Harmon: performance metrics on it, like.
463 00:34:51.250 ⇒ 00:34:56.130 Scott_Harmon: you know. Did you close it within 2 min. Did you close it within 5 min, and they pay different rates. So it’s like a rate.
464 00:34:56.139 ⇒ 00:34:56.859 Uttam Kumaran: Oh, okay.
465 00:34:57.350 ⇒ 00:35:01.189 Scott_Harmon: Like. So that’s 1 way to do to do it. And and it’s
466 00:35:02.018 ⇒ 00:35:06.200 Scott_Harmon: I could even give you examples of how call center pricing schedule.
467 00:35:06.200 ⇒ 00:35:11.050 Uttam Kumaran: That would be helpful, because I feel like that’s the easiest to be like. Compare apples to apples, and be like well.
468 00:35:11.050 ⇒ 00:35:12.049 Scott_Harmon: If you just do that.
469 00:35:12.050 ⇒ 00:35:13.019 Uttam Kumaran: Price, then.
470 00:35:13.410 ⇒ 00:35:14.849 Scott_Harmon: Yeah, just like a call center.
471 00:35:14.850 ⇒ 00:35:21.870 Uttam Kumaran: Because the marginal cost on the AI side is is like it. It really doesn’t scale like that.
472 00:35:21.920 ⇒ 00:35:23.059 Uttam Kumaran: So it’s.
473 00:35:23.060 ⇒ 00:35:27.199 Scott_Harmon: Make it 20% the price. And say, you know what you know, it’s easy like, that’s
474 00:35:27.460 ⇒ 00:35:31.909 Scott_Harmon: that’s 1 way to do it. And then the other way to do it in some of these domains is
475 00:35:32.500 ⇒ 00:35:36.369 Scott_Harmon: is actually tie it to some business outcome.
476 00:35:36.680 ⇒ 00:35:37.040 Uttam Kumaran: Yeah.
477 00:35:37.040 ⇒ 00:35:39.790 Scott_Harmon: You know, like new clients signed up, or
478 00:35:41.370 ⇒ 00:35:46.569 Scott_Harmon: I’m trying to think what they would be like, I’m sure, in our call with televero.
479 00:35:47.150 ⇒ 00:35:50.870 Scott_Harmon: You know, there could be other say they’re called success metrics, right? Where.
480 00:35:51.130 ⇒ 00:35:54.310 Scott_Harmon: you know, you basically say, we want to see this many
481 00:35:55.530 ⇒ 00:36:06.019 Scott_Harmon: business outcomes a month as long as those are defined. And then the key to it. Time is that you have to have the AI tool actually measure it itself like. So it’s actually keeping its own score.
482 00:36:06.370 ⇒ 00:36:11.820 Scott_Harmon: So at the end of the every month the AI tool goes. Look, I did this. Many things like this is what I did this month.
483 00:36:12.620 ⇒ 00:36:17.200 Scott_Harmon: Yeah, you agreed. You’re gonna pay me. You know this much money.
484 00:36:17.230 ⇒ 00:36:23.989 Scott_Harmon: and it’s very compelling, because then you can go. Well, look if if we didn’t really sign up new customers, then you don’t have to pay very much, you know. It’s more.
485 00:36:23.990 ⇒ 00:36:24.730 Uttam Kumaran: Yeah.
486 00:36:25.060 ⇒ 00:36:27.709 Scott_Harmon: It’s pay for value. So that’s that’s the other way.
487 00:36:27.810 ⇒ 00:36:30.479 Scott_Harmon: You want to build these pricing regimens and and.
488 00:36:30.480 ⇒ 00:36:31.030 Uttam Kumaran: Yeah.
489 00:36:31.400 ⇒ 00:36:33.659 Scott_Harmon: The folks at ABC in particular.
490 00:36:33.840 ⇒ 00:36:41.759 Scott_Harmon: really, really like that philosophy. So we should mention that on Thursday they like pay for result. They really.
491 00:36:41.760 ⇒ 00:36:54.940 Uttam Kumaran: I prefer that, too, because it pushes us to make sure it works. I get that. There’s probably some like. Again, we can kind of consider. Look if there’s like implementation fee or something, just to get things going, but then having the ongoing price also helps us
492 00:36:55.030 ⇒ 00:36:58.870 Uttam Kumaran: to charge something ongoing. But then again, it’s really aligned towards
493 00:36:58.970 ⇒ 00:37:07.269 Uttam Kumaran: the outcome. And then we kind of move towards more complex procedures. And then price by like, okay, what would you pay someone to do this sort of complex procedure.
494 00:37:07.310 ⇒ 00:37:12.939 Uttam Kumaran: And then we price per outcome, whether it’s like, yeah, like a meeting booked or an issue resolved
495 00:37:14.230 ⇒ 00:37:17.450 Uttam Kumaran: or like non issue escalations, or something like that.
496 00:37:17.450 ⇒ 00:37:25.375 Scott_Harmon: Yep, I don’t know. Yep, yep, yep, okay. Look. I think we’re perfectly all set. I think the Demo Miguel, in in case he looks great.
497 00:37:26.200 ⇒ 00:37:29.820 Scott_Harmon: and I think the it seems like you’ve set up a really nice
498 00:37:29.970 ⇒ 00:37:32.520 Scott_Harmon: system instead of tooling infrastructure to
499 00:37:32.530 ⇒ 00:37:38.050 Scott_Harmon: start to turn these things out super impressive. Yeah, and
500 00:37:38.804 ⇒ 00:37:44.679 Scott_Harmon: yeah, I look forward to the call on Thursday, hopefully, we’ll get a project green light on something and we’ll go from there.
501 00:37:45.180 ⇒ 00:37:45.850 Uttam Kumaran: Cool.
502 00:37:46.050 ⇒ 00:37:46.800 Uttam Kumaran: Okay.
503 00:37:46.930 ⇒ 00:37:48.130 Scott_Harmon: Thanks. Everyone.
504 00:37:48.710 ⇒ 00:37:49.269 Scott_Harmon: Thanks. Everyone.
505 00:37:49.270 ⇒ 00:37:50.350 Casie Aviles: Thanks, everyone talk, soon.
506 00:37:50.600 ⇒ 00:37:51.130 Scott_Harmon: Cheers.
507 00:37:51.400 ⇒ 00:37:51.770 Uttam Kumaran: Bye, guys.
508 00:37:51.770 ⇒ 00:37:52.410 Casie Aviles: Thank you.