Meeting Title: Client Demo Planning with Clarence Date: 2026-03-17 Meeting participants: Clarence Stone, Pranav


WEBVTT

1 00:01:43.120 00:01:44.760 Pranav: Yo, yo, yo.

2 00:01:49.440 00:01:51.120 Pranav: Hey, can’t hear you.

3 00:02:02.230 00:02:03.260 Clarence Stone: How about now?

4 00:02:03.660 00:02:05.210 Pranav: Yeah, not again. Cool.

5 00:02:05.210 00:02:09.060 Clarence Stone: It’s so weird. If I don’t open my laptop, it won’t work.

6 00:02:09.639 00:02:18.840 Clarence Stone: I was just worried that you were coding again. I was like, no, like, let’s not waste any time, like, we should just talk about the client. So I was like…

7 00:02:19.050 00:02:24.970 Clarence Stone: So, like, pro tip, like, last time, you know, something was, like.

8 00:02:25.300 00:02:30.079 Clarence Stone: like, a disconnect that we needed to align on, and we didn’t have that call. And I’m just like…

9 00:02:30.190 00:02:32.220 Clarence Stone: I hope that’s not happening again.

10 00:02:33.940 00:02:38.209 Pranav: With Danny, you’re saying? Or, I mean, I think you’re referring to, like, Lilo, right? Like…

11 00:02:38.210 00:02:39.070 Clarence Stone: Yeah.

12 00:02:39.070 00:02:39.660 Pranav: Yeah.

13 00:02:39.970 00:02:46.720 Clarence Stone: Yeah, so, like, if there’s just, like, a misalignment, we just hop on a 5-10 minute call. Like, just knock it out.

14 00:02:47.530 00:02:55.540 Pranav: Yeah, so this… I’m less concerned about because it’s, like, you know, it’s a lead.

15 00:02:55.840 00:03:00.120 Pranav: and, like, Robert has, like, all the context here. I’m not really, like…

16 00:03:00.470 00:03:06.850 Pranav: I feel like things are kind of just out of my hands. I’m just actually just trying to be, like, proactive in helping him.

17 00:03:07.100 00:03:18.729 Pranav: But yeah, I totally agree, like, that’s why, like, Utam kind of talked to me about this, too, is just, like, with, like, your CSO, like, delivery is your number one priority. So that’s why, like, in my calendar, too, like.

18 00:03:19.190 00:03:26.539 Pranav: I’ve just, like, blocked off, like, the beginning of the day to just, like, do all of my delivery work, you know? Like…

19 00:03:26.890 00:03:33.800 Pranav: turn, like, just kind of, like, tie a bow on, like, all that stuff. And yeah, I think that’s worked out well.

20 00:03:34.030 00:03:36.579 Pranav: But, yeah, no, I totally agree, like, if…

21 00:03:36.810 00:03:40.900 Pranav: thing, like, say if, like, something happened, like, with ABC, kind of, like, with…

22 00:03:41.510 00:03:45.599 Pranav: where I needed to, like, talk to you, like, I’ll be… I’ll do that way ahead of time, and…

23 00:03:45.880 00:03:49.659 Pranav: kind of… Lilo is definitely, like, a learning experience for me, too.

24 00:03:50.310 00:03:51.240 Pranav: But, yeah.

25 00:03:51.620 00:03:59.780 Clarence Stone: Yeah, so, with… like, client pursuits too, they do have an expiration date, like…

26 00:04:00.520 00:04:03.739 Clarence Stone: You probably forgot what you had for breakfast a week ago.

27 00:04:05.120 00:04:06.610 Clarence Stone: Do you remember, Prunov?

28 00:04:06.730 00:04:24.330 Clarence Stone: You don’t, right? And, you know, our clients aren’t going to remember us if we don’t follow up in an appropriate amount of time, either. So, I took a look at that email, and it’s been a minute, right? So, there is a sense of urgency there, and, like, if you can’t do it because you have client priorities.

29 00:04:24.330 00:04:34.839 Clarence Stone: totally get it. You should tell Robert, right? And say, like, hey, can you make sure you keep the client lead warmed? Like, we need a few more days, right? And this is, you know, our… we’re doing that.

30 00:04:35.180 00:04:36.330 Pranav: So, my thing, too, with this.

31 00:04:36.330 00:04:43.450 Clarence Stone: But, like, you just get forgotten, you send a follow-up, and then, like, you’re gonna tell me, oh, they didn’t answer, and we worked so hard on these demos.

32 00:04:44.110 00:04:48.280 Pranav: Yeah, so, I kinda wanna align on this, too, with, like, this client, it’s just, like.

33 00:04:48.650 00:05:03.609 Pranav: I… I’m, like, checking with Robert, like, all the time. He’s like, oh, I haven’t gotten to this yet either. So he’s kind of the one that’s taking point, and so, like, I’m just making sure, like, okay, he gives me something to do, I’m doing it immediately. This thing for, like…

34 00:05:03.980 00:05:06.229 Pranav: It’s just kind of like, he hasn’t…

35 00:05:06.590 00:05:13.159 Pranav: even give me any additional direction yet, because I’ve given him, like, the case studies, like, the demos,

36 00:05:14.150 00:05:16.520 Pranav: And so… Yes.

37 00:05:16.520 00:05:29.730 Clarence Stone: So, let me… let me kind of… don’t take this any wrong way, I’ll just give it to you, like, the truth. It’s like, this client is not like the clients you’ve worked on. If you just read the email, you can tell that this is a completely different build.

38 00:05:29.730 00:05:43.650 Clarence Stone: it takes a little bit of understanding on legal practice, and this is not like an e-com company or, like, the marketing-facing things that you’ve seen. Like, they’re working on serious infrastructure requirements, right?

39 00:05:43.740 00:05:53.340 Clarence Stone: So, like, the reason why Robert hasn’t really responded to you yet is, like, the demos you gave him didn’t really, you know, hit home.

40 00:05:54.320 00:05:55.300 Clarence Stone: Gotcha.

41 00:05:55.320 00:06:08.780 Clarence Stone: So, like, it’s probably like, yo, I don’t want to upset him, I don’t want to, like… like, the last thing you want to do is, like, make someone feel like, you know, they didn’t make a meaningful contribution, right? Like, we don’t want to bum people out.

42 00:06:08.780 00:06:32.890 Clarence Stone: But, like, okay, then how do I phrase it for him? How do I, you know, communicate what does need to be done, right? And then, like, that slows down the cycle. So, non-response is really, like, hey, how do we let him down nicely? Because, like, you put in work, you’re vibe coding, like, that’s what we want in terms of behaviors, right? Like, you didn’t do anything wrong, it’s just, like, this is a different client, and

43 00:06:32.890 00:06:34.799 Clarence Stone: You know, my feedback to them is, like.

44 00:06:34.850 00:06:48.439 Clarence Stone: Somebody’s gotta sit down with you guys and mentor you guys and help you figure this stuff out. Like, you can’t just expect somebody to just be past this email and figure things out. But here’s what’s crazy. I can. I saw that email and I said, I know what to do.

45 00:06:49.530 00:06:59.740 Clarence Stone: Right, and… and it’s not fair to just be like, oh, Clarence can do it, then everyone else can do it, or, like, Clarence just do it, because that doesn’t make it scalable.

46 00:07:00.340 00:07:08.730 Clarence Stone: Right? And nobody’s learning, and you’re probably just going to be on the same clients over and over again, and not, you know, expanding your AI capabilities.

47 00:07:09.060 00:07:14.529 Clarence Stone: So, what I want to do with you, hopefully you have more than 30 minutes, is actually.

48 00:07:14.530 00:07:19.710 Pranav: I have a meeting right after this. Okay. But I’m totally free.

49 00:07:20.400 00:07:26.109 Clarence Stone: Okay, so, let’s dissect the email then, right? Let me get the email up.

50 00:07:29.470 00:07:36.690 Clarence Stone: So, number one, like, before even looking at the email, I know that this is a…

51 00:07:37.610 00:07:40.899 Clarence Stone: Like, deployment structure that you’ve probably never seen before.

52 00:07:42.250 00:07:44.450 Pranav: Yeah, building something, like, on-prem.

53 00:07:44.900 00:07:45.640 Clarence Stone: Yeah.

54 00:07:46.410 00:07:48.499 Clarence Stone: Yep. Right. So, like…

55 00:07:48.730 00:07:59.909 Clarence Stone: there’s a lot of considerations that need to be happening, because on-prem can mean a lot of different things. Like, is it on-prem on a laptop? Well, then you have to think about

56 00:08:00.260 00:08:11.520 Clarence Stone: what models can possibly actually run on a laptop? What kind of horsepower are they working with, right? Can we give them, like, a Mac Studio, and say, this is, you know, what you plug into?

57 00:08:11.990 00:08:12.640 Pranav: Yep.

58 00:08:13.580 00:08:20.979 Clarence Stone: Okay, so let’s look through this. And this is why I was like, hey, we really need to just talk about the client first and get a good understanding of it.

59 00:08:21.040 00:08:34.309 Clarence Stone: Yeah. And then we can build something that’s meaningful, right? So, it says, hey, you know, they have terabytes of evidence collected over long periods of time using Twitter scraping.

60 00:08:34.590 00:08:40.880 Clarence Stone: Right. And Twitter scraping specifically has some tool requirements and usage.

61 00:08:41.870 00:08:50.759 Clarence Stone: Right? Because X.com and their MCP does better Twitter reading and scraping and searching than anything else.

62 00:08:51.680 00:08:58.439 Clarence Stone: Right. Social media, well, that might be something like an MCP for Manis, or Facebook.

63 00:08:58.700 00:09:05.400 Clarence Stone: Right? So, like, just in this, like, 3 lines, I’m already extrapolating all of these requirements that the client would want to see.

64 00:09:07.000 00:09:07.540 Pranav: Yeah.

65 00:09:07.540 00:09:20.529 Clarence Stone: Right? So, like, one, you’ve got terabytes of data, right, that are all sorts of different social media, and you have to run some sort of indexing and search for a large database.

66 00:09:21.740 00:09:26.780 Clarence Stone: Right. That is a huge challenge when you layer on top of that.

67 00:09:26.880 00:09:30.989 Clarence Stone: You know, it needs to be some sort of secure inference device.

68 00:09:32.240 00:09:32.950 Clarence Stone: Right.

69 00:09:33.110 00:09:34.800 Clarence Stone: So, like.

70 00:09:35.280 00:09:53.620 Clarence Stone: regardless of whether, like… we’ll dig into architecture later on, because I can explain that to you, but, like, what this communicates to me to me directly is we should show the client a way to find meaningful things out of, like, terabytes of Twitter feeds.

71 00:09:54.510 00:10:11.159 Clarence Stone: Right? Because, like, regardless of whether we do this demo on cloud or on-prem, it doesn’t matter, right? Yeah. We can prove that we can, like, really intelligently find something from social media that’s meaningful for investigative work.

72 00:10:12.290 00:10:13.050 Pranav: Gotcha.

73 00:10:13.300 00:10:14.109 Pranav: Right? Yeah.

74 00:10:14.110 00:10:16.510 Clarence Stone: So, that’s what I read from here.

75 00:10:16.920 00:10:26.349 Clarence Stone: And then, like, like, the real concerns around AI and investigative work is that they’re doing multiple cases at the same time.

76 00:10:26.720 00:10:34.969 Clarence Stone: Right, so the implicit demand here is that you need to have data barriers from Case to case.

77 00:10:35.750 00:10:43.330 Clarence Stone: Right. So, it’s sort of like in Lilo, they were saying, we want a different container for every client.

78 00:10:43.950 00:11:00.579 Clarence Stone: Right? And then, like, I want you to wrap it all in, and then find the patterns across all the clients, give us a consolidated report. And that’s sort of the same pattern here, right? You’ve got a terabyte of data from this case, you have another terabyte of data from that case, right? But those things should not touch.

79 00:11:02.400 00:11:04.320 Clarence Stone: Right? So…

80 00:11:04.560 00:11:14.769 Clarence Stone: The natural way I built work was in the app, when you set up a workspace, it is a data barrier to any other workspace.

81 00:11:15.600 00:11:23.760 Clarence Stone: So that’s already a value prop that this industry needed, and I knew about, and I built specifically to the way the architecture works.

82 00:11:23.970 00:11:36.429 Clarence Stone: Right? So, that’s another, like, feature point. So, like, I’m also extrapolating, like, what would they want to hear as a user benefit after reading this?

83 00:11:37.560 00:11:38.480 Clarence Stone: Right?

84 00:11:39.460 00:11:52.809 Clarence Stone: And then two is, we’re not proposing building a product that replaces the human diligence. The goal is a tool that supports analysts and attorneys under human verification every step. So, this…

85 00:11:53.500 00:12:06.610 Clarence Stone: is a research topic for you, right? We need to punch into AI how analysts and attorneys or, their teams may want to use a research goal

86 00:12:06.820 00:12:08.869 Clarence Stone: And how can we create

87 00:12:08.990 00:12:14.660 Clarence Stone: a step-by-step that says, hey, what are you looking to research? And instead of just doing it.

88 00:12:14.950 00:12:19.940 Clarence Stone: Right? We have to give them follow-on questions or some sort of precision

89 00:12:20.290 00:12:25.509 Clarence Stone: So that, like, they’re not just getting a bunch of results that aren’t useful.

90 00:12:26.890 00:12:27.949 Clarence Stone: Does that make sense?

91 00:12:29.070 00:12:32.080 Pranav: Yeah, yeah. Like, the AI should…

92 00:12:32.840 00:12:42.210 Pranav: it can’t just be giving them, like, worse insights than they’re provide… like, that they’re getting currently. Like, even if it does it faster, it’s not… that’s not helpful.

93 00:12:42.210 00:12:51.939 Clarence Stone: Yeah, so, like, you gotta realize, like, these are non-technical people, so they might just say, you know, I’m not gonna use this legal context, right? I’ll say, like, for now, if you come to me and you say you want to buy a home.

94 00:12:52.290 00:13:03.590 Clarence Stone: Right. If I just went to Zillow and just gave you, like, 3 pages of Zillow listings in Austin, are you gonna be like, yo, that’s awesome, or what the fuck is this trash?

95 00:13:03.890 00:13:05.149 Pranav: Right, yeah, exactly.

96 00:13:05.380 00:13:23.289 Clarence Stone: Right, so we need to know, in that second layer, what are the follow-up questions that these analysts and attorneys need to be able to answer to get a really good response? Well, I will probably, instead of giving you, you know, like, 500 homes in Austin for now, I would say, yo, my dude, like, how many garages do you need?

97 00:13:23.660 00:13:24.310 Pranav: Yeah.

98 00:13:24.560 00:13:27.629 Clarence Stone: Right? How many bedrooms are you looking for? What is your price point?

99 00:13:28.080 00:13:35.130 Clarence Stone: Right? So instead of answering the question, it should follow up and say, hey, I’m… we’re gonna start this deep research project.

100 00:13:35.290 00:13:36.180 Clarence Stone: Right?

101 00:13:36.530 00:13:39.279 Clarence Stone: Tell me more about what you want to investigate.

102 00:13:40.750 00:13:41.730 Clarence Stone: Right?

103 00:13:42.150 00:13:45.990 Clarence Stone: And then, like, privacy and chain of custody. Well…

104 00:13:46.980 00:13:57.280 Clarence Stone: This is more of a digital forensic thing, there’s not really much, like, AI here, but it could be, like, being able to trace the same people across social media.

105 00:13:59.040 00:13:59.770 Pranav: Hmm.

106 00:14:00.110 00:14:03.890 Clarence Stone: Right? Like, is this person Jim?

107 00:14:04.250 00:14:09.220 Clarence Stone: like, Pokemon Fan 1736 on X.

108 00:14:09.710 00:14:10.510 Pranav: Yeah.

109 00:14:10.510 00:14:13.119 Clarence Stone: Right? That’s what they need to be able to trace.

110 00:14:13.990 00:14:23.010 Clarence Stone: And… Like, be able to understand that all of that’s happening on, like, a consolidated, like, container.

111 00:14:23.220 00:14:32.349 Clarence Stone: Right? So that’s, like, 3 demos for you right there. One is, like, finding really good details that are precise in a large set of data.

112 00:14:34.200 00:14:39.629 Clarence Stone: Right? Two is doing investigative work. So this is sort of like prospecting.

113 00:14:40.330 00:14:49.930 Clarence Stone: Right? Like, you saying, like, hey, Clarence, I only want to buy a blue house. Like, don’t even, like, just tell me where all the blue houses are in Austin.

114 00:14:50.540 00:14:51.200 Pranav: Yep.

115 00:14:51.200 00:14:53.270 Clarence Stone: Right? That’s really fucking hard.

116 00:14:54.640 00:15:04.709 Clarence Stone: Right? And not just in, like, I’m not gonna just look on Zillow, I’m probably gonna look on Trulia, I’m probably gonna look on Remax, probably look at MLS, right? And that is what this task is here.

117 00:15:04.840 00:15:15.429 Clarence Stone: Here, it’s saying, hey, like, the goal is to support analysts and attorneys with human verification at every step. It’s like, okay, I know about all these blue houses.

118 00:15:15.530 00:15:24.949 Clarence Stone: Right? But now I want to investigate into each of these blue houses, like, who owned it, right? Or, like, what it might be. This… this helps them zoom in.

119 00:15:26.180 00:15:32.149 Clarence Stone: Right, so it could be, hey, you have a lead, you want to explore it, right, let’s take a look.

120 00:15:34.110 00:15:34.630 Pranav: show.

121 00:15:35.740 00:15:45.249 Clarence Stone: just from this, we’ve got some homework to do, because one, like, there’s a bunch of really cool tools that have come out recently, like the Palantir-style data lookups.

122 00:15:45.390 00:15:51.649 Clarence Stone: Right. You might be able to just find on Hugging Face a bunch of, social media data.

123 00:15:52.660 00:16:08.380 Clarence Stone: to enrich your environment. And then, like, I would just come up with two skills, right, that show, like, one, two containerized environments, one that’s just doing a massive data lookup, two that’s, like, helping them with, like, specific research.

124 00:16:10.160 00:16:10.720 Pranav: Yeah.

125 00:16:10.900 00:16:21.009 Clarence Stone: Right? And creating, like, this deep research report that says, oh, Pranav, you’re looking for a 3-bedroom house with 2 baths and a garage in Austin, that is blue.

126 00:16:21.010 00:16:34.840 Clarence Stone: Right? These are all the places we found your homes. This is your match criteria. This is, you know, the links in which we found that information, so you can double check that those houses are still existing on the market.

127 00:16:35.270 00:16:35.900 Pranav: Yeah.

128 00:16:36.740 00:16:49.350 Clarence Stone: And, like, create that output report to say, this is what a, you know, a analyst would, you know, be able to output just by following the step-by-step verification process.

129 00:16:50.690 00:16:51.370 Pranav: Yeah.

130 00:16:51.960 00:16:57.990 Pranav: Agreed, like… Okay, so then, my follow-up to this, right, is like, yeah, so I…

131 00:16:58.120 00:17:06.570 Pranav: I basically built a much lazier version of a demo. I didn’t even, like, think, like, line by line, like, okay, exactly…

132 00:17:06.770 00:17:09.550 Pranav: Like, going into this type of analysis, which…

133 00:17:09.880 00:17:17.749 Pranav: makes a lot of sense, like, what a… like, what a better demo would look like to me now. But then my question is, like.

134 00:17:18.300 00:17:24.230 Pranav: I’m still building just, like, a simulated environment, right? Like, so I am not… okay.

135 00:17:24.670 00:17:34.579 Clarence Stone: Yeah, absolutely. Like, you don’t have any legal data, so I don’t expect you to build anything that’s real, but maybe on Hugging Face, there are some data sets. Let’s actually look at it right now together.

136 00:17:34.920 00:17:37.409 Clarence Stone: We have time, right? We have 15 minutes.

137 00:17:37.650 00:17:38.220 Pranav: Yeah.

138 00:17:38.370 00:17:43.279 Pranav: And so, should it be functional, I guess, in that way? Like, it should actually, like, pull from that data?

139 00:17:43.470 00:17:54.490 Clarence Stone: Yes. So, like, the implied task here now is that if we want a really, like, seriously functional demo, like, we should have Sam clone that environment that he has.

140 00:17:55.290 00:17:56.030 Pranav: Hmm.

141 00:17:56.030 00:18:03.900 Clarence Stone: Right? And I should have a different, like, repo so that I can give you a front end that looks like DNY.

142 00:18:05.310 00:18:15.480 Clarence Stone: And then you can build up the skills, and honestly, it wouldn’t take us more than a day, because all we have to do is do, like, a really good search and a really good guided research plan.

143 00:18:16.220 00:18:16.880 Pranav: Right.

144 00:18:17.560 00:18:22.099 Clarence Stone: Okay, so I’m on Hugging Face, I’m looking at datasets, search datasets, so forth.

145 00:18:22.330 00:18:24.250 Clarence Stone: Let’s just see if there’s anything there.

146 00:18:24.530 00:18:26.460 Clarence Stone: Okay, it gives me a plan.

147 00:18:26.630 00:18:28.929 Clarence Stone: Hugging Face Search is fucking ass.

148 00:18:32.160 00:18:34.580 Clarence Stone: Yeah, that’s… yeah, here.

149 00:18:37.510 00:18:39.230 Clarence Stone: And it’s in Spanish.

150 00:18:39.360 00:18:41.460 Clarence Stone: Let’s try again.

151 00:18:43.170 00:18:44.409 Clarence Stone: Okay, here we go.

152 00:18:45.130 00:18:48.809 Clarence Stone: Social bias frames, social evals,

153 00:18:49.900 00:18:55.509 Clarence Stone: Okay, 1 million Reddit confessions. That is fucking awesome.

154 00:18:56.800 00:19:08.099 Clarence Stone: Right? And if you took this and matched it with maybe, an X feed, and be like, we can match who said something on X based on a confession that’s in Reddit.

155 00:19:08.850 00:19:09.520 Pranav: Hmm.

156 00:19:12.040 00:19:15.449 Clarence Stone: Right. That’s how you get customers to say.

157 00:19:15.720 00:19:17.899 Clarence Stone: How, like, just take my money.

158 00:19:18.800 00:19:19.500 Pranav: Yeah.

159 00:19:20.870 00:19:21.690 Clarence Stone: Right?

160 00:19:22.530 00:19:24.890 Clarence Stone: So, I mean, this is one dataset.

161 00:19:26.690 00:19:28.539 Clarence Stone: We’d probably find another one.

162 00:19:31.320 00:19:34.809 Clarence Stone: Internship, there’s a lot of Reddit, holy shit.

163 00:19:35.270 00:19:36.729 Pranav: And so, in this demo.

164 00:19:36.730 00:19:38.519 Clarence Stone: Well, boom, here’s another one.

165 00:19:39.340 00:19:41.079 Clarence Stone: Take that and link it.

166 00:19:47.730 00:19:48.750 Pranav: Gotcha.

167 00:19:49.100 00:19:54.229 Clarence Stone: Right? You should be like, hey, somebody confessed to this. Who on the social media might have said it?

168 00:19:56.430 00:20:01.720 Pranav: And so, it’s kind of… this demo, right? Because how I was thinking about…

169 00:20:02.410 00:20:13.760 Pranav: demos in the past for, like, smaller ticket clients, I guess. Like, maybe ones that are gonna be, like, 10K, 15K. It’s just, like, doing this much due diligence, like.

170 00:20:14.060 00:20:28.830 Pranav: I don’t know if it’s the right thing to do, like, Utam was also saying, like, you know, maybe, like, you know, don’t spend a ton of time on those type of things. But then on this type of client, right, like, where it’s… where Robert’s, like, it can literally be, like, 100K per month.

171 00:20:29.180 00:20:33.580 Pranav: Yeah, I definitely should be putting in more effort for the demo.

172 00:20:34.480 00:20:42.550 Pranav: is this kind of, like, is it informing, like, kind of your decision, too, like, for why I’m, like, going… like, I should go in much harder with, like, spending a day on, like, a.

173 00:20:42.710 00:20:52.710 Clarence Stone: So, welcome to my world. This is the segment I play in. I don’t wake up unless I’m being paid less than, like, more than 300 bucks an hour.

174 00:20:53.780 00:21:01.380 Clarence Stone: And… I, like, I can’t charge that kind of stuff unless I command this kind of service.

175 00:21:01.810 00:21:15.709 Clarence Stone: Gotcha. So this is a different world. And if you guys want to play in this world, and by the way, Tom and Robert and I have multiple conversations about this. This is where they want to be, right? Having larger clients and bigger accounts, this is how we have to go to market.

176 00:21:16.720 00:21:17.320 Pranav: Okay.

177 00:21:17.490 00:21:18.150 Pranav: Gotcha.

178 00:21:18.150 00:21:22.050 Clarence Stone: Very different than that 10 to 20 space you’ve been playing in, for sure.

179 00:21:22.050 00:21:26.890 Pranav: Yeah, and I… looking back, I built those demos, like, super fast.

180 00:21:27.050 00:21:31.420 Pranav: But, like, in an effort… and I guess… kind of…

181 00:21:32.290 00:21:47.489 Pranav: my thinking was, like, okay, let me just build them multiple demos, let me give them four case studies, and, like, we’ll have something built out in, like, OpenWork 2 to kind of just, like, be another thing. So it’s kind of volume, but yeah, I mean, I like this way better, where we just, like.

182 00:21:47.550 00:21:58.489 Pranav: Have a few demos, maybe… maybe less material that we show them, but, like, something that has real, just, like, wow factor, and something with actual functionality, not just, like, a click-through, like, interface.

183 00:21:59.120 00:22:18.990 Clarence Stone: Yeah, so I’m gonna give you 3 ideas, and I want you to create 3 plans on how you might create those skills. Look for pre-made things on X. I’m sorry, on X or, like, GitHub, right? Like, let’s not start from scratch, but, like, one is finding something in a large database, right? If you had a bunch of social media posts.

184 00:22:18.990 00:22:25.249 Clarence Stone: And you wanted to find some correlations, like, tell me about somebody who talked about this topic, right?

185 00:22:25.710 00:22:28.870 Clarence Stone: Like, give me their names and who they are, right?

186 00:22:28.870 00:22:29.390 Pranav: Yep.

187 00:22:29.390 00:22:35.720 Clarence Stone: Two is, like, I found this confession on Reddit, match it to whose X post it might be.

188 00:22:36.200 00:22:45.970 Clarence Stone: Right? And 3 is, if you had somebody’s account, right, just like their entire ex-profile, or any social media, I don’t really care what it is, right?

189 00:22:45.970 00:22:58.520 Clarence Stone: I want you to be able to apply a skill that says, give me a mental profile analysis of this person. What is their headspace? What is their motivation? And highlight to me where you are finding that information.

190 00:23:01.000 00:23:07.339 Clarence Stone: Right? Those are the things your client wants, and from that email, I figured it all out.

191 00:23:07.340 00:23:08.770 Pranav: Yeah, yeah.

192 00:23:08.770 00:23:09.160 Clarence Stone: Right.

193 00:23:09.160 00:23:12.259 Pranav: This is what Robert was looking for me to do, too.

194 00:23:12.410 00:23:13.040 Pranav: Yeah. For sure.

195 00:23:13.040 00:23:29.280 Clarence Stone: But, like, I was telling them back last night, I was like, you can expect that, like, this is a, you know, skill set that gets honed, somebody has to explain it, right? Like, this isn’t something that, you know, you just expect somebody new to do. Like.

196 00:23:29.280 00:23:29.880 Pranav: Yeah.

197 00:23:29.880 00:23:32.639 Clarence Stone: I didn’t wake up one day and just be able to do this.

198 00:23:33.040 00:23:33.640 Clarence Stone: Right.

199 00:23:33.740 00:23:34.500 Pranav: Yeah.

200 00:23:35.540 00:23:35.870 Clarence Stone: Okay.

201 00:23:35.870 00:23:40.840 Pranav: No, that makes sense. Okay, cool.

202 00:23:40.950 00:23:44.270 Pranav: Yeah, one…

203 00:23:44.270 00:23:57.800 Clarence Stone: For open work, just remember that it’s just an open harness, right? So, if you think about how cloud code was, and the way you were building stuff for Lilo, it was figuring out skills, MCPs, connectors, and data.

204 00:23:58.760 00:24:04.519 Clarence Stone: Right. It might not be all four, it might be a combination of two or three of them, right, whatever it is.

205 00:24:04.900 00:24:10.619 Clarence Stone: That’s the same thing for OpenWork, except we have even more, like, control over it.

206 00:24:11.980 00:24:18.240 Clarence Stone: Right? So, for each of these things, I would say, hey, like, what dataset would I use to demo this?

207 00:24:18.920 00:24:23.550 Clarence Stone: Right? What skill would I trigger? How would that experience be?

208 00:24:23.650 00:24:28.399 Clarence Stone: Right? User will go here, click here, and this is what the output should be.

209 00:24:28.730 00:24:29.220 Pranav: Hmm.

210 00:24:29.220 00:24:29.920 Clarence Stone: Right.

211 00:24:30.130 00:24:35.079 Clarence Stone: And then… and then, like, what other, like, outputs

212 00:24:35.430 00:24:42.689 Clarence Stone: does the end state look like, right? Like, for the Reddit confessions, I said, like, I want a deep research report, right?

213 00:24:42.690 00:24:43.300 Pranav: Yep.

214 00:24:43.300 00:25:01.770 Clarence Stone: It’s most likely, you know, this X profile with a 90% match, here are some links to the post and why we correlated. Here’s this other profile, it’s a 50% match, right? Like, legal, you know, work requires that, right? You can’t just say, oh, it’s Clarence, he did it.

215 00:25:01.770 00:25:12.069 Clarence Stone: Right? Like, people’s lives are on the line, and, you know, lawyers can be held accountable. They need to be able to show direct proof for those things. Mental profile analysis, right? Like.

216 00:25:12.760 00:25:17.080 Clarence Stone: What could you do to make that useful for a lawyer?

217 00:25:19.480 00:25:27.390 Clarence Stone: Right? And… So, you would write out… What is the objective?

218 00:25:28.550 00:25:32.070 Clarence Stone: Why it would be useful to your user.

219 00:25:32.630 00:25:35.769 Clarence Stone: What that experience looks like.

220 00:25:37.740 00:25:41.500 Clarence Stone: And… Cool.

221 00:25:41.720 00:25:55.220 Clarence Stone: And, well, we got 5 minutes, I’m gonna let you go, but, like, this is the task. If you want to huddle with me and do the first one together, I’m more than happy. Like, like, I don’t expect you to hit it out of the park right away. This is hard, right?

222 00:25:55.220 00:25:55.920 Pranav: Yeah.

223 00:25:55.920 00:25:58.810 Clarence Stone: And now you see the difference between the price brackets.

224 00:25:59.280 00:26:02.219 Pranav: Yeah. No, this is good for me to know, because, like.

225 00:26:02.800 00:26:06.060 Pranav: I wouldn’t have felt comfortable doing all of this.

226 00:26:06.340 00:26:21.130 Pranav: before, unless, like, someone told me to, because I would have been like, no, I’m doing overkill, like, this is client work that should be, like, after we actually get paid. But, yeah, you telling me that, like, you know, when you operate at this, like, typo level, for, like, in terms of…

227 00:26:21.250 00:26:27.510 Pranav: like, contract size, like, you need to deliver this type of demo? Like, that makes total sense to me.

228 00:26:27.510 00:26:33.220 Clarence Stone: Yeah, I mean, it’s a law firm, right? Like, they’re just not gonna randomly hire somebody because you’re good for it.

229 00:26:33.990 00:26:35.889 Clarence Stone: It’s like, yo, show me the goods.

230 00:26:37.280 00:26:39.080 Pranav: Yeah, makes sense. Yeah.

231 00:26:39.080 00:26:39.840 Clarence Stone: Right.

232 00:26:40.190 00:26:59.060 Clarence Stone: So, one, we’re gonna say, hey, we can show you that we can do it. Two, we need to have a deeper architectural discussion, and I’ll teach you the architectural stuff on why, like, local AI is slightly different, and, you know, the issues that you’re gonna run into. Like, there’s huge benefits, but also huge issues on, like, different architectures.

233 00:27:00.240 00:27:20.010 Clarence Stone: But, this is a fun one, because, like, if you lock this type of client in, you will be working for years, and just, like, getting completely amortized a ton of money, because government contracts don’t go away. You lock one in, you’re just gonna have a great client for a while. It’s worth, you know, the juice is worth the squeeze.

234 00:27:21.460 00:27:26.640 Pranav: To… yeah, so, yeah, what I’m gonna do is I’m… I’m just gonna be like… First off, like…

235 00:27:26.830 00:27:34.490 Pranav: Tonight, I’ll just kind of go crazy on, like, trying to build out these demos, and I’ll send something to you, there’ll probably be, like, a bunch of, like, edits, or just, like, comments.

236 00:27:34.490 00:27:38.160 Clarence Stone: Don’t build anything. Literally plan it.

237 00:27:38.160 00:27:39.260 Pranav: Okay, Planet, yeah, yeah.

238 00:27:39.510 00:27:47.820 Clarence Stone: have a conversation with AI, saying, like, hey, I’m working for a law firm that wants to make an AI assistant for mental profile analysis.

239 00:27:47.920 00:28:01.959 Clarence Stone: Right. Where are some data points that I can start with, right? Like, you can just scroll through it and just be like, actually, I want to use this exact one, tell me how I can leverage it. Right? Or you’ll look into these datasets and realize, like, hey, it’s missing certain things.

240 00:28:02.980 00:28:08.260 Clarence Stone: So, it might not be the right dataset. And you can just ask AI, like, what is a good data set, then?

241 00:28:08.680 00:28:09.410 Pranav: Yeah.

242 00:28:09.410 00:28:15.309 Clarence Stone: Right? Like, this one for political social, like, this is probably really good for mental profile analysis.

243 00:28:15.800 00:28:16.440 Pranav: Hmm.

244 00:28:16.720 00:28:26.910 Clarence Stone: Right? Because you can then match, like, sentiment negative, sentiment neutral, like, all of this will help you figure out what is their mental state.

245 00:28:28.220 00:28:28.900 Pranav: Yeah.

246 00:28:29.970 00:28:43.189 Clarence Stone: So, like, the recon part is going to save you so much time on the tail end. Like, when… if you build something and it’s just, like, useless, like, you’ll realize that, man, I should have just, like, made sure I had a sound plan.

247 00:28:44.110 00:28:45.500 Pranav: Yeah. Yep.

248 00:28:46.950 00:28:57.080 Clarence Stone: Yeah, so don’t build anything, make a plan, and present it to me, like, we’ll talk through it, right? And then we can execute from there. But,

249 00:28:57.910 00:29:02.920 Clarence Stone: I mean, if you want to go through the first one together after you get off your next meeting, we can.

250 00:29:03.030 00:29:06.939 Clarence Stone: It’s… it’s not gonna be crazy, it’s probably gonna take you, like, 30 minutes each.

251 00:29:08.290 00:29:11.150 Clarence Stone: This is fun work. This is fun thinking work.

252 00:29:11.330 00:29:22.019 Pranav: Yeah, I agree. This is, like, I feel like a lot of, like, the other stuff that we’re doing, it just kind of, like, seems like I’m just doing the same thing over again, like, saying the same things over and over again, but, like, this is a unique problem.

253 00:29:22.240 00:29:26.500 Pranav: And a very unique client, so, like, Yeah, this is definitely something.

254 00:29:26.500 00:29:28.700 Clarence Stone: This is where the money is.

255 00:29:29.040 00:29:30.839 Pranav: Yeah, that’s what it seems like, yeah.

256 00:29:31.810 00:29:32.560 Pranav: Sick.

257 00:29:32.850 00:29:36.890 Pranav: Okay, Clarence, yeah, let me hop off so I can just get a couple minutes to set up for this call.

258 00:29:37.110 00:29:40.090 Clarence Stone: And then for me, we’ll drive through the first one.

259 00:29:40.340 00:29:42.380 Pranav: Perfect. Yeah, that sounds good. Cool.

260 00:29:43.050 00:29:44.119 Pranav: Thanks for the help.

261 00:29:44.120 00:29:44.790 Clarence Stone: Yep.