Meeting Title: Brainforge Office Hours Planning Session Date: 2026-03-04 Meeting participants: Hannah Wang, Ryan Brosas, Pranav Narahari, Luke Scorziell, Uttam Kumaran
WEBVTT
1 00:00:56.840 ⇒ 00:00:57.920 Pranav Narahari: Hey guys.
2 00:01:01.230 ⇒ 00:01:02.570 Hannah Wang: Hey, how’s it going?
3 00:01:03.570 ⇒ 00:01:04.709 Pranav Narahari: Pretty good, pretty good.
4 00:01:28.360 ⇒ 00:01:31.070 Pranav Narahari: Sorry, I was just looking through my tabs here.
5 00:01:31.250 ⇒ 00:01:35.020 Pranav Narahari: I just got back home. I’m on the East Coast, I don’t know where you guys are based out of.
6 00:01:36.360 ⇒ 00:01:39.220 Hannah Wang: Oh, I’m in… I’m in LA, so…
7 00:01:39.570 ⇒ 00:01:42.430 Hannah Wang: A few hours behind, behind you.
8 00:01:43.590 ⇒ 00:01:47.660 Hannah Wang: Are you originally in the East Coast? From the East Coast?
9 00:01:48.180 ⇒ 00:01:53.609 Pranav Narahari: Yeah, I grew up here. I was in Austin for the last 2 months before last week.
10 00:01:53.780 ⇒ 00:01:58.050 Pranav Narahari: But yeah, I’m, like, based out of Massachusetts.
11 00:01:58.660 ⇒ 00:02:00.829 Hannah Wang: Okay, why were you in Austin?
12 00:02:01.370 ⇒ 00:02:08.059 Pranav Narahari: So I used to live there, so I have, like, a lot of, like, friends and just, connections there.
13 00:02:08.280 ⇒ 00:02:13.689 Pranav Narahari: it was also, like, nice that, Utam was out there too, like, kind of being able to go to WeWorks and stuff, so…
14 00:02:13.690 ⇒ 00:02:15.870 Hannah Wang: Just on Airbnb for a little bit.
15 00:02:18.050 ⇒ 00:02:18.390 Pranav Narahari: Yeah.
16 00:02:18.390 ⇒ 00:02:20.790 Hannah Wang: Is it cold over there? And…
17 00:02:20.790 ⇒ 00:02:31.420 Pranav Narahari: Yeah, I came at the worst time. Like, the blue, like, is insane. Like, when I was driving, like, I had to change my, like, my travel plans a little bit, just because of the snow.
18 00:02:31.550 ⇒ 00:02:32.809 Hannah Wang: Oh my gosh.
19 00:02:33.010 ⇒ 00:02:34.730 Hannah Wang: I can’t relate. I…
20 00:02:35.830 ⇒ 00:02:39.119 Pranav Narahari: Yeah, it’s sunny here.
21 00:02:39.120 ⇒ 00:02:40.269 Luke Scorziell: It’s always perfect over there
22 00:02:43.640 ⇒ 00:02:45.929 Luke Scorziell: Hey, Luke. Hey, how’s it going?
23 00:02:46.880 ⇒ 00:02:48.320 Pranav Narahari: Pretty good, pretty good.
24 00:02:49.830 ⇒ 00:02:56.060 Luke Scorziell: Alright, well, we’ve officially… Invited a ton of people, and have people attending.
25 00:02:56.470 ⇒ 00:02:58.820 Pranav Narahari: 8 people is crazy, I didn’t think we’d get 8.
26 00:02:58.820 ⇒ 00:03:00.220 Luke Scorziell: It’s 10, we have 10 now.
27 00:03:00.390 ⇒ 00:03:02.450 Pranav Narahari: Oh, 10? Oh my god.
28 00:03:03.100 ⇒ 00:03:06.180 Luke Scorziell: So I can kind of… like…
29 00:03:07.930 ⇒ 00:03:13.060 Luke Scorziell: Yeah, I’m just like, alright, we’ll see how this goes. I mean, I think it… I think it’s… I’m really excited, I think this is, like.
30 00:03:13.440 ⇒ 00:03:15.860 Luke Scorziell: the right momentum.
31 00:03:16.040 ⇒ 00:03:16.490 Pranav Narahari: Yep.
32 00:03:16.490 ⇒ 00:03:18.849 Luke Scorziell: for us, because I think that,
33 00:03:20.160 ⇒ 00:03:29.720 Luke Scorziell: Yeah, I mean, this is 8 MQLs, several are already, like, like a Jeff McDonald is coming, who we’ve been talking to from movers and shakers.
34 00:03:29.950 ⇒ 00:03:38.659 Luke Scorziell: And then this gives us just, like, a really kind of, like, solid playbook, I think, to kind of do this again, either with
35 00:03:39.290 ⇒ 00:03:44.189 Luke Scorziell: Yeah, more agency people, or… yeah, so, so we’ll just see, but,
36 00:03:46.350 ⇒ 00:03:49.379 Luke Scorziell: I guess hope for today,
37 00:03:50.020 ⇒ 00:03:57.250 Luke Scorziell: is I kind of… I put together what I think is, like, a pretty good…
38 00:03:59.650 ⇒ 00:04:03.360 Luke Scorziell: Presentation… or, no, it’s not only a presentation.
39 00:04:03.720 ⇒ 00:04:07.769 Luke Scorziell: So I can share my screen.
40 00:04:09.930 ⇒ 00:04:15.360 Luke Scorziell: I know, Hannah, you looked through it. I don’t know if Pranav, you got a chance to yet. And then Utam was maybe gonna join, but…
41 00:04:15.730 ⇒ 00:04:17.399 Pranav Narahari: Yeah, I didn’t get a chance to look through it.
42 00:04:21.630 ⇒ 00:04:30.179 Luke Scorziell: So… Yeah, basically, I’m thinking, like, the goal of this…
43 00:04:30.320 ⇒ 00:04:32.330 Luke Scorziell: I kind of sent this in the chat, too.
44 00:04:32.900 ⇒ 00:04:40.630 Luke Scorziell: It’s kind of like… to… I mean, we want to add, like, as much value as we can.
45 00:04:41.090 ⇒ 00:04:46.090 Luke Scorziell: To these people, as we’re on,
46 00:04:46.640 ⇒ 00:04:50.190 Luke Scorziell: as they’re on this call, like, I hope that they come away feeling like…
47 00:04:50.600 ⇒ 00:04:59.040 Luke Scorziell: oh man, like, that was super dope, like, I want to come back to another one of these. And I don’t… yeah, so… in that sense, like, I want it to be, like, engaging.
48 00:04:59.160 ⇒ 00:05:02.330 Luke Scorziell: The hope is not that it’s, like, us talking at them.
49 00:05:02.700 ⇒ 00:05:05.160 Luke Scorziell: As much as it’s kind of a…
50 00:05:05.620 ⇒ 00:05:12.690 Luke Scorziell: facilitating… but not, like, fully facilitating, but kind of, like, including them in the conversation, I guess.
51 00:05:14.460 ⇒ 00:05:19.429 Luke Scorziell: And so, in that sense, it’s, like, building relationships, and then, yeah, Ryan, the hope is that we can, like.
52 00:05:19.930 ⇒ 00:05:24.000 Luke Scorziell: pull some good content, out of this. So…
53 00:05:24.250 ⇒ 00:05:29.979 Luke Scorziell: That’s kind of the goals. I also sent that message in the chat, I don’t know if anyone has any, like, thoughts, or, like…
54 00:05:30.980 ⇒ 00:05:34.650 Luke Scorziell: Anything off of that?
55 00:05:34.860 ⇒ 00:05:36.990 Luke Scorziell: But otherwise, I can kind of keep moving along.
56 00:05:36.990 ⇒ 00:05:45.880 Uttam Kumaran: Yeah, I put some… I put a couple comments. I think, like, just think about webinars you guys had gone to. They most… they mostly sucked.
57 00:05:46.080 ⇒ 00:05:51.659 Uttam Kumaran: And so, just try to do the opposite of whatever the webinar you’ve most recently been to has done, like…
58 00:05:51.710 ⇒ 00:06:08.129 Uttam Kumaran: actually have fun, try to be, like… try to… there’s only 8 people, so, like, call people out, say hi, like, get people to turn their video on if they can, like, make an impression that lasts, and, like, do it kind of the way we do it, like, build a… don’t…
59 00:06:08.450 ⇒ 00:06:18.929 Uttam Kumaran: the webinar is but a name for, like, a really, like, kind of, like, horribly misused medium. Like, you guys are basically able to run a global event
60 00:06:19.120 ⇒ 00:06:34.930 Uttam Kumaran: from your laptop. And so, treat it kind of, like, with the same importance and fun that we would do in an in-person event. Like, we’re gonna do this happy hour here in Austin tomorrow, and, like, I’m gonna go shake everybody’s hand, have a great time, buy people’s… like, it’s gonna be great, and so…
61 00:06:35.200 ⇒ 00:06:46.560 Uttam Kumaran: I think, like, don’t let the… don’t let, like, the name of, like, webinar set the stage for, like, what this can be. So, like, some of the comments I said is, like, talk about, like, I think…
62 00:06:46.660 ⇒ 00:06:51.020 Uttam Kumaran: Luke, like, you joined Brainforge really recently, and you went from, like.
63 00:06:51.400 ⇒ 00:07:08.409 Uttam Kumaran: you know, things like ChatGPT Lite to, like, where you are now, you’re literally building apps in Cursor, like, talk about that. Talk about how, like, you… you also, you maybe, like, didn’t agree that you would ever get here, like, even, like, 6 weeks ago, and you were, you know, you were intimidated. Talk about those types of things, because that’s what’s, like.
64 00:07:08.730 ⇒ 00:07:15.119 Uttam Kumaran: that’s what’s, relatable, you know? Versus, like, every webinar right now is, like.
65 00:07:15.360 ⇒ 00:07:18.930 Uttam Kumaran: Either complete jargon or complete technical.
66 00:07:19.050 ⇒ 00:07:22.119 Uttam Kumaran: So we were somewhere in the middle, where, like, more practical.
67 00:07:22.230 ⇒ 00:07:32.129 Uttam Kumaran: Right? Like, it’s a small crowd, so that’s kind of, like, what I… a lot of my comments are, which is, like, talk about where you were before and now, like, Pranav, you can do the same.
68 00:07:32.440 ⇒ 00:07:47.160 Uttam Kumaran: highlight, like, literally the things that two months ago you were doing manually, and now you use AI for, and then also talk about, like, how that’s impacted you. Like, you can literally be like, hey, I’m, like, a one-man go-to-market team. I need to hit, like, a quarter million in, like.
69 00:07:47.230 ⇒ 00:07:56.179 Uttam Kumaran: monthly bookings, or whatever, and, like, I’m, like, I don’t, like, I have to, like, that’s just, like, what our goal is, and I haven’t been given any more budget, and so that’s what I’m dealing with, like…
70 00:07:56.690 ⇒ 00:08:02.470 Uttam Kumaran: just make it really frank, because there’s… what do we have to hide? Like, I think it’s a really interesting story, you know? So…
71 00:08:02.660 ⇒ 00:08:06.040 Luke Scorziell: Otherwise, my comments were just like any other.
72 00:08:06.040 ⇒ 00:08:10.619 Uttam Kumaran: ways of explaining some of the things that you had, like, I describe us as… I describe it as, like.
73 00:08:10.890 ⇒ 00:08:13.449 Uttam Kumaran: Plumbing in the wall, you can talk about, like.
74 00:08:13.950 ⇒ 00:08:22.120 Uttam Kumaran: when you automate, like, what does that free you up to do? Well, it’s like, yo, you can spend time with existing clients, your team, and, like, ideally spend more time with future clients, right?
75 00:08:22.900 ⇒ 00:08:38.189 Uttam Kumaran: And the discussion piece is where I think you guys, like, it’s, like, sort of, that’s where, like, I think it’s, like, make or break. If you guys can really get people engaged, let people air out their things, don’t let one person, like, steal the spotlight, right? Like, see if you can use…
76 00:08:38.309 ⇒ 00:08:43.239 Uttam Kumaran: someone’s question to, like, be like, oh, that’s great, like, I wonder how X person, you think about this, too.
77 00:08:43.890 ⇒ 00:08:49.230 Uttam Kumaran: Right, so treat it like you would if you just had these, like, 10 people in front of you, right? Like…
78 00:08:49.540 ⇒ 00:08:51.079 Uttam Kumaran: I think that’s…
79 00:08:51.320 ⇒ 00:09:01.529 Uttam Kumaran: that’s, like, gonna be the key here, but I really think you just gotta just, like, have fun and actually just, like, really lower the stakes and build, like, a relationship with everybody, because
80 00:09:01.920 ⇒ 00:09:07.659 Uttam Kumaran: Nobody’s gonna learn, like… it takes time to learn all this stuff, so you… so, like, people today are…
81 00:09:07.930 ⇒ 00:09:25.680 Uttam Kumaran: during that, they want to find out that we are trusted, they want to be like, wow, I know if I keep coming to these, or if I even just, like, talk to these folks, I’m going to learn something, because they speak my language, or they’re where I am now, you know, they kind of hit that on the head. Nobody’s going to learn a lot of stuff in 45 minutes, right? So…
82 00:09:26.160 ⇒ 00:09:34.750 Uttam Kumaran: What we want to show is that, like, hey, we’re gonna do more of these, we want to build trust, and if there’s any way we can be helpful, happy to talk.
83 00:09:35.500 ⇒ 00:09:36.130 Uttam Kumaran: That’s it.
84 00:09:36.130 ⇒ 00:09:36.800 Luke Scorziell: Hmm.
85 00:09:36.800 ⇒ 00:09:37.460 Uttam Kumaran: You know?
86 00:09:38.050 ⇒ 00:09:42.909 Uttam Kumaran: to leave it at that, you don’t have to be like, oh, if you want to hire us, call me, otherwise don’t call me, like…
87 00:09:43.010 ⇒ 00:09:48.809 Uttam Kumaran: I… I literally, when I talk to people these days, I say, okay, tell me how I can be helpful.
88 00:09:48.960 ⇒ 00:09:53.700 Uttam Kumaran: I just leave it at that, because you don’t know whether they want to refer you to someone else, they just want to call you because
89 00:09:53.870 ⇒ 00:09:57.289 Uttam Kumaran: They think that maybe you can help them, and that turns into a deal, like…
90 00:09:57.420 ⇒ 00:10:02.170 Uttam Kumaran: To lead with being helpful until you are like me and run out of… basically run out of time.
91 00:10:02.560 ⇒ 00:10:08.100 Uttam Kumaran: then you can start being like, okay, I’m gonna only… if this is gonna make money, I’ll spend time on it.
92 00:10:08.260 ⇒ 00:10:15.500 Uttam Kumaran: But these are all these people, like, some people I’ve messaged some people, some… there should be more people attending, like, from all different walks, all different roles.
93 00:10:15.610 ⇒ 00:10:17.570 Uttam Kumaran: So, I think it’ll be awesome.
94 00:10:19.590 ⇒ 00:10:22.739 Luke Scorziell: Yeah, I think,
95 00:10:34.790 ⇒ 00:10:38.609 Luke Scorziell: Do you think weekly is, like, too often, or should we do monthly for…
96 00:10:39.290 ⇒ 00:10:43.679 Uttam Kumaran: Oh, I think, or every other week? I think you should do, like, a series.
97 00:10:44.210 ⇒ 00:10:46.789 Uttam Kumaran: And then have a break. Like, do a season.
98 00:10:47.040 ⇒ 00:10:51.610 Uttam Kumaran: Is what I was thinking, where you could be like, I’m gonna do a C- we’re gonna do this just on, like.
99 00:10:52.210 ⇒ 00:11:01.109 Uttam Kumaran: I mean, I think you’re… this… this… this one is kind of a little bit focused on, like, marketing or brand, but, like, maybe we’re doing, like, a 5-episode weekly.
100 00:11:01.490 ⇒ 00:11:06.060 Uttam Kumaran: thing, and then we’re gonna take, like, two weeks, because then that way you get 2 weeks to sort of look at how it went.
101 00:11:06.260 ⇒ 00:11:07.580 Uttam Kumaran: readjust.
102 00:11:07.890 ⇒ 00:11:09.729 Uttam Kumaran: And then come back even harder.
103 00:11:10.560 ⇒ 00:11:13.929 Uttam Kumaran: And so, maybe it’d be like you were running, like, a three-part thing.
104 00:11:14.220 ⇒ 00:11:18.659 Uttam Kumaran: If anyone, like, we… this is our first pilot episode, but if people are interested, then, like.
105 00:11:18.830 ⇒ 00:11:23.500 Uttam Kumaran: we’re gonna… we are slated to run these for the next 3 weeks, like, you can suss it out.
106 00:11:23.880 ⇒ 00:11:29.019 Uttam Kumaran: But I think running, like, at least 3 or 4, and then pausing for a week or two to, like.
107 00:11:29.580 ⇒ 00:11:31.060 Uttam Kumaran: Be like, okay.
108 00:11:31.180 ⇒ 00:11:36.240 Uttam Kumaran: We need to, like, use a different software, we need to come pre-prepared with questions, we need to get it right, like…
109 00:11:36.930 ⇒ 00:11:42.950 Uttam Kumaran: Again, it’s only… we’re a small team, so, like, it’s gonna be hard if you’re, like, preparing for the next one and, like, realigning.
110 00:11:43.510 ⇒ 00:11:49.229 Uttam Kumaran: And you don’t wanna… I don’t want to sign up for 52 weeks of this right now, until you figure out that it’s actually really worth it, you know?
111 00:11:50.260 ⇒ 00:11:50.880 Luke Scorziell: Yeah.
112 00:11:50.960 ⇒ 00:11:51.890 Uttam Kumaran: Yeah.
113 00:11:54.750 ⇒ 00:11:56.059 Uttam Kumaran: What’s the giveaway?
114 00:11:57.180 ⇒ 00:12:00.890 Luke Scorziell: I think, bye.
115 00:12:01.050 ⇒ 00:12:04.389 Uttam Kumaran: Kind of, so I’ll show you guys, too.
116 00:12:05.070 ⇒ 00:12:10.830 Luke Scorziell: I… I was just telling Tom about this on the phone. Kind of built, like, to…
117 00:12:12.180 ⇒ 00:12:14.170 Luke Scorziell: Oh gosh, where did it go?
118 00:12:15.160 ⇒ 00:12:23.770 Luke Scorziell: Oh, oh, it’s right here. It’s funny, added a link. So, one of the people… I mean, maybe…
119 00:12:23.910 ⇒ 00:12:26.819 Luke Scorziell: I don’t know if you want to spend time on this or knock it off, but,
120 00:12:27.730 ⇒ 00:12:40.460 Luke Scorziell: I’m, like, one of them asked, basically, for, like, a Slack bot, or she didn’t say a Slack bot, but she wants something that, like, automatically posts, like, the latest Google, like, kind of what’s happening on Google every day.
121 00:12:40.590 ⇒ 00:12:43.520 Luke Scorziell: And so I, I just had…
122 00:12:44.190 ⇒ 00:12:51.060 Luke Scorziell: had a, cursor put this together, so I’m kind of thinking, like, I might make it a little bit more interactive, so that she could actually, like.
123 00:12:51.660 ⇒ 00:12:54.950 Luke Scorziell: Maybe, like, add her own channel, and just,
124 00:12:55.240 ⇒ 00:13:01.759 Luke Scorziell: just kind of something where, like, yesterday with David and Goliath, I sent them this demo, and it basically…
125 00:13:02.020 ⇒ 00:13:09.339 Luke Scorziell: I designed it to be like, hey, go share this with people, and then it will literally teach them in advance how to use the system that we could build for you.
126 00:13:09.440 ⇒ 00:13:12.560 Luke Scorziell: And I… I think that’s, like, kind of interesting, so…
127 00:13:13.050 ⇒ 00:13:17.129 Luke Scorziell: Yeah, so I think this will be maybe part of the giveaway, is like, hey.
128 00:13:17.340 ⇒ 00:13:20.439 Luke Scorziell: You know, for the couple people that put in questions, we actually…
129 00:13:20.900 ⇒ 00:13:25.389 Luke Scorziell: created a, I mean, we can change it too, but then… then someone else asked…
130 00:13:25.720 ⇒ 00:13:27.780 Luke Scorziell: About, like, how can I streamline ops?
131 00:13:28.070 ⇒ 00:13:34.180 Luke Scorziell: And… I haven’t worked on this one too much, but it’s like… you know…
132 00:13:34.540 ⇒ 00:13:37.140 Luke Scorziell: Your calls get recorded in here.
133 00:13:37.360 ⇒ 00:13:39.130 Luke Scorziell: that analyzes them.
134 00:13:39.290 ⇒ 00:13:42.189 Luke Scorziell: So you can ask it, like, an analyzed question.
135 00:13:42.490 ⇒ 00:13:53.160 Luke Scorziell: Kind of tells you, and then site, like, and then you can create, like, a strategy. So, this was, like, for the PR agency that… a couple PR people in here, in the call, but…
136 00:13:53.380 ⇒ 00:13:56.890 Luke Scorziell: So I’m kind of thinking, like, we could just be like, hey, you know.
137 00:13:57.090 ⇒ 00:14:03.130 Luke Scorziell: What are their names again? Trying to clean up my…
138 00:14:05.720 ⇒ 00:14:20.940 Luke Scorziell: Yeah, hey, Alice and Kayla, like, we actually, like, were able to design you something, based on what you asked for. If… if you, like, whoever comes next week, like, you know, we’ll… we’ll randomly decide to create one for you.
139 00:14:21.190 ⇒ 00:14:26.440 Luke Scorziell: And then… then, so then they’re getting, like, a personalized, lead magnet.
140 00:14:27.040 ⇒ 00:14:33.320 Luke Scorziell: So, that… I don’t know, I don’t know if you guys have thoughts, opinions on that, but,
141 00:14:44.070 ⇒ 00:14:45.050 Luke Scorziell: Any thoughts?
142 00:14:49.210 ⇒ 00:14:50.360 Uttam Kumaran: I hear my thoughts.
143 00:14:53.490 ⇒ 00:14:54.170 Luke Scorziell: I need to…
144 00:14:56.000 ⇒ 00:14:57.639 Pranav Narahari: Yeah, I’m thinking,
145 00:15:01.630 ⇒ 00:15:09.490 Pranav Narahari: I mean, I like the idea. It’s… it’s… I kind of… I mean, actually, I really like it because…
146 00:15:11.160 ⇒ 00:15:19.340 Pranav Narahari: it’s kind of just gonna engage them more, you know, like the prize. So, by creating that for them, it’s just like… I think it just…
147 00:15:20.270 ⇒ 00:15:22.099 Pranav Narahari: I’m wondering if, like…
148 00:15:22.880 ⇒ 00:15:28.199 Pranav Narahari: It’s not even necessarily, like, a prize, we can just do it for, like, whoever engages with us, you know?
149 00:15:29.560 ⇒ 00:15:31.299 Luke Scorziell: Like, everyone gets their own…
150 00:15:32.490 ⇒ 00:15:36.180 Pranav Narahari: Yeah, we don’t need to just do it for everybody, we’ll kind of be able to just, like.
151 00:15:36.680 ⇒ 00:15:42.339 Pranav Narahari: See, like, who’s actually, like, engaging with us in the meeting, and then…
152 00:15:43.240 ⇒ 00:15:49.830 Pranav Narahari: Based on that, we give them something to play with to, like, engage them further, so they don’t just, like, forget about us after the meeting.
153 00:15:50.230 ⇒ 00:15:53.739 Pranav Narahari: Does that make sense?
154 00:15:55.910 ⇒ 00:15:56.540 Luke Scorziell: Yeah.
155 00:15:56.850 ⇒ 00:15:59.369 Luke Scorziell: I mean, I guess my hope with this is, like.
156 00:15:59.720 ⇒ 00:16:04.640 Luke Scorziell: This is, like, a common issue, especially this, like, mon… it sounds like the monitoring,
157 00:16:05.390 ⇒ 00:16:12.340 Luke Scorziell: Social monitoring is… so she said we do daily monitoring for our client, searching a list of related terms on Google manually.
158 00:16:13.020 ⇒ 00:16:20.469 Luke Scorziell: Terms related to our product category, risk, etc. So they’d like to streamline this. So this seems like an issue.
159 00:16:21.290 ⇒ 00:16:27.489 Luke Scorziell: that more than one person is struggling with in this agency. So for me, I’m like, if we could kind of build something that…
160 00:16:27.740 ⇒ 00:16:31.569 Luke Scorziell: just shows them that it’s a possibility to, like, do this in Slack, and then…
161 00:16:31.730 ⇒ 00:16:35.519 Luke Scorziell: I think we just put on, like… I could just make a Brain Forge,
162 00:16:37.110 ⇒ 00:16:39.249 Luke Scorziell: Kind of thing in here that just says, like.
163 00:16:39.890 ⇒ 00:16:46.760 Luke Scorziell: Oh, oh, it says powered by Brainforge. But maybe I’ll make, like, a, hey, do you want this for your agency?
164 00:16:47.140 ⇒ 00:16:52.409 Luke Scorziell: get in contact with Brainforge or something, you know, I don’t know. So, I like that idea.
165 00:16:54.390 ⇒ 00:16:55.540 Pranav Narahari: Cool, yeah.
166 00:16:55.540 ⇒ 00:16:58.529 Luke Scorziell: Any other? Ryan, Hannah, you guys have?
167 00:17:00.940 ⇒ 00:17:02.770 Luke Scorziell: Questions, concerns?
168 00:17:05.960 ⇒ 00:17:10.089 Hannah Wang: About the… Giveaway, specifically?
169 00:17:10.250 ⇒ 00:17:12.299 Luke Scorziell: Yeah I agree.
170 00:17:12.500 ⇒ 00:17:15.599 Hannah Wang: I have no thoughts. I have no thoughts in my brain.
171 00:17:17.300 ⇒ 00:17:18.190 Luke Scorziell: No worries.
172 00:17:18.190 ⇒ 00:17:20.759 Hannah Wang: I like the good of life, though. It’s good.
173 00:17:24.310 ⇒ 00:17:33.110 Pranav Narahari: One worry that I have with, like, 10 people showing up is, like, are we going to be able to, like, engage with all those people, like…
174 00:17:33.710 ⇒ 00:17:34.330 Pranav Narahari: Or maybe I.
175 00:17:34.330 ⇒ 00:17:34.970 Luke Scorziell: I’m just…
176 00:17:35.840 ⇒ 00:17:40.089 Pranav Narahari: Maybe there’s just gonna be, like, half the people that just don’t say anything, they’re just flies on the wall, maybe they…
177 00:17:40.500 ⇒ 00:17:46.609 Pranav Narahari: you know, just stop in for, like, 20 minutes or whatever. Yeah, I’m trying to…
178 00:17:46.610 ⇒ 00:17:47.350 Luke Scorziell: Yeah.
179 00:17:47.540 ⇒ 00:17:50.680 Pranav Narahari: We get a lot of engagement. How do we just…
180 00:17:51.750 ⇒ 00:17:55.710 Pranav Narahari: Like, I don’t want one person to just, like, steal the show and we talk to them for, like, 20 minutes, you know?
181 00:17:56.450 ⇒ 00:18:08.729 Luke Scorziell: Yeah, yeah. So I feel pretty comfortable. I guess I know, Hannah, we kind of… you said you could moderate, too. I’m, like, very comfortable, calling on people and making it, like, making sure that everyone’s…
182 00:18:08.970 ⇒ 00:18:11.750 Luke Scorziell: Speaking, so I think it would be interesting to.
183 00:18:11.750 ⇒ 00:18:16.800 Uttam Kumaran: Yeah, I think, Luke, you’re right. Like, that’s your superpower, you should just… you should take that on.
184 00:18:17.230 ⇒ 00:18:17.900 Luke Scorziell: That sort of…
185 00:18:18.250 ⇒ 00:18:19.250 Uttam Kumaran: facilitate.
186 00:18:19.590 ⇒ 00:18:24.640 Luke Scorziell: I think we could totally do, like, intros, and then, like, one word.
187 00:18:24.920 ⇒ 00:18:30.339 Luke Scorziell: We’re like… Like, name… And what do you do?
188 00:18:30.490 ⇒ 00:18:32.559 Luke Scorziell: And then, like, if you were an AI…
189 00:18:33.520 ⇒ 00:18:36.679 Luke Scorziell: Or, if you don’t even know that, if… if you were…
190 00:18:37.780 ⇒ 00:18:40.149 Luke Scorziell: an AI, who would you be?
191 00:18:40.600 ⇒ 00:18:44.090 Uttam Kumaran: Who would you be? What do you want to say? Like, what is the answers.
192 00:18:44.490 ⇒ 00:18:47.350 Luke Scorziell: I thought… I don’t know, like, would you be, like.
193 00:18:47.350 ⇒ 00:18:47.810 Uttam Kumaran: Claude.
194 00:18:48.230 ⇒ 00:18:49.200 Luke Scorziell: Chat.
195 00:18:49.810 ⇒ 00:18:50.539 Uttam Kumaran: Good to see you.
196 00:18:50.540 ⇒ 00:18:51.730 Luke Scorziell: Okay, fine.
197 00:18:51.730 ⇒ 00:18:52.670 Uttam Kumaran: That’s funny.
198 00:18:52.670 ⇒ 00:18:56.230 Luke Scorziell: One might say, like, CloudBot, or whatever it’s called.
199 00:18:57.090 ⇒ 00:18:57.770 Uttam Kumaran: Yeah.
200 00:18:57.770 ⇒ 00:19:05.969 Luke Scorziell: It would change the name. But, you know, and then we can kind of, like, gauge, you know, maybe I’m, like, cursor, I don’t know, for tops cursor. So,
201 00:19:06.190 ⇒ 00:19:10.400 Luke Scorziell: Yeah, and then if they’re like, I don’t know, then obviously no, like, they’re not that familiar with AI.
202 00:19:11.200 ⇒ 00:19:16.810 Luke Scorziell: And then, this is, like, We could just say one word. So then it’s kind of a…
203 00:19:18.810 ⇒ 00:19:20.800 Luke Scorziell: Like, it forces them all to engage?
204 00:19:21.110 ⇒ 00:19:24.520 Luke Scorziell: And then break the ice of, like, just…
205 00:19:24.970 ⇒ 00:19:30.379 Luke Scorziell: Yeah, turning on their microphone and, and camera, hopefully.
206 00:19:30.500 ⇒ 00:19:36.759 Luke Scorziell: Yeah. And then… yeah, a lot of this stuff’s just from cursor, I don’t… I feel like it’s a little bit, like…
207 00:19:37.210 ⇒ 00:19:39.640 Luke Scorziell: It’s just overly formal.
208 00:19:41.510 ⇒ 00:19:52.839 Luke Scorziell: Yeah, and then I guess, like, kind of just, like, a brief presentation, or not even a really presentation, but just, like, hey, you know, what we’re noticing, as we work with agencies and a lot of our other clients, just that
209 00:19:53.100 ⇒ 00:20:05.400 Luke Scorziell: AI really works when you stop adding new tools and start building into the tools that you’re already using, and so that’s kind of what we want to talk today, about, and, you know, then it can be very natural, like.
210 00:20:05.510 ⇒ 00:20:12.090 Luke Scorziell: I mean, I don’t know if I should bring up, hey, Jeff, I know we talked to you, and, like, this is a problem that you’re, dealing with in front of the group, but…
211 00:20:12.300 ⇒ 00:20:20.199 Luke Scorziell: That can be something, and I know, like, Jake Meaney is also working with, like, a sales team that is pretty old school.
212 00:20:20.310 ⇒ 00:20:28.159 Luke Scorziell: And, like, they’re not really using AI, and so… I think this is, like, a pretty common pain point that probably a lot of them don’t know that they have.
213 00:20:28.760 ⇒ 00:20:33.220 Luke Scorziell: and so, yeah, I think, like, I can kind of go through, like.
214 00:20:33.700 ⇒ 00:20:45.640 Luke Scorziell: Couple principles, I could put… I could have a cursor, I guess, make me a slide, probably, but, like, yeah, put… so we want to build your data, or build your… build your AI where the team is already working, so it’s not, like, building a whole entire other tool.
215 00:20:45.880 ⇒ 00:20:52.199 Luke Scorziell: I’m gonna edit this, because this stuff throws me off when there’s, like, whole sentences. I don’t do well with reading.
216 00:20:52.470 ⇒ 00:20:53.650 Luke Scorziell: Line for line.
217 00:20:56.150 ⇒ 00:21:05.239 Luke Scorziell: Yeah, let me know what’s helpful for you guys, too, because it’s… this is, like… like, whenever a cursor generates just massive amounts of text, I, like, ignore half of it.
218 00:21:06.110 ⇒ 00:21:06.700 Pranav Narahari: Fantastic.
219 00:21:07.820 ⇒ 00:21:15.499 Luke Scorziell: So, yeah, and then… Kind of all this is pointing towards just, like, consolidate your tools and work
220 00:21:15.650 ⇒ 00:21:17.730 Luke Scorziell: On the stuff that really matters, so…
221 00:21:17.850 ⇒ 00:21:22.129 Luke Scorziell: Yeah. It’s not about, like, automating people’s jobs away as much as it is about,
222 00:21:22.560 ⇒ 00:21:37.179 Luke Scorziell: like, really helping you to focus on the time doing strategic and creative work, and then spending time with clients if you’re a salesperson, spending time on calls, and whatnot. And then, probably rephrase this one a little bit, but I think it’s, like.
223 00:21:37.300 ⇒ 00:21:41.380 Luke Scorziell: You have, like, you have so much more data than you need, or than you know.
224 00:21:41.720 ⇒ 00:21:43.599 Luke Scorziell: It’s probably a better way.
225 00:21:45.160 ⇒ 00:21:46.319 Luke Scorziell: Just, it’s like…
226 00:21:46.650 ⇒ 00:21:53.730 Luke Scorziell: Your call transcripts, like, those that can be turned into data that you can leverage, like, any kind of,
227 00:21:54.190 ⇒ 00:21:59.779 Luke Scorziell: Yeah, anything that you’ve been doing that is, like, in the computer, I guess, is something that you can leverage for data.
228 00:22:00.050 ⇒ 00:22:05.799 Luke Scorziell: And so… Yeah, so that’s kind of what I’m…
229 00:22:06.060 ⇒ 00:22:11.910 Luke Scorziell: Thinking for this, I’ll clean it up and make it a little more… Funny, or fun,
230 00:22:12.260 ⇒ 00:22:19.130 Luke Scorziell: And then we could do, like, some kind of, like, questions… But…
231 00:22:22.230 ⇒ 00:22:30.459 Luke Scorziell: And then… Or I can, like, ask specific questions, too.
232 00:22:33.180 ⇒ 00:22:35.550 Luke Scorziell: And then Pranav, I don’t know if you’d want to, like…
233 00:22:36.690 ⇒ 00:22:43.479 Luke Scorziell: kind of run the demo. I know I built it, so… but you can, like… I can give it to you, and if you want to add some, like.
234 00:22:44.360 ⇒ 00:22:50.819 Luke Scorziell: more functionality, or if you just feel comfortable, like, kind of running it. I tried to build it on Slack, but there were a lot of issues with, like, the…
235 00:22:51.690 ⇒ 00:22:58.030 Luke Scorziell: API keys, and I was getting to the point where I was just copy-pasting code errors from… That one…
236 00:22:58.290 ⇒ 00:23:00.350 Luke Scorziell: one AI to the next.
237 00:23:00.430 ⇒ 00:23:02.169 Pranav Narahari: So… Yeah.
238 00:23:04.430 ⇒ 00:23:06.660 Pranav Narahari: Okay, yeah, I can… I think…
239 00:23:07.220 ⇒ 00:23:14.480 Pranav Narahari: I’ll look at the questions, too. You said, yeah, Alice specifically asked for, like, the Slack bot type thing, right? So.
240 00:23:14.880 ⇒ 00:23:20.100 Luke Scorziell: Or, I mean, she didn’t say a Slack bot, but just kind of in line with, like, we want to build something for you that’s, like.
241 00:23:22.660 ⇒ 00:23:28.759 Luke Scorziell: So this is what she said. She said, we do daily monitoring for a client. So she works at a PR agency, so probably
242 00:23:29.190 ⇒ 00:23:31.970 Luke Scorziell: That’s, like, they work for…
243 00:23:32.080 ⇒ 00:23:35.360 Luke Scorziell: Brainforge, and every day they’re looking at, like.
244 00:23:36.270 ⇒ 00:23:46.550 Luke Scorziell: different AI terms and whatnot, to make… and then they’re looking at, like, probably their founder and other people around them to make sure that, like.
245 00:23:47.350 ⇒ 00:23:54.829 Luke Scorziell: Yeah, nothing crazy is getting… is happening, or no one’s, like, yeah, reporting on them in a wrong, like, wrong way, so… so this would be, like.
246 00:23:55.660 ⇒ 00:24:05.490 Luke Scorziell: Yeah, just keeping… keeping themselves apprised every morning of, like, what’s happening, so that if something is happening, then they can immediately act on it. And so, the pain point that we’d be solving here is, like.
247 00:24:05.650 ⇒ 00:24:14.539 Luke Scorziell: If it takes me, like, an hour to go through every client’s Google.
248 00:24:14.890 ⇒ 00:24:17.900 Luke Scorziell: Like, what’s happening on Google around them?
249 00:24:19.060 ⇒ 00:24:20.600 Luke Scorziell: And something happened.
250 00:24:20.710 ⇒ 00:24:28.799 Luke Scorziell: an hour ago, and then… but they’re my third client, and so now, 3 hours later, I’m not getting to that term until…
251 00:24:28.950 ⇒ 00:24:30.530 Luke Scorziell: like…
252 00:24:31.150 ⇒ 00:24:38.910 Luke Scorziell: you know, 3 hours later, and then the Twitter world has already had, like, a chance to comment on it, so I’m, like, whatever, like.
253 00:24:39.400 ⇒ 00:24:53.319 Luke Scorziell: I guess we’re a little bit out of that at this point, but… but it’s, like, they can do this faster, and it’s less risk for their clients, and, like, a bigger value proposition to be able to say, like, hey, we have a… we have a custom in-house solution that
254 00:24:55.350 ⇒ 00:24:56.630 Luke Scorziell: monitors…
255 00:24:57.500 ⇒ 00:24:58.900 Uttam Kumaran: Keywords, like…
256 00:24:58.930 ⇒ 00:25:01.220 Luke Scorziell: On the top of every hour, to make sure that…
257 00:25:01.680 ⇒ 00:25:09.280 Luke Scorziell: other things happening. So that’s… yeah, there’s a lot of value, I guess, built into this, actually. So, like, I mean, she’s not very senior at all, so…
258 00:25:09.470 ⇒ 00:25:11.519 Luke Scorziell: We can… we don’t have to, like, fully
259 00:25:12.410 ⇒ 00:25:13.950 Luke Scorziell: are on the call, but .
260 00:25:14.300 ⇒ 00:25:14.650 Uttam Kumaran: You betcha.
261 00:25:14.650 ⇒ 00:25:26.990 Luke Scorziell: Yeah, so, I mean, I can also walk through that, if that’s something… but I guess, like, want to find where you fit really well, too. I guess it could be in, like, the questions and answers, too, as, like, people ask more, like, technical…
262 00:25:27.570 ⇒ 00:25:28.000 Pranav Narahari: Yeah.
263 00:25:28.900 ⇒ 00:25:30.590 Luke Scorziell: Technically geared questions.
264 00:25:33.300 ⇒ 00:25:46.509 Pranav Narahari: Yeah, I’ll… I feel like that’s probably, like, my best fit. I could drive the demo, too. Is this the… is this Slack, thing, like, the thing that you just showed? Is that the one thing that’s gonna be demoed?
265 00:25:46.620 ⇒ 00:25:48.090 Pranav Narahari: Or is there other things?
266 00:25:49.970 ⇒ 00:25:53.279 Luke Scorziell: I think I was… I think it just depends on, like, the time.
267 00:25:53.680 ⇒ 00:25:58.190 Luke Scorziell: I don’t… how… I mean, I can build, another…
268 00:25:59.670 ⇒ 00:26:06.209 Luke Scorziell: I mean, we can show, like, some of the sales pipeline stuff that we have, but I think, like, I’d rather leave time for, like, discussion, too.
269 00:26:06.560 ⇒ 00:26:07.480 Luke Scorziell: 100%.
270 00:26:07.980 ⇒ 00:26:12.920 Luke Scorziell: Because… and kind of, like, asked, like, hey, like, Does this…
271 00:26:13.220 ⇒ 00:26:16.200 Luke Scorziell: Spark any new ideas for you.
272 00:26:17.420 ⇒ 00:26:21.840 Luke Scorziell: Ew, that is the sprint question, sorry.
273 00:26:24.480 ⇒ 00:26:25.969 Luke Scorziell: What. Would.
274 00:26:34.300 ⇒ 00:26:47.719 Luke Scorziell: I mean, I’ve even thought about, like, this is the type of environment where it’s like, if I’m talking about something, and someone’s like, hey, like, yeah, I’m kind of interested in this, and then, like, we move on to another point or something, and then you spend, like, 5 minutes during the…
275 00:26:47.720 ⇒ 00:26:55.379 Luke Scorziell: I’m like, you’re like, kind of what you did in the call, actually, with Dave and Goliath, where you’re like, oh, hey, I was actually looking, and actually we can build this. So if someone’s like.
276 00:26:55.570 ⇒ 00:27:11.429 Luke Scorziell: Yeah, I just really want, like, that demo, but, like, I don’t know, would it work in, like, Teams, or, like, for us, what’s most important is we want, like, a daily update of, like, leads, and then you’re like, okay, like, I’ll just spin that up real quick. I mean, no pressure to have to do that or anything, but, like, things like that are, like, I feel like totally within realm, because.
277 00:27:11.660 ⇒ 00:27:12.300 Pranav Narahari: Yeah.
278 00:27:12.300 ⇒ 00:27:13.080 Luke Scorziell: It’s like…
279 00:27:13.630 ⇒ 00:27:18.310 Luke Scorziell: like, again, I kind of want them to walk away from this feeling like, oh, whoa, like, that kind of blew my mind.
280 00:27:18.580 ⇒ 00:27:19.489 Luke Scorziell: What.
281 00:27:19.490 ⇒ 00:27:26.440 Pranav Narahari: Yeah, that made me think of the message, I think, Uten, like, that you sent, like… Basically, it was…
282 00:27:27.180 ⇒ 00:27:32.069 Pranav Narahari: I didn’t fully look into it yet, but it was, like, it was, like, creating diagrams.
283 00:27:32.400 ⇒ 00:27:33.340 Uttam Kumaran: Yes.
284 00:27:33.550 ⇒ 00:27:34.820 Pranav Narahari: Yeah, so…
285 00:27:34.820 ⇒ 00:27:36.380 Uttam Kumaran: Yeah, we’re using it now.
286 00:27:37.140 ⇒ 00:27:44.170 Pranav Narahari: Yeah, basically, like, something like that, Luke, like, they give us some insight, like, I go silent for a little bit, and then just…
287 00:27:44.400 ⇒ 00:27:48.510 Pranav Narahari: Create, like, an architecture for, like, what that would look like on… Using, like, the.
288 00:27:48.510 ⇒ 00:27:50.849 Uttam Kumaran: I could just send it in the Zoom chat or something, too.
289 00:27:53.150 ⇒ 00:27:53.690 Pranav Narahari: Yeah.
290 00:27:53.690 ⇒ 00:27:59.510 Luke Scorziell: Yeah, I mean, this is, like, the new kind of writing, if that makes sense, so it’s like…
291 00:27:59.770 ⇒ 00:28:01.080 Luke Scorziell: Yeah. Okay.
292 00:28:01.080 ⇒ 00:28:01.710 Uttam Kumaran: It’s notable.
293 00:28:02.120 ⇒ 00:28:09.660 Luke Scorziell: I mean, we could ask some, like, targeted questions to try to, like, get that going, or, like, give you a bit of a head start also. I don’t know.
294 00:28:10.630 ⇒ 00:28:11.050 Pranav Narahari: Yeah.
295 00:28:11.710 ⇒ 00:28:21.110 Luke Scorziell: like, Leah, this dude is super, he’s, like, a marketing genius, as far as, like, paid ads and digital.
296 00:28:21.800 ⇒ 00:28:28.730 Luke Scorziell: Like, yeah, just, gee, he’s insane. And so…
297 00:28:29.680 ⇒ 00:28:34.670 Luke Scorziell: Yeah, so he’s done, like, a ton of ad spend, stuff like that, so I don’t know what kind of tasks
298 00:28:35.350 ⇒ 00:28:40.029 Luke Scorziell: Like, he would also be a really interesting person, actually, to build out, like, a custom demo for, so, like…
299 00:28:40.460 ⇒ 00:28:42.800 Luke Scorziell: Well, what would it look like to, like, have…
300 00:28:43.240 ⇒ 00:28:49.110 Luke Scorziell: Actually, reporting would be really good for him, if we wanted to build out, like, a reporting
301 00:28:50.450 ⇒ 00:28:52.329 Luke Scorziell: Yeah, I mean, so, like, this…
302 00:28:52.510 ⇒ 00:28:55.000 Luke Scorziell: Dude, I freaking kind of love AI.
303 00:28:56.130 ⇒ 00:28:58.309 Luke Scorziell: Not in, like, a romantic way, but…
304 00:29:04.160 ⇒ 00:29:12.590 Luke Scorziell: Okay, so… Yeah, I mean, I could just try to… And I’m, like…
305 00:29:13.570 ⇒ 00:29:19.169 Pranav Narahari: I mean, in that case, it might make sense for you to then just drive, like, the demo, and then when they
306 00:29:19.700 ⇒ 00:29:23.009 Pranav Narahari: questions, like, I can then be, like.
307 00:29:23.730 ⇒ 00:29:42.259 Pranav Narahari: like, basically what we just did with, David and Goliath, like, just looking up things just to, like, so I can give them, like, not like, oh, I… we could do that, I just don’t know exactly how, but I know we can, I just don’t know the tools exactly. Instead of that, like, me just taking the 5 minutes of research to look into, like, okay.
308 00:29:42.380 ⇒ 00:29:47.070 Pranav Narahari: what would we use exactly? Just so I can sound more confident in my answer.
309 00:29:47.190 ⇒ 00:29:48.090 Pranav Narahari: like…
310 00:29:48.450 ⇒ 00:29:49.390 Luke Scorziell: Yeah.
311 00:29:50.930 ⇒ 00:29:51.550 Pranav Narahari: Yeah.
312 00:29:52.640 ⇒ 00:29:53.119 Pranav Narahari: And then I.
313 00:29:53.120 ⇒ 00:29:59.279 Luke Scorziell: I think it would be great, because, again, I think probably this is… this is… yeah, go for it. Sorry, I think my internet’s lagging.
314 00:29:59.880 ⇒ 00:30:11.599 Pranav Narahari: Oh, no, you’re good. I was just gonna say, like, yeah, in that question and answers part, I’ll probably be talking a lot. That’s kind of, like, how the sales calls goes too, right? Like, the question and answer part, that’s where I start, like, talking a lot more.
315 00:30:11.970 ⇒ 00:30:12.620 Luke Scorziell: Yeah.
316 00:30:12.770 ⇒ 00:30:16.429 Pranav Narahari: that’s probably where I would give the most value in…
317 00:30:16.750 ⇒ 00:30:20.780 Pranav Narahari: In the office hours. But I think you would probably do the best job with the demo.
318 00:30:24.240 ⇒ 00:30:25.190 Pranav Narahari: Yeah.
319 00:30:27.520 ⇒ 00:30:30.420 Luke Scorziell: And then, Hannah, how much do you feel like you want to…
320 00:30:31.040 ⇒ 00:30:33.880 Luke Scorziell: Like, host, or what… what feels…
321 00:30:34.460 ⇒ 00:30:38.720 Luke Scorziell: Good to you. Do you want to host? Do you feel like you’re okay not hosting? Yeah.
322 00:30:38.720 ⇒ 00:30:43.080 Hannah Wang: I… I don’t mind, like, opening…
323 00:30:43.870 ⇒ 00:30:49.779 Hannah Wang: saying opening remarks and then, like, transitioning from thing to thing. I just need, like, a…
324 00:30:50.350 ⇒ 00:30:56.289 Hannah Wang: Signal that we’re ready to… Or if it naturally segues, like, it’s fine too, but
325 00:30:57.700 ⇒ 00:31:05.490 Hannah Wang: I don’t mind being like, okay, now we’re gonna go into kind of, like, a Q&A discussion type format.
326 00:31:07.080 ⇒ 00:31:11.800 Hannah Wang: If someone’s quiet. Yeah, like, I don’t mind facilitating it that way.
327 00:31:12.250 ⇒ 00:31:20.620 Hannah Wang: I also need to be, like… I don’t want to be there and then not talk, but I need to be there, because I need to record the…
328 00:31:21.190 ⇒ 00:31:26.439 Hannah Wang: the thing from, like, an OB… from the OBS software, for higher quality, but…
329 00:31:27.850 ⇒ 00:31:32.230 Hannah Wang: I mean, yeah, I’m fine facilitating. It’s not a problem.
330 00:31:32.720 ⇒ 00:31:33.970 Luke Scorziell: Could be, could do, like…
331 00:31:37.020 ⇒ 00:31:51.339 Hannah Wang: Like, I’ll just intro you guys. I’ll be like, hi, like, we’re Brainforge, like, I’m the creative and lead engagement, creative and engagement lead. This is Luke and Pranav, Luke is the go-to-market lead.
332 00:31:51.730 ⇒ 00:32:01.660 Hannah Wang: when I was an air engineer, I encourage you to all turn on your videos and talk. This is, like, a discussion more than, you know, a presentation, so…
333 00:32:01.870 ⇒ 00:32:05.329 Hannah Wang: Yeah, you know, stuff like that. And then…
334 00:32:06.320 ⇒ 00:32:10.050 Luke Scorziell: Maybe you could, like, lead the… could you lead this part, too? Like…
335 00:32:11.190 ⇒ 00:32:13.750 Hannah Wang: Yeah, sure, I’ll call on people.
336 00:32:13.750 ⇒ 00:32:19.250 Luke Scorziell: you could just be like, hey, Luke and… or maybe you can give, like… I’ll just say, like…
337 00:32:20.700 ⇒ 00:32:21.320 Luke Scorziell: What do you…
338 00:32:21.320 ⇒ 00:32:25.400 Hannah Wang: I also like scripts. I like sentences, so if…
339 00:32:26.560 ⇒ 00:32:35.660 Hannah Wang: You literally, word for word, put what I should say. I thrive with that. I don’t like bullet points, or just topics, so I’m the opposite of you.
340 00:32:39.900 ⇒ 00:32:41.110 Luke Scorziell: Maybe it’s just, like…
341 00:32:45.060 ⇒ 00:32:49.789 Luke Scorziell: Okay, yeah, maybe we can work on that, or maybe if you want to add in…
342 00:32:49.980 ⇒ 00:32:52.450 Luke Scorziell: My brain just, like, doesn’t really compute.
343 00:32:52.450 ⇒ 00:32:55.729 Hannah Wang: Okay, yeah, I mean, I can also… I got it, yeah.
344 00:32:55.730 ⇒ 00:33:03.140 Luke Scorziell: Okay, yeah, I mean, if you just wanted to, like, run through this almost like you were, like, kind of interviewing us, or like, like, hey, Luke, like, you know,
345 00:33:03.790 ⇒ 00:33:14.539 Luke Scorziell: you know, could you tell us a little bit about yourself? Then I can be like, oh, yeah, like, I’m Luke Skorzel, I grew up in, like, Arrowhead, maybe no one’s gonna care, I don’t… but then, like, skill level…
346 00:33:14.820 ⇒ 00:33:24.790 Luke Scorziell: like, kind of tell the little short story for, like, 2 minutes, maybe, like, a minute, minute, maybe, and then kind of, like, okay, what about Pranav? Could you talk about,
347 00:33:25.290 ⇒ 00:33:26.600 Luke Scorziell: You, and then…
348 00:33:26.820 ⇒ 00:33:27.720 Hannah Wang: Talk about…
349 00:33:27.850 ⇒ 00:33:29.000 Luke Scorziell: Brainforge.
350 00:33:29.140 ⇒ 00:33:30.770 Luke Scorziell: And then…
351 00:33:31.790 ⇒ 00:33:38.789 Luke Scorziell: like, this can be really brief, and then just, like, a quick, like, I know I want to turn it to you guys. I don’t know, I don’t know, I guess, yeah, I don’t want to spend too much time
352 00:33:39.300 ⇒ 00:33:40.330 Luke Scorziell: on us.
353 00:33:40.450 ⇒ 00:33:45.800 Luke Scorziell: I’d rather spend, like, more time on… the people.
354 00:33:46.710 ⇒ 00:33:49.830 Hannah Wang: Yeah, I… I just feel like people…
355 00:33:51.070 ⇒ 00:33:57.829 Hannah Wang: Like, my main concern is that it’s not gonna stay less than 5 minutes. Like, I feel like it’s gonna be 10 if we’re calling on everyone.
356 00:33:58.230 ⇒ 00:34:00.500 Hannah Wang: Because people like to talk, so…
357 00:34:00.500 ⇒ 00:34:12.840 Luke Scorziell: like… So… we know… R, Arcan… people here, so… Let’s do this quick.
358 00:34:13.260 ⇒ 00:34:18.650 Luke Scorziell: What do we… New describe… My AI service.
359 00:34:19.560 ⇒ 00:34:22.700 Luke Scorziell: Would be… hold on where you were…
360 00:34:26.909 ⇒ 00:34:29.389 Luke Scorziell: So, just kind of giving them, like, the…
361 00:34:33.170 ⇒ 00:34:40.230 Pranav Narahari: I feel like one benefit of this is, like, okay, they’re speaking, but then one con of it is, like.
362 00:34:40.560 ⇒ 00:34:46.979 Pranav Narahari: we control how long they speak for less, so, like, we could have them type in the chat, or…
363 00:34:46.989 ⇒ 00:34:47.549 Hannah Wang: Yeah.
364 00:34:47.659 ⇒ 00:34:49.329 Pranav Narahari: That’s what I was thinking.
365 00:34:49.790 ⇒ 00:34:55.050 Pranav Narahari: But then I think that has the drawback of, like, okay, I think…
366 00:34:55.170 ⇒ 00:34:59.350 Pranav Narahari: That microphone just might stay muted the entire meeting, so…
367 00:34:59.350 ⇒ 00:34:59.780 Hannah Wang: Yeah.
368 00:34:59.780 ⇒ 00:35:00.780 Pranav Narahari: Up to y’all.
369 00:35:03.530 ⇒ 00:35:13.980 Luke Scorziell: Yeah, I just don’t want, like, the, like, the cameras off. They’re probably, like, not even at their computer, and they’re just, like, putting in chat that it’s, like.
370 00:35:19.020 ⇒ 00:35:25.399 Luke Scorziell: I guess we could also have them put it in the chat and then call on people and be like, hey, why do you think you’re chat GPT?
371 00:35:26.340 ⇒ 00:35:27.460 Pranav Narahari: Yeah, we could do that.
372 00:35:28.320 ⇒ 00:35:35.679 Hannah Wang: Or I also know you, like, call on people towards the end, so, like, you can just call on the people that don’t speak throughout.
373 00:35:38.210 ⇒ 00:35:44.309 Hannah Wang: the thing. Like, we can do chat intros, and then you can just call on people at the end.
374 00:35:47.130 ⇒ 00:35:48.529 Hannah Wang: Okay, yeah.
375 00:35:50.900 ⇒ 00:35:58.889 Hannah Wang: I just feel like that’d be a lot faster. Not that it’s not important, but I think they came for, like, you know…
376 00:35:59.150 ⇒ 00:36:01.970 Hannah Wang: To learn something and to see a demo, so…
377 00:36:02.570 ⇒ 00:36:03.760 Luke Scorziell: Yeah, okay.
378 00:36:08.050 ⇒ 00:36:12.990 Hannah Wang: And then, like, if… You notice that someone didn’t put
379 00:36:13.490 ⇒ 00:36:16.139 Hannah Wang: their AI in the chat.
380 00:36:16.490 ⇒ 00:36:23.980 Hannah Wang: you can call on them later. So, you can weed out, like, who’s not vocal and who is.
381 00:36:25.100 ⇒ 00:36:28.019 Luke Scorziell: We can play this by ear, too, like, if 5 people show up, then…
382 00:36:28.200 ⇒ 00:36:29.179 Hannah Wang: Yeah. True.
383 00:36:29.180 ⇒ 00:36:30.669 Pranav Narahari: We can just… Yeah.
384 00:36:31.920 ⇒ 00:36:32.900 Luke Scorziell: So…
385 00:36:32.900 ⇒ 00:36:39.999 Pranav Narahari: office hours, like, I was thinking people were gonna jump in and jump out, not necessarily be there from, like, beginning to end, so…
386 00:36:40.840 ⇒ 00:36:41.620 Luke Scorziell: Oh, true.
387 00:36:43.210 ⇒ 00:36:46.099 Luke Scorziell: Yeah, well, we’ll see.
388 00:36:47.620 ⇒ 00:36:49.210 Hannah Wang: And then…
389 00:36:53.040 ⇒ 00:36:59.509 Luke Scorziell: I don’t know, I can make this, like, more concise. I feel like this is just, like, not that important.
390 00:37:00.550 ⇒ 00:37:02.929 Luke Scorziell: I guess it’s important, but,
391 00:37:05.930 ⇒ 00:37:11.100 Luke Scorziell: Yeah, again, I kind of want it to be more of a discussion, like, I want people to be talking and sharing and,
392 00:37:11.210 ⇒ 00:37:16.310 Luke Scorziell: Because then Ryan can also, like, pull transcripts from this and say, like.
393 00:37:16.970 ⇒ 00:37:22.170 Luke Scorziell: And he can do, like, a Robert post, like, we had a… or a me post, like, we had a discussion
394 00:37:22.410 ⇒ 00:37:23.080 Luke Scorziell: What?
395 00:37:23.740 ⇒ 00:37:29.979 Luke Scorziell: 10 agency people, and… Here are the fruit trends, you know, that kind of stuff.
396 00:37:30.200 ⇒ 00:37:34.150 Luke Scorziell: And then, I mean, we’re literally building products also off of this, so…
397 00:37:34.770 ⇒ 00:37:39.059 Pranav Narahari: Do we need to give a disclaimer about that? Like, hey…
398 00:37:39.300 ⇒ 00:37:43.300 Pranav Narahari: Or is that already in, like, the email that we sent to them about this, like…
399 00:37:43.570 ⇒ 00:37:44.910 Pranav Narahari: Oh, like, we might…
400 00:37:45.290 ⇒ 00:37:52.169 Pranav Narahari: you know, use this as content. I don’t know, I’m just triggered by, like, the Leela situation.
401 00:37:54.190 ⇒ 00:37:55.140 Luke Scorziell: They’re recording…
402 00:37:55.140 ⇒ 00:38:02.760 Hannah Wang: You can just say, like, we’ll be recording. If you don’t feel comfortable with us repurposing, like, your faith, then just let me know.
403 00:38:02.880 ⇒ 00:38:04.009 Hannah Wang: Like, email handler.
404 00:38:04.020 ⇒ 00:38:08.850 Pranav Narahari: I was thinking about the face, I was thinking about, like, if they bring up, like, their specific use case.
405 00:38:08.850 ⇒ 00:38:14.550 Hannah Wang: Oh… Yeah. Yeah, face and or content and or, like.
406 00:38:14.840 ⇒ 00:38:19.150 Hannah Wang: what you’re struggling with, like, let Hannah know, or something.
407 00:38:19.560 ⇒ 00:38:20.510 Pranav Narahari: Right, right.
408 00:38:22.300 ⇒ 00:38:25.280 Luke Scorziell: I don’t know, you could say, like, if you don’t feel comfortable with this being recorded.
409 00:38:25.920 ⇒ 00:38:27.459 Luke Scorziell: Hop off and I’ll send you this.
410 00:38:29.690 ⇒ 00:38:32.060 Hannah Wang: You should leave. I’m just kidding.
411 00:38:32.480 ⇒ 00:38:38.899 Luke Scorziell: I don’t know how to say that in a nice way, or, like, I wanna be, like, just… or we could just be like, just so you know, we’re gonna be recording, and…
412 00:38:39.070 ⇒ 00:38:42.460 Luke Scorziell: like… We’ll put this on YouTube.
413 00:38:42.760 ⇒ 00:38:48.609 Pranav Narahari: We could… I wonder if we just add it to, like, the invite or something, like, we don’t even need to say it.
414 00:38:49.100 ⇒ 00:38:52.709 Hannah Wang: Yeah, I think next time I’ll do that, I forgot. I forgot to do it.
415 00:38:53.020 ⇒ 00:38:53.660 Pranav Narahari: Hmm.
416 00:38:54.340 ⇒ 00:38:55.980 Hannah Wang: So yeah, moving… moving forward, we’ll.
417 00:38:55.980 ⇒ 00:39:05.120 Luke Scorziell: I mean, most webinars, I feel like it’s pretty normal that it would be recorded, like, I don’t think anyone is probably under the impression that it’s, like… Also, Zoom will tell them, I think, right?
418 00:39:05.720 ⇒ 00:39:08.729 Pranav Narahari: Yeah, it’ll show in the top right, yeah. So, honestly…
419 00:39:08.730 ⇒ 00:39:17.109 Hannah Wang: But then we… we framed this as an office hour, not a webinar, so I… maybe it’s misleading to say office hours.
420 00:39:17.740 ⇒ 00:39:21.619 Hannah Wang: Because office hours just reminds me of, like, undergrad office hours, where you just…
421 00:39:21.900 ⇒ 00:39:25.600 Hannah Wang: Come in and ask questions and leave, not, like, a full-on…
422 00:39:25.930 ⇒ 00:39:31.700 Hannah Wang: webinar type thing. But anyway, like, yeah, moving forward, we can adjust, obviously.
423 00:39:32.090 ⇒ 00:39:35.649 Hannah Wang: But I’ll just give, like, a disclaimer at the beginning.
424 00:39:37.100 ⇒ 00:39:44.210 Luke Scorziell: Yeah, and we can play it by ear, I mean, if people are gonna, like, not share as much because of recording, I’d rather have, like, honest conversations, too.
425 00:39:44.550 ⇒ 00:39:45.999 Pranav Narahari: True. Yeah. Yeah.
426 00:39:46.360 ⇒ 00:39:52.590 Pranav Narahari: And maybe we can actually, yeah, not even say it, and then if we’re trying to use some content, we just pass it by, like…
427 00:39:52.960 ⇒ 00:39:55.469 Pranav Narahari: someone after the fact, because, yeah, I don’t…
428 00:39:55.470 ⇒ 00:39:55.980 Hannah Wang: Oh, yeah.
429 00:39:56.070 ⇒ 00:39:58.179 Pranav Narahari: People, like, skittish right off of the…
430 00:39:58.280 ⇒ 00:40:02.980 Pranav Narahari: Right out the gate, about, like, sharing their information by saying that So…
431 00:40:03.160 ⇒ 00:40:05.160 Pranav Narahari: Yeah, maybe we don’t… Yeah, maybe…
432 00:40:05.540 ⇒ 00:40:08.249 Hannah Wang: Yeah, maybe in a follow-up email, we can…
433 00:40:08.470 ⇒ 00:40:15.640 Hannah Wang: tell them… ask them if they’re okay, but yeah. Okay. I won’t… I won’t bring it up then, because…
434 00:40:16.120 ⇒ 00:40:20.140 Hannah Wang: You’re right, I don’t want to throw them off, or… Well, I just had to.
435 00:40:20.140 ⇒ 00:40:28.730 Luke Scorziell: Like, if they enter the room, and they click, and it says, this call is being recorded, and then they click, yes, I agree to enter, for me, that’s, like.
436 00:40:29.450 ⇒ 00:40:30.070 Pranav Narahari: Fair enough.
437 00:40:30.720 ⇒ 00:40:31.250 Pranav Narahari: Yeah.
438 00:40:31.250 ⇒ 00:40:36.030 Luke Scorziell: Yeah, personally, because I don’t want to, like, I don’t have to go back and ask everyone for permission to…
439 00:40:37.250 ⇒ 00:40:42.629 Pranav Narahari: Yeah, that’s true, actually. Like, with the Lilo situation, I guess it’s a little bit different.
440 00:40:42.630 ⇒ 00:40:46.200 Luke Scorziell: Yeah, I mean, what was the issue with them, like, the posts that we put out?
441 00:40:46.760 ⇒ 00:40:48.970 Pranav Narahari: Yeah, I didn’t even get to see the post, it was already gone by.
442 00:40:48.970 ⇒ 00:40:50.819 Luke Scorziell: Yeah, or just, like, pissed in general was…
443 00:40:50.820 ⇒ 00:40:52.030 Pranav Narahari: Yeah, yeah.
444 00:40:52.030 ⇒ 00:40:54.880 Luke Scorziell: From, Tom and Robert.
445 00:40:54.990 ⇒ 00:40:55.670 Pranav Narahari: Yeah.
446 00:40:56.120 ⇒ 00:41:03.800 Luke Scorziell: So… And… Yeah, so we’ll play it by ear, but yeah, we should… I don’t know.
447 00:41:03.970 ⇒ 00:41:05.760 Pranav Narahari: We’re probably good to not say that, yeah.
448 00:41:07.200 ⇒ 00:41:08.040 Luke Scorziell: Like, we’re not…
449 00:41:08.040 ⇒ 00:41:10.250 Hannah Wang: So don’t give the disclaimer?
450 00:41:11.850 ⇒ 00:41:16.530 Luke Scorziell: I think if the meeting makes it clear that That’s fair.
451 00:41:16.530 ⇒ 00:41:16.850 Hannah Wang: Okay.
452 00:41:17.290 ⇒ 00:41:18.250 Pranav Narahari: I think it does, yeah.
453 00:41:18.250 ⇒ 00:41:18.645 Hannah Wang: Whatever.
454 00:41:24.770 ⇒ 00:41:26.369 Luke Scorziell: Are we just doing a normal Zoom meeting?
455 00:41:26.960 ⇒ 00:41:27.590 Hannah Wang: Yeah.
456 00:41:27.840 ⇒ 00:41:28.440 Luke Scorziell: Okay.
457 00:41:30.690 ⇒ 00:41:35.050 Luke Scorziell: And then… Okay.
458 00:41:36.340 ⇒ 00:41:38.270 Luke Scorziell: This is good that we’re going through this stuff now.
459 00:41:42.210 ⇒ 00:41:49.179 Luke Scorziell: So, this is just kind of meant to be, like, an introduction section on, like, pain points and whatnot. I can workshop it a little bit more.
460 00:41:49.340 ⇒ 00:41:55.770 Luke Scorziell: I don’t really know that I, like, I can give a little bit of a presentation. Maybe it’d be fun to present a little more of, like, what have we done for clients?
461 00:41:58.630 ⇒ 00:41:59.470 Luke Scorziell: Look.
462 00:42:02.170 ⇒ 00:42:02.950 Luke Scorziell: Perfect.
463 00:42:03.070 ⇒ 00:42:05.249 Luke Scorziell: a client that’s not called Lilo.
464 00:42:11.900 ⇒ 00:42:13.040 Luke Scorziell: I’m like,
465 00:42:19.580 ⇒ 00:42:20.620 Luke Scorziell: So…
466 00:42:21.350 ⇒ 00:42:25.890 Luke Scorziell: this, I guess, was what I was interested in, is, like, how much content do you guys feel like
467 00:42:27.370 ⇒ 00:42:30.019 Luke Scorziell: Or do you have any thoughts around, like, what…
468 00:42:30.800 ⇒ 00:42:32.900 Luke Scorziell: Yeah, just what might be the most…
469 00:42:33.860 ⇒ 00:42:38.620 Luke Scorziell: interesting or relevant, sure. I mean, this is fine, you can just kind of talk directly about this, but…
470 00:42:43.840 ⇒ 00:42:47.120 Pranav Narahari: Okay, so for the…
471 00:42:47.470 ⇒ 00:42:53.849 Pranav Narahari: That last point, like, present a little bit more about what we’ve done for clients, like… Should I…
472 00:42:55.080 ⇒ 00:43:02.419 Pranav Narahari: kind of… like, I have a good context on ABC, and I have good context on Lilo,
473 00:43:04.530 ⇒ 00:43:14.770 Pranav Narahari: I guess those are the two main AI apps that are comparable to What we’re doing here, so…
474 00:43:21.090 ⇒ 00:43:21.880 Pranav Narahari: Yeah.
475 00:43:22.350 ⇒ 00:43:22.910 Luke Scorziell: Good.
476 00:43:23.320 ⇒ 00:43:28.670 Luke Scorziell: chat number again. Yeah, I think… I think that’d be great, like, maybe if you could share, like.
477 00:43:29.220 ⇒ 00:43:36.050 Luke Scorziell: Or… Agencies we’ve built, like, or it was, like, brief generation.
478 00:43:37.490 ⇒ 00:43:38.740 Luke Scorziell: Like…
479 00:43:39.580 ⇒ 00:43:53.310 Pranav Narahari: forecasting… models… We can say, like, Real-time reporting, yeah.
480 00:43:55.650 ⇒ 00:43:57.490 Luke Scorziell: And then what else were you gonna say?
481 00:43:59.870 ⇒ 00:44:06.010 Pranav Narahari: I want to say this in a probably less technical way, but, like, data warehousing, but maybe that’s… people understand what that means.
482 00:44:25.870 ⇒ 00:44:32.239 Pranav Narahari: And then we can say, like, what everyone’s heard about, but maybe doesn’t understand, like, knowledge bases and MCPs.
483 00:44:52.280 ⇒ 00:44:54.190 Luke Scorziell: It’d be helpful for you to, like, talk about.
484 00:44:54.460 ⇒ 00:44:55.180 Luke Scorziell: Look.
485 00:45:02.690 ⇒ 00:45:03.520 Pranav Narahari: Yeah.
486 00:45:04.250 ⇒ 00:45:06.310 Pranav Narahari: Yeah, I think that would actually be…
487 00:45:07.090 ⇒ 00:45:11.159 Pranav Narahari: super useful. Like, having, like, a moment where we’re like, okay.
488 00:45:11.340 ⇒ 00:45:15.599 Pranav Narahari: There’s a lot of just, like, noise about these things.
489 00:45:15.710 ⇒ 00:45:19.529 Pranav Narahari: Let me just do, like, some quick definitions for people, so, like, they understand a little.
490 00:45:19.530 ⇒ 00:45:21.320 Luke Scorziell: Yeah, that’s good, actually.
491 00:45:21.540 ⇒ 00:45:22.300 Pranav Narahari: Right?
492 00:45:22.590 ⇒ 00:45:28.999 Pranav Narahari: because I think, kind of, that’s where we can add value to them, like, with office hours, they probably, like, what is, like, the…
493 00:45:29.110 ⇒ 00:45:31.770 Pranav Narahari: Where can they learn a little bit?
494 00:45:33.530 ⇒ 00:45:35.500 Pranav Narahari: That would be great, and then I can, like…
495 00:45:35.800 ⇒ 00:45:41.910 Pranav Narahari: Talk about them at a tech… like, a very high technical level, but then also talk about, like, okay, what is the business impact?
496 00:45:42.500 ⇒ 00:45:43.940 Pranav Narahari: For these type of things.
497 00:45:47.180 ⇒ 00:45:53.139 Hannah Wang: I think if you could also, have Cursor generate, like, a slide…
498 00:45:53.540 ⇒ 00:45:57.520 Hannah Wang: one slide, that’d be helpful, because I think it’s easy for
499 00:45:57.780 ⇒ 00:46:02.170 Hannah Wang: All this… all these terms to go over their head, so… Just…
500 00:46:02.900 ⇒ 00:46:05.209 Luke Scorziell: Yeah, or I know there’s, like, what’s it called.
501 00:46:06.880 ⇒ 00:46:13.120 Pranav Narahari: We could even make this, like, a point of engagement, too. Like, what is a term that y’all have heard about but want to learn more about?
502 00:46:14.100 ⇒ 00:46:14.750 Luke Scorziell: Yeah.
503 00:46:16.750 ⇒ 00:46:19.600 Pranav Narahari: Like, I think Luke or somebody, like.
504 00:46:19.930 ⇒ 00:46:24.339 Pranav Narahari: did this for one of our, like, All Hands, like, one of these, like, apps where you just, like.
505 00:46:25.420 ⇒ 00:46:35.760 Pranav Narahari: like, we just ask them… we can make it interactive in that way, where they, like… I’d probably actually scratch that, like, the easiest way to do it is probably just, like, let them type it in the chat or say it out loud.
506 00:46:37.480 ⇒ 00:46:43.779 Luke Scorziell: Yeah. Okay, I like this a lot, actually, because it’s kind of helping them build, like, a foundation of what… how to talk about AI.
507 00:46:44.190 ⇒ 00:46:44.820 Pranav Narahari: Yeah.
508 00:46:52.710 ⇒ 00:46:55.959 Pranav Narahari: So should we run it as, like, a question, and they…
509 00:46:56.490 ⇒ 00:46:59.259 Pranav Narahari: Tell it, like, they tell us what are some terms, or…
510 00:46:59.420 ⇒ 00:47:09.219 Luke Scorziell: Maybe, because they might not even know what, like, these are… I mean, I’m just throwing out terms that I’ve heard at Brainforge, but I’ve kind of lost a little bit of touch with, like, what are the words that…
511 00:47:10.330 ⇒ 00:47:13.850 Luke Scorziell: Just everyone, or, like, the average public is.
512 00:47:14.110 ⇒ 00:47:15.020 Luke Scorziell: Thank you very well.
513 00:47:15.790 ⇒ 00:47:16.680 Pranav Narahari: Yeah.
514 00:47:17.800 ⇒ 00:47:20.980 Luke Scorziell: What did CloudBot turn into? What did they rename that?
515 00:47:22.420 ⇒ 00:47:24.559 Pranav Narahari: They renamed it? Oh, shoot.
516 00:47:25.100 ⇒ 00:47:32.509 Pranav Narahari: It’s spelled differently, it’s not actually, like, affiliated with Anthropic in that way, or it’s spelled differently, at least.
517 00:47:33.030 ⇒ 00:47:33.530 Pranav Narahari: That’s horrible.
518 00:47:33.530 ⇒ 00:47:34.440 Luke Scorziell: Ryan, do you know?
519 00:47:36.370 ⇒ 00:47:39.660 Ryan Brosas: It… Let me check here…
520 00:47:39.990 ⇒ 00:47:40.899 Pranav Narahari: Oh, open claw?
521 00:47:40.900 ⇒ 00:47:42.610 Ryan Brosas: law? OpenClo, yeah.
522 00:47:42.800 ⇒ 00:47:44.000 Pranav Narahari: Open claw, yeah.
523 00:47:48.360 ⇒ 00:47:49.770 Luke Scorziell: I could just ask her for, like…
524 00:47:58.070 ⇒ 00:48:01.570 Pranav Narahari: One thing that a lot of people talk about is, like, training a model.
525 00:48:02.020 ⇒ 00:48:03.120 Luke Scorziell: And, like…
526 00:48:03.230 ⇒ 00:48:09.779 Pranav Narahari: Right, they always are saying, like, oh, like, so you guys are gonna train… like, we need someone to train a model for us. I’m like, you probably don’t need…
527 00:48:10.120 ⇒ 00:48:14.380 Pranav Narahari: a model. And that’s not our specialty, right? We don’t have…
528 00:48:15.160 ⇒ 00:48:18.979 Pranav Narahari: we haven’t trained models here. But then…
529 00:48:19.140 ⇒ 00:48:22.969 Pranav Narahari: The business impact that they need is probably not something
530 00:48:23.140 ⇒ 00:48:25.010 Pranav Narahari: That requires us to train a model.
531 00:48:29.040 ⇒ 00:48:30.510 Luke Scorziell: Yeah, okay.
532 00:48:30.680 ⇒ 00:48:34.290 Luke Scorziell: I think this is good, like, maybe, like, Cop.
533 00:48:36.760 ⇒ 00:48:37.680 Luke Scorziell: 5.
534 00:48:46.070 ⇒ 00:48:50.649 Pranav Narahari: Okay, sorry, so, like, how do you want to run this? Like, I’ll just talk about 5 of these?
535 00:48:53.000 ⇒ 00:48:56.360 Luke Scorziell: Yeah, I think maybe we could make a slide and just feel like.
536 00:48:58.440 ⇒ 00:49:06.179 Luke Scorziell: Sorry, I’m very, pee on the, Myers-Briggs, which is, like, feel it out in the moment. I don’t know how familiar you guys are with that.
537 00:49:06.180 ⇒ 00:49:07.520 Pranav Narahari: Oh, yeah, yeah.
538 00:49:08.270 ⇒ 00:49:14.459 Hannah Wang: I’m Super Jay, so… But I don’t know where, Pranava, you sit.
539 00:49:15.370 ⇒ 00:49:20.829 Pranav Narahari: I think that’s the one where I’m like, one time I take it, I’m a J, one time I take it, I’m a P.
540 00:49:21.130 ⇒ 00:49:22.819 Hannah Wang: Okay, you’re in the middle, that’s good.
541 00:49:23.040 ⇒ 00:49:25.660 Luke Scorziell: You could balance us out. Yeah, because I’m like…
542 00:49:25.830 ⇒ 00:49:33.220 Luke Scorziell: I’m like, I’ll just have this super janky document in front of me, and… Just use half of it.
543 00:49:33.410 ⇒ 00:49:37.559 Luke Scorziell: during the actual call, is, like, how I would run this, which I know is not helpful for…
544 00:49:38.250 ⇒ 00:49:42.900 Pranav Narahari: Well, to be honest, if you feel comfortable doing that, like, that’s fine.
545 00:49:43.110 ⇒ 00:49:45.459 Pranav Narahari: But this part, I think I’ll be running, right?
546 00:49:45.460 ⇒ 00:49:46.290 Luke Scorziell: Yeah, yeah.
547 00:49:46.290 ⇒ 00:49:47.779 Pranav Narahari: Okay. Yeah.
548 00:49:48.870 ⇒ 00:49:54.099 Luke Scorziell: It’s good to know that you’re very Jay, Hannah.
549 00:49:55.640 ⇒ 00:49:57.370 Luke Scorziell: My girlfriend is also very Jane.
550 00:49:57.370 ⇒ 00:49:58.190 Hannah Wang: Oh.
551 00:49:58.190 ⇒ 00:50:00.990 Luke Scorziell: Very common point of frustration.
552 00:50:00.990 ⇒ 00:50:03.440 Hannah Wang: Oh, yeah. Our relationship.
553 00:50:04.120 ⇒ 00:50:09.189 Luke Scorziell: She’s like, I’ll say something, and then I just… I’m like, it’s like an idea, and I don’t really…
554 00:50:09.710 ⇒ 00:50:14.100 Luke Scorziell: I’m not gonna follow through on it, but then she’ll, like, think that I want to turn into a plan.
555 00:50:14.580 ⇒ 00:50:18.860 Hannah Wang: Yeah, mmm… Compromise.
556 00:50:20.720 ⇒ 00:50:27.480 Luke Scorziell: Okay, so… why don’t we turn this part into more of, like, a…
557 00:50:32.950 ⇒ 00:50:37.110 Pranav Narahari: So this framing part, then, Sounds like I’d be doing most of it.
558 00:50:37.110 ⇒ 00:50:38.930 Luke Scorziell: Yeah, I think so.
559 00:50:41.630 ⇒ 00:50:42.690 Luke Scorziell: And…
560 00:51:06.780 ⇒ 00:51:16.239 Luke Scorziell: I know, like, Jake works at an AI company that’s doing evals. I don’t… honestly, I don’t even know what an eval is. People say evals all the time, and I’m like, I don’t know what an eval is like…
561 00:51:16.640 ⇒ 00:51:18.040 Luke Scorziell: Measuring the model.
562 00:51:19.140 ⇒ 00:51:23.160 Ryan Brosas: It’s pretty much, the performance of the AI agents.
563 00:51:23.580 ⇒ 00:51:31.439 Ryan Brosas: It’s, it’s kind of, like, relative to the training model, but focusing on the
564 00:51:31.910 ⇒ 00:51:34.719 Ryan Brosas: Behavior of the agent itself.
565 00:51:35.710 ⇒ 00:51:36.920 Luke Scorziell: But what does that mean?
566 00:51:37.220 ⇒ 00:51:44.849 Luke Scorziell: like… It’s like seeing how it responds, like, that some agents respond differently than others.
567 00:51:45.260 ⇒ 00:51:46.080 Ryan Brosas: Yeah.
568 00:51:46.400 ⇒ 00:51:53.590 Ryan Brosas: the performance… But, yeah, it’s kind of, like, relative to the training model.
569 00:51:55.280 ⇒ 00:51:55.900 Luke Scorziell: Bye.
570 00:51:56.760 ⇒ 00:52:02.930 Luke Scorziell: So, I mean, in this case, maybe the, like, definitions and stuff should come before.
571 00:52:03.600 ⇒ 00:52:07.940 Luke Scorziell: Or… or maybe actually this, of what we’ve done for clients.
572 00:52:08.670 ⇒ 00:52:10.930 Luke Scorziell: Comes, up here.
573 00:52:14.900 ⇒ 00:52:17.779 Luke Scorziell: And then, I think, Pranav, you could… you could lead this, too.
574 00:52:17.970 ⇒ 00:52:20.170 Luke Scorziell: But, like, this would be, like, the Brain Forge.
575 00:52:44.310 ⇒ 00:52:50.199 Pranav Narahari: Do we, also want to have just, like, a section in the beginning where we just, like, tell them what the agenda is?
576 00:52:50.840 ⇒ 00:52:51.870 Luke Scorziell: Yeah, we do.
577 00:52:51.870 ⇒ 00:52:54.969 Pranav Narahari: Maybe, Hannah, like, just a couple sentences.
578 00:52:55.680 ⇒ 00:52:56.780 Pranav Narahari: In the beginning.
579 00:52:57.290 ⇒ 00:52:57.920 Hannah Wang: Yeah.
580 00:53:10.010 ⇒ 00:53:10.950 Pranav Narahari: Demo.
581 00:53:15.890 ⇒ 00:53:16.620 Pranav Narahari: Yeah.
582 00:53:21.420 ⇒ 00:53:22.270 Pranav Narahari: Nice.
583 00:53:29.220 ⇒ 00:53:36.660 Luke Scorziell: I think if we were to get to, like, this point, and it’s, like, we’re, like, 20 minutes in already, and then we think that the AI terms is gonna take, like.
584 00:53:38.090 ⇒ 00:53:41.259 Luke Scorziell: Too much longer, we could kind of cut or shorten this.
585 00:53:42.130 ⇒ 00:53:42.920 Pranav Narahari: Yeah.
586 00:53:42.920 ⇒ 00:53:46.569 Luke Scorziell: You don’t even need to mention it in the agenda.
587 00:53:46.590 ⇒ 00:53:47.790 Pranav Narahari: In that case.
588 00:53:58.000 ⇒ 00:54:01.040 Hannah Wang: They probably won’t even remember what I say, so it’s okay.
589 00:54:01.040 ⇒ 00:54:03.969 Pranav Narahari: That’s true. That’s true, that’s true.
590 00:54:07.580 ⇒ 00:54:08.939 Luke Scorziell: Wait, what did I do?
591 00:54:09.520 ⇒ 00:54:12.040 Pranav Narahari: Oh, I think you moved up the…
592 00:54:12.770 ⇒ 00:54:14.140 Luke Scorziell: Did I move it too far?
593 00:54:14.140 ⇒ 00:54:18.920 Pranav Narahari: Just, yeah, I think that… It pulled up the… what’s it called?
594 00:54:19.030 ⇒ 00:54:21.289 Pranav Narahari: Yeah, the AI fog stuff.
595 00:54:21.540 ⇒ 00:54:25.240 Pranav Narahari: I think you wanted to put it underneath… R.
596 00:54:25.240 ⇒ 00:54:26.229 Luke Scorziell: Oh, where are they?
597 00:54:26.580 ⇒ 00:54:31.929 Pranav Narahari: Yeah, yeah, yeah. Down, down, yeah, and then right above, there’s a lot of noise about a lot of terms.
598 00:54:33.230 ⇒ 00:54:34.000 Luke Scorziell: Oh, yeah.
599 00:54:34.490 ⇒ 00:54:35.460 Luke Scorziell: Thank you.
600 00:54:35.910 ⇒ 00:54:36.530 Pranav Narahari: Yep.
601 00:54:38.720 ⇒ 00:54:40.129 Luke Scorziell: I… I don’t wanna…
602 00:54:44.260 ⇒ 00:54:44.820 Pranav Narahari: Cool.
603 00:54:48.150 ⇒ 00:54:48.950 Luke Scorziell: Okay.
604 00:54:50.280 ⇒ 00:54:51.140 Luke Scorziell: So…
605 00:55:07.240 ⇒ 00:55:11.790 Luke Scorziell: Okay, so… Alright, make a slide…
606 00:55:18.320 ⇒ 00:55:20.199 Luke Scorziell: Yeah, this is coming together.
607 00:55:23.020 ⇒ 00:55:24.699 Pranav Narahari: Cool. Did you want me to make the slide?
608 00:55:25.620 ⇒ 00:55:31.189 Luke Scorziell: If, yeah, if you want to, or, sorry, we should probably give someone clear.
609 00:55:31.620 ⇒ 00:55:37.270 Luke Scorziell: I guess, like, how much of this do we feel like we need slides for? Do we just need that? Like, I feel like I could see a slide for, like.
610 00:55:38.970 ⇒ 00:55:42.750 Pranav Narahari: What I can do is just, like, I’ll create, like, kind of…
611 00:55:43.560 ⇒ 00:55:56.469 Pranav Narahari: just for each of those terms, just quick, like, definitions, and I’m obviously not just gonna read off a slide, right? But if you guys want to end up making a slide, then you can, but I’ll just send those over to you tonight.
612 00:55:56.840 ⇒ 00:56:00.640 Luke Scorziell: Okay, yeah, that sounds good. I mean, I think it’d be interesting to have a slide of, like.
613 00:56:00.890 ⇒ 00:56:08.610 Luke Scorziell: With the agenda… And… maybe just, like… but also, this is, like, not… cute.
614 00:56:08.830 ⇒ 00:56:13.629 Luke Scorziell: capable for us to do tomorrow. So, Hannah, I don’t know if that would be, like, something that you might do with Han.
615 00:56:14.140 ⇒ 00:56:19.410 Luke Scorziell: Yeah, we don’t need to do that.
616 00:56:20.550 ⇒ 00:56:21.170 Luke Scorziell: Okay.
617 00:56:21.750 ⇒ 00:56:22.460 Pranav Narahari: Okay.
618 00:56:22.460 ⇒ 00:56:24.810 Luke Scorziell: In the future, it could be nice to have, like, a…
619 00:56:25.230 ⇒ 00:56:29.920 Luke Scorziell: this on a slide, and then, like, an About Brainforge on a slide.
620 00:56:30.380 ⇒ 00:56:31.740 Hannah Wang: Yeah, okay.
621 00:56:32.330 ⇒ 00:56:34.220 Luke Scorziell: I think if we do have something that we…
622 00:56:34.520 ⇒ 00:56:40.380 Luke Scorziell: could put on a slide, then we could… that would be nice, but, or I could try to send it to Ann.
623 00:56:41.430 ⇒ 00:56:45.110 Luke Scorziell: But if…
624 00:56:50.740 ⇒ 00:56:52.419 Luke Scorziell: Okay, and then…
625 00:57:10.250 ⇒ 00:57:15.829 Luke Scorziell: And, like, I think a good tie into this would be, like, speaking of MCP servers, who even knows… who even knows what that means?
626 00:57:16.240 ⇒ 00:57:17.569 Pranav Narahari: Yeah, perfect.
627 00:57:22.380 ⇒ 00:57:25.959 Luke Scorziell: And I can be like… I don’t know.
628 00:57:28.350 ⇒ 00:57:29.110 Luke Scorziell: I think so.
629 00:57:33.970 ⇒ 00:57:36.280 Hannah Wang: Sorry, for the intro?
630 00:57:37.460 ⇒ 00:57:43.089 Hannah Wang: Are we still doing, like… Cause I see… oh.
631 00:57:43.430 ⇒ 00:57:46.209 Hannah Wang: Okay, wait. Are we, like, doing the icebreaker?
632 00:57:46.880 ⇒ 00:57:47.640 Hannah Wang: Visit.
633 00:57:48.010 ⇒ 00:57:55.739 Hannah Wang: Or just introducing brain… or I don’t… wait, where did the agenda go? Like, I… sorry.
634 00:57:56.390 ⇒ 00:57:58.669 Hannah Wang: There’s, like, two of them, so I…
635 00:57:58.670 ⇒ 00:57:59.959 Pranav Narahari: Oh, the wealthy.
636 00:58:00.120 ⇒ 00:58:05.609 Pranav Narahari: Parts, yeah. I think below there’s, like, a 0-5 minute welcome line that got copied.
637 00:58:06.020 ⇒ 00:58:06.680 Hannah Wang: Right here.
638 00:58:06.680 ⇒ 00:58:08.180 Pranav Narahari: Like, at the bottom of your screen?
639 00:58:09.090 ⇒ 00:58:09.750 Luke Scorziell: Oh.
640 00:58:09.960 ⇒ 00:58:10.290 Pranav Narahari: Yeah.
641 00:58:10.290 ⇒ 00:58:12.239 Luke Scorziell: Oh, oh, shoot. My bad.
642 00:58:20.500 ⇒ 00:58:25.090 Pranav Narahari: This’ll probably be longer than 5 minutes now, since we’ve added this.
643 00:58:25.750 ⇒ 00:58:28.350 Pranav Narahari: Like, sharing what we’ve done for clients.
644 00:58:28.850 ⇒ 00:58:29.839 Luke Scorziell: But I think…
645 00:58:29.850 ⇒ 00:58:30.440 Pranav Narahari: the frame.
646 00:58:30.440 ⇒ 00:58:32.149 Luke Scorziell: Can we just ditch the iceberg?
647 00:58:33.020 ⇒ 00:58:39.029 Pranav Narahari: Well, I think the framing will take less time now, since we’re just really talking about,
648 00:58:39.690 ⇒ 00:58:43.749 Pranav Narahari: like, these key terms, like, I don’t think it’s gonna take 8 minutes. I think probably, like.
649 00:58:45.390 ⇒ 00:58:48.690 Pranav Narahari: 4 or 5 minutes. I don’t know. What are you thinking?
650 00:58:51.340 ⇒ 00:58:53.420 Hannah Wang: Well, people also might ask questions.
651 00:58:53.740 ⇒ 00:58:54.720 Hannah Wang: During…
652 00:58:54.830 ⇒ 00:59:01.760 Hannah Wang: the… when you give, like, the definitions and stuff, so I do feel like it could… there is a world where it’d be…
653 00:59:02.060 ⇒ 00:59:06.409 Hannah Wang: Like, 10 minutes, if people… Or confused, or something.
654 00:59:06.790 ⇒ 00:59:08.030 Pranav Narahari: Yeah, that’s true.
655 00:59:09.190 ⇒ 00:59:12.820 Luke Scorziell: I guess I don’t… I don’t want to be married to the definitions, just because I’m kind of like…
656 00:59:13.650 ⇒ 00:59:16.220 Pranav Narahari: True. It might be kind of basic for people that are, like.
657 00:59:16.760 ⇒ 00:59:17.340 Hannah Wang: Mmm.
658 00:59:17.340 ⇒ 00:59:20.360 Luke Scorziell: Like, Jeff is… Jeff is gonna know a lot about AI.
659 00:59:20.360 ⇒ 00:59:21.260 Pranav Narahari: 100%.
660 00:59:21.690 ⇒ 00:59:26.020 Luke Scorziell: Jake works at an AI company, Troy…
661 00:59:26.220 ⇒ 00:59:30.340 Luke Scorziell: This is probably pretty up-to-date on AI. Leo’s up-to-date on AI.
662 00:59:31.790 ⇒ 00:59:35.859 Luke Scorziell: Alice, I have no clue. Kayla, don’t really know. But it’s like…
663 00:59:38.500 ⇒ 00:59:40.340 Luke Scorziell: I think we… maybe we could just be like…
664 00:59:41.100 ⇒ 00:59:44.610 Luke Scorziell: If you don’t know what any of these terms are at any point, we’re happy to kind of…
665 00:59:45.870 ⇒ 00:59:53.229 Pranav Narahari: Yeah, we could just… if we’re… we’ll check… take a look at the clock, I can just literally make it, like, 10 seconds, say just that.
666 00:59:53.550 ⇒ 00:59:55.539 Pranav Narahari: And then move on to the next point.
667 00:59:56.050 ⇒ 01:00:02.130 Luke Scorziell: Okay, so yeah, I’ll have you drive… or you drive the Brain Forge and, like, client work that we’ve done?
668 01:00:02.130 ⇒ 01:00:02.780 Pranav Narahari: Yeah.
669 01:00:02.800 ⇒ 01:00:05.210 Luke Scorziell: This will just say, like.
670 01:00:13.700 ⇒ 01:00:17.490 Luke Scorziell: I don’t know, I’m, like, really close, just saying we could just skip it, but,
671 01:00:21.460 ⇒ 01:00:23.850 Luke Scorziell: You can see, we can see what the vibes are.
672 01:00:24.180 ⇒ 01:00:29.020 Luke Scorziell: And the demo…
673 01:00:31.550 ⇒ 01:00:37.509 Luke Scorziell: I don’t… it probably won’t take all that long, unless I build it into something a little more complex, right?
674 01:00:38.930 ⇒ 01:00:39.630 Luke Scorziell: Huh.
675 01:00:41.190 ⇒ 01:00:43.499 Luke Scorziell: Or I, like, build it on the spot with her.
676 01:00:44.390 ⇒ 01:00:45.120 Pranav Narahari: Hmm.
677 01:00:45.870 ⇒ 01:00:47.010 Luke Scorziell: What were you saying? For the death.
678 01:00:47.430 ⇒ 01:00:48.420 Pranav Narahari: like.
679 01:00:48.620 ⇒ 01:01:02.040 Pranav Narahari: there might… like, I think I’ll just prep for it, because I feel like it’s definitely gonna tie into the discussion portion of things, when people are talking about, like, oh, what does this do, what does that do? What can I use for this? What can I use for that, like…
680 01:01:02.680 ⇒ 01:01:08.309 Pranav Narahari: I can be like, oh, we’ll build an MCP server, by the way, like… Who knows what that is?
681 01:01:08.440 ⇒ 01:01:10.650 Pranav Narahari: Or are you familiar with it?
682 01:01:11.200 ⇒ 01:01:12.530 Luke Scorziell: Like, instead of doing this.
683 01:01:13.060 ⇒ 01:01:13.920 Pranav Narahari: Yeah.
684 01:01:14.750 ⇒ 01:01:18.700 Luke Scorziell: Yeah, I’m okay with that. I think… I think we could kind of state, like.
685 01:01:19.700 ⇒ 01:01:22.160 Luke Scorziell: Where’s the beginning? Or maybe you could just say, like.
686 01:01:28.590 ⇒ 01:01:29.799 Luke Scorziell: By the way…
687 01:01:33.610 ⇒ 01:01:34.510 Luke Scorziell: Spam…
688 01:01:42.150 ⇒ 01:01:48.470 Luke Scorziell: Interesting. I mean, I’m in the process of… Let’s see… Really?
689 01:01:48.860 ⇒ 01:01:49.830 Luke Scorziell: Supply.
690 01:01:50.480 ⇒ 01:01:52.010 Luke Scorziell: Well, I’ll stop me.
691 01:01:55.520 ⇒ 01:01:56.630 Pranav Narahari: Yeah, perfect.
692 01:02:00.150 ⇒ 01:02:05.379 Pranav Narahari: Yeah, and honestly, like, feel free to stop me at any point and just be like, hey, let’s.
693 01:02:05.770 ⇒ 01:02:09.070 Luke Scorziell: Let’s get a little less technical there, talk a little bit more.
694 01:02:09.150 ⇒ 01:02:12.390 Pranav Narahari: like, business impacty on whatever I’m talking about.
695 01:02:12.390 ⇒ 01:02:15.339 Luke Scorziell: Yeah, I mean, I think… well, I think the thing that…
696 01:02:15.640 ⇒ 01:02:19.210 Luke Scorziell: We have that’s really valuable that you shouldn’t shy away from, is that…
697 01:02:21.880 ⇒ 01:02:26.160 Luke Scorziell: You are super technical, and you do know, and that’s a good thing.
698 01:02:26.460 ⇒ 01:02:26.810 Pranav Narahari: Yeah.
699 01:02:26.810 ⇒ 01:02:30.019 Luke Scorziell: Because a lot of these agencies that are out there right now are just, like.
700 01:02:30.490 ⇒ 01:02:42.919 Luke Scorziell: probably, honestly, people like me, just, like, hopping on and… and doing stuff that… so it’s, like, this is a… this is a value proposition that we have, so I don’t… I wouldn’t want… don’t… don’t feel like you have to shy away from.
701 01:02:43.500 ⇒ 01:02:45.030 Luke Scorziell: Yeah. That.
702 01:02:45.600 ⇒ 01:02:50.129 Pranav Narahari: I think it’s sometimes it’s a sell, too, to be like, okay, they know what they’re talking about, like…
703 01:02:50.710 ⇒ 01:03:00.510 Pranav Narahari: And you kind of get to know that by hearing some, like, technical stuff that, like, oh, you know, you just read about on, like, LinkedIn or Twitter, but, like, when someone’s, like.
704 01:03:00.750 ⇒ 01:03:08.099 Pranav Narahari: really talking about… or making the connection to you, like, how… and, like, you understand it a little bit better.
705 01:03:08.710 ⇒ 01:03:11.300 Pranav Narahari: Yeah, I think that will probably just, like.
706 01:03:11.640 ⇒ 01:03:17.849 Pranav Narahari: have them more… give… give them more trust in Brainforge, because, like, okay, they actually sound like… like…
707 01:03:18.780 ⇒ 01:03:20.390 Pranav Narahari: Like, experts in this field.
708 01:03:20.890 ⇒ 01:03:25.560 Luke Scorziell: Yeah, 100%. I agree. So…
709 01:03:26.130 ⇒ 01:03:29.920 Luke Scorziell: Yeah, and then… okay, so then I guess, like, as you walk through, like.
710 01:03:30.780 ⇒ 01:03:38.940 Luke Scorziell: this section, too, I guess I’m thinking, like, this is, like, really kind of making, like,
711 01:03:40.080 ⇒ 01:03:43.499 Luke Scorziell: Also, if either of you have to hop off at any point, I’ll probably need to…
712 01:03:43.850 ⇒ 01:03:46.149 Luke Scorziell: Trying to go soonish, but,
713 01:03:47.120 ⇒ 01:03:58.089 Luke Scorziell: Yeah, just, I think, like, talking about, like, the sky is really the limit, like, we… we can build really any… anything that annoys you, like, we can build, and then… and then maybe that could be, like.
714 01:03:58.930 ⇒ 01:04:01.950 Luke Scorziell: Like, and what?
715 01:04:03.040 ⇒ 01:04:04.630 Luke Scorziell: Why is the limit?
716 01:04:06.310 ⇒ 01:04:07.730 Luke Scorziell: Let me bring that…
717 01:04:28.920 ⇒ 01:04:31.539 Pranav Narahari: I think it’s, like, basically a guarantee. I’ll just be like.
718 01:04:32.390 ⇒ 01:04:35.299 Pranav Narahari: I would bet it can be helped with AI.
719 01:04:35.790 ⇒ 01:04:37.409 Luke Scorziell: Yeah, Ari, I mean, you could just say, like.
720 01:04:38.190 ⇒ 01:04:40.289 Luke Scorziell: Can be helpful, you know, to make. Yeah.
721 01:04:40.290 ⇒ 01:04:42.149 Pranav Narahari: Can definitely be held to that, yeah.
722 01:04:42.150 ⇒ 01:04:44.449 Luke Scorziell: Because I feel pretty confident saying that.
723 01:04:45.370 ⇒ 01:04:51.640 Luke Scorziell: Temple, and then I’ll be like, Thanks so much.
724 01:05:01.440 ⇒ 01:05:06.469 Hannah Wang: I think I’ll make slides. I’ll just do, like, the Friday slides.
725 01:05:06.640 ⇒ 01:05:10.049 Hannah Wang: Variation, like, a really simple one, because…
726 01:05:10.920 ⇒ 01:05:15.709 Hannah Wang: yeah, it’s… I think it’ll be helpful to guide through the agenda.
727 01:05:16.750 ⇒ 01:05:22.970 Hannah Wang: So I’ll just make a quick one in Google Slides, and then we can… use that as…
728 01:05:23.500 ⇒ 01:05:29.219 Hannah Wang: There’s a lot going on. And then, Pranav, whatever you build your slide on, like.
729 01:05:29.370 ⇒ 01:05:31.039 Hannah Wang: Doesn’t have to be Google.
730 01:05:31.580 ⇒ 01:05:37.300 Hannah Wang: slides, you can just, like, even screenshot something, and then I’ll just paste it into a slide or something.
731 01:05:38.180 ⇒ 01:05:42.419 Luke Scorziell: Perfect. Could also just say, yeah, yeah, the slides could also just be, like, demo.
732 01:05:42.760 ⇒ 01:05:43.729 Hannah Wang: Yeah,
733 01:05:44.160 ⇒ 01:05:45.530 Luke Scorziell: about Brainforge.
734 01:05:46.010 ⇒ 01:05:46.630 Hannah Wang: Yeah.
735 01:05:48.220 ⇒ 01:05:50.800 Luke Scorziell: Okay, thank you for offering to do that.
736 01:05:51.100 ⇒ 01:05:53.780 Luke Scorziell: Yeah. Also, I don’t know if you want to charge your laptop.
737 01:05:54.880 ⇒ 01:05:58.749 Hannah Wang: It might die soon. Okay, 8%, not bad.
738 01:05:59.930 ⇒ 01:06:04.359 Luke Scorziell: Let me go get my charger. It’s funny.
739 01:06:07.980 ⇒ 01:06:08.930 Luke Scorziell: Okay.
740 01:06:09.070 ⇒ 01:06:11.830 Luke Scorziell: Thanks so much, Pranav, for, like, what you’ve done with those clients.
741 01:06:14.550 ⇒ 01:06:15.500 Luke Scorziell: I like it.
742 01:06:31.240 ⇒ 01:06:32.020 Pranav Narahari: Yay.
743 01:06:35.540 ⇒ 01:06:39.400 Luke Scorziell: Okay, I can, I can finish this part, too.
744 01:06:39.510 ⇒ 01:06:40.340 Luke Scorziell: the gold.
745 01:06:42.410 ⇒ 01:06:43.330 Pranav Narahari: Oof.
746 01:06:48.890 ⇒ 01:06:51.510 Luke Scorziell: Screen…
747 01:07:21.120 ⇒ 01:07:22.160 Luke Scorziell: Okay, and then…
748 01:07:26.670 ⇒ 01:07:27.490 Pranav Narahari: Okay.
749 01:07:31.540 ⇒ 01:07:32.590 Luke Scorziell: October.
750 01:07:42.080 ⇒ 01:07:52.909 Pranav Narahari: Anna, did that mixer end up happening last week? In Austin?
751 01:07:53.590 ⇒ 01:07:59.500 Hannah Wang: No, we had to… to cancel, because we only got, like, 3 sign-ups.
752 01:07:59.500 ⇒ 01:08:00.969 Pranav Narahari: Oh, gotcha, gotcha.
753 01:08:00.970 ⇒ 01:08:04.810 Hannah Wang: Yeah, it was, like, a whole… whole thing,
754 01:08:05.530 ⇒ 01:08:13.239 Hannah Wang: But we’re, like, hosting, a happy hour tomorrow, and we got more sign-ups for that, and that was, like, a super last-minute thing.
755 01:08:14.030 ⇒ 01:08:14.780 Pranav Narahari: Nuts.
756 01:08:15.470 ⇒ 01:08:20.040 Hannah Wang: Yeah, that… Kind of sad, because…
757 01:08:20.140 ⇒ 01:08:27.979 Hannah Wang: we obviously implement MixedPanel, and we’re partners with them, and… Had to cancel the event.
758 01:08:28.560 ⇒ 01:08:31.730 Hannah Wang: And then there’s this, like, whole thing with the venue.
759 01:08:34.229 ⇒ 01:08:34.989 Pranav Narahari: Right.
760 01:08:34.990 ⇒ 01:08:38.409 Hannah Wang: Yeah, they… Long story, but…
761 01:08:38.840 ⇒ 01:08:44.230 Hannah Wang: that’s still going on. Probably have to dispute something with Utam’s credit card, but…
762 01:08:44.229 ⇒ 01:08:45.850 Luke Scorziell: Oh, really? It’s still going on.
763 01:08:46.470 ⇒ 01:08:57.899 Hannah Wang: Yeah, like, both of the hosts… the host and Peerspace are like, nope, you gotta pay all of it, and then Robert’s like, let’s just dispute it. I was like, okay.
764 01:09:01.560 ⇒ 01:09:04.370 Hannah Wang: Yeah, but thanks for asking.
765 01:09:04.729 ⇒ 01:09:08.709 Pranav Narahari: Oh man, I didn’t know it was… I didn’t know it was like that.
766 01:09:08.710 ⇒ 01:09:10.370 Luke Scorziell: No clue that was still happening. I’m sorry.
767 01:09:10.370 ⇒ 01:09:15.840 Hannah Wang: That’s okay. I… my husband is a lawyer, so I just make him write all the language.
768 01:09:16.170 ⇒ 01:09:24.629 Hannah Wang: I’m, like, very non-confrontational, and I hate doing stuff like that, so I’m like, okay, Eric, you help me fight them.
769 01:09:25.460 ⇒ 01:09:27.220 Hannah Wang: That’s a win.
770 01:09:38.359 ⇒ 01:09:44.909 Luke Scorziell: Okay, I can write the demo script later. But yeah, discussion, I guess?
771 01:09:46.999 ⇒ 01:09:52.429 Luke Scorziell: We’ll make sure everyone gets… I don’t… yeah, I don’t mind at all kind of guiding it, I guess, but…
772 01:09:52.939 ⇒ 01:09:56.649 Luke Scorziell: Yeah, they… I was thinking of, like, what is the…
773 01:10:00.000 ⇒ 01:10:02.369 Pranav Narahari: I think what you have here, like, is pretty good, like…
774 01:10:03.500 ⇒ 01:10:14.969 Pranav Narahari: seeding it, and then whenever there’s follow-up questions about, like, oh, how exactly would you implement this? If they do become technical, then I think I do find just…
775 01:10:15.280 ⇒ 01:10:16.989 Pranav Narahari: Answering those questions.
776 01:10:20.000 ⇒ 01:10:21.690 Pranav Narahari: But basically, it starts with you.
777 01:10:24.410 ⇒ 01:10:25.789 Hannah Wang: I think we lost Luke.
778 01:10:26.260 ⇒ 01:10:27.920 Pranav Narahari: Oh, okay, okay, I wasn’t sure if that was pink.
779 01:10:31.370 ⇒ 01:10:33.429 Luke Scorziell: Like, what are we trying to do with AI?
780 01:10:35.200 ⇒ 01:10:36.350 Luke Scorziell: some questions.
781 01:10:36.550 ⇒ 01:10:37.860 Luke Scorziell: Wait, can you guys hear me now?
782 01:10:38.040 ⇒ 01:10:38.820 Luke Scorziell: That’s tall.
783 01:10:40.230 ⇒ 01:10:41.879 Luke Scorziell: We can. Yeah, this is good.
784 01:10:41.880 ⇒ 01:10:42.610 Pranav Narahari: Yeah.
785 01:10:44.590 ⇒ 01:10:50.909 Hannah Wang: Oh, okay, well… Maybe he’ll… Maybe he’ll join again.
786 01:10:55.720 ⇒ 01:11:14.409 Pranav Narahari: I was gonna say, for, like, drafting, like, messages that have to be, like, somewhat, like, confrontational, like, I was using ChatGPT so much for that, because we were having an issue with, like, me and my roommates when I was, like, living in, like, a townhouse, like, in Austin, like.
787 01:11:14.910 ⇒ 01:11:19.310 Pranav Narahari: are, like, landlord just… just a terrible person. Like, just.
788 01:11:19.310 ⇒ 01:11:20.510 Hannah Wang: Oh, man.
789 01:11:20.950 ⇒ 01:11:27.730 Pranav Narahari: And so, we were just like… I was like, hey, this stuff is broken, how can you say this in, like, strict language?
790 01:11:27.730 ⇒ 01:11:28.640 Hannah Wang: Yeah.
791 01:11:28.640 ⇒ 01:11:33.509 Pranav Narahari: And then use, like, the lease as, like, context, like, doing all that stuff, but…
792 01:11:33.510 ⇒ 01:11:41.170 Hannah Wang: Yeah. Yeah, maybe I should’ve… used AI to… Well, I don’t know.
793 01:11:41.570 ⇒ 01:11:47.760 Hannah Wang: Yeah, next time, if… if I don’t have the resources. But, oh man, bad landlords are always…
794 01:11:48.140 ⇒ 01:11:56.269 Hannah Wang: I feel like tenant, lawsuits are, like, a really common, common thing. And they’re always in favor of tenants, I feel like, so…
795 01:11:57.290 ⇒ 01:11:58.000 Pranav Narahari: Yeah.
796 01:11:58.660 ⇒ 01:11:59.260 Hannah Wang: boy.
797 01:11:59.550 ⇒ 01:12:01.370 Luke Scorziell: Depending on the state, I guess.
798 01:12:01.370 ⇒ 01:12:08.220 Pranav Narahari: Yeah, I think Texas maybe… or at least Austin was saying, like, yeah, you have a low likelihood chance of, like… Oh, what?
799 01:12:08.690 ⇒ 01:12:14.260 Luke Scorziell: California’s, like, super… it sucks to be a landlord in California, but it’s really nice to be a tenant.
800 01:12:14.260 ⇒ 01:12:14.910 Hannah Wang: Yeah.
801 01:12:15.790 ⇒ 01:12:18.890 Hannah Wang: Same thing, it’s nice to be an employee in California, but…
802 01:12:18.890 ⇒ 01:12:20.430 Luke Scorziell: A little harder to be a business.
803 01:12:21.550 ⇒ 01:12:22.270 Pranav Narahari: Yeah.
804 01:12:23.860 ⇒ 01:12:38.210 Luke Scorziell: Okay, I kind of have to… I’m at my grandparents, and they’re making dinner, and then I’m going to a game, so I might, I might dip, because I think we’re, like… I don’t know, do you guys feel confident? What is our confidence level?
805 01:12:38.660 ⇒ 01:12:44.109 Pranav Narahari: I feel pretty good, yeah. Hannah, at some point tonight, I’ll just send you, like, that screenshot.
806 01:12:44.240 ⇒ 01:12:45.000 Pranav Narahari: Fuck.
807 01:12:45.930 ⇒ 01:12:51.479 Pranav Narahari: And then, yeah, I feel pretty good other than that. I think that’s the only prep, really, I need to do.
808 01:12:54.620 ⇒ 01:12:55.869 Luke Scorziell: Per… yeah, sorry, go for it.
809 01:12:55.870 ⇒ 01:13:07.420 Hannah Wang: I was just gonna say I’ll… I’ll whip of something real quick, and then it should be… we have some time tomorrow morning to… to add finishing touches, so I don’t think it should be a huge problem.
810 01:13:08.080 ⇒ 01:13:08.760 Luke Scorziell: Yeah.
811 01:13:09.490 ⇒ 01:13:11.440 Luke Scorziell: Cool. Okay. I’ll get…
812 01:13:11.550 ⇒ 01:13:20.320 Luke Scorziell: Good job. And I think, yeah, I mean, this should be easier, and seems like something we can do again. So I guess, like, the final part, too, will be…
813 01:13:20.700 ⇒ 01:13:25.179 Luke Scorziell: Yeah, us sending, like, a follow-up email…
814 01:13:25.400 ⇒ 01:13:28.829 Luke Scorziell: Inviting them to go to another session.
815 01:13:29.260 ⇒ 01:13:32.430 Luke Scorziell: And maybe just sending, like, a couple of the lead magnets.
816 01:13:34.290 ⇒ 01:13:35.020 Hannah Wang: Yeah.
817 01:13:36.370 ⇒ 01:13:37.090 Hannah Wang: Sounds good.
818 01:13:37.580 ⇒ 01:13:39.380 Luke Scorziell: Cool. Cool. Okay.
819 01:13:40.930 ⇒ 01:13:42.569 Luke Scorziell: Thank you, guys.
820 01:13:43.630 ⇒ 01:13:44.850 Pranav Narahari: Yeah, thanks guys.
821 01:13:44.850 ⇒ 01:13:46.959 Hannah Wang: Alright, have a good evening. Bye.
822 01:13:46.960 ⇒ 01:13:47.580 Luke Scorziell: Me too.