Meeting Title: BF StoryBrand Session Date: 2025-06-16 Meeting participants: Hannah Wang, Uttam Kumaran, Amber Lin
WEBVTT
1 00:00:27.660 ⇒ 00:00:28.200 Uttam Kumaran: Hello!
2 00:00:30.590 ⇒ 00:00:31.480 Hannah Wang: Hello!
3 00:00:33.290 ⇒ 00:00:34.585 Uttam Kumaran: How’s the day going.
4 00:00:36.080 ⇒ 00:00:37.559 Hannah Wang: Good. How’s yours?
5 00:00:38.510 ⇒ 00:00:45.002 Uttam Kumaran: It’s good just got a bunch of sales stuff done. And yeah, just moving
6 00:00:46.470 ⇒ 00:00:54.130 Uttam Kumaran: we are trying to bring on also like a sales coordinator and a Pm. Coordinator. So I’m just like moving that stuff along. And
7 00:00:55.240 ⇒ 00:00:59.076 Uttam Kumaran: yeah, it’s good. Had a good conversation, too, with,
8 00:01:00.050 ⇒ 00:01:03.760 Uttam Kumaran: I was talking to this. These folks that I don’t know. I sent it in the
9 00:01:05.209 ⇒ 00:01:11.550 Uttam Kumaran: inspiration channel interlude.
10 00:01:14.200 ⇒ 00:01:19.990 Uttam Kumaran: Yeah, I had a good conversation with Matthew. Who’s 1 of the head guys there? And Hi.
11 00:01:19.990 ⇒ 00:01:24.970 Uttam Kumaran: I told him I told him that. Hey? I told him that we’d with
12 00:01:25.110 ⇒ 00:01:26.910 Uttam Kumaran: maybe the Me. Him
13 00:01:27.050 ⇒ 00:01:34.730 Uttam Kumaran: and they have their other co-founders like a big brand designer guy, so I think it’d be cool to all talk because they do a lot of brand work for
14 00:01:36.170 ⇒ 00:01:38.819 Uttam Kumaran: for Vc. Back startups.
15 00:01:40.860 ⇒ 00:01:43.590 Hannah Wang: Oh, right? Yeah. Their website looked really good.
16 00:01:43.590 ⇒ 00:01:44.530 Uttam Kumaran: Yeah.
17 00:01:45.940 ⇒ 00:01:51.009 Hannah Wang: Okay, yeah. If you schedule a meeting with them, I’m happy to hop on as well for that.
18 00:01:51.010 ⇒ 00:01:51.560 Uttam Kumaran: Yeah.
19 00:01:53.980 ⇒ 00:01:55.500 Hannah Wang: Hi! Amber! How’s it going.
20 00:01:57.796 ⇒ 00:02:02.173 Amber Lin: I just jumped from another meeting, so I’m still switching my brain over.
21 00:02:02.510 ⇒ 00:02:05.290 Hannah Wang: That’s okay. You can switch slowly.
22 00:02:05.290 ⇒ 00:02:05.680 Amber Lin: Okay.
23 00:02:05.680 ⇒ 00:02:11.070 Hannah Wang: This should be. Oh, I don’t know how intense this will be, but I can drive, so
24 00:02:11.070 ⇒ 00:02:19.629 Hannah Wang: me pull up all the things. Let’s see what meeting was it for?
25 00:02:21.700 ⇒ 00:02:22.830 Hannah Wang: Well was it.
26 00:02:22.830 ⇒ 00:02:28.539 Amber Lin: I was meeting the ABC. Trainers. So the the people that go.
27 00:02:28.950 ⇒ 00:02:29.680 Hannah Wang: That’s
28 00:02:31.438 ⇒ 00:02:43.079 Amber Lin: That’s that’s helping me organize their internal documents and making sure we roll, making sure that people actually use them. So people on almost that’s on the floor, working so.
29 00:02:43.080 ⇒ 00:02:44.000 Hannah Wang: I see.
30 00:02:44.250 ⇒ 00:02:49.680 Amber Lin: With Janice and Shannon. So that was helpful and chaotic because they’re not organized.
31 00:02:49.800 ⇒ 00:02:51.046 Amber Lin: Oh, boy!
32 00:02:52.010 ⇒ 00:02:55.830 Hannah Wang: They maybe they need a am coordinator on their.
33 00:02:55.830 ⇒ 00:03:07.050 Amber Lin: Well, I mean, that’s what I that’s what I’m doing. And every time we meet we’re like, oh, this is so productive like. I I wonder why I had to be with you guys because it wasn’t done.
34 00:03:07.050 ⇒ 00:03:11.500 Uttam Kumaran: But it’s also like that type of work buys us more time. And yeah, work. Right? So.
35 00:03:11.500 ⇒ 00:03:14.840 Amber Lin: It is, it is is just that work alone, like
36 00:03:15.480 ⇒ 00:03:23.169 Amber Lin: like, not even development, makes our responses so much better because we just don’t have the content to make good responses with our AI.
37 00:03:23.470 ⇒ 00:03:32.780 Hannah Wang: And okay, that’s it’s like, so interesting hearing about client work. It’s like, Oh, yeah.
38 00:03:32.780 ⇒ 00:03:37.470 Hannah Wang: I can’t imagine. Oh, I I feel like it’s good. But it’s also it’s difficult.
39 00:03:37.470 ⇒ 00:03:44.050 Hannah Wang: because you’re dealing with people and clients and stuff, and that’s always tricky. So good job.
40 00:03:44.050 ⇒ 00:03:47.479 Amber Lin: Sometimes it’s fun, sometimes it’s intense.
41 00:03:47.480 ⇒ 00:03:47.800 Hannah Wang: Yeah.
42 00:03:48.066 ⇒ 00:03:50.459 Amber Lin: Like the variety. So I I’m good with that.
43 00:03:50.460 ⇒ 00:03:57.130 Hannah Wang: That’s good. That’s good. I don’t do well with variety. So it’s good. We have a mixture of people on this team.
44 00:03:58.150 ⇒ 00:04:02.500 Hannah Wang: Okay, so I just sent over the fig jam link.
45 00:04:02.880 ⇒ 00:04:16.539 Hannah Wang: and I’m also gonna send over a notion file that Utong started. I edited it last Friday or Thursday, let me share that
46 00:04:16.829 ⇒ 00:04:18.029 Hannah Wang: page.
47 00:04:19.790 ⇒ 00:04:24.729 Hannah Wang: So Tom, I, just like moved down your notes over here, cause you took notes, so I, just
48 00:04:25.030 ⇒ 00:04:28.159 Hannah Wang: okay track of it. But I just added, like
49 00:04:28.770 ⇒ 00:04:55.449 Hannah Wang: the fig jam link here. And then the each step in the story brand framework. So I just kind of combed through the tickets on the fig jam, and kind of took the ones that had the most votes and kind of move them over here, so maybe we can look through it and brainstorm but I think for the character. We agreed that it’d be like company execs. So I just paste that there? But
50 00:04:55.540 ⇒ 00:04:58.809 Hannah Wang: yeah, I think where we kind of left off last time was
51 00:04:59.270 ⇒ 00:05:03.171 Hannah Wang: 2, 2 through 7 is kind of what we need to hammer out.
52 00:05:03.450 ⇒ 00:05:04.090 Amber Lin: Okay.
53 00:05:04.090 ⇒ 00:05:05.424 Hannah Wang: So basically everything,
54 00:05:05.870 ⇒ 00:05:06.400 Amber Lin: Okay.
55 00:05:06.500 ⇒ 00:05:23.120 Hannah Wang: So I don’t know how we want to run this meeting exactly. Do you wanna hop on the fig jam and look at those notes like, look at everything. Or do you just kinda wanna look at kind of what I brainstormed here. Just talk through that, or how.
56 00:05:23.120 ⇒ 00:05:25.616 Uttam Kumaran: Yeah, I think that I think I would.
57 00:05:26.410 ⇒ 00:05:34.549 Uttam Kumaran: What you bring. What you put in the brain forge Storybrand is, did you just like copy over the couple the 4 things from
58 00:05:36.870 ⇒ 00:05:38.449 Uttam Kumaran: what we agreed on, or tell me.
59 00:05:38.450 ⇒ 00:05:39.720 Hannah Wang: Yeah, I.
60 00:05:39.720 ⇒ 00:05:41.160 Uttam Kumaran: It would be easier for you.
61 00:05:47.830 ⇒ 00:06:01.120 Hannah Wang: I just kinda took like, for example, Number 4 took the ones that people like upvoted a lot, cause we didn’t like talk through everything in detail. So I just took the ones that the most votes and kinda like structured it and
62 00:06:01.230 ⇒ 00:06:17.310 Hannah Wang: gave like options. So, for example, like the process plan. This is like potential plan number one, Potential plan number 2, agreement plans that just kind of copied over like the ones that were voted
63 00:06:18.840 ⇒ 00:06:22.900 Hannah Wang: highly voted, and for these 2 I just copied over
64 00:06:23.370 ⇒ 00:06:25.640 Hannah Wang: the text that were on the stickies.
65 00:06:25.850 ⇒ 00:06:28.514 Hannah Wang: So I feel like.
66 00:06:29.180 ⇒ 00:06:43.484 Uttam Kumaran: My 1st ask, because I think what would be helpful while we 3 of us have a discussion is for me to just like. Also, I’ll just have AI running, and I’ll also share. We can just both share our screens. And I could just have it there. But like, can we? Can we put, like
67 00:06:43.890 ⇒ 00:06:49.188 Uttam Kumaran: all of the notes from like the stickies we like at the bottom of this somewhere?
68 00:06:50.910 ⇒ 00:06:54.110 Hannah Wang: Okay, let’s see.
69 00:06:54.110 ⇒ 00:06:59.959 Hannah Wang: like we don’t necessarily need to like. I mean, we can keep my notes like all the way at the bottom. But I kinda wanna just have a section for.
70 00:07:00.380 ⇒ 00:07:05.670 Uttam Kumaran: Like all the notes from our right?
71 00:07:05.670 ⇒ 00:07:06.770 Uttam Kumaran: Yeah, okay.
72 00:07:11.430 ⇒ 00:07:12.440 Hannah Wang: Okay, cool.
73 00:07:14.020 ⇒ 00:07:17.170 Hannah Wang: Okay. Let’s see.
74 00:07:17.770 ⇒ 00:07:23.399 Hannah Wang: I mean, we didn’t really vote on the villain problem stuff.
75 00:07:23.900 ⇒ 00:07:31.052 Hannah Wang: Yeah. So should we just copy over everything
76 00:07:32.320 ⇒ 00:07:33.489 Hannah Wang: I mean, I think there won’t.
77 00:07:33.620 ⇒ 00:07:36.440 Uttam Kumaran: Ones that we had here.
78 00:07:39.910 ⇒ 00:07:44.790 Uttam Kumaran: Yeah, maybe let’s just copy over everything like, is there like, can you? Can I export this somehow?
79 00:07:46.162 ⇒ 00:07:48.877 Hannah Wang: I don’t know. Let me see.
80 00:07:53.090 ⇒ 00:07:56.029 Uttam Kumaran: I can export as like a Csv looks like.
81 00:07:56.620 ⇒ 00:07:58.870 Uttam Kumaran: And let me see what happens when I do that.
82 00:08:02.150 ⇒ 00:08:05.836 Uttam Kumaran: because I just want it to be like copy pasteable into AI, basically.
83 00:08:06.120 ⇒ 00:08:06.880 Hannah Wang: Yeah.
84 00:08:14.250 ⇒ 00:08:14.580 Uttam Kumaran: Okay.
85 00:09:25.990 ⇒ 00:09:30.609 Uttam Kumaran: I’m trying to open this taking a sec.
86 00:09:32.530 ⇒ 00:09:36.979 Uttam Kumaran: Maybe. Also, do you mind getting the transcripts for those 2 meetings that we had.
87 00:09:38.880 ⇒ 00:09:43.610 Hannah Wang: Oh, yeah, and you can just paste them into like some sort of toggle there as well.
88 00:09:44.610 ⇒ 00:09:46.640 Hannah Wang: Yeah, let me see.
89 00:09:48.550 ⇒ 00:09:50.699 Uttam Kumaran: Get that, probably from the zoom app.
90 00:09:51.090 ⇒ 00:09:51.750 Hannah Wang: Yeah.
91 00:10:11.620 ⇒ 00:10:18.968 Uttam Kumaran: It didn’t really work. So I’m just gonna probably just like copy paste it in.
92 00:10:20.050 ⇒ 00:10:22.720 Hannah Wang: Okay, I can help with that. After.
93 00:12:01.270 ⇒ 00:12:04.760 Uttam Kumaran: Sorry it’s actually
94 00:12:17.060 ⇒ 00:12:17.760 Uttam Kumaran: why
95 00:12:29.480 ⇒ 00:12:33.540 Uttam Kumaran: I’m trying to export it again. But.
96 00:12:34.120 ⇒ 00:12:38.960 Hannah Wang: You wanna maybe share your screen so I can maybe help and see.
97 00:12:39.620 ⇒ 00:12:44.160 Uttam Kumaran: Well, I’m I’m just. I just export this. Pdf, but it’s like kind of lagging. I don’t know.
98 00:12:44.160 ⇒ 00:12:44.980 Hannah Wang: Oh, okay.
99 00:12:46.580 ⇒ 00:12:49.780 Uttam Kumaran: My froze.
100 00:12:54.020 ⇒ 00:12:59.060 Uttam Kumaran: If you go to file and figma and go to file export as Pdf.
101 00:12:59.520 ⇒ 00:13:00.970 Uttam Kumaran: see if it works for you.
102 00:13:02.170 ⇒ 00:13:05.730 Hannah Wang: Like the whole ping, my file. Okay, yeah.
103 00:13:05.730 ⇒ 00:13:07.210 Uttam Kumaran: That. Okay.
104 00:13:10.290 ⇒ 00:13:11.080 Uttam Kumaran: share.
105 00:13:46.280 ⇒ 00:13:50.260 Uttam Kumaran: Okay, this I don’t know if it started glitching for me.
106 00:13:52.940 ⇒ 00:13:58.140 Hannah Wang: Yeah, mine’s mine’s still loading, but I’ll give it a minute or 2,
107 00:14:08.700 ⇒ 00:14:10.440 Hannah Wang: I think. Actually.
108 00:14:10.590 ⇒ 00:14:21.090 Hannah Wang: I think it’s because if you export the whole thing. It’s also gonna try to parse through like all the screenshots. So I’m gonna try to selectively export.
109 00:14:21.090 ⇒ 00:14:25.240 Uttam Kumaran: Oh, okay.
110 00:14:25.450 ⇒ 00:14:26.540 Hannah Wang: Yeah. Let me try.
111 00:14:26.540 ⇒ 00:14:31.825 Uttam Kumaran: I feel like if I just get it all as text, at least in notion, it’ll help this a bit.
112 00:14:32.270 ⇒ 00:14:32.690 Hannah Wang: Yeah.
113 00:14:42.750 ⇒ 00:14:46.909 Hannah Wang: cool as Pdf.
114 00:15:18.200 ⇒ 00:15:23.110 Hannah Wang: how did you get the ones in ocean right now, like the.
115 00:15:23.110 ⇒ 00:15:26.569 Uttam Kumaran: I just select. I just like selected it, copied and paste it.
116 00:15:27.110 ⇒ 00:15:28.130 Hannah Wang: Okay.
117 00:15:30.543 ⇒ 00:15:38.659 Uttam Kumaran: And then for this, like, I don’t. Actually the Vtt file isn’t best like, can you get it from the actual like our zoom app, because it’ll give it to us like.
118 00:15:39.125 ⇒ 00:15:43.670 Hannah Wang: Oh, kind of nicely formatted, and you can just click copy there.
119 00:16:26.820 ⇒ 00:16:30.514 Uttam Kumaran: This is my laptop is like not working with the channel.
120 00:16:31.220 ⇒ 00:16:39.369 Hannah Wang: Yeah, my! Mine’s not really loading. Wait! Can you help me? Wait? Where’s my zoom?
121 00:16:41.670 ⇒ 00:16:46.229 Uttam Kumaran: Yeah, if you go to the go to the zoom demo thing.
122 00:16:47.140 ⇒ 00:16:49.230 Hannah Wang: Like within zoom right.
123 00:16:52.290 ⇒ 00:16:53.349 Uttam Kumaran: No, no!
124 00:16:54.150 ⇒ 00:16:56.559 Uttam Kumaran: Like our the app that we built.
125 00:16:57.890 ⇒ 00:17:00.120 Hannah Wang: Oh yes!
126 00:17:00.120 ⇒ 00:17:02.819 Uttam Kumaran: Like a demo.brainforce.ai.
127 00:17:03.040 ⇒ 00:17:03.860 Uttam Kumaran: Yes.
128 00:17:04.050 ⇒ 00:17:05.810 Hannah Wang: Zoom search. Yeah. Okay.
129 00:17:06.109 ⇒ 00:17:07.180 Hannah Wang: I see.
130 00:17:07.680 ⇒ 00:17:08.139 Uttam Kumaran: Yeah.
131 00:17:09.660 ⇒ 00:17:16.290 Uttam Kumaran: And then, if you just search for whatever the the name of the meeting was, and then at the, you can just literally copy the Transcript, and we cleaned it up.
132 00:17:16.290 ⇒ 00:17:16.910 Hannah Wang: Right.
133 00:17:17.500 ⇒ 00:17:18.470 Uttam Kumaran: Like. Nice.
134 00:17:19.410 ⇒ 00:17:20.030 Hannah Wang: Yeah.
135 00:17:54.280 ⇒ 00:17:59.459 Hannah Wang: is it? Okay? If I put it into a txt because it’s too large to paste it.
136 00:17:59.460 ⇒ 00:18:00.340 Uttam Kumaran: Sure. Yeah, that’s fine.
137 00:18:00.340 ⇒ 00:18:01.210 Hannah Wang: Okay.
138 00:18:13.520 ⇒ 00:18:18.206 Uttam Kumaran: So what I’m gonna do is I’m gonna take all this
139 00:19:44.930 ⇒ 00:19:46.290 Uttam Kumaran: download, this one
140 00:20:55.080 ⇒ 00:20:56.140 Uttam Kumaran: 6.
141 00:20:56.610 ⇒ 00:20:58.819 Uttam Kumaran: I have to close this fake channel or nothing.
142 00:22:36.220 ⇒ 00:22:38.970 Uttam Kumaran: Okay, so like, these are all fine. Right?
143 00:22:40.460 ⇒ 00:22:48.139 Hannah Wang: Well, I I feel like the problem we were having was integrating AI into the problem, right?
144 00:22:49.300 ⇒ 00:22:53.510 Hannah Wang: So that, like just data wise, that looks great. But
145 00:22:53.740 ⇒ 00:22:58.740 Hannah Wang: maybe we can think about how to loop in AI. Somehow.
146 00:22:59.560 ⇒ 00:23:02.640 Uttam Kumaran: Yeah. So let me pull this up.
147 00:23:56.980 ⇒ 00:24:01.370 Uttam Kumaran: And this is this is probably the what we’re talking about. Right.
148 00:24:01.370 ⇒ 00:24:01.813 Hannah Wang: Yeah,
149 00:24:08.960 ⇒ 00:24:10.669 Uttam Kumaran: Okay, I feel like this is good. Then.
150 00:24:10.910 ⇒ 00:24:12.791 Uttam Kumaran: right? These are pretty good. So let’s
151 00:24:13.910 ⇒ 00:24:17.107 Uttam Kumaran: I guess, like, let’s talk about
152 00:24:19.450 ⇒ 00:24:26.690 Uttam Kumaran: So we have the villains. Fine, external.
153 00:24:28.280 ⇒ 00:24:31.150 Uttam Kumaran: Okay? So let’s here. Okay, so
154 00:24:34.570 ⇒ 00:24:37.750 Uttam Kumaran: confident, Roi, positive decisions
155 00:24:44.360 ⇒ 00:24:49.400 Uttam Kumaran: respected as a data forward leader, save time. Clean data in a league of their own.
156 00:24:50.340 ⇒ 00:24:53.549 Uttam Kumaran: Okay? Like, yeah, we basically landed kind of in here.
157 00:24:53.550 ⇒ 00:24:54.310 Hannah Wang: Yeah.
158 00:24:56.400 ⇒ 00:24:58.420 Uttam Kumaran: I’m just gonna paste these. So we have them.
159 00:25:03.941 ⇒ 00:25:10.120 Uttam Kumaran: Helps them avoid failure, wasting time and money missing decision process falling behind customer.
160 00:25:11.230 ⇒ 00:25:13.469 Uttam Kumaran: Okay, yep. Have this
161 00:25:18.040 ⇒ 00:25:22.400 Uttam Kumaran: Turdish transactional Cta landing page downloads.
162 00:25:23.120 ⇒ 00:25:26.689 Uttam Kumaran: not ready to chat healthy analytics checklist.
163 00:25:31.050 ⇒ 00:25:32.550 Uttam Kumaran: So we can talk about these
164 00:25:33.360 ⇒ 00:25:38.880 Uttam Kumaran: direct Cta schedule call book your strategy session. Yeah.
165 00:25:40.000 ⇒ 00:25:42.549 Uttam Kumaran: okay, I think these are good. I think.
166 00:25:43.330 ⇒ 00:25:51.770 Uttam Kumaran: Book a call book, a strategy call, yeah, we’re good.
167 00:25:52.040 ⇒ 00:25:54.589 Uttam Kumaran: Let’s talk about the agreement plan.
168 00:25:55.990 ⇒ 00:26:00.169 Uttam Kumaran: We don’t sell software, you’ll never be left with a pretty dashboard that doesn’t help.
169 00:26:01.540 ⇒ 00:26:03.429 Uttam Kumaran: Okay. I think these are
170 00:26:08.850 ⇒ 00:26:10.669 Uttam Kumaran: the worry Free Kickoff.
171 00:26:12.150 ⇒ 00:26:14.400 Hannah Wang: Oh, that’s just like I just came up with.
172 00:26:14.400 ⇒ 00:26:15.080 Uttam Kumaran: I like that.
173 00:26:15.080 ⇒ 00:26:15.655 Hannah Wang: AI!
174 00:26:16.230 ⇒ 00:26:18.009 Uttam Kumaran: Worry free kickoff.
175 00:26:19.890 ⇒ 00:26:20.810 Uttam Kumaran: Yeah.
176 00:26:20.940 ⇒ 00:26:22.389 Uttam Kumaran: So this. But like.
177 00:26:23.130 ⇒ 00:26:27.290 Uttam Kumaran: I think some people basically said that you want like they, they said you should. We should
178 00:26:27.510 ⇒ 00:26:32.960 Uttam Kumaran: either offer like, yeah, like a kickoff like this, or like an ex what’s called like an accelerator like A,
179 00:26:33.600 ⇒ 00:26:37.039 Uttam Kumaran: we offer our like patented AI accelerator.
180 00:26:37.310 ⇒ 00:26:37.760 Hannah Wang: Hmm.
181 00:26:37.880 ⇒ 00:26:40.110 Uttam Kumaran: Or yeah.
182 00:26:45.630 ⇒ 00:26:48.600 Uttam Kumaran: So maybe we can. Let’s just note down some like, open.
183 00:26:48.970 ⇒ 00:26:50.890 Uttam Kumaran: Maybe. Let’s note down like.
184 00:26:58.150 ⇒ 00:27:00.150 Uttam Kumaran: so this is our
185 00:27:13.940 ⇒ 00:27:23.990 Uttam Kumaran: open questions are like, for what should our agreement plan the call?
186 00:27:35.730 ⇒ 00:27:45.609 Uttam Kumaran: Okay? So then, process plan. Tell us what’s broken. We’ll fix it and clean the Miss Schedule call auto your setup, execute with your team. Okay, that’s pretty good.
187 00:27:47.130 ⇒ 00:27:47.920 Uttam Kumaran: Alright.
188 00:27:49.140 ⇒ 00:27:55.130 Uttam Kumaran: So what do we have? Okay, we have chaos to clarity the clear start plan chaos to clarity plan
189 00:27:58.150 ⇒ 00:27:59.530 Uttam Kumaran: schedule a call.
190 00:28:00.040 ⇒ 00:28:01.319 Uttam Kumaran: We audit.
191 00:28:04.160 ⇒ 00:28:08.159 Uttam Kumaran: You get a customized plan and we execute their team.
192 00:28:08.850 ⇒ 00:28:10.900 Uttam Kumaran: Yeah. So I kind of do like,
193 00:28:19.420 ⇒ 00:28:23.390 Uttam Kumaran: I kind of like this
194 00:28:27.330 ⇒ 00:28:33.099 Uttam Kumaran: oops like this is basically the entire plan. Right? So this is a 5,
195 00:28:33.700 ⇒ 00:28:37.790 Uttam Kumaran: 6, 6. So this is a 4, 5, right?
196 00:28:39.110 ⇒ 00:28:39.600 Uttam Kumaran: Yeah.
197 00:28:41.420 ⇒ 00:28:51.569 Hannah Wang: Yeah, although the book did mention like keeping it keeping the steps minimal. Because if you put too many steps, it’s a it’s a lot. It’s like overwhelming.
198 00:28:52.440 ⇒ 00:28:56.770 Hannah Wang: but I think it said 3 to 5, but forgot.
199 00:28:56.920 ⇒ 00:29:00.150 Hannah Wang: I feel like 3 is the the sweet, sweet spot.
200 00:29:02.820 ⇒ 00:29:08.140 Amber Lin: I’m looking at. Step 3 to 5. They can be one step.
201 00:29:11.380 ⇒ 00:29:12.849 Uttam Kumaran: This is chapter 4.
202 00:29:15.653 ⇒ 00:29:17.660 Hannah Wang: Oh, yeah.
203 00:29:17.660 ⇒ 00:29:19.519 Uttam Kumaran: Character. No, no character.
204 00:29:20.440 ⇒ 00:29:23.060 Uttam Kumaran: Oh, and cool
205 00:29:26.850 ⇒ 00:29:29.200 Uttam Kumaran: meets a guy. Oh, okay, this is, give some suggestion.
206 00:29:29.200 ⇒ 00:29:29.980 Uttam Kumaran: 7.
207 00:29:31.590 ⇒ 00:29:33.410 Hannah Wang: Yeah, yeah, yeah.
208 00:29:35.270 ⇒ 00:29:43.729 Uttam Kumaran: So process, plan, schedule appointment, create the plan, execute the plan together, download software, integrate your database, revolutionize interaction.
209 00:29:44.650 ⇒ 00:29:49.690 Uttam Kumaran: Okay, so, but what is the difference. Oh, okay.
210 00:29:51.040 ⇒ 00:29:54.509 Uttam Kumaran: what is the I still don’t get? What is the difference in agreement plan?
211 00:29:56.370 ⇒ 00:29:57.760 Uttam Kumaran: Oh, okay, never mind.
212 00:29:57.760 ⇒ 00:30:05.088 Amber Lin: Yeah, I was thinking about the agreement plan essentially promises we make. So essentially saying, we won’t.
213 00:30:05.850 ⇒ 00:30:10.730 Amber Lin: We’re not biased towards certain vendors. That’s a promise or agreement.
214 00:30:11.050 ⇒ 00:30:16.430 Amber Lin: or that promise that we will go from solving your business problems
215 00:30:17.150 ⇒ 00:30:20.320 Amber Lin: like that type. I think those are promises and agreements.
216 00:30:23.310 ⇒ 00:30:26.439 Uttam Kumaran: Okay, so how about this?
217 00:30:27.750 ⇒ 00:30:28.880 Uttam Kumaran: I guess
218 00:30:33.230 ⇒ 00:30:34.500 Uttam Kumaran: schedule call.
219 00:30:35.350 ⇒ 00:30:37.870 Uttam Kumaran: That’s create their plan.
220 00:30:38.840 ⇒ 00:30:43.120 Uttam Kumaran: And then we that you execute right.
221 00:30:43.390 ⇒ 00:30:43.840 Hannah Wang: Yeah.
222 00:30:47.870 ⇒ 00:30:54.030 Uttam Kumaran: I mean, we can. We can. Yeah, so kind of what’s unique here is gonna be like things like this, like, what what’s in these plans.
223 00:30:55.070 ⇒ 00:30:57.549 Uttam Kumaran: So that’s an open question, too, is like.
224 00:30:58.270 ⇒ 00:31:01.720 Uttam Kumaran: what is what is in each of the
225 00:31:04.890 ⇒ 00:31:06.599 Uttam Kumaran: bunch of people on Capitol Hill?
226 00:31:09.340 ⇒ 00:31:12.248 Uttam Kumaran: What is in the process.
227 00:31:29.110 ⇒ 00:31:35.233 Uttam Kumaran: Okay, so that’s these. Okay, it makes sense. So I’m just going to.
228 00:31:38.770 ⇒ 00:31:41.450 Uttam Kumaran: I’ll leave these. But I think this is fine.
229 00:31:41.680 ⇒ 00:31:42.400 Hannah Wang: Yeah.
230 00:32:01.950 ⇒ 00:32:08.959 Uttam Kumaran: Okay? And then, so let’s so I’m working kind of backwards. But I think it’s helpful because it gets better as it goes up. So
231 00:32:10.812 ⇒ 00:32:17.260 Uttam Kumaran: so empathy was, we know what it’s like to make big calls. We felt the stress.
232 00:32:17.820 ⇒ 00:32:20.620 Uttam Kumaran: okay, so this is probably good. Yeah.
233 00:32:25.430 ⇒ 00:32:26.100 Uttam Kumaran: okay.
234 00:32:33.810 ⇒ 00:32:39.600 Uttam Kumaran: so we’re this is what we’re building is the brand script. So this is the brand script right.
235 00:32:41.096 ⇒ 00:32:42.959 Hannah Wang: Yes, I think so.
236 00:32:45.790 ⇒ 00:32:48.230 Uttam Kumaran: That’s why keep the book keeps saying that word.
237 00:32:49.560 ⇒ 00:32:52.830 Uttam Kumaran: So client logos, case study certifications.
238 00:32:58.330 ⇒ 00:32:59.000 Uttam Kumaran: Okay.
239 00:33:11.139 ⇒ 00:33:34.509 Uttam Kumaran: percent of engineers on the team. And then maybe percent of engineers on the team. Okay, yeah. And then, yeah, we we don’t take kickbacks. Oh, so this is authority.
240 00:33:35.150 ⇒ 00:33:41.102 Uttam Kumaran: But like, what is competency and authority. So those are 2 different ones.
241 00:33:41.940 ⇒ 00:33:46.709 Hannah Wang: I think in v. 2, he changed the word to competency from authority.
242 00:33:46.770 ⇒ 00:33:47.730 Hannah Wang: Yeah.
243 00:33:48.210 ⇒ 00:33:49.140 Uttam Kumaran: 5.
244 00:33:50.750 ⇒ 00:34:03.065 Uttam Kumaran: Okay, okay? So I’m just gonna say, competency authority, we’re data operators percent. We don’t take kickbacks.
245 00:34:04.725 ⇒ 00:34:08.895 Uttam Kumaran: Oh, so that’s not an empathy.
246 00:34:09.989 ⇒ 00:34:15.500 Uttam Kumaran: okay. But then still, I think we can put this here right
247 00:34:17.929 ⇒ 00:34:21.910 Uttam Kumaran: data operators, client Logos, we know. Okay, that’s fine.
248 00:34:22.040 ⇒ 00:34:29.399 Uttam Kumaran: So I can’t get the data. I need to make decisions that drive. Okay, I feel frustrated and exposed making.
249 00:34:29.639 ⇒ 00:34:31.850 Uttam Kumaran: I mean, it kind of did this again, which is good.
250 00:34:32.929 ⇒ 00:34:39.050 Uttam Kumaran: right? So I feel frustrated, exposed, making high stakes decisions without knowing the data is right?
251 00:34:40.409 ⇒ 00:34:52.050 Uttam Kumaran: Recently, leadership. So it’s still, I think our problems are like, kind of both, like more data heavy.
252 00:34:52.739 ⇒ 00:34:57.489 Hannah Wang: Yeah, what was your prompt like? Can you scroll up a little bit.
253 00:34:58.120 ⇒ 00:34:59.920 Uttam Kumaran: Oh! I that.
254 00:34:59.920 ⇒ 00:35:03.760 Hannah Wang: Oh, okay, did you tell it like, try to combine AI.
255 00:35:03.760 ⇒ 00:35:05.879 Uttam Kumaran: Well, I think that’s what we’ll do next.
256 00:35:05.880 ⇒ 00:35:06.310 Hannah Wang: Okay.
257 00:35:06.310 ⇒ 00:35:10.639 Uttam Kumaran: It looks it, says he. It it found this out from our conversation, where.
258 00:35:11.100 ⇒ 00:35:13.819 Uttam Kumaran: Central attention is how to unify data. And AI,
259 00:35:14.982 ⇒ 00:35:18.429 Uttam Kumaran: the other thing that I want to
260 00:35:19.680 ⇒ 00:35:30.476 Uttam Kumaran: add to the meets a guide piece is. I want to add
261 00:35:33.580 ⇒ 00:35:39.700 Uttam Kumaran: one thing that’s unique about offer.
262 00:35:41.256 ⇒ 00:35:50.520 Uttam Kumaran: Basically fully deployed AI other data.
263 00:35:51.230 ⇒ 00:36:03.360 Uttam Kumaran: I love automation project managers to the and between.
264 00:36:03.980 ⇒ 00:36:05.110 Uttam Kumaran: And how many cents.
265 00:36:05.940 ⇒ 00:36:10.393 Uttam Kumaran: Right? So one of the things is like, you’re almost bringing us on to like
266 00:36:10.940 ⇒ 00:36:19.529 Uttam Kumaran: bringing us on to Spearhead. A new initiative, not just hiring.
267 00:36:20.120 ⇒ 00:36:31.684 Uttam Kumaran: just hiring more staffing resources. Okay, so yeah, okay, so
268 00:36:35.050 ⇒ 00:36:39.919 Uttam Kumaran: so here, it’s a senior leaders, mid market tech enabled companies.
269 00:36:40.660 ⇒ 00:36:50.110 Uttam Kumaran: I think that’s pretty fair wants clarity and control, and a reputation for leading.
270 00:36:51.110 ⇒ 00:36:56.590 Uttam Kumaran: I don’t know if we need to put that so, Zack, one level, if you see us trying to prove value grow influence.
271 00:36:57.260 ⇒ 00:36:59.489 Uttam Kumaran: I’m just gonna put these here.
272 00:37:00.740 ⇒ 00:37:01.520 Hannah Wang: Okay.
273 00:37:04.670 ⇒ 00:37:06.380 Uttam Kumaran: The system side.
274 00:37:17.630 ⇒ 00:37:26.569 Uttam Kumaran: Okay? So then let’s see, it gave us a couple of other things. So one is, yeah. Write our final brand script. Okay, cool. Decide on the villain metaphor. So
275 00:37:26.910 ⇒ 00:37:33.429 Uttam Kumaran: I think I mean, I still think both of these are good villains.
276 00:37:35.760 ⇒ 00:37:50.760 Uttam Kumaran: This is what makes it really hard for me to decide, because we are going to clients where they don’t have broken systems. They do, just don’t they? Just missing like sort of someone with the knowledge to come in and execute.
277 00:37:51.540 ⇒ 00:37:57.089 Uttam Kumaran: There are some people, though, with broken systems. So I wonder if if that’s where we should
278 00:37:58.180 ⇒ 00:38:00.249 Uttam Kumaran: have 2 or like
279 00:38:00.690 ⇒ 00:38:08.079 Uttam Kumaran: I don’t know. I mean, the broken systems may ultimately be because of the absent owner.
280 00:38:08.440 ⇒ 00:38:09.270 Hannah Wang: Yeah.
281 00:38:09.270 ⇒ 00:38:10.470 Uttam Kumaran: So it’s a result.
282 00:38:10.720 ⇒ 00:38:11.670 Hannah Wang: Right.
283 00:38:12.480 ⇒ 00:38:14.770 Uttam Kumaran: So let’s maybe I’ll write that down.
284 00:38:14.770 ⇒ 00:38:18.389 Hannah Wang: The broken system can be like an external problem.
285 00:38:22.420 ⇒ 00:38:34.980 Uttam Kumaran: So if I was to put this here compensation, broken system is typically
286 00:38:40.155 ⇒ 00:38:40.550 Uttam Kumaran: few
287 00:38:55.410 ⇒ 00:39:03.679 Uttam Kumaran: broken system issue that arises.
288 00:39:04.300 ⇒ 00:39:10.140 Uttam Kumaran: Question mark, okay, it is saying, it does feel more visceral visual.
289 00:39:10.940 ⇒ 00:39:12.293 Uttam Kumaran: I agree.
290 00:39:15.240 ⇒ 00:39:23.280 Uttam Kumaran: I just yeah, I feel like that is gonna be something that people bring up. Although sometimes people don’t have broken systems.
291 00:39:23.700 ⇒ 00:39:27.220 Uttam Kumaran: That they would blame, they would. They would just say, they don’t have the knowledge
292 00:39:27.360 ⇒ 00:39:36.320 Uttam Kumaran: like we have some leads. Clients just lack the knowledge already.
293 00:39:39.160 ⇒ 00:39:42.689 Uttam Kumaran: And then, yeah, let’s talk about the data versus AI positioning.
294 00:39:45.425 ⇒ 00:39:52.000 Uttam Kumaran: So yeah, I do think that it’s something like this closer.
295 00:39:52.150 ⇒ 00:40:02.680 Uttam Kumaran: like, I had a meeting today where basically, I said, like is commonly the easiest part
296 00:40:04.720 ⇒ 00:40:15.829 Uttam Kumaran: ensure. You have all the right context for the at the right time in the right tool like slack
297 00:40:17.180 ⇒ 00:40:25.309 Uttam Kumaran: for all of your numbers to use is the harder issue
298 00:40:25.910 ⇒ 00:40:28.060 Uttam Kumaran: said that today, 30 min ago.
299 00:40:28.360 ⇒ 00:40:28.800 Hannah Wang: Hmm.
300 00:40:29.635 ⇒ 00:40:35.480 Uttam Kumaran: I think this is good. To collapse is like a really nice strong word.
301 00:40:36.400 ⇒ 00:40:46.209 Uttam Kumaran: and it really pushes the data piece forward right without. And what is like. So like, what is a right data foundation
302 00:40:46.750 ⇒ 00:40:58.370 Uttam Kumaran: that is like centralized centralized source for all data, data, observability.
303 00:40:59.550 ⇒ 00:41:10.140 Uttam Kumaran: Kpi definitions, owners of owners of business area definitions.
304 00:41:10.810 ⇒ 00:41:13.970 Uttam Kumaran: This is for data models.
305 00:41:16.290 ⇒ 00:41:18.269 Uttam Kumaran: You know, things like that. So we have it.
306 00:41:18.700 ⇒ 00:41:22.979 Uttam Kumaran: We have, like what our definition of a right data foundation is this is like.
307 00:41:23.140 ⇒ 00:41:25.499 Uttam Kumaran: this is our like that diagram, basically.
308 00:41:26.150 ⇒ 00:41:28.542 Hannah Wang: In the flow. Diagram.
309 00:41:30.260 ⇒ 00:41:31.800 Uttam Kumaran: Test one liner
310 00:41:33.650 ⇒ 00:41:39.030 Uttam Kumaran: you mentioned this, consider sharper. We clean up the data mess. So you can make confident decisions.
311 00:41:39.860 ⇒ 00:41:42.139 Uttam Kumaran: get the clarity and systems you need.
312 00:41:43.240 ⇒ 00:41:47.900 Uttam Kumaran: Yeah. So it’s like what are like. And then I do agree, I think we need to understand, like.
313 00:41:48.070 ⇒ 00:41:50.380 Uttam Kumaran: what are our blurbs? Right?
314 00:41:50.950 ⇒ 00:41:52.190 Uttam Kumaran: The one liner.
315 00:41:52.330 ⇒ 00:41:57.060 Uttam Kumaran: What is the what is the 2 to what is the 2 to 3. Liner?
316 00:41:57.230 ⇒ 00:41:59.489 Uttam Kumaran: What is the paragraph? Right?
317 00:42:01.700 ⇒ 00:42:09.069 Uttam Kumaran: What’s what’s in each of these that isn’t in the other ones, right? Like above. And case studies, maybe
318 00:42:09.770 ⇒ 00:42:19.189 Uttam Kumaran: wins data points like, I just need to think about like what’s in this versus what’s not.
319 00:42:19.900 ⇒ 00:42:21.970 Uttam Kumaran: So maybe let me put. Let me put a
320 00:42:22.210 ⇒ 00:42:25.929 Uttam Kumaran: let me just paste these questions and any other questions that you guys think we still have.
321 00:42:27.570 ⇒ 00:42:30.850 Uttam Kumaran: that I can paste in, and we can work through.
322 00:42:37.550 ⇒ 00:42:42.740 Hannah Wang: no, I feel like the main thing for me is the data versus AI thing.
323 00:42:43.420 ⇒ 00:42:51.160 Hannah Wang: Because that’s yeah. That’s that’s how we’re gonna target the right people and clients and draw them in. So
324 00:42:51.800 ⇒ 00:42:53.730 Hannah Wang: think that’s the main one for me.
325 00:43:01.020 ⇒ 00:43:06.919 Amber Lin: Once you’ve run it through, I’ll take a look, and I’ll put any comments that I have.
326 00:43:08.480 ⇒ 00:43:10.039 Amber Lin: or questions I have.
327 00:43:10.850 ⇒ 00:43:12.140 Uttam Kumaran: Do you have a lot more?
328 00:43:20.200 ⇒ 00:43:23.460 Uttam Kumaran: So I’m gonna just paste these.
329 00:43:25.170 ⇒ 00:43:27.310 Uttam Kumaran: So that’s what let’s see.
330 00:43:32.190 ⇒ 00:43:35.950 Uttam Kumaran: Well, here first, st I’m gonna say, this is our branch script so far.
331 00:44:35.380 ⇒ 00:44:36.200 Uttam Kumaran: Let’s see.
332 00:44:37.470 ⇒ 00:44:38.130 Uttam Kumaran: Yeah.
333 00:44:55.130 ⇒ 00:44:55.860 Uttam Kumaran: Oh.
334 00:45:10.230 ⇒ 00:45:11.030 Uttam Kumaran: yeah.
335 00:45:29.510 ⇒ 00:45:38.779 Uttam Kumaran: Okay. So okay, what should our agreement plan call confidence guarantee, worry, free launch agreement, fast track guarantee.
336 00:45:44.600 ⇒ 00:45:45.260 Uttam Kumaran: Let’s see.
337 00:45:53.470 ⇒ 00:45:56.969 Uttam Kumaran: Use the confidence, guarantee and define what it covers.
338 00:45:57.870 ⇒ 00:45:59.719 Amber Lin: I like the confidence guarantee.
339 00:46:00.570 ⇒ 00:46:00.960 Hannah Wang: Hmm.
340 00:46:00.960 ⇒ 00:46:02.568 Uttam Kumaran: Okay, cool. So let me
341 00:46:05.200 ⇒ 00:46:07.960 Uttam Kumaran: I’m just gonna this here.
342 00:46:10.840 ⇒ 00:46:15.090 Uttam Kumaran: So I’m gonna say, let’s say.
343 00:46:18.110 ⇒ 00:46:28.200 Uttam Kumaran: 30 day or 14 day confidence guarantee
344 00:46:28.560 ⇒ 00:46:33.010 Uttam Kumaran: something like that and then define what it covers.
345 00:46:38.650 ⇒ 00:46:39.340 Uttam Kumaran: Yeah.
346 00:46:43.140 ⇒ 00:46:47.860 Uttam Kumaran: what’s in the process and agreement plan specific to your engagement
347 00:46:49.320 ⇒ 00:46:55.569 Uttam Kumaran: process plan. Diagnose the chaos. He tells us, broken weakness develop a tailored AI plan.
348 00:47:11.050 ⇒ 00:47:15.050 Uttam Kumaran: Okay, okay, so that’s our agreement plan.
349 00:47:22.750 ⇒ 00:47:24.750 Uttam Kumaran: So this is the.
350 00:47:25.540 ⇒ 00:47:29.720 Uttam Kumaran: oh, okay. So this is from the chaos. Oh, I like that chaos. The clarity plan.
351 00:47:38.080 ⇒ 00:47:40.170 Hannah Wang: Can I delete our old answers.
352 00:47:40.954 ⇒ 00:47:42.900 Uttam Kumaran: Yeah, that’s fine. We’ll just like.
353 00:47:42.900 ⇒ 00:47:44.140 Hannah Wang: Okay, or like.
354 00:47:44.140 ⇒ 00:47:51.980 Uttam Kumaran: If there’s yeah, or just deleted if we’re gonna override it. But just like some things, yeah.
355 00:47:52.283 ⇒ 00:47:55.619 Hannah Wang: I like chaos to clarity. That’s kind of what I brainstormed
356 00:47:56.060 ⇒ 00:48:01.259 Hannah Wang: as well. So I think it just took from what I had on the on the notion, Doc.
357 00:48:24.300 ⇒ 00:48:27.140 Uttam Kumaran: I think I like this these 3 steps.
358 00:48:28.000 ⇒ 00:48:28.670 Hannah Wang: Hmm.
359 00:48:29.400 ⇒ 00:48:29.980 Uttam Kumaran: Right.
360 00:48:31.220 ⇒ 00:48:34.640 Hannah Wang: Diagnose the chaos designed to fix.
361 00:48:36.250 ⇒ 00:48:40.720 Hannah Wang: I kind of like how it all starts with DI kind of like that alliteration.
362 00:48:40.920 ⇒ 00:48:41.460 Uttam Kumaran: Okay.
363 00:48:41.460 ⇒ 00:48:42.370 Hannah Wang: Yeah.
364 00:48:43.320 ⇒ 00:48:46.669 Uttam Kumaran: Diagnose the chaos design. The fix deploy. Yeah.
365 00:48:48.720 ⇒ 00:48:52.210 Amber Lin: We can call it a 3D. Plan from chaos to clarity
366 00:48:52.910 ⇒ 00:48:54.870 Amber Lin: really want to make it work. I like it.
367 00:48:54.870 ⇒ 00:48:55.510 Uttam Kumaran: I like it.
368 00:48:55.510 ⇒ 00:48:56.040 Amber Lin: Yeah.
369 00:48:56.410 ⇒ 00:48:57.550 Uttam Kumaran: For consulting. Yeah.
370 00:48:58.745 ⇒ 00:49:04.560 Uttam Kumaran: So what what materials do we need? Updated to create as a supplement? Okay.
371 00:49:05.134 ⇒ 00:49:09.684 Uttam Kumaran: visual aids from classic clarity? Yep, okay, that makes sense
372 00:49:13.670 ⇒ 00:49:14.560 Uttam Kumaran: hardwired.
373 00:49:35.340 ⇒ 00:49:39.030 Uttam Kumaran: And then we have these right.
374 00:49:40.470 ⇒ 00:49:42.920 Hannah Wang: Before after use case page.
375 00:49:49.120 ⇒ 00:49:51.910 Hannah Wang: I’m trying to think.
376 00:49:53.260 ⇒ 00:49:55.320 Uttam Kumaran: Case studies. And we have both of these.
377 00:49:56.860 ⇒ 00:49:59.699 Hannah Wang: Oh, oh, oh, oh, yeah. Case. Study.
378 00:50:00.210 ⇒ 00:50:02.929 Uttam Kumaran: This is just a sample sow we got this.
379 00:50:05.060 ⇒ 00:50:08.429 Uttam Kumaran: I mean my only suggestion here is we.
380 00:50:09.640 ⇒ 00:50:17.019 Uttam Kumaran: We should make an of the the urban stems plan.
381 00:50:19.700 ⇒ 00:50:28.740 Uttam Kumaran: I like this unique brain forge position. We’re not staff. Aug. We are your data. AI task force task force kind of military. I’m not sure I like that. But
382 00:50:33.240 ⇒ 00:50:35.699 Uttam Kumaran: why does this keep doing this?
383 00:50:49.390 ⇒ 00:50:52.119 Uttam Kumaran: Yes, this is our competition.
384 00:50:52.810 ⇒ 00:50:54.436 Uttam Kumaran: I agree.
385 00:50:58.530 ⇒ 00:50:59.230 Uttam Kumaran: Okay.
386 00:50:59.750 ⇒ 00:51:01.300 Hannah Wang: What’s slideware.
387 00:51:02.300 ⇒ 00:51:08.709 Uttam Kumaran: A slide where is like just you do proof of concept, or you do like a demo.
388 00:51:08.930 ⇒ 00:51:09.440 Hannah Wang: I sort of.
389 00:51:09.440 ⇒ 00:51:10.400 Uttam Kumaran: Eyes there.
390 00:51:10.400 ⇒ 00:51:10.930 Hannah Wang: I see.
391 00:51:10.930 ⇒ 00:51:11.989 Uttam Kumaran: Like. It’s not real.
392 00:51:12.350 ⇒ 00:51:13.010 Hannah Wang: Hmm.
393 00:51:14.130 ⇒ 00:51:17.869 Uttam Kumaran: Like, basically, oh, yeah, we just show a slide of like something we could do. And it’s like.
394 00:51:18.540 ⇒ 00:51:20.160 Uttam Kumaran: doesn’t end up actually working.
395 00:51:20.440 ⇒ 00:51:21.070 Hannah Wang: Hmm.
396 00:51:22.620 ⇒ 00:51:24.290 Uttam Kumaran: 3 or 4. Actually.
397 00:51:31.240 ⇒ 00:51:37.180 Uttam Kumaran: we’re not selling you a developer. Yeah, I kind of like this, although it’s kind of derogatory. But I like it.
398 00:51:38.940 ⇒ 00:51:45.079 Uttam Kumaran: I hate. When people say, like, how many did that like, I really dislike that when people are like, how many devs you have or like
399 00:51:46.070 ⇒ 00:51:49.099 Uttam Kumaran: developers? Yeah, I really like, think they’re like
400 00:51:49.710 ⇒ 00:51:59.836 Uttam Kumaran: talking about people in the wrong way, so I probably won’t use. I probably won’t. I don’t know if you guys are, for I’m down, but I just don’t like when people talk about people like that.
401 00:52:00.290 ⇒ 00:52:00.710 Amber Lin: And on.
402 00:52:00.710 ⇒ 00:52:06.080 Uttam Kumaran: Because people will come to me and they’ll be like, how many devs you have and shit like that? I’m like dude. They have names. And they’re humans. And they’re
403 00:52:06.560 ⇒ 00:52:08.650 Uttam Kumaran: team. So like.
404 00:52:08.650 ⇒ 00:52:11.359 Amber Lin: It doesn’t affect the quality of work we deliver.
405 00:52:12.430 ⇒ 00:52:17.460 Uttam Kumaran: It’s also I don’t like when people like. Oh, I have a hundred people here. I just don’t like that. It’s like not a good way to talk about people.
406 00:52:17.460 ⇒ 00:52:17.860 Amber Lin: Perfect.
407 00:52:17.860 ⇒ 00:52:21.589 Uttam Kumaran: I have a hundred people here like with 30. It’s like.
408 00:52:21.590 ⇒ 00:52:33.170 Amber Lin: That should be one of our values that we say on the website that will attract the people who’s resonating with that value. I think that’s a very unique and important way, and we should write that down.
409 00:52:33.360 ⇒ 00:52:36.079 Uttam Kumaran: Yeah, I mean, maybe we should say, like we don’t.
410 00:52:37.050 ⇒ 00:52:42.829 Uttam Kumaran: Hire like devs like we hire people, or something like that.
411 00:52:42.830 ⇒ 00:52:43.250 Hannah Wang: Hmm.
412 00:52:46.220 ⇒ 00:52:48.180 Uttam Kumaran: Background style or something like that.
413 00:52:48.800 ⇒ 00:52:51.450 Uttam Kumaran: But yes, that’s that could be good.
414 00:52:53.140 ⇒ 00:53:03.909 Uttam Kumaran: I mean, yeah, something like that could be good. Is broken villains, the villain, or the symptom. You’re right. Broken symptom is often a broken system is a symptom of no owner.
415 00:53:04.060 ⇒ 00:53:14.090 Uttam Kumaran: The ownership vacuum, the Frankenstack, the slideware factory. So lead with broken systems. But explain, it’s caused by a lack of ownership.
416 00:53:16.590 ⇒ 00:53:17.779 Uttam Kumaran: I like this.
417 00:53:18.620 ⇒ 00:53:19.170 Hannah Wang: And.
418 00:53:19.890 ⇒ 00:53:20.979 Uttam Kumaran: What do you all think.
419 00:53:20.980 ⇒ 00:53:50.819 Amber Lin: Yeah. And I really think this is where we can show off the framework we have. This is where we show off that we have this unique approach of how we look at things. And we don’t need to like hide this as if it’s IP, because, like people can know these frameworks. But it’s about the implementation. So we can put down how we diagnose things where we think the the main 3 categories of reasons are. And we can just put that straight out, and people will see the expertise in that.
420 00:53:52.110 ⇒ 00:53:52.930 Uttam Kumaran: Yeah.
421 00:53:53.250 ⇒ 00:53:58.139 Uttam Kumaran: I actually, I actually like this, because it’s like, Look, we know that you’re probably feeling the broken system.
422 00:53:58.250 ⇒ 00:54:00.970 Uttam Kumaran: But most likely it’s like this.
423 00:54:03.090 ⇒ 00:54:07.872 Uttam Kumaran: and we can tweak the language. But as long as we get, if we’re like, this is crisp, I feel like
424 00:54:09.150 ⇒ 00:54:12.630 Uttam Kumaran: like we can get AI to make this thinner or longer, or whatever
425 00:54:12.830 ⇒ 00:54:27.939 Uttam Kumaran: data versus AI. How should we position them? Your instincts are dead on best angle. We make AI useful by starting where it matters your data frame. AI is the outcome. AI is easy getting the right data to the right place. That’s the hard part
426 00:54:28.750 ⇒ 00:54:31.939 Uttam Kumaran: we help you build that foundation. So your AI actually works.
427 00:54:35.510 ⇒ 00:54:43.799 Hannah Wang: And that makes it kind of sound like AI is the end goal. But we have some clients where it’s just data, right? And not the AI part.
428 00:54:43.800 ⇒ 00:54:55.970 Uttam Kumaran: That’s like a that’s a it’s a great question, because that’s not. That’s what our current state is, not where we want to be. Right. So this is where, like our brand, may be different than what we, what we end up selling.
429 00:54:55.970 ⇒ 00:54:56.490 Hannah Wang: Okay.
430 00:54:56.490 ⇒ 00:54:59.350 Uttam Kumaran: As a brand is what we want to project to. People like, yeah.
431 00:54:59.350 ⇒ 00:55:02.840 Uttam Kumaran: comes to me and is like, we don’t want any AI at all. We’ll still do the work
432 00:55:03.250 ⇒ 00:55:03.630 Uttam Kumaran: that’s different.
433 00:55:03.630 ⇒ 00:55:05.529 Uttam Kumaran: And then what our brand projects.
434 00:55:05.530 ⇒ 00:55:08.499 Uttam Kumaran: Yeah, eventually, we may not take on that type of stuff.
435 00:55:08.670 ⇒ 00:55:19.570 Amber Lin: Yeah, I think on that. I think the goal is not AI, but what that will help us achieve. Right? What we’re doing for ABC. It’s not about giving them an AI system, but that
436 00:55:19.820 ⇒ 00:55:30.520 Amber Lin: this will help them do things faster. This will eliminate like churn and excess time. So I think.
437 00:55:33.552 ⇒ 00:55:37.410 Amber Lin: How do we incorporate that
438 00:55:38.690 ⇒ 00:55:44.480 Amber Lin: as a like as an introduction to why, we aim for AI like.
439 00:55:44.920 ⇒ 00:55:45.295 Hannah Wang: Hmm.
440 00:55:52.310 ⇒ 00:55:53.330 Uttam Kumaran: Yeah, I agree.
441 00:55:53.330 ⇒ 00:56:02.580 Amber Lin: Yeah cause. Ultimately, I think both for our data service and our AI service, we’re trying to achieve the same thing, a fast of supported decisions and improve business outcomes.
442 00:56:04.100 ⇒ 00:56:05.999 Hannah Wang: Hmm! I see.
443 00:56:08.220 ⇒ 00:56:11.009 Amber Lin: So I see as a as the same thing.
444 00:56:19.630 ⇒ 00:56:21.769 Amber Lin: No! Oh, I like the pyramid.
445 00:56:21.950 ⇒ 00:56:22.630 Amber Lin: Oh, you already.
446 00:56:22.630 ⇒ 00:56:24.030 Uttam Kumaran: Sort of a graph like this.
447 00:56:24.350 ⇒ 00:56:26.340 Uttam Kumaran: This is kind of similar to the graph.
448 00:56:26.560 ⇒ 00:56:28.129 Uttam Kumaran: We can make a pyramid.
449 00:56:28.420 ⇒ 00:56:32.450 Amber Lin: Yeah, we should tie it to like ultimately, what this is achieving.
450 00:56:32.490 ⇒ 00:56:33.419 Amber Lin: Yeah, I agree.
451 00:56:33.420 ⇒ 00:56:51.010 Amber Lin: And and then it wouldn’t matter if they don’t want. Like specific parts of these, we can say, Hey, you’re not maximizing your potential if you have all these data. But you don’t utilize this new tool, which is AI to help you achieve that capacity.
452 00:57:06.010 ⇒ 00:57:06.810 Uttam Kumaran: So
453 00:57:27.280 ⇒ 00:57:33.280 Uttam Kumaran: so I’m just gonna ask that, I’m gonna say, ultimately, just like data.
454 00:57:36.550 ⇒ 00:57:44.310 Uttam Kumaran: AI, our goal is revenue growth and saving time.
455 00:57:45.550 ⇒ 00:57:47.890 Uttam Kumaran: That’s AI for
456 00:57:51.610 ⇒ 00:58:03.934 Uttam Kumaran: ability to see real Roi and to build production grade systems. Not just
457 00:58:07.260 ⇒ 00:58:07.960 Uttam Kumaran: yeah.
458 00:58:13.750 ⇒ 00:58:14.480 Uttam Kumaran: yeah.
459 00:58:16.330 ⇒ 00:58:17.180 Uttam Kumaran: X,
460 00:59:10.280 ⇒ 00:59:11.440 Uttam Kumaran: I like this one.
461 00:59:20.450 ⇒ 00:59:21.650 Hannah Wang: AI!
462 00:59:26.460 ⇒ 00:59:28.379 Uttam Kumaran: Or it could be AI that pays for itself.
463 00:59:28.810 ⇒ 00:59:29.270 Hannah Wang: Yeah.
464 00:59:53.410 ⇒ 00:59:54.370 Uttam Kumaran: Like this one.
465 01:00:11.950 ⇒ 01:00:13.339 Uttam Kumaran: I like this, too.
466 01:00:22.060 ⇒ 01:00:22.870 Uttam Kumaran: Okay.
467 01:00:35.140 ⇒ 01:00:38.400 Uttam Kumaran: so okay, now, we’re at the hour. So
468 01:00:40.280 ⇒ 01:00:45.070 Uttam Kumaran: I feel pretty good. Now that we can actually probably lock some of these in
469 01:00:46.060 ⇒ 01:00:49.610 Uttam Kumaran: like, what do you think is the best thing for us to do? Should I? Just
470 01:00:51.300 ⇒ 01:00:53.830 Uttam Kumaran: so we just go to go one by one, and I can
471 01:00:54.030 ⇒ 01:00:57.830 Uttam Kumaran: pass at this and have it work on it like, what do you think is best.
472 01:00:58.630 ⇒ 01:01:02.760 Amber Lin: Do you mean internally as a team, or to test this with external stakeholders?
473 01:01:03.112 ⇒ 01:01:10.530 Uttam Kumaran: No, I mean, like, I kind of want to basically finish this brand script like I want to have like a locked version.
474 01:01:10.530 ⇒ 01:01:11.250 Amber Lin: Okay.
475 01:01:11.390 ⇒ 01:01:21.479 Uttam Kumaran: Then what we will, there’s a couple of next steps. One is the next sort of session for us to do is to walk through our homepage and walk through
476 01:01:21.830 ⇒ 01:01:33.269 Uttam Kumaran: like are the core set of documents that we send to all the people. And one have AI basic. I’ll what I’ll do is once we have the brand script, I’ll create a brand script reviewer, reviewer, prompt.
477 01:01:33.610 ⇒ 01:01:42.039 Uttam Kumaran: Or basically, I’ll modify the other sales. Copy, prompt that we have, and I’ll have it review all of our stuff. And then we we now look at it through this lens.
478 01:01:42.900 ⇒ 01:01:53.530 Amber Lin: awesome. I I think personally, I think we could each have a homework of to record a short video as, if we’re introducing our brand
479 01:01:53.670 ⇒ 01:02:08.359 Amber Lin: like that will force us to put this in words and put this in the clear format, and I want to do it anyways, and that will let us see what is missing in this script, and we can each review each other’s videos of walk like talking about who we are.
480 01:02:08.660 ⇒ 01:02:09.630 Amber Lin: How’s that.
481 01:02:11.020 ⇒ 01:02:12.350 Uttam Kumaran: I’m fine with that like.
482 01:02:12.770 ⇒ 01:02:18.500 Uttam Kumaran: is there? Is there like a thing that the Brand Script book tells us to do, or just research record walking through the whole thing.
483 01:02:19.597 ⇒ 01:02:31.899 Amber Lin: No, I I meant as in like a 30 second clip or 1 min clip of as if I’m playing role play as if I’m gonna introduce our brand to someone new like, how am I gonna talk about it?
484 01:02:33.220 ⇒ 01:02:48.399 Amber Lin: Right? It’s just say end result. And that forces us to do the thinking like, maybe we consolidate that into a 1 liner or like a 30 second intro, and everybody does that for their own version. And we can see like what’s missing.
485 01:02:48.810 ⇒ 01:02:50.259 Uttam Kumaran: Okay, yeah, see what’s next?
486 01:02:50.260 ⇒ 01:02:54.300 Hannah Wang: Oh, that’s that’s gonna be a good like the way.
487 01:02:54.300 ⇒ 01:02:55.250 Amber Lin: It’s hard, it’s hard.
488 01:02:55.250 ⇒ 01:02:57.430 Hannah Wang: It will be the parameter for how.
489 01:02:57.430 ⇒ 01:02:57.970 Uttam Kumaran: Yes.
490 01:02:57.970 ⇒ 01:03:04.840 Hannah Wang: Because I am more removed from all this stuff, so it’ll be interesting to see.
491 01:03:04.840 ⇒ 01:03:07.979 Uttam Kumaran: I think. Does this give you a little bit more of a sense of like
492 01:03:08.510 ⇒ 01:03:09.779 Uttam Kumaran: what? Who we are, who we.
493 01:03:09.780 ⇒ 01:03:15.669 Amber Lin: Yeah, yeah, I really like what we did for this past 2 sessions. But I’m not gonna speak for him.
494 01:03:16.383 ⇒ 01:03:22.059 Hannah Wang: I think so. It’s still a little too wordy for me, like every that’s there’s no like
495 01:03:22.180 ⇒ 01:03:24.210 Hannah Wang: top level statement.
496 01:03:24.878 ⇒ 01:03:34.010 Hannah Wang: And I’m still not really seeing the combination of data. And AI like to me, they feel very separate. The way we explain it. So.
497 01:03:34.010 ⇒ 01:03:34.800 Amber Lin: I see.
498 01:03:34.800 ⇒ 01:03:38.169 Hannah Wang: Yeah, I still feel a little bit iffy.
499 01:03:38.170 ⇒ 01:03:41.110 Uttam Kumaran: Good like, I think one piece to talk about that is
500 01:03:42.440 ⇒ 01:03:51.680 Uttam Kumaran: the the main assumption here. And I think you could be right. This is where like, maybe we should explain, is like, really like everybody’s. Now familiar with the word like context.
501 01:03:51.970 ⇒ 01:04:02.210 Uttam Kumaran: right? Like context is like data. Like context is what you send to the language model.
502 01:04:03.580 ⇒ 01:04:08.560 Uttam Kumaran: then process and predict the answer
503 01:04:09.133 ⇒ 01:04:14.149 Uttam Kumaran: right? Like, when I put this together. And I gave this to AI. This is data.
504 01:04:14.370 ⇒ 01:04:17.519 Hannah Wang: Right. This is having our Zoom Meetings. This is having all this.
505 01:04:17.740 ⇒ 01:04:34.009 Uttam Kumaran: Imagine if you’re asking some conversation above, out like on top of your Crm, you need to have all your Crm data. You need to have information about who you are. Your documents, right? So that’s all the data as sort of pipeline and engineering work. Maybe we do need to make it crystal clear, like.
506 01:04:34.720 ⇒ 01:04:37.079 Uttam Kumaran: what is data in this world? Well.
507 01:04:37.670 ⇒ 01:04:40.910 Uttam Kumaran: maybe maybe that is something we should just like make super clear.
508 01:04:43.340 ⇒ 01:04:47.270 Hannah Wang: Yeah, yeah, kind of explaining what that is. And also.
509 01:04:47.580 ⇒ 01:04:51.580 Hannah Wang: yeah, I mean, I keep saying this. But like the AI piece like, how does that
510 01:04:51.780 ⇒ 01:05:01.029 Hannah Wang: like? I understand, generally like kind of what we’re doing and the services we offer. But
511 01:05:01.380 ⇒ 01:05:11.749 Hannah Wang: to me they still feel very separate. I mean, I understand that in order to have good good AI, we you need good data like I understand that. But.
512 01:05:11.750 ⇒ 01:05:15.870 Uttam Kumaran: Tell me what? Tell me how? Tell me? Why, how that makes sense.
513 01:05:18.460 ⇒ 01:05:29.849 Hannah Wang: Well, you need to give AI like background information, or else it’s not gonna know, like how to pull in all the information that it needs in order to process your request like, it needs
514 01:05:30.080 ⇒ 01:05:36.519 Hannah Wang: background information and needs context. You need like AI needs to know.
515 01:05:37.590 ⇒ 01:05:46.230 Hannah Wang: Yeah, just background information. I don’t know what the right word is. I guess it is context, but that’s like a buzzword, and I’m trying to avoid using that. But.
516 01:05:46.230 ⇒ 01:05:48.129 Uttam Kumaran: Yeah. Okay.
517 01:05:48.600 ⇒ 01:05:50.830 Hannah Wang: Yeah. But to me, like, I guess.
518 01:05:51.130 ⇒ 01:06:07.729 Hannah Wang: like I, when I came into Brain Forge, like to me, we were selling 2 products like data. And AI. So to me, I can’t like unsee that or undo that in my head. So maybe that’s just a me problem like we’re selling 2 different types of services.
519 01:06:08.385 ⇒ 01:06:21.519 Hannah Wang: But but I also understand like they’re very intermingled and they work together. So I guess it’s just like what wording or like, what one liner can we use to make it? So that
520 01:06:21.940 ⇒ 01:06:35.344 Hannah Wang: yeah, it’s like, we’re not selling 2 different things like, I can’t get out of my head like, Oh, we have a data toggle and a AI toggle for the pricing page. We have like 2 different charts for those.
521 01:06:37.890 ⇒ 01:06:42.349 Hannah Wang: yeah, it it just feels very separate, but.
522 01:06:42.740 ⇒ 01:06:44.020 Hannah Wang: Yeah. Go ahead.
523 01:06:44.020 ⇒ 01:06:47.299 Amber Lin: I think how I changed my view on it is
524 01:06:47.470 ⇒ 01:06:50.930 Amber Lin: not to view it that we’re selling data, or that we’re selling AI, because.
525 01:06:51.340 ⇒ 01:07:06.039 Amber Lin: We’re not selling products. We’re selling a service. And we’re selling a very tailored tailored service. So ultimately, we’re selling a solution to a specific problem. And that problem that we’re selling that problem that we’re solving is
526 01:07:06.300 ⇒ 01:07:21.660 Amber Lin: the fact that they’re unused potential. The fact that there’s decisions. It’s unable to be made, because it doesn’t really matter if we change your data and they don’t, they don’t change and hence why, some of the work I’m doing is not
527 01:07:21.790 ⇒ 01:07:30.690 Amber Lin: directly data work. I’m working directly with the clients. But that’s why we’re a consultancy and why we’re not a product company is because we’re
528 01:07:31.270 ⇒ 01:07:34.850 Amber Lin: specifically solving targeted problems.
529 01:07:34.850 ⇒ 01:07:35.320 Hannah Wang: Yeah.
530 01:07:35.320 ⇒ 01:07:54.790 Amber Lin: So that’s why, if we change the lens of oh, we’re selling it data solution, we’re selling AI solution. And we’re selling a like a business solution if I have to really frame it that way. And it doesn’t matter what tool we use. It just so happens that we have a lot of data people and AI people under our toolkit.
531 01:07:55.380 ⇒ 01:07:56.610 Amber Lin: That’s how I see it.
532 01:07:56.610 ⇒ 01:08:10.805 Hannah Wang: I see. But are we still gonna use the language like, Oh, we’re selling data solutions. We’re selling AI solutions like to me. There needs to be like an umbrella term kind of what you said like a business solution.
533 01:08:11.160 ⇒ 01:08:13.977 Amber Lin: Should we say, tech forward business solutions.
534 01:08:14.380 ⇒ 01:08:16.339 Hannah Wang: But then that might be too vague right like for me.
535 01:08:16.710 ⇒ 01:08:33.471 Hannah Wang: Like, if I wasn’t a tech person I’d be like Bro. What does that mean like, what does tech? But but then again, like a CEO and a CTO might understand what that is. So we just need to make sure that we understand our audience, and that they know the language that we want to use.
536 01:08:33.760 ⇒ 01:08:41.709 Amber Lin: Yeah. Oh, you’re sure, right? Because if we’re selling to a tech person, they want to know, like, what what do you mean by business solution. But if we’re talking to the CEO,
537 01:08:41.880 ⇒ 01:08:48.430 Amber Lin: that’s not that connected to the daily operations, then it’s just okay. Are you gonna solve my problem?
538 01:08:48.439 ⇒ 01:08:49.269 Hannah Wang: Yeah, yeah.
539 01:08:49.270 ⇒ 01:08:55.309 Amber Lin: I I don’t care if you deal with the AI stuff where you deal with my data folks. I just care if you saw my problem.
540 01:08:55.310 ⇒ 01:08:56.100 Hannah Wang: Yeah.
541 01:08:56.220 ⇒ 01:08:57.420 Hannah Wang: So
542 01:08:58.670 ⇒ 01:09:08.170 Hannah Wang: I don’t know. Maybe we just get rid of data. And AI, oh, and but that’s like, how that’s our niche. So I understand we need that type of language in there. But
543 01:09:09.760 ⇒ 01:09:18.140 Hannah Wang: yeah, I don’t know that that was helpful. But yeah, maybe I just have to change my lens like, how I view our company?
544 01:09:19.490 ⇒ 01:09:29.789 Hannah Wang: yeah, cause even like the way we structure our internal teams. It’s like, Oh, you have, like the AI team, and they have, like the Da data engineering team. And it’s like, Oh, but like what we’re selling is
545 01:09:29.910 ⇒ 01:09:38.890 Hannah Wang: could be one or both of them. So maybe I just need to like, combine it in my head.
546 01:09:38.890 ⇒ 01:09:39.330 Amber Lin: Yeah.
547 01:09:39.330 ⇒ 01:09:41.340 Hannah Wang: Not separate entities.
548 01:09:42.010 ⇒ 01:09:44.340 Amber Lin: And they feed into each other so.
549 01:09:44.340 ⇒ 01:09:44.770 Hannah Wang: A story.
550 01:09:44.770 ⇒ 01:10:01.330 Amber Lin: From ABC. So we started out building their AI product. But soon we realized that internally they they their data is a mess. So we can’t really see if our AI product is doing well because they don’t have the right data metrics to prove that. So now we’re helping them to
551 01:10:01.610 ⇒ 01:10:06.100 Amber Lin: use to help them see what they can use for their data, and even.
552 01:10:06.100 ⇒ 01:10:06.699 Hannah Wang: I see.
553 01:10:06.700 ⇒ 01:10:27.969 Amber Lin: That’s 1 point and the other points as we build their system, we found that the data which is their training guides essentially is a complete mess. And therefore our AI can’t even give correct answers, because their data that we build it on is a mess. So that’s what I’m doing currently with the clients is to groom through that data to make sure it’s better for both the AI and the human.
554 01:10:28.140 ⇒ 01:10:42.600 Amber Lin: And so if we only sold AI solutions. I don’t think we’re gonna help them with that, because we’ll deliver a chat bot, and we’ll say, Hey, it’s your problem. We’re not gonna solve your data problem. But because we’re solving a business problem, we we kind of
555 01:10:42.840 ⇒ 01:10:46.409 Amber Lin: have to tackle all these parts in order for us to succeed.
556 01:10:47.710 ⇒ 01:10:48.629 Hannah Wang: I see so.
557 01:10:48.950 ⇒ 01:10:58.510 Hannah Wang: If you like, go to our pricing page, for example, like what solution did they? What? What did they click on like? Oh, I want this package like, was it
558 01:10:58.770 ⇒ 01:11:04.180 Hannah Wang: like, cause I know, on our website and stuff every, it’s like 2 separate toggles.
559 01:11:04.640 ⇒ 01:11:05.430 Hannah Wang: So like.
560 01:11:05.570 ⇒ 01:11:08.899 Hannah Wang: But we’re technically selling them both solutions, like data.
561 01:11:08.900 ⇒ 01:11:10.260 Amber Lin: I am true.
562 01:11:10.260 ⇒ 01:11:11.720 Amber Lin: Which one is it? Now?
563 01:11:11.940 ⇒ 01:11:19.309 Amber Lin: I guess they they begin. They started with AI, and then we realize we have a lot more data problems. So that’s how a.
564 01:11:19.310 ⇒ 01:11:19.690 Hannah Wang: Release.
565 01:11:19.690 ⇒ 01:11:20.890 Amber Lin: Went across. I think.
566 01:11:21.440 ⇒ 01:11:29.940 Amber Lin: That really tells us of how we should structure the customer journey on our website because our current customer journey leads them to think that these are separate things.
567 01:11:29.940 ⇒ 01:11:32.250 Hannah Wang: Yeah, that I mean, that’s how I thought of.
568 01:11:32.250 ⇒ 01:11:32.680 Amber Lin: Okay.
569 01:11:32.680 ⇒ 01:11:38.317 Hannah Wang: Like, if you think I’m a potential client like, Oh, yeah, to me, they’re very separate.
570 01:11:39.110 ⇒ 01:11:51.850 Hannah Wang: And now I understand you like telling me and explaining all these things, they do feed into each other. And I see that clearly. I understand that. But yeah, from the get go, like, if I just look at our website. That’s not super clear to me.
571 01:11:52.450 ⇒ 01:12:10.859 Amber Lin: I see that feedback is so helpful, because, like I, I’m so in the weeds, I can’t. I can’t really tell you much of how it’s like. So I guess, for instance, if you are a customer who’s looking to like help your photography business right? And you click in. And we say, oh, we have data service, AI service. You’re like.
572 01:12:10.860 ⇒ 01:12:11.510 Hannah Wang: What the heck.
573 01:12:11.510 ⇒ 01:12:27.560 Amber Lin: Well, like I don’t. I don’t think I need this, but if we start with most photography photographers spent, like most most of their time in finding the right images rather than like doing the I don’t know like
574 01:12:27.950 ⇒ 01:12:33.589 Amber Lin: like. So we start with a problem and say, Hey, we off. We’re gonna help you with that. Here’s
575 01:12:33.760 ⇒ 01:12:44.239 Amber Lin: step 1, 2, 3, 4. We’re gonna do. We’re gonna start with data and then build on top of that and then build on top of that, and you’ll have like. Each step will give you benefits, and at the end of it you’ll be like super empowered.
576 01:12:44.710 ⇒ 01:12:55.279 Hannah Wang: Yeah, like to me, it doesn’t really matter if you use day or data or AI like, I don’t care if you use both, or one, or either, like, I just want you to solve my problem.
577 01:12:55.983 ⇒ 01:12:58.975 Hannah Wang: I feel like. That’s how most Ceos feel.
578 01:12:59.350 ⇒ 01:13:05.180 Amber Lin: I think that’s that’s the core thing that we’ll combine our 2 lines of services.
579 01:13:05.610 ⇒ 01:13:12.700 Hannah Wang: Yeah. So I don’t know. Tom, was that helpful at all. Just like be blabbering about how I don’t understand stuff.
580 01:13:12.830 ⇒ 01:13:13.440 Hannah Wang: No.
581 01:13:13.440 ⇒ 01:13:16.669 Uttam Kumaran: No, I mean, this helps like I mean, if you look at like sort of what I just
582 01:13:16.790 ⇒ 01:13:18.510 Uttam Kumaran: was chatting with AI, for I think.
583 01:13:18.510 ⇒ 01:13:18.930 Hannah Wang: Yeah.
584 01:13:19.249 ⇒ 01:13:28.830 Uttam Kumaran: You’re right in that, I think. I don’t. I don’t think it’s good enough for us to say we sell, we, we help you make better decisions. But like.
585 01:13:28.830 ⇒ 01:13:34.330 Uttam Kumaran: okay, I think something like this. That sort of shows like we
586 01:13:34.470 ⇒ 01:13:40.870 Uttam Kumaran: AI for us looks like these. But these are just like the form factors like this is what we’re actually delivering.
587 01:13:41.370 ⇒ 01:13:44.083 Hannah Wang: Hmm, right? Contact.
588 01:13:46.210 ⇒ 01:13:53.050 Uttam Kumaran: Ultimately, I think. This is sort of it.
589 01:13:54.880 ⇒ 01:13:56.132 Hannah Wang: Context is data.
590 01:13:59.020 ⇒ 01:14:05.349 Uttam Kumaran: I think. We also have to do stuff like this, which is like what AI! What AI is when we say it.
591 01:14:05.350 ⇒ 01:14:05.790 Hannah Wang: Yeah.
592 01:14:06.440 ⇒ 01:14:07.489 Amber Lin: Lovely. I like that.
593 01:14:07.490 ⇒ 01:14:09.350 Uttam Kumaran: Segment by job to be done.
594 01:14:09.350 ⇒ 01:14:10.260 Hannah Wang: Yeah.
595 01:14:12.040 ⇒ 01:14:14.530 Uttam Kumaran: And this is like where the demo site comes into play right.
596 01:14:25.770 ⇒ 01:14:32.000 Hannah Wang: but even saying like context is data to me, I’m like
597 01:14:32.300 ⇒ 01:14:39.990 Hannah Wang: what does that mean, like, I, I know what it means because we’ve like talked about it. But if I was just to see that one line
598 01:14:40.936 ⇒ 01:14:42.570 Hannah Wang: it’s not.
599 01:14:42.570 ⇒ 01:14:46.380 Uttam Kumaran: This is where like well, this is where we have to assume that.
600 01:14:47.020 ⇒ 01:14:47.490 Hannah Wang: Oh!
601 01:14:47.490 ⇒ 01:14:49.110 Uttam Kumaran: People that are buying from us have at least.
602 01:14:49.110 ⇒ 01:14:49.720 Hannah Wang: Right.
603 01:14:49.720 ⇒ 01:14:53.419 Uttam Kumaran: Tried chat Gpt and are familiar with the word context.
604 01:14:53.420 ⇒ 01:14:54.609 Uttam Kumaran: Okay, yeah.
605 01:14:54.610 ⇒ 01:15:00.060 Uttam Kumaran: I also think at some point we are not selling to the average photographer.
606 01:15:00.060 ⇒ 01:15:01.360 Hannah Wang: Right, yeah.
607 01:15:01.540 ⇒ 01:15:05.739 Uttam Kumaran: We’re not selling to people that have 0 understanding of what this is. In fact.
608 01:15:05.870 ⇒ 01:15:09.280 Uttam Kumaran: we’ve wasted a lot of time trying to sell those people because they just didn’t bring.
609 01:15:09.780 ⇒ 01:15:14.140 Uttam Kumaran: We can’t. I can’t explain how fireworks
610 01:15:14.280 ⇒ 01:15:17.459 Uttam Kumaran: and like talk on a Zoom Meeting like about like how the way fire works like.
611 01:15:18.210 ⇒ 01:15:21.370 Uttam Kumaran: imagine you had like a thing that burns, and it gives you heat like.
612 01:15:21.370 ⇒ 01:15:22.079 Hannah Wang: Be like what the fuck.
613 01:15:22.490 ⇒ 01:15:27.639 Uttam Kumaran: Right? So some point need someone to go light a fire and then be like, okay, I get it, like, you know.
614 01:15:27.990 ⇒ 01:15:32.730 Uttam Kumaran: So part of this for me is like we. I think we need to have some level of like
615 01:15:33.000 ⇒ 01:15:40.510 Uttam Kumaran: what we assume our customer. We assume word prospects no.
616 01:15:41.610 ⇒ 01:15:42.250 Hannah Wang: Right.
617 01:15:43.980 ⇒ 01:15:45.590 Uttam Kumaran: I do think that, like
618 01:15:46.650 ⇒ 01:15:50.680 Uttam Kumaran: I do think this is helpful, though, which is like what is AI when we say it.
619 01:15:51.470 ⇒ 01:15:54.020 Uttam Kumaran: A lot of people will just say, not go beyond that.
620 01:15:54.500 ⇒ 01:16:00.280 Uttam Kumaran: or they’ll say it, and it’ll go into some really technical jargon like, I think this these are. This is a really good thing.
621 01:16:00.610 ⇒ 01:16:03.290 Uttam Kumaran: like here’s what it is when we say it.
622 01:16:04.180 ⇒ 01:16:07.780 Uttam Kumaran: And ultimately this is really clear.
623 01:16:08.440 ⇒ 01:16:14.160 Uttam Kumaran: But, like our per. Our our folks have got to know what the word context means.
624 01:16:14.160 ⇒ 01:16:14.600 Hannah Wang: Yeah.
625 01:16:14.600 ⇒ 01:16:18.870 Uttam Kumaran: Like the fact that, like you, pay something into AI, and then it gives you something.
626 01:16:18.870 ⇒ 01:16:19.720 Hannah Wang: Right.
627 01:16:25.840 ⇒ 01:16:27.719 Uttam Kumaran: I do think this is really good, though.
628 01:16:28.330 ⇒ 01:16:29.310 Hannah Wang: Hmm.
629 01:16:51.850 ⇒ 01:16:52.540 Amber Lin: Okay.
630 01:16:53.520 ⇒ 01:16:55.519 Amber Lin: When are we meeting next.
631 01:16:56.780 ⇒ 01:16:58.428 Uttam Kumaran: Yeah, I think, let’s
632 01:17:02.170 ⇒ 01:17:07.840 Uttam Kumaran: yeah, I mean, I I’m fine with this. I mean, I think the next meeting
633 01:17:08.290 ⇒ 01:17:13.330 Uttam Kumaran: we have, we should just basically rewrite these with and like lock it.
634 01:17:13.540 ⇒ 01:17:17.040 Uttam Kumaran: And then we should move on to starting to look at
635 01:17:18.770 ⇒ 01:17:20.920 Uttam Kumaran: what edits we want for the home page
636 01:17:21.740 ⇒ 01:17:24.070 Uttam Kumaran: and for our capabilities that, given this.
637 01:17:24.470 ⇒ 01:17:25.040 Amber Lin: Come on!
638 01:17:25.610 ⇒ 01:17:30.589 Hannah Wang: Hmm is next week, Monday, like, at the same time. Good.
639 01:17:30.590 ⇒ 01:17:34.719 Hannah Wang: I would try to do this week because I feel like we’re we’re basically at the finish line.
640 01:17:35.590 ⇒ 01:17:39.299 Amber Lin: Yeah, I’m open pretty open.
641 01:17:39.990 ⇒ 01:17:43.869 Amber Lin: So just book it for book. Put it on my calendar. I’ll join.
642 01:17:45.230 ⇒ 01:17:49.670 Uttam Kumaran: Yeah, cause I have a good feeling that the new web flow person, she’s gonna do. Okay? So I want her.
643 01:17:49.670 ⇒ 01:17:50.150 Hannah Wang: Yeah.
644 01:17:50.150 ⇒ 01:17:54.829 Uttam Kumaran: To be able to take it. So we want to do tomorrow. We can do tomorrow. If we wanna
645 01:17:55.090 ⇒ 01:17:58.640 Uttam Kumaran: leave a day we can do Wednesday, but I would probably say one of those.
646 01:17:59.960 ⇒ 01:18:05.060 Hannah Wang: Okay, I mean Wednesday. We have our finishing up the rest of the chapter
647 01:18:05.210 ⇒ 01:18:07.345 Hannah Wang: meeting in the morning.
648 01:18:07.880 ⇒ 01:18:12.509 Amber Lin: I. I’m done with the chapters, so if we need anything, I can.
649 01:18:13.640 ⇒ 01:18:15.999 Uttam Kumaran: Yeah, what’s like the like?
650 01:18:16.330 ⇒ 01:18:18.618 Uttam Kumaran: What are the last chapters? Oh.
651 01:18:19.720 ⇒ 01:18:24.559 Hannah Wang: It’s just like wrapping it up with a pretty bow and like positioning and stuff like that.
652 01:18:25.610 ⇒ 01:18:28.010 Uttam Kumaran: I don’t know. I’m kind of just like.
653 01:18:33.140 ⇒ 01:18:35.440 Hannah Wang: I mean, it’s like not necessary, I guess.
654 01:18:35.440 ⇒ 01:18:38.940 Uttam Kumaran: I guess that’s why, like we’re at the end. Maybe we just skip it.
655 01:18:39.230 ⇒ 01:18:40.610 Uttam Kumaran: I feel pretty good.
656 01:18:41.280 ⇒ 01:18:45.949 Hannah Wang: Then you want. Do you just want to spend that Wednesday? Okay, time slot.
657 01:18:46.383 ⇒ 01:18:47.250 Uttam Kumaran: Okay. Yeah.
658 01:18:47.800 ⇒ 01:18:50.309 Hannah Wang: Just tell people they don’t have to read it if they don’t have time.
659 01:18:50.310 ⇒ 01:18:54.119 Uttam Kumaran: I think I think, send them this. Say, this is like, basically, we’re like 80% of the way there.
660 01:18:54.120 ⇒ 01:18:54.840 Hannah Wang: Okay.
661 01:18:55.160 ⇒ 01:18:55.750 Uttam Kumaran: And.
662 01:18:56.340 ⇒ 01:18:58.439 Hannah Wang: Just have them look through it. Okay.
663 01:18:58.440 ⇒ 01:19:03.220 Uttam Kumaran: Because I think we’re there, I think. Let’s let’s let’s cement it of one version on Wednesday.
664 01:19:03.220 ⇒ 01:19:03.570 Hannah Wang: Okay.
665 01:19:03.870 ⇒ 01:19:11.889 Uttam Kumaran: And then I think we’re good to just start planning out edits. I so I feel pretty good.
666 01:19:12.340 ⇒ 01:19:14.850 Uttam Kumaran: I think at least we’ve gotten like.
667 01:19:15.540 ⇒ 01:19:18.430 Uttam Kumaran: like, I mean, we’re probably 80% of the way there.
668 01:19:18.730 ⇒ 01:19:23.779 Uttam Kumaran: except like we got the proms really done. Well, we got the character.
669 01:19:24.260 ⇒ 01:19:27.040 Uttam Kumaran: the billing a lot better. And then
670 01:19:27.970 ⇒ 01:19:35.140 Uttam Kumaran: we sort of have these plans. So we do have a lot to work on. I think it’s still some of this small messaging. But like we’re closer.
671 01:19:35.140 ⇒ 01:19:55.429 Hannah Wang: Yeah, I mean, I feel like 3 through 6 is 3 through 7 is good. It’s just 2 like that has a problem is that we need to kind of hammer down. I feel like we have all the pieces there in that notion. It’s just like it’s just making like just trying to summarize it. Basically, I think is the
672 01:19:56.000 ⇒ 01:19:58.950 Hannah Wang: next step. So, okay, yeah, we can.
673 01:19:59.120 ⇒ 01:20:03.200 Hannah Wang: Yeah, we can discuss this on Wednesday, and then move from there.
674 01:20:03.963 ⇒ 01:20:04.726 Uttam Kumaran: Okay.
675 01:20:05.490 ⇒ 01:20:09.159 Hannah Wang: Or anything else before we end. Thanks for staying on longer.
676 01:20:09.160 ⇒ 01:20:11.299 Amber Lin: Oh, what’s the homework?
677 01:20:12.830 ⇒ 01:20:16.189 Hannah Wang: Homework is record. Read through.
678 01:20:16.190 ⇒ 01:20:16.620 Hannah Wang: Okay.
679 01:20:16.620 ⇒ 01:20:23.899 Hannah Wang: you can record it. If that’s helpful. I feel like mine will not be helpful. But you can record it.
680 01:20:24.298 ⇒ 01:20:27.490 Uttam Kumaran: Try to record one, and then we can.
681 01:20:27.490 ⇒ 01:20:30.319 Amber Lin: I’ll try. I’ll try to if I remember it. Okay, sounds good.
682 01:20:30.320 ⇒ 01:20:30.610 Hannah Wang: Okay.
683 01:20:30.900 ⇒ 01:20:31.440 Hannah Wang: Alright!
684 01:20:31.440 ⇒ 01:20:33.040 Hannah Wang: Alrighty! Alright! Everyone.
685 01:20:33.040 ⇒ 01:20:33.660 Amber Lin: Bye, everyone.
686 01:20:33.890 ⇒ 01:20:34.470 Uttam Kumaran: Hi.
687 01:20:34.470 ⇒ 01:20:34.880 Hannah Wang: Right.