Meeting Title: ABC Trainer Bot | Placeholder Date: 2025-05-05 Meeting participants: Uttam Kumaran, Amber Lin, Scott_Harmon


WEBVTT

1 00:01:25.710 00:01:27.080 Amber Lin: Hi Scott!

2 00:01:27.390 00:01:28.619 Scott_Harmon: Hi! Amber! How are you?

3 00:01:28.620 00:01:30.729 Amber Lin: I’m good. What are you eating?

4 00:01:30.880 00:01:33.620 Scott_Harmon: I just finished a late breakfast of

5 00:01:34.943 00:01:39.710 Scott_Harmon: my own concoction, scrambled eggs with jalapenos, and saw.

6 00:01:39.920 00:01:47.209 Amber Lin: Oh, that’s so cool! Wait! Did you grow your own jalapenos? Is that what you’re saying?

7 00:01:47.210 00:01:52.140 Scott_Harmon: No, no, although although we have in the past, they grow well here.

8 00:01:52.390 00:01:53.950 Amber Lin: Oh, really.

9 00:01:53.950 00:01:54.280 Scott_Harmon: We’ll.

10 00:01:54.280 00:01:57.510 Amber Lin: The time is kind of similar to Mexico.

11 00:01:58.090 00:01:58.890 Scott_Harmon: Yeah.

12 00:01:59.310 00:02:00.170 Amber Lin: Oh, wow!

13 00:02:00.170 00:02:01.560 Scott_Harmon: Yeah, yeah, yeah, no, we.

14 00:02:01.560 00:02:03.369 Amber Lin: Do my meal prep. Too.

15 00:02:03.860 00:02:05.010 Scott_Harmon: I’m sorry.

16 00:02:05.180 00:02:07.329 Amber Lin: I was also just cooking.

17 00:02:07.330 00:02:08.410 Scott_Harmon: Oh, you were! Oh!

18 00:02:08.419 00:02:13.249 Amber Lin: Yeah, let me go to my kitchen. That’s why I’m like

19 00:02:13.359 00:02:18.259 Amber Lin: 2 min late because I was trying to cook. I have

20 00:02:21.430 00:02:23.170 Amber Lin: I have my.

21 00:02:23.410 00:02:24.549 Scott_Harmon: What are you making.

22 00:02:24.550 00:02:33.589 Amber Lin: Beef for my, I’m trying to make a burrito, so I’m cooking my meal prep. This is like taco seasoning with cilantro and onions.

23 00:02:33.590 00:02:34.170 Scott_Harmon: Huh!

24 00:02:34.170 00:02:40.889 Amber Lin: This is like a Chinese Taiwanese, like Teriyaki brown beef.

25 00:02:41.180 00:02:48.140 Amber Lin: and there I’m cooking like I’m making chicken.

26 00:02:49.180 00:02:52.129 Amber Lin: and that’s for, and those are ingredients for your burrito.

27 00:02:52.742 00:02:57.689 Amber Lin: No, the chicken is, I’m doing meal prep. So I freeze everything.

28 00:02:57.690 00:02:59.709 Scott_Harmon: So what’s the burrito? What do you get?

29 00:03:00.150 00:03:00.810 Scott_Harmon: Burrito?

30 00:03:01.441 00:03:04.329 Amber Lin: For the burrito. I think I’m gonna do the

31 00:03:04.450 00:03:15.010 Amber Lin: beef ground beef, and then I have Jalapenos. I bought my tortillas because it’s Cinco Mayo today.

32 00:03:15.220 00:03:16.250 Scott_Harmon: Oh, yeah. Yeah.

33 00:03:16.250 00:03:18.769 Amber Lin: Yeah, celebrating that today.

34 00:03:18.930 00:03:22.570 Scott_Harmon: So you know, one of the big controversies between.

35 00:03:22.570 00:03:23.080 Amber Lin: Yeah.

36 00:03:23.080 00:03:24.869 Scott_Harmon: LA. And Texas.

37 00:03:25.000 00:03:30.170 Amber Lin: Whether or not the thing you get in the morning, with eggs in it, wrapped in a tortilla.

38 00:03:30.340 00:03:33.680 Scott_Harmon: Is, is a breakfast burrito, or breakfast. Taco.

39 00:03:35.080 00:03:37.713 Amber Lin: How is that? Even a breakfast? Taco.

40 00:03:38.090 00:03:42.090 Scott_Harmon: Because in Texas, in Texas there’s no such thing as a breakfast burrito.

41 00:03:42.380 00:03:46.600 Amber Lin: What like for me, a taco is not closed.

42 00:03:47.010 00:03:49.260 Amber Lin: If it’s closed, then it’s a burrito.

43 00:03:49.260 00:03:55.029 Scott_Harmon: So I told you my daughter lived in La for a while. Of course she, you know, went to school here and then, and then moved to La.

44 00:03:55.150 00:03:58.299 Scott_Harmon: This was maybe one of the biggest things for her.

45 00:03:58.890 00:04:02.709 Scott_Harmon: but she’s like Dad. They they call breakfast tacos, breakfast burritos here.

46 00:04:03.460 00:04:06.449 Scott_Harmon: and I’m like. Well, what do you mean? They call them breakfast burritos? She goes.

47 00:04:06.450 00:04:08.590 Amber Lin: Like. What do you mean? You told him, Tacos.

48 00:04:08.590 00:04:14.740 Scott_Harmon: Right. But I I think you have a point. I think, Burritos, I always think of as being

49 00:04:15.120 00:04:16.940 Scott_Harmon: completely wrapped up.

50 00:04:17.610 00:04:18.670 Amber Lin: Yeah.

51 00:04:18.670 00:04:22.770 Scott_Harmon: You know, and then Tacos, as being more like open in a.

52 00:04:22.930 00:04:28.340 Amber Lin: Yeah, is it not wrapped up in Texas, like, is.

53 00:04:28.586 00:04:32.780 Scott_Harmon: They come. They come both ways. They have like the street, you know. You could get them.

54 00:04:32.930 00:04:35.720 Scott_Harmon: you know, open, but then they’ll all. They also

55 00:04:36.250 00:04:43.680 Scott_Harmon: wrap them, and then you always order corn flour. But even if they’re wrapped they’re breakfast, Tacos, there’s no such thing as a breakfast Burrito in Texas like.

56 00:04:43.680 00:04:50.110 Amber Lin: Wow. So I guess they started looking like a taco. And then someone thought of to wrap it. But they didn’t give it a new.

57 00:04:50.110 00:04:52.459 Scott_Harmon: I think that’s probably what happened.

58 00:04:52.460 00:04:53.060 Amber Lin: Yeah.

59 00:04:53.060 00:04:58.599 Scott_Harmon: And I don’t know if we have like a different Mexican diaspora.

60 00:04:58.890 00:04:59.849 Scott_Harmon: Yeah, you know.

61 00:04:59.850 00:05:00.480 Scott_Harmon: So then.

62 00:05:00.480 00:05:05.789 Amber Lin: It was an American thing, because I was looking it up when I looked at the recipe, and it was a

63 00:05:06.010 00:05:15.070 Amber Lin: Tex-mex or American version of Mexican food, so I guess there was nothing like that in Mexico. It was just completely American thing.

64 00:05:15.070 00:05:20.300 Scott_Harmon: Yeah. Tech specs is is a unique vernacular, like, it’s a unique.

65 00:05:21.940 00:05:27.889 Amber Lin: Variation on, you know, and many people that migrate to Texas come from.

66 00:05:29.470 00:05:34.540 Scott_Harmon: Specific parts or states in Mexico. And so it just sort of has its own.

67 00:05:34.950 00:05:39.430 Amber Lin: You know lineage. But Tex-max, you know the added.

68 00:05:40.010 00:05:46.122 Scott_Harmon: Some local, you know. Quirks, I guess you call them so like you put

69 00:05:47.190 00:05:56.750 Scott_Harmon: like the chili concarne you put on top of an enchilada is very specific to tex-mex, you know, flavoring, and you know 4 or 5 other.

70 00:05:57.120 00:05:59.420 Scott_Harmon: You know, you’d almost think of them as like the.

71 00:05:59.720 00:06:07.980 Scott_Harmon: you know, like the trappings of a dish. There’s but they’re unique. They define, you know, tech specs and

72 00:06:07.980 00:06:16.240 Scott_Harmon: and but in the same way, California, Max, I don’t. It’s not call that. But

73 00:06:16.430 00:06:23.189 Scott_Harmon: my daughter likewise said it was like very unique. And it’s also not Mexican like, because she spent a lot of time in Mexico.

74 00:06:23.380 00:06:24.520 Amber Lin: Oh! Hold on!

75 00:06:24.520 00:06:26.150 Scott_Harmon: So l so LA.

76 00:06:26.150 00:06:27.840 Amber Lin: Daughter, right now.

77 00:06:27.840 00:06:34.239 Scott_Harmon: Yeah, yeah. So she she spent a bunch of time in Mexico. And so she’s like, Look la has its own.

78 00:06:34.790 00:06:39.400 Scott_Harmon: You know. Twist on, you know Mexican food, and it’s uniquely la, it’s not.

79 00:06:39.610 00:06:44.020 Scott_Harmon: It’s not just Mexican. It’s like La Mexican, and

80 00:06:44.190 00:06:48.080 Scott_Harmon: I remember, she told me she described the 4 or 5 things that are unique.

81 00:06:48.670 00:06:59.299 Scott_Harmon: You know the only Angelenos do to Mexican food that you don’t find anywhere else. Now she lives in New York, and she knows she’s like a real food culture person. So

82 00:06:59.700 00:07:01.330 Scott_Harmon: she notices all this stuff.

83 00:07:02.070 00:07:11.150 Amber Lin: That’s so cool like for me. I’m here for almost 2 years now, and my lease is ending. So I’m looking for.

84 00:07:11.280 00:07:19.010 Amber Lin: I originally was like, Okay, I’ll find a place to live in. La. But then he was like, Do I have to be in? La? No, I don’t have to be in La

85 00:07:19.010 00:07:19.450 Amber Lin: Beautiful.

86 00:07:19.450 00:07:32.719 Amber Lin: especially. My job is remote now. So I was just thinking about okay, where all these places I can go probably stay like a month or 2, 3 months, and then move to another place within it.

87 00:07:32.720 00:07:37.870 Scott_Harmon: That’s a great lots of people do that. And it’s a it’s just a fantastic way to see. You know, it’s a big.

88 00:07:38.360 00:07:45.620 Scott_Harmon: is obviously a big country with tons of, you know, regional, you know, variety, and so.

89 00:07:45.620 00:07:48.149 Amber Lin: Yeah, what would you recommend?

90 00:07:48.530 00:07:49.470 Scott_Harmon: I’m sorry.

91 00:07:49.470 00:07:51.129 Amber Lin: Where would you recommend?

92 00:07:51.130 00:07:55.879 Scott_Harmon: I I like, I I kind of weird and like I like every place like I think

93 00:07:57.300 00:07:59.719 Scott_Harmon: I mean, I like Texas a lot, but I live here, but.

94 00:08:00.550 00:08:04.849 Scott_Harmon: I might. I’ve always kind of felt like, because I lived in Boston when I was younger for a while, and.

95 00:08:04.850 00:08:05.470 Amber Lin: Oh!

96 00:08:05.470 00:08:08.839 Scott_Harmon: I grew up in the Midwest, and I’ve spent time a lot of time in La, and

97 00:08:09.190 00:08:12.519 Scott_Harmon: so I kind of feel like every

98 00:08:12.910 00:08:15.369 Scott_Harmon: place has a lot of really good things about it.

99 00:08:16.200 00:08:31.349 Scott_Harmon: And so they’re just different good things like, if you live in Florida, there’s 1 set of things that are really great, and if you live in New England, there’s a different set of things like my daughter lives in New York right now in Brooklyn, and there’s a whole bunch of great things about it.

100 00:08:31.700 00:08:34.700 Scott_Harmon: But then there’s a bunch of downsides, you know, it’s just like.

101 00:08:35.830 00:08:40.320 Scott_Harmon: but if you don’t have to live there for your whole life. You can just take advantage of the good things.

102 00:08:40.770 00:08:43.650 Amber Lin: I know. Where do you say I start? I’ve been in La.

103 00:08:43.659 00:08:47.419 Scott_Harmon: My favorite city, just city city, you know, in.

104 00:08:47.689 00:08:50.619 Amber Lin: The Us. Is Boston, Boston.

105 00:08:50.769 00:08:54.599 Scott_Harmon: And a partly because it’s the lead to me. It’s the least like

106 00:08:56.829 00:09:00.139 Scott_Harmon: America. It’s it’s quite European. And it’s it’s just

107 00:09:00.800 00:09:05.299 Scott_Harmon: is just very different, culturally, than any other city I’ve been to in America.

108 00:09:07.650 00:09:16.660 Amber Lin: And my sister’s gonna be she’s gonna teach at a university, I think in Boston once she graduates for Phd, so I’ll probably just go visit her.

109 00:09:16.660 00:09:17.759 Scott_Harmon: Have you ever been.

110 00:09:18.170 00:09:26.430 Amber Lin: I’ve never been like I. We did a road trip from DC. To La, but we didn’t go. We didn’t go to.

111 00:09:26.430 00:09:27.860 Scott_Harmon: Oh, yeah, it’s way out of your way.

112 00:09:28.518 00:09:31.809 Amber Lin: Yeah, just in the middle.

113 00:09:31.810 00:09:35.789 Scott_Harmon: Well, Boston’s just so much like, you know. Nothing in America is very old, like it’s.

114 00:09:35.790 00:09:36.260 Amber Lin: Hmm.

115 00:09:36.260 00:09:38.230 Scott_Harmon: Are you? Did you grow up in Taiwan, or where did you grow up at.

116 00:09:38.230 00:09:47.450 Amber Lin: I grew up in China, but I spent a whole year in Europe, where I also just traveled around. So when you say European, I think I know what you’re talking about.

117 00:09:48.060 00:09:53.090 Scott_Harmon: Well, America just so young that they’re just, you know. There’s just not much, you know.

118 00:09:53.470 00:09:56.020 Scott_Harmon: deep history in a lot of American cities.

119 00:09:56.400 00:10:05.269 Scott_Harmon: But Boston and New York are kind of the 2 exceptions, you know, like Boston is a lot more. Oh, that’s 300 years old, and that cemetery is 300.

120 00:10:05.720 00:10:06.589 Amber Lin: Gold. And

121 00:10:06.590 00:10:12.340 Amber Lin: 1st thing I learned about a Us. History was like the Boston Tea Party. And it’s like, Okay.

122 00:10:12.660 00:10:18.933 Scott_Harmon: Yeah, yeah, yeah, no, it’s. And it’s just very distinctive culture.

123 00:10:19.280 00:10:19.940 Amber Lin: Good.

124 00:10:20.320 00:10:22.169 Scott_Harmon: I guess I guess there are

125 00:10:22.300 00:10:24.830 Scott_Harmon: there in New York. If I were your age, and

126 00:10:25.090 00:10:27.110 Scott_Harmon: you know didn’t have a family, I’d probably

127 00:10:27.950 00:10:30.590 Scott_Harmon: I think there’s probably the most to do in those cities.

128 00:10:30.790 00:10:32.549 Amber Lin: Not even La like New York.

129 00:10:32.550 00:10:36.369 Scott_Harmon: Oh, no, I love I love la! But if you were, gonna go check someplace else out like.

130 00:10:36.370 00:10:37.080 Amber Lin: No.

131 00:10:37.570 00:10:41.300 Scott_Harmon: Like la, I think, gets for me. It would get old

132 00:10:42.670 00:10:46.490 Scott_Harmon: just having to drive everywhere, like even Kelsey said, like.

133 00:10:47.490 00:10:59.360 Scott_Harmon: you know, you can literally do anything in it like any. There’s it has everything, you know, everything. But you do have to drive sometimes an hour to get to it. And in New York you just get on a train, and

134 00:11:00.140 00:11:06.069 Scott_Harmon: you know, 22 min, and you’re you know you’re in the Bronx like it’s.

135 00:11:06.070 00:11:07.150 Amber Lin: Oh!

136 00:11:07.150 00:11:07.656 Scott_Harmon: You know.

137 00:11:07.910 00:11:15.189 Amber Lin: Yeah, I do feel that here in La, because I’ve been able to get around without a car, but it takes so long

138 00:11:15.430 00:11:17.680 Amber Lin: I don’t. I don’t drive so I’m taking.

139 00:11:17.680 00:11:19.350 Scott_Harmon: Or do you ride the train out there.

140 00:11:19.510 00:11:22.220 Amber Lin: Yeah, I write. I write it to everywhere, like.

141 00:11:22.220 00:11:24.030 Scott_Harmon: See. Maybe I guess.

142 00:11:25.340 00:11:26.800 Scott_Harmon: I guess maybe it’s gotten better.

143 00:11:26.950 00:11:28.470 Scott_Harmon: I’m I know I’ve.

144 00:11:28.470 00:11:32.978 Amber Lin: Not really it, just I make do with it.

145 00:11:33.480 00:11:36.500 Scott_Harmon: And they’re probably places you can’t go then, because, like

146 00:11:37.390 00:11:42.270 Scott_Harmon: you know me, I guess the train, like in New York, the train just goes everywhere like there’s no place. The train doesn’t go like.

147 00:11:43.130 00:11:44.020 Scott_Harmon: you know.

148 00:11:44.140 00:11:45.410 Scott_Harmon: We just take the train.

149 00:11:45.610 00:11:47.690 Amber Lin: Have you been to San Francisco.

150 00:11:47.870 00:11:51.209 Scott_Harmon: Oh, yeah, I almost moved. There. I’ve spent half my career.

151 00:11:52.430 00:12:04.529 Amber Lin: Wait really what? We’re sorry. I knew a little bit about you from Uton, but I never really got to know you, because every time we were in a group call. I never got to talk to you like, what did you? What was your career in.

152 00:12:05.460 00:12:11.510 Scott_Harmon: It’s all it’s all been in like high tech startups. I did my just all venture capital and.

153 00:12:12.310 00:12:15.709 Scott_Harmon: In software, you know, hardware companies. I I started

154 00:12:16.260 00:12:19.279 Scott_Harmon: my career early on working for Hewlett, Packard and

155 00:12:19.970 00:12:26.230 Scott_Harmon: and and then and then moved into soft software industry and had lots of different jobs. And

156 00:12:26.500 00:12:31.080 Scott_Harmon: and then, from the last second half of my career, it was all

157 00:12:31.330 00:12:35.120 Scott_Harmon: working with startups, doing early stage startups. And so it’s all

158 00:12:35.370 00:12:40.030 Scott_Harmon: either you know, San Francisco, or but or or Austin, you know. And so.

159 00:12:41.410 00:12:46.760 Amber Lin: Were you skilling those startups, or were you more like advisor? Oh, so you.

160 00:12:46.760 00:12:51.230 Scott_Harmon: Yeah, no, I I know. I ran 3 of them myself as as a founder.

161 00:12:51.550 00:12:57.170 Scott_Harmon: and I joined. I joined the 1st one as as a marketing executive, and then ran marketing.

162 00:12:57.680 00:13:01.300 Scott_Harmon: and then that that company got bought by Ibm.

163 00:13:01.860 00:13:06.649 Scott_Harmon: And then I work for Ibm for 2 years after that, and then as a marketing Vp. And then

164 00:13:07.240 00:13:12.489 Scott_Harmon: I left and founded my own company with 2 other people from Ibm.

165 00:13:12.950 00:13:17.509 Scott_Harmon: and I was then I was CEO of that. And then that company went public.

166 00:13:17.720 00:13:18.420 Amber Lin: Hmm.

167 00:13:18.610 00:13:24.460 Scott_Harmon: And then I ran it as a public company, CEO, for several years, and then and then I

168 00:13:25.450 00:13:29.620 Scott_Harmon: started and ran 2 other companies after that.

169 00:13:32.230 00:13:37.671 Scott_Harmon: Yeah. So it’s always been. It’s always been scaling and starting young companies.

170 00:13:38.090 00:13:46.100 Amber Lin: Really like the process where it’s like very little something. It’s not like nothing, but it’s a little little something to something bigger.

171 00:13:46.100 00:13:48.739 Scott_Harmon: Yeah, I think that’s right. I think that the

172 00:13:49.200 00:13:58.840 Scott_Harmon: that you know, I I think that the way I think about startups is that they’re. It’s a lot like an experimental, you know, science. And

173 00:13:59.290 00:14:03.370 Scott_Harmon: you want to run a lot of experiments. And then the experiments that work.

174 00:14:04.220 00:14:06.809 Scott_Harmon: you know, get more resources and

175 00:14:07.310 00:14:10.469 Scott_Harmon: more promise. And more people kind of go to that area.

176 00:14:10.570 00:14:16.339 Scott_Harmon: And so, you know, for the startups, that kind of prove out their concept.

177 00:14:17.250 00:14:23.430 Scott_Harmon: You know, a lot of times the term for is product market fit. You know. They demonstrate that they’ve got a unique niche or a unique.

178 00:14:24.190 00:14:26.530 Scott_Harmon: You know, something unique that the world wants.

179 00:14:27.110 00:14:31.040 Scott_Harmon: Then from that point into making them a you know, a

180 00:14:31.140 00:14:33.400 Scott_Harmon: a durable business. You know that.

181 00:14:34.490 00:14:36.399 Amber Lin: It’s very different, of.

182 00:14:36.400 00:14:38.229 Scott_Harmon: Is a very different set of skills. Yeah.

183 00:14:39.120 00:14:40.009 Scott_Harmon: But I like that.

184 00:14:40.010 00:14:42.710 Amber Lin: Grow and then hand off to someone to like. Just run the.

185 00:14:42.710 00:14:52.919 Scott_Harmon: Yeah, yeah, that’s kind of probably it. So generally, generally, that phase is kind of 0 to 100 million in revenue kind of that window would be where I’ve spent all my my time above

186 00:14:53.340 00:14:57.539 Scott_Harmon: anything above. That is a very different set of skills to run a very large.

187 00:14:57.540 00:14:57.960 Amber Lin: Yeah.

188 00:14:58.050 00:15:05.250 Scott_Harmon: New company. And it’s it’s just a lot about personnel and and you know.

189 00:15:05.580 00:15:17.550 Amber Lin: Yeah, I I feel, I mean, I’m very early in my career. This is my like, full first, st very full time job. I graduated in December. And so, like, I was thinking.

190 00:15:17.760 00:15:43.499 Amber Lin: the past few days or the past few months about, okay, what do I want to do in the career? I feel I really am like as you meant as you brought it up. Say it was Aha! Moment, say, oh, it’s a very different set of skills to scale from 0 to 100, and then 100 to like. Just keep going. I don’t think I like that like 100 after humongous company, systems are very set. I like to

191 00:15:43.750 00:15:58.470 Amber Lin: like, give the business different systems and help them scale. That’s essentially why I’m at Brainforge. And probably that’s why you’re helping with Brainforge, too. It’s just because they’re so early. And there’s so many things you can’t do.

192 00:15:59.020 00:16:01.339 Scott_Harmon: Yeah, I think there’s a lot of

193 00:16:02.310 00:16:06.750 Scott_Harmon: interesting, you know, when you’re discovering the potential and helping the company

194 00:16:06.910 00:16:10.100 Scott_Harmon: realize early potential, I think that’s a real

195 00:16:11.249 00:16:19.110 Scott_Harmon: you know, for me it’s a lot of it’s fun. It’s it’s a very dynamic sort of uncertain.

196 00:16:20.080 00:16:28.800 Scott_Harmon: you know, time. And so it’s just very interesting. I think it’s professionally for someone

197 00:16:29.270 00:16:33.610 Scott_Harmon: you know, like you, it’s or anybody, it’s riskier.

198 00:16:34.030 00:16:36.960 Scott_Harmon: you know, like, like, like those

199 00:16:37.470 00:16:42.969 Scott_Harmon: those ventures, you know don’t have. I mean, their odds of success really? Aren’t that? Aren’t that great?

200 00:16:43.320 00:16:43.830 Amber Lin: Yeah.

201 00:16:43.900 00:16:49.389 Scott_Harmon: But I think we got to a point a while ago, maybe 25 years ago, where.

202 00:16:49.930 00:16:55.290 Scott_Harmon: even if you work for a company, if it was a good company, and maybe it didn’t, you know, become.

203 00:16:55.970 00:16:58.800 Scott_Harmon: you know, massive. But you will learn so much.

204 00:17:00.110 00:17:04.630 Scott_Harmon: And that you would have so much that your your next opportunity would be

205 00:17:05.329 00:17:10.279 Scott_Harmon: wonderful, even if the company maybe didn’t do great because of everything you could learn

206 00:17:10.589 00:17:16.750 Scott_Harmon: by working, even if it was. You know, it’s a good idea, and like in it, one of the things I really like about America is

207 00:17:16.910 00:17:19.010 Scott_Harmon: you’re not really penalized.

208 00:17:19.660 00:17:25.650 Scott_Harmon: Even if the company maybe didn’t become Facebook as long as you worked at a good company did a good job.

209 00:17:25.880 00:17:26.980 Scott_Harmon: you know.

210 00:17:27.650 00:17:30.150 Amber Lin: How would other people know that though.

211 00:17:30.550 00:17:31.150 Scott_Harmon: All right.

212 00:17:31.420 00:17:34.669 Amber Lin: How would other people know that you did a good job because.

213 00:17:34.670 00:17:43.360 Scott_Harmon: Well, the way the way it works in practice is that the number one currency you can have and develop

214 00:17:44.249 00:17:52.780 Scott_Harmon: especially, you know, especially in early on in your career, is your reputational capital, and their reputation

215 00:17:53.450 00:17:56.270 Scott_Harmon: extends beyond your current. So like

216 00:17:56.630 00:18:00.579 Scott_Harmon: like the people that you know and the people you meet

217 00:18:00.910 00:18:08.389 Scott_Harmon: they’ll get called. Let’s say you go apply for a different job, you know. They’ll call 3 or 4 people that worked with you at Brainforge, and

218 00:18:08.920 00:18:09.930 Scott_Harmon: and

219 00:18:10.230 00:18:19.030 Scott_Harmon: somebody always knows some. Oh, I knew somebody that worked there, or whatever that has an outsized leverage on how on you getting your next job. So

220 00:18:19.150 00:18:22.780 Scott_Harmon: I got a lot of jobs, even if some didn’t work out.

221 00:18:23.050 00:18:27.279 Scott_Harmon: I worked with this super person, and they referred me to this job

222 00:18:27.500 00:18:33.120 Scott_Harmon: and and the best way to always get jobs is by reference to be referred for a role.

223 00:18:33.120 00:18:34.120 Amber Lin: Yeah, yeah.

224 00:18:34.120 00:18:37.320 Scott_Harmon: Right versus like applying for a role like like.

225 00:18:37.320 00:18:37.850 Amber Lin: You’re talking.

226 00:18:37.850 00:18:38.430 Scott_Harmon: Better.

227 00:18:38.430 00:18:44.049 Amber Lin: I applied for 3 months, and then this Robert reached out to me like I didn’t apply.

228 00:18:44.380 00:18:57.680 Scott_Harmon: Probably somebody you’ll meet at Brainforge, will, you know, leave, or whatever? And they go? Oh, my gosh! I work with this woman Amber. You got a call like. So that if you really, if you really look at successful careers, they have that pattern they have.

229 00:18:57.930 00:19:04.130 Scott_Harmon: They have people who the people they work with really, really love them and think really highly of them.

230 00:19:04.640 00:19:09.459 Scott_Harmon: Some of those people go on and get good jobs. It’s like seeds, you know, going, you know, kind of going out from plants.

231 00:19:09.460 00:19:10.530 Amber Lin: Oh!

232 00:19:10.530 00:19:15.400 Scott_Harmon: And then the 1st thing they do is go. Oh, you have to call Amber. She’s just fantastic and.

233 00:19:15.840 00:19:20.380 Amber Lin: By and large. The references people make are pretty good, like.

234 00:19:20.820 00:19:24.759 Scott_Harmon: So generally the top 50% of a company

235 00:19:25.090 00:19:28.269 Scott_Harmon: are the highest contributors, the bottom 50 not so much.

236 00:19:28.720 00:19:32.150 Amber Lin: And the top 50 almost always get.

237 00:19:32.510 00:19:35.169 Scott_Harmon: Referred to other organizations.

238 00:19:35.360 00:19:43.690 Scott_Harmon: And so, yeah, if you just look at the career of you know, people that do really well, that they have that they have that pattern.

239 00:19:43.800 00:19:46.030 Scott_Harmon: and consider.

240 00:19:46.030 00:19:52.980 Amber Lin: So important to hear like early on in my career. Thank you so much. I didn’t expect to get this.

241 00:19:52.980 00:19:59.130 Scott_Harmon: Yeah, 2 things you could do or work with really good people that you think are really good, like, Utam is really good.

242 00:19:59.130 00:20:00.330 Amber Lin: I agree.

243 00:20:00.330 00:20:06.130 Scott_Harmon: And work on really good projects or products like that’s those are the 2 most important things you could do. And

244 00:20:06.350 00:20:11.309 Scott_Harmon: and if you don’t. If you don’t think really highly of the people, or

245 00:20:11.670 00:20:18.070 Scott_Harmon: even if it doesn’t work right, if that that you want to work with the top

246 00:20:18.400 00:20:22.119 Scott_Harmon: kind of 20% like it’s. It’s just a statistical game.

247 00:20:22.800 00:20:25.890 Amber Lin: And if you’re able to do that, then.

248 00:20:26.690 00:20:28.549 Scott_Harmon: There’s sort of this.

249 00:20:30.340 00:20:31.440 Amber Lin: Multiplying effect.

250 00:20:31.440 00:20:37.610 Scott_Harmon: Yes, yeah, yeah. Like it. It just you get pulled into that cohort and.

251 00:20:37.610 00:20:38.250 Amber Lin: Oh!

252 00:20:38.250 00:20:44.729 Scott_Harmon: You get pulled into it via relationship. So the biggest mistake you could make is just really working for kind of a

253 00:20:45.420 00:20:52.209 Scott_Harmon: you know. Hey? I needed the money, and you know I didn’t think they were that great, but it was a good opportunity. They gave me a rate like

254 00:20:52.480 00:20:56.070 Scott_Harmon: like the biggest mistake is working for a real average.

255 00:20:56.580 00:20:59.610 Scott_Harmon: or even below average team or

256 00:20:59.800 00:21:02.389 Scott_Harmon: or opportunity, even if it like.

257 00:21:02.990 00:21:07.829 Scott_Harmon: made sense financially. Or oh, I really wanted to move to Boston, so you know, it wasn’t a

258 00:21:08.300 00:21:13.749 Scott_Harmon: you know. I they there are a lot like bad reasons to take a job.

259 00:21:14.120 00:21:14.990 Amber Lin: Wow!

260 00:21:14.990 00:21:23.040 Scott_Harmon: And likewise really smart people, like really good like when I would hire people

261 00:21:23.380 00:21:26.080 Scott_Harmon: like. All I really cared about is where they worked

262 00:21:26.550 00:21:31.479 Scott_Harmon: and what their reputation is, and if it wasn’t a company I thought highly of, I just wouldn’t even enter like

263 00:21:32.290 00:21:34.610 Scott_Harmon: I wouldn’t even think about hiring them.

264 00:21:34.930 00:21:44.199 Amber Lin: So do you? Would you say it’s better to cause it’s always a debate of Do I want a bigger name company on my resume like. Does that matter to you when you.

265 00:21:44.200 00:21:49.330 Scott_Harmon: Not just the name. It’s kind of complicated amber. It’s the the fundamental.

266 00:21:50.240 00:21:52.749 Scott_Harmon: My fundamental belief about this is that

267 00:21:53.360 00:22:01.659 Scott_Harmon: pretty much everything you know in modern life is about the 80 20 rule. You know, there were 20%

268 00:22:02.250 00:22:07.330 Scott_Harmon: of the productivity come. 80% of the output comes from like 20%.

269 00:22:07.330 00:22:07.900 Amber Lin: Oh!

270 00:22:07.900 00:22:12.840 Scott_Harmon: No organization is uniform, you know. That’s true of sports teams. It’s true of

271 00:22:13.310 00:22:16.370 Scott_Harmon: militaries. It’s true of everything, and.

272 00:22:16.370 00:22:16.930 Amber Lin: Yeah.

273 00:22:17.330 00:22:20.200 Scott_Harmon: However, you do it, getting

274 00:22:20.500 00:22:27.860 Scott_Harmon: proximity to that top 20% building relationships. And sometimes they are the biggest name, you know, like.

275 00:22:28.630 00:22:30.190 Scott_Harmon: early on like.

276 00:22:31.020 00:22:37.630 Scott_Harmon: you know those. They’re just the names, everybody says. Oh, that’s the greatest company in the world, right? Not always, though. Right like.

277 00:22:37.850 00:22:38.460 Amber Lin: Oh!

278 00:22:38.610 00:22:40.960 Scott_Harmon: Like if you just do your homework

279 00:22:42.010 00:22:49.669 Scott_Harmon: on like Utam or other people, and and just get enough perspective like, no, this, this is really a top talent group of people.

280 00:22:49.820 00:22:58.630 Scott_Harmon: and he’s a really top talented leader, which I think he is. If you’re comfortable, that that’s true, right? Like I. But I think you have to be

281 00:22:59.090 00:23:02.530 Scott_Harmon: very discerning about like you have to be intentional.

282 00:23:02.890 00:23:03.890 Amber Lin: Yeah.

283 00:23:04.890 00:23:13.860 Scott_Harmon: Real choosy about who you’re gonna work like, who you will work with, like the the benefits of

284 00:23:14.310 00:23:16.759 Scott_Harmon: working with a top 20%

285 00:23:18.410 00:23:21.039 Scott_Harmon: team or people are so outsized

286 00:23:21.610 00:23:25.679 Scott_Harmon: that that’s probably the number. One thing I ever did in my career that helped it go well.

287 00:23:25.930 00:23:30.730 Scott_Harmon: was I was just able to work with super talented teams.

288 00:23:32.110 00:23:32.800 Amber Lin: Wow!

289 00:23:32.800 00:23:36.429 Scott_Harmon: And then they they just get to know you, and then they you become one of them. And

290 00:23:36.760 00:23:40.999 Scott_Harmon: it’s like a secret society, almost like like it’s it’s.

291 00:23:41.000 00:23:42.260 Amber Lin: So interesting.

292 00:23:42.260 00:23:44.879 Scott_Harmon: Yeah. So that’s that’s kind of the

293 00:23:45.490 00:23:49.750 Scott_Harmon: the pattern. What? What did? What did you graduate? Where’d you graduate from? And what did you study.

294 00:23:49.750 00:24:08.790 Amber Lin: I went to Usc, so my school. So technically, I went to 3 schools. So my program is a collaboration between 3 schools across 3 different continents. So there’s Usc in la, there’s Hkusd, which is a really good school in Hong Kong

295 00:24:08.790 00:24:20.160 Amber Lin: in Asia as well. And then I went to Boconi. So that’s in Milan and Italy. So I have. I have technically, I have 3 degrees, but they are all undergraduate business degrees.

296 00:24:20.550 00:24:26.570 Scott_Harmon: It was so. It’s kind of business, cross-disciplinary, but all business related.

297 00:24:26.950 00:24:38.270 Amber Lin: Yeah, business related. And mostly it’s about being the benefit of that is being in the country for a year, and then maybe having an internship in that country like building the network over there as well.

298 00:24:38.510 00:24:43.670 Amber Lin: So a lot of friends who now work in Europe and who work in Asia.

299 00:24:44.420 00:24:45.569 Scott_Harmon: Wow! What a name!

300 00:24:45.570 00:24:45.890 Amber Lin: Interesting.

301 00:24:45.890 00:24:46.890 Scott_Harmon: And opportunity.

302 00:24:47.432 00:24:53.600 Scott_Harmon: I know. U. Usc. Has a tremendous reputation. I don’t know the other schools so well. But

303 00:24:55.390 00:24:57.979 Scott_Harmon: yeah, I just think I think that

304 00:24:58.660 00:25:01.740 Scott_Harmon: just be real choosy about your opportunities.

305 00:25:02.560 00:25:09.940 Scott_Harmon: Even even if you find something that’s intellectually exciting. If you don’t think the people are good enough.

306 00:25:10.750 00:25:11.859 Amber Lin: Oh, wow!

307 00:25:11.860 00:25:15.100 Scott_Harmon: Don’t go there right like like

308 00:25:15.450 00:25:19.409 Scott_Harmon: it’s you. You really want to kind of figure out

309 00:25:20.790 00:25:23.720 Scott_Harmon: how you can be associated with the best teams.

310 00:25:24.710 00:25:31.990 Amber Lin: Oh, okay, that is, if I don’t remember anything that I will remember, says people above everything, the best people.

311 00:25:31.990 00:25:37.840 Scott_Harmon: It really is, and it’s just, you know you probably don’t when you see it, or if you ask enough questions like

312 00:25:41.480 00:25:47.099 Scott_Harmon: like, I remember a friend of mine that I went on to co-found a company with him, but

313 00:25:47.370 00:25:54.759 Scott_Harmon: he’s now become fairly prominent. Venture capitalist. He lives in Silicon Valley. He runs a big venture capital firm named Floodgate.

314 00:25:55.330 00:26:00.909 Scott_Harmon: and his name’s his name’s Mike Maples. He’s written a bunch of books. He’s pretty well known in the venture community.

315 00:26:01.260 00:26:04.589 Scott_Harmon: and he’s about 10 years younger than I am.

316 00:26:05.040 00:26:07.720 Scott_Harmon: and when he’s graduating from college

317 00:26:07.860 00:26:12.820 Scott_Harmon: he went to Harv, went to Harvard Business School. He’s graduating, and I ran.

318 00:26:13.230 00:26:19.270 Scott_Harmon: I ran marketing for that for that company that was that had been bought by Ibm.

319 00:26:19.870 00:26:26.959 Scott_Harmon: and he sent he sent me. I think he called me or sent me a letter, and he’s like, Look, I’ve done all my homework. There’s only 3 companies in the world

320 00:26:27.870 00:26:34.179 Scott_Harmon: that I want to work at, because everybody says they’re the best companies, and you’re one of them, and I’ll work there for free

321 00:26:34.810 00:26:40.360 Scott_Harmon: he goes. You never met me, he goes. I’ll work there for 6 months for free, which

322 00:26:40.530 00:26:43.149 Scott_Harmon: just a it’s a very provocative, you know. Kind of.

323 00:26:44.140 00:26:50.909 Scott_Harmon: Way to, you know. So I met him, and I thought he’s super impressed. I hired him. He was great, and he didn’t have to work for free, but that was his

324 00:26:51.666 00:26:53.180 Scott_Harmon: good intro.

325 00:26:53.180 00:26:55.430 Scott_Harmon: He only targeted 3 companies.

326 00:26:55.430 00:26:56.110 Amber Lin: Hmm.

327 00:26:56.110 00:26:58.940 Scott_Harmon: Did a whole bunch of research, and he said, These are these are the

328 00:26:59.320 00:27:01.430 Scott_Harmon: the had, the best people in them.

329 00:27:01.560 00:27:05.810 Scott_Harmon: You know the most talent. I want to just go work there, and

330 00:27:06.630 00:27:11.999 Scott_Harmon: and you know one of them was salesforce.com, and you know one of them is Tidley. I forgot the 3.rd

331 00:27:12.160 00:27:17.100 Scott_Harmon: But he, you know I made him an offer, and he came to work, came to work for us, did great, so

332 00:27:17.450 00:27:19.829 Scott_Harmon: that that was an example of

333 00:27:20.010 00:27:22.970 Scott_Harmon: you know how he was just sort of very intentional.

334 00:27:24.150 00:27:24.970 Amber Lin: Yeah.

335 00:27:24.970 00:27:26.600 Scott_Harmon: About his next opportunity.

336 00:27:27.950 00:27:55.630 Amber Lin: That’s amazing. Like I I already feel I already feel that made a very correct choice to work here, because my responsibilities has expanded so much. I was just a project manager. And now I’m helping them set the internal strategy of like, okay, this is a service business. What do you need to do to do better? Customer. Serve client delivery. How do you make the improve employee satisfaction? How do you improve

337 00:27:55.630 00:28:13.640 Amber Lin: revenue and margins of financial performance so like that is something that makes me really, really excited. So I already feel like I’ve grown a lot in this role. And I I call Utam every week, so I’ve already learned a lot from him. I just don’t know how. I don’t know how he knows so much at at this age.

338 00:28:13.640 00:28:23.390 Scott_Harmon: He’s no, he’s great. He he’s, you know. I’ve only known him a short time, but was real impressed with him, and that’s why I decided to become advisor because he’s just a really impressive.

339 00:28:25.190 00:28:34.369 Scott_Harmon: really impressive background for so you know, relatively young guy. And so you know, what he’s trying to do is, I think, build a company in a very

340 00:28:35.140 00:28:36.040 Scott_Harmon: kind of specific.

341 00:28:36.040 00:28:36.410 Amber Lin: Modern.

342 00:28:36.410 00:28:39.619 Scott_Harmon: New way. Everybody, yeah. Modern is a great word, like.

343 00:28:40.260 00:28:45.480 Scott_Harmon: I’m sure, in business school you studied all kinds of different patterns for how to form and run companies.

344 00:28:45.800 00:28:49.220 Scott_Harmon: And you know, or different paradigms about how to think about them.

345 00:28:49.370 00:28:55.240 Scott_Harmon: And especially in technology, we have a set of patterns that everybody copies.

346 00:28:55.470 00:28:56.660 Amber Lin: Yeah, they’re good.

347 00:28:56.660 00:29:01.949 Scott_Harmon: Just templates about how to do a startup or whatever. So I think I think that he has some

348 00:29:03.350 00:29:10.410 Scott_Harmon: pretty pretty provocative and interesting newer ideas that kind of break some of those patterns.

349 00:29:11.240 00:29:13.609 Scott_Harmon: Which is, which is pretty neat, I think.

350 00:29:13.610 00:29:16.329 Amber Lin: I agree it was always talking about you.

351 00:29:16.670 00:29:17.635 Uttam Kumaran: Yeah.

352 00:29:19.170 00:29:20.159 Scott_Harmon: How are you, Buddy?

353 00:29:20.160 00:29:21.229 Amber Lin: Where you are.

354 00:29:21.230 00:29:22.480 Uttam Kumaran: Hey? I’m good.

355 00:29:22.760 00:29:28.949 Uttam Kumaran: Good this week is gonna be great. We have a lot like tons of stuff for like culminating this week. So.

356 00:29:29.210 00:29:29.830 Scott_Harmon: Okay.

357 00:29:29.830 00:29:35.560 Uttam Kumaran: Very, very exciting, although it’s rain. It’s raining here today. It’s kind of sad kind of gloomy, but.

358 00:29:36.410 00:29:40.340 Scott_Harmon: Dude. It hasn’t rained here in like a year. What are you talking about? I was driving.

359 00:29:40.720 00:29:44.859 Scott_Harmon: and it started to rain, and my windshield wipers came on. I’m like, what is this?

360 00:29:45.850 00:29:51.799 Scott_Harmon: I don’t remember driving in the rain like it. We’re in a big drought. Amber is like a multi-year drought.

361 00:29:52.380 00:29:53.410 Amber Lin: Wow!

362 00:29:53.410 00:29:57.129 Scott_Harmon: And so it’s just hasn’t rained much for like a year and a half.

363 00:29:57.510 00:30:00.590 Amber Lin: Are the crops surviving at all. If it’s a multi year drought.

364 00:30:00.590 00:30:01.980 Scott_Harmon: Not a big crop

365 00:30:02.470 00:30:10.210 Scott_Harmon: area. There’s there’s people don’t grow a lot of a lot. There’s some, but not much. Mostly it’s mostly like grazing land for cattle and stuff.

366 00:30:10.770 00:30:16.820 Scott_Harmon: And it’s been a little. It’s been a little tough. Yeah, I mean, what does grow? You kind of have to irrigate?

367 00:30:18.550 00:30:24.860 Scott_Harmon: You know, you have to use well water to irrigate the crops, but that’s not really so much in Central Texas. It’s mostly

368 00:30:25.980 00:30:27.070 Scott_Harmon: the planes.

369 00:30:27.400 00:30:29.989 Amber Lin: Interesting like that’s that’s my in

370 00:30:30.140 00:30:34.530 Amber Lin: bias and impression of Texas, just as just fields.

371 00:30:36.060 00:30:41.150 Scott_Harmon: It’s like anything else. Texas is like California, since it’s so big there’s like 5 different.

372 00:30:41.493 00:30:42.179 Amber Lin: You know.

373 00:30:42.180 00:30:48.429 Scott_Harmon: Unique climate and geographic like East Texas, where Houston is, is nothing like

374 00:30:48.980 00:30:53.490 Scott_Harmon: West Texas is nothing like the Rio Grande Valley, like they’re all just.

375 00:30:53.910 00:30:54.450 Amber Lin: Wow!

376 00:30:54.450 00:30:57.549 Uttam Kumaran: And then West Texas is like all desert, like very.

377 00:30:57.550 00:30:58.630 Amber Lin: Wow!

378 00:30:58.630 00:30:59.889 Uttam Kumaran: Uninhabited, like.

379 00:31:00.690 00:31:01.280 Scott_Harmon: What?

380 00:31:01.280 00:31:02.389 Amber Lin: Should I move there and.

381 00:31:02.680 00:31:05.299 Uttam Kumaran: Really like Wild West. Actually, it’s like all that.

382 00:31:05.590 00:31:16.439 Scott_Harmon: And they nobody knows this. But there’s actually fairly high mountain range in West Texas, but because nobody goes. It’s called the Davis Mountains, and you know they have an Alpine snow covered.

383 00:31:16.640 00:31:17.720 Uttam Kumaran: Oh, wow!

384 00:31:17.720 00:31:21.129 Scott_Harmon: Nobody ever goes to, you know, a really, really neat.

385 00:31:22.270 00:31:26.469 Scott_Harmon: really neat trip. You should take some time, Witham. If you want to live long weekend. It’s like a

386 00:31:27.600 00:31:34.149 Scott_Harmon: it’s probably 7 h drive. There’s an observatory out there, called the Mcdonald Observatory up in the Davis Mountains.

387 00:31:34.520 00:31:37.689 Scott_Harmon: and you can go out there. There’s all these like cool.

388 00:31:37.850 00:31:44.960 Scott_Harmon: just remote. But there are these motels you can stay in, and then you get to go up into the observatory, and they like teach this

389 00:31:45.320 00:31:48.820 Scott_Harmon: night stargazing class through this telescope thing.

390 00:31:50.980 00:31:59.560 Scott_Harmon: Think it’s the second, highest telescope in the Us. But anyway, it’s so far away from any ambient light there’s like, no, there’s just no light. So anyway, it’s this.

391 00:31:59.560 00:32:00.560 Amber Lin: Oh, so people!

392 00:32:00.560 00:32:01.000 Amber Lin: Cool!

393 00:32:01.000 00:32:05.139 Scott_Harmon: We’ll go there from all over the world to go. Do this this

394 00:32:05.880 00:32:10.159 Scott_Harmon: experience in the Mcdonald Observatory, and then you can make a weekend out of it.

395 00:32:10.500 00:32:12.729 Scott_Harmon: My sister used to do it every year.

396 00:32:13.730 00:32:15.520 Uttam Kumaran: Still haven’t gotten a big band.

397 00:32:16.290 00:32:17.529 Scott_Harmon: Big Ben’s a trip.

398 00:32:17.840 00:32:23.258 Uttam Kumaran: Yeah, I’m I’m I don’t mind driving that distance. Actually, it’s just

399 00:32:24.560 00:32:26.934 Uttam Kumaran: you. We just have to commit.

400 00:32:27.330 00:32:29.509 Scott_Harmon: Well like, I said. It’s like a 3 day weekend kind of thing.

401 00:32:29.510 00:32:30.350 Uttam Kumaran: Yeah.

402 00:32:30.350 00:32:33.176 Scott_Harmon: Okay, like driving half of it, like,

403 00:32:34.820 00:32:41.909 Scott_Harmon: But but okay, California, like, there’s all this stuff in California that people like

404 00:32:43.060 00:32:47.339 Scott_Harmon: like most people in California have never been north of San Francisco.

405 00:32:47.810 00:32:49.690 Amber Lin: It’s sad, so sad.

406 00:32:49.910 00:32:53.399 Scott_Harmon: And I know you know you’re from there. So you obviously have. But

407 00:32:53.540 00:32:58.790 Scott_Harmon: it’s surprising how many people think San Francisco. Or maybe Napa is like.

408 00:32:59.380 00:33:02.099 Scott_Harmon: That’s the northern edge of the State.

409 00:33:02.100 00:33:05.940 Uttam Kumaran: Yeah, it’s it’s it’s really unfortunate, I mean. But

410 00:33:06.200 00:33:18.850 Uttam Kumaran: that’s the beauty of California is like, basically like from there. It’s like all preserved land like we will. We’ll go camping, I mean, in June. We’re gonna go camping somewhere in like Mendocino, and we’re.

411 00:33:18.850 00:33:19.290 Scott_Harmon: Really.

412 00:33:19.290 00:33:23.990 Uttam Kumaran: We’ve we’ve yeah. We’ve camped in Eureka, like all through there.

413 00:33:23.990 00:33:27.849 Scott_Harmon: It’s it’s unbelievable, but it’s also just so rural. And like you think it is.

414 00:33:27.850 00:33:28.830 Uttam Kumaran: Very rural.

415 00:33:28.830 00:33:31.120 Scott_Harmon: Half the time you think you’re in Alabama, you know, it’s

416 00:33:31.120 00:33:35.099 Scott_Harmon: yeah. Yeah. It’s like a dotted doesn’t feel like California.

417 00:33:35.100 00:33:39.179 Scott_Harmon: You know. You you go up to Lake Shasta and all that stuff, and like

418 00:33:39.500 00:33:45.869 Scott_Harmon: almost nobody’s ever been there that I know, you know, even like people live in the Bay Area like, what are you talking about like.

419 00:33:45.870 00:33:49.810 Uttam Kumaran: No, yeah, cause it’s 8 h to get to Eureka. It’s 8 h.

420 00:33:49.810 00:33:50.339 Amber Lin: That’s oh!

421 00:33:50.340 00:33:50.840 Uttam Kumaran: From the van.

422 00:33:50.840 00:33:53.694 Amber Lin: Yeah, it extends so far up.

423 00:33:54.170 00:33:58.319 Uttam Kumaran: Oh, yeah, it’s like, almost like feels like Maine feels like rural Maine up there.

424 00:33:58.320 00:34:02.580 Scott_Harmon: Weird. The weird thing is like San Francisco is like halfway up California.

425 00:34:02.580 00:34:04.419 Amber Lin: Yeah. It’s in the middle.

426 00:34:04.420 00:34:09.280 Scott_Harmon: It’s called Northern California, right? Like it’s like not. It’s not really Northern California. It’s Central California.

427 00:34:09.280 00:34:12.480 Uttam Kumaran: Fort Bragg is very, very nice, like.

428 00:34:12.800 00:34:14.610 Uttam Kumaran: And then, yeah, of course, Redwood.

429 00:34:14.949 00:34:17.590 Uttam Kumaran: my dad goes to the Redwoods like all the time.

430 00:34:18.340 00:34:19.290 Scott_Harmon: It’s crazy up there.

431 00:34:19.290 00:34:19.650 Uttam Kumaran: Can.

432 00:34:20.000 00:34:24.889 Scott_Harmon: Yeah, it’s just like, I don’t even can you fly in like, I know? Obviously you can. But like.

433 00:34:25.610 00:34:31.740 Uttam Kumaran: I don’t know. I think like you could. I think you have to go to Sacramento like I don’t know what else is higher.

434 00:34:32.219 00:34:36.989 Scott_Harmon: Yeah, it’s almost like better to fly into Portland and drive down, or something.

435 00:34:36.989 00:34:38.869 Uttam Kumaran: Yeah, yeah, yeah.

436 00:34:39.679 00:34:40.559 Scott_Harmon: Well, anyway.

437 00:34:41.051 00:34:47.839 Amber Lin: Anyways, I got so sidetracked such. It’s an interesting conversation always.

438 00:34:48.783 00:34:53.589 Scott_Harmon: But I guess we wanted to talk a little bit about the design approach.

439 00:34:53.590 00:34:55.829 Amber Lin: Yes, about the trainer. Bot!

440 00:34:55.830 00:34:57.679 Scott_Harmon: Trainer bot right right right.

441 00:34:57.680 00:35:07.980 Amber Lin: Yes, so we already have some ideas for that. Essentially, from all our conversations of what we discussed in the team, so essentially has

442 00:35:08.370 00:35:33.390 Amber Lin: 2 aspects right? One is just a simple editor to help it. Make sure that everything is formatted correctly. But the second part was just a more important part is making sure that all the content is there. So that’s asking the required questions and having this right user process when it the bot interfaces with a trainer to make sure that all the necessary information are collected.

443 00:35:33.440 00:35:51.009 Amber Lin: So that’s our idea of how this training box should be formatted. And then in the future. Maybe it can just directly make the updates into the documents which we’re trying to figure out now. But immediately we want to have those 2 features that we mentioned.

444 00:35:51.300 00:35:57.809 Scott_Harmon: Yeah, I think that makes a lot of sense. I part of my question or observation. Maybe utam. It’s a little bit of a

445 00:35:58.120 00:36:04.810 Scott_Harmon: databasey kind of question that you might be able to help me with. So you know

446 00:36:05.050 00:36:09.530 Scott_Harmon: databases, there’s all this idea of a schema or a data structure.

447 00:36:09.830 00:36:16.100 Scott_Harmon: And one of the one of the differences between working with, you know, structured data

448 00:36:16.490 00:36:25.550 Scott_Harmon: in a database is that you? You know you have a schema versus unstructured data in a document where you where you don’t. And

449 00:36:26.130 00:36:29.790 Scott_Harmon: my thought had been that you know the

450 00:36:30.170 00:36:32.449 Scott_Harmon: and a lot of times, you know.

451 00:36:32.670 00:36:36.359 Scott_Harmon: you can discover the structure and unstructured data. So

452 00:36:36.950 00:36:39.940 Scott_Harmon: one of the things I wanted to do I just haven’t had time.

453 00:36:40.630 00:36:48.700 Scott_Harmon: is, and I’ve seen some experience like experiments like this, where you’ll give it a a big, unstructured document.

454 00:36:48.890 00:36:52.350 Scott_Harmon: and you’ll tell an AI. You’ll give it the right prompt, and you say.

455 00:36:52.660 00:36:55.670 Scott_Harmon: discover the structure in this document for me.

456 00:36:55.670 00:36:56.070 Amber Lin: Oh!

457 00:36:56.070 00:37:02.300 Scott_Harmon: And create like it, create for me a Json file, or or, you know, a schema.

458 00:37:02.710 00:37:03.720 Scott_Harmon: And

459 00:37:03.880 00:37:13.599 Scott_Harmon: there, there’s a bunch of techniques that I’ve been reading about like you, can. You can. You can just read a whole bunch of unstructured documents or tell AI to do it, and it’ll create. They’ll say, well, like, look

460 00:37:13.930 00:37:18.960 Scott_Harmon: kind of implied in this document are the following 7 primary types. And those

461 00:37:19.390 00:37:25.479 Scott_Harmon: those types have the following attributes. And so it’s like it’ll like, basically discover the schema. So.

462 00:37:25.760 00:37:26.400 Amber Lin: Oh!

463 00:37:26.400 00:37:28.870 Scott_Harmon: The thinking my thinking had been.

464 00:37:29.360 00:37:35.280 Scott_Harmon: we’ve got this document. I forgot the document. The master document, whatever it’s called, and

465 00:37:36.050 00:37:38.159 Scott_Harmon: my hunch is. But I haven’t

466 00:37:39.000 00:37:44.319 Scott_Harmon: proved it, and is that there’s probably 4, 5, or 6 different fundamental types of

467 00:37:44.540 00:37:54.960 Scott_Harmon: of knowledge in in that document. There’s probably a how to one type might be a how to like a procedure. To do something, another would be.

468 00:37:55.120 00:37:59.460 Scott_Harmon: would be a service description. Okay, this is the description of a

469 00:37:59.590 00:38:02.750 Scott_Harmon: of a past, you know, of a particular type of service like

470 00:38:03.310 00:38:11.809 Scott_Harmon: my hunch is in the document, and it might even be the different sections actually like I cause. I know the document has sections, and that each type of

471 00:38:12.940 00:38:16.029 Scott_Harmon: you know. So so, for example, it could be the case

472 00:38:16.180 00:38:21.360 Scott_Harmon: that what we really are dealing with in this. In this master knowledge knowledge document are.

473 00:38:21.810 00:38:32.270 Scott_Harmon: let’s say, 6 or 7 different fundamental types of knowledge. Right? And so if that’s the case.

474 00:38:32.500 00:38:35.479 Scott_Harmon: And I think it actually probably is.

475 00:38:35.820 00:38:46.509 Scott_Harmon: What you can then do is you make the trainer agent aware of those types? Right? So so instead of just saying, I want to go edit and put some information into this file.

476 00:38:47.340 00:38:51.920 Scott_Harmon: If if the trainer bot is aware of the

477 00:38:52.240 00:38:57.129 Scott_Harmon: again, I’m making this up. But let’s say you’ve got 6 or 7 different types of knowledge

478 00:38:57.570 00:39:01.430 Scott_Harmon: that you can create. Well, if the trainer Bot, knows those types.

479 00:39:03.120 00:39:10.389 Scott_Harmon: Then you say, Hey, Trainerbot, I want to create some new knowledge. And the 1st question the trainerbot would ask you is, well, okay, what type? Because we have 6 or 7 different types of

480 00:39:10.780 00:39:20.100 Scott_Harmon: of of help, content, or knowledge that Csrs need. And you could even imagine it. We might even list them out you like, do you want to create this or that or the other thing?

481 00:39:20.210 00:39:22.609 Scott_Harmon: And then you go? Oh, yeah, no. I want to create a

482 00:39:23.180 00:39:30.830 Scott_Harmon: a procedure. Or maybe maybe it’s a new how to we have these. Oh, by the ways, right? So oh, by the ways are a type of content.

483 00:39:31.490 00:39:33.860 Scott_Harmon: I want to create a new. Oh, by the way, okay, great.

484 00:39:34.040 00:39:42.119 Scott_Harmon: Now, the knowledge Bot knows because it understands the different fields and attributes

485 00:39:42.490 00:39:47.880 Scott_Harmon: that that type of knowledge involves it, can it can say great, you know.

486 00:39:48.170 00:39:50.809 Scott_Harmon: And again, just, I’m just

487 00:39:51.340 00:39:56.019 Scott_Harmon: imagining this right? You could say, Look, you know, just write a blurb, and I’ll

488 00:39:56.430 00:40:06.930 Scott_Harmon: I’ll take it and help you refine it into the not like I will. You could just start writing, and then it would come back and go. That’s great. But I need to know 3 more things, you know, before I have

489 00:40:08.690 00:40:13.539 Scott_Harmon: before I have everything I need to, where where I can go, publish it. And

490 00:40:13.720 00:40:16.050 Scott_Harmon: have you ever have you ever been to like a.

491 00:40:16.530 00:40:21.930 Scott_Harmon: you know, like you go to a government office and you want to apply for something.

492 00:40:22.320 00:40:27.709 Scott_Harmon: and the Government office goes. That’s great. But I need, I mean, 2 more pieces of information.

493 00:40:29.090 00:40:33.129 Scott_Harmon: They have some checklist, and you know I need your social security number and your

494 00:40:33.450 00:40:35.140 Scott_Harmon: you know you’re like you’re like.

495 00:40:35.270 00:40:38.589 Scott_Harmon: I don’t have that, you know, and they’re like, well, we need it, you know. We we need it.

496 00:40:38.590 00:40:39.710 Amber Lin: I see.

497 00:40:39.710 00:40:43.149 Scott_Harmon: And the reason they need it is because they have a database schema, and they

498 00:40:43.450 00:40:46.830 Scott_Harmon: they they don’t want to publish incomplete information.

499 00:40:46.830 00:40:51.790 Amber Lin: Yeah. And honestly having that schema also helps the trainer bot know what to ask.

500 00:40:52.520 00:41:00.010 Scott_Harmon: Yeah, exactly. So. Now the trainer Bot’s job, the prompting that you build for do, Tom, is to

501 00:41:00.350 00:41:04.110 Scott_Harmon: make sure you have a complete. The schema is complete

502 00:41:04.790 00:41:12.179 Scott_Harmon: for this type of knowledge, and again, you can do it conversationally like that becomes sort of, I think, easy, but

503 00:41:13.100 00:41:17.300 Scott_Harmon: it won’t publish it incomplete

504 00:41:17.640 00:41:31.439 Scott_Harmon: bit of knowledge. And the reason I think that’s really important is that if you look at knowledge systems for customer support organizations just historically, they tend to have incomplete knowledge like one of the big problems with knowledge bases is they’re incomplete.

505 00:41:31.980 00:41:32.310 Amber Lin: Oh no!

506 00:41:32.310 00:41:36.889 Scott_Harmon: And so someone there’ll be an article or a bit of knowledge about this thing.

507 00:41:37.040 00:41:42.659 Scott_Harmon: and because you don’t have all the necessary attributes, the knowledge isn’t usable.

508 00:41:44.280 00:41:49.159 Scott_Harmon: And so that’s a that’s a big, inherent flaw in the design of knowledge systems. So.

509 00:41:49.500 00:41:54.299 Uttam Kumaran: But we we do have amber, that we put together a doc about the

510 00:41:54.610 00:41:57.410 Uttam Kumaran: what is it like? A style guide for the.

511 00:41:57.410 00:42:00.400 Amber Lin: Yes, formatting the guidelines. Yeah. And we have.

512 00:42:00.400 00:42:05.100 Uttam Kumaran: Can we pull that up? Because that is probably where this should get included?

513 00:42:05.100 00:42:12.590 Scott_Harmon: Bet. I bet if you just like I read this article the other day. I don’t know if I sent it to you, Upan, have you worked with knowledge grass yet.

514 00:42:14.050 00:42:29.090 Uttam Kumaran: no, but I I you know I read. There’s like this framework for sort of like documentation that we try to use internally here. Maybe I’ll send it to you. You’ll find it interesting. I just sent it in the Zoom chat. It’s called Diataxis.

515 00:42:30.100 00:42:34.480 Uttam Kumaran: Like we’re not like super strict on this mainly just cause. It’s like

516 00:42:34.980 00:42:37.789 Uttam Kumaran: we don’t need to right now, but like this is sort of.

517 00:42:38.080 00:42:42.060 Uttam Kumaran: I got sent this a while ago. And I’ve it’s sort of helped me think about

518 00:42:42.170 00:42:50.820 Uttam Kumaran: how we do documentation at scale and how we basically organize documentation in different categories.

519 00:42:51.210 00:42:54.449 Uttam Kumaran: This may help us in this use case.

520 00:42:54.450 00:42:55.190 Amber Lin: Hmm.

521 00:42:58.230 00:43:04.090 Scott_Harmon: Yeah, let me just let me just pull something up real quick. I was reading it. I should have sent it to you. So where’s my

522 00:43:05.350 00:43:07.560 Scott_Harmon: oh, God! I got the wrong browser!

523 00:43:09.220 00:43:11.630 Scott_Harmon: I gotta have the janky apple browser.

524 00:43:12.120 00:43:12.770 Amber Lin: Hmm.

525 00:43:12.970 00:43:15.459 Scott_Harmon: Never even used the apple browser. Talk on it.

526 00:43:18.700 00:43:23.280 Scott_Harmon: This is just it was just a long, prompt. Let’s see if I can find it here.

527 00:43:26.000 00:43:32.190 Scott_Harmon: That they wanted to build a not here. Here it is. Okay. So I’m gonna

528 00:43:32.510 00:43:35.990 Scott_Harmon: send this to you. I’m gonna put it in the chat. Actually.

529 00:43:36.430 00:43:37.200 Amber Lin: Okay.

530 00:43:37.200 00:43:45.889 Scott_Harmon: And if you pull it Utam, I’m sure you could probably, you know, read this and parse it in like a minute like this is sort of down your alley. But

531 00:43:47.210 00:43:51.330 Scott_Harmon: let’s see, where’s my chat? Come on. Okay, here we go.

532 00:43:51.740 00:44:00.360 Scott_Harmon: So if you click on that, you will.

533 00:44:01.460 00:44:08.080 Scott_Harmon: and you. Just if you just scale down or scroll down. Rather, you’ll actually see the prompt

534 00:44:08.340 00:44:12.870 Scott_Harmon: text they use. It’s like in. It’s about halfway down the post.

535 00:44:13.880 00:44:20.140 Scott_Harmon: and it just says you’re an expert knowledge graph schema designer with deep experience. And this is this one’s for healthcare.

536 00:44:20.870 00:44:22.170 Uttam Kumaran: And nice.

537 00:44:22.170 00:44:27.280 Scott_Harmon: And it’s just. It’s a very long prompt, and it says, you’re a schema designer. I want you to read this.

538 00:44:27.580 00:44:30.359 Scott_Harmon: These documents that have a bunch of healthcare data

539 00:44:30.940 00:44:37.720 Scott_Harmon: again. And I want you to create these data. I want you to document the inherent data structure.

540 00:44:37.920 00:44:44.790 Scott_Harmon: And then, as you can see. If you scroll down a little farther, it actually creates the knowledge, graph and knowledge graph shows all the relationships

541 00:44:45.100 00:44:50.610 Scott_Harmon: that are in that in those documents, and then you throw like 5 documents at it, and it generates the

542 00:44:51.180 00:44:53.900 Scott_Harmon: Json. And then you have to write a little pie torch and poof.

543 00:44:53.900 00:44:54.410 Uttam Kumaran: Yeah.

544 00:44:54.410 00:45:01.620 Scott_Harmon: Like this. I thought it was like amazing, like the prompts are so cool like it just said, figure it out like.

545 00:45:01.620 00:45:02.300 Uttam Kumaran: Yeah.

546 00:45:02.440 00:45:10.960 Scott_Harmon: And so I guess I’m not a technical expert, obviously. But my hunch is

547 00:45:11.110 00:45:16.219 Scott_Harmon: if we if we can make the design, the the trainer bot

548 00:45:16.990 00:45:21.230 Scott_Harmon: if we can give it the awareness of the struct of the of the

549 00:45:21.960 00:45:25.489 Scott_Harmon: of the data structure, of the knowledge we wanted to collect.

550 00:45:26.470 00:45:29.350 Scott_Harmon: will, it’ll be much more valuable.

551 00:45:30.242 00:45:35.789 Scott_Harmon: Both the ABC. And then and then any any other clients you deploy it with. I think you’ll have just like a

552 00:45:36.670 00:45:40.020 Scott_Harmon: like. I think that’ll be a pattern you’ll want to repeat over and over again.

553 00:45:41.390 00:45:44.799 Scott_Harmon: And and now at ABC you’ll have a closed loop

554 00:45:45.150 00:45:47.839 Scott_Harmon: because the experts like Janice, and

555 00:45:48.090 00:45:52.390 Scott_Harmon: when they publish something they’ll publish complete data because the bot will make them

556 00:45:53.510 00:46:01.030 Scott_Harmon: the thing the thing we don’t want. It’s probably I don’t mean to be impolite, but Janice is not like a knowledge engineer.

557 00:46:01.030 00:46:01.920 Uttam Kumaran: Yeah, yeah.

558 00:46:02.070 00:46:05.079 Scott_Harmon: She’s really smart about like pest control.

559 00:46:05.780 00:46:11.239 Scott_Harmon: But her job is not to sort of be a data scientist or understand

560 00:46:11.850 00:46:16.759 Scott_Harmon: knowledge schema. So whatever she types in is going to be incomplete. And now

561 00:46:17.250 00:46:25.250 Scott_Harmon: you’re introducing. If if you don’t let the make the trainer bot make it complete.

562 00:46:25.620 00:46:26.170 Amber Lin: Yeah.

563 00:46:26.170 00:46:29.880 Scott_Harmon: The knowledge gets dirty over time and becomes less and less useful.

564 00:46:30.360 00:46:31.380 Scott_Harmon: And

565 00:46:31.630 00:46:38.809 Scott_Harmon: the next thing you know, you’ve got just a big knowledge base with a bunch of crap in it that, like the efficacy and the knowledge base goes

566 00:46:39.110 00:46:41.729 Scott_Harmon: down over time. Utam. It doesn’t go.

567 00:46:42.920 00:46:48.690 Uttam Kumaran: So I think there’s 2 things. One is like they. One is the just, the formatting. The second, you’re right, is like

568 00:46:48.900 00:46:52.419 Uttam Kumaran: actually just forcing them to answer all the questions.

569 00:46:52.620 00:46:56.600 Uttam Kumaran: So you almost have to like, categorize what they’re trying to add. And then

570 00:46:56.850 00:47:04.450 Uttam Kumaran: basically, we just have a prompt that like verifies. Okay, we’ve basically, this is a

571 00:47:05.010 00:47:12.220 Uttam Kumaran: explanation. Or this is like some adding some context to here, you need to go through these 4 or 5 additional steps.

572 00:47:12.330 00:47:17.490 Uttam Kumaran: So one thing, Amber, I think to even start is we should take the current document

573 00:47:20.510 00:47:22.609 Amber Lin: Run it through AI and get a structure.

574 00:47:22.820 00:47:34.439 Uttam Kumaran: Yeah, but almost do a couple of things. So you should run the document and and have an AI figure out if there are things that exist there already that are not fully formed.

575 00:47:34.630 00:47:39.739 Uttam Kumaran: Cause, then we should re. We should. Those should be the 1st to tag.

576 00:47:40.360 00:47:46.600 Scott_Harmon: That’s exactly. That’s exactly right. Start start with what’s in the document. So it’s almost like an audit where you take the current document.

577 00:47:46.600 00:47:47.390 Amber Lin: Oh!

578 00:47:47.540 00:47:58.749 Scott_Harmon: Right. You’re gonna have to iterate with the prompting a little bit. You know, you’re a data list, you know, and the 1st thing you’ll do is identify places in this document with incomplete knowledge.

579 00:47:59.430 00:48:06.209 Scott_Harmon: You know where where we can, or we’re missing data or attributes. And then the 1st goal would be to fix that.

580 00:48:06.880 00:48:15.769 Scott_Harmon: And you could just do that manually and go back to Denise or Yvet to go. Hey? Our AI! Some of our existing stuff, it thinks, is incomplete, you know.

581 00:48:16.210 00:48:17.210 Amber Lin: Yeah. Totally.

582 00:48:17.510 00:48:23.079 Scott_Harmon: And then to your point, Utam. Now going forward.

583 00:48:24.140 00:48:28.809 Scott_Harmon: you could just let anybody add new. It’s almost like a new instance of the type.

584 00:48:30.900 00:48:34.649 Uttam Kumaran: And I would ask AI to help you generate the following

585 00:48:34.790 00:48:41.109 Uttam Kumaran: thing, which is, I mean, give it all the context like of this, of where objective is. But then we need it to basically

586 00:48:41.530 00:48:42.680 Uttam Kumaran: generate.

587 00:48:44.630 00:48:46.290 Uttam Kumaran: The types of

588 00:48:47.410 00:48:57.000 Uttam Kumaran: the types of changes that can get added and what constitutes like a full change, or like a change with all the necessary information.

589 00:48:57.330 00:49:03.749 Uttam Kumaran: And then you, you can then basically have it, have it. Create the prompt on.

590 00:49:03.870 00:49:11.079 Uttam Kumaran: You’ve just received this change verify that it hits Xyz. If it if it misses one of those points, please

591 00:49:11.190 00:49:15.320 Uttam Kumaran: re, like, basically ask the user again to

592 00:49:15.970 00:49:21.430 Uttam Kumaran: fill out whatever it’s been missed. So 1st goal is to just make sure that everything in the doc

593 00:49:21.630 00:49:23.410 Uttam Kumaran: is accurate.

594 00:49:23.620 00:49:26.429 Uttam Kumaran: Use that filled out Doc with

595 00:49:26.610 00:49:32.179 Uttam Kumaran: all the things we know are full to then have AI create the

596 00:49:33.080 00:49:42.060 Uttam Kumaran: that like back and forth, which is like, I, wanna, I wanna update a mosquito policy. Great. It seems like your change falls into here. I need these pieces of information.

597 00:49:42.540 00:49:46.980 Uttam Kumaran: and that’s the blocker. And then maybe even yeah.

598 00:49:46.980 00:49:52.820 Scott_Harmon: Can you send me amber? Just a Pdf of the current? What do we call the thing? The knowledge document.

599 00:49:52.820 00:49:53.960 Amber Lin: Google, Doc, yeah.

600 00:49:53.960 00:49:54.820 Uttam Kumaran: Central Dock.

601 00:49:54.980 00:50:02.890 Scott_Harmon: And, by the way, I and I, I don’t have the same tooling expertise, you guys do. But I’ll play around with it a little myself like

602 00:50:03.190 00:50:05.859 Scott_Harmon: like, I think, just that very 1st sort of

603 00:50:06.510 00:50:09.110 Scott_Harmon: discover the structure in this document, and.

604 00:50:09.110 00:50:11.090 Uttam Kumaran: Yeah, have it. Do the discovery? First? st

605 00:50:11.090 00:50:20.899 Scott_Harmon: Yeah. And again, you’re you’re more of a data scientist, obviously. And like, I don’t know whether you ask it for Json or like, I think you could be like, really explicit with.

606 00:50:20.900 00:50:21.220 Uttam Kumaran: Yeah.

607 00:50:21.450 00:50:27.699 Scott_Harmon: Produce. And then, you know, maybe you, as a data scientist, could go. Yeah, that looks pretty thorough, like, I like that like that.

608 00:50:27.910 00:50:33.940 Scott_Harmon: You know, I was gonna write a program with a database. I’d like I could do it, and

609 00:50:34.540 00:50:39.089 Scott_Harmon: that should be the 1st goal. And then, if to your point if there’s gaps in it like

610 00:50:39.760 00:50:50.000 Scott_Harmon: not. Not all the policies have all the information, you know, like there’s a mosquito policy here, but then there’s a termite policy, and it doesn’t. It doesn’t have all the right stuff.

611 00:50:50.500 00:50:51.520 Scott_Harmon: And

612 00:50:52.290 00:50:57.849 Scott_Harmon: so step one is great. We can just we can make we? We can just tell Janice, or whatever.

613 00:50:57.850 00:50:58.670 Amber Lin: Yeah.

614 00:50:58.670 00:50:59.100 Scott_Harmon: Hey!

615 00:50:59.100 00:51:03.150 Amber Lin: Step. One’s gonna take quite a while. All of the updates take

616 00:51:03.544 00:51:11.200 Amber Lin: a lot of the stuff they need to verify internally, they need to escalate to make sure. Hey, is this a singular policy that we want across the company.

617 00:51:11.200 00:51:11.740 Scott_Harmon: Well, you.

618 00:51:11.830 00:51:13.039 Amber Lin: Bit of time. There.

619 00:51:13.040 00:51:18.230 Scott_Harmon: You just hit on a you just hit on a key issue which

620 00:51:18.830 00:51:25.400 Scott_Harmon: we we’re gonna have to phase this right? Like, as you figure out how to phase this in utam.

621 00:51:25.400 00:51:25.960 Uttam Kumaran: Yeah.

622 00:51:25.960 00:51:33.569 Scott_Harmon: It gets. This gets into the whole middle of the workflow around approving new knowledge. And

623 00:51:34.350 00:51:36.879 Scott_Harmon: in reality, that’s kind of complicated, like.

624 00:51:37.160 00:51:45.610 Scott_Harmon: okay, I think I’ve got it right. Who else needs to sign off on that? You know. That needs to go to Steve because he’s a division director like

625 00:51:45.970 00:51:53.079 Scott_Harmon: like what it’s probably done. Janice probably can’t just publish new knowledge without someone’s approval.

626 00:51:54.660 00:51:58.499 Scott_Harmon: Like. First, st the AI has to tell her. Yeah, this looks good to me like

627 00:51:59.160 00:52:03.669 Scott_Harmon: this looks like a complete test policy, you know, or whatever the thing it is.

628 00:52:04.270 00:52:08.619 Scott_Harmon: But you’re probably eventually going to have to say.

629 00:52:09.110 00:52:14.610 Scott_Harmon: But someone else needs to sign off on that before I can publish it back to the master data set.

630 00:52:17.360 00:52:17.900 Amber Lin: Yeah.

631 00:52:18.580 00:52:25.039 Uttam Kumaran: So another thing, amber is even even if they like it, basically have the AI flag. What’s right or what’s wrong?

632 00:52:25.590 00:52:30.930 Uttam Kumaran: And then my suggestion would be, of course, tell them like, Hey, these things need more info.

633 00:52:31.170 00:52:39.829 Uttam Kumaran: Go, go, fill it out, but just ditch them ditch. Those keep what’s right, and then have the AI process that for creating the next set of prompts.

634 00:52:41.980 00:52:43.310 Uttam Kumaran: Do you see what I’m saying?

635 00:52:44.606 00:53:06.799 Amber Lin: That is, then that is very dependent on evals for the AI to know what’s right and wrong, because you’re saying that we let AI flag was right and wrong. Then what use? What AI thought was right to create more things. But the problem is that if the AI wasn’t right in the 1st place, and it ruins the whole entire process afterwards.

636 00:53:07.860 00:53:12.390 Uttam Kumaran: No, I think very simply. You can just take that document. You throw it into

637 00:53:12.790 00:53:14.469 Uttam Kumaran: chat. And you just say

638 00:53:15.580 00:53:19.800 Uttam Kumaran: this, you just basically just walk through our entire goal of like this part of the project.

639 00:53:20.260 00:53:25.069 Uttam Kumaran: And then just say, the 1st goal is to make sure our existing data set is full.

640 00:53:25.340 00:53:27.170 Uttam Kumaran: is full. And and

641 00:53:27.460 00:53:33.770 Uttam Kumaran: and then you could basically ask it to tell you which pieces are good enough versus bad enough.

642 00:53:34.060 00:53:43.600 Uttam Kumaran: like good versus bad. Take the bad stuff out, and then you can say now that I have a a rich, like full document, please break down like how I would

643 00:53:43.760 00:53:50.960 Uttam Kumaran: if if our goal is to create a prompt that discusses with the user and verifies that their input information is full.

644 00:53:51.170 00:53:55.339 Uttam Kumaran: Create me that that prompt. So it doesn’t create it based on like the bad.

645 00:53:55.830 00:54:08.970 Amber Lin: Yeah, I get your. I get your process and I understand it. I think my only question is that I just wanna make sure that when we take the bad stuff out, we’re actually taking out like the right parts.

646 00:54:09.530 00:54:10.389 Uttam Kumaran: Yeah, so that’s.

647 00:54:10.390 00:54:24.920 Amber Lin: I was wondering if we should involve Janice and Yvette, or someone at least, that’s very knowledgeable to tell us. Hey, is this complete? But the problem is that they are humans, and they might not like think of everything in the single instance.

648 00:54:24.920 00:54:26.389 Uttam Kumaran: I see what you mean. I see what you mean.

649 00:54:26.390 00:54:28.360 Amber Lin: How do we make sure it is right?

650 00:54:29.080 00:54:29.790 Uttam Kumaran: I mean.

651 00:54:30.000 00:54:39.829 Uttam Kumaran: yeah, I would. Whatever response AI gives you on like, what’s complete, you can send to them. And you can say, Hey, we’re we’re creating a rubric on what is complete.

652 00:54:39.830 00:54:41.590 Amber Lin: Okay, that that’s.

653 00:54:41.590 00:54:44.799 Uttam Kumaran: We flagged these as complete, and these as not complete.

654 00:54:45.400 00:54:55.849 Uttam Kumaran: Does that sound did like? Can you get approval? If there’s anything else you would add, what would it be? So then you’re right. We basically have a golden data set of like, what a complete

655 00:54:56.430 00:55:03.329 Uttam Kumaran: thing of documentation is. Think, consider the knowledge graph. Consider this like Dietaxis. There’s a lot of these.

656 00:55:03.716 00:55:16.169 Uttam Kumaran: It’s not. This is not like what I spend my time thinking about, but, like, go with one of these frameworks, one of these knowledge engineering frameworks, or ask AI to provide you with a couple. I would. La! I would rather route this in

657 00:55:16.280 00:55:18.288 Uttam Kumaran: some framework that exists versus.

658 00:55:18.690 00:55:19.850 Amber Lin: I would like that.

659 00:55:19.850 00:55:23.849 Uttam Kumaran: Invent one. I feel like there’s gotta be like, oh, there’s there’s a bunch.

660 00:55:23.850 00:55:31.570 Uttam Kumaran: I’m glad you said that. I think there’s a lot of people doing this. I’m sure there’s open source frameworks and approaches this just some random blog post. I found.

661 00:55:31.570 00:55:33.740 Scott_Harmon: Yeah, but I bet you there are.

662 00:55:34.480 00:55:39.029 Uttam Kumaran: There’s gotta be something old like Microsoft thing on, like how to do like technical docs or.

663 00:55:39.030 00:55:51.990 Scott_Harmon: Once you get into knowledge graphs. Microsoft seems to be a real leader, and they publish all kinds of papers, and they’ve got all kinds of open source projects and stuff. I don’t know if if you need, if knowledge graphs are the right

664 00:55:53.100 00:56:02.779 Scott_Harmon: thing here, right? Because, again, this is just not my field. I I think they’re really cool because they’re visual. And I I’m a visual learner so like, Oh, my gosh! Here’s a graph of all these

665 00:56:02.940 00:56:06.459 Scott_Harmon: relationships, of how everything relates together

666 00:56:06.660 00:56:10.410 Scott_Harmon: that that may be overkill for what we’re doing like.

667 00:56:10.890 00:56:12.890 Scott_Harmon: There may be something simpler.

668 00:56:13.160 00:56:18.280 Amber Lin: Yeah, maybe the 1st step would just to be have like table columns like, that’s it.

669 00:56:18.280 00:56:24.960 Scott_Harmon: Yeah, I I just, I bet you, once you dig around a little, you’ll find projects, or or

670 00:56:25.320 00:56:29.780 Scott_Harmon: you know, mostly just open source activities that that do exactly this.

671 00:56:30.000 00:56:32.540 Amber Lin: Yeah, yeah, that are extracted.

672 00:56:33.630 00:56:37.390 Scott_Harmon: Extracting and imposing structure on unstructured data.

673 00:56:38.520 00:56:39.180 Scott_Harmon: Gotcha.

674 00:56:39.906 00:56:50.309 Amber Lin: Awesome. Do you think we should get one of our data engineers on this project? Because for Miguel and Casey, this might take them a long time to figure out.

675 00:56:50.310 00:57:02.850 Uttam Kumaran: No, I don’t think like I think we have what we need to do next year, like, I don’t think we need to involve more people. I think the biggest thing is like I don’t want. I don’t want the AI to come up with some novel structure. I just think there’s there’s some

676 00:57:03.040 00:57:09.099 Uttam Kumaran: principled framework that exists already about how to organize these types of like

677 00:57:09.390 00:57:14.900 Uttam Kumaran: sops or whatever. Maybe it’s even going on perplexity or something being like, give me the top 5 of these.

678 00:57:14.900 00:57:15.560 Scott_Harmon: That’s awesome.

679 00:57:15.560 00:57:16.350 Scott_Harmon: Maybe we should pick.

680 00:57:16.350 00:57:23.069 Uttam Kumaran: One, because some other company has probably made this their whole business like inventing these. And I’d rather

681 00:57:23.190 00:57:26.629 Uttam Kumaran: I’d rather just use AI to apply that versus.

682 00:57:26.630 00:57:27.230 Amber Lin: Yes.

683 00:57:27.230 00:57:36.290 Uttam Kumaran: We’re gonna have to get into. We’re gonna end up trying to figure out what they figured out. The taxes is a version that works. So if we don’t find anything

684 00:57:36.600 00:57:39.320 Uttam Kumaran: like I would vote for just using that.

685 00:57:39.820 00:57:46.469 Amber Lin: Do we find one that’s specific to this customer service framework that we use.

686 00:57:46.820 00:57:47.420 Scott_Harmon: It?

687 00:57:48.210 00:58:01.210 Scott_Harmon: Good question. I think I think this problem is probably very common in customer service, because customer service uses lots of unstructured data knowledge, you know, for answering questions. So I know it’s common.

688 00:58:01.450 00:58:05.970 Scott_Harmon: But you may actually find it in other areas as well. Like, I think it’s.

689 00:58:06.220 00:58:10.659 Scott_Harmon: there’s lots of areas where there’s lots of just written documents that are unstructured.

690 00:58:10.660 00:58:18.799 Amber Lin: Yeah, especially now that we think of expanding like even maybe to Hr, but then we have a different structure of.

691 00:58:18.800 00:58:19.189 Scott_Harmon: Yeah. The.

692 00:58:19.190 00:58:19.640 Uttam Kumaran: Yeah, I don’t.

693 00:58:19.720 00:58:23.970 Scott_Harmon: The reason I want to double click on this is, I think if you can figure out this

694 00:58:24.670 00:58:29.719 Scott_Harmon: tool or framework, or whatever you like you’ll find yourself using it over and over again in other domains.

695 00:58:29.720 00:58:30.260 Uttam Kumaran: Yeah.

696 00:58:32.650 00:58:39.019 Scott_Harmon: you know. Let let me just give you a random example, you know, for the in the mental behavior health

697 00:58:39.200 00:58:42.180 Scott_Harmon: thing we were doing with Televiro a few weeks ago, Tom.

698 00:58:42.680 00:58:47.420 Scott_Harmon: you’ll remember that one of the applications is documenting the

699 00:58:48.190 00:58:52.009 Scott_Harmon: the physicians sessions like when they have these sessions.

700 00:58:52.220 00:58:58.110 Scott_Harmon: therapy sessions. They got to make all these notes. Well, they’re just making random notes, you know. They’re they’re not following a schema.

701 00:58:58.110 00:58:58.880 Uttam Kumaran: Yeah.

702 00:58:58.880 00:59:03.509 Scott_Harmon: We’re just going first.st Do, Tom said he. Got, you know, with dog bit him when he was a kid, and then.

703 00:59:03.670 00:59:06.840 Scott_Harmon: you know, then he was mad about, you know, whatever they’re just making notes.

704 00:59:07.110 00:59:18.460 Scott_Harmon: and they’re not really following. But what you want is you want that data to really do interesting things with it to understand the structure and then ultimately impose some structure. So

705 00:59:18.900 00:59:23.570 Scott_Harmon: what you ultimately want when that therapist writes their notes

706 00:59:23.870 00:59:27.310 Scott_Harmon: is the AI assistant to help them create a structured

707 00:59:28.881 00:59:33.630 Scott_Harmon: set of notes that that that capture all of the therapeutic

708 00:59:34.460 00:59:37.709 Scott_Harmon: parameters that are important to this, that, or the other

709 00:59:38.050 00:59:40.409 Scott_Harmon: sort of diagnostic thing they’re working on.

710 00:59:40.560 00:59:45.839 Scott_Harmon: So it’s the same. But my point is, it’s the same pattern. You might want to read 5,000

711 00:59:46.390 00:59:51.750 Scott_Harmon: notes from 5,000 earlier sessions, and say, Find me the common patterns

712 00:59:51.860 00:59:57.820 Scott_Harmon: and the structure and good notes, you know. Well, they’re really good notes.

713 00:59:57.820 01:00:00.990 Uttam Kumaran: Well, that’s what the human would do like that, Microsoft. These guys.

714 01:00:01.110 01:00:06.409 Uttam Kumaran: They went and probably studied a bunch, and then they were like it boils down to like these kind of things.

715 01:00:06.410 01:00:08.800 Scott_Harmon: But real good ones have the following, you know.

716 01:00:08.800 01:00:09.360 Uttam Kumaran: Yeah.

717 01:00:09.360 01:00:18.330 Scott_Harmon: 7 things they’d say, tell you this and that and the other thing. Well, now you’ve got your structure. And then, once you’ve trained the AI to be fluent in the structure.

718 01:00:18.720 01:00:26.289 Scott_Harmon: Now it can become a note taking assistant and say, Oh, Amber, that’s great. Did the patient mention this? Oh, yeah, I forgot. They mentioned that, you know.

719 01:00:26.400 01:00:28.549 Scott_Harmon: And and now you’re

720 01:00:29.350 01:00:36.450 Scott_Harmon: because the AI understands the structure, it can add more value in the, in the whole, in the whole journey.

721 01:00:39.180 01:00:44.740 Scott_Harmon: I will, for my part, I’ll just play around with different prompts to see what I can extract from it, but.

722 01:00:44.740 01:00:45.510 Amber Lin: Hey!

723 01:00:45.510 01:00:48.939 Scott_Harmon: Amber. I’m sure you could find a lot more interesting things that I can.

724 01:00:49.470 01:01:05.109 Amber Lin: This is very interesting, and this is because we also have a lot of internal documentation. How to do this, how to do that? How do you use this tool? How do you structure the meetings? They’re so random, and we have so many documents flying around at notion. And this once we figure this out like

725 01:01:05.450 01:01:05.990 Amber Lin: this.

726 01:01:05.990 01:01:12.170 Scott_Harmon: You’re right, could be super useful internally, boy, this document’s gotten really big. 200 and.

727 01:01:12.170 01:01:14.700 Amber Lin: Oh, it is! It is massive.

728 01:01:15.070 01:01:17.090 Scott_Harmon: Wow, that’s so cool. Okay.

729 01:01:17.090 01:01:22.550 Scott_Harmon: yeah. So that’s also something that I think we should use. We should do one pass with AI to even just like.

730 01:01:22.890 01:01:32.820 Uttam Kumaran: Basically redo that or like remove, you know, just remove fluff, or whatever

731 01:01:33.627 01:01:45.240 Uttam Kumaran: you know, you know, basically what you can do is once we have the the basically, the guidance prompt. You can then run it back through it and say, like, Make sure every section fits our new guidelines.

732 01:01:45.630 01:01:50.459 Scott_Harmon: Right. You could just reformat that Re could almost condense and.

733 01:01:50.460 01:01:50.930 Uttam Kumaran: Yes.

734 01:01:50.930 01:01:59.819 Scott_Harmon: Optimize the current document to conform to the more you know the your structure that you’ve discovered.

735 01:02:00.470 01:02:05.769 Scott_Harmon: And yeah, I think you’ll end up with a more readable document as well, because I’m just paging through it.

736 01:02:05.990 01:02:08.570 Scott_Harmon: It’s interesting, you know. It’s it’s it’s a.

737 01:02:09.460 01:02:12.120 Scott_Harmon: It still looks a little stream of consciousness.

738 01:02:12.340 01:02:20.090 Scott_Harmon: you know, which is obviously how it came out like that’s how it was. It’s just, wow. Then we talk about this for a while, and we talk about that for a while.

739 01:02:26.390 01:02:27.220 Scott_Harmon: huh!

740 01:02:27.670 01:02:31.830 Scott_Harmon: Like even the section titles are sort of.

741 01:02:32.810 01:02:38.750 Amber Lin: There’s not really, really a structure to this. Think that will really benefit them too.

742 01:02:38.950 01:02:43.369 Scott_Harmon: Right. The structure certainly doesn’t jump out. To me it looks kind of like random, but.

743 01:02:43.550 01:02:44.480 Amber Lin: Yeah.

744 01:02:44.740 01:02:46.999 Scott_Harmon: It’s like they just started talking.

745 01:02:47.000 01:03:00.420 Amber Lin: I know people just mostly control F, like, they just find a key phrase or keyword. There’s not really a directory. Okay, this goes under that, and you go there. So there’s no really steps. People mostly just search.

746 01:03:00.880 01:03:01.980 Scott_Harmon: Right, right.

747 01:03:01.980 01:03:02.510 Amber Lin: Yeah.

748 01:03:03.190 01:03:15.530 Amber Lin: okay, this is very interesting. I’ll bet I’ll look into it. I’ll let our engineers. I’ll take a few 1st passes on that as well, and then I’ll update you, or we can talk about it in the meeting, too.

749 01:03:15.860 01:03:16.660 Scott_Harmon: Okay.

750 01:03:16.660 01:03:17.370 Amber Lin: Yeah.

751 01:03:17.740 01:03:19.679 Scott_Harmon: Well, great thanks for getting us together and

752 01:03:19.810 01:03:21.869 Scott_Harmon: digging into a little bit. I think it’s a

753 01:03:21.990 01:03:28.179 Scott_Harmon: like, I said. I think it’ll help with ABC. But then, hopefully, maybe it’ll be in an area of expertise for the firm that.

754 01:03:28.900 01:03:34.189 Uttam Kumaran: Have this? Yeah, like, if whenever we’re gonna do big context, docs like this, like, how do we structure.

755 01:03:34.790 01:03:40.230 Amber Lin: Yeah, I mean amber. We should do one for us. We should do on one for us. Next

756 01:03:40.230 01:03:41.240 Amber Lin: we were saying, Yeah.

757 01:03:41.240 01:03:46.330 Uttam Kumaran: You know, like we, we have all of our procedure. I mean, the nice thing is, we have

758 01:03:46.490 01:03:51.480 Uttam Kumaran: everything written at least in notion, and it’s all like in some format.

759 01:03:51.830 01:03:57.459 Uttam Kumaran: So you could basically have AI take the probably have like, Oh, one, take the 1st pass

760 01:03:57.810 01:04:03.510 Uttam Kumaran: at a 1 large dock for Brainforge, given like everything that’s in notion.

761 01:04:03.850 01:04:09.330 Uttam Kumaran: perhaps. Or you can send it slack, could send it everything. Actually, you could just throw everything in basically.

762 01:04:09.330 01:04:14.900 Scott_Harmon: Stunning. How good it can be if you write a great prompt right like, if you, if you run.

763 01:04:14.900 01:04:19.150 Uttam Kumaran: Also, that’s sort of the that’s kind of like. The tough part, too, is.

764 01:04:19.150 01:04:19.620 Scott_Harmon: Yeah.

765 01:04:20.211 01:04:26.130 Uttam Kumaran: It’s really hard to like engineer outcomes like not deterministic.

766 01:04:26.280 01:04:36.259 Uttam Kumaran: So you’re like, Oh, if I put like, please, what it what I missed is like, get 5%. I don’t like that. I can’t like really compute that sort of stuff.

767 01:04:36.790 01:04:45.890 Uttam Kumaran: The nice thing is, we’re just figuring out even basic stuff, but eventually they’ll some like Chat gpt you. They should figure out how to boost every single prompt

768 01:04:46.530 01:04:57.339 Uttam Kumaran: instead of you having to remember to say please, or like Xml format, like, you know, we have a prompt library internally where we we saved like top prompts for copywriting, for

769 01:04:57.790 01:04:59.829 Uttam Kumaran: fails for but I’m like

770 01:04:59.980 01:05:06.390 Uttam Kumaran: it’s just so fickle like. Is this really the magic? Right now? It just seems like I don’t know.

771 01:05:06.390 01:05:08.630 Scott_Harmon: Prompt engineering is the new coding right?

772 01:05:08.630 01:05:09.250 Uttam Kumaran: Yeah.

773 01:05:10.050 01:05:16.415 Scott_Harmon: Okay, well, this is been super productive amber. Thanks for getting us together. I hope I hope it.

774 01:05:17.480 01:05:26.629 Amber Lin: Yeah. Didn’t know. Didn’t even think about it after before you brought it up. So this is saving us a lot of like missed.

775 01:05:26.630 01:05:32.859 Scott_Harmon: This. This is scar tissue. From years ago, when I I did this call, I did this call center re-engineering project. And

776 01:05:33.460 01:05:39.000 Scott_Harmon: we did a a knowledge base. And it took we it took like a year to do.

777 01:05:39.130 01:05:41.069 Scott_Harmon: But at the end it kind of

778 01:05:41.530 01:05:46.830 Scott_Harmon: didn’t go well because of this issue like we weren’t disciplined enough about the structure of the knowledge. And

779 01:05:47.200 01:05:49.339 Scott_Harmon: people could just start typing stuff in.

780 01:05:50.037 01:05:55.919 Scott_Harmon: Oh, we’ll just use search to find it. And but the problem was that the the.

781 01:05:56.670 01:06:01.900 Scott_Harmon: the, the document you’d get back when you search the knowledge base

782 01:06:02.560 01:06:08.320 Scott_Harmon: wouldn’t be complete. In other words, it wouldn’t really be able to solve my problem because it didn’t have all the right

783 01:06:08.640 01:06:10.450 Scott_Harmon: attributes like.

784 01:06:11.216 01:06:18.349 Scott_Harmon: I’ll just give you a tiny example. We, one of our customers was Dell computer, like, you know, computer company.

785 01:06:18.670 01:06:23.990 Scott_Harmon: And we built their whole knowledge base for their Csrs to be able to answer questions about.

786 01:06:24.560 01:06:28.479 Scott_Harmon: You know, when people would call the call centers and their computer, their laptop didn’t work.

787 01:06:29.350 01:06:33.819 Scott_Harmon: And and someone had, you know, they call. And they say, Oh, my.

788 01:06:35.670 01:06:49.650 Scott_Harmon: my video screen looks weird or something like it’s just think of some random, you know problem that someone call only half my video screen works or something like that. And they Csr to search the knowledge base. And it would find an article about

789 01:06:50.340 01:07:08.680 Scott_Harmon: about something, you know, a common problem with video screens. But because it didn’t include the right. Additional data like this applies to people with this laptop and this version of Microsoft software like, they didn’t know if this applied to that client or not, because the knowledge.

790 01:07:08.680 01:07:10.560 Amber Lin: It was not helpful at all.

791 01:07:10.560 01:07:17.350 Scott_Harmon: Yeah. And and so they’re like, does this apply? Then they just said, It’s not helpful, like, I don’t know. It’s just not complete, like.

792 01:07:17.350 01:07:18.040 Amber Lin: Huh!

793 01:07:18.040 01:07:20.880 Scott_Harmon: There’s not enough knowledge, you know. It’s not thorough enough

794 01:07:21.080 01:07:27.300 Scott_Harmon: for me to know if this applies this client or not, and so pretty soon all the knowledge articles were worthless, and so.

795 01:07:27.300 01:07:27.980 Amber Lin: Wow!

796 01:07:27.980 01:07:36.499 Scott_Harmon: It. Like, I said. It’s just it’s just scar tissue. So I think we have a chance to to mitigate all that with structure.

797 01:07:36.650 01:07:39.570 Amber Lin: Yes, and to recover old wounds.

798 01:07:39.570 01:07:40.580 Scott_Harmon: Yeah, thank, you, yeah.

799 01:07:40.580 01:07:41.969 Amber Lin: Well, it is time.

800 01:07:41.970 01:07:44.380 Amber Lin: Yeah, yeah. Heal my old wounds. Okay?

801 01:07:44.380 01:07:47.664 Amber Lin: And that’s why people have expectations for their children.

802 01:07:48.030 01:07:51.329 Scott_Harmon: That’s right. Our children will be better than we are.

803 01:07:51.632 01:07:56.769 Amber Lin: Thank you so much, Scott. Don’t wanna take a lot one time. I already talked to you.

804 01:07:56.990 01:07:58.420 Scott_Harmon: Thank you. Amber cheers.

805 01:07:58.420 01:07:59.290 Amber Lin: Hey? You!

806 01:07:59.330 01:08:00.060 Scott_Harmon: Bye-bye.

807 01:08:00.340 01:08:01.180 Amber Lin: Bye-bye.