Meeting Title: Medical-Industry-Prospects Date: 2024-09-26 Meeting participants: Abigail Zhao, Chang Ho, Joshuadeveyra, Uttam Kumaran


WEBVTT

1 00:03:59.600 00:04:00.480 Uttam Kumaran: Hey, Miguel.

2 00:04:02.790 00:04:03.463 joshuadeveyra: Hello! Hello!

3 00:04:04.320 00:04:04.900 Uttam Kumaran: 8,

4 00:04:06.260 00:04:07.610 Uttam Kumaran: I think.

5 00:04:08.400 00:04:09.519 joshuadeveyra: Oh, yeah. Sure. Go ahead.

6 00:04:09.960 00:04:10.852 Uttam Kumaran: No, I think

7 00:04:12.010 00:04:13.016 Uttam Kumaran: we might.

8 00:04:13.890 00:04:18.860 Uttam Kumaran: I’m gonna stay video off for a sec. I’m just finishing lunch because I’ve been on meetings since 9 am. But

9 00:04:19.545 00:04:20.110 Uttam Kumaran: okay.

10 00:04:24.990 00:04:27.330 joshuadeveyra: 70 stars. Okay, sure. Sure.

11 00:04:30.652 00:04:33.309 joshuadeveyra: The this one that Patrick did

12 00:04:34.680 00:04:38.219 joshuadeveyra: about scraping that that was like my plan

13 00:04:38.330 00:04:48.640 joshuadeveyra: originally, but more of like a scaled down version. So I was like planning to once we start building like the Snowflake agent, because I think that’s also gonna be part of the

14 00:04:49.020 00:04:50.430 joshuadeveyra: Github tool right.

15 00:04:50.790 00:04:51.260 Uttam Kumaran: Exactly.

16 00:04:51.260 00:04:59.120 joshuadeveyra: So I was thinking, you know, we only need to scrape the documentation for the details that we need. But good thing. He already scraped everything.

17 00:05:00.470 00:05:05.829 Uttam Kumaran: Yeah, I mean, 3 gig is 5. Gig is not bad. I mean, probably

18 00:05:06.010 00:05:09.280 Uttam Kumaran: probably the stuff that’s relevant to us is like a portion of that.

19 00:05:09.820 00:05:10.670 joshuadeveyra: Yeah, yeah.

20 00:05:12.810 00:05:13.950 joshuadeveyra: yeah.

21 00:05:30.830 00:05:33.210 joshuadeveyra: is Patrick, like our CTO.

22 00:05:35.392 00:05:36.617 Uttam Kumaran: No, but

23 00:05:37.370 00:05:41.100 Uttam Kumaran: I don’t know. Patrick is just a another engineer on the team.

24 00:05:41.160 00:05:46.670 Uttam Kumaran: But Patrick, right now, is basically gonna start owning, like all of like internal engineering, mostly.

25 00:05:46.670 00:05:47.669 joshuadeveyra: Oh, I see! I see!

26 00:05:48.670 00:05:54.859 Uttam Kumaran: like a lot of stuff around the developer experience and things. And he also helps with hiring just one of the early employees.

27 00:05:56.000 00:05:57.399 joshuadeveyra: Oh, okay, okay.

28 00:05:59.940 00:06:02.789 joshuadeveyra: We added a lot of people now, hey, Abigail?

29 00:06:03.320 00:06:03.900 Uttam Kumaran: Hey!

30 00:06:03.900 00:06:04.600 Abigail Zhao: Bye.

31 00:06:07.590 00:06:09.689 Uttam Kumaran: Let me see

32 00:06:11.280 00:06:12.730 Uttam Kumaran: checking. It’s trying

33 00:06:46.490 00:06:50.219 Uttam Kumaran: nice. See, Miguel, everybody. Once they start playing around with it.

34 00:06:50.450 00:06:51.860 Uttam Kumaran: they get addicted.

35 00:06:52.870 00:06:57.240 joshuadeveyra: Kind of surprising, though like was, What is it? What was his name? Brian? Was it Brian pay.

36 00:06:57.700 00:06:58.380 Uttam Kumaran: Yeah.

37 00:06:58.960 00:07:00.549 joshuadeveyra: That he doesn’t use AI.

38 00:07:00.750 00:07:02.419 joshuadeveyra: I couldn’t imagine life without a.

39 00:07:02.420 00:07:04.170 Uttam Kumaran: Because he’s that’s cause he’s lazy.

40 00:07:06.600 00:07:11.159 Uttam Kumaran: although I told him it would. It would make you even it. It would like help your laziness.

41 00:07:11.160 00:07:11.535 joshuadeveyra: Yeah.

42 00:07:11.910 00:07:14.739 Uttam Kumaran: He’s so lazy he doesn’t even want to learn how to fix it.

43 00:07:16.350 00:07:19.319 joshuadeveyra: Like all you have to do is tell them your problem, and that fixes it.

44 00:07:20.500 00:07:21.580 Uttam Kumaran: He’s already out.

45 00:07:22.760 00:07:26.800 Uttam Kumaran: but I also think he’s a perfect candidate to like help. Figure this out.

46 00:07:27.060 00:07:28.799 Uttam Kumaran: Abigail. For context.

47 00:07:28.950 00:07:31.819 Uttam Kumaran: We wrote a little AI, like Pr reviewer

48 00:07:31.830 00:07:33.480 Uttam Kumaran: bought for Github.

49 00:07:33.780 00:07:36.330 Uttam Kumaran: Yeah. And Brian was on the call, and he was like.

50 00:07:36.520 00:07:39.682 Uttam Kumaran: he was like, Yeah, I don’t. I never used AI before.

51 00:07:44.570 00:07:45.210 Chang Ho: Hello!

52 00:07:46.400 00:07:47.320 Uttam Kumaran: Hey!

53 00:07:47.970 00:07:49.010 Chang Ho: Can you hear me?

54 00:07:49.070 00:07:50.099 Uttam Kumaran: I can hear you.

55 00:07:50.480 00:07:51.849 Chang Ho: Oh, perfect, perfect!

56 00:07:52.420 00:07:56.579 Chang Ho: Right. Joshua and Abigail are on as well. I’m gonna turn my video on.

57 00:07:57.430 00:08:07.155 Uttam Kumaran: Yeah, I’m eating lunch, but everybody else feel free to go video on. Or I will go video on in like 10 min. Just been on meetings for like 6 7 h.

58 00:08:07.450 00:08:09.370 Chang Ho: How dare you take a break? Who’sam?

59 00:08:09.909 00:08:11.079 Uttam Kumaran: I know. Well.

60 00:08:11.530 00:08:15.430 Uttam Kumaran: I guess I had. I had 15 min to heat some stuff up

61 00:08:15.650 00:08:18.170 Uttam Kumaran: and get back on. But

62 00:08:18.880 00:08:22.836 Uttam Kumaran: yeah, I really appreciate getting this through together.

63 00:08:24.580 00:08:29.229 Uttam Kumaran: You know. Everyone knows me, of course, but I want everyone else to know everybody. So

64 00:08:29.520 00:08:30.200 Chang Ho: Yeah.

65 00:08:30.200 00:08:38.879 Uttam Kumaran: From the brain. Forge side, Abigail and Miguel, you could go ahead and ex just like, give a little bit of context about what you do

66 00:08:39.511 00:08:47.108 Uttam Kumaran: at Brainforge, and then maybe, Chang, you could go also. Go ahead and just give a little 2 cents. I want us to kind of set the meeting up.

67 00:08:47.820 00:09:01.209 Uttam Kumaran: I’ve I’ve told, you know everybody kind of like what the goal is. But really we’re trying to understand how Brainforge can deliver solutions in the medical and clinical industry. And so this conversation is really gonna be focused on

68 00:09:01.370 00:09:08.289 Uttam Kumaran: kind of strategies there. You know, you have me who I can kind of give a sense of how we can actually

69 00:09:08.726 00:09:24.843 Uttam Kumaran: execute on the sales side, you know. Outbound marketing. How do we actually go after folks? You have Chang, who is actually gives all the context, basically on who we should go after and why and what are possible? Solutions that we can deliver miguel on the call.

70 00:09:25.320 00:09:32.587 Uttam Kumaran: has been building everything internally. AI related for us. And has a huge experience kind of doing that in the last few years.

71 00:09:32.950 00:09:37.730 Uttam Kumaran: So I wanted him to be here as well. And then Abigail has been helping a lot on the

72 00:09:38.160 00:09:41.250 Uttam Kumaran: sort of the sales side and kind of organizing our thoughts on.

73 00:09:41.610 00:09:52.220 Uttam Kumaran: you know, how do we build a campaign and who to go after? So yeah, maybe I’ll go if you want to kick off. I don’t know if you’ve met Miguel before, so maybe if you just want to give a little intro. And then, Miguel, you can go.

74 00:09:52.900 00:10:22.800 Abigail Zhao: Yeah, I mean, I’m Abigail. I am interning here. So basically, I’m just kind of taking everything as a learning experience right now. But yeah, like, Tom said, I am helping out where I can on the sales side right now, that focuses a lot on Apollo, which is like the sales platform we’re using. It just focuses a lot on like outreach and like gaining leads like filtering through all that stuff. But yeah, I think that’s like a very brief overview of like what I am currently doing right now.

75 00:10:23.330 00:10:24.440 Chang Ho: And for how long?

76 00:10:26.810 00:10:29.259 Abigail Zhao: like 2 months now, I think.

77 00:10:29.508 00:10:32.989 Uttam Kumaran: 3 months. I’m a bad person to ask about time. I have no idea.

78 00:10:33.790 00:10:38.349 Abigail Zhao: I think it was like I like started end of August ish

79 00:10:38.690 00:10:39.260 Abigail Zhao: or.

80 00:10:39.260 00:10:40.790 Chang Ho: How long are you planning to.

81 00:10:41.290 00:10:54.680 Abigail Zhao: I don’t even remember anymore. But yeah, I mean, well, I’m also currently in school. So I don’t know just kind of like taking it as I go. I guess I’ll stop whenever I feel like.

82 00:10:55.060 00:10:59.949 Uttam Kumaran: Yeah. Abigail’s. Abigail’s a mathematics, Major. I don’t know. Do you have a minor but math, Major, at.

83 00:10:59.950 00:11:05.029 Abigail Zhao: Yeah, I’m majoring in applied math and minoring in computer programming at Usc, here, yeah.

84 00:11:05.030 00:11:05.810 Chang Ho: Hmm.

85 00:11:06.940 00:11:08.090 Chang Ho: Splendors.

86 00:11:08.530 00:11:09.710 Chang Ho: which city.

87 00:11:10.520 00:11:11.520 Abigail Zhao: Los Angeles.

88 00:11:12.620 00:11:18.240 Chang Ho: Oh, wow! Apparently it’s not polluted enough now. You’ve lost the orange hue in the sunshine.

89 00:11:18.240 00:11:18.670 Abigail Zhao: Yeah.

90 00:11:20.790 00:11:21.320 Abigail Zhao: yeah.

91 00:11:21.320 00:11:22.090 Chang Ho: Feel free.

92 00:11:23.620 00:11:28.584 Chang Ho: I know, because I’ve been breathing all those free radicals and reducing your life expectancy. But never mind.

93 00:11:29.230 00:11:31.970 Chang Ho: things are improving. Yeah, okay, that’s cool.

94 00:11:32.340 00:11:36.030 Chang Ho: That’s really cool math, Major Compski

95 00:11:36.360 00:11:39.619 Chang Ho: minor. And you’re doing this just

96 00:11:39.630 00:11:44.630 Chang Ho: to keep yourself out of trouble or to get yourself in trouble for a little while. See what it’s like.

97 00:11:44.810 00:11:46.389 Chang Ho: Then industry.

98 00:11:47.120 00:11:47.390 Uttam Kumaran: How does that?

99 00:11:47.390 00:11:47.740 Chang Ho: I mean, who.

100 00:11:48.670 00:11:51.817 Uttam Kumaran: See what it’s like to really work and try to get some money.

101 00:11:53.200 00:11:59.820 Chang Ho: Oh, yeah. Yeah. Good point. How did you meet Utam? Did Utam? Did you scout Abigail out? I can’t remember what the story is.

102 00:12:00.080 00:12:00.660 Uttam Kumaran: No

103 00:12:01.340 00:12:09.749 Uttam Kumaran: I don’t think you’ve met him, but we have another person on team, a good friend of mine, Brian Pay, who’s actually Abigail’s cousin.

104 00:12:09.850 00:12:11.020 Uttam Kumaran: Yeah.

105 00:12:11.520 00:12:16.450 Uttam Kumaran: who was like, Hey, I have a very smart cousin who’s at Usc.

106 00:12:16.823 00:12:24.399 Uttam Kumaran: Who’s interested, and I’m like, I’ll talk to anybody about what we do, even if what, for whatever reason I’m like, put me in touch.

107 00:12:24.480 00:12:31.950 Uttam Kumaran: I think the trouble with me is that I’m very low on time, so it took me a while to kind of get Abigail into the fray and stuff, but

108 00:12:32.110 00:12:34.060 Uttam Kumaran: you know I really wanted

109 00:12:35.020 00:12:42.770 Uttam Kumaran: especially the folks who are on intern side. I wanted them to understand really what we do, but also working, doing. Taking stuff on for the clients

110 00:12:43.020 00:12:46.750 Uttam Kumaran: is difficult, because it’s a lot higher stress and.

111 00:12:46.750 00:12:47.220 Chang Ho: Yeah.

112 00:12:47.220 00:12:54.159 Uttam Kumaran: Like, I just don’t want to put that. That’s like, yeah, that’s like tougher stuff. So instead, I was like, what are the toughest problems we’re dealing with.

113 00:12:54.280 00:13:07.119 Uttam Kumaran: And then part of that was really sales. So we figured out a lot of like, how we want to go after clients and like, what are the different factors? And then, now, Abi was working with Miguel on some AI stuff. So basically, just like

114 00:13:07.370 00:13:12.940 Uttam Kumaran: I was like, Hey use Brainforge as a conduit to learn as much as possible. And

115 00:13:13.100 00:13:18.889 Uttam Kumaran: I wanna actually put you guys not on projects that are just like nice to have like things that actually move the needle for us.

116 00:13:19.288 00:13:23.350 Uttam Kumaran: And that have like deliverables and stuff like that. So this is one of them. So.

117 00:13:24.020 00:13:26.540 Chang Ho: Yeah, potentially. Fingers crossed.

118 00:13:26.640 00:13:34.239 Chang Ho: I mean, the balls just started rolling. I mean, I say, fingers crossed. I’m talking up about myself. Really. I’m sure, Abigail, you do great work

119 00:13:35.010 00:13:35.530 Abigail Zhao: Thank you.

120 00:13:35.530 00:13:43.059 Chang Ho: Anyway? So yeah, I met Utam through a mutual friend used to work with uten at data culture. Is that right? Utam.

121 00:13:44.150 00:13:45.409 Uttam Kumaran: Yes, data, culture.

122 00:13:45.650 00:13:46.390 Chang Ho: Yeah.

123 00:13:47.170 00:13:49.210 Chang Ho: sort of bespoke

124 00:13:50.040 00:13:54.869 Chang Ho: data analytics firm in New York. You may have heard of it. I don’t know. I don’t know how

125 00:13:55.420 00:14:03.000 Chang Ho: well known it is across the Us. But anyway, in New York I did this scholarship exchange program

126 00:14:03.420 00:14:07.669 Chang Ho: with Neil, and it was Neil who introduced me to Utah.

127 00:14:08.010 00:14:12.820 Chang Ho: And so I was based in Boston for a year, trying to become a little bit more mathematically competent.

128 00:14:12.830 00:14:18.600 Chang Ho: So it’s kind of relevant for you, Abigail, I mean, you have to excuse me if I can’t remember my partial differentials, and how I do them, but

129 00:14:19.230 00:14:29.230 Chang Ho: I sort of scratched up on it and brushed up majorly during that year in Mit, which I had to in order to survive and not be given a a u.

130 00:14:29.270 00:14:40.810 Chang Ho: Which I think stood for unclassified, or maybe just unintelligent. But anyway, there was a whole range, and I love the fact that Mit had that, whereas Harvard, I think, was very much. Oh.

131 00:14:41.110 00:14:42.420 Chang Ho: you tried

132 00:14:42.590 00:14:43.980 Chang Ho: a minus?

133 00:14:44.850 00:14:47.549 Chang Ho: Yeah, have a Gpa. For.

134 00:14:47.610 00:14:59.240 Chang Ho: and if you’re good I’ll give you a Gpa. 4. And if you’re excellent I’ll give you a Gpa. Of 4, and that’s kind of how it kept going with with Harvard and Mit was very much. You’re going to be in the full spectrum of assessment. So it was

135 00:14:59.350 00:15:07.019 Chang Ho: nerve wracking as hell, anyway, that was my little my most recent foray into mathematics again, and and

136 00:15:07.160 00:15:12.479 Chang Ho: scaled up. And there was a bit of data engineering and applying machine learning models

137 00:15:12.710 00:15:13.869 Chang Ho: in a way that was

138 00:15:13.900 00:15:17.127 Chang Ho: a little bit more reasonable than just plugging and playing.

139 00:15:17.580 00:15:28.139 Chang Ho: and or rather a little bit more reason than plug and play, and I’m also a medic I don’t know what that means in the Us. I can’t remember now is a medic a paramedic.

140 00:15:28.280 00:15:29.160 Chang Ho: Probably.

141 00:15:29.270 00:15:30.880 Chang Ho: I didn’t mean that I’m a dog.

142 00:15:30.880 00:15:33.470 Uttam Kumaran: What it. You gotta describe what you do first, st I think.

143 00:15:33.470 00:15:37.520 Chang Ho: I’m an internal, med and infectious diseases. Doc.

144 00:15:39.160 00:15:40.540 Uttam Kumaran: He’s a doctor, he’s a doctor.

145 00:15:40.540 00:15:40.980 Abigail Zhao: Okay.

146 00:15:41.420 00:15:45.370 Chang Ho: Yeah, I think, medic God, I’ve forgotten all the cross.

147 00:15:45.715 00:15:48.820 Uttam Kumaran: Is here means like, you’re like a 1st responder.

148 00:15:49.290 00:15:51.510 Chang Ho: Well, I do sometimes respond, first, st

149 00:15:54.570 00:16:09.949 Chang Ho: okay. But yeah, okay, that’s that’s bad. So yes, I’m not a medic. I am a. I’m an infectious diseases. Doc. There we go, clarify. And also I also work as an intern internist. I think that’s the term

150 00:16:10.230 00:16:24.010 Chang Ho: in one of the hospitals here. Hospitals here in Oxford I do a bit of research and utam, and I’ve been talking about the potential to do some data oriented projects. My entire Phd over the last 4, 3, 4 years has been around

151 00:16:24.090 00:16:31.660 Chang Ho: applied AI modeling and applied causal modeling, which has been the most interesting, really more than AI to be honest

152 00:16:31.810 00:16:36.259 Chang Ho: using real world data. So using observational data sets to try and get a better sense for

153 00:16:36.606 00:16:41.509 Chang Ho: what makes a difference in the real world? Not, you know, outside of the framework of

154 00:16:41.620 00:16:47.390 Chang Ho: clinical trials which you may have heard of, you know, when you test, it’s like a B testing with Google, right

155 00:16:48.023 00:16:54.750 Chang Ho: sort of trial and going forward prospectively. Unfortunately, that’s not feasible, for the vast majority of medical questions.

156 00:16:55.630 00:16:57.399 Chang Ho: anyway. Got interested in all this.

157 00:16:57.600 00:17:01.279 Chang Ho: started talking to utam about whether we could find some

158 00:17:02.355 00:17:03.340 Chang Ho: potential

159 00:17:04.099 00:17:05.349 Chang Ho: clients

160 00:17:05.369 00:17:08.969 Chang Ho: who are essentially lacking the human

161 00:17:09.470 00:17:11.300 Chang Ho: buy power right now

162 00:17:12.079 00:17:18.600 Chang Ho: to do stuff internally, because the growing movement, and this has been validated or verified by

163 00:17:18.980 00:17:22.240 Chang Ho: the people I spoke to last week. I spoke to

164 00:17:23.619 00:17:27.380 Chang Ho: basically the I think the chief of

165 00:17:28.010 00:17:36.180 Chang Ho: data clinical, the clinical chief of data, Nicole Benson, I think one of you looked her up, or both of you looked her up

166 00:17:36.240 00:17:38.120 Chang Ho: at Brigham and Mass. Gen.

167 00:17:39.030 00:17:53.729 Chang Ho: And her 1st comment was the general movement that she’s seen not only in Boston, but also with geisinger health in Pennsylvania along. Obviously Stanford’s gonna be pretty similar, I’m sure. I’m sure it’ll be pretty similar in La as well.

168 00:17:54.190 00:17:57.039 Chang Ho: The hospitals are trying to develop their own internal

169 00:17:58.450 00:17:59.510 Chang Ho: essentially

170 00:18:00.100 00:18:03.899 Chang Ho: digital. Infrastructure, health digital infrastructure

171 00:18:03.980 00:18:21.280 Chang Ho: to to a greater extent. Why? Because they want to minimize the number of 3rd party vendors that they have to trawl through to understand who they should invest in B to to reduce the amount of overheads for the amount of 3rd party vendors that they currently have in the ecosystem. 3, rd at least they have greater control. Fourthly, so that they can

172 00:18:21.540 00:18:23.159 Chang Ho: make it a lot more bespoke

173 00:18:23.300 00:18:26.380 Chang Ho: and not worry about some 3rd party company collapsing

174 00:18:26.440 00:18:40.739 Chang Ho: for some reason or another and leaving them with escaping hole. So for multiple reasons, they’re trying to develop their own internal human workforce to work on this. You can imagine that’s like a 1020 year vision.

175 00:18:40.780 00:18:45.730 Chang Ho: and they have nowhere near the firepower to do that. Why? Because they don’t pay enough.

176 00:18:46.250 00:18:50.079 Chang Ho: So all the software engineers want to be working for you know where?

177 00:18:51.680 00:18:57.933 Chang Ho: for utam that brain forge? Obviously and so the I the idea is

178 00:18:58.770 00:19:07.710 Chang Ho: I I’ve raised it to the Chief Information Officer Lin Shen. So I’ll I’ll send you the links now. Actually, maybe

179 00:19:07.720 00:19:11.860 Chang Ho: so you can have a look at who these people are because they they should be essentially

180 00:19:12.725 00:19:18.430 Chang Ho: you steam them to be epitomes of the sorts of people we should be chasing after. So lin shen at

181 00:19:18.900 00:19:19.670 Chang Ho: like

182 00:19:21.540 00:19:22.920 Chang Ho: My Jen.

183 00:19:28.900 00:19:30.590 Chang Ho: So this guy

184 00:19:32.370 00:19:34.450 Chang Ho: where’s the chat? There we go

185 00:19:36.220 00:19:43.169 Chang Ho: is, I think, chief information officer at mess at Mass. Gen. Brigham.

186 00:19:43.230 00:19:45.719 Chang Ho: Chief Medical information officer. That’s him.

187 00:19:46.700 00:19:48.320 Chang Ho: And then we have

188 00:19:52.160 00:19:53.310 Chang Ho: Adam.

189 00:19:54.070 00:19:59.120 Chang Ho: And when at Mass jam Griggle, who’s the chief information officer

190 00:20:00.230 00:20:01.929 Chang Ho: at Maths Gem Brigham?

191 00:20:03.420 00:20:04.540 Chang Ho: This guy?

192 00:20:05.300 00:20:09.899 Chang Ho: And I think, generally speaking, we should be looking out for people who either.

193 00:20:11.100 00:20:20.850 Chang Ho: you know, work, who are either this high up in the management scheme, or probably couple of tiers below, because, frankly, people are right up. Top, probably are one of those people who have.

194 00:20:20.950 00:20:24.030 Chang Ho: you know, 15,000 unread emails.

195 00:20:24.340 00:20:27.079 Chang Ho: You know those sorts of people. You must have seen them around.

196 00:20:27.850 00:20:28.230 Uttam Kumaran: That’s me!

197 00:20:28.230 00:20:30.578 Chang Ho: Yeah, well, maybe that’s you.

198 00:20:31.290 00:20:33.939 Chang Ho: Every time I look in the mirror.

199 00:20:33.960 00:20:40.720 Chang Ho: Yeah. So the 15,000 unread email kind of people are these 2, but thankfully I managed to

200 00:20:40.850 00:20:42.989 Chang Ho: buy some time from Lin shen.

201 00:20:43.300 00:20:44.340 Chang Ho: and

202 00:20:45.000 00:20:49.309 Chang Ho: just through the fact that I’ve done the same bioinformatics course

203 00:20:49.370 00:20:52.270 Chang Ho: with them. So I was. We were in the same year, group

204 00:20:53.370 00:21:00.480 Chang Ho: at Harvard and Mit. 2019 2020. So you just talk to him next Wednesday. And Nicole’s 1st comment to me was.

205 00:21:00.920 00:21:03.839 Chang Ho: you know, this is the general trend that we’re seeing

206 00:21:04.300 00:21:07.850 Chang Ho: is that of internal capability investment.

207 00:21:07.870 00:21:12.630 Chang Ho: They’re nowhere near that at this stage. So the way I pitched it to lin shen.

208 00:21:12.720 00:21:16.040 Chang Ho: I can send you the email. It’s very basic. I just the way I pitched it was

209 00:21:16.640 00:21:18.060 Chang Ho: with that in mind.

210 00:21:18.630 00:21:20.109 Chang Ho: I essentially just said.

211 00:21:20.720 00:21:21.600 Chang Ho: Look.

212 00:21:21.810 00:21:25.729 Chang Ho: we’re basically a bunch of quite, quite

213 00:21:26.050 00:21:31.862 Chang Ho: talented data engineers. And you’ve also got medical medical data analyst on board.

214 00:21:32.510 00:21:34.130 Chang Ho: with me.

215 00:21:34.850 00:21:38.589 Chang Ho: I want to know if there are any very data centric

216 00:21:39.340 00:21:48.369 Chang Ho: projects that you can think of that are more process, driven rather than clinical, clinical, clinical, because data access becomes a humongous issue.

217 00:21:48.730 00:21:53.999 Chang Ho: coming to going to a 3rd party as soon as there’s any confidential data. But there might be quite a few process driven

218 00:21:54.320 00:22:04.399 Chang Ho: things behind the scenes like, how do you automate some of the follow up stuff? How do you automate some of the emails that human resources will have to send to various patients to make sure that they.

219 00:22:04.520 00:22:14.129 Chang Ho: you know, are rebooking the test that they’ve missed, or they missed a Ct. Or they missed an MRI. They want to make sure that’s semi automated. So everything sort of you know.

220 00:22:14.990 00:22:20.029 Chang Ho: as clear as possible for one person or 2 people just to keep on top of the entire caseload

221 00:22:20.150 00:22:24.009 Chang Ho: of patients for a particular department. How do you do that?

222 00:22:24.370 00:22:39.979 Chang Ho: Often those sorts of it sounds fairly simple to you guys, probably, but doesn’t exist, you know, and many, many different hospitals. You will not find that there is this wonderful dashboard from which you can control the entirety of the hospital like you like you might do in a video game. It’d be wonderful.

223 00:22:40.060 00:22:48.580 Chang Ho: But there isn’t. And so that’s that’s an example of opportunity that I sort of said, look, you know I don’t know if that’s something that you’re looking for at the moment at Brigham.

224 00:22:49.020 00:22:52.839 Chang Ho: But it’s the sort of thing that I’m looking for

225 00:22:52.920 00:22:57.980 Chang Ho: is where I’ve framed it because we’re going to come in as 3rd party as a 3rd party entity

226 00:22:57.990 00:23:02.760 Chang Ho: and said, I know you’re looking for essentially, to to hire more people

227 00:23:02.790 00:23:05.009 Chang Ho: to develop your internal capability.

228 00:23:05.220 00:23:14.309 Chang Ho: But you’re gonna be a little way away from that. I know that. So is there any chance, you know some of these projects that you’re currently trying to get off the ground

229 00:23:14.650 00:23:15.869 Chang Ho: could do with

230 00:23:15.920 00:23:17.830 Chang Ho: some input from people like us.

231 00:23:18.170 00:23:18.750 Uttam Kumaran: Yeah.

232 00:23:18.750 00:23:22.940 Chang Ho: And I framed that obviously far more concisely in the email. And I’ll happily send that on to you.

233 00:23:22.940 00:23:23.540 Uttam Kumaran: Yeah, please.

234 00:23:23.540 00:23:34.330 Chang Ho: But that’s that’s sort of the way I approached it, and juice talk to him on Wednesday. I don’t know if he, if he’s given me the light of day, because we’re in the same course together quite a few years out. Quite probably.

235 00:23:34.330 00:23:34.930 Uttam Kumaran: Just said that.

236 00:23:34.930 00:23:35.310 Chang Ho: And.

237 00:23:35.725 00:23:36.969 Uttam Kumaran: That’s just sales.

238 00:23:37.420 00:23:45.239 Chang Ho: Yeah. But I’m hoping to then utilize him to then hopscotch onto the next person in different

239 00:23:46.400 00:23:56.080 Chang Ho: analogous sites, whether in in San Fran, or in La, or in Geisinger, down in in Penn, or wherever essentially the larger centers first.st

240 00:23:56.080 00:23:56.680 Uttam Kumaran: Yeah.

241 00:23:56.680 00:24:01.029 Chang Ho: Because I know they’re probably the ones with with more fire. You know, more monetary firepower.

242 00:24:01.280 00:24:02.050 Uttam Kumaran: Yeah.

243 00:24:02.050 00:24:05.580 Chang Ho: And they’re probably also the ones who are investing more actively in trying to

244 00:24:05.940 00:24:09.619 Chang Ho: bridge the gap between relying too heavily on 3rd party vendors

245 00:24:09.640 00:24:20.660 Chang Ho: who already have entire software packages created to having internal development. And I kind of think we should exist in that slight niche between the 2 when, when

246 00:24:21.050 00:24:23.429 Chang Ho: when hospitals are basically trying to transition

247 00:24:23.806 00:24:29.150 Chang Ho: because they often need extra mercenary firepower, like you’ve discovered Witham with Baylor health.

248 00:24:33.760 00:24:34.500 Uttam Kumaran: Yeah.

249 00:24:34.860 00:24:37.030 Chang Ho: Yeah, he frozen a bit, but.

250 00:24:38.000 00:24:39.309 Uttam Kumaran: Oh, sorry! Can you hear me now?

251 00:24:39.470 00:24:41.319 Chang Ho: Yeah, yeah, yeah, definitely, definitely.

252 00:24:41.760 00:24:44.639 Uttam Kumaran: Yeah. So I mean, I I think that

253 00:24:45.160 00:24:47.810 Uttam Kumaran: for us. That’s all like

254 00:24:47.890 00:25:00.689 Uttam Kumaran: exactly the information we need, and I’ll show you a little bit about how what is like activating this sort of like all the information you’re getting from these different personas and these stakeholders like actually means.

255 00:25:00.780 00:25:13.670 Uttam Kumaran: So on our side. What we do is like we set up Mark. We set up outbound and like warm marketing campaigns. Right? So I’m working with, we have a another folk, another person on the team that’s coming on and owning that

256 00:25:14.009 00:25:25.450 Uttam Kumaran: but basically, the things that we look at right now are these sort of like components. We look at the current tech stack. Right? So one thing that could be helpful is like, if you know that

257 00:25:25.550 00:25:28.509 Uttam Kumaran: a company has X technology which makes

258 00:25:28.680 00:25:43.839 Uttam Kumaran: why job like a necessity or easier, that’s something that we can start to filter. For. For example, we we currently do that when we go after certain Ana, certain e-commerce companies. We look at their website to see what they’re using to track their users.

259 00:25:43.880 00:25:49.499 Uttam Kumaran: And then it tells me, oh, like they’re they’re gonna be more conducive towards messaging. That’s like.

260 00:25:49.540 00:26:02.240 Uttam Kumaran: consolidate this type of data. The titles, right? So you already provided some of that. So what we’re gonna do is I’m basically gonna take the titles that you’ve given us, and basically say these like, these are the lookalikes that we kind of want to find

261 00:26:02.340 00:26:06.150 Uttam Kumaran: revenue. Geo. I know you mentioned like looking for

262 00:26:06.850 00:26:21.039 Uttam Kumaran: learning from the larger, but then finding the demand in the smaller, like less folks that are not associated with a huge college or something like that. So that’s great that may pop up in revenue that may pop up in employee size

263 00:26:21.490 00:26:26.669 Uttam Kumaran: that may pop up in Geo, right? Like going after like non major cities.

264 00:26:27.026 00:26:33.090 Uttam Kumaran: So we have these like abilities to do these sorts of filters. The other thing is lookalike companies.

265 00:26:33.360 00:26:44.429 Uttam Kumaran: The other thing that’s beneficial is like, if you’re like, Hey, these these 4 or 5 clinics or hospitals will be great. I want to get. I want. I generally want them some like this.

266 00:26:44.500 00:26:51.740 Uttam Kumaran: That’s really helpful, too, because you may not be able to put your finger on like. What are all the attributes. But you may have a couple of examples.

267 00:26:51.760 00:26:57.861 Uttam Kumaran: So that’s really helpful. The next thing is like fit score. So basically, what we do is

268 00:26:58.490 00:26:59.220 Uttam Kumaran: we

269 00:26:59.360 00:27:03.279 Uttam Kumaran: have a little bit of like lead scoring. That we do

270 00:27:04.000 00:27:06.209 Uttam Kumaran: on our end. And I’ll just show you

271 00:27:06.310 00:27:08.099 Uttam Kumaran: like what that looks like.

272 00:27:13.220 00:27:17.019 Uttam Kumaran: basically, in, in addition to having those filters.

273 00:27:17.630 00:27:25.429 Uttam Kumaran: Those are all pretty wide ranges. So within within. Each of those ranges we have, we have other

274 00:27:25.745 00:27:28.799 Uttam Kumaran: ranking. So let me show you like how this is

275 00:27:29.050 00:27:35.443 Uttam Kumaran: based, what our current setup is, and we can customize this for this industry.

276 00:27:37.060 00:27:38.837 Uttam Kumaran: so we have.

277 00:27:45.970 00:27:54.380 Uttam Kumaran: So we have the category, the attribute, the score, the definition, and like, where it’s going from, basically. So we have 2 things we have like fit score and like engagement score

278 00:27:54.600 00:27:57.190 Uttam Kumaran: so fit score. We have, like size industry.

279 00:27:57.460 00:28:04.390 Uttam Kumaran: their data infrastructure that currently exists the role and the growth. So for medical maybe growth

280 00:28:04.680 00:28:12.779 Uttam Kumaran: may or may not be a factor. This may be something else. This may be like again, is it attached to something or not? But basically.

281 00:28:13.350 00:28:38.720 Uttam Kumaran: we? We wanna have multi levels of filtering because that gives us the best understanding, the feedback loop of like when when we do get meetings with people? What category did they fit in? And the second thing is like some of these filters, we’re gonna get like thousands and thousands of companies. So having multi layered approach of filters allows us to continue to separate and then and then have personalized messaging for each of these right. And so this is basically

282 00:28:39.020 00:28:50.133 Uttam Kumaran: given to the team, and that this is their they’re executing now on the fit score. The next level to that is going to be engagement. So we’re starting to push out content.

283 00:28:50.520 00:29:01.370 Uttam Kumaran: and like, we’re starting to understand like, do we have a referral or not? All of that will go into another which is like engagement. Have they already checked us out? Have they read our content? Right? We’re we’re gonna end up

284 00:29:01.530 00:29:13.019 Uttam Kumaran: ideally as we start, publish more content in different sectors. People from that sector will come and read. That is a great intent based signal that will end up taking advantage of this.

285 00:29:13.140 00:29:17.240 Uttam Kumaran: like, I have no idea when we’re gonna get to. But this we’re we’re able to do today.

286 00:29:18.550 00:29:19.360 Uttam Kumaran: so.

287 00:29:19.360 00:29:21.500 Chang Ho: I mean, I think I think, also, just hold.

288 00:29:21.540 00:29:30.539 Chang Ho: yeah, in some ways. I mean, I’m gonna have to update Abigail once. And and you as well. Obviously, Tam and Josh, I don’t know how much you’re gonna be involved in this

289 00:29:30.840 00:29:34.799 Chang Ho: scouting system that you’ve got, which is quite impressive. I would say

290 00:29:35.030 00:29:48.149 Chang Ho: from the outset. You know, when once I’ve spoken to Lin Chen, I think that’s just me. What this will be. One case study of how to approach this, and I’m sure Lynn will be able to give me a little bit more insight next Wednesday, when I’m due to speak to him

291 00:29:48.240 00:29:51.300 Chang Ho: about the sorts of people we should be approaching with.

292 00:29:51.630 00:29:54.670 Chang Ho: With this we think he might not be the right guy, I mean, he might say.

293 00:29:55.040 00:29:55.370 Uttam Kumaran: Close to.

294 00:29:55.370 00:30:00.439 Chang Ho: Someone who’s in. I don’t know the chief quality officer or the deputy chief quality officer

295 00:30:00.590 00:30:05.869 Chang Ho: who’s gonna be a lot more appropriate for you, because they will deal with the slightly

296 00:30:07.455 00:30:09.629 Chang Ho: paraclinical side, ie.

297 00:30:10.200 00:30:14.279 Chang Ho: The infrastructural aspects of running an institution like a clinic

298 00:30:14.470 00:30:28.169 Chang Ho: that isn’t so patient facing. That isn’t so clinical data driven where we’ll have a lot more speed of, I think, speed of getting going with any project, partly from data governance issues. As I mentioned

299 00:30:28.190 00:30:30.300 Chang Ho: as a 3rd party entity.

300 00:30:30.940 00:30:35.750 Chang Ho: and secondly, being able to have a very concrete defined endpoint

301 00:30:36.130 00:30:38.369 Chang Ho: in a short period of time.

302 00:30:38.440 00:30:44.310 Chang Ho: I know all of this needs to be defined. But I really do fervently believe, especially for as a first, st you know.

303 00:30:44.510 00:30:51.160 Chang Ho: 1, st 2 projects, 1, st 2 or 3 projects which are like we discussed before they should be super discrete. So we can.

304 00:30:51.160 00:30:51.790 Uttam Kumaran: For sure.

305 00:30:51.790 00:30:57.390 Chang Ho: Get a sense for how you know we all work, and delivering something

306 00:30:57.570 00:31:00.079 Chang Ho: like, you know, very tangible

307 00:31:00.210 00:31:00.690 Chang Ho: thanks

308 00:31:01.800 00:31:06.949 Chang Ho: and use it as a case study for moving forward and maybe going on to some more complex projects. But

309 00:31:07.640 00:31:13.239 Chang Ho: you know, complexity, we just try and minimize from the outset, really, just for the 1st project, if we can.

310 00:31:13.470 00:31:14.370 Chang Ho: So

311 00:31:14.890 00:31:24.339 Chang Ho: that’s why I thought actually talking to Lin. Chen would be useful, because it sounds like they’re a little bit desperado. If I’m being honest. They they want the human capital. If they don’t have it

312 00:31:25.580 00:31:26.180 Chang Ho: to do various.

313 00:31:26.180 00:31:34.750 Uttam Kumaran: And I think they also, even if they were to go higher for it, the amount of time it would take, and the quality of the people that would get would be really tough, especially.

314 00:31:35.220 00:31:37.100 Chang Ho: Oh, my God, yeah.

315 00:31:37.100 00:32:03.200 Uttam Kumaran: Even if we target people that are in more rural areas like you’re not gonna get the folks right? So that’s why I think, having some, and there may. There may be parts of this where we can’t stack rank, but again, like to think of it like scoring, I think, helps to like. Understand? The biggest things that we’re gonna need is like everything we need for the outreach. Right? So what are like, what are the pain points? What’s the feeling?

316 00:32:03.370 00:32:04.830 Uttam Kumaran: What is? And then what is like.

317 00:32:04.830 00:32:07.540 Chang Ho: Yeah, I think I think I’ll suss this out on Wednesday, Tuesday.

318 00:32:07.540 00:32:07.970 Uttam Kumaran: Yeah, yeah.

319 00:32:07.970 00:32:14.289 Chang Ho: For an institution like mass. General, be pretty representative, I think, of many other different clinical ventures.

320 00:32:14.627 00:32:16.569 Chang Ho: And we’ll be able to sort of

321 00:32:16.720 00:32:23.459 Chang Ho: categorize the problems as ones that really are going to only affect multi institutional.

322 00:32:23.460 00:32:23.830 Uttam Kumaran: Yeah.

323 00:32:23.830 00:32:30.489 Chang Ho: Umbrella sites like mass, Gen. Brigham, where they, where that that term actually encapsulates many different hospitals. In one.

324 00:32:30.490 00:32:30.940 Uttam Kumaran: Awesome.

325 00:32:30.940 00:32:42.752 Chang Ho: A network of hospitals versus a much smaller clinical, like clinic setting, like a family doctor’s practice, or maybe a small clutch of them in a rural setting which might have complete differential needs.

326 00:32:43.570 00:32:48.920 Chang Ho: as an aside. Sorry, Joshua. I know we haven’t, really. I haven’t had a chance to actually hear from what you do, but

327 00:32:49.040 00:32:53.880 Chang Ho: I I don’t think I’ve ever seen so many different background changes in in 10 min

328 00:32:53.920 00:32:56.409 Chang Ho: as I have with you on the top.

329 00:32:56.410 00:32:59.669 Uttam Kumaran: And ask him, and ask him what time it is, where he’s at.

330 00:32:59.920 00:33:03.449 Chang Ho: I know, you see. Honestly, you look like you just entered the the Gamers chair.

331 00:33:03.970 00:33:09.609 Chang Ho: and you’re you’re about to enter with the world of Warcraft with us just in the background.

332 00:33:10.590 00:33:13.899 joshuadeveyra: Yeah, yeah, it’s 4 30 am. Right? Now.

333 00:33:15.010 00:33:15.740 Chang Ho: Where.

334 00:33:18.210 00:33:18.890 joshuadeveyra: So, yeah.

335 00:33:19.390 00:33:19.890 Uttam Kumaran: He’s in the phone.

336 00:33:19.890 00:33:20.580 joshuadeveyra: Tomorrow.

337 00:33:20.880 00:33:21.670 joshuadeveyra: Yeah. Manila.

338 00:33:21.670 00:33:22.960 Chang Ho: You’re in the Philippines.

339 00:33:22.960 00:33:23.480 joshuadeveyra: Yep. Yep.

340 00:33:24.840 00:33:27.950 Chang Ho: Oh, man, yeah, it looks smoky, humid.

341 00:33:29.932 00:33:31.540 joshuadeveyra: Yeah, it’s very odd

342 00:33:34.470 00:33:39.649 Uttam Kumaran: But make Miguel. Maybe you want to give 2 cents about what you do, and and I can kind of talk about

343 00:33:41.120 00:33:44.099 Uttam Kumaran: I mean, that’ll like self. Explain why you’re on the call. But yeah.

344 00:33:44.430 00:33:49.840 joshuadeveyra: Oh, yeah, yeah, so basically what I do. Before I was here was

345 00:33:49.960 00:33:54.890 joshuadeveyra: more on the AI side, I don’t really develop AI. It’s more of like I fine tune it.

346 00:33:55.020 00:33:58.820 joshuadeveyra: I use, you know, models open AI models just to

347 00:33:59.387 00:34:04.160 joshuadeveyra: gener generative content. That start. I started there and then I started with

348 00:34:04.330 00:34:20.070 joshuadeveyra: AI agents like late last year, and then conversational agents, and then all the way through voice AI! So like over the past couple of years. I’ve helped like a couple of businesses in Australia mainly, and Europe restrace funds.

349 00:34:20.550 00:34:21.739 joshuadeveyra: for you know.

350 00:34:21.980 00:34:25.490 joshuadeveyra: conversational AI voice. AI stuff. So yeah.

351 00:34:25.800 00:34:27.870 joshuadeveyra: I’m sorry my brain is not clear.

352 00:34:29.100 00:34:36.080 Chang Ho: No, not at all. Can you? Can you make me? Can you make me a Chang voice? AI, so I can just respond automatically to all my on call shifts.

353 00:34:36.989 00:34:39.280 Chang Ho: and still be medically, legally.

354 00:34:40.550 00:34:48.189 joshuadeveyra: That’s something. Actually, I talked to with them earlier because one of the clients I worked with before I think they signed up.

355 00:34:48.560 00:34:58.220 joshuadeveyra: It’s it’s something like a healthcare insurance something, and they signed like, something 100,000 couple of $100,000 deal with them.

356 00:34:58.690 00:35:01.019 joshuadeveyra: But it’s more mostly on voice AI.

357 00:35:01.620 00:35:02.980 Chang Ho: Yeah, okay.

358 00:35:02.980 00:35:07.500 Uttam Kumaran: Yeah, he, he was working for a company that they’re trying to recruit people for clinical trials.

359 00:35:07.870 00:35:11.090 Uttam Kumaran: and they use an AI agent to make those phone calls.

360 00:35:11.530 00:35:14.370 Chang Ho: Wow! With a super sexy voice.

361 00:35:14.790 00:35:19.439 Chang Ho: so that these so these the patients, are far more likely to sign up.

362 00:35:21.540 00:35:23.460 Uttam Kumaran: Just take random drugs. Yeah.

363 00:35:24.835 00:35:25.569 Chang Ho: We’ll pay you.

364 00:35:25.570 00:35:26.710 joshuadeveyra: 100 bucks.

365 00:35:27.470 00:35:28.160 Chang Ho: Dude.

366 00:35:29.260 00:35:41.694 Chang Ho: I’m not sure this is Miguel. I’m not sure. You’re really ethical, guy. I’m just. I’m getting getting vibes here. You’re willing to do anything to get to that. I’m joking. But

367 00:35:42.110 00:35:43.599 Chang Ho: anyway, that sounds

368 00:35:43.900 00:35:47.910 Chang Ho: yeah. I mean, that sounds cool as as an initial introduction today. I thought.

369 00:35:48.070 00:35:50.719 Chang Ho: I just relay what I took in the call.

370 00:35:50.740 00:35:52.250 Chang Ho: and then

371 00:35:52.566 00:36:01.769 Chang Ho: follow up, I think next next week. Once I spoke to Lynn. I have a little bit. I think I’m starting to get a little bit more of an appreciation for what’s going to be needed. But I think

372 00:36:01.910 00:36:04.329 Chang Ho: just a couple of conversations more will.

373 00:36:04.330 00:36:04.760 Uttam Kumaran: For sure.

374 00:36:04.760 00:36:12.170 Chang Ho: Give me that just a little bit more substance for you to work with at the moment. I feel like it’s too. It’s too open to me like, for now Abigail and Miguel.

375 00:36:12.170 00:36:12.580 Uttam Kumaran: Sure.

376 00:36:12.580 00:36:13.779 Chang Ho: Yeah, it could work for hours

377 00:36:14.100 00:36:15.530 Chang Ho: this. But if the budget.

378 00:36:15.530 00:36:17.100 Uttam Kumaran: The thing is that for you.

379 00:36:17.160 00:36:19.689 Uttam Kumaran: it’s it’s it’s always a distillation.

380 00:36:19.760 00:36:28.689 Uttam Kumaran: And there’s loss in the problem from through you, through me to the email, right? So the best thing we can have is clarity

381 00:36:29.407 00:36:52.650 Uttam Kumaran: but also we want, I want to start testing something right. And the easiest test we can do is shoot off emails and see what happens right? And so wherever when, whenever we get to a point of confidence, we can decide on a on just one type of campaign and then try it out and see whether the messaging is right. See whether we can get on the phone with somebody. The reason why I want

382 00:36:53.121 00:37:10.959 Uttam Kumaran: Miguel on this call is because he’s gonna be he’s basically full time doing AI stuff at Brainforge and we’re gonna continue to recruit folks with his background. And also just train up folks into basically building agents, and using conversational AI basically

383 00:37:10.960 00:37:24.880 Uttam Kumaran: taking any of the most modern tools and leveraging them for clients. The nice thing. We saw the data part of the business, which is great, and I don’t think this is like a different part, I think, actually, for them, probably what’s gonna happen is we come in

384 00:37:25.080 00:37:33.580 Uttam Kumaran: to do AI or data. But the data part is really necessary to do any of AI stuff anyways. But I will say the building agents is a brand new

385 00:37:33.610 00:37:45.709 Uttam Kumaran: like art. So it’s been great having him on the team. We’ve worked on some like amazing stuff. And so I want he’s looped in on every kind of aspect where we’re thinking about leveraging AI

386 00:37:47.090 00:37:57.819 Uttam Kumaran: So for proof of concepts to answer questions for clients. Like he’s seen, I would say out of all the people I’ve met in the past 2 years doing this

387 00:37:57.830 00:38:01.230 Uttam Kumaran: like, it’s probably the most knowledgeable person that’s actually done

388 00:38:01.540 00:38:03.370 Uttam Kumaran: stuff in AI for sure.

389 00:38:04.940 00:38:06.523 Uttam Kumaran: not like a research setting.

390 00:38:08.950 00:38:10.899 Chang Ho: Cool. I’m looking forward to it.

391 00:38:11.170 00:38:14.929 Chang Ho: Yeah, so let’s should we touch base next week? Then about all this.

392 00:38:14.930 00:38:15.580 Uttam Kumaran: Yeah, I think.

393 00:38:15.580 00:38:21.449 Chang Ho: Now, Abigail, don’t don’t worry too much about developing this until I give you a little bit more.

394 00:38:22.160 00:38:24.310 Chang Ho: Yeah, it’s a little bit more to chew on.

395 00:38:25.040 00:38:25.820 Chang Ho: Yeah.

396 00:38:26.130 00:38:30.759 Uttam Kumaran: Why don’t I why don’t I grab grab time at this same time next week?

397 00:38:31.170 00:38:32.010 Uttam Kumaran: Same time.

398 00:38:32.010 00:38:35.910 Chang Ho: Next week. Let’s take a look I am.

399 00:38:36.590 00:38:40.610 Chang Ho: Actually that. Would that would work quite well. It’s it’s the day after due to speak to Lynn.

400 00:38:40.610 00:38:46.609 Uttam Kumaran: Actually, if we could do if we could do earlier in the day. Sorry if we could do earlier.

401 00:38:48.200 00:38:49.910 Uttam Kumaran: any day earlier.

402 00:38:50.510 00:38:51.879 Uttam Kumaran: Has this happened.

403 00:38:52.820 00:38:54.979 Chang Ho: Oh, because of because of John.

404 00:38:54.980 00:39:00.170 Uttam Kumaran: It’s like 5 am there? Yeah. And he has. We’re going to one more meeting after this. Also.

405 00:39:01.820 00:39:04.500 Uttam Kumaran: they’re both AI related. And I was like.

406 00:39:05.180 00:39:10.689 Uttam Kumaran: well, I told him I gave him. I get him a coffee, and he’s like I already got one. I was like, Okay, well.

407 00:39:11.010 00:39:14.500 Chang Ho: Is that all you’re offering me? I need more than just copy.

408 00:39:15.230 00:39:15.889 Chang Ho: No, I see.

409 00:39:15.890 00:39:22.262 Uttam Kumaran: Like, look, these are interesting things here. I can record them. And you could. I could send it to you later. But it’s gonna be. It’s boring now.

410 00:39:24.570 00:39:28.270 Chang Ho: Oh, Josh, you need to. AI clone yourself, mate. Oh, my.

411 00:39:28.270 00:39:28.600 Uttam Kumaran: Go off.

412 00:39:29.970 00:39:32.159 joshuadeveyra: But yeah should be fine around same time.

413 00:39:32.740 00:39:33.890 joshuadeveyra: I just need to be a bit more.

414 00:39:33.890 00:39:37.970 Chang Ho: Slightly earlier. We can do slightly earlier option. If you can still be in the Philippines.

415 00:39:38.030 00:39:40.429 Chang Ho: Why don’t we? When do you tend to go to sleep?

416 00:39:40.620 00:39:41.560 Chang Ho: Never.

417 00:39:43.690 00:39:44.243 joshuadeveyra: Yeah, never.

418 00:39:44.530 00:39:47.411 Uttam Kumaran: Probably like, probably like 2 am. Usually I feel like.

419 00:39:47.700 00:39:48.869 joshuadeveyra: Yeah, 2 or 3. Yeah.

420 00:39:48.870 00:39:53.099 Chang Ho: So 2 am. So you’d rather have the meeting. What? Like? 3 h before.

421 00:39:54.190 00:39:57.419 joshuadeveyra: Yeah, yeah, hopefully, if that’s possible, that would be great.

422 00:39:57.420 00:40:01.820 Chang Ho: 6 Pm. For me is absolute. I mean, I think I can just about wake up for that.

423 00:40:02.770 00:40:04.080 Uttam Kumaran: Yeah, okay.

424 00:40:04.410 00:40:05.069 Chang Ho: That stuff.

425 00:40:06.680 00:40:09.430 Chang Ho: That’s the amount of dedication I’m willing to

426 00:40:10.860 00:40:11.280 Chang Ho: accommodate.

427 00:40:11.280 00:40:11.950 Uttam Kumaran: Okay.

428 00:40:11.950 00:40:12.690 Chang Ho: So, yeah, so.

429 00:40:12.690 00:40:13.069 Uttam Kumaran: Would you say?

430 00:40:13.070 00:40:14.356 Chang Ho: 6 pm.

431 00:40:15.330 00:40:21.210 Chang Ho: yeah, let’s do 6 pm. Next week, or at least 3 h before whatever this was. So that’s all the time. Yeah.

432 00:40:21.560 00:40:23.209 Chang Ho: One pm. For you, Tim

433 00:40:23.250 00:40:27.060 Chang Ho: Abigail. Will that be like 9 Am.

434 00:40:27.570 00:40:28.430 Chang Ho: No.

435 00:40:29.255 00:40:29.720 Abigail Zhao: Myself.

436 00:40:29.720 00:40:37.690 Uttam Kumaran: So for you. So so for you. It’s 1 Pm. 12 Pm. Me is is 10 Am. East 10 Am. Pacific.

437 00:40:37.920 00:40:41.970 Uttam Kumaran: Wait, Shang, I thought you were 7 h ahead of East. Are you 6 h ahead of East Coast?

438 00:40:42.870 00:40:45.179 Chang Ho: I’m 5 h ahead of East Coast.

439 00:40:45.830 00:40:48.340 Uttam Kumaran: Oh, okay, so it’ll be so. It’ll be noon for me.

440 00:40:49.430 00:40:54.629 Chang Ho: Okay. So 6 pm. For me. Noon for you that Abigail would be what 9 Am.

441 00:40:56.500 00:41:00.999 Abigail Zhao: And I also Loki, have class during that time. So

442 00:41:04.180 00:41:05.940 Uttam Kumaran: Class is more important. Class is.

443 00:41:07.800 00:41:09.319 Uttam Kumaran: we’ll fill you in on what happened.

444 00:41:09.320 00:41:11.060 Abigail Zhao: Okay, yeah, that sounds good.

445 00:41:12.240 00:41:15.139 Chang Ho: Yeah, we can. We can send you an executive summary.

446 00:41:17.210 00:41:18.410 Uttam Kumaran: Right. I can write.

447 00:41:18.410 00:41:20.130 Chang Ho: Class more important. I don’t know.

448 00:41:20.130 00:41:26.079 Uttam Kumaran: No, I mean I I don’t want I don’t know what this. If I should say the truth or just support

449 00:41:26.470 00:41:27.160 Uttam Kumaran: because

450 00:41:27.890 00:41:28.480 Uttam Kumaran: I had you.

451 00:41:28.480 00:41:29.080 Chang Ho: You’re like.

452 00:41:29.799 00:41:32.679 Uttam Kumaran: I had no interest

453 00:41:32.900 00:41:34.590 Uttam Kumaran: in school at all.

454 00:41:34.970 00:41:38.119 Uttam Kumaran: I didn’t even know. Yeah, I mean, like, yeah.

455 00:41:38.120 00:41:38.540 Abigail Zhao: And finish.

456 00:41:38.540 00:41:39.390 joshuadeveyra: College.

457 00:41:39.670 00:41:40.330 joshuadeveyra: cool.

458 00:41:40.330 00:41:44.950 Abigail Zhao: Attendance is mandatory for that class, so I do kind of need to be there. Unfortunately.

459 00:41:44.950 00:41:51.720 Uttam Kumaran: Be surprised what that means, because I’m telling you I’m here, and I skipped some attendances mandatory classes, too.

460 00:41:53.400 00:41:58.116 Uttam Kumaran: but like no pressure, you don’t. You can make your own decisions.

461 00:41:58.860 00:42:02.480 Abigail Zhao: She like? She’s pretty like tedious about like who’s there like she’s 1 of.

462 00:42:02.480 00:42:02.950 Uttam Kumaran: Those teachers.

463 00:42:02.950 00:42:03.879 Abigail Zhao: I will like

464 00:42:04.030 00:42:06.389 Abigail Zhao: call out names to like.

465 00:42:07.430 00:42:10.289 Chang Ho: And shame! Shame on you! Shame!

466 00:42:10.540 00:42:13.829 Abigail Zhao: Yeah, sort of thing. Yeah, bad influence.

467 00:42:14.150 00:42:19.399 Chang Ho: Okay, fine, if that’s what we have. That’s the sort of demon we’re dealing with.

468 00:42:19.660 00:42:20.170 Abigail Zhao: Yeah.

469 00:42:20.700 00:42:26.070 Chang Ho: Then you should. Then you should probably go. It sounds yeah, it sounds fatal.

470 00:42:26.740 00:42:29.504 Chang Ho: So yeah, anyway. Nice chatting to you guys

471 00:42:30.310 00:42:36.269 Chang Ho: first, st yeah. 1st meet with Josh and Abigail. Sorry is it Joshua or John.

472 00:42:36.738 00:42:39.419 joshuadeveyra: Yeah, my full name is Joshua. So either way works.

473 00:42:39.420 00:42:42.310 Uttam Kumaran: Call Miguel, but I would say anything.

474 00:42:42.850 00:42:45.510 Chang Ho: I think Mcgehen fits your moustache better.

475 00:42:45.520 00:42:46.889 Chang Ho: I’ll go from begin.

476 00:42:47.090 00:42:48.210 joshuadeveyra: Yeah, thank, you.

477 00:42:48.210 00:42:49.420 Chang Ho: Okay, cool.

478 00:42:49.890 00:42:50.830 Chang Ho: anyway.

479 00:42:51.050 00:42:52.609 Chang Ho: Nice to see you guys.

480 00:42:52.610 00:42:53.930 Abigail Zhao: Yeah, it’s nice meeting you.

481 00:42:53.930 00:42:55.030 joshuadeveyra: Thanks, guys. Bye-bye.