Meeting Title: Uttam-Vincent-Chang Date: 2024-11-07 Meeting participants: Vincent Jeanselme, Chang Ho, Uttam Kumaran, Miguel


WEBVTT

1 00:04:17.860 00:04:18.610 Vincent Jeanselme: Hi.

2 00:04:20.010 00:04:21.019 Miguel: Hey, Vincent?

3 00:04:33.190 00:04:34.120 Miguel: Hey! We test.

4 00:04:34.870 00:04:36.489 Vincent Jeanselme: Hi! How are you?

5 00:04:37.600 00:04:38.270 Uttam Kumaran: Hey!

6 00:04:38.270 00:04:39.030 Miguel: Yes.

7 00:04:39.340 00:04:40.800 Uttam Kumaran: Hey? Good! How are you?

8 00:04:41.310 00:04:42.140 Vincent Jeanselme: Good.

9 00:04:43.820 00:04:45.579 Uttam Kumaran: Let me just get this.

10 00:04:47.400 00:04:49.070 Uttam Kumaran: How’s everything really nice to meet you

11 00:04:49.180 00:04:50.719 Uttam Kumaran: for taking the time.

12 00:04:51.685 00:04:54.870 Vincent Jeanselme: Thank you. Yeah, very much the same. Very excited to meet you.

13 00:04:57.190 00:05:00.779 Uttam Kumaran: Great. How do you? How do you know, Chang like, how did you guys get connected.

14 00:05:01.904 00:05:14.299 Vincent Jeanselme: We work together at Turing. Last year, or like 2 years ago. Yeah. So it’s when we met and we worked on some project together. Intersection of health care and machine learning. That’s how we.

15 00:05:14.300 00:05:14.760 Uttam Kumaran: Okay.

16 00:05:14.760 00:05:15.400 Vincent Jeanselme: We didn’t.

17 00:05:15.910 00:05:20.650 Vincent Jeanselme: I think he will join. He might be a bit late, he told me, like he might be just a bit late today.

18 00:05:21.070 00:05:29.549 Uttam Kumaran: Okay. Okay, yeah. I know he mentioned he might not be able to make it. But that’s okay. I know. I I don’t know like how much he’s told you. Maybe I’ll give you the

19 00:05:29.570 00:05:55.689 Uttam Kumaran: the 90 second about Brainforge and me and us, and then like. Oh, so Brain forge is a company that I started about a year ago. We started the company primarily focused on doing data, engineering, and data modeling related like, consult projects. So that’s my background. I’m a data engineer. I studied computer engineering. I worked as a data engineer for a bunch of companies like lead data teams.

20 00:05:56.125 00:06:01.489 Uttam Kumaran: And then also did a little bit of stint leading product at a startup. Quit. My job

21 00:06:01.550 00:06:03.880 Uttam Kumaran: was like, what do I do next

22 00:06:04.070 00:06:22.589 Uttam Kumaran: decided that, like, I wanted to go work for multiple clients and like, try to see if I can sort of like drum up some business on my own got, you know, kind of lucky, and and started to get some great clients and develop some, you know. Great data projects for them slowly built up the team and then also through the whole process. I’m like.

23 00:06:22.650 00:06:47.632 Uttam Kumaran: basically addicted to doing stuff with AI and the nice thing is now that we have Brainforge, and we have folks like Miguel and a few other folks that are really, really experienced. And AI implementation and automation, we’re we’re able to kind of turn that around as a business. So for me, it’s like I’m an engineer. So it’s amazing that I can get paid to do some of these stuff that are like super super fun. But also we

24 00:06:48.150 00:07:06.764 Uttam Kumaran: you know, we have these 2 parts of the business that we’re starting to find ways that they intersect. So most of the team is is engineering. And and you know, we are. We had some people recently kind of on the sales and marketing side. But we’re growing, and you know we I got connected through Chang through another friend of mine, Neil, who is like

25 00:07:07.190 00:07:31.650 Uttam Kumaran: he kind of. It’s a project manager, product manager, friend of mine in New York. And then we’ve been talking about, you know. How do we kind of get into the to the healthcare space. More specifically, I think, where you know, we’ve we have a lot of opportunity is healthcare, I think, has a lot. They’re still super behind. You know, a lot of like B, 2 B Sas, and a lot of like the high tech fields. However, I do think that

26 00:07:31.810 00:07:36.149 Uttam Kumaran: compare like. It’s hard for someone like me to just enter, because it takes a lot of

27 00:07:36.250 00:07:55.440 Uttam Kumaran: not only jargon, but like, know how and understand the pain points and things like that. And that’s really where me and Chang started to talk about like, is there a way that we can operate together to kind of use the brain for infrastructure. Right? Like, we’re an established company. We have all these case studies. How can we use that? And then his subject matter expertise to kind of

28 00:07:55.450 00:08:06.606 Uttam Kumaran: break into, you know, medical and clinical and healthcare and that that’s primarily at the moment like looking at AI and and automation related projects. But

29 00:08:07.190 00:08:28.889 Uttam Kumaran: of course, like I, when I go to new clients. So we kind of, I just asked about what their problems are. And then we kind of come up with ways to solve them. So I’m based here in the States. Miguel runs sort of our AI practice in terms of like, actually, the implementations and and engineering he’s based in in the Philippines. And then we have folks kind of around

30 00:08:29.040 00:08:34.450 Uttam Kumaran: the world like we have folks in the Us. In Argentina, in Asia.

31 00:08:35.308 00:08:45.120 Uttam Kumaran: and so I yeah, just wherever the smart people are is is where they are. So. Yeah, that’s a little bit about us, all right. And then it’s a long, winded explanation. But.

32 00:08:45.740 00:08:48.460 Vincent Jeanselme: I know sounds super interesting, like, definitely.

33 00:08:49.900 00:08:50.822 Uttam Kumaran: Yeah. And then,

34 00:08:52.260 00:08:53.629 Uttam Kumaran: sorry. Yeah, go ahead.

35 00:08:56.930 00:08:59.209 Vincent Jeanselme: Sorry I didn’t hear you. What did you say?

36 00:08:59.620 00:09:01.838 Uttam Kumaran: No, I said. I said, sorry, I interrupted. You. Go ahead.

37 00:09:02.495 00:09:21.879 Vincent Jeanselme: No, no worries. So yeah, no very excited, like, I’m very interested also, like in the application of machine learning in healthcare. That’s where I did my Ph, like I worked during my Phd. On that topic particularly with like the aspect of fairness, which is like more and more important, I think, particularly in the States, like, there is a lot of AI deals, and like

38 00:09:21.970 00:09:33.090 Vincent Jeanselme: some kind of requirement for companies to comply with some AI regulations. And so that’s what I’ve been working more like on the technical side. But it’s very much what I’m interested in

39 00:09:33.100 00:09:59.790 Vincent Jeanselme: so definitely resonate with me. And I also have, like some experience with different healthcare system in the Us. Particularly I work like at Mayo Clinic. I worked with Upmc with our medical school, like with a few schools around here. So yeah, definitely interested in those topics. Very challenging, but very like the time to do that, I think, is like they definitely need to improve some of the automatizations.

40 00:10:00.070 00:10:01.660 Vincent Jeanselme: Yeah.

41 00:10:01.760 00:10:03.510 Vincent Jeanselme: sounds sounds very cool.

42 00:10:06.460 00:10:31.069 Uttam Kumaran: Yeah. So I mean, I guess I’m interested to hear what Chang has told you so far and like, kind of like, get your perspective on, like what you guys talked about. You know, we’ve been talking for a little bit about how to break into medical you know, I’ve worked with healthcare data and stuff like that before. But I think Chang is a lot of understanding on the inner workings of, you know, the actual day to day. And so we’re trying to find ways.

43 00:10:31.190 00:10:50.970 Uttam Kumaran: Basically, the the 1st step for us is, can we figure out what the used cases are? So I can start marketing right? And what does marketing mean from our side? Like we start to send emails to like operators and decision makers. We come up with Demos and proof of concepts of things that we can do and basically have enough serials to kind of think about a pitch

44 00:10:51.254 00:10:57.300 Uttam Kumaran: and of course, like we don’t have any healthcare clients at the moment, but the kind of the goal is like.

45 00:10:57.400 00:11:17.139 Uttam Kumaran: Everybody who is involved in the project like gets a piece of it. And we’re kind of thinking about how to form this like part of the company which is like this medical industry. And how do we serve? How do? How do we take our expertise across a bunch of other industries and like, basically serve, you know, medical, so interested in like what you guys discussed. And maybe we can even just start there.

46 00:11:17.880 00:11:39.159 Vincent Jeanselme: Yeah, we discussed about like a few, maybe low hanging fruit in the healthcare system that needs like automaticization, like we were speaking mainly about scheduling. That’s still taking a lot of time for doctors, for instance, just organize a meeting, and like once one is canceled like their whole planning is

47 00:11:39.640 00:12:06.109 Vincent Jeanselme: like lost, so like simply that, I think would be a very good like Entry Point, where there is low risk as well to deploy such models, because I think that the challenge also in healthcare is like also like liabilities that will come with developing a more treatment. For instance, like decision making, or this kind of models that might be impacting care directly. That might be a bit more challenging versus like

48 00:12:06.200 00:12:21.750 Vincent Jeanselme: there’s so much in the healthcare infrastructure that could benefit from automation, and that is not done right now. I think that would be a place, maybe to start. I think what we do also with 10 a lot. It was like

49 00:12:22.070 00:12:42.909 Vincent Jeanselme: big, big healthcare system might already have some of that, and might already have like people doing that. But that would be more like the small to medium hospitals in the Us. In the Uk. Maybe less, but like in the Us. I think it’s more like the place where where those smaller hospitals would benefit more. I think.

50 00:12:44.280 00:13:05.823 Uttam Kumaran: Yeah, I mean, you know, we were just. We’ve just been talking about this today and this week about a lot of these voice models, but also like the phone calling models like, there’s a couple of tools like Bland AI, and where basically, you can set up like voice based agents that can call. And so I’m we’re actually speaking with

51 00:13:06.250 00:13:23.771 Uttam Kumaran: someone. Later today, I I believe there, we we got an inbound request. About setting this up for a certain clinic. I believe it’s it’s a healthcare use case. Although we we still need some more information. But basically they want to set up something which is like appointment, scheduling appointment, rescheduling

52 00:13:24.090 00:13:24.480 Vincent Jeanselme: Yeah.

53 00:13:24.480 00:13:26.350 Uttam Kumaran: Basically the the

54 00:13:26.400 00:13:35.900 Uttam Kumaran: if I had to form like the the pitch, and like kind of like, what’s the roi for me? It would be like. Look, it’d be exactly what you said is, there is a huge issue.

55 00:13:35.980 00:13:49.419 Uttam Kumaran: probably, both on the Telehealth side and the in person side that’s related to appointment. Scheduling appointment follow ups and appointment like rescheduling basically and confirmation right? So it’s like

56 00:13:50.050 00:14:18.680 Uttam Kumaran: day before. Like as, for example, my girlfriend works in kind of like telehealth for like neurodivergent kids. So folks with like autism you know, like children’s health around like that sort of therapy. But she has to call people all day, basically like, understand whether they’re coming to the meeting. If they miss the meeting that they follow up reschedule all of that you could do with AI right now. And so I think there’s a lot of value in like

57 00:14:18.680 00:14:26.159 Uttam Kumaran: kind of those categories which is like calling someone before a meeting, making sure they have all their materials calling someone if they miss a meeting

58 00:14:26.270 00:14:47.230 Uttam Kumaran: automatically doing the follow up process and rebooking or rescheduling. And then the the last thing is I think that’s a really great place to measure. Roi. For example, if we can get from a clinic, hey? Tell me how much money you guys typically make per meeting? Tell me the law, how much, how many people actually attend the meetings? Or don’t attend and you get lost.

59 00:14:47.230 00:14:59.349 Uttam Kumaran: Then we have a clear kpi that we can measure towards to show our why. And the other thing is is like, if, for example, if they I this is where I don’t know what those numbers are, and I don’t know what like.

60 00:14:59.520 00:15:19.319 Uttam Kumaran: I don’t know what the industry kpis for those are. But like, for example, in sales, we do a lot of sales data. It’s really easy to explain. Like, Hey, you, we take a hundred calls out of that. There’s usually like a 2% conversion rate to a deal. If you if half, if 30 of those calls, you know, people don’t show up.

61 00:15:19.320 00:15:41.550 Uttam Kumaran: And and imagine if we were to cut that in half, how does that impact the amount of deals you end up closing? Then you kind of back into like, what’s the revenue lost in this similar example? One thing we would do is like, okay, let’s say, you guys have a 25%, no show rate. What if? And you, in order to close those you have like 10 people working, calling, imagine we had an AI that works.

62 00:15:41.630 00:15:56.420 Uttam Kumaran: And it’s like 9 cents a minute. So it’s automatically super less. And that’s our costs right? Imagine we could, we could make sure that we cut that in half. What is the opportunity for you like? Do you guys recoup like 20 KA month? Then it’s.

63 00:15:56.420 00:15:57.080 Vincent Jeanselme: Yeah.

64 00:15:57.080 00:16:00.240 Uttam Kumaran: We then understand the value, and then we can price towards that right?

65 00:16:00.563 00:16:23.029 Uttam Kumaran: That’s those are the situations I like like, because there’s a lot of things we’ve done where it’s hard to know the Roi and I want to be closer to things where it’s like, it’s really easy to measure our impact. Right? Like, if we if they use our system. And they immediately say, like, Wow, we’ve we went from a a loss rate of 25% to to 10%.

66 00:16:23.160 00:16:37.504 Uttam Kumaran: Then it’s like they’re like, we just saved 30 grand. Of course, here’s 10 grand, right? Like, it’s so easy to to do that. And then, lastly, those are also great case studies that are results driven for us to go. Resell, you know.

67 00:16:38.040 00:16:38.750 Vincent Jeanselme: Yeah.

68 00:16:39.135 00:16:44.529 Uttam Kumaran: Yeah, my brain dove right now on, like what I think is the opportunity.

69 00:16:44.950 00:17:02.020 Vincent Jeanselme: Yeah, no. I definitely like, I think, whatever can reduce the burden on doctors, because I think it’s like quite scarce resource in like in the in this like healthcare system, is very much like the doctors, and right now they spend so much time doing that kind of organization, and they.

70 00:17:02.020 00:17:03.070 Uttam Kumaran: Really, okay.

71 00:17:03.070 00:17:14.119 Vincent Jeanselme: So like spending, like reducing that too much is great as well as like all the emails they have to do, they have so much emails in like every day, just to reschedule like or

72 00:17:14.170 00:17:32.330 Vincent Jeanselme: send a reminder. All of. That is so much time they spend, not taking care of patients. And I think it’s really much quantifiable as well like you can know how much hours they spend on emails compared to how much do they spend on like patient care, I think, yeah, it’s a yeah. It’s a great approach, I think, to

73 00:17:32.340 00:17:35.810 Vincent Jeanselme: to reduce to improve the healthcare system in general.

74 00:17:36.540 00:18:02.159 Uttam Kumaran: Do you think there’s like any other sort of like low hanging fruit ideas that you know. Cause one thing I would do is we would just basically try to run, maybe test for like the emails. Like, if if there’s any other really low hanging fruit where you’re totally right where we’re not interacting with a lot of patient like health data, right? So there’s not. There’s low regulation. It’s easy to measure it hits. And the other thing is is like.

75 00:18:02.510 00:18:32.300 Uttam Kumaran: if because I again I don’t, I mean, I would say in my network, I don’t know a lot of doctors, but I may try to find someone to ask, because basically, if I if I go to them and you’re like, do you guys have a problem with patient rescheduling? We want to be like, yes, like, Oh, my God, that’s our biggest problem, right? Like, I don’t want to go after ideas where there’s not that reaction, because people aren’t going to be excited, and we’re going to be learning a little bit alongside them in this process. But if there’s anything else in your mind that comes about like

76 00:18:33.400 00:18:39.600 Uttam Kumaran: things that where it’s like, it’s like a jarring like, Oh, my God! That’s like a such an annoying problem that we have.

77 00:18:40.400 00:18:44.050 Uttam Kumaran: yeah, we don’t need a solution. But just to think about it.

78 00:18:44.830 00:18:52.280 Vincent Jeanselme: Yeah, I think one of the other aspect in the Us. Particularly like the billing they spend. They spend so much time. Then, after

79 00:18:52.520 00:19:02.149 Vincent Jeanselme: like trying to be, because there is so much that to to take into account, like the insurance, what will be reimbursed for like that particular patient? What does the hospital wants like?

80 00:19:02.580 00:19:17.879 Vincent Jeanselme: That’s a problem from the data we receive, because, like the data is very biased, it’s not representative of the patient condition, but like of what is refundable for that particular patient. But they need to do that. And that’s like an optimization problem. They spend hours doing every single day.

81 00:19:18.130 00:19:27.500 Vincent Jeanselme: I think it might be harder, because it’s closer to patient care. But it’s definitely something that would benefit from automate optimization. Yeah, for sure.

82 00:19:28.900 00:19:35.030 Uttam Kumaran: Okay? So I like that, too. Yeah, billing, yeah, I mean, especially working with insurance providers. All that.

83 00:19:35.030 00:19:47.078 Vincent Jeanselme: Yeah. Yeah. Oh, that’s a nightmare. Yeah. Then there is like in the care itself, there is a lot that can be automatic. Yeah.

84 00:19:48.110 00:19:53.799 Vincent Jeanselme: what? I was surprised, but like it’s much harder, because again, like that’s closer to the patient.

85 00:19:54.020 00:20:14.680 Vincent Jeanselme: You need like FDA approval and all that. But there is a lot of things that are bottleneck in current care that takes hours to do, and that could definitely benefit from AI like there is more and more. For instance, in oncology you need to look at a lot of different slides of like a cancer cell and identify to

86 00:20:14.700 00:20:20.250 Vincent Jeanselme: some characteristic to sorry to give the stage of the cancer.

87 00:20:20.520 00:20:43.780 Vincent Jeanselme: They need to do that by hand, one by one. And like, do you have 20 slides for one patient? You need to look at each single one, and then like fill a form of like 20 pages. I think there is opportunity for AI. That’s like definitely where I see like, Oh, you can directly identify which slides you should look at and even like, maybe pre fill like the report, instead of doing everything

88 00:20:44.870 00:20:45.490 Vincent Jeanselme: like.

89 00:20:45.970 00:20:46.840 Vincent Jeanselme: Yeah, I think.

90 00:20:46.840 00:20:48.264 Uttam Kumaran: Click on the so

91 00:20:48.990 00:21:09.510 Uttam Kumaran: like that’s a great no, that’s a great example of like. For example, there may be portions of that where you still want a doctor, but then, taking their information and filling out the forms is a solved problem, right? Like we have AI now that can Ocr use. Use the vision. Llm. Scan those forms, understand all the fields.

92 00:21:09.570 00:21:27.310 Uttam Kumaran: Take, maybe a transcript of the doctor, or even like use a voice to ask the doctor the questions. They just talk, and then it fills out the form. Right so. And it’s also thinking, like, where where should the AI be involved and where, where, maybe it shouldn’t. And that’s the core. That’s those are the sort of projects where I think

93 00:21:27.480 00:21:36.339 Uttam Kumaran: this is kind of my, the long game, and even where, where we’re having some luck is like, if you start in one of these organizations, hey, Chang.

94 00:21:36.340 00:21:37.040 Chang Ho: Oh, yeah.

95 00:21:38.221 00:21:39.868 Uttam Kumaran: We’re just talking about

96 00:21:40.490 00:21:59.709 Uttam Kumaran: a particular idea. But it’s like, if you start in one of these organizations you solve like the patient. Let’s say the patient, rescheduling problem. They now view you as just a team that could solve problems. They’re much more willing to be like, okay, why don’t we work with you, or give you some ability to do this like the cell scanning idea right?

97 00:22:00.490 00:22:07.499 Uttam Kumaran: It’s hard to come in and pitch that it’s easier to come in and pitch something where it’s like very easy to measure. Roi. It may even take us.

98 00:22:07.500 00:22:08.529 Vincent Jeanselme: Yeah. On the release.

99 00:22:08.530 00:22:12.690 Uttam Kumaran: To develop and deploy the appointment solution. And then they’re like.

100 00:22:12.790 00:22:22.521 Uttam Kumaran: so these are our, these are our guys. Right? That’s the sort of like progression. But it doesn’t mean we can. We can market to all of those ideas and start putting stuff out there.

101 00:22:22.800 00:22:28.259 Vincent Jeanselme: Yeah. But yeah, I think I agree with you, like, like, further away, you’re from the patient care

102 00:22:28.360 00:22:55.679 Vincent Jeanselme: easier. It is like to to get into that market like of like scheduling. I think you were speaking about Transcript, and I remember, like doctors in the Us. Spend a lot of time as well summarizing what happened like in the interaction. And I’ve heard about projects that try to listen to the conversation they have, and, like just transcript that and create a summary, or like a draft of the summary of the interaction, which I think is also an easy kind of

103 00:22:55.800 00:22:58.730 Vincent Jeanselme: use of AI for video.

104 00:22:58.730 00:23:04.159 Uttam Kumaran: We do that every meeting this meeting will get summarized and sent to slack for me. So I remember

105 00:23:04.260 00:23:27.310 Uttam Kumaran: what I promise to do like, because I have 10 meetings every day, so it’s like impossible right? And if I was more, if I was better at like starting my day with notes taking notes. But now I don’t have to do that because the AI does it. And the other thing is, it’s not just like summarize it. It has context about what we do, what this meeting is about, and then it’ll give us like, these are the action items.

106 00:23:27.820 00:23:28.330 Vincent Jeanselme: Yeah.

107 00:23:28.370 00:23:55.330 Uttam Kumaran: I I think I think like doctors. Of course I don’t know why we can’t scale that pro. I think. Of course, there’s some security concerns. But the nice thing is, there’s also open source technology that you can build this stuff on. And you could have this running on a device in their network and like it doesn’t even leave the device or doesn’t leave their their network. So that’s the nice things like you don’t. There’s a lot of open source providers that do this like voice to

108 00:23:55.650 00:23:57.080 Uttam Kumaran: voice to text.

109 00:23:57.396 00:24:05.779 Uttam Kumaran: And then you have, like you could run llama and a machine to do the summarization, like, I do think there’s ways of getting around the security concerns where none of this gets sent to the cloud.

110 00:24:08.930 00:24:14.490 Vincent Jeanselme: Yeah, like Cheng, we were speaking mainly about what aspect of

111 00:24:14.830 00:24:31.370 Vincent Jeanselme: care could be automaticized to reduce the cost, like, particularly of like doctors spending hours on emails or scheduling, or like all those aspects. But maybe you might have more insight of what like. Take a lot of your time in practice that could be

112 00:24:31.590 00:24:32.880 Vincent Jeanselme: like maybe

113 00:24:33.000 00:24:34.070 Vincent Jeanselme: automatic.

114 00:24:34.650 00:24:49.630 Chang Ho: I think the 1st thing to say is that there are a lot of players already in this market, right? Because they realize it’s the lowest hanging fruit. So there’s obviously a bridge. There’s Heidi, there is, and there are a few much smaller players. I think there’s also Nabla

115 00:24:50.478 00:25:09.290 Chang Ho: so active in Boston and France came out of France and a few others that are very local to. You know Australia, New Zealand. It’s probably company that in the Uk. Now, I think a lot of people have figured out that that’s 1 of the things that doctors want and need the most is a quick transcription software problem is that

116 00:25:09.550 00:25:23.789 Chang Ho: none of them are going to become successful unless they can integrate into epic or Cerna. And that’s why you’re heading more towards something that’s a little bit more regulation heavy. That requires a lot more time, partly because you’re trying to get epic and Cerner on board to make it interact.

117 00:25:24.260 00:25:24.780 Chang Ho: So

118 00:25:25.590 00:25:26.989 Chang Ho: I always feel like.

119 00:25:27.230 00:25:31.769 Chang Ho: even though that seems like an obvious place to go. I I feel like the the more

120 00:25:32.570 00:25:43.870 Chang Ho: the place where we can, we’ll be able to implement a model over a short time, or implement some product over a short period that so many regulatory, bound, regulatory obstacles will be in things like scheduling

121 00:25:44.685 00:25:48.860 Chang Ho: things like optimizing theater usage.

122 00:25:49.010 00:25:50.620 Chang Ho: ensuring that.

123 00:25:50.770 00:25:51.679 Chang Ho: Hear it

124 00:25:52.320 00:26:04.270 Chang Ho: like people’s vacation calendars are all aligned. For example, with theater. These are very simple things that Palantir and others have been focusing on for many, many years, and then done so successfully.

125 00:26:05.090 00:26:07.960 Chang Ho: But there’ll be smaller hospitals that will not be able to afford

126 00:26:08.230 00:26:17.990 Chang Ho: the palantir checks equally with Mckinsey and Bcg. Bain, etc. How they made their money out of big clinics and operations. A lot of it is to do with

127 00:26:19.060 00:26:21.899 Chang Ho: yeah, like, with operational efficiency

128 00:26:22.930 00:26:26.664 Chang Ho: where the little admin cuts that you can make

129 00:26:27.600 00:26:33.730 Chang Ho: And I feel like the the scheduling part is one thing. I was talking to Vanson the other day, Utam and Miguel

130 00:26:33.960 00:26:37.059 Chang Ho: like how we can be offering everything from something as

131 00:26:37.270 00:26:43.580 Chang Ho: so very structural, very mechanistic as that where it doesn’t require much medical domain knowledge. But a little bit would help

132 00:26:43.650 00:26:50.260 Chang Ho: all the way to being able to offer something a little bit more forward thinking in terms of

133 00:26:50.340 00:26:53.779 Chang Ho: that’s like a roadmap for us, like for us to create for them

134 00:26:53.970 00:26:58.939 Chang Ho: a roadmap for how to think about creating AI, a robust AI platform.

135 00:26:59.750 00:27:06.960 Chang Ho: What the regulate, what the regulation is currently in the Us. And how it might look going to the future, and that’s something that we will have to work on.

136 00:27:07.722 00:27:12.239 Chang Ho: How to register models, how to detect, model, drift.

137 00:27:12.270 00:27:15.310 Chang Ho: how to detect biases.

138 00:27:15.390 00:27:27.600 Chang Ho: and and obviously unfair on inequitable care, something that bounce on super expert in and you know these are buzzwords. Not but not just buzzwords. I think there might be

139 00:27:27.620 00:27:30.300 Chang Ho: actual financial incentives, which is why

140 00:27:31.030 00:27:36.199 Chang Ho: the fairness and bias aspect of AI has become such an Mba special

141 00:27:36.810 00:27:43.859 Chang Ho: like business schools are lapping it like lapping it up like, no like. There’s no tomorrow, because they all want some sort of

142 00:27:43.920 00:27:58.140 Chang Ho: legal checkbox thing to have happened, or for some for some consultancy to come in and said, Yeah, fine. You know you’re doing the right thing, or you’ve been seen to make the do the you know you’ve been seen to at least consult someone for expert advice on that.

143 00:27:59.410 00:28:06.900 Chang Ho: So I feel like that. That was the conversation that came from Lin Chen, the the global informatics director at Brigham women.

144 00:28:07.490 00:28:10.090 Chang Ho: Math. Gen. Last time was

145 00:28:10.470 00:28:18.320 Chang Ho: the likes of Duke and Stanford, and these large medical centers will probably already have internal talent that they’re trying to procure.

146 00:28:18.340 00:28:29.869 Chang Ho: but where we might have a little bit more. Success in his view is where is where there is a like a medium sized hospital that is academically linked, but just doesn’t have the same

147 00:28:29.980 00:28:32.660 Chang Ho: human power or financial

148 00:28:33.845 00:28:34.690 Chang Ho: oomph!

149 00:28:34.830 00:28:39.069 Chang Ho: As places like Brigham, mass, Gen. etc, to be able to

150 00:28:40.230 00:28:43.040 Chang Ho: to hire internal talent, but might want to just

151 00:28:43.410 00:28:45.500 Chang Ho: get into the AI space a bit

152 00:28:45.560 00:28:48.109 Chang Ho: with as much with that where you know.

153 00:28:48.400 00:28:50.369 Chang Ho: with players like ourselves.

154 00:28:51.170 00:28:57.420 Chang Ho: See if it’s something for them. Kind of explore a little bit that way. So I kind of wanted to sort of angle for that

155 00:28:57.780 00:29:01.339 Chang Ho: as a possible way to sort of pitch what we could offer.

156 00:29:01.360 00:29:14.481 Chang Ho: Yeah, like providing an AI blueprint. Basically, like, you know, are you interested in creating a robust AI platform for yourselves, like having AI models that are being developed by your researchers and other researchers from other hospitals and other

157 00:29:16.560 00:29:18.577 Chang Ho: from from Academia

158 00:29:19.620 00:29:27.909 Chang Ho: and not exactly from an industry as well, and being able to test that internally on your own data sets, and being able to validate that, and having some control over that. Do you want that kind of

159 00:29:28.120 00:29:30.340 Chang Ho: capability? I like.

160 00:29:30.940 00:29:32.650 Chang Ho: you know, that could be one thing we could offer.

161 00:29:32.650 00:29:39.559 Uttam Kumaran: It’s like, yeah, it’s almost like 2 things. I think it’s 1 is I think the really low hanging fruit

162 00:29:39.580 00:30:06.490 Uttam Kumaran: super clear, ro measurable Roi is appointment, scheduling, rescheduling calendar management folk. And again, there’s a lot of tools that are that are that Miguel has even experience in, and we’ve been playing around with that are around phone calling. And you could handle all of this. Now, via AI rescheduling. And it’s very, very good, which means in 3 months it’ll be great. So

163 00:30:06.530 00:30:35.799 Uttam Kumaran: there’s a lot of great stuff there. I think. Also, what you just brought up is something yeah, I totally didn’t think about, and I think maybe the pitch is either again for me. Now I think about like, what’s the feeling you want to grab is like, how do you? How do you almost tell them like, how do you guys keep up with these larger hospitals that have these teams like, how do you guys have a idea about AI governance. AI fairness, right? And that’s also ways where even for their team to say, talk to us. Or maybe we do a session on, like what that even is

164 00:30:36.480 00:30:47.249 Uttam Kumaran: that can start as that. And then we basically find ways and solutions there. So there’s almost like 2 things. One, I think the way in is either is that which is like

165 00:30:47.260 00:31:11.000 Uttam Kumaran: we do these, we we can help you solve this appointment problem. In addition, we offer like consultatory sessions on these sorts of topics that allow you to keep up with these large firms. I think that’s start. I think all these the more conversations we have. Clearly, we’re basically sprinting towards like what those

166 00:31:11.290 00:31:16.379 Uttam Kumaran: pitches are, I think, in terms of like, what I want to take. This is, we’re gonna craft.

167 00:31:16.922 00:31:21.900 Uttam Kumaran: Basically, we’re gonna we’re gonna start to create messaging. And

168 00:31:22.382 00:31:24.070 Uttam Kumaran: a lead list around

169 00:31:25.047 00:31:41.709 Uttam Kumaran: this like campaign, which will basically start as like, do you guys have this appointment scheduling problem? We will create a list of this, the companies that you mentioned Chang. And maybe I’ll just. I’ll I’ll just make all of us are in slack and we can look at that and be like. These are the.

170 00:31:41.710 00:32:00.200 Uttam Kumaran: These are the hospitals, and also, these are the people in the hospitals. We should be targeting the second. And we’ll just start that. I think the other thing is, if I could get your guys help and maybe helping our content team put out like one or 2 articles about healthcare and AI, we can throw together some topics.

171 00:32:00.533 00:32:08.139 Uttam Kumaran: And it’s very. It’s honestly can be as low touch as you want. Like we use. We’re using AI to generate a lot of content. And it’s kind of like.

172 00:32:08.160 00:32:30.359 Uttam Kumaran: we have some source material. We then use AI to generate, then I review, but I think that would also, and we can publish that on our Linkedin on the blog, on the Newsletter. And and also in our email series, we will when we email like these clinics, we’ll say, check out this case study. We wrote or check out this this post. We just did and then maybe start there.

173 00:32:30.360 00:32:40.879 Uttam Kumaran: I mean, I think, for me, or what a huge win would be is like if if our group can like, hop on the phone with one of these clinics, explain like.

174 00:32:40.970 00:32:55.209 Uttam Kumaran: start with the appointment scheduling problem, and then, as like a short term, solve long term, talk about opportunities in AI and building models, and also keeping up with these larger firms, and then just try to get someone to say.

175 00:32:55.230 00:33:01.160 Uttam Kumaran: Okay, I want it right? Like that’s it. I think from there we can think about.

176 00:33:01.170 00:33:13.515 Uttam Kumaran: I mean, of course, implementing the solution for them, but but kind of seeing where it goes on. Our team. I know, Shane, you’re familiar, but we have Miguel. We have another AI automation person, Casey.

177 00:33:13.990 00:33:24.346 Uttam Kumaran: so. And then, of course, I think we all can collaborate and work on it. I think we do have enough muscle to kind of execute which I’m really really excited for

178 00:33:25.150 00:33:27.790 Uttam Kumaran: So let me let me see, I think I have enough

179 00:33:28.200 00:33:34.699 Uttam Kumaran: to kind of get going on a campaign. I’ll get your everything organized. Get your guys thumbs up.

180 00:33:34.720 00:33:38.720 Uttam Kumaran: We’ll connect you with like we’ll connect you guys with our content person.

181 00:33:39.760 00:33:40.534 Uttam Kumaran: See?

182 00:33:41.960 00:33:52.379 Chang Ho: I think, just just to just to elaborate on that mate. I think there is that like, can we help you with your scheduling? Can we help you with operational efficiency in the theatre? Can we help you? I think you sort of say.

183 00:33:52.440 00:34:12.499 Chang Ho: can we help you with scheduling? Eg. You know, operating room efficiency or operating room like staff allocation, alignment, vacations, staff rostering that sort of thing. There’s that aspect, bam bam, bam bullet point classic problems that hospital admin have to deal with all the time. Waste a lot of money on.

184 00:34:15.550 00:34:30.130 Chang Ho: that’s all like very palantir like stuff. We’ll just call it that for the sake of argument, because it makes it easier. That’s super palantir because it’s easy. It’s scalable. They know how to do it. They’ve created a playbook internally. They do it time and time again, you know.

185 00:34:30.260 00:34:32.550 Chang Ho: Lather rinse repeat.

186 00:34:34.073 00:34:43.536 Chang Ho: I think on the more abstract end there is the. Can we create for you a capability. This is number. The biggest second heading will be

187 00:34:43.980 00:34:45.690 Chang Ho: Can we create for you?

188 00:34:46.000 00:34:50.669 Chang Ho: So the the blueprint and regulatory governance roadmap

189 00:34:50.969 00:34:52.460 Chang Ho: for dealing with

190 00:34:53.139 00:35:06.719 Chang Ho: implementing AI or getting an AI model going? What should you be looking out for as an institution? What are the red flags? How do you test for bias. What do you need in terms of like basically more strategy and regulation that’d be like.

191 00:35:07.030 00:35:12.650 Chang Ho: can we create for your blueprint? For how to get an AI ecosystem going in your hospital.

192 00:35:13.030 00:35:15.230 Chang Ho: We’re testing out these new AI models

193 00:35:15.270 00:35:30.999 Chang Ho: for seeing if they’re flawed, you know, etc. Are there any like security issues. Blah, blah blah can you just stay within the law on all this stuff? So that’s 1 other aspect that we can. That’ll be a lot more white paper, oriented rather than actual coding or product.

194 00:35:31.380 00:35:40.530 Chang Ho: And then finally, the thing where I see there’ll be a bigger, you know again, like an roi. Everything has to be tethered to an roi right? They want. They want to see if it’d be useful for them.

195 00:35:40.670 00:35:43.679 Chang Ho: and if it’s been worth their money hiring us.

196 00:35:44.490 00:35:54.620 Chang Ho: and the final and it’s probably the most exciting part of the puzzle can just say is, can we find for you through large language models and other things.

197 00:35:55.247 00:36:01.469 Chang Ho: And our internal like domain expertise, can we find more patients who are eligible for certain treatments for you?

198 00:36:01.600 00:36:03.389 Chang Ho: Because you’re losing money right? There?

199 00:36:04.030 00:36:04.800 Chang Ho: Yeah.

200 00:36:05.860 00:36:07.750 Uttam Kumaran: Yeah, that’s amazing. Yeah, I didn’t.

201 00:36:07.750 00:36:21.850 Chang Ho: So. So if you think about that as a sort of okay, this is where you could be making money. But you’re just not because you’ve missed out these patients who are currently eligible for certain treatments. It will take hours and hours for people to sort of scour through these notes.

202 00:36:22.190 00:36:23.310 Chang Ho: Can we

203 00:36:23.330 00:36:28.121 Chang Ho: do that free semi automatically? There are players that there there are startup

204 00:36:28.680 00:36:33.219 Chang Ho: startups that are doing this already, but the amount that they’re charging, I think, is.

205 00:36:33.730 00:36:47.319 Uttam Kumaran: But also they’re not. They don’t come and implement it for you. This is the thing. There’s a lot of people that are selling software. But there’s no people who come in actually with like dig the holes, basically is how I describe it to folks.

206 00:36:47.700 00:36:58.689 Chang Ho: Yeah, oh, there is, there is return there is. So there’s there’s there’s a company that I admire a lot that’s come out of mit called

207 00:36:59.070 00:37:00.220 Chang Ho: layer health.

208 00:37:00.390 00:37:02.020 Chang Ho: And they’re doing exactly that.

209 00:37:02.320 00:37:05.529 Chang Ho: They’re they’re bouncing from one academic medical center to another

210 00:37:05.920 00:37:09.370 Chang Ho: and using all of their expertise to

211 00:37:09.900 00:37:11.409 Chang Ho: to basically extract.

212 00:37:12.140 00:37:13.640 Chang Ho: create dashboards

213 00:37:13.870 00:37:14.960 Chang Ho: so that

214 00:37:15.000 00:37:19.279 Chang Ho: people feel like people within the hospital feel like they have control. But ultimately it’s

215 00:37:19.310 00:37:21.990 Chang Ho: the Mit layer health guys that are

216 00:37:22.200 00:37:24.049 Chang Ho: doing all the hard work

217 00:37:24.080 00:37:25.420 Chang Ho: and filtering out.

218 00:37:25.420 00:37:27.160 Uttam Kumaran: But I also, I’m also

219 00:37:27.380 00:37:33.890 Uttam Kumaran: yeah. I also think it’s just like these software tools and methods and frameworks

220 00:37:33.960 00:37:36.720 Uttam Kumaran: are changing at such a pace.

221 00:37:37.330 00:37:40.830 Uttam Kumaran: There’s no even. It’s hard for us

222 00:37:40.860 00:37:50.349 Uttam Kumaran: to keep up, and we’re in the business of keeping up with it that there’s no way that, and there’s just a. So there’s just incentive from the software companies are going to.

223 00:37:50.400 00:38:06.339 Uttam Kumaran: They’re gonna argue that they have the best solution. The people in the actual operators. They’re gonna be like, I, I have a full time job. There’s no way I can pick which one’s the best and so there’s somebody who’s almost like playing for playing.

224 00:38:06.390 00:38:29.449 Uttam Kumaran: not on either playing just for the the consultant. Right? The client, right? It’s like, we say, whether it’s Bland AI, whether it’s this software, this software. It doesn’t for us, it doesn’t really matter. It’s actually we only care about the solution. We are much more well researched in a field that is changing so often. That that goes to 2 things. One is like

225 00:38:29.450 00:38:49.790 Uttam Kumaran: for us to develop them of the phone, the appointment scheduling solution. Let’s say, a new back end software comes out this way. Better. They don’t have to manage like maintaining that and updating that we automatically sort of like handle that. The second thing. So that’s on the vendor selection side. The 3rd thing is, you’re right on the ecosystem building. Right? Why don’t you lead on us to help you nurture that

226 00:38:49.790 00:38:59.429 Uttam Kumaran: ecosystem within your organization. I really liked how you mentioned that. And then the 3, rd there’s gonna be things where it’s so bespoke to your organization.

227 00:39:00.910 00:39:11.260 Uttam Kumaran: that like, it’s a a software that’s not gonna exist for a few years, especially not gonna exist outside of epic for a few years that handles your specific.

228 00:39:11.700 00:39:38.000 Uttam Kumaran: It’s gonna take 5 or 6 years for some of these, because all this stuff right now is so generic. And right. So there’s a lot of providers for Apis Sdks. But the the people that are building vertical AI software, I look at it. I’m like this is gonna be outdated because your requirements are gonna change in a few years. So in in just like 6 months. So I think there is something like in data and some other in like software. It doesn’t change this rapidly.

229 00:39:38.939 00:39:39.740 Uttam Kumaran: The.

230 00:39:40.075 00:39:48.810 Chang Ho: You’re asking one of the questions that’s going to end up being the devil’s advocate question from some clients who are a little bit more astute.

231 00:39:48.840 00:39:59.100 Chang Ho: You’ve hit. You’ve hit the money in the mark, they’ll say. Well, let’s say fine. Let’s just pay you for 3 months. 6 months worth of work. You give us a model, and it will become to function 7 or 8. What do we do then?

232 00:39:59.240 00:40:00.909 Chang Ho: Why do we even pay you for it?

233 00:40:02.370 00:40:05.070 Uttam Kumaran: Yeah, I mean, I think the biggest thing is you. You

234 00:40:05.080 00:40:27.920 Uttam Kumaran: you have confidence, and we have transparency in the architecture way we build that it’s easy to add new foundation models. It’s easy to implement like different flows. But again, this is software you have to maintain. There’s some level of maintenance, right? This isn’t like nothing. No, no vendor or software you use right now is a set it and thing.

235 00:40:27.940 00:40:47.850 Uttam Kumaran: So just like every other tool in their organization. There has to be a stakeholder, and there has to be someone who’s who works with us to maintain that. But that’s where you trust on us building scalable solutions right? Like actual software that can evolve. That’s like, we’re just you gotta just trust that we’re not like shitty engineers.

236 00:40:48.103 00:41:04.809 Uttam Kumaran: But at the same time. This is where I think a lot of other consultants will try to lock people in and say, this is our proprietary software, and like. If you turn, turn us off, you lose it. I don’t play. I don’t want to play games like that, because I think if we’re the best at what we do. They’ll keep us on. Anyways.

237 00:41:04.830 00:41:31.789 Uttam Kumaran: it’s not worth like gatekeeping right? Like things like that. For me. It’s it’s actually like I prefer them to have a full transparency in the way we architect things and the flows, and for them to understand and want to build that with us, doing it in a way where it’s like Palantir, where it’s like, oh, we have like Palantir software, and we implement it for you. But you can’t turn us off. That’s that’s just I don’t. I don’t agree with. I don’t think we need to play play like that. I think if our results really shine, they’re gonna want to keep us

238 00:41:32.190 00:41:34.790 Uttam Kumaran: you know. I don’t have fear. I don’t fear of

239 00:41:35.630 00:41:36.329 Uttam Kumaran: I don’t know.

240 00:41:36.330 00:41:50.010 Chang Ho: Yeah, I think that’s a fair response. I think with the patient identification problem. That is a very hard problem. By the way, because you’d be dealing with a lot of unstructured information. And also, you know.

241 00:41:50.140 00:41:53.749 Chang Ho: the healthcare provider will have to be willing to give us some data on that

242 00:41:53.920 00:41:57.959 Chang Ho: for us to play with right like our models will have to be able to interact with

243 00:41:58.405 00:42:00.750 Chang Ho: data that’s quite confidential, even if it.

244 00:42:00.750 00:42:01.559 Uttam Kumaran: Yeah, structures.

245 00:42:01.820 00:42:10.629 Chang Ho: So I can see that there might be quite a lot of obstacles there in terms like liability. And whatever else one of the things that van. So I briefly discussed was how

246 00:42:10.730 00:42:13.920 Chang Ho: something that just sits a little bit more close to the bone.

247 00:42:14.520 00:42:15.900 Chang Ho: Yeah, like that

248 00:42:16.334 00:42:20.300 Chang Ho: is going to need a little bit more thinking behind it, and as well as

249 00:42:20.410 00:42:32.530 Chang Ho: ensuring that, legally speaking, we’re in the right place. The hospital needs to feel like their legal team has also, like assessed this properly. And so that fine like risks are okay to to have in terms of the clinical risks

250 00:42:32.580 00:42:38.160 Chang Ho: as well. I think the way in one way we can abrogate that as as readily as possible would be just to say, Look

251 00:42:38.588 00:42:41.980 Chang Ho: just do everything you’re doing at the moment as you are.

252 00:42:42.180 00:42:45.680 Chang Ho: we’re going to try and identify more patients for you. Who

253 00:42:45.800 00:42:51.309 Chang Ho: could have a procedure, who hasn’t, who, according to current guidelines, blah blah blah

254 00:42:53.170 00:42:56.479 Chang Ho: and you can always have a human in the loop, and you should have a human in the loop

255 00:42:56.590 00:42:57.390 Chang Ho: to

256 00:42:57.550 00:42:59.989 Chang Ho: double check that those patients are eligible.

257 00:43:00.560 00:43:01.350 Uttam Kumaran: For sure.

258 00:43:01.500 00:43:09.239 Chang Ho: And then that way, you can sort of say, well, look, we can, we can, we can. This is going to be slightly experimental, because we can’t guarantee for you. This is going to happen.

259 00:43:09.509 00:43:15.739 Chang Ho: But it’s something like I mean, maybe don’t phrase it, phrase it like that. But you can say, we will help develop your model

260 00:43:15.830 00:43:17.139 Chang Ho: to identify

261 00:43:17.200 00:43:18.270 Chang Ho: patience.

262 00:43:18.844 00:43:21.329 Chang Ho: who who are currently eligible for

263 00:43:21.400 00:43:23.240 Chang Ho: and could benefit from.

264 00:43:23.620 00:43:38.189 Chang Ho: latest policy or guideline based treatments that aren’t receiving them for one reason or another. At this juncture we can identify them for you that we pass through a human being at your end. If your clinicians are happy to do that.

265 00:43:38.240 00:43:39.380 Chang Ho: and then that’s

266 00:43:40.120 00:43:47.170 Chang Ho: that’s where the Roi would be. For for example, you know, a model like that is very easily roiable as opposed to.

267 00:43:47.450 00:43:57.130 Chang Ho: For example, for example, dictation software is harder to give you an roi on. Why? Because it’s sort of like. Well, we’ll save you a few more hours, you might see one more patient.

268 00:43:57.950 00:44:06.789 Uttam Kumaran: It would. Also, I think, is gonna do that, and they’ll sell it like I. Those aren’t those aren’t things that that’s gonna get taken up by like just basic software.

269 00:44:07.310 00:44:13.820 Uttam Kumaran: It’s gonna be more of the stuff that’s bespoke. And you’re right, like, I want to be on the revenue generating side of the business.

270 00:44:14.329 00:44:27.450 Uttam Kumaran: You can be time saving somewhat, but I always want to go back to like, how many dollars are we bringing back? Because we’re not in in AI, we’re not. Gonna I don’t want to charge hourly. We’re gonna charge based on the value to the client.

271 00:44:27.520 00:44:54.289 Uttam Kumaran: If that’s a hundred k, or if that’s 20 k, like, that’s what I want to find out it back into cause. There there are no there’s no set standards, which means we have a good opportunity. If we can show that we’re gonna move this needle then and we do it. Then there’s a lot. There’s a lot there. Okay, so let me take let me take a couple of these back. I’ll add everybody here. To slack. So we’re all kind of chatting. And then,

272 00:44:54.880 00:44:55.974 Uttam Kumaran: yeah, I think,

273 00:44:57.680 00:45:06.100 Uttam Kumaran: I think we have a couple of action. So I think the biggest thing is we’re gonna form the leads list and then form at least one or 2 of like the sample emails that we’ll send out

274 00:45:06.436 00:45:08.399 Uttam Kumaran: and then the 3rd thing is, if I could get

275 00:45:08.680 00:45:11.190 Uttam Kumaran: both of you guys help on some content stuff.

276 00:45:11.370 00:45:12.939 Uttam Kumaran: and then we’ll just like.

277 00:45:13.620 00:45:20.369 Uttam Kumaran: kind of see what happens as we get as we get calls in. We’ll just add everybody here to them. And like, we just pitch basically.

278 00:45:20.730 00:45:21.450 Chang Ho: Yes.

279 00:45:21.610 00:45:22.250 Uttam Kumaran: No.

280 00:45:22.510 00:45:23.530 Uttam Kumaran: and then we’ll

281 00:45:23.880 00:45:30.626 Uttam Kumaran: I mean, the one thing is like, I know, this is like a this is a money making opportunity. So I don’t wanna

282 00:45:31.080 00:45:37.259 Uttam Kumaran: brush over that like I know, we don’t have like contracts and stuff in place. But that was also my goal is like, I,

283 00:45:37.720 00:45:50.210 Uttam Kumaran: part of this is like, I just wanna make sure that we have something that’s like about to close, or that we’re we’re warm on. And then I think we should definitely have that discussion on how we want to arrange everything. But I will say, like.

284 00:45:50.210 00:45:50.810 Chang Ho: Yeah.

285 00:45:51.340 00:46:04.450 Uttam Kumaran: You have my like a good faith. Agreement on like we’ll work something out for sure. There! What? The biggest. And then we’ve changed. We’ve been talking for a long time, so it’s like, Well, I just want to get one win, you know, kind of going, and then.

286 00:46:04.450 00:46:06.400 Chang Ho: Get a project going. Yeah, I agree.

287 00:46:06.570 00:46:07.580 Chang Ho: I agree.

288 00:46:07.848 00:46:11.729 Chang Ho: I also think that you probably already have a system in place, don’t you? For

289 00:46:11.740 00:46:13.820 Chang Ho: how you cut the pie when

290 00:46:13.920 00:46:32.549 Chang Ho: you know client finally pays you. Frankly, I’m not that interested in in talking about this yet until we actually have a potential viable client. Let’s just let’s just get the ball rolling first.st See what what actually sticks, and then we can and what they’re willing to offer, and then we can end up chatting more about the dollar then. But I’m not going to jump the gun

291 00:46:32.720 00:46:34.030 Chang Ho: on that.

292 00:46:34.030 00:46:37.820 Uttam Kumaran: I’m the same way, yeah. And we could do whatever we need. So okay.

293 00:46:37.820 00:46:38.480 Vincent Jeanselme: It’s good.

294 00:46:38.950 00:46:39.430 Uttam Kumaran: Okay.

295 00:46:39.430 00:46:45.950 Chang Ho: As you know, clinicians are notoriously expensive in the Us. Thankfully. As a British doctor, I’m significantly less expensive. But yeah, doesn’t.

296 00:46:46.451 00:47:00.988 Uttam Kumaran: You work for a Us. Company now. So that’s it. We’re charging us prices. We’re not charging European prices. And then so that’s where I am. And yeah.

297 00:47:01.770 00:47:04.369 Vincent Jeanselme: I don’t interview in 2 weeks in Austin.

298 00:47:04.740 00:47:05.940 Uttam Kumaran: Okay. Hell, yeah.

299 00:47:06.200 00:47:16.150 Uttam Kumaran: If you end up coming down here, I’m happy to show you, or if you need a place to stay, I may be out for Thanksgiving, but if you need a place to stay, please let me know. So whatever you need.

300 00:47:16.400 00:47:17.270 Vincent Jeanselme: Thank you.

301 00:47:17.270 00:47:19.086 Chang Ho: Also, I’m really sorry about

302 00:47:19.780 00:47:21.749 Chang Ho: yeah, about your president.

303 00:47:23.265 00:47:25.694 Uttam Kumaran: We’ll see how it goes.

304 00:47:26.180 00:47:27.040 Chang Ho: I mean.

305 00:47:27.260 00:47:32.400 Chang Ho: yeah, I mean, I know you have. I know the American people have elected a monster, but really

306 00:47:32.680 00:47:45.280 Chang Ho: they’re just electing for an anti-establishment face, aren’t they? It’s not because they necessarily like trump, but more because there’s just no one standing up for what they believe is a perversion of

307 00:47:45.560 00:47:46.270 Chang Ho: their own.

308 00:47:46.270 00:48:02.489 Uttam Kumaran: 100. That is like the majority opinion right now, basically. And you can see that in the volume, like a lot of people who didn’t vote ever voted, and they voted Republican. But also, it’s like

309 00:48:02.720 00:48:08.930 Uttam Kumaran: you. I don’t know. It’s weird, like history happens every day, and it’s really crazy to be in the middle of

310 00:48:09.440 00:48:12.009 Uttam Kumaran: history is that’s happening. So it’s.

311 00:48:12.560 00:48:13.549 Chang Ho: Yeah, it is

312 00:48:13.670 00:48:17.740 Chang Ho: the number of single issue issue voters is absolutely striking as well.

313 00:48:18.130 00:48:19.540 Uttam Kumaran: Yeah, and.

314 00:48:19.540 00:48:21.290 Chang Ho: I agree. It’s like I’m voting with trump.

315 00:48:21.290 00:48:24.170 Uttam Kumaran: Don’t even affect. And there’s issues that don’t even affect

316 00:48:24.250 00:48:28.170 Uttam Kumaran: certain people that they vote on. But it’s it’s their vote, you know. So

317 00:48:28.440 00:48:29.180 Uttam Kumaran: yeah.

318 00:48:29.180 00:48:29.770 Chang Ho: Yeah.

319 00:48:30.280 00:48:31.400 Uttam Kumaran: Okay, guys.

320 00:48:31.400 00:48:37.080 Chang Ho: Saddest I have to say of all is who are immigrants.

321 00:48:37.260 00:48:43.680 Chang Ho: 1st generation in the Us. It really earns, you know, their Us. Citizenships.

322 00:48:44.040 00:48:45.390 Chang Ho: and then they’re claiming

323 00:48:45.720 00:48:50.539 Chang Ho: that it’s this immigrant invasion that is impacting their lives.

324 00:48:51.200 00:48:54.369 Uttam Kumaran: Yeah, but it’s also they. They shut down high, skilled labor.

325 00:48:54.550 00:48:57.870 Uttam Kumaran: but then they let they let in a lot of illegal immigrants.

326 00:48:57.980 00:48:59.100 Uttam Kumaran: Yeah, I mean.

327 00:48:59.200 00:49:02.809 Uttam Kumaran: why? Why did we cut the h 1 b program so heavy.

328 00:49:02.840 00:49:04.080 Uttam Kumaran: But then, like.

329 00:49:04.530 00:49:12.339 Uttam Kumaran: it’s just no, if there’s just no answers, because there’s no there’s no plan, because government is the worst run company in the world.

330 00:49:12.450 00:49:17.377 Uttam Kumaran: So and then they just want to get more money to do more.

331 00:49:17.730 00:49:19.379 Chang Ho: Do well with it. Yeah.

332 00:49:19.510 00:49:20.880 Chang Ho: yeah. Yeah.

333 00:49:21.730 00:49:28.780 Chang Ho: Anyway. Let’s yeah. Let’s let’s put something else in the let’s schedule schedule a follow up with the meantime.

334 00:49:28.780 00:49:33.729 Uttam Kumaran: I’ll grab. I’ll grab something next week, and then we’ll work towards it. Give me a little bit of time.

335 00:49:35.390 00:49:37.159 Uttam Kumaran: but I want, but I think this.

336 00:49:37.160 00:49:42.220 Chang Ho: But by next week let’s have a let’s have a draft email that we want us to send rounds by marketing.

337 00:49:42.220 00:49:45.510 Uttam Kumaran: Oh, yeah, we’ll have. We’ll have that by like Monday or Tuesday, for sure.

338 00:49:45.510 00:49:48.520 Chang Ho: And a list of places we want to send this off to. I think

339 00:49:48.650 00:49:50.985 Chang Ho: they should ideally be.

340 00:49:52.860 00:49:56.110 Chang Ho: yeah, like, university affiliated

341 00:49:56.569 00:50:10.840 Chang Ho: medical institutions, maybe just go with go below the top. 25 top 30 you. You know us medical centers or us medical universities and just work your way down that list, so I’m sure many of them will have

342 00:50:10.880 00:50:13.509 Chang Ho: multitudes of associatives.

343 00:50:13.970 00:50:16.359 Chang Ho: medical clinics, and hospitals.

344 00:50:16.770 00:50:17.520 Uttam Kumaran: Okay. Okay.

345 00:50:17.520 00:50:20.300 Chang Ho: Where they’ll have less data, analytical capacity.

346 00:50:20.670 00:50:21.390 Uttam Kumaran: Yeah.

347 00:50:21.390 00:50:23.809 Chang Ho: But might be interested in incorporating. AI, yeah.

348 00:50:23.810 00:50:29.710 Uttam Kumaran: Even look if, like certain titles don’t exist in the in the company, like, if they don’t have a certain title.

349 00:50:29.890 00:50:32.670 Uttam Kumaran: then it’s good. But okay.

350 00:50:32.910 00:50:45.910 Chang Ho: Exactly you, I mean, I’m sure Valence knows this already, Miguel. You might know this, too, you know. A chief data information officer, or something like that, or chief information officer simply doesn’t exist in the majority of

351 00:50:46.680 00:50:49.720 Chang Ho: hospital and healthcare providers. They just don’t exist.

352 00:50:50.010 00:50:50.740 Uttam Kumaran: Yeah.

353 00:50:50.740 00:50:52.439 Chang Ho: They’re rarefied beasts.

354 00:50:52.490 00:51:07.680 Chang Ho: and you, you know, you’ll encounter them like a you know, like a double horned black rhino in a safari in Boston, of course, in Stanford, of course, and they think it’s normal because it’s become normative for them. But it’s just simply not the case

355 00:51:09.470 00:51:11.089 Chang Ho: across the Us.

356 00:51:11.150 00:51:12.750 Chang Ho: - you’ll find it.

357 00:51:13.070 00:51:14.350 Uttam Kumaran: That’s where I want.

358 00:51:14.350 00:51:18.670 Chang Ho: Chief officer. Yeah, sure. Chief Safety Officer. Yeah. Chief information officer.

359 00:51:18.820 00:51:19.810 Chang Ho: Rare

360 00:51:20.100 00:51:23.360 Chang Ho: chief like digital information officer. Oh, my God, I mean.

361 00:51:23.410 00:51:24.830 Chang Ho: you know, you’re getting niche.

362 00:51:25.340 00:51:25.830 Chang Ho: Yeah.

363 00:51:25.830 00:51:26.530 Uttam Kumaran: Yeah.

364 00:51:28.750 00:51:30.849 Uttam Kumaran: okay, guys, we’ll chat in slack.

365 00:51:32.010 00:51:33.609 Uttam Kumaran: Really nice to meet you as well

366 00:51:33.680 00:51:35.580 Uttam Kumaran: talk to you soon. Yeah.

367 00:51:36.436 00:51:38.149 Chang Ho: Guys. Bye-bye.