Meeting Title: Brainforge x BrainCheck AI Infrastructure Discussion Date: 2025-08-12 Meeting participants: Uttam Kumaran, Samuel Roberts, Bassel Samman, Ken


WEBVTT

1 00:00:23.170 00:00:24.240 Uttam Kumaran: Hey!

2 00:00:24.490 00:00:25.509 Uttam Kumaran: How’s it going?

3 00:00:25.850 00:00:28.309 Samuel Roberts: Alright, it’s a little warm here today.

4 00:00:28.660 00:00:29.520 Uttam Kumaran: Okay.

5 00:00:29.520 00:00:31.699 Samuel Roberts: My office is in the attic, and it gets…

6 00:00:31.900 00:00:36.380 Samuel Roberts: Pretty hot when the sun’s taken down, so the AC is just going, and it can’t keep up, but….

7 00:00:36.570 00:00:40.710 Uttam Kumaran: We did, we did, like, setup tours one week.

8 00:00:41.050 00:00:41.660 Samuel Roberts: Oh, yeah.

9 00:00:41.660 00:00:46.550 Uttam Kumaran: We want to do that again. I’m sure you have, like… do you have a mechanical keyboard?

10 00:00:46.580 00:00:50.579 Samuel Roberts: Of course, I have a mechanical keyboard, am I not predictable? Yeah, there you go.

11 00:00:50.580 00:00:52.269 Uttam Kumaran: That’s mine.

12 00:00:52.270 00:00:57.430 Samuel Roberts: I got a few of them, this is my current one. Oh, nice, yeah, that I don’t have, I should.

13 00:00:57.770 00:00:59.160 Uttam Kumaran: Razor mouse….

14 00:00:59.160 00:01:00.350 Samuel Roberts: Oh, nice, yeah.

15 00:01:01.000 00:01:02.879 Uttam Kumaran: And another thing was around.

16 00:01:02.880 00:01:04.120 Samuel Roberts: I just said, huh.

17 00:01:04.129 00:01:06.079 Uttam Kumaran: My desk is a mess, though, like….

18 00:01:06.080 00:01:06.690 Samuel Roberts: Yeah.

19 00:01:06.880 00:01:10.739 Uttam Kumaran: Just gets messy, so… Hey, how are you?

20 00:01:11.460 00:01:12.989 Bassel Samman: I’m doing alright, how are you?

21 00:01:12.990 00:01:15.679 Uttam Kumaran: Good, good to see you. How do you pronounce your name, by the way?

22 00:01:15.950 00:01:16.800 Bassel Samman: Basile.

23 00:01:16.800 00:01:18.730 Uttam Kumaran: Basil, I’m Utam. It’s nice to meet you.

24 00:01:18.730 00:01:20.559 Bassel Samman: Tom, nice to meet you as well.

25 00:01:21.180 00:01:21.620 Samuel Roberts: Sounds.

26 00:01:21.620 00:01:23.119 Bassel Samman: Thanks, Samuel, how’s it going?

27 00:01:23.120 00:01:23.650 Samuel Roberts: Shoot.

28 00:01:23.900 00:01:24.970 Bassel Samman: Here’s Ken.

29 00:01:25.260 00:01:26.610 Uttam Kumaran: Yeah, it’s good.

30 00:01:26.610 00:01:27.390 Ken: Hello?

31 00:01:28.140 00:01:28.640 Samuel Roberts: Boom.

32 00:01:28.640 00:01:29.480 Uttam Kumaran: Hey!

33 00:01:30.370 00:01:33.409 Bassel Samman: Do a quick round of intros, and then we can dig in?

34 00:01:33.410 00:01:38.119 Uttam Kumaran: Yeah, sure, sure, sure. So, again, my name’s Utom, I’m CEO of Brainforge,

35 00:01:38.420 00:01:41.130 Uttam Kumaran: We’re a data analytics and AI consultancy.

36 00:01:41.440 00:01:55.409 Uttam Kumaran: I’m here in Austin, but our team is sort of everywhere, across the U.S. and global. Sam is on our AI team, maybe Sam will have you give a brief introduction, and then, yeah, excited to sort of jump in and see where we can be helpful.

37 00:01:56.110 00:02:04.940 Samuel Roberts: Yeah, so good to mention, my name’s Sam, I’m on the AI team. I’ve come on recently, so I’m sort of getting, to know,

38 00:02:05.170 00:02:10.290 Samuel Roberts: our client work and stuff, but I’m excited to learn more about what we can do for you guys.

39 00:02:11.860 00:02:26.679 Bassel Samman: Alright, I’m Basil, Basil Salmon, I’m VP of Engineering, been with BrainCheck for… since the early days. We’re now digging into AI. We did a bunch of stuff on the engineering side, we’ll dig in later.

40 00:02:26.730 00:02:32.030 Bassel Samman: And then now we’re starting to go into the production side. I’ll let Ken introduce himself.

41 00:02:33.430 00:02:39.909 Ken: I’m Ken, staff engineer, engineering manager, basically everything engineering that’s not executive.

42 00:02:40.060 00:02:43.140 Ken: and a baby to AI. So, I…

43 00:02:43.460 00:02:47.099 Ken: big on the application side, and not so much on the AI, but hope to learn.

44 00:02:48.150 00:02:48.940 Uttam Kumaran: Awesome.

45 00:02:48.940 00:02:52.620 Bassel Samman: He says that, but he’s being modest.

46 00:02:53.940 00:02:56.140 Bassel Samman: It’s definitely showed me a lot of tricks.

47 00:02:57.120 00:02:58.070 Bassel Samman: Yeah.

48 00:02:58.070 00:02:58.400 Ken: I learned.

49 00:02:58.400 00:03:03.329 Uttam Kumaran: So, I know, I know we went back and forth on email, but tell me about where you want to start, …

50 00:03:03.530 00:03:05.030 Uttam Kumaran: Happy to talk through anything.

51 00:03:05.580 00:03:16.069 Bassel Samman: So, I think, like, it’s been… there’s been a lot of noise, obviously. Everybody wants AI something, AI something, and it’s a lot of times what they don’t know what they’re asking for.

52 00:03:16.180 00:03:17.729 Bassel Samman: And they’re just like, you know…

53 00:03:17.940 00:03:24.900 Bassel Samman: Let’s, let’s, you know, create a conversational AI, and I’m like, great. They’re like, we need to bring a company.

54 00:03:25.020 00:03:32.520 Bassel Samman: So they can figure out the voice stuff. I’m like, the voice stuff is not the hard part. It’s everything else.

55 00:03:32.520 00:03:37.370 Uttam Kumaran: It’s whatever gets put into the voice API, it’s not the voice itself.

56 00:03:37.370 00:03:37.840 Samuel Roberts: And….

57 00:03:37.970 00:03:44.310 Bassel Samman: Yeah, it’s all the things that the voice needs to say that usually ends up slowing it down.

58 00:03:44.670 00:03:59.059 Bassel Samman: So yeah, I mean, we’ve been kind of experimenting, and I’ve been doing a lot of experimentation, so usually kind of the way we do things at BrainCheck is we’ll do experiments, and then later on we’ll expand, so I’ve been experimenting a lot with N8N,

59 00:03:59.240 00:04:15.590 Bassel Samman: I started out with experimenting with different, you know, IDEs. We ended up, you know, between Cursor, Windsurf, did a bunch of projects. We ended up in Windsurf before Windsurf started losing again. It was winning when we chose it.

60 00:04:15.720 00:04:32.680 Bassel Samman: And, you know, I passed it on to the engineering team. The engineering team, took that and kind of started doing some really cool stuff with MCP servers and really connecting the IDEs and the IDE with the MCP servers, and started to do some cool stuff there.

61 00:04:32.710 00:04:38.379 Bassel Samman: And now I kind of moved on to, NADNM workflow management.

62 00:04:38.600 00:04:47.620 Bassel Samman: I’ve… I’ve dug pretty deep in N8N, like, I don’t think I’ve used every node, but I’ve used pretty close to every single…

63 00:04:47.740 00:04:58.319 Bassel Samman: one at this point, just playing around, and now moving into, okay, where do we take it next? So we are in the healthcare industry, which is the difficult part.

64 00:04:58.580 00:05:15.699 Bassel Samman: So we were… you were… you were suggesting Bedrock, for example, and that is our, like, right now, lead contender. Not as an N8N replacement, it would be an NAN partner, so they have a Bedrock node, and so we would use a Bedrock agent.

65 00:05:16.240 00:05:23.730 Bassel Samman: Instead, and that that way we have the safety of running the models in our own environments.

66 00:05:24.460 00:05:29.490 Bassel Samman: And, and the, the, you know, flexibility of using N8N.

67 00:05:30.170 00:05:42.000 Bassel Samman: workflows. One of the big things that we need to consider with N8N is, definitely scaling. I understand how it scales, it makes sense, that’s how we did it 10 years ago.

68 00:05:42.050 00:05:55.400 Bassel Samman: Right? Which is, you put queues, and you just grab from the queue, and you execute. Like, it’s not, not a big, not a huge complication there. There are issues with that model, but it’s not, you know…

69 00:05:55.530 00:06:10.300 Bassel Samman: I think doing, like, event-based, is probably better, but doing queue-based is not… is not horrible. It’s… it’s worked in the past, just to get… you gotta be careful when things dump off, and how you’re clearing the queues, and all that stuff, like…

70 00:06:10.480 00:06:19.579 Bassel Samman: you need a management layer with Q-Systems. So I wanted to see, like, you know, it sounded like you guys have, gone through, and I usually put…

71 00:06:19.920 00:06:28.270 Bassel Samman: I use, like, I judge companies, usually, and individuals on four-level systems for AI, so level one is

72 00:06:28.730 00:06:30.889 Bassel Samman: Everybody using ChatGPT?

73 00:06:31.000 00:06:48.940 Bassel Samman: So, you know how to… you know how to write a prompt, you probably get pretty advanced, but you don’t know what an assistant is, you don’t know what an agent is. Then level 2 is, like, you know what an assistant is, you’ve built your own agents, you’ve gotten a little bit more flexible, you have an idea of how to train an agent, you’re not really sure of how, you know…

74 00:06:49.050 00:06:59.029 Bassel Samman: And then level 3 would be when you start going into workflows, and how do you minimize when to use AI, because if you use AI everywhere, you end up just messing up everything.

75 00:06:59.440 00:07:16.100 Bassel Samman: Usually in Level 2, you learn how to get to level 3. Like, you start having problems, you start to solve them, you go into workflows, you minimize the amount of data going into the agent, you start learning how to build automated prompts so you’re not building one gigantic prompt, like, all those things are…

76 00:07:16.560 00:07:25.540 Bassel Samman: And N8M does a great job by that, by the way, Ken, on that. Like, you can pass automated prompts pretty easily with NAVM. It’s pretty cool.

77 00:07:25.540 00:07:26.080 Uttam Kumaran: a lot of times.

78 00:07:26.080 00:07:26.440 Bassel Samman: Imagine.

79 00:07:26.440 00:07:27.350 Uttam Kumaran: No, yeah.

80 00:07:27.350 00:07:35.060 Bassel Samman: Right, right. Which is what I like about it. There’s a lot of… there’s a lot of really… you learn… it forces to… you to learn.

81 00:07:35.320 00:07:54.469 Bassel Samman: And then, you know, level 4 is Google. Like, Tesla, or X, or those companies that are building their own models, and really training them, like, from scratch, and training them. I don’t think we’ll ever want to be… well, not for another couple of years. We, you know, never say never.

82 00:07:54.720 00:08:01.580 Bassel Samman: start building our own models. We are doing that, actually, just not… not at a production level scale.

83 00:08:01.930 00:08:04.860 Bassel Samman: On, like, for research purposes only.

84 00:08:04.860 00:08:05.460 Uttam Kumaran: Sure.

85 00:08:05.760 00:08:19.660 Bassel Samman: So I kind of wanted, like, where I would say we are is very much level 2, learning to get to level 3. The only reason I did the whole level system is just to make it easier of where we are.

86 00:08:19.660 00:08:20.010 Uttam Kumaran: Fair.

87 00:08:20.010 00:08:26.689 Bassel Samman: And so Level 3 is where we’re trying to break into. We’ve built agents, we’ve built, like.

88 00:08:27.010 00:08:33.619 Bassel Samman: simple, calls with AI and having it, you know, build scripts and all that stuff.

89 00:08:33.830 00:08:36.069 Bassel Samman: And now we’re trying to figure out, okay.

90 00:08:36.220 00:08:38.739 Bassel Samman: Clearly, we’re gonna produce crappy stuff.

91 00:08:38.840 00:08:49.260 Bassel Samman: if we have one agent does everything, that’s not what, you know, so we’re… we’re getting it up to the next level of how do we protect the data? How do we put guardrails? How do we…

92 00:08:49.260 00:08:59.780 Bassel Samman: Minimize when to talk to agents, build smarter workflows instead of dumber… or smarter agents. There’s no such thing as a smarter agent. You just get dumber or slower. You have two choices.

93 00:08:59.790 00:09:01.719 Uttam Kumaran: Yeah. So….

94 00:09:02.210 00:09:14.260 Bassel Samman: So yeah, we’re trying to forge that path forward, and I was talking to Michael about that, and it was funny, because we were drifting on the same note, and he’s like, you gotta, you know…

95 00:09:14.400 00:09:18.790 Bassel Samman: You gotta talk to these guys, and, and, and, and, you know.

96 00:09:19.730 00:09:30.169 Bassel Samman: check, you know, cross notes, and see what… well, how’s it going? Are you thinking about it correctly? What are your options? Because he’s… I reached, like, the limit of what… what he feels comfortable.

97 00:09:30.170 00:09:30.959 Uttam Kumaran: We’re talking about.

98 00:09:30.960 00:09:31.400 Bassel Samman: about….

99 00:09:31.470 00:09:32.490 Uttam Kumaran: And so….

100 00:09:32.490 00:09:34.200 Bassel Samman: They passed me on to you guys, which is.

101 00:09:34.200 00:09:51.769 Uttam Kumaran: No, and it’s a sort of, like, you start to build abstractions over what you know, but you’re right, I mean, like, I think about our company, in fact, I think, Sam, I probably explained the same thing to you, where I can… I think of it like, you know when they talk about autonomous cars, they have these, like, level one, level 2 autonomy? That’s how I speak about our

102 00:09:51.770 00:10:02.189 Uttam Kumaran: even our business, right? And it’s like, Level 1 autonomy, or level zero is whatever, baseline. Level 1 is, like, everybody’s just using ChatGPT, but they’re kind of just, like, copying and pasting stuff.

103 00:10:02.370 00:10:14.490 Uttam Kumaran: Level 2 is, like, okay, maybe there are just smarter workflows that, okay, maybe we’ve developed great system prompts in Slack or in a UI that you can now use so you don’t have to, like.

104 00:10:14.560 00:10:33.279 Uttam Kumaran: maybe write the prompt over and over again. Maybe it’s almost a replica of just projects, cloud projects or ChatGPD projects. The next thing is actually building the input, so being able to have systems that pull, you know, in our world, this is pulling from Slack, it’s pulling from GitHub, it’s pulling from Zoom meetings.

105 00:10:33.370 00:10:40.170 Uttam Kumaran: It’s pulling from the internet, several other data stores, structured and unstructured data.

106 00:10:40.170 00:10:53.789 Uttam Kumaran: made available to an agent dynamically when you’re answering a question. Then you start getting to multi-agent systems, right? Where it’s like, okay, the goal, you’re right in that you’ll find that the more

107 00:10:53.790 00:11:03.460 Uttam Kumaran: constrained your use case can be and the objective, the better your agent will get to some deterministic output, meaning it will be more accurate over time.

108 00:11:03.460 00:11:11.469 Uttam Kumaran: But even in that is, like, you need to define accuracy, you need to start being able to run evals, you need to start having golden data sets.

109 00:11:11.470 00:11:13.989 Uttam Kumaran: Start getting… having a feedback loop.

110 00:11:13.990 00:11:31.829 Uttam Kumaran: not only, like, just thumbs up, thumbs down, but also something more, qualitative, and then starting to measure, okay, what are… what are all of our agents actually responding with? How are scores going up? Can we triage the low scores? And then, ultimately, you get to systems that I would say that the kind of the peak of that is

111 00:11:31.830 00:11:34.479 Uttam Kumaran: More like ambient systems, is what they’re calling.

112 00:11:34.480 00:11:49.389 Uttam Kumaran: things that are working on the background, consuming logs, consuming events, and dynamically taking actions. An example of this in our business is we do a lot of client work. Commonly, in client work, things can get missed in Slack, for example, if we’re talking to a client.

113 00:11:49.440 00:11:58.199 Uttam Kumaran: So we… we’re trying to build agents that look through on a periodic basis all the messages that have been sent by a client, and says, hey, I think we may have missed something.

114 00:11:58.440 00:12:09.090 Uttam Kumaran: Right? That’s a… that’s almost an always-on thing that happens, right? But so that’s, I feel like, kind of getting towards the level 5, but again, we’re sort of talking about both levels of…

115 00:12:09.280 00:12:10.310 Uttam Kumaran: …

116 00:12:10.480 00:12:28.759 Uttam Kumaran: types of systems, but also starting to push the boundaries of, like, what’s… what’s possible. But it… one, it starts with just having, like, a great N8N setup. So for us, again, we have several AI engineers that are working in our platform, and the way we even got into this is, again, we were running a data analytics

117 00:12:28.820 00:12:45.899 Uttam Kumaran: agency before, but I was using AI to build a whole company, and so for 2 years, I’ve been using every tool in the market to try to get an edge, and we’re completely bootstrapped, so we’ve sort of, by constraint, we’ve needed to invest in AI, and through that process, we…

118 00:12:45.900 00:12:56.749 Uttam Kumaran: Tried a lot of things, messed up a lot of things, and then started now to go to market, because with some of the things that we’ve built. But not in, like, a product… not in, like, a product manner, but more in, like, hey, we’ve…

119 00:12:56.750 00:13:13.019 Uttam Kumaran: used N8N at the limit. We’ve used Slack at the limit on how those things interact, so you can sub in Teams or any sort of chat-based workspace. We’re also starting to do great stuff with email, but we’re also vibecoding our own simple UIs to take on tasks.

120 00:13:13.020 00:13:23.209 Uttam Kumaran: a chat-based interface is not the best interface for every task, right? And that’s what’s, you know, Sam on the team has a great background in front-end UX, and

121 00:13:23.210 00:13:31.340 Uttam Kumaran: The reason being is that, like, just chatting back and forth and something coming up in text is not… is not the best for everything, right? For example.

122 00:13:31.340 00:13:38.750 Uttam Kumaran: One of our common use cases is we need to go from a transcript with a client to tasks in our linear board, right, for project management.

123 00:13:38.750 00:13:53.999 Uttam Kumaran: Well, we developed a little UI that allows you to see all those as blocks. A project manager can quickly edit them, and so it’s more like a workflow-specific UI. So there’s even, like, one step further where maybe chat isn’t the best interface for the thing you accomplish.

124 00:13:54.070 00:14:07.019 Uttam Kumaran: So, part of this is, like, you know, the system prompt, and the prompt is actually, I would say, you could get there pretty fast, but do you have the right data available, whether it’s through a RAG pipeline, some retrieval system.

125 00:14:07.020 00:14:23.779 Uttam Kumaran: Right? So that becomes more of a data engineering task, which is something that’s all we do for, you know, that’s our business, primarily. And then it becomes an output, like, is it going to the right integration? Is a webhook getting triggered? Is another function getting triggered with a structured output? Or is there, like.

126 00:14:23.880 00:14:27.230 Uttam Kumaran: you know, workflow-specific UI that has to get built.

127 00:14:27.320 00:14:34.309 Uttam Kumaran: So that’s kind of, like, how we think about it. And for us, you know, the reason we chose N8N, as I mentioned in the email, one, it’s…

128 00:14:34.380 00:14:44.800 Uttam Kumaran: it’s, … you know, I’m not really a big fan of, like, low code, no code. I really did not want to have to admit that we were going to use NAN for stuff, but one, it’s very flexible.

129 00:14:44.800 00:15:07.989 Uttam Kumaran: So you can call external functions, you can run a lot of stuff in there. You know, they have a lot of rich integration set and a huge community, so it’s really good. Second is, like, we want to enable anybody in the company to build on NAN, whether the least technical person to the most technical person, and then you can graduate things out of there, right? So everything ultimately is function calls, and you can move it into

130 00:15:08.110 00:15:26.649 Uttam Kumaran: wherever you want to go. So, the graduation path is not something that ultimately we’d be locked into, because what is it to prompt? It’s some functions that wrap around it. We’re calling some database to pull something. So I was more comfortable with the decision to kind of invest there, and now most of our stuff for our clients runs there.

131 00:15:26.790 00:15:37.090 Uttam Kumaran: You know, and we’re… we’re in a probably… we’re pretty similar setups, but where our business and the kind of the things we’ve done on the edge is more multi-agent setups, human in the loop.

132 00:15:37.190 00:15:56.139 Uttam Kumaran: And where we try to really prioritize is evals. Like, making sure you have a great set of evaluation metrics to understand adoption and to understand scoring. Like, are you actually getting the proper outputs? Who’s using it on the team? If they’re not using it, why not? If they are using it, what can we learn from that person?

133 00:15:56.230 00:16:01.350 Uttam Kumaran: Things like that. We work… we work with a wide variety of clients, and people who are at this level zero.

134 00:16:01.530 00:16:10.060 Uttam Kumaran: Right? And we also work with people kind of at your level, where they want to bring us in, sort of like, what are we missing? Or, like, how do we get this to the next level?

135 00:16:10.340 00:16:15.229 Uttam Kumaran: Where you have great engineers, but you almost need someone just solely dedicated to that, so…

136 00:16:15.470 00:16:19.379 Uttam Kumaran: Just talk for a while, but that’s, like, sort of all the stuff we’re thinking about, you know?

137 00:16:20.160 00:16:30.090 Bassel Samman: Yeah, I mean, I could definitely think of scenarios where, so we… our business model, we can… we kind of have 3 audiences.

138 00:16:30.150 00:16:38.640 Bassel Samman: We definitely have the doctors, you have the patients, and for elderly folks, you have the caregivers. And, each of those…

139 00:16:38.690 00:16:50.400 Bassel Samman: You have to speak a different language. The doctor doesn’t… you don’t have to be so kind-wigged. You don’t have to, you know, be careful not to say things like end-of-life checklist or anything like that. They would….

140 00:16:50.400 00:16:50.860 Uttam Kumaran: Hell yeah.

141 00:16:50.860 00:17:05.830 Bassel Samman: So, with patients you have. You do… you don’t want to, you know, if they hear end-of-life checklist, they would probably have a heart attack. So, you don’t want to tell them the real name, you want to be able to be, you know, cognizant, and same with caregivers, you want to give them

142 00:17:05.980 00:17:16.310 Bassel Samman: more information, like, watch out they don’t fall, watch out that they’re taking their prescriptions that you wouldn’t want to say directly to the patient. So… so, yeah, there’s a lot of…

143 00:17:16.410 00:17:18.150 Bassel Samman: …

144 00:17:18.319 00:17:26.499 Bassel Samman: It’s… it’s the same data, but 3 audiences, which is interesting, because we gotta build… it’s… it’s more about the guardrails.

145 00:17:26.619 00:17:33.160 Bassel Samman: When to use, when not to use, and also, like, never prescribe something as a doctor, you know, those kind of things that…

146 00:17:33.280 00:17:36.970 Bassel Samman: There’s rule sets, guardrails, and audiences.

147 00:17:37.320 00:17:38.590 Bassel Samman: …

148 00:17:39.450 00:17:51.449 Bassel Samman: So, yeah, I mean, I’d love to… that would be very helpful if we could… if you guys are on that… I’ll have to talk to the team also to see about the… we have a data team, but they’re really, you know.

149 00:17:51.820 00:17:58.680 Bassel Samman: they’re not… I wouldn’t say they’re AI-specific, they’re really, you know, they’re the research arm of BrainCheck.

150 00:17:58.800 00:18:05.720 Bassel Samman: And so they do a lot of, data analysis and digging into data and stuff like that, so…

151 00:18:06.010 00:18:12.700 Bassel Samman: Yeah, at some point, it’d be great, once we are getting into… once we get past the infrastructure side, like, right now.

152 00:18:12.950 00:18:19.470 Bassel Samman: What I want to build is the infrastructure. I want it to be really easy for somebody to come to me and say.

153 00:18:19.590 00:18:23.469 Bassel Samman: you know, I want a conversational AI model.

154 00:18:23.630 00:18:33.199 Bassel Samman: And then I could say, okay, great, there’s… these are the 3-4 components that you need. Here, we spin, you know, here’s an NAMM workflow we’ve set up for you.

155 00:18:33.200 00:18:44.619 Bassel Samman: here’s your, you know, put something together for conversational, like, you know, UI, like, where you can just make a phone call or something like that, and then hook it up, and then they’re ready to go.

156 00:18:44.800 00:18:47.820 Bassel Samman: It’s… it’s honestly, like, not hard work.

157 00:18:48.200 00:18:54.370 Bassel Samman: But I don’t want to do it myself every time. I want to build the infrastructure so it’s easier to spin up, like, a simple app.

158 00:18:54.370 00:19:09.190 Uttam Kumaran: even just templates, right? Like, templates on… so that, again, the Reisman NANs, you can load in your API keys, and so they can use these common nodes that are just particular to… to your instance. But then, again, yeah, having very simple things like

159 00:19:09.280 00:19:21.970 Uttam Kumaran: you know, one of the things that we’re exploring, Sam, maybe you can talk about, like, Copilot kit, like, I think it could be… it’s… you could probably explain, like, what the problem was, and maybe why, like, kind of that, because that’s… that’s sort of going into, like, infrastructure on…

160 00:19:22.980 00:19:31.360 Uttam Kumaran: you know, whenever we develop a chat experience, what is that gonna look like? What are the components? Maybe you can talk a little bit about, like, what we… what we found there.

161 00:19:32.050 00:19:49.149 Samuel Roberts: Yeah, so, Copilot Kit, is a library, basically, for, you know, giving you these chat UIs, but it hooks into lots of other things. Like, it can make an NNN webhook, but it also plugs into some of the other frameworks better. But really, it’s about that kind of consistency.

162 00:19:49.850 00:19:52.470 Samuel Roberts: Across, for us, especially, because if we’re building different

163 00:19:52.580 00:19:54.510 Samuel Roberts: you know, chat UIs for different

164 00:19:54.590 00:19:59.809 Samuel Roberts: Apps internally, for client work, you know, that’s not something we need to necessarily

165 00:19:59.870 00:20:05.849 Samuel Roberts: build from the ground up ourself. It’s kind of a, you know, it’s theoretically a solved problem out there, but

166 00:20:05.890 00:20:22.269 Samuel Roberts: a lot of them are kind of… still want you to do all the styling and formatting and getting all the little nitty-gritty stuff right, all the edge cases. And so we were looking at that recently, because we have, you know, we spun up and bobcoded our own little chat box for our own tools, and we’re trying to see what else is out there that we could

167 00:20:23.280 00:20:31.519 Samuel Roberts: could utilize. And, you know, what we were finding was those CopilotKit was really good at doing a lot of these things that were, you know, kind of…

168 00:20:31.930 00:20:34.280 Samuel Roberts: You know, standard stuff that…

169 00:20:34.410 00:20:44.539 Samuel Roberts: you expect, but it’s really easy for us to, or for anyone, really, to just kind of almost drop in, plug their NADA into it, and then get a really solid chat experience.

170 00:20:44.920 00:20:47.940 Uttam Kumaran: Yeah, so, like, things like suggested, responses, summaries….

171 00:20:47.940 00:20:48.690 Samuel Roberts: Exactly, yeah.

172 00:20:48.690 00:20:57.620 Uttam Kumaran: facts, right? So everything you come to expect out of ChatGPT, one of the things you’ll find is that it’s hard to mimic that experience, because all you have is the API.

173 00:20:57.750 00:21:04.609 Uttam Kumaran: Right? But then I told Mikey, I was like, I don’t want to be building chat front ends, right? So let’s go find what the best

174 00:21:04.620 00:21:22.560 Uttam Kumaran: you know, generative AI chat front end is, and, like, and also the one with the longest roadmap, with most… and we found CopilotKit to be a great use case for that, and they have a lot of out-of-the-box, you know, features that we’re going to start using, and now that’s our default for chat experiences, and so for us, that’s a platform decision.

175 00:21:22.560 00:21:40.200 Uttam Kumaran: That’s… any of our internal chat experiences are going to be built on that. When we go to clients, that’s our recommendation. And so we look… we’re looking at things the very similar way, where I want our team focused on great prompt engineering, great brag, like, how do we get the right data, how do we do pre-filtering?

176 00:21:40.200 00:21:53.800 Uttam Kumaran: summarization, and then the actual, like, integration. Is this… if it’s… if it’s going to a chat interface, that’s one, but if it has to get sent to Slack, or if it’s getting sent to an email, or another system, like, that’s where the brain should go.

177 00:21:53.800 00:22:01.219 Uttam Kumaran: not on, like, Claude versus this, or, you know, on setting up the… those are all platform things that I want set up, not only for.

178 00:22:01.220 00:22:01.750 Bassel Samman: Right.

179 00:22:01.750 00:22:07.809 Uttam Kumaran: anybody in our company, but when we go to clients, I want to sort of have… say, hey, we’ve done a… we’ve done the research on.

180 00:22:08.670 00:22:09.140 Bassel Samman: There you go.

181 00:22:09.140 00:22:11.510 Uttam Kumaran: Market, here’s what the leading edge is, you know.

182 00:22:12.000 00:22:16.190 Bassel Samman: Yeah. Yeah, that was, like, some of the things that I want to set up, like, …

183 00:22:16.920 00:22:22.099 Bassel Samman: we have a lot of data, for example, in Google Drive. We have a lot of data

184 00:22:22.100 00:22:42.829 Bassel Samman: And yeah, like, our source code, it gives you context on what our actual platform does, right? So I would want AI to be trained on that and understand it, you know, I want to add, like, there’s all these things, so if we want to do BI, I would want it to know our database, our, you know, have access to our data, our source code.

185 00:22:42.850 00:22:54.840 Bassel Samman: as well as our Google Drive, right? Because it’s really breaching across. So, building those datasets and training on them, I think, is a smart idea, and then… and so we should have that set up.

186 00:22:54.950 00:23:00.150 Bassel Samman: And so when we want a RAG model, we could literally say, tack into, you know.

187 00:23:00.420 00:23:03.169 Bassel Samman: Tap into the engineering

188 00:23:03.400 00:23:11.790 Bassel Samman: business data… data training set. Tap into the clinical data set. Tap into, and you’re ready to go.

189 00:23:11.790 00:23:15.810 Uttam Kumaran: Right? Instead of rebuilding these for every single audience, or

190 00:23:15.810 00:23:33.689 Uttam Kumaran: 100%, yeah. It’s very similar to how we’re thinking about it in our business, because we… we have several clients we’re working with, right? And for me, a common problem in consulting is the waiting game of, like, where is this document, or who was in that meeting? So those are all problems that we’re, like, tackle head-on, because

191 00:23:34.030 00:23:50.179 Uttam Kumaran: you know, we don’t want our client to have a bad experience where we’re waiting for something just because someone answered a question late or didn’t get to something, when easily they could have… if they would have asked it over our code base, asked it over all of our meetings, it would have been an easy answer, right? But the problem there is actually not

192 00:23:50.300 00:24:06.330 Uttam Kumaran: getting that all into chat is like, okay, what’s the best way to get all of our meetings from Zoom into a Supabase thing and put something like Elasticsearch or something on top of it to do keyword search? Like, that is that problem, and then similarly, like, codebase, okay, how do we…

193 00:24:06.330 00:24:22.419 Uttam Kumaran: ETL, all of our codebase, into S3, and then programmatically pick the codebase to bring into context, or the files to bring, right? So you need some tagging, you maybe need an LLM to go summarize when to use certain things. That’s… that’s actually, like, the meat of the work.

194 00:24:22.610 00:24:42.089 Uttam Kumaran: And then maybe, okay, eventually those will all get put up as MCP servers so that you can pick and choose. And then, of course, you need auth, right? So, who’s calling the agent? What permissions do they have? How does it get passed to the actual MCP server, or get used? You know, that’s a lot of the same things that we’re thinking about. How do we build that as a…

195 00:24:42.300 00:24:58.829 Uttam Kumaran: as a platform. So, for example, any new client that comes on, we’re trying to auto-generate several pipelines so that our project managers immediately know that, hey, all the Slack messages, all the emails with those clients are all getting loaded into one area, and they can quickly chat over everything.

196 00:24:58.860 00:25:04.680 Uttam Kumaran: I need to generate an SOW, I need to generate a project plan, I need to create linear tickets.

197 00:25:04.810 00:25:12.739 Uttam Kumaran: But again, I don’t want… it’s… we developed it individually for a while, and then now we’re starting to build a platform, so any new client

198 00:25:12.980 00:25:17.710 Uttam Kumaran: It’s… we kick off several pipelines that get built, stuff that gets auto-routed.

199 00:25:17.930 00:25:36.030 Uttam Kumaran: And so it’s very similar, but I think this is where you’ll find that a lot of the folks that we talk to, I would say you guys are actually a step ahead, in that a lot of the folks that we talk to are just at the point where they’re trying to decide whether to use ChatGPT or Copilot, or they’re, like.

200 00:25:36.170 00:25:52.989 Uttam Kumaran: they’re starting to use it, but they don’t even know what agents to build. So I do agree that investing in an NHN infrastructure is the best move, but I think where your approach is right, in that you focus on the infrastructure. Once you have it, then people will want to come, because you’ll teach them to solve their own problems.

201 00:25:53.010 00:26:06.819 Uttam Kumaran: And your team can start to build templates for things, like, simple chat interface. A person will try to be like, hey, can you actually add this? And you say, well, you can go clone the template, you go at it, and, like, we’ve built you the playground to go do that.

202 00:26:07.240 00:26:11.860 Uttam Kumaran: And that’s a great… that’s a great way. It’s like, where our business is heading as well, you know?

203 00:26:12.410 00:26:20.729 Uttam Kumaran: But also, NAN isn’t the place for everything, like, if there’s fine-tuning and things that need to happen in bedrock, then that needs to happen there.

204 00:26:21.380 00:26:22.050 Uttam Kumaran: …

205 00:26:22.150 00:26:31.209 Uttam Kumaran: And we’re gonna have… we’re gonna have parts of our stuff we… we don’t feel is the best to run at NNN, so we run externally as well, or it’s part of other services, so…

206 00:26:31.470 00:26:33.169 Uttam Kumaran: It is a challenge.

207 00:26:33.740 00:26:37.199 Bassel Samman: Yeah, the way I’m thinking about it is all the training

208 00:26:37.460 00:26:51.080 Bassel Samman: The RAG model itself should probably live in bedrock, so you’re building that, that, so the assistant is ready to go. You’re not… you’re not… I… I don’t know what your experience is with

209 00:26:51.220 00:26:56.150 Bassel Samman: with trying to build a RAG model within NADEN, I feel like it was a lot…

210 00:26:56.380 00:27:00.760 Bassel Samman: more performant, and a lot… maybe it’s me imagining it.

211 00:27:00.910 00:27:09.330 Bassel Samman: To build it inside of, like, you know, OpenAI, or inside of Bedrock or something like that, where you’re really building that data set there.

212 00:27:09.330 00:27:09.670 Uttam Kumaran: will be.

213 00:27:09.670 00:27:10.730 Bassel Samman: And then you’re a system.

214 00:27:10.730 00:27:12.940 Uttam Kumaran: Like, kind of the embedding model?

215 00:27:13.320 00:27:17.450 Uttam Kumaran: with using OpenAI embeds, but then we have all of our stuff in Supabase.

216 00:27:17.740 00:27:19.060 Uttam Kumaran: And again, we’re pulling, like.

217 00:27:19.060 00:27:19.700 Bassel Samman: Oh, okay.

218 00:27:19.700 00:27:22.219 Uttam Kumaran: Mostly textual data.

219 00:27:23.130 00:27:27.849 Uttam Kumaran: Right? Like, it’s mostly text data, and then…

220 00:27:27.960 00:27:36.489 Uttam Kumaran: It’s easy if we are able to just hook up a database to give it access to run SQL, but that’s a lot of the stuff we’re putting in. Yeah, go ahead, Ken.

221 00:27:36.840 00:27:38.370 Ken: Yeah, I was just wondering the…

222 00:27:38.760 00:27:43.760 Ken: Mentioning Slack, have you had… have you run into issues with access to Slack messages yet?

223 00:27:44.350 00:27:55.289 Uttam Kumaran: Yeah, so that’s a… that’s another challenge. So we haven’t… because we’re kind of a smaller team, I don’t think it’s been that high of a priority, but we will start to basically build in…

224 00:27:55.470 00:28:04.939 Uttam Kumaran: access controls via Google Auth. So, depending on the client that you’re assigned to, and you inherit permissions from Google Auth, we will actually bifurcate

225 00:28:04.940 00:28:15.509 Uttam Kumaran: what data you have… you have access to. So we use… we use an ETL tool to bring in all of our Slack messages into S3, and then that gets put into Supabase

226 00:28:15.510 00:28:21.390 Uttam Kumaran: And we run, you know, several steps on top of, to sort of make sure it’s ready for RAG.

227 00:28:21.630 00:28:29.969 Uttam Kumaran: Eventually, basically, we’ll start to build, sort of, databases per client, and sort of have that separation.

228 00:28:30.270 00:28:39.199 Uttam Kumaran: We… we don’t… we… currently, I don’t know… I’m not sure what the process is, but I’m not sure… I don’t think it’ll be that tough for us to migrate to that type of setup.

229 00:28:40.580 00:28:47.779 Uttam Kumaran: Because, again, we… it’ll just be, like, understanding the user properties, and then making sure that we… we query the right thing.

230 00:28:48.150 00:28:55.800 Bassel Samman: Yeah. Honestly, you bring up a good point that I have not, because I’m only basically doing very simple setups that I haven’t thought about.

231 00:28:55.850 00:29:08.450 Bassel Samman: is, you could just, I don’t want to say summarize, but you’re basically understanding the data in Superbase, and then you’re hitting Superbase directly from your agent, and then…

232 00:29:08.710 00:29:13.040 Bassel Samman: So you’re not directly talking to your data, essentially, you’re like a….

233 00:29:13.040 00:29:18.579 Uttam Kumaran: We’re just trying to retrieve. We’re literally trying to just get the best records to pull out the context.

234 00:29:18.890 00:29:19.390 Bassel Samman: Yeah.

235 00:29:19.430 00:29:20.130 Uttam Kumaran: …

236 00:29:20.690 00:29:35.070 Uttam Kumaran: And again, this is on, like, this could be on thousands of Slack messages, right? So, how do you choose the right ones to bring in, and then to summarize? And so, there’s definitely… this is where it’s… it’s sort… it’s totally use case dependent.

237 00:29:35.070 00:29:35.929 Ken: Well, thank you.

238 00:29:36.150 00:29:37.209 Uttam Kumaran: Yeah, go ahead.

239 00:29:37.800 00:29:46.660 Ken: I was thinking two pieces of that. One was what you mentioned, which is the access. So, like, I’ve run into that with the MCPs as well. Now that they do… a lot of them do SSE, you can do…

240 00:29:46.770 00:29:53.580 Ken: Basically, you inherit the permissions, like you were saying, of the user who logged in, you can see what you see, but that’s a problem in Slack, because not everybody’s in every channel.

241 00:29:53.580 00:30:11.939 Uttam Kumaran: Yes. You need to have naming conventions, you need, like, we have naming conventions for our Slack, and then, most likely, we will… again, I will try to probably push Google… because we use Google for our workspace and stuff, I’ll probably push Google’s stuff as far as we can to make sure that

242 00:30:11.990 00:30:15.129 Uttam Kumaran: Based on a naming convention, you have some dynamic access, but

243 00:30:15.560 00:30:29.580 Uttam Kumaran: Yeah, again, but, like, I didn’t name… we didn’t have naming conventions because of this problem, right? Because this is only a recent… I just… I’m opinion about our Slack being organized, so we have different naming conventions about external versus internal, client, non-client.

244 00:30:29.920 00:30:33.249 Uttam Kumaran: And so we will basically be able to

245 00:30:33.490 00:30:35.820 Uttam Kumaran: Kind of, like, assign access that way.

246 00:30:36.360 00:30:42.379 Ken: The second thing with Slack specifically is I believe that they are limiting third-party access to their APIs, even with user permission.

247 00:30:42.630 00:30:54.109 Ken: And I anticipate that that’s going to be a problem going forward with others, like Google. Google will allow you to connect Gemini, but they may not always allow you to connect, you know, in the way that we’ve been accustomed to their.

248 00:30:54.110 00:30:54.710 Uttam Kumaran: Yeah.

249 00:30:54.710 00:31:01.109 Ken: So, like, having… having retrieved everything with something that is authorized to do that, and collecting it in a central place…

250 00:31:01.820 00:31:02.730 Ken: That’s….

251 00:31:02.730 00:31:11.519 Uttam Kumaran: That’s our position. Yeah. Because they’re… they’re not gonna be able to guard against that, but they’re gonna… they’re gonna guard against, like, point access.

252 00:31:11.520 00:31:12.060 Ken: directly.

253 00:31:12.520 00:31:27.760 Uttam Kumaran: Right? And so that’s what you’re seeing on Twitter, people talking about that, but I’m like, okay, but who… that’s like the… that’s why the MCPs kind of fail at that, because they’re doing point reads when you need it. For us, we take everything, and then we make a decision on what we need.

254 00:31:27.960 00:31:35.120 Uttam Kumaran: You know, after the fact. But, again, for me, that was as easy as, like, hey, let’s just dump everything in S3, and then we’ll go from there.

255 00:31:36.150 00:31:41.609 Ken: I like that idea as well. And you can, like you said, you can further gate access through some kind of convention that you have.

256 00:31:41.610 00:31:42.300 Uttam Kumaran: Yes.

257 00:31:42.790 00:31:47.199 Ken: you have your stuff. You have your stuff in a place where you can… where you can query it.

258 00:31:47.450 00:31:52.159 Uttam Kumaran: Yeah, and again, for us, it’s like, okay, like, another use case is take all the slack.

259 00:31:52.350 00:32:06.979 Uttam Kumaran: messages for a client this week and write a sprint summary, right? So that we need, like, historical, you know, we need a list of messages that need to come in. And then we also have cleaning steps, right? Like, Slack messages, you’ll find, are very noisy, there’s all this crap.

260 00:32:06.980 00:32:11.080 Ken: Sure. So, we need to be able to understand what’s threaded messages.

261 00:32:11.080 00:32:13.170 Uttam Kumaran: Like, if there are files attached…

262 00:32:13.620 00:32:22.479 Uttam Kumaran: And then the other… I would say the other side to Slack is we actually try to use it as, like, calling… talking to the agents. So, being able to use Slack’s

263 00:32:22.490 00:32:34.699 Uttam Kumaran: like, Slack bot framework to send files, to then trigger NAM webhooks, because for us, we do have a little bit of a platform, like, kind of a vibe-coded platform now.

264 00:32:34.760 00:32:39.969 Uttam Kumaran: But, I don’t know, I don’t want our team to be leaving where work happens, and Slack…

265 00:32:40.200 00:32:54.490 Uttam Kumaran: is where a lot of our work happens, so as much as we can use agents there, it’s a great experience for folks. And then we do have some stuff that can’t… is not conducive towards a Slack-based interface that maybe it happens elsewhere.

266 00:32:57.190 00:32:59.130 Bassel Samman: So, for a superbase…

267 00:32:59.390 00:33:09.519 Bassel Samman: I’m curious, do you guys use AWS at all, or is it because you don’t, like, you don’t… there’s not much on AWS that you use Superbase?

268 00:33:09.770 00:33:12.619 Ken: Superbase is an Elixir wrapper around Postgres, so it’s just a Postgres.

269 00:33:12.620 00:33:15.300 Bassel Samman: Yeah, that, that, that’s… I understand.

270 00:33:15.540 00:33:20.849 Uttam Kumaran: Yeah, we don’t have… so we don’t have a constraint on what to use, so we just… we just use it, yeah.

271 00:33:21.290 00:33:27.649 Bassel Samman: Yeah, because, I mean, as long as you put in, like, the PG vector or whatever, you essentially use anything.

272 00:33:27.650 00:33:29.500 Uttam Kumaran: It’s the same thing, yeah, it’s….

273 00:33:30.540 00:33:37.179 Bassel Samman: I was just curious if there was any advantage that you found that was… that would make it worth it to use.

274 00:33:37.680 00:33:38.430 Uttam Kumaran: Not really.

275 00:33:38.430 00:33:38.910 Bassel Samman: 100%.

276 00:33:38.910 00:33:41.570 Uttam Kumaran: You guys, I would say for us, like, we don’t, like.

277 00:33:41.850 00:33:49.530 Uttam Kumaran: The stuff we’re investing on for our team is stuff we’re doing for ourselves, so the investment is not, like…

278 00:33:49.690 00:34:04.290 Uttam Kumaran: a lot of our work that we’re doing in AI is actually now more for our clients than for ourselves. This is, like, this is stuff that we develop, but a lot of times, our clients don’t ask us for stuff as challenging as, like, what I ask our team to do, you know, so we tend to go find

279 00:34:04.410 00:34:14.980 Uttam Kumaran: ways that are more cost-constrained of doing it, but… and our data isn’t so large, right? It’s just, we’re able to run a lot of these things, but it’s all Postgres.

280 00:34:15.330 00:34:20.340 Bassel Samman: … Yeah. Yeah, I mean, I use it with some of my N8N stuff, but I…

281 00:34:20.650 00:34:30.549 Bassel Samman: I was just curious if there was a… I wasn’t planning to use it for our production stuff, so I was wondering if there’s any advantage versus, like, some tools they offer that… that…

282 00:34:30.980 00:34:33.419 Bassel Samman: But okay, cool, yeah.

283 00:34:33.429 00:34:39.439 Uttam Kumaran: Yeah, I would say there’s just no, like, N8N competitor by the big clouds that is worth trying.

284 00:34:40.060 00:34:40.690 Bassel Samman: Yeah.

285 00:34:40.699 00:34:57.459 Uttam Kumaran: I don’t… I think, by the time you get all the access set up, and they’re very limited, NADN has been there for quite a long time. I think at some point, they’ll catch up to have some similar things, or someone will buy these guys, but until then, it’s pretty good. And again, you can self-host N8N,

286 00:34:57.689 00:34:58.199 Uttam Kumaran: ….

287 00:34:58.200 00:34:58.610 Bassel Samman: Yeah.

288 00:34:58.610 00:35:03.019 Uttam Kumaran: You know? The other thing is, I think you guys should think about investing into some sort of eval.

289 00:35:03.260 00:35:05.519 Uttam Kumaran: Or about some sort of eval platform where you could.

290 00:35:05.520 00:35:07.130 Ken: I was so happy when you said that.

291 00:35:07.460 00:35:14.409 Ken: That is the piece that intrigues me most. Basil knows that’s… that’s my… that’s the thing I listen for. Like, he has his little keywords, he listens to see if somebody is…

292 00:35:15.130 00:35:16.349 Ken: knows AI or not.

293 00:35:16.500 00:35:23.449 Ken: if all of the people that I know that actually do this stuff, like, that’s where they spend the bulk of their time. How do you actually know that you’re getting what you want when it’s.

294 00:35:23.450 00:35:24.950 Uttam Kumaran: Yeah, cause I, I, I need…

295 00:35:25.290 00:35:36.889 Uttam Kumaran: guarantee the outcome. And anecdotally, AI can seem like magic, but I needed to work at, like, the 100,000th time, I needed to make sure. And we’re never gonna get it right on the first try.

296 00:35:37.240 00:35:54.010 Uttam Kumaran: Like, so I need to set client expectations well that we will start track… we have a score, and you should see the score climb over time, and… and for our clients, right, if we’re building agents, we have… part of our project management process is the triage. So we look at the lowest scoring stuff.

297 00:35:54.010 00:36:04.529 Uttam Kumaran: okay, maybe there was context missing, maybe we didn’t account for this type of request, or maybe something lagged. So, like, we need to build those, and we… so we’ve been using Brain Trust.

298 00:36:04.700 00:36:10.589 Uttam Kumaran: Pretty great tool. Arise. A-r-i-z-e is the other, like, kind of killer tool in this space.

299 00:36:10.790 00:36:19.170 Uttam Kumaran: I think you can’t go wrong with both. The other thing is, like, having human-in-the-loop, and sort of, like, human-in-the-loop feedback, too, as well, is really good.

300 00:36:19.280 00:36:28.230 Uttam Kumaran: But yeah, I would also suggest the same thing if you’re talking to other folks. They’re not talking about evals, and there’s no way to get… like, there’s no measurement in this world. There’s just anecdotally, like.

301 00:36:28.230 00:36:29.810 Ken: Exactly. But first, like….

302 00:36:29.870 00:36:38.100 Uttam Kumaran: Let’s keep going, and that’s not like the… I don’t really care. I care about it working every time, and I care about proving that.

303 00:36:38.550 00:36:39.140 Bassel Samman: Yeah.

304 00:36:39.550 00:36:40.419 Ken: You know, that was…

305 00:36:40.680 00:36:51.220 Ken: Specifically, like Basil mentioned, we’re a healthcare company, and so, like, one of the early things that people were AI is, like, we should generate patient care plans with our AI, and I’m like, that seems like the very worst possible use of AI, right?

306 00:36:51.220 00:36:56.730 Uttam Kumaran: Yeah, but it’s a very sexy thing to do, and someone could have just developed a simple version of that.

307 00:36:56.890 00:37:02.529 Uttam Kumaran: But the edge cases, and I don’t know, I just think, like, once you get into having these in production.

308 00:37:02.780 00:37:07.799 Uttam Kumaran: you just have to measure. And so that’s what, for us as a company, like, for all of our AI clients.

309 00:37:07.850 00:37:17.939 Uttam Kumaran: As part of the outcome, we develop dashboards where we measure not only things about, you know, how long, like, response times, but we do leaderboards, so, like, who’s talking to it the most.

310 00:37:17.940 00:37:32.769 Uttam Kumaran: And we do scoring. And then Brain Trust and these guys all have out-of-the-box scoring. You could also do things like LLM as a judge, where it judges the response. And then, of course, we’ll start to then use LLMs to help triage, right? There’s probably 20% of those requests that are, like.

311 00:37:32.800 00:37:36.550 Uttam Kumaran: For example, for some of our clients, we have just huge document stores where

312 00:37:36.670 00:37:41.799 Uttam Kumaran: If they would have blamed us and said, our agent was wrong, but in fact, the information just isn’t there.

313 00:37:41.990 00:37:46.899 Uttam Kumaran: Right? And so, how do I prove that? How do I know? How do I isolate a request pair

314 00:37:47.230 00:38:04.779 Uttam Kumaran: find the problem, then triage it. This is the, I think, what everybody’s learning now about how do you maintain these systems. A lot of the AI consultancies that you’re seeing, this is where I just think, like, because we’re… like, we’re an engineering firm first, like, a data engineering firm, and so I know

315 00:38:04.910 00:38:09.050 Uttam Kumaran: we have to measure these things, and so a lot of companies, they start off as, like, NHM,

316 00:38:09.180 00:38:17.089 Uttam Kumaran: AI consultancies, and they’re not building production systems, so they’re building, like, proof of concepts, things that kind of flash in the pants.

317 00:38:17.090 00:38:17.510 Bassel Samman: Doctor.

318 00:38:18.150 00:38:18.720 Uttam Kumaran: Yeah.

319 00:38:18.720 00:38:21.920 Bassel Samman: The problem is it sounds so real.

320 00:38:22.190 00:38:28.569 Bassel Samman: So you might get a result from AI, and like, the worst is with executives, because they’re looking.

321 00:38:28.570 00:38:30.140 Uttam Kumaran: I know, I know.

322 00:38:30.140 00:38:32.740 Bassel Samman: I work with executives all the time, like, I just….

323 00:38:32.740 00:38:33.330 Ken: Look!

324 00:38:33.330 00:38:44.960 Bassel Samman: you just put it in chat GPT, and it gave you a result. Look, I’m like, it’s gonna give you a result every single time. The problem, is it right? And, you know, and sometimes you’ll, like, look, just put in the spreadsheet, it’s, you know…

325 00:38:44.960 00:38:54.179 Bassel Samman: thousand row spreadsheet, and chat GPT, no, it’ll give you the first 10 rows, but look at the rest of the rows before you… do you think it worked.

326 00:38:54.180 00:39:06.510 Uttam Kumaran: that’s… it’s good that they’re interested, and then you’re like, we’ll take it from here. Like, thank you for, like… because… and this is where I think in our business, it’s unique, because I’m here, so I play both… I play both people.

327 00:39:06.510 00:39:20.500 Uttam Kumaran: But I have the ideas on, like, what is causing us so much pain? Like, where is the problem in the business? But then it quickly needs… it moves from me testing out prototyping in GPT, and then it’s like, here’s, like, it needs to move into some sort of broader.

328 00:39:20.500 00:39:21.160 Bassel Samman: Yeah.

329 00:39:21.350 00:39:26.019 Uttam Kumaran: It also needs to move past me, like, other people in the company need to get enabled to use it.

330 00:39:26.340 00:39:26.880 Ken: I’m mute.

331 00:39:26.880 00:39:29.350 Uttam Kumaran: It can’t just be a ChatGBT thing, yeah.

332 00:39:30.020 00:39:37.239 Ken: I guess it’s easier in your place, because it’s kind of like in our team. So, like, Basil and I have not had a problem with getting AI uptake. We…

333 00:39:37.350 00:39:40.940 Ken: you know, we… we’re… you know, we’re tickies, right? We like this kind of stuff.

334 00:39:40.940 00:39:41.459 Uttam Kumaran: Yeah, yeah, yeah.

335 00:39:42.450 00:39:46.209 Uttam Kumaran: If I was living in a cave, I would be trying to get internet, you know, so… Yeah, yeah.

336 00:39:46.210 00:39:57.039 Ken: But how do you get uptake from, like… you work with a lot of client companies where maybe not everybody’s even using ChatGBT yet, right? Yeah. How do you sell it to the rest of the team?

337 00:39:58.400 00:40:04.119 Uttam Kumaran: Yeah, this is where it’s… it’s more consulting than it is technology work, like, ….

338 00:40:04.120 00:40:06.130 Ken: You’re human very much in the loop.

339 00:40:06.130 00:40:08.099 Uttam Kumaran: Yes, like, this is where, again, it’s….

340 00:40:08.100 00:40:10.620 Bassel Samman: With a stick, is how you sounded.

341 00:40:10.620 00:40:27.690 Uttam Kumaran: This is where you have to have a good client stakeholder, and this, you know, the reason why we do things like the leaderboard is it’s kind of a psychology thing, where if I can go to the company and say, here are the people that are using it, here are the people that aren’t using it, it’s sort of a nudge to be like, go talk to the people that aren’t using it.

342 00:40:27.690 00:40:31.789 Uttam Kumaran: Right? And that’s a really great mechanism for us to produce for them.

343 00:40:31.840 00:40:45.319 Uttam Kumaran: But also, it is… it’s a human thing. Even in… even in my business, we have project managers, operations people, marketing people. They’re… they’re… for one or many reasons, they’re not using AI. They’re afraid. Maybe they’re not trained.

344 00:40:45.420 00:40:54.160 Uttam Kumaran: Also, maybe they used it once and it got it wrong, and then they’re like, it’s wronged, right? So there’s all these things where, for us, using it for so long.

345 00:40:54.500 00:41:07.119 Uttam Kumaran: I’m sort of, like, we’re at a different level, but for everybody, they’re just starting. And so, one, there has to be some training. Second, there has to be some sort of feedback loop process. Like, again, psychology is, like, people want to be listened to, they want to know

346 00:41:07.120 00:41:16.480 Uttam Kumaran: that their feedback is coming back into the system somehow. But it’s a lot of consulting. It’s a lot of just sitting and making sure we go person by person to understand

347 00:41:16.480 00:41:26.209 Uttam Kumaran: the reasons behind adoption. So in that sense, like, I sort of think back to my product manager days where, okay, I’m understanding, like, what is the user, really, what are the issues they’re facing?

348 00:41:26.300 00:41:30.749 Uttam Kumaran: But it’s not… it’s not been a technology problem, for the most part.

349 00:41:32.000 00:41:39.880 Uttam Kumaran: it’s been a, like, why aren’t they adopting it? Who in the line of command needs to remind people to do so?

350 00:41:39.880 00:41:52.459 Uttam Kumaran: And how can you ultimately hook the success of an org or, like, them hitting a KPI to AI adoption, right? And that needs to come from the management level. And so whenever we come into a client, we have to agree on a KPI.

351 00:41:52.710 00:41:55.869 Uttam Kumaran: like, what is the metric that we’re trying to achieve? Is it, like.

352 00:41:56.090 00:42:11.679 Uttam Kumaran: everybody’s using the agent once a day? Is it another KPI that goes down or goes up? Like, we have to measure something, because I’m confident we’ll build you whatever the agent it is. I’m not worried about that. Like, I’m worried about affecting the right metric and getting adoption.

353 00:42:11.980 00:42:15.630 Uttam Kumaran: And so, like, that’s what we’re… we’re trying to achieve the outcome.

354 00:42:16.090 00:42:23.249 Bassel Samman: Yeah, that’s what I did with early on on our team. I was checking everybody’s progress, like, how many hours are they putting in?

355 00:42:23.540 00:42:32.149 Bassel Samman: using, you know, Windsurf when we first started. Obviously, I didn’t stay up with that. Once I saw that people are using it nearly every day, then I laid back a little bit.

356 00:42:32.150 00:42:38.180 Uttam Kumaran: several times. In the analytics, it suggests that you not do it individual, just by, like, team use.

357 00:42:38.180 00:42:42.180 Ken: Every time I would go back, it had been turned back to individuals, because you wanted to see these.

358 00:42:42.180 00:42:46.380 Uttam Kumaran: I don’t know, I look… I want to see individuals, like….

359 00:42:46.380 00:42:47.650 Ken: Oh, actually, I see you’re already prepared.

360 00:42:47.650 00:42:57.269 Bassel Samman: It’s not… yeah, it’s not… it’s not about accountability and, like, yelling at people. It’s really to see where they’re stuck. Like, remember when Castriot, at the beginning, was like.

361 00:42:57.510 00:43:16.320 Bassel Samman: You know, I told him to use it. He’s like, yeah, yeah, yeah, I got it, and it’s okay. And then when, you know, a week later, I came by, I’m like, hey, have you used it? Well, I tried. It didn’t give me the right results, it kept messing up, and I’m like, did you try giving it a picture instead of words? He’s like, no, I haven’t tried that. And then he tried that, and then it worked, and then all of a sudden, he’s hooked again.

362 00:43:16.420 00:43:29.809 Bassel Samman: It’s not about, you know, it’s really finding what, like you said, they just didn’t think of a different way to talk to it. It’s us learning how to talk to it as much as it learning how to understand us.

363 00:43:30.430 00:43:45.679 Ken: when he said, you know, earlier, linking what he’s saying about individual uptake and a leaderboard. I’m usually against things like leaderboards, but in this case, I think it’s valuable, because it’s feedback both sides. If you’re not using it, there might be a very valid reason.

364 00:43:45.680 00:43:46.090 Uttam Kumaran: Yes.

365 00:43:46.090 00:43:46.690 Bassel Samman: Yep.

366 00:43:46.690 00:43:47.260 Ken: the… the….

367 00:43:47.260 00:43:58.280 Uttam Kumaran: But I also want to champion the people that are using it the most. We have clients where that person, I’m like, have that person join our stand-ups? There’s something they found, there’s something… they’re like the early adopter on the curve.

368 00:43:58.280 00:44:09.859 Uttam Kumaran: So we want to… we want to nourish them and make sure that we’re listening to them. And similarly, it’s not actually like a, hey, go fire this. It’s, like, not a that at all. It’s actually… they probably have some valid concerns where maybe we made a mistake.

369 00:44:09.860 00:44:15.169 Uttam Kumaran: Or maybe we continue to make a mistake, or maybe it’s not serving their purpose. And again, these are not just…

370 00:44:15.290 00:44:21.410 Uttam Kumaran: This is not just engineers, like, for us, we have customer service use cases, marketing use cases, so…

371 00:44:21.520 00:44:32.279 Uttam Kumaran: it could be a wide variety of things where… and again, like, we’re even… we’re evolving, like, maybe we should have AI office hours, where people just come with their problems, maybe we’re starting to do more looms.

372 00:44:32.410 00:44:47.699 Uttam Kumaran: Right? Because it’s easy enough for people to just install Cursor, but for the folks in your… in your… in your businesses that are actually having really tough time with efficiency, which tends to be project management, marketing, operations, there’s a lot of alpha for them.

373 00:44:47.710 00:44:54.199 Uttam Kumaran: Right? And it’s like, finding out what they’re constrained by often can be a huge ROI gain.

374 00:44:54.250 00:44:57.959 Uttam Kumaran: And you may end up, like, it may just be getting them to use

375 00:44:58.020 00:45:07.260 Uttam Kumaran: like, ChatGPT in a certain way, but also just maybe a very simple NAN workflow that you can build for them, or you can coach them to build, that would enable them, you know?

376 00:45:08.060 00:45:09.450 Bassel Samman: Yeah, yeah.

377 00:45:09.930 00:45:27.790 Bassel Samman: I’m honestly always blown away when people say, like, you can have a non-technical person use NAVIN. It’s probably because of the way I use it, but if you can simplify it, yeah, like, if I… if you could do sub-workflows, you could probably just say, use this sub-workflow for the technical stuff, but I’m like.

378 00:45:27.790 00:45:28.789 Uttam Kumaran: That’s where we’re starting to get….

379 00:45:28.790 00:45:30.590 Bassel Samman: Pretty technical.

380 00:45:30.590 00:45:40.389 Uttam Kumaran: It’s pretty technical, it is pretty technical, but I’ve seen motivated people get pretty far. But again, there’s also some people who want to clock in and clock out, and you’re never going to get them to open any of them.

381 00:45:40.390 00:45:47.909 Bassel Samman: Yeah, and honestly, they’ll find a no-code way to do it. I’m probably just being lazy and dropping into, like, a code node, you know?

382 00:45:48.110 00:46:05.239 Uttam Kumaran: Yeah, and again, even if they could just do that, though, then trying to hit it themselves, it could be better, but for example, one of the common workflows in my business is copying transcripts, and logging into Zoom, getting a transcript, and then going into, like, your ChatGPT and asking… there’s, like, such a long process.

383 00:46:05.240 00:46:09.840 Uttam Kumaran: So I said, look, we need to just make a cleaner way for our team to own clean transcripts.

384 00:46:09.880 00:46:19.989 Uttam Kumaran: and make it so it’s easy to go from transcript to one or several actions. Create tickets, create a summary email, right? And that’s what we own. So we didn’t go the distance and, like.

385 00:46:20.170 00:46:27.199 Uttam Kumaran: Building something grand, we’re like, cool, there’s one place to get all the transcripts, and then immediately from there, you can take the five common actions.

386 00:46:27.200 00:46:42.870 Uttam Kumaran: But it takes sort of this, like, product manager approach to kind of see that and say, okay, what is 80-20 here on, like, what’s worth automating? I could have been like, here’s project managers, I’m going to train you all on N8N, and then 10 years from now, maybe we would have gotten that, but instead, I was like, guys.

387 00:46:43.020 00:46:55.090 Uttam Kumaran: Everybody just does one of five things when they take a transcript, so let’s just, like, build buttons for each of those, and then all those are system prompts that pull in the right context based on the screen we’re in, there’s some state.

388 00:46:55.310 00:47:05.619 Uttam Kumaran: Perfect. That’s it. But, like, again, that took, like, it had to… it had to… we need to look at the entire use case, you know, and understand the user. So it wasn’t….

389 00:47:05.620 00:47:06.410 Bassel Samman: Have you guys….

390 00:47:06.410 00:47:06.970 Uttam Kumaran: Nope.

391 00:47:07.350 00:47:11.530 Bassel Samman: Have you guys dealt with a lot of healthcare stuff, or…?

392 00:47:12.500 00:47:24.679 Uttam Kumaran: Yeah, we do a lot of healthcare on the… on the data side. So we have… one of our clients is a GLP-1 retailer. We work with a lot of different, like, CPG, e-comm, and the…

393 00:47:24.790 00:47:28.920 Uttam Kumaran: on the… in the healthcare side, telehealth as well.

394 00:47:29.180 00:47:32.560 Uttam Kumaran: That’s been most of, like, where we’ve…

395 00:47:32.850 00:47:49.000 Uttam Kumaran: we’ve worked. And again, a lot of that has been in a data capacity, and a lot of those clients now are starting to use us more in an AI and automation capacity, just because we’re… we have now modeled all their data, and then it’s pretty easy for us to also see… we’ve interacted with most of the business in that function, and so it’s easy.

396 00:47:49.000 00:47:49.400 Bassel Samman: Yeah.

397 00:47:49.400 00:47:51.210 Uttam Kumaran: See where the opportunities are.

398 00:47:52.140 00:48:06.880 Bassel Samman: Yeah, I think that might be helpful. We might actually reach out to you guys when we’re starting to build our datasets, and like, how we’re gonna… because I… that’s my biggest concern, is there’s gonna be a lot of thinking there. How do we structure the data? How do we…

399 00:48:07.090 00:48:15.209 Bassel Samman: You know, and putting it blindly in is not an option, really, so… okay, clean.

400 00:48:15.210 00:48:17.069 Ken: screw it up. But, you know, if we can….

401 00:48:17.070 00:48:19.899 Bassel Samman: Well, we are gonna screw it, whichever way we go.

402 00:48:19.900 00:48:20.989 Uttam Kumaran: screw it up, screw it up.

403 00:48:20.990 00:48:24.129 Ken: There’s, like, the first three screw-ups, that’s, you know, three….

404 00:48:24.130 00:48:34.599 Uttam Kumaran: We can help you skip a couple steps. I would say land everything in a data lake first, and then do something after that. You’ll save yourself a lot of, like, integration work.

405 00:48:35.190 00:48:38.530 Uttam Kumaran: Yeah, because also that data, by the way, will be helpful for other teams.

406 00:48:38.530 00:48:39.250 Ken: Right.

407 00:48:39.250 00:48:53.090 Uttam Kumaran: So, just isolating it just to the AI use case, you’re gonna find that some other teams are like, oh, I didn’t know we could actually, like, get access to that. I could use it elsewhere. So, what we have everything in a data lake in S3, and then we move it to stuff for analytics.

408 00:48:53.090 00:48:53.610 Bassel Samman: Yeah.

409 00:48:53.610 00:48:55.279 Uttam Kumaran: Stuff for AI use cases.

410 00:48:55.900 00:48:59.529 Bassel Samman: Yeah, I’d like to get to a point where I could, like, …

411 00:48:59.570 00:49:09.859 Bassel Samman: Where the data is accessible and just, just protected from… on a… on a higher level, so the data raw is there and accessible.

412 00:49:09.860 00:49:21.430 Bassel Samman: But the agents are limited into what they can do, and it’s filtered and, like, you know, anonymized and protected and all that at a higher level than the data, so that way, at least.

413 00:49:21.590 00:49:30.720 Bassel Samman: You could still ask the agent, you know, how many customers took this test, how many patients failed this, so without actually getting to patient data.

414 00:49:30.720 00:49:31.760 Uttam Kumaran: Yes. ….

415 00:49:32.470 00:49:43.119 Bassel Samman: So, alright, cool. I actually… I know we’re way over time, and I actually… I almost did two time slots, because I knew a half hour is not… I knew you guys were the right audience.

416 00:49:43.120 00:49:47.919 Uttam Kumaran: End of the day, so I knew… usually, if I’m talking to technical people, it goes… it goes long.

417 00:49:48.300 00:49:57.770 Uttam Kumaran: I spend most of my day, like, not talking… no one asks me ever about, like, how we’re gonna build it. They only ask, like, what the outcome is, so I’m happy.

418 00:49:58.040 00:49:59.350 Ken: We’re gonna have it, right?

419 00:49:59.350 00:50:04.149 Uttam Kumaran: I’m happy to have it the other way for once, that’s, like, that takes a lot less energy for me, so….

420 00:50:04.390 00:50:12.139 Bassel Samman: Yeah, I mean, it’s the most fun conversation for me as well. I’m glad I made it at the end of the day, so it’s a little bit less hectic.

421 00:50:12.140 00:50:16.379 Ken: I’m glad that you invited me. For once, Basil, I will say thank you for inviting me to a call.

422 00:50:18.230 00:50:23.580 Uttam Kumaran: Yeah, and I told Sam also, because Sam’s new, and I said, just come to some of these, because you’ll get a sense of, like, what….

423 00:50:23.580 00:50:28.680 Bassel Samman: what we’re talking to clients about, but this one was a lot easier than they usually are, I feel like.

424 00:50:28.680 00:50:29.170 Samuel Roberts: Yeah, I like.

425 00:50:29.170 00:50:34.929 Uttam Kumaran: We went to one yesterday where it’s like, we were in a long… it’s like a long… we were on Zoom in a long board room, they were like.

426 00:50:35.610 00:50:37.490 Uttam Kumaran: And I’m like.

427 00:50:37.490 00:50:38.100 Samuel Roberts: Yeah.

428 00:50:38.100 00:50:41.930 Uttam Kumaran: Okay, we’ll just… we’ll talk at a high level about everything, and we’ll get you some thoughts.

429 00:50:41.930 00:51:00.629 Bassel Samman: Yeah, yeah, I mean, that’s… I mean, honestly, most of the times, like, when I was talking to Michael, he kind of explained where you guys are, and I’m like, yep, those are the people I want to talk to, because most times, I’ll get, you know, everybody now knows somebody who does AI, and they think they have a secret weapon, and they’re like, here, talk to this guy, and I’m like.

430 00:51:00.680 00:51:03.119 Bassel Samman: Great, he built an agent, like, you know.

431 00:51:03.120 00:51:05.709 Uttam Kumaran: Yeah, yeah, yeah, so that’s cute, that was cute last year, yeah.

432 00:51:05.710 00:51:09.179 Bassel Samman: I’m like, that’s not….

433 00:51:09.180 00:51:13.490 Uttam Kumaran: Yeah, the same thing, too. I mean, you should see my LinkedIn DMs right now, it’s like…

434 00:51:13.950 00:51:28.679 Uttam Kumaran: And you know what’s even more funny? My text messages are filled with people, like, like, friends or, like, old friends are like, hey, I’ve, like, designed this AI thing, like, can you, like, check it out, or can I get time? I’m like, dude, I don’t know, like…

435 00:51:29.250 00:51:30.469 Uttam Kumaran: I’m out of stealth.

436 00:51:30.650 00:51:33.109 Samuel Roberts: Don’t be in stealth forever, like….

437 00:51:33.110 00:51:39.119 Uttam Kumaran: get something going, and then I’m happy, or send me a lube, or send me something to code to read, like, but otherwise….

438 00:51:39.120 00:51:39.470 Bassel Samman: God.

439 00:51:39.470 00:51:40.699 Uttam Kumaran: I think so, yeah.

440 00:51:40.700 00:51:42.450 Bassel Samman: Or maybe not, since….

441 00:51:42.450 00:51:49.979 Ken: I literally am looking at a Slack… I just came from a meeting where AI, in intercom in this case, was telling our Salesforce person.

442 00:51:50.170 00:51:59.370 Ken: But I could easily get their logs for her failed download by hitting the endpoint API intercom download content data XYZ1234.

443 00:51:59.370 00:52:01.060 Uttam Kumaran: Yes.

444 00:52:01.290 00:52:02.490 Uttam Kumaran: Perfect.

445 00:52:03.420 00:52:07.320 Bassel Samman: What could be wrong with that?

446 00:52:07.320 00:52:23.550 Ken: When I told her it needs a token, so I said, well, what we actually need is the token and the real ID of the place that your logs live. So she asked it to write some code, and she sent me back, like, the same thing, but now with, like, some Python code. Import request, your access token here.

447 00:52:24.010 00:52:26.140 Ken: Yeah, it’s good to have a nice conversation.

448 00:52:27.070 00:52:27.390 Samuel Roberts: Yeah.

449 00:52:27.390 00:52:37.070 Bassel Samman: Yeah, yeah. Sometimes it’s good, it’s a really… you and your luck sometimes with AI. Sometimes you get really good answers, sometimes you get weird.

450 00:52:37.660 00:52:45.679 Bassel Samman: imaginations. Yeah. Alright, well, I appreciate it. I know, and I apologize, we’re, we’re way over the allotted time.

451 00:52:45.830 00:52:56.859 Bassel Samman: It was a fun conversation. We’re definitely gonna have more conversations. I’m gonna talk to my team and expanded my horizons on what’s possible and how to go about it, so I appreciate that.

452 00:52:56.860 00:52:57.280 Uttam Kumaran: Yeah.

453 00:52:57.280 00:52:57.710 Bassel Samman: ….

454 00:52:57.710 00:53:08.970 Uttam Kumaran: I’m gonna send some of the links of stuff that we’ve tried. I have a great rag, kind of vendor that you should check out, and then I’ll send you some of those eval companies. You guys, just check them out, and then, yeah.

455 00:53:08.970 00:53:09.970 Bassel Samman: I wrote them.

456 00:53:10.420 00:53:13.609 Uttam Kumaran: just email me back, or whatever, happy to….

457 00:53:13.610 00:53:16.990 Bassel Samman: Yeah, I mean, I can… I can definitely use some help.

458 00:53:17.230 00:53:27.379 Bassel Samman: Once we decide, like, what… where the data… like, right now, I’m asking people for two things. Every time somebody’s like, I want an agent, I say, fine, give me a workflow.

459 00:53:27.380 00:53:31.430 Uttam Kumaran: Then we’ll talk. And the second set is going to be, okay.

460 00:53:31.690 00:53:34.399 Bassel Samman: You guys have a lot of knowledge everywhere.

461 00:53:34.810 00:53:54.389 Bassel Samman: We need to figure out how to put it together, so I’m looking into bringing Snowflake in, for example, to just start dumping the data into one place, Ava. And so once I have that data, and we know… I know where the code is, that’s about the only thing I know at this point that I’m confident of. And then once we go.

462 00:53:54.390 00:53:55.840 Ken: I haven’t touched the code in a while.

463 00:53:56.370 00:53:59.990 Bassel Samman: I haven’t touched it, but I know where it is. I’m watching.

464 00:54:01.590 00:54:06.589 Bassel Samman: So once, once we kind of… we have the data, like, data…

465 00:54:06.780 00:54:09.500 Bassel Samman: You know, the layout of the data land.

466 00:54:09.660 00:54:14.889 Bassel Samman: I think it would be great to bring you guys in and say, hey, this is the mess that we have.

467 00:54:15.070 00:54:21.310 Bassel Samman: How do we make this mess into something that we can actually use, and how do we…

468 00:54:21.700 00:54:32.500 Bassel Samman: compartment, you know, make… make different, just, separations, segregate different data. This is… you know, how do we think about it? And probably bring you guys with our data team.

469 00:54:32.530 00:54:47.799 Bassel Samman: And then kind of start talking about how do we make sense of this data, and how do we feed it in a way. I really like how you described about, like, how do we contextualize it so we’re only looking at a subset of the data for a specific agent or a specific question.

470 00:54:48.140 00:55:00.549 Bassel Samman: I like the idea, but I have no idea how to implement it, to be honest. Like, in my… in the back of my mind, I’m trying to calculate it. I’m like, how do you… how do you… but how… but then… so, I think, helping you guys clarify, that would be great.

471 00:55:00.750 00:55:02.099 Uttam Kumaran: And being like.

472 00:55:02.100 00:55:07.950 Bassel Samman: Well, let’s try it one more time. Let’s just delete all that, try it one more time. Just delete everything.

473 00:55:07.950 00:55:15.180 Uttam Kumaran: Yeah, or it’s me, like, on Twitter, I’m like, someone… I read this paper, someone just figured out this neat thing, like, we should try this one day. Yeah.

474 00:55:15.180 00:55:34.769 Bassel Samman: Yeah, yeah, I mean, I’d love to tap into your experience about, like, how do we… you know, people think caches work automagically. No, there’s a lot of science behind it, and there’s, like, how do we… how do we figure out what to cache and what to, you know, text and all that. Like, I think it would be great to bring you in and not learn those lessons ourselves.

475 00:55:34.770 00:55:38.309 Bassel Samman: Because that’s a lot of learning that I, you know, we would just.

476 00:55:38.310 00:55:39.340 Ken: bypass.

477 00:55:39.610 00:55:39.930 Uttam Kumaran: Yeah.

478 00:55:39.930 00:55:52.179 Bassel Samman: Yeah, yeah. So I think that would be great, but now I know I have an action plan, so I can start asking the right questions, and I appreciate that. Very eye-opening. I loved the conversation, it was great, I appreciate it.

479 00:55:52.460 00:55:54.750 Uttam Kumaran: Perfect. Okay. Well, thank you guys. Appreciate it.

480 00:55:54.750 00:55:56.949 Bassel Samman: Thank you so much, appreciate it. Have a good one.

481 00:55:56.950 00:55:58.010 Samuel Roberts: You think he is.

482 00:55:58.010 00:55:58.560 Ken: Right.