Meeting Title: Eden - Brainforge: AI Command Center Weekly Check-in Date: 2026-04-17 Meeting participants: Pranav Narahari, Adam P, Daniel


WEBVTT

1 00:03:55.180 00:03:56.250 Pranav Narahari: Hey, Adam.

2 00:03:56.830 00:03:57.680 Adam P: Fair enough.

3 00:03:57.810 00:03:58.630 Adam P: How are you?

4 00:03:59.220 00:04:01.749 Pranav Narahari: Pretty good, pretty good. How’d your week go?

5 00:04:02.650 00:04:08.239 Adam P: pretty… chill. Like, yeah, nothing crazy happened. It was steady and…

6 00:04:08.470 00:04:12.430 Adam P: You know, had stuff going on, but nothing… no fires.

7 00:04:12.430 00:04:14.669 Pranav Narahari: That’s good. That’s… that’s the main thing.

8 00:04:18.149 00:04:19.250 Pranav Narahari: Hey, Danny.

9 00:04:19.630 00:04:21.320 Daniel: What’s up, gang gang?

10 00:04:22.029 00:04:23.819 Pranav Narahari: Nothing much, nothing much.

11 00:04:24.500 00:04:25.770 Daniel: How’s life, Breonna?

12 00:04:26.940 00:04:28.949 Pranav Narahari: Things are good, things are good.

13 00:04:28.950 00:04:29.470 Adam P: Patty.

14 00:04:29.470 00:04:39.810 Pranav Narahari: It was… it was funny when I saw… I was about to send y’all a message about, hey, the Google Doc is, ready, and I was like, who are these two bubbles on top? I didn’t even send it to y’all yet.

15 00:04:39.810 00:04:40.520 Daniel: We got through it.

16 00:04:40.520 00:04:41.830 Pranav Narahari: No, that’s cool.

17 00:04:42.720 00:04:46.680 Daniel: This is… so, I want to start the call, yeah.

18 00:04:46.980 00:04:52.180 Daniel: Give me some context on what’s happened over, like, the last week and a half here. I’m just trying to see, like.

19 00:04:52.320 00:04:54.270 Daniel: Like, where are we?

20 00:04:54.650 00:05:01.879 Pranav Narahari: Sure, yeah, yeah. So, we met on Monday this week, right? And that’s where we kind of showed a little bit of…

21 00:05:02.100 00:05:20.750 Pranav Narahari: Or I think at that point, you guys had already started using a little bit of the Master Studio UI, and you were seeing the integration of just your own personal accounts, so for your own Google, your own Slack, and then asking questions on that. You probably noticed, and we talked about it a little bit, like, the context issues coming about.

22 00:05:20.750 00:05:33.489 Pranav Narahari: And so what we talked about, like, at the end of that call was… I mean, there was a lot that we talked about in that call, too. You talked about how, like, with Claude Co-work, that’s kind of narrowing the scope of what we need to work on here, and so…

23 00:05:33.680 00:05:51.500 Pranav Narahari: the main things that are still not being executed on with Cloud Co-work is that domain-wide delegation, right? Like, being able to alias into other people’s names, and to get data from… from different people, not just your own authenticated account. And so, that’s what we did for this week.

24 00:05:51.810 00:05:58.789 Pranav Narahari: So this whole week, we set up a new UI as well. We also enabled memory, so you’re gonna have better…

25 00:05:59.330 00:06:10.219 Pranav Narahari: context on your conversation, so if you ask a follow-up, it’s gonna perform better than it would have before. There is, like I just said, like, the domain-wide delegation, so if…

26 00:06:10.250 00:06:24.369 Pranav Narahari: A quick check for this is, if you guys ask a question on… let’s say, Adam, you ask a question, hey, what’s my schedule look like today? It’ll say, okay, what’s on your schedule? You ask the same question, what’s on Pranav’s schedule today? You’ll see…

27 00:06:24.540 00:06:37.289 Pranav Narahari: an answer on that, too. So you’re seeing that data’s getting pulled in from these various people’s accounts. Also what we added in is the backbone, which is the order chart that you provided either yesterday or a couple of days ago.

28 00:06:37.290 00:06:37.760 Daniel: Yeah.

29 00:06:38.450 00:06:42.300 Daniel: It took me a second, sorry for that, but we needed to, like, redo the full org chart, so…

30 00:06:42.300 00:06:48.860 Pranav Narahari: All good, all good. I mean, accuracy is most important there, and so we were working on other things in parallel, so it’s totally fine.

31 00:06:50.300 00:06:57.889 Pranav Narahari: And then, yeah, I think now we’re at a point where I feel like, with the features that we have, we’re kind of…

32 00:06:58.010 00:07:05.650 Pranav Narahari: we built the things, now it’s about refining, and that’s why I wanted to create that Google Doc for you guys to look at. Let us know…

33 00:07:05.790 00:07:21.770 Pranav Narahari: your comments on the output. Are we hitting the mark on certain areas? Are we going too deep on certain areas? Are we going too shallow? And then just overall, like, formatting… output formatting comments as well. Like, is the tone good? Is…

34 00:07:21.870 00:07:24.400 Pranav Narahari: The format good, things of that nature.

35 00:07:24.780 00:07:25.580 Daniel: Perfect.

36 00:07:26.020 00:07:45.220 Daniel: So the TLDR on this is it nailed some things, and then completely whiffed and missed on other things, and so I was doing some thought process on, like, what are the similarities? Like, why is the LIDA project not picked up, and nothing in New Mexico? Why is… why is it, like, perfectly… it still is…

37 00:07:45.220 00:07:48.140 Daniel: like, on-the-nose assessing health OS work.

38 00:07:48.500 00:08:06.029 Daniel: So, like, I know there’s this project that’s picking up all the signals, it’s developing a full purview into this thing, and I was like, well, is it because I have more access into Lita or something? And I think I just had to come to terms with the fact that I think, actually, HealthOS is unique because almost all of the work is being done in Slack.

39 00:08:06.950 00:08:08.710 Pranav Narahari: Mmm. Async.

40 00:08:09.330 00:08:13.579 Daniel: like, we have some meetings and stuff, but I’m not even seeing it really pick up meeting notes.

41 00:08:13.710 00:08:16.459 Daniel: It’s really heavily bent towards Slack.

42 00:08:16.670 00:08:20.950 Daniel: And the common thread I’m seeing is, if work is being done.

43 00:08:21.340 00:08:35.710 Daniel: especially being done publicly on Slack, it’s pulling it, assessing it remarkably well, and, you know, bringing the high-quality outputs we would suggest. But it’s completely blind, or it still feels completely blind.

44 00:08:35.760 00:08:41.400 Daniel: to the other forms of working, up to and including Google Workspace.

45 00:08:41.620 00:08:43.900 Daniel: Like, meeting notes and things like that?

46 00:08:43.909 00:08:44.309 Pranav Narahari: Yeah.

47 00:08:44.310 00:09:03.760 Daniel: In these answers, like, a lot of these prompts have recent recorded Google Meets on it that are in our workspace, and it’s meetings where, like, I have the notes, Adam has the notes, Rebecca has the notes, Pete has the notes, and it’s not picking it up, right? So I wanted to ask a couple general questions. Number one is.

48 00:09:04.820 00:09:09.050 Daniel: How do we, like, weight or balance?

49 00:09:09.730 00:09:14.029 Daniel: Those meeting documents in a way that it becomes, like.

50 00:09:15.210 00:09:18.140 Daniel: as valuable as a Slack note on that.

51 00:09:18.140 00:09:18.860 Pranav Narahari: Yes.

52 00:09:18.950 00:09:32.919 Pranav Narahari: Yeah. So, there’s two ways of doing that, right? We can set hard limits. Now, but like you said, different projects are going to have context in different areas, so setting hard limits, like.

53 00:09:32.920 00:09:42.470 Pranav Narahari: only get 25% of your contacts, or have, like, a certain token allocation for context from Slack versus Google. I don’t think that’s the best way to do it.

54 00:09:42.700 00:10:02.349 Pranav Narahari: a better way to… so there’s two ways. If you want to have a little bit more hands-on, you can toggle on and off contacts. So you can say, enable meeting notes to be pulled in as contacts, enable Slack to be pulled in as contacts. These are… you can even ask maybe follow-ups to, like, enable certain contexts as well.

55 00:10:04.130 00:10:24.370 Pranav Narahari: what I kind of want to do here is get more of, like, a baseline. So, like, this is good feedback to know that, okay, we’re missing the mark specifically here on meeting notes. First, what I want to do is make sure, okay, what is all the data we’re getting for meetings? I know we’re getting, like, the calendar, like, metadata, like, who’s joining calendar events, when they’re happening, like.

56 00:10:24.370 00:10:26.229 Pranav Narahari: Is there, like, clustering around that?

57 00:10:26.230 00:10:27.050 Pranav Narahari: But…

58 00:10:27.240 00:10:37.800 Pranav Narahari: in terms of, like, the AI summary, or, like, the transcripts even, from these meetings, that is really good context as well. I wanna… I wanna see if there’s any issues with that.

59 00:10:37.800 00:10:45.129 Daniel: Right? And so, like, I almost… because you almost imagine a way of working, right, where it should be reflective of…

60 00:10:45.380 00:10:48.410 Daniel: Like, how my mind thinks about workflow.

61 00:10:48.620 00:10:51.109 Daniel: Right? So, if you sit in a meeting.

62 00:10:51.210 00:10:54.620 Daniel: You synthesize really quickly a lot of…

63 00:10:55.190 00:11:01.350 Daniel: Process stuff, like, this is done, that’s not done, oh, this is a big issue, and then you talk about it for, like, 30 minutes, you know?

64 00:11:01.630 00:11:02.060 Pranav Narahari: That’s like…

65 00:11:02.060 00:11:12.830 Daniel: actually how a meeting runs. You really quickly synthesize everything and then have a deliverable that is probably the real sticking point in any given project, right?

66 00:11:13.150 00:11:13.630 Pranav Narahari: Yeah.

67 00:11:13.630 00:11:23.730 Daniel: Meetings are also, for me, some of the highest value contacts for cross-function inter-team work, right?

68 00:11:24.810 00:11:34.540 Daniel: yes, it’s good if you message so-and-so on Slack over at Eden Pharmacy and you’re a telehealth member, but, like, if you go and have a Google Meet with Rebecca, like.

69 00:11:34.650 00:11:42.240 Daniel: That… that is… that’s, like, even more valuable context of I’m directly working with Rebecca, I’m sitting in a meeting with her, right?

70 00:11:42.370 00:11:49.079 Daniel: Yeah. So my very broad assessment on this is, I don’t know, what to…

71 00:11:49.860 00:11:55.239 Daniel: how to do that, but I bet there’s a strategy we can start to consider for meeting context.

72 00:11:55.460 00:12:09.200 Pranav Narahari: Yes, and I’m already getting some ideas based on what you’re saying, like… like you said, there are… these meetings usually follow a structure. Maybe… I mean, structure, maybe people don’t see it, but there’s an underlying structure of how meetings usually go, right? Like.

73 00:12:09.200 00:12:16.309 Pranav Narahari: you talk about, like, the context in the beginning, synthesize everything together, and then at the end, there’s certain takeaways.

74 00:12:16.670 00:12:27.029 Pranav Narahari: those are probably going to be really good context to prioritize from each meeting. So, let’s just assume it’s very limited context, right? Like.

75 00:12:27.110 00:12:40.649 Pranav Narahari: context windows are massive now, so if we even need to… if we even wanted to throw in the entire transcript for a lot of meetings, maybe we can. But at the very least, what we should be getting from each of the relevant meetings is the takeaways. So…

76 00:12:40.720 00:12:55.820 Pranav Narahari: I think that is gonna be, like, a huge… like, for these ones where you feel like the questions are missing the mark, and I kind of want to get into those as well, like, the specifics of which ones, like, hit the mark, which ones didn’t. For the ones that didn’t, I think this is gonna be a huge, like.

77 00:12:56.510 00:13:03.740 Pranav Narahari: Huge value add, huge, benefit, and maybe, like, better response quality, basically.

78 00:13:04.080 00:13:11.250 Daniel: Yeah, and look, I’m happy to do, like, a general assessment on this, right? So, how do I share? Yes.

79 00:13:13.940 00:13:18.469 Daniel: So, like, on this, like, I’m telling you.

80 00:13:18.660 00:13:21.190 Daniel: If we ask it about HealthOS stuff.

81 00:13:21.560 00:13:22.130 Pranav Narahari: Yes.

82 00:13:22.130 00:13:26.660 Daniel: pretty damn clear on the responses. Nice. This is… this is great.

83 00:13:26.820 00:13:30.679 Daniel: Do I have any commentary on it? Yeah, I will.

84 00:13:30.810 00:13:33.759 Daniel: I mean, sorta, but I’m not expecting this to be, like.

85 00:13:33.870 00:13:48.340 Daniel: Danny’s brain, I’m expecting this to be, like, an overview without my knowledge on what’s going on with the project, and honestly, if I read this, I’d be like, you know, and I’m Adam, or somebody who’s not, like, involved in the day-to-day workflows like I’m… I am right now.

86 00:13:48.510 00:13:52.210 Daniel: That’s… Pretty good assessment on status.

87 00:13:52.580 00:13:59.250 Daniel: This is, quite honestly, a great assessment on current stat, like, project status and outcomes, right?

88 00:13:59.250 00:13:59.810 Pranav Narahari: Yeah.

89 00:14:00.130 00:14:08.220 Daniel: Next milestones? Yeah, I mean, it’s like, hit the nail on the head on this, right? And then I go down here, and this is what made me think about it. I started looking, I’m like.

90 00:14:09.040 00:14:10.460 Daniel: Really, the only.

91 00:14:10.460 00:14:11.260 Pranav Narahari: One tool called.

92 00:14:11.260 00:14:12.420 Daniel: used with Slack.

93 00:14:12.590 00:14:21.289 Daniel: Exactly, yep. Because all of this project, I’ve kept really diligently monitored Slack and Jira with connections between the two.

94 00:14:21.570 00:14:22.710 Pranav Narahari: Gotcha, yeah.

95 00:14:22.710 00:14:24.760 Daniel: Then we ask it this, right?

96 00:14:25.330 00:14:28.679 Daniel: This is an interesting assessment.

97 00:14:29.230 00:14:35.539 Daniel: First of all, we’re gonna have misspellings, right? So this is L-E-D-A, not L-I-D-A.

98 00:14:35.770 00:14:36.300 Pranav Narahari: Okay.

99 00:14:36.300 00:14:42.650 Daniel: It actually made it a good use case, right? Because, like, sometimes I’m gonna type something in, maybe I spelled, I don’t know.

100 00:14:42.780 00:14:52.440 Daniel: You know, I spelled, prescription wrong, or something. Like, that’s gonna happen. So it tried to grab… and then it forced itself in this context bucket.

101 00:14:53.080 00:15:00.380 Daniel: Trying to say, like, okay, Eden Pharmacy is dispensing in New Mexico, well, I knew that, like, that’s where it’s located, right?

102 00:15:01.320 00:15:04.680 Daniel: On February 26th, there was an announcement…

103 00:15:05.620 00:15:13.530 Daniel: in the general channel by Rebecca Edge, without like… That’s not.

104 00:15:13.730 00:15:14.720 Daniel: Correct.

105 00:15:15.040 00:15:19.839 Daniel: I’m not sure what that even means, right? So we bought this thing, it’s been dispensing since August.

106 00:15:20.480 00:15:24.749 Daniel: And then it’s looking through Slack to try… forcing itself into context.

107 00:15:25.400 00:15:41.870 Daniel: said, like, okay, well, there was this post in the general channel, maybe that has something to do with it. By the way, all of these agreements live in Workspace. Like, I easily could have seen, like, you know, acquisition was closed on X day, and it was actively dispensing the same day of purchase point, right?

108 00:15:41.950 00:15:49.669 Daniel: Like, it’s just an interesting context for me. New Mexico is listed as a state that can be routed to Eden Pharmacy as of March 9th, okay?

109 00:15:50.160 00:15:56.360 Daniel: But… the way it got there was Diego posted that in the Health OS dev team channel.

110 00:15:56.630 00:16:00.320 Daniel: I mean, when we bought this, it’s been licensed in New Mexico since…

111 00:16:00.920 00:16:03.780 Daniel: Like, for, like, 3 years before we bought it, right?

112 00:16:04.040 00:16:04.510 Pranav Narahari: Yeah.

113 00:16:04.510 00:16:11.139 Daniel: So, so I was looking through, you know, comparing and contrasting literally just these two. This one, project mostly managed on Slack.

114 00:16:11.270 00:16:18.860 Daniel: recent… I’m not sure how this matters, but technology-related, right?

115 00:16:19.510 00:16:32.609 Daniel: clear current status, as described in JIRA, etc. Then we’ve got this one, which is a more ambiguous project. What’s the status of the, you know, economic development project in New Mexico? And it forced itself into a context bucket

116 00:16:32.940 00:16:39.879 Daniel: trying to describe how, like, we are selling some stuff in New Mexico with, like, general channel pings and mentions.

117 00:16:40.090 00:16:43.999 Daniel: Didn’t pull anything about economic development out, which I thought was really interesting.

118 00:16:44.370 00:16:44.750 Pranav Narahari: Oh my god.

119 00:16:44.750 00:16:51.380 Daniel: Somehow, the search weighting Was, like, looking for these exact keywords, and it separated them, right?

120 00:16:51.380 00:16:59.940 Pranav Narahari: So, do you think that information just doesn’t exist in, like, in Slack? Because if I look at the tool call here, too, it looks like it only looked at Slack, so…

121 00:16:59.940 00:17:04.179 Daniel: Well, that’s what I’m trying to say, it’s all workspace documents, phone calls and meetings.

122 00:17:04.180 00:17:05.380 Pranav Narahari: Exactly, yeah.

123 00:17:05.380 00:17:05.730 Daniel: So…

124 00:17:05.730 00:17:07.049 Pranav Narahari: So it’s a fairly trendy.

125 00:17:07.050 00:17:07.839 Daniel: leave this.

126 00:17:07.849 00:17:09.849 Pranav Narahari: You’re trying to put it back together, yeah.

127 00:17:09.849 00:17:16.909 Daniel: There’s a waiting issue, and or it’s only looking in Slack, and then I go to the tool name, I’m like, wait, it only looked in Slack. No wonder it couldn’t find anything.

128 00:17:16.910 00:17:18.160 Pranav Narahari: Exactly, yeah.

129 00:17:19.240 00:17:20.069 Daniel: So…

130 00:17:21.280 00:17:35.679 Daniel: absolutely would love to go through… we can go through all of these, right? But the most common thread I found when scrolling through all of these was anything that’s heavily involved in Slack, the other one was the, overall work project, so let me go to the project…

131 00:17:38.300 00:17:45.760 Daniel: This one was one where I think it actually did try and… is it this one or this one? It’s the OKR one. Here we go.

132 00:17:45.960 00:17:54.740 Daniel: Update me on Q2 OKR progress by initiative, right? So, interesting that it dove in, looked at… wait, no, shit, it’s the next one.

133 00:17:55.210 00:17:56.000 Daniel: Here we go.

134 00:17:57.540 00:18:04.730 Daniel: Yes. Based on the information from your Google Calendar and Drive, for the record, the first time in this entire doc it’s mentioned, right?

135 00:18:04.990 00:18:05.920 Pranav Narahari: Interesting, yeah.

136 00:18:05.920 00:18:08.430 Daniel: information from Google Calendar and Drive this week.

137 00:18:08.670 00:18:14.219 Daniel: it knows that I had a meeting of an OKR, didn’t pull the notes.

138 00:18:14.520 00:18:23.149 Daniel: It just cites that I had a meeting, right? No, that was it. It didn’t pull the notes from it, it just said I had a meeting.

139 00:18:24.060 00:18:26.509 Daniel: And then it cited the planning document.

140 00:18:27.870 00:18:35.379 Pranav Narahari: Did it have any commentary? Because I saw something that said associated notes. Gemini associated notes, like, right there, second bullet point.

141 00:18:36.460 00:18:41.580 Pranav Narahari: ETL stand up, but it doesn’t… I’m not sure if that really says it looked at the notes.

142 00:18:41.580 00:18:45.770 Daniel: See, see, and then the conclusions are, like, this suggests active discussion and documentation.

143 00:18:45.770 00:18:48.659 Pranav Narahari: Yeah, so that’s basically saying it didn’t look…

144 00:18:48.660 00:18:55.070 Daniel: It didn’t look or pull in the context, and then this is the one where it said it, it wasn’t able to review

145 00:18:55.950 00:18:59.990 Daniel: I should have marked this when I was looking at it. It said it wasn’t able to review…

146 00:19:09.550 00:19:12.119 Daniel: It said it wasn’t able to review a document.

147 00:19:12.270 00:19:15.389 Daniel: But this was the first mention of Workspace Drive Files list.

148 00:19:15.600 00:19:16.470 Daniel: Okay.

149 00:19:16.860 00:19:25.919 Daniel: Maybe it’s the question… one of these questions, it cites, I found a relevant document, but I couldn’t open it for you, and that was the trigger that started me down this…

150 00:19:26.850 00:19:31.660 Pranav Narahari: Yeah, sometimes it asks for, like, consent to, like.

151 00:19:31.790 00:19:39.120 Pranav Narahari: actually open into things, and we… we’re noticing that… we noticed that for Slack a couple times. I think that is…

152 00:19:39.490 00:19:49.719 Pranav Narahari: that is no longer an issue. But I will look in for, like, certain keywords like that in this, report, and fully read into it as well.

153 00:19:50.230 00:19:56.349 Pranav Narahari: To find… find those things, yeah, because that’s… that’s clear indication, right, of things not being properly accessed.

154 00:19:57.110 00:19:58.200 Pranav Narahari: And also.

155 00:19:58.220 00:19:59.660 Daniel: And then after that, it starts, like.

156 00:19:59.660 00:20:00.020 Pranav Narahari: Yeah.

157 00:20:00.020 00:20:02.220 Daniel: workspace for all these questions, which I found interesting.

158 00:20:02.620 00:20:17.419 Pranav Narahari: And I think also what it would be super just useful for all these questions is, okay, try to use that tool, and see what context you get, and assess if it’s relevant or not, right? Like, it will be able to do an assessment of the…

159 00:20:18.860 00:20:28.759 Pranav Narahari: The similarity of the semantics of the data that it’s accessing versus the question being asked in the command center.

160 00:20:28.760 00:20:36.690 Daniel: So, for dumb people terms, if I know a document exists, ask the bot to go pull that document.

161 00:20:37.830 00:20:38.449 Pranav Narahari: Well, actually.

162 00:20:38.450 00:20:44.469 Daniel: Like, hey, look for our, you know, strategic planning 2026 priorities, or whatever, and like…

163 00:20:44.470 00:20:53.869 Pranav Narahari: What I’m saying, actually, is that we’ll hard code on our end that there are going to be certain tools that always get triggered based on whatever question is asked.

164 00:20:53.970 00:21:05.179 Pranav Narahari: So always check, like, the Google, files list to see if there’s anything relevant. Because sometimes that tool’s not even being triggered here, and it’s just never hitting that context.

165 00:21:06.240 00:21:07.340 Pranav Narahari: Does that make sense?

166 00:21:07.570 00:21:09.100 Daniel: No. So…

167 00:21:09.100 00:21:09.730 Pranav Narahari: Yeah.

168 00:21:10.060 00:21:17.490 Daniel: So basically, you’re saying we have to… we’ll train it to know, like, hey, if you ask about OKR.

169 00:21:19.540 00:21:22.879 Daniel: We’re… we’re gonna wanna set context by…

170 00:21:23.400 00:21:29.239 Daniel: looking at that other document I looked at the other day that you liked about, okay, our priorities, or whatever.

171 00:21:30.880 00:21:35.500 Daniel: Like, keywords just become associated with these planning documents we have.

172 00:21:36.130 00:21:51.200 Pranav Narahari: we won’t need to specify documents itself, like, we won’t need to hardcode that. So, like, so for that first example, right, it only looked at the Slack context, and that was fine for that example, but then the second example, I think, yeah, the OKR one.

173 00:21:51.200 00:21:56.660 Pranav Narahari: It still only looked at Slack, and it was missing a lot of the context. Right, it did.

174 00:21:56.660 00:21:58.200 Daniel: And it pulled in workspace drive files.

175 00:21:58.200 00:22:04.219 Pranav Narahari: Oh, sorry, the one before this that you showed, I think it’s for a different project, the New Mexico one.

176 00:22:04.260 00:22:06.450 Daniel: Oh, yeah, the New Mexico one, yeah.

177 00:22:06.450 00:22:14.790 Pranav Narahari: Yeah, so that one only pulled in Slack, right? And if we had just hard-coded to say, also look at the Google files.

178 00:22:15.920 00:22:30.429 Pranav Narahari: then it would pull in more context, and that should be a default. Now, the amount of context that it pulls in will still be dynamic, which I think is good. However, we’re still forcing the system to look in all these different paths. We don’t… we don’t give it the…

179 00:22:30.590 00:22:34.889 Pranav Narahari: We don’t give it the option to not look. It has to at least look, and then assess.

180 00:22:34.890 00:22:43.009 Daniel: Now I’m a little confused, though, because you said use the tool, so… Are we training that?

181 00:22:44.180 00:22:50.350 Daniel: Through, like, our search data, or are you saying that’s something you guys are gonna go and try and set, like.

182 00:22:50.770 00:22:52.320 Daniel: context rules on.

183 00:22:52.320 00:22:57.320 Pranav Narahari: Yeah, it’s super simple for us to do, it’s just, we just enable tools to always get run on default.

184 00:22:57.660 00:23:11.100 Pranav Narahari: So, it’s no difference in the behavior for you guys, and for any user of the command center. There’s… there has to be no difference in, like, the documents that you guys write either. It’s just on our end, right now, it’s…

185 00:23:11.420 00:23:16.459 Pranav Narahari: It’s… it’s making the decision whether or not to look in certain areas, like…

186 00:23:16.460 00:23:25.989 Daniel: it decides I have enough information here on Slack to make a call on this. Therefore, I don’t have to go look in Workspace because I have enough context to answer this question.

187 00:23:26.890 00:23:30.839 Daniel: with a certain… character count from Slack.

188 00:23:31.480 00:23:32.960 Pranav Narahari: Yeah. Yep.

189 00:23:33.560 00:23:38.729 Pranav Narahari: It’s, it’s, it’s like a little bit… It’s also looking at the…

190 00:23:39.000 00:23:48.899 Pranav Narahari: the prompt itself that was asked to the command center, and it’ll assess, okay, does this sound like information that would exist in Slack? Does this sound like information that may exist in the

191 00:23:48.900 00:24:04.609 Pranav Narahari: Google Files, so if there’s, like, specific question about, like, meetings, right? And I think with Q2 objectives, right? That probably triggered the AI to think, okay, this is likely a document. Q2 project plans are usually documents, let me go to Google.

192 00:24:09.040 00:24:13.780 Adam P: How does it… so, I mean, dumb question, but, like, how does it learn? Like, what…

193 00:24:14.690 00:24:24.810 Adam P: I mean, everything, every AI agent does something different. So, for example, I mean, the thing in my head right now is, say, like, Danny and I have a meeting about, I don’t know, we call it, you know.

194 00:24:24.970 00:24:32.119 Adam P: command center, whatever. We have a meeting, but in that meeting, we talk about a tangent that has nothing to do with command center.

195 00:24:32.240 00:24:37.519 Adam P: Meeting notes capture that tangent, put it in the file, but the file’s called Command Center.

196 00:24:37.850 00:24:50.980 Adam P: Does the AI ever learn or constantly catalog and realize and understand that information for HealthOS might be in a command center meeting note?

197 00:24:51.750 00:25:01.840 Pranav Narahari: Yeah, that would be… that would be pretty difficult. Okay. Because, yeah, the metadata that I would first look into to assess, is this relevant information or not, is gonna be the title, is gonna be…

198 00:25:01.840 00:25:02.650 Adam P: Yeah.

199 00:25:02.990 00:25:08.710 Pranav Narahari: The name of the calendar event. But there is a lot of context that could still make it

200 00:25:08.890 00:25:23.430 Pranav Narahari: you know, get the right answer. Let’s say the people involved with one project are very similar to the people involved in another project. It’s going to make the correct analysis that there may be other topics of discussion in that calendar event.

201 00:25:24.210 00:25:28.060 Adam P: Gotcha. Yeah, that was kind of where I was going at, is, you know.

202 00:25:28.320 00:25:41.920 Adam P: why, like, why did it stop at Slack? Is there any way you would ever know or learn that, hey, actually, I should maybe look here? Rather than explicitly being told to look there, is it going to, over time, you know.

203 00:25:42.070 00:25:45.829 Adam P: Learn itself that that’s a process it should take.

204 00:25:46.380 00:25:57.199 Pranav Narahari: Yeah, so that overtime learning is we’re not training, right? What we’re doing is just using a model, and then, building tools to then access different parts of your data.

205 00:25:57.300 00:26:06.740 Pranav Narahari: So, there’s no actual training happening here, which is… that’s a whole… totally different type of effort, which I don’t think is…

206 00:26:06.910 00:26:11.320 Pranav Narahari: the correct… solution for this type of problem.

207 00:26:11.940 00:26:18.609 Pranav Narahari: Yeah, but yeah, in terms of learning, there’s certain contexts that we can also end up saving.

208 00:26:19.440 00:26:30.989 Pranav Narahari: there… and that’s the whole concept of, like, a knowledge base that I brought up on Monday, is… if we start finding themes of things, and we can… we can probably set up a system as well to…

209 00:26:31.310 00:26:38.620 Pranav Narahari: Detect themes, but then also save that document to then use as additional context for the command center.

210 00:26:40.530 00:26:49.940 Pranav Narahari: So, in that way, like, what you’re talking about learning, it would learn. Not in just, like, the traditional AI sense, but in its effect, yeah, it would be.

211 00:26:49.940 00:26:53.049 Adam P: Yeah, and it’s its own hard-coded context sort of thing.

212 00:26:53.420 00:26:53.990 Pranav Narahari: Yep.

213 00:26:54.860 00:26:57.899 Adam P: Yeah, because I can see that definitely happening,

214 00:26:58.080 00:27:13.960 Adam P: especially now that there’s kind of, like, a not-everything-is-in-slack realization sort of deal, you know, retroactively perspective, yeah, that happens more than I would think it should, that we have a meeting for A, but we talk about B the whole time.

215 00:27:17.520 00:27:25.969 Pranav Narahari: Yeah, and these are certain themes that, like, you’re already noticing, right? Like, say if you’re like, I know this is happening, that’s additional context that we give to the system.

216 00:27:26.140 00:27:33.320 Pranav Narahari: So, we put less weight on using just the meeting title.

217 00:27:33.470 00:27:36.230 Pranav Narahari: As indication of is this relevant or not.

218 00:27:36.470 00:27:44.090 Pranav Narahari: I think in this case, it would be more relevant to see, okay, who are the participants of the meeting, right? Because you can’t talk about a topic if someone isn’t present.

219 00:27:44.290 00:28:01.929 Pranav Narahari: Sure. Right? So if all the people aren’t present, then you’re not gonna be… or if one… if none of the people are present, then you’re not gonna be able to talk about the topic. If everybody that’s involved in a certain project are present, then there’s still high likelihood that they’re talking about it, even if it’s not… the calendar wouldn’t…

220 00:28:02.110 00:28:06.440 Pranav Narahari: Hint to that, the calendar to that. Yeah.

221 00:28:06.440 00:28:07.230 Adam P: Makes sense.

222 00:28:08.980 00:28:15.749 Pranav Narahari: And so I think this is the type of discussion that’s super important right now, because this is really gonna, like, refine the…

223 00:28:16.150 00:28:24.190 Pranav Narahari: refine the output. We’ve done a lot of… honestly, I think probably 90-95% of the actual

224 00:28:24.320 00:28:32.740 Pranav Narahari: real difficult, like, software, like, writing, code building, things like that. Now it’s just about refining the output.

225 00:28:35.850 00:28:48.529 Daniel: So, I’m highlighting a couple of these. I’m recognizing a little bit of a theme here, and again, my goal is to try and… I know you want broad scope, everything at once, so you can put sprints in flight and that kind of stuff. I get it.

226 00:28:48.640 00:28:51.720 Daniel: But also, I want to try and solve, like.

227 00:28:51.950 00:28:56.179 Daniel: a business objective through this. Like, a clear, clear business objective, right?

228 00:28:56.510 00:28:57.080 Pranav Narahari: Yeah.

229 00:28:57.080 00:28:57.680 Daniel: So…

230 00:28:57.980 00:29:05.149 Daniel: what I… what I’m realizing is I can… I can go pull the Health OS Slack notes from Claude. I have that existing, right?

231 00:29:06.300 00:29:10.069 Daniel: So, that doesn’t really solve a business problem for me.

232 00:29:10.650 00:29:14.370 Daniel: Which is… I’m blind to some things.

233 00:29:15.090 00:29:20.210 Daniel: How can we harken back to the original concept here, which was sort of that executive pulse, right?

234 00:29:20.390 00:29:24.660 Daniel: That goes across the company. So I highlighted a few questions in here.

235 00:29:24.790 00:29:31.580 Daniel: I think the goal should be to focus more on Velocity, like this…

236 00:29:31.700 00:29:37.069 Daniel: the bird’s-eye view kind of concept, like, like, how can we set context and pick out

237 00:29:37.510 00:29:43.969 Daniel: broad questions, like, what are the three largest initiatives by Slack activity and people involved right now? That is… that is…

238 00:29:44.480 00:29:50.739 Daniel: That is useful for me, because I can’t assess that context.

239 00:29:51.810 00:30:01.660 Daniel: through my own Slack AI or Claude system, right? Like, I’m blind to a lot of this, which is… and it already caught a few things I find really interesting, like this catalyst mention.

240 00:30:01.770 00:30:04.840 Daniel: That’s a big deal for me, like, that’s actionable.

241 00:30:05.040 00:30:05.930 Pranav Narahari: Right?

242 00:30:06.000 00:30:11.760 Daniel: Yep. So, I’m highlighting a couple of these, and I think the theme, hopefully you see in here is, like.

243 00:30:12.170 00:30:13.000 Daniel: what…

244 00:30:13.760 00:30:30.410 Daniel: from the order chart, what groups are working together, right? Like, is somebody on the dev team talking to somebody on the telehealth, talking… like, that context shows me, like, integration of the team. Like, okay, it may not have the context to know, like, where every single one of those projects are at, but it can at least tell me

245 00:30:30.410 00:30:49.410 Daniel: hey, over the last 6 months, you’ve had a lot more communication between Eden Pharmacy and Eden Telehealth. Well, that’s a hugely helpful signal for me, right? Or, hey, dev team is never, ever talking to, I don’t know, the retail team. And then I’d be like, okay, that makes sense, they don’t have any retail projects, but I could track that over time, right?

246 00:30:49.410 00:30:54.109 Daniel: And if it instead found, like, hey, dev team is working every day with Eden Health Clubs, I’d be like, why?

247 00:30:55.290 00:30:55.929 Pranav Narahari: You know what I mean?

248 00:30:56.320 00:31:07.149 Daniel: So… and then that would tell me, like, hey, we’re misallocating resources. So I’ve highlighted a few of these. I think if we can get the broad picture, and I can start to learn, like, hey, what’s going on at Eden?

249 00:31:07.370 00:31:08.310 Daniel: Yeah.

250 00:31:08.750 00:31:13.090 Daniel: Hey, where is this particular project out, and what was the last outcome?

251 00:31:15.030 00:31:22.079 Daniel: I don’t know how to magically make that context happen, but that’s the objective that I can’t accomplish

252 00:31:22.180 00:31:27.580 Daniel: without the full MCP context you’ve set up in, like, my clod.

253 00:31:28.510 00:31:38.259 Pranav Narahari: Yes. Yeah. That makes sense. That’s the whole point of the command center, right? Right. Like, we don’t need it to answer… yeah, to your point, like, some of these other questions…

254 00:31:38.260 00:31:44.629 Daniel: detailed in some of the initial questions I was asking. I was like, where’s HealthOS Project? Where’s the LIDA project? Maybe that’s not the point. Maybe the point is…

255 00:31:44.770 00:31:45.300 Pranav Narahari: Yeah.

256 00:31:45.670 00:31:46.340 Daniel: Mike.

257 00:31:46.820 00:31:47.670 Daniel: you know.

258 00:31:48.940 00:31:54.169 Daniel: figuring out if the pharmacy team is spending 30% of their day working with telehealth. Look, that’s a signal…

259 00:31:54.280 00:32:00.240 Daniel: For me, that, like, something’s wrong in terms of, like, completing those objectives, and they’re just pontificating for hours, right?

260 00:32:00.240 00:32:01.090 Pranav Narahari: Totally, yeah.

261 00:32:01.260 00:32:05.430 Daniel: So I highlighted a few of these that I think maybe we can hone in on, on, like, Like.

262 00:32:05.620 00:32:06.800 Daniel: This gave me nothing.

263 00:32:07.280 00:32:08.290 Pranav Narahari: You get nothing, yeah, yeah.

264 00:32:08.290 00:32:15.809 Daniel: This was also kind of, like, before… you guys had to have done this before I sent over the org chart, because I sent it over, like, what, yesterday evening?

265 00:32:16.140 00:32:16.750 Pranav Narahari: Yeah.

266 00:32:16.940 00:32:31.459 Daniel: So, like, that makes sense. Maybe the org chart gives us some context there. What teams are most engaged with that? Well, that’s org chart related. What are the biggest cross-team coordination gaps? Well, that’s org chart related. So maybe as I’m thinking through that, I had that, conversation with that group called Worklytics.

267 00:32:31.920 00:32:32.670 Pranav Narahari: Right.

268 00:32:32.670 00:32:34.760 Daniel: I know we can solve it,

269 00:32:35.050 00:32:39.089 Daniel: Let me just share a little bit. Actually, where are you?

270 00:32:39.410 00:32:40.409 Daniel: Where’s my Zoom?

271 00:32:41.390 00:32:51.360 Daniel: There we go. In this one,

272 00:32:52.410 00:32:55.450 Daniel: How can I… how can I start here?

273 00:32:59.830 00:33:01.520 Daniel: Thought this was interesting.

274 00:33:01.640 00:33:05.679 Daniel: Now, obviously, our tool’s not designed expressly for this.

275 00:33:07.370 00:33:14.130 Daniel: But… The concept they were pitching is like, hey, we can kind of see your organization.

276 00:33:14.400 00:33:17.699 Daniel: How much do you have cross-org collaboration time?

277 00:33:17.930 00:33:23.180 Daniel: like… We don’t need to do benchmarking, but, like, bottlenecks, like…

278 00:33:23.860 00:33:30.599 Daniel: where are people spending… these are really broad insights, I get it, but what it started to hone in on is, like.

279 00:33:31.760 00:33:33.329 Daniel: Who’s doing a lot?

280 00:33:33.960 00:33:36.090 Daniel: Who’s not talking to each other, ever?

281 00:33:36.350 00:33:41.119 Daniel: Like, really high-level signals On, like, way of working.

282 00:33:41.750 00:33:42.590 Pranav Narahari: Yeah.

283 00:33:43.010 00:33:46.860 Pranav Narahari: It’s very structured. It’s very structured based on what it looks like here, yeah.

284 00:33:46.860 00:33:55.329 Daniel: And I don’t necessarily need it structured. I don’t need a… I don’t need a network output like this. Like, I could imagine a chatbot being able to assess it and be like.

285 00:33:55.730 00:33:57.350 Daniel: Okay,

286 00:33:57.890 00:34:10.870 Daniel: Eden Pharmacy has 3 times as many Slack messages with Eden Telehealth than they did in January, or something, right? Well, that’s enough for me, like, that’s directionality and signal. This is, like…

287 00:34:11.679 00:34:13.290 Daniel: HRIS.

288 00:34:15.620 00:34:23.050 Daniel: adoption data and shit. I don’t need this, but the concept was kind of interesting of being able to see, like, who’s working together when and on what.

289 00:34:25.199 00:34:31.239 Pranav Narahari: Yeah, that makes sense. And I think this is taking us to another level, which I think is also super interesting.

290 00:34:31.479 00:34:32.729 Pranav Narahari: having…

291 00:34:33.319 00:34:47.269 Pranav Narahari: the integrations part we’re building, we’re… I’m going to look into specifically this, meeting transcript polls. If that’s not happening, we’re gonna add that part. But other than that, I think that’s the only missing link here in terms of

292 00:34:47.839 00:35:02.099 Pranav Narahari: having access to all the data, right? And then, so, for getting those specific types of outputs, it’s just about creating more direction. So, we want to have this type of output, right?

293 00:35:02.569 00:35:03.789 Pranav Narahari: And then…

294 00:35:03.849 00:35:12.229 Pranav Narahari: the chat interface can be thought of as, like, okay, you can ask any type of question. It has access to all these different tools and all these different data sources.

295 00:35:12.269 00:35:24.589 Pranav Narahari: ask a question, and then it will… it’ll find the information for you. However, the output is always going to be less structured than, let’s say, a dedicated view for…

296 00:35:24.619 00:35:36.259 Pranav Narahari: cross-team communication. If that view is dedicated just for cross-team communication, to see, hey, how many people on the engineering team are talking to other people on the marketing team, just an example,

297 00:35:36.319 00:35:41.759 Pranav Narahari: It’s gonna always just have better data be more accurate, because we’re giving it a lot more direction.

298 00:35:42.249 00:35:43.499 Pranav Narahari: Does that make sense?

299 00:35:44.310 00:35:46.539 Daniel: Well, and I’m trying to tie this back to…

300 00:35:47.020 00:35:52.600 Daniel: like, look, this is great, it’s fun to explore possibilities, all that kind of stuff. I also want to…

301 00:35:53.260 00:36:02.319 Daniel: Be sure that, I also want to be sure that, like, we’re solving a business problem here.

302 00:36:02.320 00:36:03.370 Pranav Narahari: Yes, yeah, yeah.

303 00:36:03.370 00:36:12.770 Daniel: He’ll… the real, like, The real first step to solving a business problem is Is, can we quickly assess

304 00:36:13.000 00:36:19.069 Daniel: Things that we don’t have access to at all, because it takes full, full organizational context.

305 00:36:19.410 00:36:28.899 Daniel: And that is kind of, like, the baseline, right? Like, what can’t I do in my clod? And the answer is, I can’t search everything, I can only search what I can search.

306 00:36:29.170 00:36:30.219 Pranav Narahari: Exactly, yep.

307 00:36:30.220 00:36:33.739 Daniel: So we need to… we need to… we need to draw in, like…

308 00:36:34.200 00:36:45.639 Daniel: it’s less important that I get the exact details on a particular project or whatever. If I’m involved in it, I can go pull that. It’s more important that we set context to the organization.

309 00:36:45.860 00:36:50.720 Daniel: the… Yeah, and stuff I’m, like, blind to, I guess.

310 00:36:51.930 00:36:52.500 Pranav Narahari: Yeah.

311 00:36:52.500 00:37:07.729 Daniel: That’s a soliloquy to say I’ve highlighted some questions that I think I can’t answer, right? Like, I can’t… I can go into Claude and it’ll give me an answer, but without full organizational context, I’m only ever going to assess what projects I’m involved with versus what’s actually happening.

312 00:37:07.880 00:37:10.220 Daniel: Across the org. And I think this did.

313 00:37:10.220 00:37:10.600 Pranav Narahari: Good.

314 00:37:10.600 00:37:12.290 Daniel: Pretty decent job.

315 00:37:12.680 00:37:16.570 Daniel: Trying to, like, Draw through that.

316 00:37:18.020 00:37:21.900 Daniel: maybe the context, like, like this gravy deal on Hillstone opportunity?

317 00:37:22.600 00:37:26.949 Daniel: I really haven’t been involved in that. Like, that’s kind of news to me, right?

318 00:37:26.950 00:37:27.520 Pranav Narahari: Yeah.

319 00:37:27.520 00:37:30.330 Daniel: I know it’s actually a pretty small project.

320 00:37:30.960 00:37:32.640 Daniel: I just think it pulled that out, right?

321 00:37:32.860 00:37:33.470 Pranav Narahari: Yup.

322 00:37:34.890 00:37:45.559 Daniel: Anyway, that was a long, long way to say I think I want to hone in on some of these that are, like, water, semantic kind of questions, because that’s the root business

323 00:37:45.680 00:37:47.880 Daniel: problem I can’t solve in Claude.

324 00:37:48.300 00:37:52.829 Pranav Narahari: Right, yeah, I’m glad that you defined these specific things where you can’t…

325 00:37:52.980 00:38:07.260 Pranav Narahari: I wonder if you threw this into Claude, it would probably just give you not great information back, but it’s a good baseline for us to be, like… for you to know that the command center is giving you the right output, and good output, I should just say.

326 00:38:07.410 00:38:26.170 Pranav Narahari: is that, like, just… we can compare the two, right? Like, next week, let’s do that. We can go over the report that I’ll generate for you of what we’re… what we’re creating versus your Cloud instance, and what we should see is significantly better and more enriched answers in the command center.

327 00:38:27.480 00:38:28.115 Daniel: Yeah…

328 00:38:28.750 00:38:29.320 Pranav Narahari: Yeah.

329 00:38:32.800 00:38:35.319 Pranav Narahari: Yeah, it’d be interesting right now to see what it said.

330 00:38:44.120 00:38:46.239 Daniel: Because it does a pretty damn good job.

331 00:38:47.760 00:38:50.810 Daniel: Although, I have no connected Asana, so that’s interesting.

332 00:38:53.950 00:38:56.679 Daniel: Pulling heavy towards Jira and Atlassian.

333 00:38:57.620 00:38:58.130 Daniel: Awesome.

334 00:38:58.130 00:38:58.910 Pranav Narahari: Oh, okay.

335 00:39:00.140 00:39:02.190 Daniel: Didn’t cite Google Workspace. Little workspace.

336 00:39:02.190 00:39:08.809 Pranav Narahari: It’s not a clear, I guess, one-to-one, comparison, because it has way more integrations.

337 00:39:10.030 00:39:12.559 Daniel: Well, but that’s a… I mean, that’s what we’re up against here, right?

338 00:39:12.720 00:39:14.580 Pranav Narahari: Yeah, that’s true. Yeah.

339 00:39:17.070 00:39:19.960 Daniel: I don’t know what this is gonna spit out, probably a bunch of garbage, but…

340 00:39:21.020 00:39:27.219 Daniel: Everything on here is gonna tell me I’m working hard on HealthOS, which is like, great, what are other people working on? I know what I’m working on.

341 00:39:28.360 00:39:29.180 Pranav Narahari: Right, right.

342 00:39:34.500 00:39:35.370 Daniel: Okay, cool.

343 00:39:35.370 00:39:36.580 Pranav Narahari: I’m gonna go one thing more.

344 00:39:37.000 00:39:41.160 Daniel: Interesting. It’s gonna go through every tool, especially the tools I don’t use.

345 00:39:41.490 00:39:43.210 Daniel: Jira, Asana, and Linear.

346 00:39:43.900 00:39:54.300 Pranav Narahari: Yeah, so actually what’s interesting here is they’re kind of doing what we just thought of as a solution, is I’m just gonna go through every single one of my data connectors, and just see if there’s anything relevant.

347 00:39:54.600 00:40:13.140 Daniel: And that’s why, like, on the MCP, the original concept was, like, well, we just dumped all this shit into one underlying tool, not using connectors, but literally had, like… I was imagining in my head, like, this database where, like, okay, a new doc was created in Google Workspace, export into…

348 00:40:13.290 00:40:26.080 Daniel: this big data lake, and use that to assess, right? And all of a sudden, all of the different tools become one-to-one weightings, because they all just live in one place. It’s not going connector by connector like Claude does.

349 00:40:26.670 00:40:34.279 Daniel: And then it’s just, like, a big data lake to assess, like, alright, based on all of the shit we’re pulling in here, what’s happening? And where’s it happening?

350 00:40:35.300 00:40:39.379 Pranav Narahari: Yeah, I can see… I can see what you’re saying there, too.

351 00:40:39.840 00:40:48.240 Pranav Narahari: However, it’s also still gonna be based off of maybe, like, the density of information that’s getting pulled from each of these connectors, right? Is the data lake gonna…

352 00:40:48.910 00:40:59.939 Pranav Narahari: it’s not gonna be an even one-to-one wait, just because Slack information may be less, less memory than what you’re pulling from Google.

353 00:41:00.160 00:41:09.830 Pranav Narahari: And at this… and the bigger concern for me is, actually, is that these documents, specifically, are gonna be changing.

354 00:41:10.650 00:41:12.050 Pranav Narahari: And so…

355 00:41:13.510 00:41:21.890 Pranav Narahari: It’s just… I think the solution that we actually have outlined right now is a little bit better. First off, it’s a lot less,

356 00:41:22.070 00:41:37.520 Pranav Narahari: overhead that we need to, like, worry about. There’s less things that are gonna go wrong. Data’s not gonna get, like, malformatted. If we want to do a different format for our data, it takes a lot to then, you know, migrate our current system to that new system.

357 00:41:37.640 00:41:47.899 Pranav Narahari: This is something that we actually did talk about a lot internally. Do we want to stand up a data warehouse to source in all the information, like, run an ETL job once a week, or…

358 00:41:47.900 00:42:00.720 Pranav Narahari: once a day. But we decided, like, these… the CLI tools, these MCPs, are actually much better than people give them credit for, and for this specific context, I think it does the job exactly the same, if not better.

359 00:42:00.720 00:42:04.440 Daniel: Great context issue. I mean, this just pulled from my JIRA board. Thanks.

360 00:42:04.640 00:42:05.660 Daniel: You know what I mean?

361 00:42:05.660 00:42:06.340 Pranav Narahari: Oh, that’s okay.

362 00:42:06.340 00:42:10.250 Adam P: It’s so much output, too, that is… there’s nothing summarized.

363 00:42:10.250 00:42:12.909 Daniel: I mean, okay, but literally.

364 00:42:16.880 00:42:21.250 Daniel: Like, the path of least resistance to answer my question, and it’s like, okay.

365 00:42:21.480 00:42:22.410 Pranav Narahari: Hmm.

366 00:42:22.410 00:42:34.169 Daniel: Jira’s pretty well structured. I’m just gonna look at Jira and give an answer, because it sounds like he’s using Jira a lot lately, and also, it’s just easy, because it’s already organized. Literally didn’t pull in Slack at all.

367 00:42:34.990 00:42:36.370 Pranav Narahari: It’s saving time.

368 00:42:36.370 00:42:39.779 Daniel: a moment to look through Workspace, ignored it at the top.

369 00:42:40.790 00:42:45.980 Daniel: never looked through Slack. I don’t have a Notion, Asana, or Linear connected, which is bizarre.

370 00:42:48.230 00:42:48.799 Daniel: So this is the.

371 00:42:48.800 00:42:49.330 Pranav Narahari: I’m curious.

372 00:42:49.340 00:42:50.020 Daniel: I feel like I’m missing.

373 00:42:50.020 00:42:53.370 Pranav Narahari: Yeah. How does,

374 00:42:53.730 00:42:58.759 Pranav Narahari: How does pricing work with this? Do you pay per token, or do you just pay per seat, like, per month?

375 00:42:58.940 00:43:02.579 Daniel: I just have, like, some… $200 a month.

376 00:43:02.890 00:43:08.619 Adam P: Didn’t they… they literally just changed it to where it’s gonna be tokens regardless now? I just read that, I think.

377 00:43:09.080 00:43:09.800 Pranav Narahari: Okay.

378 00:43:09.800 00:43:17.990 Adam P: like, this morning, they… Anthropic was like, no more seats, everyone uses tokens. I don’t know if it was, like, fear-mongering title or whatever, but…

379 00:43:20.670 00:43:28.999 Pranav Narahari: I mean, that’s how they get you with the seats thing, too, sometimes. I mean, it’s usually cheaper, right? And it usually, like, fits the use case for most people, but…

380 00:43:29.820 00:43:32.110 Pranav Narahari: They’re gonna throttle your token usage.

381 00:43:32.450 00:43:42.949 Pranav Narahari: Right, they’re gonna… because for them, that’s where they have their cost savings. If they use less tokens, so if they’re like, okay, already organized data in this place, we think this is gonna be enough

382 00:43:43.030 00:43:54.229 Pranav Narahari: context for their answer. 9 out of time… 9 out of 10 times, like, people are using AI for very basic things, not for what we’re doing here. And so it’s gonna give them a fine output, but…

383 00:43:54.590 00:44:00.390 Pranav Narahari: it works against us here. Potentially. Potentially. I don’t… we won’t know what’s happening behind the scenes.

384 00:44:00.860 00:44:10.090 Daniel: So, coming up to… I gotta jump in a second, but the overall context I generally have is…

385 00:44:10.660 00:44:18.089 Daniel: If we can solve the business problem of giving me context that I do not have access to because I am one person.

386 00:44:18.360 00:44:19.220 Pranav Narahari: Yes, yep.

387 00:44:19.220 00:44:27.949 Daniel: And we have an ability for me to take something like… let’s say Adam sent over a summary doc from Ms. Claude. Hey, looks like the team’s spending a lot of time on this.

388 00:44:28.540 00:44:28.910 Pranav Narahari: You can take…

389 00:44:28.910 00:44:32.770 Daniel: Copy it, go into our actual bird’s eye view.

390 00:44:32.980 00:44:42.350 Daniel: And be like, no, Adam, you’re just not clued into those calls, they’re already actively happening, it’s documented in Workspace, it’s in the Slack notes, you just aren’t…

391 00:44:43.730 00:44:45.900 Daniel: In that channel, or something.

392 00:44:46.270 00:44:47.120 Pranav Narahari: Yeah, yeah.

393 00:44:47.120 00:44:50.590 Daniel: Like, that would be an interesting test to run, right?

394 00:44:50.780 00:44:55.089 Daniel: Yeah. I run something on this, maybe, maybe I’ll test a question, right? Like,

395 00:44:55.340 00:45:02.120 Daniel: one of these, like, we tried one of the cross-functional ones, it’s just giving me a terrible output. But if we tried… if we tried one, maybe…

396 00:45:03.520 00:45:05.030 Daniel: Maybe this is better.

397 00:45:06.950 00:45:11.599 Daniel: Just to see…

398 00:45:12.300 00:45:25.390 Daniel: If we try one of these, I get a clawed answer, and then we have the MCP layer that we’re building out, double-check it, and see what I missed. That might be, like, a direct one-to-one way to test, like, are we drawing in additional context?

399 00:45:27.890 00:45:29.159 Pranav Narahari: Yes, yeah, yeah.

400 00:45:29.640 00:45:35.430 Pranav Narahari: And is it answering your question versus giving you incorrect information? Yeah.

401 00:45:37.490 00:45:40.220 Pranav Narahari: Because, like you said, like, for Adam, you may give…

402 00:45:40.730 00:45:46.239 Pranav Narahari: it may show a certain answer, and then you’re just like, no, Adam, you’re just not clued into…

403 00:45:46.370 00:45:54.010 Pranav Narahari: The other discussions, conversations that are happening outside of just the ones that you’ve been invited to, or the documents you’ve been a part of.

404 00:45:59.430 00:46:00.800 Daniel: Yeah, okay.

405 00:46:03.120 00:46:24.750 Pranav Narahari: Okay, yeah, I know, yeah, we’re closing in on the 45 minutes. I feel pretty good about some direction here, is we need to make sure that we’re pulling in all the context to start off with, right? Like, like you said, that’s the foundation to solving all these business problems. I mean, that is a business problem in itself, right? Getting that context. But then second off.

406 00:46:24.750 00:46:31.179 Pranav Narahari: And kind of a way for us to make sure that we’re properly utilizing that context is answering those highlighted questions.

407 00:46:31.180 00:46:34.879 Pranav Narahari: I think the ones that you highlighted best capture

408 00:46:34.880 00:46:43.000 Pranav Narahari: the gap between your individual Cloud instance versus what the command center should be, answering.

409 00:46:43.350 00:46:47.560 Daniel: And maybe what I’ll do is I’ll take a moment here just to ask it a couple of these questions.

410 00:46:47.560 00:46:48.180 Pranav Narahari: Yes.

411 00:46:48.180 00:46:53.680 Daniel: Give you my Claude context, and then, I’m starting a new chapter for each of these two.

412 00:46:54.000 00:47:00.139 Daniel: And then… and then we can dump this question straight into MCP and ask it, what… what… what is it missing?

413 00:47:00.740 00:47:02.890 Pranav Narahari: Yeah, that would be… that would be great.

414 00:47:05.180 00:47:07.380 Daniel: Okay, so I’ll go through a couple of these.

415 00:47:07.560 00:47:13.309 Daniel: give my Claude answers, and then why don’t we just test it against that? And then really, like.

416 00:47:13.620 00:47:23.940 Daniel: I don’t know how to highlight this other than, like, I’m not doing this for fun, necessarily, so we’ll really need to reassess, like, can we accomplish this business objective in a reasonable format?

417 00:47:24.910 00:47:28.389 Daniel: And… What’s the value against that, right?

418 00:47:28.820 00:47:29.160 Pranav Narahari: Yeah.

419 00:47:29.160 00:47:31.380 Daniel: I guess what I’m trying to say there is, like.

420 00:47:31.880 00:47:37.519 Daniel: If the answer is, like, hey, what you’re looking for really, like.

421 00:47:37.910 00:47:41.549 Daniel: we need, like, a different approach to it or something. I’m okay with that answer, right?

422 00:47:42.280 00:47:51.729 Daniel: I think… I think what I’m trying to do is keep this laser-focused in on, like, the business objective at hand, which is something other than just building out fun MCP layers, right?

423 00:47:52.040 00:47:54.990 Pranav Narahari: Yes, yeah. And I think this…

424 00:47:55.070 00:48:09.529 Pranav Narahari: this, direction more about, like, what type of questions, too, is really good. All of these questions kind of do benefit from having that global access, but I think the ones that you kind of highlighted, which are in that bird’s eye view.

425 00:48:09.530 00:48:15.339 Pranav Narahari: category are where you should see the most amount of benefit with a command center, so…

426 00:48:15.340 00:48:19.649 Pranav Narahari: Yeah, just focusing on those. I’ll probably generate a few more questions surrounding that as well.

427 00:48:20.860 00:48:34.909 Pranav Narahari: And then I do feel confident that with our solution, with what we’re working on right now, this is the right approach. And then we’ll be able to confirm that next week with just the difference in answers that you’re getting with Claude versus with the Command Center.

428 00:48:38.680 00:48:39.690 Daniel: Interesting.

429 00:48:40.500 00:48:42.649 Adam P: Contacts on specific… huh.

430 00:48:50.930 00:48:53.130 Pranav Narahari: Okay, guys, how are we feeling about,

431 00:48:53.980 00:48:58.600 Pranav Narahari: how does that sound for, like, next week, what we’re gonna be working on? I think we’re…

432 00:48:58.600 00:49:07.580 Daniel: I’m probably gonna, in the background, play with this and just try and get a few more context questions in, and really, I think that’s… that’s… that’s the charter. If we can take…

433 00:49:07.790 00:49:21.240 Daniel: I don’t need 90 questions, right? If one of these questions, we can clearly tell, like, hey, you missed something here, and it pulls in something that I know I haven’t seen, then I’ll be like, okay, now I’m trusting we’re getting a fuller content.

434 00:49:21.240 00:49:21.580 Adam P: Oh.

435 00:49:21.580 00:49:22.569 Daniel: in the review.

436 00:49:22.930 00:49:39.130 Pranav Narahari: 100%, yeah. Okay, and then that’s how I’ll… that’s how I’ll structure my report for you guys next week, is that, you know, first with these cloud outputs, throw that into, the command center, what is missing, and then that what is missing part, I’ll show in the report.

437 00:49:39.130 00:49:53.639 Daniel: Yeah, I think that’d be a pretty cool summary, because then it’s almost just, like, gap spotting there, right? Yes. And if we know it has additional context, at the very least we know, okay, the business objective could be achieved here, now it’s just fine-tuning the bot to recognize that and draw these, like, larger…

438 00:49:53.850 00:49:54.740 Daniel: Things.

439 00:49:55.170 00:49:56.110 Pranav Narahari: Exactly.

440 00:49:56.110 00:49:59.040 Daniel: Excellent! This is exactly what it should say.

441 00:50:01.520 00:50:04.929 Daniel: Because that’s literally what he calls it in the notes, in the meetings, on Slack.

442 00:50:05.190 00:50:09.680 Daniel: And it pulled that from my question of… Dude.

443 00:50:09.680 00:50:11.529 Adam P: Whatever it’s called internally, yeah.

444 00:50:11.530 00:50:13.800 Daniel: Yeah, that’s cool, okay.

445 00:50:13.990 00:50:30.009 Daniel: So this… this will be some really good context. Okay, Pranav, I’ll start dumping some of these things in here so you can get some of that gap spotting, and then… and let’s see. I mean, that’s… that’ll definitely be a go-no-go for me on, like, investing a ton of, you know, infrastructure behind this, is like, is it actually going to capture context?

446 00:50:30.230 00:50:33.090 Pranav Narahari: Yes. Yeah, yeah, that makes sense. That makes sense.

447 00:50:33.320 00:50:34.540 Pranav Narahari: Okay, guys.

448 00:50:34.810 00:50:35.420 Daniel: Thank you.

449 00:50:36.050 00:50:38.370 Pranav Narahari: Good convo. We’ll talk again next week.

450 00:50:38.620 00:50:40.020 Adam P: Awesome. Toto, guys.

451 00:50:40.270 00:50:41.699 Pranav Narahari: Have a good weekend. See ya.