Meeting Title: Brainstorming on topic, semantic view design + dbt model design for AI traversal Date: 2026-05-08 Meeting participants: Jasmin Multani, Advait Nandakumar Menon, Uttam Kumaran, Amber Lin


WEBVTT

1 00:00:20.120 00:00:21.859 Advait Nandakumar Menon: Hey, how’s it going?

2 00:00:22.120 00:00:23.770 Jasmin Multani: Hey, what’s it going?

3 00:00:25.470 00:00:27.650 Advait Nandakumar Menon: Going good, how about you?

4 00:00:28.290 00:00:34.440 Jasmin Multani: Yeah, it’s finally Friday, I clocked in my hours for Clockify.

5 00:00:34.640 00:00:38.480 Jasmin Multani: That’s what I’m happy about.

6 00:00:39.590 00:00:40.530 Advait Nandakumar Menon: That’s nice.

7 00:00:44.690 00:00:46.669 Jasmin Multani: How were your travels last weekend?

8 00:00:47.780 00:00:53.530 Advait Nandakumar Menon: Yeah, I went… to Dayton, Ohio, it was nearby. I went for my friend’s graduation, so…

9 00:00:53.840 00:00:55.209 Jasmin Multani: Oh, that’s nice!

10 00:00:55.580 00:00:56.420 Advait Nandakumar Menon: Yeah.

11 00:00:56.910 00:00:57.580 Jasmin Multani: That’s so funny!

12 00:00:57.580 00:01:02.120 Advait Nandakumar Menon: Yeah, we spent the day, like.

13 00:01:02.410 00:01:11.669 Advait Nandakumar Menon: We were taking the pictures here and there, and then just attending the graduation. Came back home on Saturday, so it was… it was short, but it was fun.

14 00:01:11.670 00:01:13.679 Jasmin Multani: Yeah, I think it’s so worth it.

15 00:01:14.410 00:01:15.640 Advait Nandakumar Menon: Yeah, yeah.

16 00:01:15.640 00:01:21.559 Jasmin Multani: Chaotic, but, like, you rarely get to see people in your life throughout your 20s after a point.

17 00:01:22.010 00:01:25.580 Advait Nandakumar Menon: Yeah, that’s true. Very true.

18 00:01:25.580 00:01:26.100 Jasmin Multani: Yeah.

19 00:01:26.500 00:01:27.250 Advait Nandakumar Menon: Yep.

20 00:01:33.270 00:01:35.910 Advait Nandakumar Menon: Do you have any fun plans for the weekend?

21 00:01:36.470 00:01:45.400 Jasmin Multani: Do I? It’s my first weekend out of the yoga certificate.

22 00:01:45.400 00:01:47.570 Advait Nandakumar Menon: Oh, yeah, how did that go?

23 00:01:47.730 00:01:53.850 Jasmin Multani: So, I’m certified, and I got to teach a class last week, and…

24 00:01:54.230 00:01:57.889 Jasmin Multani: I’m gonna be teaching 4 classes in September.

25 00:01:58.010 00:02:13.130 Jasmin Multani: And there was a point where they went over, like, the taxes and everything, and basically they were like, yeah, you can… when you’re a yoga instructor and you have a 1099, you can write off your manicures and your outfits. I’m like, oh, okay.

26 00:02:13.320 00:02:16.810 Jasmin Multani: I’m suddenly very interested in teaching now.

27 00:02:19.460 00:02:20.030 Jasmin Multani: So…

28 00:02:20.030 00:02:24.879 Advait Nandakumar Menon: like… How many people ended up attending last week?

29 00:02:24.880 00:02:30.270 Jasmin Multani: Oh my god, I think 30. It was 30 people, and then.

30 00:02:30.360 00:02:31.760 Advait Nandakumar Menon: We raised…

31 00:02:31.760 00:02:36.359 Jasmin Multani: Yeah, and all of the funds went towards a non-profit that’s.

32 00:02:36.360 00:02:38.220 Advait Nandakumar Menon: That’s… that’s good, that’s nice. Yeah.

33 00:02:38.220 00:02:43.849 Jasmin Multani: Yeah, and that nonprofit is to help immigrants fight against ICE.

34 00:02:43.960 00:02:44.730 Jasmin Multani: like, they’re, like.

35 00:02:45.140 00:02:51.990 Jasmin Multani: for their legal proceeds. We raised $600, which I think is pretty good for…

36 00:02:52.270 00:02:58.159 Jasmin Multani: Something that’s, you know, for beginner yoga teachers, I think it’s pretty good.

37 00:02:58.820 00:03:02.810 Advait Nandakumar Menon: That’s… that’s really… Who, like… that was a first class, right?

38 00:03:03.510 00:03:13.510 Jasmin Multani: That was the first class I ever taught in, like, where someone’s paying me. I’ve taught my mom and my brother, and it’s very different.

39 00:03:13.800 00:03:17.090 Jasmin Multani: And I only did, like, a 5-minute sequence. We all each…

40 00:03:17.400 00:03:21.390 Jasmin Multani: did a portion, but I haven’t taught a full class yet.

41 00:03:21.920 00:03:31.119 Jasmin Multani: So, I’ll do that in September, and then even then, like, all proceeds go towards a non-profit that we want to pick.

42 00:03:31.470 00:03:32.310 Jasmin Multani: So…

43 00:03:33.970 00:03:36.040 Jasmin Multani: That’s good, like, each person in the…

44 00:03:36.820 00:03:40.249 Jasmin Multani: Each person who’s graduated takes, like, one month to teach.

45 00:03:41.160 00:03:47.969 Jasmin Multani: In that studio, so… I think it’s a really good… Studio setup.

46 00:03:48.160 00:03:48.890 Jasmin Multani: And…

47 00:03:49.570 00:03:53.149 Jasmin Multani: They gave us each malas, which I thought was so sweet.

48 00:03:53.150 00:03:56.130 Advait Nandakumar Menon: Yeah, yeah.

49 00:03:56.970 00:04:10.600 Jasmin Multani: And I really appreciate that, like, I told my mom, they were like, my mom and my nanny, they’re like, what? Like, what’s… what yoga studio is this? That’s nice. But then they’re like, okay, cool, now you’ll actually, like.

50 00:04:10.800 00:04:13.880 Jasmin Multani: be praying more often, and I’m like, yeah, yeah, we’ll see.

51 00:04:14.550 00:04:21.859 Advait Nandakumar Menon: So, are there actually any Indians who are running it, like, anyone… the studio itself, or is it…

52 00:04:22.220 00:04:31.299 Jasmin Multani: So there are two instructors… so there are two instructors… yeah, so the yoga studio owner is white, but she’s very…

53 00:04:31.770 00:04:35.360 Jasmin Multani: Like, seeped in with the culture, like.

54 00:04:35.560 00:04:42.740 Jasmin Multani: Her mom, even though she was also white, like, her mom… like, this woman’s 50, so her mom…

55 00:04:43.540 00:04:50.299 Jasmin Multani: also was into yoga way back when. So they were introduced to it pretty early on, before these waves.

56 00:04:50.390 00:05:04.660 Jasmin Multani: And, you know, even when Kate, the yoga instructor, like, she lost her mom when she was 30, so then she went even deeper into yoga to feel connected to her, and she would, like, she’s just…

57 00:05:05.030 00:05:11.409 Jasmin Multani: you can tell, like, she actually has studied the roots, and she forces everyone to say the Sanskrit names.

58 00:05:11.900 00:05:18.099 Jasmin Multani: And, she, like, a lot… we read the Bhagavad Gita, and then…

59 00:05:18.600 00:05:28.840 Jasmin Multani: she, so, like, even though there… yeah, there are two Indian yoga instructors at the studio, and they taught us modules. Like, one woman…

60 00:05:28.950 00:05:35.429 Jasmin Multani: She’s really cool, she’s… She is a doctor, and she’s a yoga instructor.

61 00:05:36.360 00:05:37.419 Advait Nandakumar Menon: Oh, well.

62 00:05:37.420 00:05:38.479 Jasmin Multani: Yeah, so I’m like, wow.

63 00:05:38.820 00:05:41.690 Jasmin Multani: It’s like the ultimate… the ultimate career.

64 00:05:41.960 00:05:43.010 Advait Nandakumar Menon: Yeah, yeah.

65 00:05:43.010 00:05:49.760 Jasmin Multani: And she’s hella smart, like, she went to Columbia and Harvard, and that’s, like, very established.

66 00:05:49.760 00:05:50.630 Advait Nandakumar Menon: Yeah, yeah.

67 00:05:50.630 00:05:56.570 Jasmin Multani: And so, now that she’s in her mid-30s, she’s like, okay, I’m not gonna… I think she does,

68 00:05:56.950 00:05:58.769 Jasmin Multani: She does per diem.

69 00:05:58.970 00:05:59.630 Jasmin Multani: So…

70 00:05:59.630 00:06:00.340 Advait Nandakumar Menon: Oh, God.

71 00:06:00.800 00:06:03.980 Jasmin Multani: That’s how she has time, but she also teaches and everything.

72 00:06:04.510 00:06:06.460 Jasmin Multani: Sorry, sorry that derailed.

73 00:06:07.130 00:06:09.670 Advait Nandakumar Menon: No, it was fun to hear that. Hey, Utam.

74 00:06:09.670 00:06:13.529 Jasmin Multani: Yeah, but there are, like, They do honor the roots.

75 00:06:13.870 00:06:14.939 Jasmin Multani: Hi, Ethan.

76 00:06:15.690 00:06:16.880 Uttam Kumaran: Hello!

77 00:06:19.540 00:06:24.060 Uttam Kumaran: Yeah, I just wanted to talk about, like, this, because I feel like a couple of us are…

78 00:06:24.330 00:06:27.779 Uttam Kumaran: Poking around topics and semantic views, and…

79 00:06:28.330 00:06:33.540 Uttam Kumaran: I don’t know, I feel like probably now a few of us have some opinionated thoughts on them, and…

80 00:06:33.730 00:06:38.139 Uttam Kumaran: I’ve had some opinionated thoughts for a long time that I feel like we can have a fun discussion on, like.

81 00:06:38.510 00:06:41.040 Uttam Kumaran: How do we construct a semantic layer?

82 00:06:41.590 00:06:46.409 Uttam Kumaran: Maybe I’ll just, like… Go first, and even, like, start with, like.

83 00:06:46.550 00:06:54.629 Uttam Kumaran: what I think the semantic layer, like, is now, because I think it is actually, like, very, very different.

84 00:06:54.830 00:06:59.400 Uttam Kumaran: It is very, very different than…

85 00:06:59.770 00:07:05.440 Uttam Kumaran: like, what it was before, so maybe I’ll even give some, like, lore on, like,

86 00:07:06.020 00:07:09.050 Uttam Kumaran: on the semantic layer, so one is,

87 00:07:09.460 00:07:16.910 Uttam Kumaran: Yeah, I’ve been… so the company where I was at before Brainforge, and I was leading product at, was actually, like, a semantic layer startup.

88 00:07:16.970 00:07:28.230 Uttam Kumaran: Basically, part of the goal was to build a no-code data transformation. So you could basically create metrics and dimensions, measures.

89 00:07:28.230 00:07:40.330 Uttam Kumaran: and tables, all through a UI. So you could basically say, I want this column, this column, and then I want to transform this column into this. Of course, you can add descriptions and things, but more of the semantic layer, like.

90 00:07:40.330 00:07:48.199 Uttam Kumaran: three, four years ago was actually just definitions, like, definitions and expressions, right? So, like, the name.

91 00:07:48.200 00:08:03.739 Uttam Kumaran: the expression of, like, this dimension is, like, a cast of this column, right? And then this measure is a sum of this, and then this metric is a sum of this measure and this measure divided by this measure, right?

92 00:08:03.740 00:08:15.690 Uttam Kumaran: So you can kind of layer it into, like, you have columns, or, like, the ultimate primitive, right? Tables of columns. Those ladder into dimensions, which can be… have a function on top of them, but are, of course, disc… I always say.

93 00:08:15.770 00:08:31.060 Uttam Kumaran: dimensions describe measures, and then you have measures, which are aggregations, and then you have metrics, which can be, like, a combination. For example, you can have case when, the state is California, then this measure, right?

94 00:08:31.290 00:08:49.959 Uttam Kumaran: So that’s what semantic layer used to be. Description of the expression. Now, because of AI, and because AI is… was primarily, like, text-based, right? Column descriptions, table descriptions, and broader, like, semantic

95 00:08:50.000 00:08:54.149 Uttam Kumaran: understanding matters a lot more than it used to.

96 00:08:54.300 00:09:03.409 Uttam Kumaran: And so, one of the challenges I think we’re gonna see across the board is… now, previously, for a dashboard, it didn’t really matter if you had all your table descriptions.

97 00:09:03.460 00:09:22.400 Uttam Kumaran: comm to script, like, whatever. It’s actually more important for you to have expressions right. Now, we need to have both kind of correct, which is a different challenge. And in fact, now we have two products that this team has to support. You have to support the dashboard, which is usual, business as usual. You have to now support text-based

98 00:09:22.440 00:09:37.680 Uttam Kumaran: AI querying of our semantic system, right? Whether they’re asking, what does this mean? Whether they’re asking, how do I find this? Whether they’re asking, generate me an answer, right? All of that

99 00:09:37.750 00:09:39.519 Uttam Kumaran: Basically, at the root of it.

100 00:09:39.920 00:09:55.869 Uttam Kumaran: you’re typing into AI, it generates, like, it uses a set of tools to either run, like, a SQL query, it pulls from a certain semantic area, it combines all that, and then gives you, basically, the next best response.

101 00:09:55.930 00:10:02.669 Uttam Kumaran: part of the reason why I’m, like, bringing this up is it kind of also impacts how we’re structuring data models. Like.

102 00:10:02.800 00:10:08.569 Uttam Kumaran: There may be actually a value to having fewer, wider tables now, because

103 00:10:08.650 00:10:26.029 Uttam Kumaran: having the AI do the joins may be more fickle, right? Like, what if we have 15 tables, and then the AI has to join all 15, they answer a question, maybe we should have just created one fat table, right? For it to just say, like, here’s the table, you figure out, run whatever query you need now.

104 00:10:26.280 00:10:34.390 Uttam Kumaran: Similarly, like, table descriptions really matter. Column descriptions really matter. Metric descriptions really matter.

105 00:10:34.680 00:10:52.160 Uttam Kumaran: even, like, more, like, business definitions, like, for example, like, almost I’m like, let’s throw our discovery docs into there, right? Because, like, who… like, someone… you can expect someone to be like, who runs this business domain? Why was this decision made?

106 00:10:52.250 00:11:06.370 Uttam Kumaran: Ultimately, are we on the hook for, like, the AI answering that? I would probably say yes. Should we be? I don’t think so, but we are. And this is where, ultimately, I think two things are gonna happen, and then I’ll pause, and I’m gonna get some reaction.

107 00:11:06.980 00:11:13.400 Uttam Kumaran: I think our positioning as a company is that we are the context engineering experts.

108 00:11:13.560 00:11:15.700 Uttam Kumaran: And the ownership of…

109 00:11:15.740 00:11:32.680 Uttam Kumaran: the dashboard, in addition to the ownership of the chatbot, is gonna flow to us, because we are the people in between that tool getting successful. You may be the best ChatGPT user, but you have no access to structured contacts, like, you’re kind of screwed.

110 00:11:32.680 00:11:35.870 Uttam Kumaran: So, our business positioning is shifting towards that.

111 00:11:35.870 00:11:38.589 Uttam Kumaran: Now, however, a lot of

112 00:11:38.590 00:11:52.419 Uttam Kumaran: like, actually having to construct a really amazing chat experience is not very common for a classic data team. So, we need to transition, right? So, the ask for everybody here is, like.

113 00:11:53.000 00:12:07.380 Uttam Kumaran: it’s sort of twofold. One is, like, we… Robert and I need to find ways to bring people on to these projects that are actually almost like AI product managers. Because managing a chatbot experience and the UX of the chatbot is actually like a product, right?

114 00:12:07.780 00:12:23.649 Uttam Kumaran: similar to large, super large companies, you almost have these, like, dashboard product managers, where they, like, they run this, like, one dashboard, right? It’s kind of similar. Second is, like, also our work as analysts, AEs, DEs, needs to understand that we have another

115 00:12:23.770 00:12:28.710 Uttam Kumaran: surface that people are going to be using, and I think we kind of need to decide, like.

116 00:12:29.570 00:12:40.490 Uttam Kumaran: how much of that we can accomplish with our normal day-to-day style, which is, like, land stuff at Snowflake, model and dbt, make it available, or do we need, like, a completely different

117 00:12:40.680 00:12:46.169 Uttam Kumaran: parallel system that’s, like, all fat tables that, like, powers the AI, like…

118 00:12:47.020 00:13:06.900 Uttam Kumaran: So that’s kind of, like, what I wanted to open up. I haven’t been able to talk with this about anybody for a long time, until now I feel like you guys now understand the pain of this problem. So, hopefully that, like, wrapped this up, but if you have any things that I missed, or, like, maybe I didn’t totally grasp it, let me know, but sort of wanted to open the floor with that.

119 00:13:09.980 00:13:11.550 Advait Nandakumar Menon: Yeah, that is…

120 00:13:11.970 00:13:22.199 Advait Nandakumar Menon: really good context, Rutan, and I have to agree with you, like, how it was before and how it is now, like, definitions and context in general is really important for the AI to…

121 00:13:22.200 00:13:35.710 Advait Nandakumar Menon: give accurate results and answers, but I do want to ask, like, how much context, like, you said, like, maybe it’s useful to, throw in the discovery or whatever, like, how much context is too much context for the AI? Because…

122 00:13:35.810 00:13:41.349 Advait Nandakumar Menon: after a certain limit, I shared this example with Shivani as well yesterday, like.

123 00:13:41.560 00:13:52.260 Advait Nandakumar Menon: There is the character limit as well within Blobby and Omni, like, it gets truncated after a certain limit, like, so how will we balance that out?

124 00:13:52.450 00:13:54.880 Advait Nandakumar Menon: So, do you have any ideas on that?

125 00:13:56.660 00:14:06.630 Uttam Kumaran: It’s gonna… I would say we need to do two things. We have to test, and Amber is learning this the hard way, because the documentation from these product companies suck.

126 00:14:06.830 00:14:13.570 Uttam Kumaran: Like, they don’t… they don’t… they just, like, we released a chatbot, and we released, these places where you can put free text.

127 00:14:13.950 00:14:26.709 Uttam Kumaran: It’s like, dude, like, you didn’t release any tooling for the developer to measure, to run… like, Omni doesn’t have evals, which means we can’t put in 50 questions into Omni and say, Omni, help me make sure that these get answered.

128 00:14:26.820 00:14:28.060 Uttam Kumaran: The same time.

129 00:14:28.290 00:14:29.589 Uttam Kumaran: We have to build that.

130 00:14:29.700 00:14:30.380 Uttam Kumaran: Okay.

131 00:14:31.110 00:14:34.749 Uttam Kumaran: Like, second point is, like, We probably have to test.

132 00:14:35.010 00:14:48.909 Uttam Kumaran: Right? We probably have to test, and, like, maybe we test our own instance, maybe we test with Element. But the third point is, ultimately, we need to tell the client, like, we need to really be forward with the client that this is, like, a super, super brand new technology, and this is where I don’t think…

133 00:14:49.130 00:14:54.849 Uttam Kumaran: I’ve done a good job with Element doing that. I think we have, but, like, we almost need to keep doing it, which is, like.

134 00:14:55.340 00:14:57.510 Uttam Kumaran: It’s not gonna work day one.

135 00:14:57.560 00:15:01.179 Uttam Kumaran: It may not even work perfectly, like, day 90, like.

136 00:15:01.210 00:15:18.810 Uttam Kumaran: this is, like, a new technology. There’s gonna be a top… for all the benefits, there’s also, like, no team on planet Earth like us that, like, is like, oh yeah, we’re 100% experts in how to construct the best Omni. There’s, like, nobody that… nobody… not even the Omni folks are able to do that. So…

137 00:15:19.650 00:15:25.830 Uttam Kumaran: that’s, like, kind of the playing ground, you know? So there is a science to it, there is a science to, like.

138 00:15:26.010 00:15:40.199 Uttam Kumaran: you know, context stuffing and how much stuff we can have. There is a science to… maybe we need to build some tooling out around Omni that, like, ultimately Omni should just build into the product, but we’re not gonna wait for them, we just need, right?

139 00:15:40.200 00:15:51.179 Uttam Kumaran: a couple things that come to mind are, like, as part of our CICD, when you push a change to DBT, or you push a change to Omni, I want it to run through 50 questions.

140 00:15:51.220 00:15:56.599 Uttam Kumaran: and use, like, what’s called LLM as a judge to basically look at, like, did it answer it?

141 00:15:56.750 00:16:00.870 Uttam Kumaran: properly. Like, we have a set of standard answers for all those.

142 00:16:01.170 00:16:10.219 Uttam Kumaran: run through those answers using the Omni MCP, and then tell us whether these are off, and like, so that we know we’re not pushing changes that impact this, because

143 00:16:10.360 00:16:20.960 Uttam Kumaran: So those are some of the things, but that’s tooling that, like, this group has to kind of be like, we need this so that I can go to, like, the platform team, and me and Awage can, like, go build that, you know?

144 00:16:22.490 00:16:26.339 Uttam Kumaran: Amber’s facing the same whack-a-mole, like, situation.

145 00:16:27.390 00:16:35.139 Amber Lin: I think from my experience, I think our team needs to now have a bit more

146 00:16:35.220 00:16:54.630 Amber Lin: knowledge around AI, observability, how those things work. We’ve been very much a data team, but now we are more stepping into the AI and context engineering role, and we need to know, okay, that we’re not just doing dashboards anymore, and I think we should

147 00:16:54.710 00:17:12.699 Amber Lin: as the three of us, we should seek education, educate ourselves on how those things work, and maybe work with the AI team to know how those observabilities are set up, because I think that will… that will be the new measurement of… of our work and how things are done. So,

148 00:17:12.819 00:17:20.120 Amber Lin: I would like us to do that. Maybe we can have sessions where we learn that together, but I think that would be really helpful.

149 00:17:20.460 00:17:25.019 Amber Lin: and going back, Utam, to your point about

150 00:17:25.410 00:17:36.229 Amber Lin: So, not going back upstream to… towards the data of how data should be structured. Right now, I am experimenting with flat tables and…

151 00:17:36.360 00:17:40.040 Amber Lin: Having to join tables.

152 00:17:40.510 00:17:51.699 Amber Lin: I find that when it is working inside a semantic view, where we define the relationships, the joins usually work, and it doesn’t cause too much

153 00:17:51.830 00:18:01.569 Amber Lin: trouble, though I… I don’t know if there’s, say, 100 tables versus 20 of how that would behave, but within a well-defined semantic view.

154 00:18:01.940 00:18:10.409 Amber Lin: it’s okay joining across different tables instead of a big, flat table. I think…

155 00:18:10.720 00:18:13.959 Amber Lin: Using a flat table might cause…

156 00:18:14.340 00:18:19.870 Amber Lin: Has the benefit of less joins, and you need to define less,

157 00:18:20.520 00:18:37.569 Amber Lin: Joins that don’t exist yet, it’s just already there, but it could come as, oh, the compute might be higher, it might have trouble finding columns now that it doesn’t have the table name to look into where they are.

158 00:18:37.670 00:18:48.319 Amber Lin: So, I think on both sides there… both sides, there is a problem of looking for the right thing, which is things that I’m running into of…

159 00:18:48.490 00:18:59.890 Amber Lin: The reason why queries fail is that it does not know what table, does not know what column, and then it joins the wrong way, or it filters for the wrong thing.

160 00:19:00.150 00:19:05.369 Amber Lin: So, I know it doesn’t answer the question how we want to do…

161 00:19:06.100 00:19:11.150 Amber Lin: DVT, but maybe a suggestion is, since we have those layers.

162 00:19:11.400 00:19:18.949 Amber Lin: And for dashboards, we use the broken-down models, the dim fact tables.

163 00:19:19.040 00:19:32.580 Amber Lin: But it’s sourced from somewhere, so ideally, in that last transformation, maybe we don’t change what the column names are, we only change if we break it down into smaller tables, and with

164 00:19:32.790 00:19:38.400 Amber Lin: AI generation to look at the models. We can just use the level above that.

165 00:19:39.270 00:19:51.020 Amber Lin: for, for Blobby or for questions, then we can still use essentially the same content, but just more humanly friendly for dashboarding.

166 00:19:53.020 00:19:54.410 Amber Lin: What do you think, Tom?

167 00:19:56.410 00:20:04.299 Uttam Kumaran: Yeah, one of my things that I wish, and I don’t know if, maybe, Aave, you have to tell me, is, like, I wish Omni could just push back, like.

168 00:20:04.440 00:20:24.139 Uttam Kumaran: and ask more follow-up questions, because frankly, like, if I can tell you guys what I’m seeing broadly across all of AI, is this is, like, a, input problem. It’s actually not as much of a, like, text-to-SQL, like, text-to-SQL is solved, like, the models are getting better, they’ll infer their way to the answer.

169 00:20:24.260 00:20:28.239 Uttam Kumaran: But most of the time, the user isn’t giving enough information.

170 00:20:28.420 00:20:40.220 Uttam Kumaran: And I’m seeing this across the board, and so the way we’re affecting this at Brainforge is, one is we have a ton of documentation internally, so if you were to say, like.

171 00:20:40.690 00:20:45.549 Uttam Kumaran: like, what does this client do? There’s a bunch of skills and things that will help you figure that out.

172 00:20:45.740 00:20:53.189 Uttam Kumaran: But also, like, I’m gonna try to enforce more across the board that if the AI isn’t confident it can find the answer, it’s gonna push back.

173 00:20:53.680 00:20:58.110 Uttam Kumaran: You know, and so I wonder, like, if there’s ways in Omni to do that.

174 00:20:58.280 00:21:02.800 Uttam Kumaran: Or if it’s sort of like, it just goes… Yeah, go ahead.

175 00:21:03.360 00:21:09.820 Amber Lin: I’m not speaking directly in Omni, but I can speak to how I was working with the Snowflake agent.

176 00:21:10.010 00:21:21.760 Amber Lin: With agent instructions, I was able to tell it, hey, if you find two things confounding, ask these questions, and the harness worked decently well, so,

177 00:21:21.910 00:21:26.780 Amber Lin: I assume we can edit AI instructions for Bobby, right?

178 00:21:26.780 00:21:31.029 Advait Nandakumar Menon: Yeah, that’s something… similar thing I’ve done in Blobby as well, like, if it’s…

179 00:21:31.130 00:21:39.660 Advait Nandakumar Menon: confused between wholesale or retail or point of sale and sale. Like, if you set up an instruction within the AI context, then Blobby does it, but…

180 00:21:39.930 00:21:42.190 Advait Nandakumar Menon: Right out of the box, it doesn’t, yeah.

181 00:21:42.340 00:21:46.230 Jasmin Multani: Yeah, and do… I want to also ask, how much…

182 00:21:46.430 00:21:50.159 Jasmin Multani: Of this, you know, under-the-hood curation should we do?

183 00:21:50.830 00:22:01.199 Jasmin Multani: that’s invisible to the user, versus how much should we force the user to say, hey, you need to input your own priorities? Because that’s my other worry,

184 00:22:01.530 00:22:06.110 Jasmin Multani: that users are just opting into AI, but they’re not considering

185 00:22:06.700 00:22:15.469 Jasmin Multani: the under-the-hood priorities, that are being made, and the decisions that are being made. Yesterday, Advid made a really good point.

186 00:22:15.650 00:22:19.469 Jasmin Multani: with Shivani saying that, hey, some of these questions end up

187 00:22:20.030 00:22:26.740 Jasmin Multani: have a… the way the AI answers can be limited based off of what topic is being used.

188 00:22:26.910 00:22:35.490 Jasmin Multani: So, instead of… but, like, at the same time, instead of us configuring Omni to have a default topic.

189 00:22:35.990 00:22:47.479 Jasmin Multani: We should still be forcing the users to be mindful and opt into the topic, so that they also are self-aware of their own troubleshooting, steps that they need to take.

190 00:22:48.720 00:23:01.239 Jasmin Multani: And alongside, we should have, like, a troubleshooting FAQ for each dashboard, and topic that… or, yeah, dashboard that’s used, like, business domain, so people know how to navigate and,

191 00:23:01.600 00:23:10.210 Jasmin Multani: troubleshoot on their own, but I… I think there is… A risk in making decisions

192 00:23:10.490 00:23:12.890 Jasmin Multani: That’s invisible to the user’s eye.

193 00:23:12.890 00:23:13.830 Uttam Kumaran: Yeah…

194 00:23:18.650 00:23:19.270 Jasmin Multani: That’s true.

195 00:23:19.270 00:23:21.900 Uttam Kumaran: Yeah, but I think what I’ve learned…

196 00:23:23.230 00:23:33.659 Uttam Kumaran: what I’ve learned in trying to do it at this company is, sort of, you have to assume at any moment you have two class of users. You have someone that, like, has… that they’re using AI for the first time.

197 00:23:34.050 00:23:34.670 Jasmin Multani: And you’re.

198 00:23:34.670 00:23:36.179 Uttam Kumaran: I also have someone that’s, like.

199 00:23:36.790 00:23:40.079 Uttam Kumaran: Like, they’re, like, the god, and they’re using the same chat window.

200 00:23:40.470 00:23:50.040 Uttam Kumaran: You know, and it’s very tough because those are completely different types of users, And, one, I mean.

201 00:23:50.300 00:23:55.060 Uttam Kumaran: Always, like, a great answer, but also kind of, like.

202 00:23:55.210 00:23:58.910 Uttam Kumaran: sort of always been the case. It’s like, we gotta train people, right? But…

203 00:23:59.450 00:24:01.329 Uttam Kumaran: I don’t know if we even get, like…

204 00:24:01.500 00:24:10.809 Uttam Kumaran: I don’t know if that’s gonna be possible on Element. I feel like even Shivani isn’t, like, as patient to, like, sit and even understand, like, what a topic is, like…

205 00:24:11.330 00:24:25.939 Uttam Kumaran: So my hesitancy is, like, I don’t think that future… for this client is right. At Brain Forge, though, you can tell we took an opposite approach. We are going and training everybody methodically. Part of the reason is because, like.

206 00:24:26.090 00:24:35.039 Uttam Kumaran: I… there’s no one… I don’t want people to just… I don’t want people to get left behind. Part of it is, we’re gonna go sell this stuff, so we all need to know how this works.

207 00:24:35.200 00:24:39.320 Uttam Kumaran: But… this is a client where I don’t know,

208 00:24:39.610 00:24:49.320 Uttam Kumaran: If that’s gonna be possible. So, we… I would… and the directive we have from the client is to make sure that things are accurate.

209 00:24:49.670 00:24:51.700 Uttam Kumaran: Not that things are fast.

210 00:24:52.230 00:24:54.049 Uttam Kumaran: And not that things are, like…

211 00:24:54.240 00:25:00.640 Uttam Kumaran: You just get whatever answer. Like, it’s accuracy and the robustness is the directive.

212 00:25:00.820 00:25:14.660 Uttam Kumaran: Which helps us be, like, we’re going towards that, and then if Shivani pulls us other ways, we’re like, well, you made it explicit that, like, that’s our direction, to make sure that it’s always right. And so, if we know it’s always be right, these are the boundaries.

213 00:25:15.030 00:25:18.189 Uttam Kumaran: These are the boundary conditions, right?

214 00:25:18.760 00:25:31.419 Uttam Kumaran: And that’s a much better conversation that you can have with some grounding instead of, like, I want everything. I wanted to answer my, like, off-the-win question, and this really nuanced thing, like, you’re gonna get in such a jam.

215 00:25:31.550 00:25:33.459 Uttam Kumaran: Right? And, like.

216 00:25:33.790 00:25:43.119 Uttam Kumaran: part… like, and I like comparing and contrasting our company, but we have other companies where it will be a slow, training-based thing, which is wonderful.

217 00:25:43.150 00:25:57.500 Uttam Kumaran: At our company, we’re kind of getting to the mode where, like, some people are using it for development work, some people are using it for, like, sales work, some people are using… like, so, what we’re gonna do is we’re gonna have, like, just different product services. Like.

218 00:25:57.620 00:26:07.039 Uttam Kumaran: even Omni is one chat window, how can that support all these types of questions? Some of those are visualization, some of those are, like, answer me this question, some of those are research.

219 00:26:07.510 00:26:16.260 Uttam Kumaran: this is where it’s, frankly, what I’m… what I feel like is the chat experience across each product is like lazy product work.

220 00:26:16.650 00:26:19.020 Uttam Kumaran: That’s just, like, put a… throw a chat in there.

221 00:26:19.160 00:26:24.589 Uttam Kumaran: And then leave it for the developers to, like, deal with the nonsense that comes out of that. And…

222 00:26:24.770 00:26:30.709 Uttam Kumaran: like, I don’t… I don’t particularly think that this is the final form of that product, but…

223 00:26:31.380 00:26:43.239 Uttam Kumaran: this is what it is. And I think we innovated in pitching this product, and, like, I was the one that pushed to have Omni there, because I want us to always be on the tip of the spear, but…

224 00:26:43.890 00:26:49.959 Uttam Kumaran: We are also so… this is just, like, this world is changing, and they’re… the product patterns aren’t set yet.

225 00:26:50.220 00:26:55.920 Uttam Kumaran: So… we either have a choice. I think for a comp… for a client like this, where

226 00:26:56.050 00:26:58.749 Uttam Kumaran: accuracy really matters. My…

227 00:26:58.990 00:27:04.619 Uttam Kumaran: My vote, and I would love to hear other people’s thoughts, is to just, like, clamp it up.

228 00:27:04.850 00:27:06.689 Uttam Kumaran: It’s to really be, like.

229 00:27:07.570 00:27:14.479 Uttam Kumaran: This is what you could do, because if we make it too open, like, we… we can’t… we can’t do.

230 00:27:14.480 00:27:14.960 Jasmin Multani: hey.

231 00:27:14.960 00:27:16.000 Uttam Kumaran: all the edge cases.

232 00:27:16.490 00:27:23.460 Uttam Kumaran: Because a SQL query for a model is a deterministic system. I can tell you that the query is going to respond in a certain way.

233 00:27:25.220 00:27:32.140 Uttam Kumaran: So, what we’re being asked to do is take an inherently, like, non-deterministic system

234 00:27:32.250 00:27:42.100 Uttam Kumaran: and, like, make it more deterministic, which is the… is really the million dollar problem in the whole industry right now. So it’s not… I don’t want to say, like, we’re gonna solve it any, like…

235 00:27:43.160 00:27:57.659 Uttam Kumaran: the couple, like, levers that Omni gives us is not really, like, yeah, I don’t know, I don’t think it’s there, and I guarantee you, most of their clients are in the same boat. Most of the data teams at those clients are in the same boat.

236 00:27:58.370 00:28:03.239 Uttam Kumaran: And so part of this is, like, maybe we should call Omni as a group and be like, yo, how do we deal with this?

237 00:28:03.950 00:28:05.459 Uttam Kumaran: Can you connect us with other…

238 00:28:05.740 00:28:12.310 Uttam Kumaran: customer teams that are dealing with this, because, yeah, I feel like… Steering it has been difficult.

239 00:28:14.030 00:28:22.430 Amber Lin: Does Omni have verified queries, or at least, you can put queries that will trigger again if you have similar questions?

240 00:28:23.720 00:28:28.480 Advait Nandakumar Menon: You can use sample queries within the topic to do that.

241 00:28:28.480 00:28:28.900 Amber Lin: So.

242 00:28:28.900 00:28:31.100 Advait Nandakumar Menon: That’s something I’ve said before.

243 00:28:32.170 00:28:37.270 Amber Lin: Let’s see… But then that will bulk up the context as well.

244 00:28:37.620 00:28:43.389 Advait Nandakumar Menon: Yeah, that’s… It’s… A tough thing to balance it, so that’s…

245 00:28:43.520 00:28:48.260 Advait Nandakumar Menon: Like, giving all the context, giving all the information we have, versus just…

246 00:28:48.510 00:28:50.569 Advait Nandakumar Menon: Trying to not overload it, it’s…

247 00:28:51.490 00:28:55.520 Advait Nandakumar Menon: It’s something I’m having a hard time balancing it, so…

248 00:29:00.840 00:29:12.390 Jasmin Multani: So the core problem I want to ask is, like, what we’re trying to solve for is what happens when we hand off this AI to Element, and they ask a bunch of questions.

249 00:29:12.530 00:29:18.460 Jasmin Multani: And they come back to us, and they say, hey, this is wrong. That’s what we’re trying to protect ourselves against, right?

250 00:29:23.810 00:29:31.570 Uttam Kumaran: Yeah, I… I, it’s almost like, how do we release a product that we’re confident in?

251 00:29:32.850 00:29:35.919 Uttam Kumaran: And, like, that’s more of my thing, cause if you’re… yeah.

252 00:29:36.510 00:29:37.520 Uttam Kumaran: Yeah.

253 00:29:39.240 00:29:42.520 Jasmin Multani: I think it’s, yeah, it’s definitely trickier, because.

254 00:29:42.520 00:29:48.379 Uttam Kumaran: So, like, maybe, Amber, can you describe how we… what we’re trying on CTA now? Because it’s basically this kind of…

255 00:29:48.680 00:29:58.699 Uttam Kumaran: same, same, but different. Like, maybe you can tell folks, like, how we’re… how we’re thinking about, okay, let’s say things are just gonna keep being wrong, but, like, how we’re trying to triage it.

256 00:29:59.030 00:30:07.620 Amber Lin: Yeah, so on CTA, so far we’ve been building the content, kind of like what we’ve been doing for Omni, but our… our…

257 00:30:07.650 00:30:14.620 Amber Lin: Next steps, the things that we’re delivering, is not the accurate content, but the tooling for the…

258 00:30:14.650 00:30:33.620 Amber Lin: for them and for us to iterate, because so far, we… the only thing that we know for sure is that, things will be wrong and things will change in the way we work. So, what we’re proposing is, one, giving the clients the tools they need to

259 00:30:33.680 00:30:53.440 Amber Lin: create stuff, create semantic views, create dashboards, and two, for us to iterate on any problems that they present to us. So, making sure, one, we see their usage, two, we take their feedback and action on those tickets in a timely manner.

260 00:30:53.440 00:31:00.129 Amber Lin: So that means weekly reviews, creating tickets from them, letting them know that we’re receiving these

261 00:31:00.130 00:31:07.490 Amber Lin: these requests, and we’re working on these requests. Like, those type of, flows, or…

262 00:31:08.110 00:31:24.210 Amber Lin: methods are the only thing we can guarantee, and not the actual product, itself, because even if we guarantee something now, it will be different in a month, or in 3 months. So…

263 00:31:24.800 00:31:27.140 Amber Lin: We’re more building towards the…

264 00:31:28.530 00:31:37.840 Amber Lin: school and the infrastructure around delivering work. And I think if we can align with Element on that…

265 00:31:38.480 00:31:48.570 Amber Lin: That would be helpful, but I feel like, on my impression of Element, that might require a different philosophy of how they… how they view things. They’re very different clients.

266 00:31:48.570 00:31:49.870 Jasmin Multani: Yeah, they don’t…

267 00:31:50.220 00:31:59.050 Jasmin Multani: they don’t even want to offer us questions. Like, she… we asked her, like, because in my opinion, the more questions we,

268 00:31:59.180 00:32:12.060 Jasmin Multani: throttle the AI in different voices. So, like, in my best world, I get each one of us to ask questions to the AI, with different, according to different domains, like finance.

269 00:32:12.080 00:32:22.049 Jasmin Multani: sales and so forth. And then I invite, in a perfect world, element stakeholders to also ask their top 5 questions. Because truthfully, e-commerce…

270 00:32:22.300 00:32:28.539 Jasmin Multani: it’s the same thing over and over again, same metrics over and over again. They really mainly care about.

271 00:32:28.540 00:32:31.869 Uttam Kumaran: So what again did she say she didn’t want us to do?

272 00:32:32.660 00:32:40.479 Jasmin Multani: increase the number of questions. So, like, sample questions. So, when we asked her, like, hey, so there are so many different ways to cut.

273 00:32:40.480 00:32:45.430 Uttam Kumaran: I would totally… okay, so let me give you an answer. We have to. So, whether she… like…

274 00:32:45.600 00:32:54.990 Uttam Kumaran: this is not… this is actually more of a fact… this is now a fact-based conversation. Like, the more questions we have, there’s a complete linear relationship to the actor.

275 00:32:54.990 00:32:55.350 Jasmin Multani: Yeah.

276 00:32:55.350 00:32:58.940 Uttam Kumaran: That’s not, like, a… that’s not even, like, a philosophy question for me.

277 00:32:58.940 00:32:59.350 Jasmin Multani: Yeah.

278 00:32:59.350 00:33:05.580 Uttam Kumaran: Like, this is, like, how, like, reinforcement learning in this system… this is how the AI even got to where it is.

279 00:33:05.580 00:33:05.940 Jasmin Multani: Yes.

280 00:33:05.940 00:33:09.110 Uttam Kumaran: So my point to you is, if she says no, let’s just do it in the background.

281 00:33:09.180 00:33:27.399 Uttam Kumaran: And so, you can have me, Robert, I will put on finance, I’ll ask 100 questions, you guys can go look through it. I will go do that for you guys. Whether she wants to do it or not is, like, I can’t… but this is also where my, like, I don’t want to go and have an argument about it. Let’s just do it.

282 00:33:27.400 00:33:34.620 Jasmin Multani: No, there’s no argument. Like, the second she said that, I was like, okay, we’re gonna… we’re gonna crowdsource it internally. She was fine with it.

283 00:33:35.020 00:33:36.869 Jasmin Multani: Like, in a perfect world.

284 00:33:37.060 00:33:39.179 Jasmin Multani: The way we ask questions.

285 00:33:39.180 00:33:40.240 Uttam Kumaran: No, you’re gonna find it’s

286 00:33:40.560 00:33:53.019 Uttam Kumaran: complete linear relationship to the number of questions. So one takeaway here, maybe for this group, is, like, how can we get ahead of that, maybe across all clients? Like, Amber, I didn’t bring you in with a super concise, but…

287 00:33:53.320 00:34:09.369 Uttam Kumaran: I have, like, 100 transcripts, and I’m sure you could be, like, generate me, like, 100 questions. Like, same with Element. We’ve done discovery meetings with every single stakeholder. Maybe you can say, like, give it all the context of Element, generate me those questions.

288 00:34:09.370 00:34:09.850 Jasmin Multani: Okay.

289 00:34:09.850 00:34:17.519 Uttam Kumaran: I can… I will… I’m also more than happy to go in and be like, I am head of sales here, here’s, like, easy, medium, hard questions.

290 00:34:17.540 00:34:33.989 Uttam Kumaran: And I think broadly, what this group will learn a little bit about is, like, what’s called… and, like, AI, it’s really, right now, it’s called, like, just building, like, evals, which is, like, evaluation data sets. And I think that’s something that I can get this group… this is just gonna be… have to be a new…

291 00:34:34.820 00:34:51.370 Uttam Kumaran: QA step, like, we would do a VQA on a dashboard, or we have tests, like, this is just maybe something we have to do to support this product as a strategy team, is, hey, anytime we come in to support a chat-based query product, we have to generate a minimum of, like.

292 00:34:51.550 00:34:55.739 Uttam Kumaran: 100 questions, minimum of 25 questions per domain.

293 00:34:56.150 00:35:04.140 Uttam Kumaran: categorize it to easy, medium, hard, all with the actual answers next to them. And then I’ll show you guys how we can build a little

294 00:35:04.200 00:35:13.189 Uttam Kumaran: like, eval engine. Evaluation, and I can send some reading after this, because we do this on the AI team a lot. Basically, what happens is, like.

295 00:35:13.190 00:35:25.539 Uttam Kumaran: It’s fairly simple. You ask AI a question, it gives you an answer. There’s a couple of ways of doing evaluation. You may have known… you may have heard of, like, there’s, like, distance functions and semantic distance functions.

296 00:35:25.930 00:35:41.050 Uttam Kumaran: So there’s a lot of those, but there’s also, like, LLM as a judge, which is another AI looks at the questions and is like, how far off is this? So we will actually run several evaluators, both, like, if it’s a number, if the number has to be right, let’s just ask for a number. But, like.

297 00:35:41.330 00:35:56.029 Uttam Kumaran: asking something that’s more of a sentence, okay, there’s different things, so we can use several different evaluators. We can even… and this is something that maybe we should… the platform team should work on just building an integration for you guys into Omni that allows you to do this.

298 00:35:56.270 00:36:04.820 Uttam Kumaran: I don’t know if it could be something like duct tape with a spreadsheet, and then we move to something that lives in GitHub. Like, you know, you could envision, like.

299 00:36:04.960 00:36:13.920 Uttam Kumaran: there’s a YAML file with all of these questions, the difficulty, the domain, and, like, the right answer, and, like, what evaluators.

300 00:36:13.970 00:36:32.630 Uttam Kumaran: And maybe, like, that job just runs, like, every day, and then runs on PR, and says, like, we’re, like, 50% there, 70% there, and then that gives you something you can report on in terms of accuracy, not just, like, oh, it was inaccurate today, it sucks. Like, that’s horrible, you know?

301 00:36:32.690 00:36:44.329 Uttam Kumaran: And so maybe it gives you, actually, like, an artifact to say, well, we have a structured approach towards measuring accuracy. It’s an eval-based approach. If you Google eval-based approach, it is the number one approach.

302 00:36:44.540 00:36:57.390 Uttam Kumaran: So that solves that. And you literally can say, we’re gonna make sure that every week we report on, now we’re hitting 50% accuracy, 60%, 70%, right? You almost set the benchmark for yourselves.

303 00:36:57.800 00:36:59.570 Uttam Kumaran: Maybe that is, like, a good…

304 00:37:00.450 00:37:04.540 Uttam Kumaran: way to do this, and I think Amber, probably also a good way to do this for CTA.

305 00:37:06.910 00:37:07.819 Uttam Kumaran: You know…

306 00:37:08.450 00:37:21.869 Amber Lin: Yeah, I would really like that to happen. I would… if the platform’s gonna… platform team’s gonna build something for us, I would like us to be in a working session with them and learn how that’s done. I think this is gonna be…

307 00:37:22.410 00:37:30.210 Amber Lin: a huge part of our… our work moving forward. Like, I think we… our team needs to learn this, and I would love to learn this.

308 00:37:30.210 00:37:43.749 Amber Lin: I’ve done some sort of ad hoc thing for CTA, but I would like to learn this on a more production scale, and how to integrate this with systems. So if you guys are avid Jasmine, if you’re interested, I would love to do this together.

309 00:37:44.440 00:37:45.040 Advait Nandakumar Menon: Yep.

310 00:37:45.510 00:38:01.249 Amber Lin: And this will be a tool that our team will use down the road. I’m more so thinking of, okay, we have a new client or a new product that we’re building. Any product we build on top of our data set that relates to AI will need

311 00:38:01.490 00:38:10.580 Amber Lin: We’ll need an eval dataset, or need an eval workflow. This should be our standard pro- standard

312 00:38:10.580 00:38:24.529 Amber Lin: process, for us to measure our work, and for us to know how things… how things should be done. So, ideally, we have some sort of playbooks around that, some sort of reusable workflows.

313 00:38:25.620 00:38:43.079 Amber Lin: or just basic stuff that we can run using a CLI, even before integrating it into, at least a scale to say, hey, questions at CLI, run this, give me… give me a… give me a result. So, we can even start there, and then we can build that into a tool later.

314 00:38:45.560 00:38:54.030 Jasmin Multani: Yes, that sounds good. I like the brainstorming, and previously, what in flat hierarchies, what I’ve seen is

315 00:38:54.310 00:38:57.969 Jasmin Multani: Different players, who are all excellent players.

316 00:38:58.110 00:39:00.310 Jasmin Multani: Are kind of doing the same thing.

317 00:39:00.360 00:39:16.939 Jasmin Multani: But the intent is not to recreate and dominate each other. The idea is saying, like, hey, Amber is doing something for CTA, given their context. She’s creating this eval. I’m doing something for Element. It’s a similar product, but I’m adding my context in.

318 00:39:16.940 00:39:28.430 Jasmin Multani: So we both come to the table and we compare, which is very different from, like, how work cultures have existed in the past so many years. It’s always about, like, dominating and taking each.

319 00:39:28.430 00:39:32.869 Uttam Kumaran: Especially in consulting, it’s like, it’s really bad. It’s never like this.

320 00:39:33.090 00:39:38.979 Uttam Kumaran: It’s like, always, like, that team needs to, like, die so that I can… it’s like, what?

321 00:39:38.980 00:39:54.779 Jasmin Multani: Yeah, it’s… it’s effing weird, but the reality is, like, in the last favorite team that I’ve worked on, we each would have different experiences, different voices, different lenses to view the same thing, and on top of that, we’re working with different clients.

322 00:39:54.780 00:40:04.709 Jasmin Multani: e-commerce versus CTA, so I’d say we set up a workshop, and we say, hey, this is where, my evaluation,

323 00:40:05.130 00:40:18.570 Jasmin Multani: service is looking like, this is what I built, these are the questions I’ve asked it, and then we each, like, stress test it, and brainstorm and say, hey, have you thought about this? Have you thought about this context? Like, given what I know about the industry.

324 00:40:18.570 00:40:33.969 Jasmin Multani: this is an issue, this is the benchmark. So that’s what I imagine. I’m just worried… I want to take it… I think so far, we’re really good about structuring out certifications and having a high bar there. Now it’s a matter of hitting play.

325 00:40:33.970 00:40:45.059 Jasmin Multani: And how do we resolve things when things are being built? And how do we brainstorm when things are being built? That’s where I’d like to see our workshopping go towards.

326 00:40:45.480 00:40:46.180 Uttam Kumaran: Yeah.

327 00:40:46.350 00:40:53.479 Uttam Kumaran: Yeah, so this is where, like, I kind of want you guys, for this, like, let’s say this tool, you guys are, like, the product managers for this.

328 00:40:53.520 00:41:07.629 Uttam Kumaran: So, what I’ll do is I’m gonna… I’ll try to just craft, like, a little bit of a story of, like, what we’re trying to get built, but ultimately, like, the platform group is gonna build this for y’all. This is something that we’re not gonna… we first may just be for Omni, but…

329 00:41:07.850 00:41:15.839 Uttam Kumaran: as, like, for me, a leading platform, I’m like, I want this to support any text-to-SQL situation that we end up in, right?

330 00:41:15.840 00:41:27.979 Uttam Kumaran: And so what I can do is, like, maybe just do a first draft of the plan, like, show you guys even how I, like, got to that plan, and then have you guys basically say, cool. And then on the platform team.

331 00:41:28.260 00:41:34.880 Uttam Kumaran: Yeah, we’re taking on as much as we can every week, so whenever we can slot it in and do it, we will, but I think…

332 00:41:35.030 00:41:40.180 Uttam Kumaran: What would be helpful, even in the short term, is, like, we gather those… start to gather those questions.

333 00:41:40.310 00:41:43.330 Uttam Kumaran: You know, anyways.

334 00:41:43.860 00:41:45.690 Amber Lin: Yeah.

335 00:41:46.380 00:42:02.209 Amber Lin: I vote for workshop next Friday, and then before next Friday, if we can test our questions in our clients. So, I can test my own CTA, you guys can test yours on Element. We can come with those questions, or…

336 00:42:02.210 00:42:07.079 Amber Lin: at least what we’ve learned about eval, so Uten, if you can share some eval stuff of…

337 00:42:07.080 00:42:13.459 Amber Lin: metrics we can look at, we can experiment with those and come together next Friday, and then we can

338 00:42:13.550 00:42:19.880 Amber Lin: combine our requirements, send them to the platform team, for a quick MVP.

339 00:42:21.100 00:42:24.169 Jasmin Multani: Great, yeah. Should this be, like, a running Friday cadence?

340 00:42:24.170 00:42:33.290 Amber Lin: I would love that, because there’s a lot of tools I want us to build, or I think will be helpful for our work, because we run into the same problems.

341 00:42:34.920 00:42:39.370 Uttam Kumaran: Yeah, this is where I’m… I don’t know my… I don’t count on me on being on anything.

342 00:42:39.480 00:42:41.120 Uttam Kumaran: But I will put him on Slack.

343 00:42:41.290 00:42:41.820 Uttam Kumaran: And I will.

344 00:42:41.820 00:42:42.240 Amber Lin: Yeah.

345 00:42:42.260 00:42:45.459 Uttam Kumaran: invite me. I would love to be there. If I can be there, I’ll be there.

346 00:42:45.460 00:42:45.870 Amber Lin: Yeah.

347 00:42:45.870 00:42:52.239 Uttam Kumaran: But ultimately, I want to show you guys one pass of this, like, come up with a product, have platform build it, loop.

348 00:42:52.490 00:42:59.399 Uttam Kumaran: And then you’ll see, kind of, how it works. Please do. Yeah, every team needs to start. Right, currently, I’ve been the primary product

349 00:42:59.670 00:43:02.019 Uttam Kumaran: Person asking that team for work.

350 00:43:02.180 00:43:15.090 Uttam Kumaran: And I want that to sort of grow. Like, different teams have different needs. That could be solved with stuff off the shelf, or we should build stuff. It’s getting very easy for us to build things. We build everything already, so… yeah.

351 00:43:15.090 00:43:15.520 Jasmin Multani: I don’t…

352 00:43:15.520 00:43:16.689 Uttam Kumaran: We could keep doing that.

353 00:43:16.910 00:43:22.690 Uttam Kumaran: Cool. And then you guys should also call Omni and give them… ask them about this, like, how they think we should solve this, too.

354 00:43:27.460 00:43:27.980 Amber Lin: Cool.

355 00:43:28.420 00:43:30.289 Uttam Kumaran: Great chat. Thanks, everyone.

356 00:43:30.290 00:43:31.850 Amber Lin: See, we’ll set up a recurring.

357 00:43:31.850 00:43:32.290 Advait Nandakumar Menon: Who knows?

358 00:43:32.290 00:43:34.070 Amber Lin: Friday, right? Bye, guys.

359 00:43:34.070 00:43:35.380 Advait Nandakumar Menon: Bye.