Meeting Title: AI and Snowflake Progress Check-in Date: 2026-05-06 Meeting participants: Jay Heavner, Uttam Kumaran
WEBVTT
1 00:02:35.270 ⇒ 00:02:36.240 Uttam Kumaran: Hey, Jay.
2 00:02:36.550 ⇒ 00:02:37.450 Jay Heavner: Hey, what’s going on, man?
3 00:02:37.960 ⇒ 00:02:39.689 Uttam Kumaran: Hey, how’s everything?
4 00:02:40.290 ⇒ 00:02:41.690 Jay Heavner: Good.
5 00:02:41.850 ⇒ 00:02:42.809 Jay Heavner: How about you?
6 00:02:43.780 ⇒ 00:02:49.060 Uttam Kumaran: Good, I feel like we’re making good progress on the Snowflake side of things.
7 00:02:49.280 ⇒ 00:02:53.510 Uttam Kumaran: It’s raining here in Austin.
8 00:02:53.510 ⇒ 00:02:54.260 Jay Heavner: Sure, sign in.
9 00:02:54.370 ⇒ 00:03:00.239 Uttam Kumaran: I’m gonna be hanging out with, y’all next week. Amber and I will be visiting, so…
10 00:03:00.780 ⇒ 00:03:06.010 Uttam Kumaran: Excited for that, excited to do some in-person trainings and,
11 00:03:06.680 ⇒ 00:03:08.990 Uttam Kumaran: Just, like, see the crew, so, yeah.
12 00:03:12.110 ⇒ 00:03:19.810 Uttam Kumaran: How’s, how’s, like, stuff in since, like, last conversation on, like, AI front to, like, They’re in general.
13 00:03:20.080 ⇒ 00:03:26.270 Jay Heavner: So… Good question. The, we have a new CEO, starting May 1st.
14 00:03:26.620 ⇒ 00:03:31.510 Jay Heavner: She immediately did a strategic retreat with her direct reports.
15 00:03:31.730 ⇒ 00:03:36.509 Jay Heavner: yesterday and today, I know AI is one of their big topics of conversation.
16 00:03:36.840 ⇒ 00:03:42.580 Jay Heavner: I haven’t gotten any kind of readout or notes from that yet, so I don’t know.
17 00:03:43.010 ⇒ 00:03:43.780 Uttam Kumaran: Okay.
18 00:03:43.950 ⇒ 00:03:54.490 Jay Heavner: one One potentially big win is that Catherine convinced one of our internal customers to let us…
19 00:03:55.610 ⇒ 00:04:02.839 Jay Heavner: build a system for them using AI that they were about to go to market for and spend a bunch of money
20 00:04:03.040 ⇒ 00:04:09.969 Jay Heavner: to buy a system that did this, and they were going to use this part of it, and I’m like, we’re both.
21 00:04:09.970 ⇒ 00:04:10.320 Uttam Kumaran: Yeah.
22 00:04:10.320 ⇒ 00:04:18.969 Jay Heavner: Doesn’t make any sense, and then a lot of, manual data movement between it, and Catherine’s like, let us just build this thing for you. So…
23 00:04:19.810 ⇒ 00:04:21.769 Jay Heavner: you know, that VP is like, okay.
24 00:04:22.010 ⇒ 00:04:25.890 Jay Heavner: So, we’re kind of sprinting at that right now.
25 00:04:26.630 ⇒ 00:04:30.540 Jay Heavner: Yeah, that’s, you know… Lots, lots of interest.
26 00:04:32.830 ⇒ 00:04:38.890 Jay Heavner: and work… It’s still a little weird, like, someone reached out to me yesterday, they’re doing,
27 00:04:39.650 ⇒ 00:04:47.479 Jay Heavner: we have some indexes on the NASDAQ, and NASDAQ requires us to review those companies periodically.
28 00:04:47.880 ⇒ 00:04:48.980 Jay Heavner: And…
29 00:04:49.300 ⇒ 00:04:58.840 Jay Heavner: so they were using Claude to review those companies, but they were doing it with a prompt that’s this big and a one-shot, and I’m like, I just…
30 00:04:58.960 ⇒ 00:05:08.250 Jay Heavner: And they were burning… they were burning through their session limits in no time at all, because they were trying to use the API, they had an output token size of
31 00:05:08.560 ⇒ 00:05:11.670 Jay Heavner: something massive, and they’re just burning through tokens, and I’m like.
32 00:05:11.670 ⇒ 00:05:12.070 Uttam Kumaran: Yeah.
33 00:05:12.400 ⇒ 00:05:15.630 Jay Heavner: And I’m like, I see what you guys are trying to do.
34 00:05:15.990 ⇒ 00:05:18.299 Jay Heavner: But it’s not a one-shot process.
35 00:05:18.410 ⇒ 00:05:21.819 Jay Heavner: Your token limits are wrong, like, you need to…
36 00:05:22.610 ⇒ 00:05:26.690 Jay Heavner: I think what they’re doing is they’re not… I’m like.
37 00:05:27.310 ⇒ 00:05:31.200 Jay Heavner: they should have a conversation with the AI to plan the thing, not just…
38 00:05:31.200 ⇒ 00:05:34.750 Uttam Kumaran: Yeah, they should, like, relinquish the control of how to get there.
39 00:05:34.750 ⇒ 00:05:46.210 Jay Heavner: Yes. And what they did was, someone used the artifact things in code and dropped in some node that I think they hand-wrote the node code for it, and I’m like.
40 00:05:46.350 ⇒ 00:05:53.900 Jay Heavner: You’re doing the part you should let AI do, right? Like, that’s where I think people are struggling, is…
41 00:05:54.970 ⇒ 00:06:01.229 Jay Heavner: they are still trying to bring what they perceive their historic value to the table, and I’m like.
42 00:06:01.700 ⇒ 00:06:02.860 Jay Heavner: That’s not…
43 00:06:03.120 ⇒ 00:06:09.099 Jay Heavner: your value anymore. Like, I was talking… I was talking to my team, and they’re doing some stuff in AWS, and I’m like.
44 00:06:09.440 ⇒ 00:06:18.940 Jay Heavner: let Claude do that for you. If you’ve got the AWS CLI installed, and you’ve got the right scopes, it’s gonna do that better than you can, right? Yeah.
45 00:06:19.070 ⇒ 00:06:21.149 Jay Heavner: Ride the wave, don’t fight the wave.
46 00:06:21.490 ⇒ 00:06:25.579 Jay Heavner: You know, Catherine and I were just… she was demoing,
47 00:06:26.510 ⇒ 00:06:39.589 Jay Heavner: she built some repo on top of, linked intent design to do building blocks, and everyone’s asking… they have questions about building blocks. I’m like, you don’t have to understand how it works.
48 00:06:39.590 ⇒ 00:06:40.060 Uttam Kumaran: Yeah.
49 00:06:40.060 ⇒ 00:06:49.710 Jay Heavner: you just have to know what your intent is, and use it, it will understand how it works. And I think people… this is where people are really struggling.
50 00:06:50.230 ⇒ 00:06:54.819 Jay Heavner: And I’m like, you don’t need to be a master of technology, you need to be a master of intent.
51 00:06:54.960 ⇒ 00:07:01.800 Jay Heavner: And you need to know, like, what your race conditions are, and your boundary conditions are, so when it does something, you know, like.
52 00:07:02.030 ⇒ 00:07:07.479 Jay Heavner: When it doesn’t follow proper hygiene or discipline, you can call it out.
53 00:07:07.930 ⇒ 00:07:08.350 Uttam Kumaran: Yeah.
54 00:07:08.350 ⇒ 00:07:11.050 Jay Heavner: But that’s where your lane is now, right? Like…
55 00:07:11.520 ⇒ 00:07:25.669 Uttam Kumaran: And I think we all went through that journey, right? So part of it is, like, I wonder how much of one of the proposals to the, like, executive group is, like, actually just, like, they need to do some just training that’s less about, like, using Claude.
56 00:07:25.800 ⇒ 00:07:27.450 Uttam Kumaran: It’s more about, like.
57 00:07:27.640 ⇒ 00:07:38.180 Uttam Kumaran: you are moving to, like, I have a problem, and then review the plan, and then review the outputs, and really, truly, like, let it fill in the gap.
58 00:07:38.610 ⇒ 00:07:40.669 Uttam Kumaran: Right? Yep. You know?
59 00:07:40.840 ⇒ 00:07:42.710 Uttam Kumaran: Yeah.
60 00:07:43.150 ⇒ 00:07:46.950 Jay Heavner: You have to know more than it, but, like, you know, if you’re trying to do…
61 00:07:46.950 ⇒ 00:07:51.639 Uttam Kumaran: But it’s gonna make these abstract, it may make these connections, and it may also, like.
62 00:07:51.950 ⇒ 00:08:09.990 Uttam Kumaran: Yeah, if you’re gonna say, go just do this thing, it will do that, but instead you should have just started with, like, I have this problem, let’s brainstorm, let’s, like, research. You may arrive at what you thought, but guess what? It will also have, like, now somewhat of the same understanding you have about the surface of a problem.
63 00:08:09.990 ⇒ 00:08:10.410 Jay Heavner: Yep.
64 00:08:10.410 ⇒ 00:08:14.809 Uttam Kumaran: And then, it may also ideally, come up with things that you didn’t consider.
65 00:08:14.910 ⇒ 00:08:21.830 Uttam Kumaran: You know? And then ultimately, I think the freeing thing is now we can go… a lot of people can go do things where you didn’t have that depth.
66 00:08:22.000 ⇒ 00:08:31.740 Uttam Kumaran: Right? Like, you can go into domains where you had known depth, but previously, if you had to just come in with a plan, there’s no way. But now, if you come in with a problem, you say, it’s kind of in this area.
67 00:08:31.960 ⇒ 00:08:41.799 Uttam Kumaran: it’s gonna use best practices to get there, and then you turn into people that are, like, become multidisciplinary, sort of can 10X the things that they can work on, you know, so…
68 00:08:42.000 ⇒ 00:08:45.930 Jay Heavner: this is a conversation I had with my team. I mean, you know, as a developer.
69 00:08:46.270 ⇒ 00:08:50.240 Jay Heavner: You’re going to start moving more into domain-specific knowledge.
70 00:08:50.510 ⇒ 00:08:51.080 Uttam Kumaran: Yeah.
71 00:08:52.300 ⇒ 00:08:59.539 Jay Heavner: You know, the more domain-specific knowledge you have across a wider variety of things, the more effective you’ll be, because…
72 00:08:59.750 ⇒ 00:09:04.339 Jay Heavner: the specificity We can offload a lot of that.
73 00:09:04.450 ⇒ 00:09:10.760 Jay Heavner: You have to know enough to be… The bullshit detector.
74 00:09:10.760 ⇒ 00:09:11.270 Uttam Kumaran: Yes.
75 00:09:11.270 ⇒ 00:09:18.219 Jay Heavner: Something like that, right? Like, that’s where the value is going to be, in competency with human-in-the-loop kind of judgment.
76 00:09:18.570 ⇒ 00:09:26.970 Jay Heavner: You know, there’s this weird thing here where no one really… Understands what an agent is.
77 00:09:27.250 ⇒ 00:09:27.580 Uttam Kumaran: Yeah.
78 00:09:27.580 ⇒ 00:09:32.029 Jay Heavner: talks about agentic, and people think that, you know, when they’re in
79 00:09:33.080 ⇒ 00:09:44.219 Jay Heavner: the web version of ChatGPT, or… Yes. I’m running an agent. I’m like, no, no, and really, at the definition level, an agent’s a pretty simple thing.
80 00:09:44.220 ⇒ 00:09:44.850 Uttam Kumaran: Yes.
81 00:09:45.100 ⇒ 00:09:50.210 Jay Heavner: And you don’t need to be creating agents, probably. You should probably let, you know.
82 00:09:51.000 ⇒ 00:09:58.679 Jay Heavner: let us create those agents, and then you use those agents, but honestly, those… that agent use will probably be transparent to you.
83 00:09:58.680 ⇒ 00:09:59.200 Uttam Kumaran: Yes.
84 00:09:59.200 ⇒ 00:10:00.740 Jay Heavner: not done it well, right?
85 00:10:00.740 ⇒ 00:10:02.959 Uttam Kumaran: Yeah, yeah. But it’s even, again, like…
86 00:10:03.300 ⇒ 00:10:08.099 Uttam Kumaran: Don’t worry, just come with the prop, like, don’t even worry about whether it’s agent or…
87 00:10:08.400 ⇒ 00:10:20.899 Uttam Kumaran: a CLI, or… you know, ultimately, you want the system to invoke the right integration to solve your problem in the most secure way, configured by the CTA, sort of.
88 00:10:21.090 ⇒ 00:10:34.270 Uttam Kumaran: tech team, right? And more of actually what it is, is, like, just come with, like, an open mind and a problem, and here’s how to frame to the plan, you know? Versus, like, do this thing.
89 00:10:34.480 ⇒ 00:10:47.079 Uttam Kumaran: It’s like, dude, that’s just, like… and so I don’t know, I wonder if you… what do you think is, like, the solve there? Is it, like, okay, we… first phase was, like, get… because I think in our company, it was, like, first phase is just get people to try everything.
90 00:10:47.080 ⇒ 00:10:47.870 Jay Heavner: Yep, yep, yep.
91 00:10:47.870 ⇒ 00:10:57.039 Uttam Kumaran: Now, then it’s like, okay, now, then wrangle everyone back to, like, here’s the fundamental… now that you’ve seen that water is wet, which I couldn’t have sort of shown you.
92 00:10:57.190 ⇒ 00:11:09.819 Uttam Kumaran: Like, without you touching it. Now, I can come back and be like, great, here’s, like, how to be effective, and then you have some comparison point, right? Like, maybe that’s… that’s more of, like, what…
93 00:11:10.010 ⇒ 00:11:14.039 Uttam Kumaran: That’s… and that’s a… that’s, like, a commitment to, like, quite a bit of work to just, like.
94 00:11:14.330 ⇒ 00:11:17.100 Uttam Kumaran: almost do that development, but I think that…
95 00:11:17.650 ⇒ 00:11:21.680 Uttam Kumaran: It’s gonna scale, because otherwise people are just gonna build these bad habits, you know?
96 00:11:21.680 ⇒ 00:11:28.399 Jay Heavner: I totally agree. And so, one of my questions with Catherine yesterday is, is, you know, the market research team that built their artifact.
97 00:11:28.960 ⇒ 00:11:31.220 Jay Heavner: That isn’t going to be effective for them.
98 00:11:31.450 ⇒ 00:11:38.440 Jay Heavner: And I’m like, you know, I agree. They’ve learned that water is wet, and now I need to level them up. I need to upskill them.
99 00:11:38.790 ⇒ 00:11:48.779 Jay Heavner: Yeah. How do I do that in a way that doesn’t, like, doesn’t feel like I’m killing puppies, right? Like, they’ve done this hard, they’ve done this work, they’re very proud of it, and now I’m like, alright, that’s great.
100 00:11:48.970 ⇒ 00:11:50.160 Jay Heavner: Mel, let’s…
101 00:11:51.290 ⇒ 00:12:00.999 Jay Heavner: let’s tear it down, and let’s level up on the next iteration, because it, you know, I think the thing that people need to realize, and this is a new kind of paradigm, is
102 00:12:01.380 ⇒ 00:12:09.730 Jay Heavner: Execution is much cheaper than it needs to be, so there’s no harm in reiterating, there’s no harm in running…
103 00:12:09.730 ⇒ 00:12:10.260 Uttam Kumaran: 5 pairs.
104 00:12:10.260 ⇒ 00:12:12.440 Jay Heavner: parallel branches and seeing which one works the best.
105 00:12:12.440 ⇒ 00:12:12.910 Uttam Kumaran: Yeah.
106 00:12:12.910 ⇒ 00:12:17.729 Jay Heavner: Like, don’t treat the first one as the finished product.
107 00:12:18.080 ⇒ 00:12:26.769 Jay Heavner: treat it as, alright, what did I learn from that one? What worked? What didn’t? And then reiterate. So, I’m gonna meet with them and talk about, alright, so you did this.
108 00:12:27.370 ⇒ 00:12:32.059 Jay Heavner: now let’s take what you know, and let’s just go have a conversation. Like, let’s open up…
109 00:12:32.790 ⇒ 00:12:41.930 Jay Heavner: And for web research, I might point them back to ChatGPT and not Claude. I think it’s a little stronger natively. But, like, let’s go talk.
110 00:12:42.290 ⇒ 00:12:45.159 Jay Heavner: whatever one you want to use, we can do codex, we can do…
111 00:12:45.160 ⇒ 00:12:45.980 Uttam Kumaran: Exactly.
112 00:12:45.980 ⇒ 00:12:59.120 Jay Heavner: Claude, I mean, and frankly, if you wait a week, it might pivot again, and you should be prepared to pivot as it pivots. But let’s go have a conversation with the problem you’re trying to solve, and let’s talk to it about the…
113 00:12:59.250 ⇒ 00:13:02.489 Jay Heavner: The problems with a one-shot versus a pipeline, and, you know…
114 00:13:02.490 ⇒ 00:13:03.030 Uttam Kumaran: Yes.
115 00:13:03.030 ⇒ 00:13:07.180 Jay Heavner: And let them… Kind of guide them through that journey.
116 00:13:07.350 ⇒ 00:13:12.400 Jay Heavner: And my hope is at some point, people become… their own guides, right?
117 00:13:12.400 ⇒ 00:13:13.130 Uttam Kumaran: Yes.
118 00:13:13.130 ⇒ 00:13:15.849 Jay Heavner: That’s the part that… That is the part…
119 00:13:15.850 ⇒ 00:13:19.260 Uttam Kumaran: I found… oh, I found this skill, I think we should adopt it, or like…
120 00:13:19.260 ⇒ 00:13:19.800 Jay Heavner: Yeah.
121 00:13:19.800 ⇒ 00:13:32.290 Uttam Kumaran: You know, I realized that, like, okay, if you work on a great plan, and then you hand it off, like, yeah, you want them to sort of start to self-optimize their workflow, but if they continue down the path of, like, one-shotting, there’s, like, no…
122 00:13:32.290 ⇒ 00:13:40.679 Uttam Kumaran: there’s, like, no way to get… like, I don’t… I don’t know… I don’t think there’s a way to get there, and you almost have to be comfortable, sort of, like, relinquishing
123 00:13:41.110 ⇒ 00:13:56.119 Uttam Kumaran: the every single detail, but being actually very opinionated about the input, and checking, like, that the thing delivered, and then, like, yeah, like, reviewing the plan, like, that’s actually where you need to shift your time, and I don’t think that…
124 00:13:56.120 ⇒ 00:14:02.060 Uttam Kumaran: Like, that’s not how the product sort of trains you. It doesn’t… doesn’t have any input validation to be like.
125 00:14:02.300 ⇒ 00:14:04.980 Uttam Kumaran: Actually, you need to give me more information, Mike.
126 00:14:04.980 ⇒ 00:14:16.150 Jay Heavner: Well, that’s the fundamental problem with the models, right? They want to please you. Yeah. And I was telling… just in the meeting we just had, I’m like, you know, these models want to please you, they’re highly opinionated. If you don’t tell them.
127 00:14:16.180 ⇒ 00:14:27.269 Jay Heavner: they’re not going to say, hey, there’s a gap here. They’re gonna fill that gap in for you. Yeah. And they might not feel it the way you want, and it might not even be transparent the way they filled it in, so…
128 00:14:27.390 ⇒ 00:14:35.709 Jay Heavner: Be opinionated in the places where you need to be highly opinionated, and let it know that, and then understand where you’re okay letting the model be like.
129 00:14:35.950 ⇒ 00:14:41.309 Jay Heavner: This is a decision you can make. I don’t care. This isn’t core to my business.
130 00:14:41.500 ⇒ 00:14:50.470 Jay Heavner: But over here, you know, I do care about this. I need this data read specifically from this API endpoint. You need to know that, otherwise…
131 00:14:50.470 ⇒ 00:14:51.080 Uttam Kumaran: Yeah.
132 00:14:51.080 ⇒ 00:14:52.779 Jay Heavner: What are you fabricating up, right?
133 00:14:52.780 ⇒ 00:14:54.190 Uttam Kumaran: Yes, yes.
134 00:14:54.190 ⇒ 00:15:00.890 Jay Heavner: So, I think that, like, a lot of our work Internally, is going to be…
135 00:15:01.690 ⇒ 00:15:11.670 Jay Heavner: How do we build the necessary architecture and infrastructure in a just-in-time kind of fashion that let people
136 00:15:12.520 ⇒ 00:15:17.250 Jay Heavner: like, work back to intent, right? Like, they don’t need to build agents and skills.
137 00:15:17.340 ⇒ 00:15:33.020 Jay Heavner: I mean, they may have to build skills for specific business lives. Like, I’m working with finance, I don’t have financial skills. You know, I can… I can obviously ask Claude to build me a finance skill, but it won’t know our internal secret sauce, so that’s where… that’s where I need the…
138 00:15:33.440 ⇒ 00:15:37.550 Jay Heavner: the SMEs to step up and say, alright, SMEs, your job.
139 00:15:37.550 ⇒ 00:15:38.080 Uttam Kumaran: Yes.
140 00:15:38.080 ⇒ 00:15:41.880 Jay Heavner: Is to articulate and build these skills really well.
141 00:15:42.150 ⇒ 00:15:45.480 Jay Heavner: And then we’ll build agents that use those skills.
142 00:15:45.690 ⇒ 00:15:50.779 Jay Heavner: And then we’ll build, you know, invocation patterns that know when to use those things, and now…
143 00:15:51.440 ⇒ 00:15:53.929 Jay Heavner: I still think that you…
144 00:15:54.970 ⇒ 00:15:59.379 Jay Heavner: I still think the big missing part is It will never say…
145 00:16:00.130 ⇒ 00:16:01.949 Jay Heavner: the assumptions it made. It’ll never say.
146 00:16:01.950 ⇒ 00:16:10.719 Uttam Kumaran: Unless you’re telling me, like, explain those are some… you almost have to… you have to really baby it and treat it like an intern, is often the thing, yeah.
147 00:16:10.720 ⇒ 00:16:25.689 Jay Heavner: And 4-7… I’ve noticed that 4-7, if you ask it, will do that better than other ones, but you really have to prompt it, like, alright, you know, do this, but then also, what’s your gap analysis on this? What’s, you know.
148 00:16:25.880 ⇒ 00:16:31.740 Jay Heavner: give it very specific prompt to force it to say, alright, these are holes I just… blew through.
149 00:16:32.160 ⇒ 00:16:38.659 Jay Heavner: But yeah, I mean… My biggest concern is people will get,
150 00:16:38.820 ⇒ 00:16:44.330 Jay Heavner: you know, you start here, you get to here, and I’m like, this is good enough. I’m like, no, we need to be to here, right?
151 00:16:45.070 ⇒ 00:16:46.250 Jay Heavner: the old…
152 00:16:46.510 ⇒ 00:17:03.349 Jay Heavner: how many times have you gone to a meeting, and you’ve shown them a prototype or MVP? I’m like, we can launch tomorrow! I’m like, no, we can’t. You know what I mean? That’s always been the problem, is you show someone a proof of concept, yeah. You show them a proof of concept, then they’re like, launch it! And I’m like.
153 00:17:03.650 ⇒ 00:17:04.369 Jay Heavner: No.
154 00:17:04.619 ⇒ 00:17:08.610 Jay Heavner: it’s a proof of concept. It is, you know, it looks.
155 00:17:08.619 ⇒ 00:17:12.689 Uttam Kumaran: No, that’s exactly… it’s a proof of a concept.
156 00:17:12.690 ⇒ 00:17:19.539 Jay Heavner: It’s got some happy path in there, we’ve accounted for no edge or error cases, it does not work. It is not a robust thing.
157 00:17:19.760 ⇒ 00:17:23.360 Jay Heavner: And I’m nervous that people will get to
158 00:17:23.900 ⇒ 00:17:30.400 Jay Heavner: 80% of a happy path, like, this is good enough, let’s stop here. And we have to have the discipline to say, no.
159 00:17:32.450 ⇒ 00:17:38.789 Jay Heavner: Catherine had this… I don’t know if this is her idea or not, but she’s like, you know, give everybody a sandbox.
160 00:17:39.150 ⇒ 00:17:43.349 Jay Heavner: play, Michael, what if our project pipeline becomes
161 00:17:43.590 ⇒ 00:17:50.149 Jay Heavner: Reviewing things in a sandbox rather than a traditional project pipeline where someone’s coming in with their idea.
162 00:17:50.540 ⇒ 00:17:56.210 Jay Heavner: That might be interesting. I don’t know if we’ll get to that point or not, but
163 00:17:56.870 ⇒ 00:17:59.909 Jay Heavner: You know, and there was the question of, well, who’s building apps?
164 00:18:00.100 ⇒ 00:18:07.900 Jay Heavner: And I’m like, that’s… that’s a complicated question, because for low-value… Not low value, but low…
165 00:18:08.310 ⇒ 00:18:11.530 Jay Heavner: Low-impact things that don’t touch
166 00:18:12.130 ⇒ 00:18:18.980 Jay Heavner: you know, external things, or regulated data, you know. Yeah. I’m fine with people building their own apps for that. That’s great.
167 00:18:19.190 ⇒ 00:18:23.810 Jay Heavner: The moment you hit Something important, something public-facing.
168 00:18:23.810 ⇒ 00:18:24.240 Uttam Kumaran: Yeah.
169 00:18:24.240 ⇒ 00:18:29.699 Jay Heavner: touching regulated or protected data, then no, you can’t… I’m sorry, you can’t vibecode your app for that.
170 00:18:29.700 ⇒ 00:18:31.860 Uttam Kumaran: Yes, yes. Yeah.
171 00:18:32.210 ⇒ 00:18:35.839 Uttam Kumaran: Yeah, maybe one thing I’ll even share, because we’re starting to put together, like.
172 00:18:35.990 ⇒ 00:18:40.939 Uttam Kumaran: just some slides on explaining this. I don’t know if this will even help give some visuals.
173 00:18:41.180 ⇒ 00:18:47.219 Uttam Kumaran: And… and even we could… maybe we should put together a CTA-style version of this for next week, but, like.
174 00:18:47.310 ⇒ 00:19:01.259 Uttam Kumaran: we sort of… I tried to codify a lot of this, like, okay, how do you build, how do you build, like, consensus and organization around, like, what this future is gonna look like? I think, to your point, I think there’s gonna be a small amount of, like.
175 00:19:01.440 ⇒ 00:19:10.400 Uttam Kumaran: builders, and a lot of people that are consuming and using it to execute work. Yeah. And I think part of that is just gonna be because of qualifications.
176 00:19:10.530 ⇒ 00:19:20.769 Uttam Kumaran: I think part of that is just gonna be, like, those people are building artifacts, but they’re not necessarily, like, contributing back to, like, the global system.
177 00:19:20.770 ⇒ 00:19:22.820 Jay Heavner: I totally agree with that, yeah.
178 00:19:22.820 ⇒ 00:19:27.720 Uttam Kumaran: Yeah, and yeah, that’s actually fine, and I think to your point, we don’t actually need everybody to, like.
179 00:19:27.960 ⇒ 00:19:43.060 Uttam Kumaran: contribute to, like, a global… like, not everybody’s gonna be great at creating skills to connect to, like, SharePoint. And there’s… why should there be 100 skills that connect to SharePoint? It should be the few people, but also problem is, is, like.
180 00:19:43.300 ⇒ 00:20:02.290 Uttam Kumaran: this requires, like, kind of a team, like, we need people that… we need some people on the, like, what we call, like, either the platform team, or, like, whatever, the builders, and then you need people that are able to, like, strategize and meet with those SMEs and, like, get the adoption, and then almost, like, do the POCs, and then funnel it to, like.
181 00:20:02.410 ⇒ 00:20:20.000 Uttam Kumaran: a small group of people that sort of productionalize it, right? And so this is sort of, like, one thing we talk about, which is, like, you have all these sort of things that surround, like, an agent, and really, it’s, like, all of the context management that is sort of, you know, really the issue here. And one thing that
182 00:20:20.320 ⇒ 00:20:24.699 Uttam Kumaran: What kind of sh… I wanted to share is, like,
183 00:20:24.820 ⇒ 00:20:30.100 Uttam Kumaran: we have, like… let’s see… like, this is really, I think, a good,
184 00:20:30.360 ⇒ 00:20:42.540 Uttam Kumaran: slide, which is, like, how do you… what is… how does it work with, like, sort of the stuff we’re doing in Snowflake? And I think when we kind of came in, I think our core piece is, like, okay, we’re working on landing all the data in Snowflake.
185 00:20:42.630 ⇒ 00:20:54.470 Uttam Kumaran: creating the data models, packaging it in terms of, dashboards. Now we’ve added one more layer, which is like, okay, we can enable Snowflake Cortex AI,
186 00:20:54.540 ⇒ 00:21:06.890 Uttam Kumaran: Which is helping people actually reason into, like, a decision. But ultimately, beyond just, like, the chat interface, there’s a series of apps, exactly what you said, that are partly agentic.
187 00:21:06.960 ⇒ 00:21:18.080 Uttam Kumaran: partly UI, but actually, they may allow us to churn or prevent the purchasing of other applications. Also, they may unlock new capabilities of, like, applications that
188 00:21:18.080 ⇒ 00:21:27.310 Uttam Kumaran: didn’t exist in the market, and are, like, literally CTA software, right? Like, that’s… there’s no software that exists to do certain things. And even in your industry, as you know, there’s…
189 00:21:27.310 ⇒ 00:21:31.939 Uttam Kumaran: Commonly only a few providers that do a lot of the things that you guys buy, and so you’re kind of like.
190 00:21:32.280 ⇒ 00:21:40.330 Uttam Kumaran: you’re kind of, like, shackled, right? And so, this is sort of, like, kind of like what we’re talking about, which is, like, it builds on a lot of the data foundation.
191 00:21:40.330 ⇒ 00:21:59.409 Uttam Kumaran: It actually, I think, expands it because for AI, you not only need reporting data, you may need, like, transactional data, log data, like, all sorts of stuff landed, so even, like, for example, previously, we may not have landed certain things in the warehouse because we didn’t need it on an OKR,
192 00:21:59.520 ⇒ 00:22:10.880 Uttam Kumaran: dashboard. Now, in order to support the AI systems, we actually need all that information organized, whether that’s in a relational database, whether that’s somewhere in some sort of, like.
193 00:22:10.940 ⇒ 00:22:23.109 Uttam Kumaran: telemetry, like, all these logs sitting in, like, a Datadog or something, whether that’s research, like, on-the-fly data that, like, goes and gets pulled, whether that’s through a CLI, whether that’s through an MCP, but it’s all…
194 00:22:23.180 ⇒ 00:22:37.599 Uttam Kumaran: data, so that all that needs security? Like, how are you doing pass-through governance of, like, auth? Like, how are you… what are you able… like, what are the endpoints that you’re able to grab? How is that forming a context? But ultimately, how is that driving to, like.
195 00:22:37.860 ⇒ 00:22:39.090 Uttam Kumaran: a decision.
196 00:22:41.190 ⇒ 00:22:47.350 Uttam Kumaran: So yeah, I mean, we… this deck kind of says the same thing, like, kind of 10 times that we’re sort of working on, but…
197 00:22:47.980 ⇒ 00:22:58.460 Uttam Kumaran: Yeah, we kind of talk a little bit about, like, okay, part of this is the semantic layer of all this data, part of this is now, okay, we have to do evaluations, because previously a query is, like, deterministic, right? So…
198 00:22:58.460 ⇒ 00:22:58.780 Jay Heavner: Yes.
199 00:22:59.370 ⇒ 00:23:13.509 Uttam Kumaran: you… we… but right now, like, AI decisioning over context, it’s actually, like, you have temperature, right? You don’t want… it’s not necessarily, like, this is always the answer, and so you initially need a lot more systems to govern that.
200 00:23:13.510 ⇒ 00:23:19.590 Uttam Kumaran: Like, you need, like, an eval data set. Like, you need, like, usage kind of telemetry to see
201 00:23:19.620 ⇒ 00:23:23.339 Uttam Kumaran: all of the, like, chat logs, basically, right?
202 00:23:23.340 ⇒ 00:23:28.689 Jay Heavner: I think that’s where people struggle, is people think… people are used to determinism, right?
203 00:23:28.690 ⇒ 00:23:29.330 Uttam Kumaran: Yes.
204 00:23:29.330 ⇒ 00:23:30.779 Jay Heavner: Pattern, we all know.
205 00:23:30.930 ⇒ 00:23:39.670 Jay Heavner: And they don’t understand the non-deterministic nature. They don’t understand that asking on a Tuesday might give you a different answer than.
206 00:23:39.670 ⇒ 00:23:40.110 Uttam Kumaran: Yeah.
207 00:23:40.110 ⇒ 00:23:41.870 Jay Heavner: on a Friday, right?
208 00:23:41.870 ⇒ 00:23:50.839 Uttam Kumaran: And that you can’t… and that you can’t… we would spend years going and isolating why that happened. You just have to build… you have to build around this, like.
209 00:23:50.840 ⇒ 00:23:51.300 Jay Heavner: Yes.
210 00:23:51.300 ⇒ 00:23:52.980 Uttam Kumaran: Bob.
211 00:23:52.980 ⇒ 00:23:56.559 Jay Heavner: You have to account for a certain layer of
212 00:23:57.130 ⇒ 00:24:00.870 Jay Heavner: Not necessarily ambiguity, but, you know, close enough.
213 00:24:01.010 ⇒ 00:24:04.649 Jay Heavner: And, again, you have to be…
214 00:24:05.390 ⇒ 00:24:08.969 Jay Heavner: You have to understand what you’re asking for, what you’re expecting.
215 00:24:09.140 ⇒ 00:24:14.510 Jay Heavner: You have to be able to coax and coach sometimes that out,
216 00:24:14.750 ⇒ 00:24:18.639 Jay Heavner: Yeah. Now, I think, are we getting closer to…
217 00:24:20.090 ⇒ 00:24:23.130 Jay Heavner: more deterministic things. I think we are, especially…
218 00:24:23.130 ⇒ 00:24:30.599 Uttam Kumaran: I think so. Yeah, I think you have use… you may have use case-specific models. The skill… like, every step of this way is, like, actually, like.
219 00:24:30.910 ⇒ 00:24:44.810 Uttam Kumaran: narrowing, its output to something predictable, right? Like, every layer of… from skills to the MC… like, everything is… is making it so the difference between a one-shot versus
220 00:24:45.100 ⇒ 00:24:48.459 Uttam Kumaran: Like, this is our gap, but…
221 00:24:48.650 ⇒ 00:24:58.589 Uttam Kumaran: Also, the opportunity’s kind of now. It’s like, would you wait 5 years for it all to get figured out? You know? What’s the risk-reward of, like.
222 00:24:58.910 ⇒ 00:25:07.289 Uttam Kumaran: You know, so… and you guys are very early, like, I… I think it’s amazing, especially in your industry, like, I, you know, I can’t even imagine.
223 00:25:07.290 ⇒ 00:25:18.400 Jay Heavner: Well, it’s funny because everyone feels that they’re falling behind, and I was on Reddit last night, and I saw a guy post something where a friend of his wanted to use Claude Code.
224 00:25:18.720 ⇒ 00:25:20.699 Jay Heavner: And used it for…
225 00:25:20.970 ⇒ 00:25:29.780 Jay Heavner: 30 minutes and give up on it, because just couldn’t wrap their head around it, and the guy was like, oh, I just assumed that everyone was using cloud code, everyone would understand Cloud Code.
226 00:25:30.070 ⇒ 00:25:34.949 Jay Heavner: But he’s like, I don’t think there are as many people out there actually using this as what we want to say, because.
227 00:25:34.950 ⇒ 00:25:35.710 Uttam Kumaran: Yeah.
228 00:25:35.710 ⇒ 00:25:43.290 Jay Heavner: you know, the news cycle is the news cycle, and all that news is coming directly out of the Valley or New York City. Yes. It’s being told by journalists.
229 00:25:43.290 ⇒ 00:25:48.670 Uttam Kumaran: In the Reddit, imagine, like, you have to go to a Reddit forum on Cloud Code, you’re already in the game, right?
230 00:25:48.670 ⇒ 00:25:50.849 Jay Heavner: So, you’re, you’re already, you’ve, you’ve communicated.
231 00:25:50.850 ⇒ 00:25:57.899 Uttam Kumaran: Yeah, yeah. I feel the same way, but it’s weird. We’re sort of in these, like, one foot in the hot tub, one foot in the pool, kind of like…
232 00:25:59.540 ⇒ 00:26:04.259 Jay Heavner: Then you see the people who are like, man, that guy’s great. And then you stop and think.
233 00:26:04.490 ⇒ 00:26:12.630 Jay Heavner: how many of those people are out there? You know what I mean? It’s not as many as you… like, everybody who thinks they’re falling behind
234 00:26:13.210 ⇒ 00:26:15.900 Jay Heavner: Just by asking yourself the question means you’re probably ahead.
235 00:26:15.900 ⇒ 00:26:19.609 Uttam Kumaran: Yeah, you’re right, that’s a great way of putting it. Yeah, yeah, yeah, yeah.
236 00:26:19.890 ⇒ 00:26:22.300 Jay Heavner: So, what days are you in next week?
237 00:26:22.300 ⇒ 00:26:24.430 Uttam Kumaran: I’m in here… I’m in there Tuesday and Wednesday.
238 00:26:24.430 ⇒ 00:26:29.689 Jay Heavner: Thursday and Wednesday. Cool, excellent. Well, let’s… I gotta hop to another call, but let’s… let’s pick this up again, man.
239 00:26:29.880 ⇒ 00:26:30.400 Uttam Kumaran: Okay, okay.
240 00:26:30.400 ⇒ 00:26:31.889 Jay Heavner: I’m gonna go through your deck, too, more.
241 00:26:31.890 ⇒ 00:26:34.250 Uttam Kumaran: Okay, okay, great, perfect. Awesome.
242 00:26:34.250 ⇒ 00:26:34.740 Jay Heavner: Right.