Meeting Title: Snowflake AI Integration Planning Date: 2026-04-22 Meeting participants: Jay Heavner, Uttam Kumaran
WEBVTT
1 00:00:53.330 ⇒ 00:00:54.290 Uttam Kumaran: AJ.
2 00:00:55.170 ⇒ 00:00:55.800 Jay Heavner: Hey there.
3 00:00:55.800 ⇒ 00:00:56.959 Uttam Kumaran: Hey, how are ya?
4 00:00:57.120 ⇒ 00:00:57.819 Jay Heavner: Good, how are you?
5 00:00:57.820 ⇒ 00:01:00.100 Uttam Kumaran: Good How’s the week gone?
6 00:01:00.690 ⇒ 00:01:02.800 Jay Heavner: So far, so good. What about you?
7 00:01:03.250 ⇒ 00:01:16.200 Uttam Kumaran: Good. We’re, we’re starting to plan out, you know, using Snowflake agents, and, like, kind of exploring that. We’ve been shipping a lot of stuff around Snowflake semantic views, and, like, sort of the Cortex analysts, so…
8 00:01:16.420 ⇒ 00:01:29.519 Uttam Kumaran: It’s been good, it’s been, like, sort of, like, shining a flashlight into, like, the darkness on Snowflake, and trying to figure out, like, what’s best, but… they have a lot of cool stuff that they shift, and, like, we’re trying to see how far we can take advantage of it for y’all, so…
9 00:01:29.840 ⇒ 00:01:30.600 Jay Heavner: Very cool.
10 00:01:30.800 ⇒ 00:01:31.460 Uttam Kumaran: Yeah.
11 00:01:32.450 ⇒ 00:01:34.149 Uttam Kumaran: Yeah, go ahead.
12 00:01:34.150 ⇒ 00:01:35.069 Jay Heavner: No, go ahead.
13 00:01:35.370 ⇒ 00:01:49.679 Uttam Kumaran: Yeah, it was… I kind of wanted to probably start with just… I think when we started talking about the Snowflake agent work, Catherine mentioned that, like, you’re also considering, you know, how do we potentially use some of our data work
14 00:01:49.820 ⇒ 00:02:03.250 Uttam Kumaran: you know, potentially around, like, a shared context layer for the company, and I don’t know, would just love to hear your… she also mentioned that, there’s a May 12th meeting with the board on, you know, just, like.
15 00:02:03.490 ⇒ 00:02:06.020 Uttam Kumaran: the AI sort of initiative, and…
16 00:02:06.400 ⇒ 00:02:12.400 Uttam Kumaran: Yeah, just wanted to get your thoughts. I feel like there’s probably some overlap with our thinking, and just wanted to understand, like, yeah, if we can…
17 00:02:12.520 ⇒ 00:02:14.870 Uttam Kumaran: We can be helpful, and how we can be helpful.
18 00:02:15.470 ⇒ 00:02:20.820 Jay Heavner: So, I think the answer is yes, but I don’t know any more than
19 00:02:21.050 ⇒ 00:02:23.480 Jay Heavner: you do at this point. Okay. Okay.
20 00:02:24.340 ⇒ 00:02:36.230 Jay Heavner: There is a… something shy of a mandate, but a push from leadership to use more AI,
21 00:02:36.430 ⇒ 00:02:40.080 Jay Heavner: But what that is is not exactly clear, so we.
22 00:02:40.080 ⇒ 00:02:40.420 Uttam Kumaran: Okay.
23 00:02:40.420 ⇒ 00:02:43.340 Jay Heavner: We are in the process of sort of doing…
24 00:02:43.840 ⇒ 00:02:48.689 Jay Heavner: Like, intake on ideas and projects that people want to do.
25 00:02:50.090 ⇒ 00:02:56.629 Jay Heavner: Okay. Just to try to get them into some kind of pipeline, so we can then analyze, prioritize, figure out what we’re gonna work on.
26 00:02:56.830 ⇒ 00:02:57.200 Uttam Kumaran: Okay.
27 00:02:57.200 ⇒ 00:03:04.040 Jay Heavner: You know, when I talked to my boss about maybe we would use you guys,
28 00:03:04.250 ⇒ 00:03:08.290 Jay Heavner: We got into some of that to help out with some of the lift.
29 00:03:08.800 ⇒ 00:03:15.149 Jay Heavner: But yeah, right now, we’re still just trying to figure out what it is that they want to do, because it’s like, use AI. I’m like.
30 00:03:15.390 ⇒ 00:03:18.929 Jay Heavner: To do what? Cool. Like, we can do that. Tell me what you want to do.
31 00:03:19.040 ⇒ 00:03:28.410 Jay Heavner: they want us to use AI. I’m like, well, that’s not how it works, you know? You just don’t swing it around like a hammer, you know, you gotta have a thing for it. So…
32 00:03:30.170 ⇒ 00:03:38.130 Jay Heavner: I do think there’ll be more work with Snowflake. Like, I met with a group today, and they are doing… this pattern has come up a lot.
33 00:03:38.510 ⇒ 00:03:42.860 Jay Heavner: Where people are doing… Company research.
34 00:03:43.160 ⇒ 00:03:47.540 Jay Heavner: So, for this case, we have some NASDAQ indexes we manage.
35 00:03:47.830 ⇒ 00:03:53.080 Jay Heavner: And there are, like, 1500 companies across all these indexes. And they have to…
36 00:03:53.740 ⇒ 00:04:04.970 Jay Heavner: rebalance, or they gotta do something quarterly, every 6 months, every year, that requires them doing some analysis of these companies, which I think right now is…
37 00:04:05.940 ⇒ 00:04:07.710 Jay Heavner: Hold on a second.
38 00:04:10.930 ⇒ 00:04:12.930 Jay Heavner: Sure, but you can…
39 00:04:20.089 ⇒ 00:04:23.560 Jay Heavner: Hold on. Catherine might join us, let me send her the thing.
40 00:04:23.560 ⇒ 00:04:24.569 Uttam Kumaran: Oh, okay, great.
41 00:04:25.390 ⇒ 00:04:31.890 Jay Heavner: Come here, you… Forward, forward, forward.
42 00:04:43.000 ⇒ 00:04:45.460 Jay Heavner: Okay, just Senator.
43 00:04:46.390 ⇒ 00:04:51.570 Jay Heavner: So they do, I think, largely Google searches, or they go out and hit certain news things.
44 00:04:51.670 ⇒ 00:04:58.560 Jay Heavner: And they gather this information, and then they use it for whatever the purpose. This is like a marketing… kind of a marketing.
45 00:04:58.560 ⇒ 00:05:02.070 Uttam Kumaran: And they’re doing, they’re doing, like, the gathering pretty manually, or, like.
46 00:05:02.070 ⇒ 00:05:09.500 Jay Heavner: Yeah, and they’re trying to use AI for this, and, you know, they’re trying to use Claw to do some of this, and I’m like, I’d be…
47 00:05:10.440 ⇒ 00:05:17.529 Jay Heavner: I’d be careful with… you know, I didn’t say this directly, but like, you know, I’d be careful with Vanilla Clawed trying to do market research for me. It’s…
48 00:05:17.530 ⇒ 00:05:21.749 Uttam Kumaran: Yeah, and it’s gonna just take whatever’s GEO, like, high, and…
49 00:05:22.600 ⇒ 00:05:25.840 Jay Heavner: Yeah, yeah, like, you know, there are…
50 00:05:27.200 ⇒ 00:05:35.099 Jay Heavner: ways to do that, but that’s not… you know, that’s fine. You’ll get something from that. I’m like, but the more interesting thing is, I’m like, so I break it up into a couple parts where…
51 00:05:35.270 ⇒ 00:05:37.259 Jay Heavner: We are doing some type of…
52 00:05:37.820 ⇒ 00:05:47.200 Jay Heavner: ingestion of information. You know, we pick data sources we trust, we’re ingesting data from those, maybe we’re vectorizing those, we’re doing sentim analysis, we’re doing all that kind of stuff.
53 00:05:47.610 ⇒ 00:05:51.949 Jay Heavner: And then you can use that downstream in interesting ways, but…
54 00:05:52.240 ⇒ 00:05:58.189 Jay Heavner: We’ve had a couple requests kind of like this, where people are like, well, we want to know what companies are doing.
55 00:05:59.040 ⇒ 00:06:03.609 Jay Heavner: you know, out in the wild, so we can pivot and do X. And I’m like…
56 00:06:03.720 ⇒ 00:06:06.650 Jay Heavner: That’s… that’s good. It’s not a…
57 00:06:08.010 ⇒ 00:06:13.950 Jay Heavner: maybe it’s becoming a better case… use case for AI, but I know a year ago it wasn’t a great AI use case.
58 00:06:14.190 ⇒ 00:06:21.510 Jay Heavner: Maybe there are better things, but, like, you know, Catherine was showing me some of the data we can buy in Snowflake. I’m like, we’d be better off to just buy data.
59 00:06:21.800 ⇒ 00:06:22.230 Uttam Kumaran: Totally.
60 00:06:22.230 ⇒ 00:06:23.490 Jay Heavner: mind it ourselves.
61 00:06:23.490 ⇒ 00:06:30.370 Uttam Kumaran: Yeah, yeah, yeah, I agree. Or even just saying, like, look, only use PR Newswire, only use Wall Street Journal, like.
62 00:06:30.370 ⇒ 00:06:32.200 Jay Heavner: Yeah, use, use, use the FG.
63 00:06:32.200 ⇒ 00:06:33.420 Uttam Kumaran: Use these approved websites.
64 00:06:33.420 ⇒ 00:06:33.850 Jay Heavner: Yeah.
65 00:06:33.850 ⇒ 00:06:39.080 Uttam Kumaran: sources and use these approved snowflake sources to then mine, right?
66 00:06:39.350 ⇒ 00:06:42.290 Jay Heavner: Go, go, go buy Crunchbase, go by…
67 00:06:42.290 ⇒ 00:06:43.220 Uttam Kumaran: Yeah, exactly.
68 00:06:43.220 ⇒ 00:06:49.240 Jay Heavner: go suck it all down from the journal, from FT, from whatever.
69 00:06:49.240 ⇒ 00:06:49.830 Uttam Kumaran: Yeah.
70 00:06:49.830 ⇒ 00:06:52.560 Jay Heavner: You know, if you want to go do…
71 00:06:52.950 ⇒ 00:06:59.900 Jay Heavner: Twitter or Reddit. Okay, that’s cool, like, maybe you see what people are talking about, but you weight that very differently.
72 00:06:59.900 ⇒ 00:07:00.430 Uttam Kumaran: Yes.
73 00:07:01.660 ⇒ 00:07:06.619 Jay Heavner: you know, I think there’s some interesting things we could do there, and I think, like, that feels more like a…
74 00:07:07.090 ⇒ 00:07:09.280 Jay Heavner: Probably in the world of Catherine.
75 00:07:09.590 ⇒ 00:07:13.099 Jay Heavner: I’ll speak first since she’s not here, but
76 00:07:13.650 ⇒ 00:07:23.290 Jay Heavner: And then, once we have that, then we can really do interesting things with it, right? Like, really, it’s just getting that breadth of data in to do anything with it.
77 00:07:25.750 ⇒ 00:07:36.299 Jay Heavner: you know, you tell me, what are you guys seeing in the world of, like, market research and AI? Is AI approaching a level of market research penetration that surpasses traditional market research?
78 00:07:36.840 ⇒ 00:07:42.450 Uttam Kumaran: Yeah, maybe I can actually even just, like, show you an example of,
79 00:07:42.820 ⇒ 00:07:52.590 Uttam Kumaran: something that we’re… like, I’m using this, you know, fairly often, and I think you may find it pretty interesting. So I… I’m using this skill called Last 30 Days.
80 00:07:53.990 ⇒ 00:07:54.979 Jay Heavner: Yeah, I mentioned it’s like.
81 00:07:54.980 ⇒ 00:08:06.600 Uttam Kumaran: Catherine, and we don’t have to use it directly, but I think it’s a great example of a skill that helps for, like, community consensus on a topic. A common use case for me is, like.
82 00:08:07.090 ⇒ 00:08:20.459 Uttam Kumaran: hey, I want to implement, like, an auth service for an application. Well, like, you know, you have your classic, like, we could use Firebase, what, Okta, but, like, what’s the community… what’s, like, the last 30 days? And I don’t think… no, that’s exactly…
83 00:08:21.270 ⇒ 00:08:22.470 Uttam Kumaran: What is that like?
84 00:08:22.590 ⇒ 00:08:25.319 Uttam Kumaran: What’s a conversation around Hawk and Dave?
85 00:08:25.550 ⇒ 00:08:27.400 Uttam Kumaran: Hey, I just lost your audio.
86 00:08:27.860 ⇒ 00:08:34.399 Uttam Kumaran: And it runs a series of sub-agents, it has a great scaffolding. I’ll put a report. Yeah.
87 00:08:35.900 ⇒ 00:08:38.370 Jay Heavner: Just lost your audio.
88 00:08:41.409 ⇒ 00:08:43.229 Jay Heavner: I can’t hear you anymore. Can you hear me?
89 00:08:44.720 ⇒ 00:08:45.540 Uttam Kumaran: How about now.
90 00:08:45.540 ⇒ 00:08:46.250 Jay Heavner: Yeah, there you go.
91 00:08:46.770 ⇒ 00:08:55.210 Uttam Kumaran: Yeah, no, basically, it, like, triggers a series of sub-agents that, like, all kind of attack the problem from different angles. It uses, like.
92 00:08:55.340 ⇒ 00:09:13.079 Uttam Kumaran: X, Reddit, right? It uses some of the approved sources, for which you can add your API key, or it’ll use publicly. It combines all of that and produces, like… it basically does two things. One, it injects a prompt into your current session that says, I am the expert in, like, auth.
93 00:09:13.190 ⇒ 00:09:22.790 Uttam Kumaran: And you can ask me any ques… I have access to this report, you can ask me questions about the report, or it’s basically… it just helps me be like, cool, I’m just gonna go with the community consensus the best
94 00:09:23.290 ⇒ 00:09:35.929 Uttam Kumaran: idea, you know? And I think that is a really good option for the research portion, to not only use stuff from Snowflake, use approved sources, and honestly, you could just create a
95 00:09:36.120 ⇒ 00:09:41.250 Uttam Kumaran: version of that. It doesn’t have to be… it’s not… maybe it doesn’t have 30 days, but it could just be, like, a research or a.
96 00:09:41.250 ⇒ 00:09:41.730 Jay Heavner: Yeah.
97 00:09:41.730 ⇒ 00:09:42.760 Uttam Kumaran: her skill.
98 00:09:42.980 ⇒ 00:09:46.369 Uttam Kumaran: That way, those folks are not even thinking about, like.
99 00:09:46.660 ⇒ 00:09:53.519 Uttam Kumaran: whatever, they just run the market research skill. And then I think the second part is actually the asset creation.
100 00:09:53.690 ⇒ 00:09:59.740 Uttam Kumaran: Right? Is that, like… is that, like, a one-pager? Is that a deck? Like, I… I think…
101 00:09:59.930 ⇒ 00:10:02.640 Uttam Kumaran: That comes immediately next.
102 00:10:02.640 ⇒ 00:10:03.250 Jay Heavner: Yeah.
103 00:10:03.620 ⇒ 00:10:12.089 Uttam Kumaran: And it’s not all in one skill, but, like, that is… that is sort of how… so in terms of research, that’s what I’m seeing in the last, like, 60 days, is, like.
104 00:10:12.200 ⇒ 00:10:13.910 Uttam Kumaran: Game changer for us.
105 00:10:14.400 ⇒ 00:10:21.659 Jay Heavner: Yeah, I like the idea, too, of somewhat… not canned, but templated outputs from that, right? Like, I feel like we’re.
106 00:10:21.660 ⇒ 00:10:22.360 Uttam Kumaran: Yeah, exactly right.
107 00:10:22.360 ⇒ 00:10:23.340 Jay Heavner: where
108 00:10:23.440 ⇒ 00:10:31.939 Jay Heavner: what do you want? What is the artifact you want from… alright, you’ve done… you’ve done your research, your ingestion, what is the output you want from that that is going.
109 00:10:31.940 ⇒ 00:10:32.640 Uttam Kumaran: Yes.
110 00:10:32.640 ⇒ 00:10:44.080 Jay Heavner: downstream digestible for your audience and all that, but I also like the idea of… I’d much rather be using APIs or structured data to consume data than trying to use
111 00:10:44.570 ⇒ 00:10:45.730 Jay Heavner: web fetch.
112 00:10:46.120 ⇒ 00:10:46.750 Uttam Kumaran: Yes.
113 00:10:46.750 ⇒ 00:10:53.969 Jay Heavner: using WebFetch, you’re gonna run up against formatting issues, you’re gonna run up against robots.txt, you’re going to hit all.
114 00:10:53.970 ⇒ 00:10:54.510 Uttam Kumaran: Exactly.
115 00:10:54.510 ⇒ 00:11:01.889 Jay Heavner: We did a… we did a Gleanathon last year, and someone was trying to use the Fortune 500 website.
116 00:11:02.030 ⇒ 00:11:07.839 Jay Heavner: to mine it, and I’m like, you realize it can’t see that. That’s all JavaScript. It has no idea. It’s just manual.
117 00:11:07.840 ⇒ 00:11:09.870 Uttam Kumaran: It’s gonna block you immediately. Yeah.
118 00:11:09.870 ⇒ 00:11:10.490 Jay Heavner: Yeah.
119 00:11:10.590 ⇒ 00:11:11.840 Jay Heavner: So…
120 00:11:11.840 ⇒ 00:11:21.889 Uttam Kumaran: That’s why, and that’s also my, kind of, like, our thinking even broadly, and, you know, our company uses a lot of skills, and ultimately, I think we’re starting to treat skills as products.
121 00:11:22.110 ⇒ 00:11:22.780 Uttam Kumaran: And…
122 00:11:23.220 ⇒ 00:11:38.469 Uttam Kumaran: we’re teaching people, like, use this skill where it may ask you for feedback, it may execute a certain thing, but I guess, like, don’t worry too much, like, assume that that is the best version of this. For example, like, we have a lot of people creating decks.
123 00:11:38.740 ⇒ 00:11:53.729 Uttam Kumaran: Originally, people, like, 4 months ago, people were like, there’s, like, 10 deck creator skills. Everybody, like, started to make one, and then I’m like, look, they’re all kind of good, but I’m just gonna take a crack at a really good one. Yeah. It does everything from PowerPoint to this, it asks you good questions, and now, just…
124 00:11:53.790 ⇒ 00:12:06.700 Uttam Kumaran: don’t work… just use this one. It has the flexibility, and then let me know if it’s, like, not working, and we’ll maintain the skill, you know? And so you just use it, get it as far as you can, and then ship the thing, you know?
125 00:12:06.970 ⇒ 00:12:07.580 Jay Heavner: Yeah.
126 00:12:07.770 ⇒ 00:12:15.130 Jay Heavner: It’s interesting you say that, because we were talking about that as sort of what internal value proposition of…
127 00:12:15.280 ⇒ 00:12:32.500 Jay Heavner: standardizing libraries around skills, agents, MCP, just like, hey, this is… these are known things that we trust, you can go off and use these. Because everyone here is talking about, oh, I need to build an agent. I’m like, I’m not sure you even know what an agent is, much less need to build one.
128 00:12:32.500 ⇒ 00:12:32.900 Uttam Kumaran: Yes.
129 00:12:33.140 ⇒ 00:12:40.069 Jay Heavner: you’re thinking about, what’s that? You’re thinking about a GPT and ChatGPT that’s a different kind of thing.
130 00:12:40.200 ⇒ 00:12:43.619 Jay Heavner: But… hold on a second.
131 00:12:46.710 ⇒ 00:12:53.479 Jay Heavner: But, yeah, you know, and so we’ve got some now, but most of ours are just things we’ve downloaded from…
132 00:12:54.560 ⇒ 00:12:57.280 Jay Heavner: various GitHub repos on the internet, right? Like, you look at.
133 00:12:57.280 ⇒ 00:12:57.710 Uttam Kumaran: Yes.
134 00:12:57.710 ⇒ 00:13:01.180 Jay Heavner: she is. Maybe you do some tweaking,
135 00:13:01.420 ⇒ 00:13:10.129 Jay Heavner: Yeah, I’m in the basement now. I’ve got my laundry hanging behind me, I have a floor in my background, like a floor.
136 00:13:10.600 ⇒ 00:13:21.579 Jay Heavner: Here, pull a chair, I’ll just twist this a little bit. You do look more like the appropriate, gray-beard, IT director at that point. Yeah, yeah, yeah.
137 00:13:21.580 ⇒ 00:13:31.059 Jay Heavner: I just moved in and dumped everything. Oh, yeah, that tracks. I’ve put 3 things up on my wall, right? That’s what I’m doing now. I mean, can we also talk about how…
138 00:13:31.060 ⇒ 00:13:49.939 Jay Heavner: should be reported to HR immediately for what I’m about to say. Like, no woman has ever brought tools to an office, right? Like… Oh, I don’t know where those even came from, but they were in my desk upstairs. The network tester is mine, and the acid reducer is mine. The other stuff, I think I’ve just stolen at different places along the years.
139 00:13:50.020 ⇒ 00:13:55.489 Jay Heavner: Yeah. Oh, let me… let me unlaw it now, so you aren’t sitting in the background like a…
140 00:13:55.610 ⇒ 00:14:03.139 Jay Heavner: A little demon. Yeah. More appropriate. I had to wear a suit this morning, and I refused to wear it longer than I had to, so I stood behind that and changed.
141 00:14:04.640 ⇒ 00:14:05.940 Jay Heavner: Just hanging in there.
142 00:14:06.170 ⇒ 00:14:14.519 Jay Heavner: Support this. We were talking about… so I’ve had two conversations this week with the market research team, and in both cases.
143 00:14:14.780 ⇒ 00:14:16.590 Jay Heavner: They’re talking about…
144 00:14:17.410 ⇒ 00:14:35.390 Jay Heavner: mining data and doing things with it. Like, today was the NASDAQ, they were a little more structured than the Tor thing. They have 1,500 companies, they are… periodically, they have to do something, like, there’s a bird. And they go out and they do some research, they find…
145 00:14:35.890 ⇒ 00:14:42.759 Jay Heavner: articles that say that they’re still in the space they’re supposed to be in, blah blah blah. But what we were kind of talking about is, like.
146 00:14:44.350 ⇒ 00:14:53.740 Jay Heavner: as an organization, we want all this company data, right? We want to be drinking from a fire hose of our creation, I think.
147 00:14:54.080 ⇒ 00:14:57.110 Jay Heavner: And… Probably better to…
148 00:14:58.070 ⇒ 00:15:05.990 Jay Heavner: buy data from Snowflake, and mine very specific data using APIs and agreements we have with those people, rather than just
149 00:15:06.560 ⇒ 00:15:12.060 Jay Heavner: Hey, Claude, why don’t you go find everything you find about CrowdStrike? Oh my god, yeah, absolutely, yes. Right, that’s just…
150 00:15:12.060 ⇒ 00:15:17.500 Uttam Kumaran: You’re gonna get listicles, you’re gonna get CrowdStrike pro stuff from CrowdStrike.
151 00:15:17.500 ⇒ 00:15:23.209 Jay Heavner: And, you know, we’re going to spend money to do this. That’s the right thing to do, right?
152 00:15:24.120 ⇒ 00:15:26.220 Jay Heavner: You know, and then we can…
153 00:15:26.630 ⇒ 00:15:30.740 Jay Heavner: Build that inventory of data, and then we can just go ask questions of that thing.
154 00:15:31.240 ⇒ 00:15:35.459 Jay Heavner: And, you know, when I’m talking to them, like, it was very clear that they weren’t
155 00:15:36.310 ⇒ 00:15:51.290 Jay Heavner: they’re looking at it as a one-shot operation, like, research it and do the thing, and I’m like, no, but first you find it, and then you do your classification, your sentiment analysis, and you do all these other things, and then you… then you pivot here, and like, you have to think about this, like.
156 00:15:51.820 ⇒ 00:15:56.770 Jay Heavner: You’re baking bread, lots of ingredients, blah blah blah, not one-shot it, but…
157 00:15:57.140 ⇒ 00:16:12.709 Jay Heavner: I get why you want a one-shot. We all want a one-shot. It’s like, I don’t think I look so much… I mean, I think you’re not wrong, but I also… it is this, like, interesting… it’s a… to your point, it’s like this failure to, like, see the system, because it’s like… I’ve even had this, like, Kyle, I’ve been, like.
158 00:16:12.710 ⇒ 00:16:17.180 Jay Heavner: So that’s really cool what you put together. Have you thought about what you’re gonna do when they say, add this?
159 00:16:17.180 ⇒ 00:16:19.199 Jay Heavner: Oh, yeah, yeah. Right?
160 00:16:19.200 ⇒ 00:16:29.310 Uttam Kumaran: And it’s, like, people think about skills as, like, this, like, or even doing this, like, end-to-end, it’s like… it’s actually just, like, if you can make the pieces faster, and you just have the checks.
161 00:16:29.310 ⇒ 00:16:45.059 Uttam Kumaran: I’ll take 50% faster, you know? So, like, you don’t have to go 90%, 100% faster. And so, part of this is also, I think, Jay, is, like, if there’s someone on that team who can be, like, the owner of the skill, so I think, Catherine, basically kind of, like, when I was…
162 00:16:45.060 ⇒ 00:16:53.279 Uttam Kumaran: talking about the last 30 days skill, and I was like, that concept is really interesting for market research, but ultimately, it’s like, I think part of if we
163 00:16:53.650 ⇒ 00:17:09.099 Uttam Kumaran: the pitch at our company that we’re doing is we’re treating skills kind of like products, more in the way of, like, a lot of people are going to be using them, and then they may say, like, oh, this skill lacked this integration, or, like, it… the output wasn’t exactly right. Okay, then, like.
164 00:17:09.160 ⇒ 00:17:15.689 Uttam Kumaran: there’s a small team of us that are maintaining. Is there someone on the market research side that can be that, like, skill owner?
165 00:17:16.060 ⇒ 00:17:16.609 Jay Heavner: Hi.
166 00:17:17.390 ⇒ 00:17:23.809 Uttam Kumaran: Because then otherwise, I think, Jay, like, on the side, you should just own, like, the fact that people can create
167 00:17:23.920 ⇒ 00:17:30.759 Uttam Kumaran: skills at all, and execute skills at all, and, like, a governed system, but ultimately, like, I’m not the market research CTA.
168 00:17:31.180 ⇒ 00:17:34.200 Uttam Kumaran: expert, right? And so, like, formatting…
169 00:17:34.200 ⇒ 00:17:46.279 Jay Heavner: Reconcile our bank statement, right? I’m working with finance right now, they want to do reconciliation, and they sent me this adorable set of instructions. Add column K, copy column D to column K.
170 00:17:46.280 ⇒ 00:17:47.080 Uttam Kumaran: Yes.
171 00:17:47.080 ⇒ 00:17:52.220 Jay Heavner: And I’m like, oh, well, bless your heart, that’s adorable, but those aren’t instructions.
172 00:17:52.460 ⇒ 00:18:08.719 Jay Heavner: But I can sit with them, and we can get instructions, but I think it’s going to take some effort to extract instructions from people, that’s fine. And I told you, Tom, that, like, you and I had talked about building libraries around skills and agents and MCP and all the blocks.
173 00:18:09.240 ⇒ 00:18:20.169 Jay Heavner: And then… Having some kind of operator, central clearinghouse to push the right skill at the right time.
174 00:18:21.940 ⇒ 00:18:26.819 Jay Heavner: Did Catherine tell you I’ve been tilting at a dark factory? Like, trying to build my own dark factory?
175 00:18:26.820 ⇒ 00:18:28.530 Uttam Kumaran: No, wait, what does that mean?
176 00:18:28.530 ⇒ 00:18:37.410 Jay Heavner: Like, a fully, spec, like, business spec, to deploy finished product with no human in the middle?
177 00:18:37.410 ⇒ 00:18:39.739 Uttam Kumaran: I am also trying to do something like that.
178 00:18:39.740 ⇒ 00:18:40.190 Jay Heavner: And…
179 00:18:40.440 ⇒ 00:18:44.829 Uttam Kumaran: Because we’re looking at a lot of open source, like, versions of software, and I’m like.
180 00:18:44.830 ⇒ 00:18:45.430 Jay Heavner: Yay!
181 00:18:45.430 ⇒ 00:18:55.040 Uttam Kumaran: okay, like, how can I go from this to, like, my config understanding of our, like, we deploy some stuff on railway, our railway setup, our backend setup, our off.
182 00:18:55.260 ⇒ 00:18:56.919 Uttam Kumaran: But, like, yeah, kind of a factory.
183 00:18:56.920 ⇒ 00:18:57.710 Jay Heavner: Yep.
184 00:18:57.710 ⇒ 00:18:58.220 Uttam Kumaran: Yeah.
185 00:18:58.220 ⇒ 00:19:03.359 Jay Heavner: I tried to do, like, the mission to Mars before I even started to, like, throw a model.
186 00:19:03.360 ⇒ 00:19:04.860 Uttam Kumaran: Yeah, it’s like…
187 00:19:05.250 ⇒ 00:19:05.780 Jay Heavner: Like, I’m not.
188 00:19:05.780 ⇒ 00:19:07.809 Uttam Kumaran: Not nearly, I’m not even close, by the way.
189 00:19:07.810 ⇒ 00:19:11.510 Jay Heavner: Oh, I’m not, I’m not either. Like, I’ve got it where it will run…
190 00:19:11.770 ⇒ 00:19:15.789 Jay Heavner: But what it produces doesn’t run,
191 00:19:16.710 ⇒ 00:19:33.269 Jay Heavner: Yeah, it’s… Isn’t the Dark Factory also handling the, like, the bug fixer and Steve’s stuff? Yeah, but that’s a much simpler thing. So, I didn’t put… it’s not going through mine, it’s a really simple… which, I built a thing for a grant proposal tool.
192 00:19:33.580 ⇒ 00:19:35.030 Jay Heavner: And the… the…
193 00:19:35.160 ⇒ 00:19:44.610 Jay Heavner: the product sponsor, he’s like, oh, I have these changes. And I’m like, you know what? I’m gonna put you in a little text box. You just type your… you change in the text box, and…
194 00:19:44.610 ⇒ 00:20:01.520 Jay Heavner: it opens a ticket, it pops a ticket, it sends me Slack notifications, it does the build at the end. It’s pretty good. Yeah, yeah, yeah. I mean, and technically that is a very lightweight dark factory, but that’s where, when I was looking at this, people were like, oh, start with an existing project and do bug fixes on it.
195 00:20:01.600 ⇒ 00:20:13.839 Jay Heavner: Okay. That makes more sense. Yeah, that’s your model name. I wanted something that I could run an iOS project through, or a Python project through, or a Node project through, like, any architecture I want, and it’s like…
196 00:20:14.070 ⇒ 00:20:14.880 Jay Heavner: No.
197 00:20:15.350 ⇒ 00:20:16.300 Jay Heavner: No.
198 00:20:16.810 ⇒ 00:20:18.810 Jay Heavner: But I’m like, that’s what I want. So…
199 00:20:19.310 ⇒ 00:20:22.990 Jay Heavner: Yeah, the other thing… Yeah, I mean.
200 00:20:22.990 ⇒ 00:20:32.050 Uttam Kumaran: to tell you even one further thing, Jay, on our team, I try to treat the ticket as, like, the ultimate unit of work, and, like.
201 00:20:32.150 ⇒ 00:20:43.959 Uttam Kumaran: Everything on our side runs… we use linear a lot. But without that, then there’s no, like, unit of work, and there’s no, like, thing I can even begin to try to hand off to an agent to do, so…
202 00:20:43.960 ⇒ 00:20:44.450 Jay Heavner: Yup.
203 00:20:44.450 ⇒ 00:21:00.350 Uttam Kumaran: I’ve tried our best to say, like, everything has to start with a ticket and ladders up to, like, okay, further abstractions, further abstractions. That way, we can start to, like, go from the ticket to, like, the PR, or what was done, and then hand that to the AI and be like.
204 00:21:00.410 ⇒ 00:21:10.900 Uttam Kumaran: okay, what… what would need… what were the steps in between that you need? Like, what are the integrations you would need? Where would you run this? Like, what is the output? And, like.
205 00:21:11.350 ⇒ 00:21:18.510 Uttam Kumaran: okay, are we, like, 20% close to, like, being able to literally assign this to the CTA agent, or are we, like, 90%, right?
206 00:21:18.920 ⇒ 00:21:28.199 Jay Heavner: I will tell you, I’ve done the same thing, and the problem is, like, it just… it starts to degrade. It won’t open the ticket. Or it’ll open the ticket, it’ll work the ticket, but it won’t finish the ticket.
207 00:21:28.200 ⇒ 00:21:38.449 Uttam Kumaran: Yeah, no, all the integrations have to work really, really well, and then also, it’s like, it may work on your computer, and then the moment someone else assigns it, like, the auth is off, or…
208 00:21:38.450 ⇒ 00:21:38.850 Jay Heavner: Mmm.
209 00:21:38.850 ⇒ 00:21:49.240 Uttam Kumaran: Yeah, it’s… but so we’re heading… trying to head in that direction, ultimately because I think there are Porsche… there’s some tickets that can be done by AI, but I’m honestly not sure if…
210 00:21:49.410 ⇒ 00:22:03.430 Uttam Kumaran: folks are going to realize that, so it’s almost like maybe you create the ticket, and then… I’m thinking almost like there’s an AI triage that’s like, okay, I believe I can take this, I’ll take it. Or, actually, this is human…
211 00:22:03.970 ⇒ 00:22:05.900 Uttam Kumaran: You should continue to go on.
212 00:22:05.900 ⇒ 00:22:08.049 Jay Heavner: It… it believes it can do everything, that’s the.
213 00:22:08.050 ⇒ 00:22:08.530 Uttam Kumaran: True, true.
214 00:22:08.530 ⇒ 00:22:12.080 Jay Heavner: And it won’t. And then I’m like, did you finish it? No.
215 00:22:12.490 ⇒ 00:22:14.570 Jay Heavner: Write more tickets to finish the work.
216 00:22:14.570 ⇒ 00:22:14.970 Uttam Kumaran: Yeah.
217 00:22:14.970 ⇒ 00:22:17.209 Jay Heavner: Or go back and reopen the ticket, and…
218 00:22:17.970 ⇒ 00:22:21.630 Jay Heavner: Actually, okay, I have this harebrained idea that I…
219 00:22:21.870 ⇒ 00:22:26.450 Jay Heavner: I was contemplating trying, because, yeah, right, same experience,
220 00:22:28.270 ⇒ 00:22:32.990 Jay Heavner: I was contemplating frying, essentially, like,
221 00:22:33.000 ⇒ 00:22:51.789 Jay Heavner: too repository-type thing, like, and so if I had this, like, you know, tickets and stuff get opened, and then the intelligent thing can decide, like, oh, I could do that, versus I need a human, like, almost it’s like a read replica kind of a concept, right? And so, like, let all of that happen.
222 00:22:52.370 ⇒ 00:23:03.759 Jay Heavner: work and try to solve the ticket, and put in the PR, and then on my, like, real repo, right, that’s where I can sort of evaluate, like, okay, well, was that a good AI, like, you know, solve? Like, should.
223 00:23:03.760 ⇒ 00:23:04.380 Uttam Kumaran: Yes.
224 00:23:04.380 ⇒ 00:23:14.010 Jay Heavner: But, like, I think… I don’t know, I just, like, my brain went to, like, it might be interesting to have two repositories feeding each other back and forth.
225 00:23:14.370 ⇒ 00:23:17.440 Jay Heavner: Then I started to wonder, like.
226 00:23:19.370 ⇒ 00:23:25.680 Jay Heavner: Okay, this is where I start to sound a little crazy, but I’m like, if we were to package
227 00:23:25.820 ⇒ 00:23:27.939 Jay Heavner: Somehow. Some magical how.
228 00:23:28.050 ⇒ 00:23:28.820 Jay Heavner: like…
229 00:23:28.830 ⇒ 00:23:38.639 Jay Heavner: co-work, anything a person’s staff does in co-work is secretly being essentially GitHubbed on the back end, right? Yeah, yeah, yeah. And then there’s the ability for a similarly agentic process
230 00:23:38.640 ⇒ 00:23:54.160 Jay Heavner: to go through that repo of probably most junk, but the occasion is a good idea, right? And then that AI is like, hey, this thing that so-and-so on your market research team built is actually kind of dope, you might want to consider putting it into, like, you know, the organization’s official code, kind of a thing.
231 00:23:54.160 ⇒ 00:23:54.740 Uttam Kumaran: Yes.
232 00:23:54.740 ⇒ 00:24:02.479 Jay Heavner: or person A and Person B are talking about the same thing, bringing them together. Yeah. Yeah, no, I’ve had similar thoughts on…
233 00:24:03.040 ⇒ 00:24:06.660 Jay Heavner: People are going to freak out about the observability part of this.
234 00:24:08.080 ⇒ 00:24:22.010 Jay Heavner: Like, the crazy thing is… Yeah, like, they forget about it, but I think when you lean into it, you realize, like, observability is actually… The power. Yeah, yeah, yeah. I tend to agree. I told him to…
235 00:24:22.040 ⇒ 00:24:37.599 Jay Heavner: that we were going to start working on… because you like AI projects, I’m like, yeah, we’re still looking for good candidates, but we’re going to build up an ingestion pipeline to bring all these things in, because, you know, I’m still not…
236 00:24:38.090 ⇒ 00:24:45.000 Jay Heavner: hearing… wonderful ideas. The NASDAQ thing seems fine. It’s gonna be…
237 00:24:46.450 ⇒ 00:24:52.099 Jay Heavner: it’s not a traditionally great AI thing, because they’re looking for market research. Yeah. And…
238 00:24:52.300 ⇒ 00:24:59.419 Jay Heavner: they’re trying to do it with Claude, and I’m like, I’m just… Claude’s whatever at this. You know, you can force it into certain things, but…
239 00:24:59.630 ⇒ 00:25:08.250 Jay Heavner: And really, what they’re pulling now is pretty low-stakes stuff. They’re, like, pulling press releases, and… You know, whatever.
240 00:25:08.360 ⇒ 00:25:16.340 Jay Heavner: Subsistence farming? Subsistence farming, yeah. What, I mean…
241 00:25:16.760 ⇒ 00:25:23.270 Jay Heavner: Have you heard any really good? I mean, it’s just like, no, right? I mean, and I, like…
242 00:25:23.390 ⇒ 00:25:40.350 Jay Heavner: I haven’t, however, the, you know, me and my Pollyanna, right? Like, I have at least heard people coming up with, like, things that are really valid, like, AI can help you automate that thing that has been automatable this whole time, right? You know, like, so I think at least people are starting to realize…
243 00:25:40.350 ⇒ 00:25:50.320 Jay Heavner: what can be automated, and that AI can help them do that. Yeah. But yes, I have not yet heard something where it’s like, oh, that is actually a killer use case for AI itself.
244 00:25:50.320 ⇒ 00:25:50.690 Uttam Kumaran: Yeah, no.
245 00:25:50.690 ⇒ 00:25:56.849 Jay Heavner: We’re not doing AI projects, we’re using AI to automate traditional automation pipelines.
246 00:25:56.850 ⇒ 00:26:03.289 Uttam Kumaran: Another way to think about it is, like, on the market research side, you have both the research component and the production of an asset.
247 00:26:03.450 ⇒ 00:26:09.440 Uttam Kumaran: Right? And so, you can think about those, like, a little bit differently, like.
248 00:26:09.700 ⇒ 00:26:20.330 Uttam Kumaran: hey, come with whatever your research report is, and we’ll help you create the report. Or it’s a, we’ll help you get the research first. So I think it sort of depends on, like.
249 00:26:20.620 ⇒ 00:26:29.130 Uttam Kumaran: there’s kind of an interesting choice there. I think the other thing is we could just choose the team that we feel like we may have the highest odds of success with.
250 00:26:29.400 ⇒ 00:26:33.200 Uttam Kumaran: As, like, a good candidate to, like, just do a user interview.
251 00:26:33.200 ⇒ 00:26:43.040 Jay Heavner: Or… yeah. I do wonder… Tom had said this earlier, I’ve discounted this, I’m guessing you’ve discounted this too, but making pretty flashy things…
252 00:26:44.350 ⇒ 00:26:48.610 Jay Heavner: We aren’t impressed by those, but other people would be impressed by.
253 00:26:48.610 ⇒ 00:26:49.170 Uttam Kumaran: Yes.
254 00:26:49.490 ⇒ 00:26:50.890 Uttam Kumaran: So that’s kind of it, like…
255 00:26:50.890 ⇒ 00:26:51.320 Jay Heavner: Yes.
256 00:26:51.320 ⇒ 00:26:57.689 Uttam Kumaran: the research component made me kind of like, alright, yeah, this goes and gets all my links, but oh my gosh, it, like, did the PDF.
257 00:26:58.130 ⇒ 00:27:10.560 Jay Heavner: Yeah, it looks like… well, you know, this… the CEO who’s coming in… what was her thing? Babel, or… Oh, Gamma. Gamma. Yeah, she’s, you know, using slide things, and I’m like.
258 00:27:11.110 ⇒ 00:27:14.120 Jay Heavner: Yeah. That’s… we’re frozen.
259 00:27:14.660 ⇒ 00:27:17.060 Jay Heavner: That’s okay.
260 00:27:17.060 ⇒ 00:27:18.429 Uttam Kumaran: That’s fine.
261 00:27:18.430 ⇒ 00:27:26.020 Jay Heavner: Let me kick it off and back on and see if that fixes it. I did notice it was doing a lot of, like, strangeness for a minute there. It probably…
262 00:27:26.370 ⇒ 00:27:29.189 Jay Heavner: You know, we only buy the best equipment, Catherine.
263 00:27:29.410 ⇒ 00:27:30.590 Jay Heavner: Absolutely.
264 00:27:30.810 ⇒ 00:27:34.419 Jay Heavner: I dare you to put a ticket into IT and ask them to come. Yeah.
265 00:27:34.630 ⇒ 00:27:35.650 Jay Heavner: Ian.
266 00:27:36.040 ⇒ 00:27:38.070 Jay Heavner: Alright, he’s gone.
267 00:27:38.070 ⇒ 00:27:38.929 Uttam Kumaran: I can notice.
268 00:27:38.930 ⇒ 00:27:56.409 Jay Heavner: No, I mean, I agree, like, I think, like, you and I chase, like, power and knowledge out of these things. I think everybody else is just like, green. Well… It’s fine, I want it somewhere. My fear is that it becomes… you have the same problem you have with the MVP, where it looks dumb.
269 00:27:56.410 ⇒ 00:28:04.709 Jay Heavner: Yeah. And they assume that flash and substance are comparable. That’s a good point. And…
270 00:28:05.820 ⇒ 00:28:18.219 Jay Heavner: So, we gotta be careful with that message. I’m like, so this looks… this looks really good, and it’s fine, it’s okay, you know, I wouldn’t… I wouldn’t, put my child’s life on it, but it’s, you know, it’s…
271 00:28:18.390 ⇒ 00:28:19.569 Jay Heavner: It’s a thing.
272 00:28:19.860 ⇒ 00:28:26.070 Jay Heavner: But it does… I don’t know, like, I’m just trying to get the ball down the field.
273 00:28:26.610 ⇒ 00:28:36.090 Uttam Kumaran: Well, that’s why I think, like, if you produce… like, one thing that… I feel the same way, like, we have a lot of things around, like, decks and document production,
274 00:28:36.270 ⇒ 00:28:49.180 Uttam Kumaran: But ultimately, like, people weren’t using the ski… until I literally showed, like, wow, we could do this in, like, 10 minutes now, and it could pull from all the right sources, and it does the end-to-end thing, that’s when they were really impressed.
275 00:28:49.250 ⇒ 00:29:05.849 Uttam Kumaran: So that’s why I wonder if, like, we can even take a past example of a research report, almost, like, recreate it using the new mode, and then find… I think who we need to convince, ultimately, is, like, the champion within the market research team that’s, like.
276 00:29:05.900 ⇒ 00:29:12.570 Uttam Kumaran: I have a couple tweaks, but, like, I love this. I’m gonna become a power user, and I’m gonna, like, propose modifications to the skill.
277 00:29:13.310 ⇒ 00:29:22.330 Uttam Kumaran: Because that person then becomes the person that gets adoption within the team. I think we’re… the distance is going to be too far for us to… otherwise, we’re going to turn to product managers for, like.
278 00:29:22.330 ⇒ 00:29:28.840 Jay Heavner: So I don’t… I think… I don’t disagree, so, like, yes, but I think now I see what you’re talking about more so.
279 00:29:30.260 ⇒ 00:29:36.240 Jay Heavner: One of the wrinkles with market research is that I’m not 100%.
280 00:29:36.490 ⇒ 00:29:39.220 Jay Heavner: Sure, the research is very…
281 00:29:39.490 ⇒ 00:29:52.239 Jay Heavner: very useful, and so, to Jay’s point, if they, like, they might be so impressed with the magic trick of the skill did the thing that they move past it, like, yeah, but is that research report remotely valuable to anybody?
282 00:29:52.440 ⇒ 00:30:00.300 Jay Heavner: That’s fair. And so, what’s interesting about that department, too, is they have some people who are very engaged.
283 00:30:00.770 ⇒ 00:30:15.260 Jay Heavner: I feel like they’re trying to go this way, and they’re pointed just slightly this way a little bit, and we can close that. That’s a closable gap. You know, I might spend some more time with Chris and, like, just kind of point him, because
284 00:30:15.730 ⇒ 00:30:20.860 Jay Heavner: He’s done things in Claude. He’s writing… I didn’t realize you could put Node in Claude.
285 00:30:20.860 ⇒ 00:30:45.199 Jay Heavner: Yeah, yeah. I’m like, oh wow, you’re really… you’re… He’s pushing. He’s pushing, I think, interesting ways, and I’m like, oh, you’ve got a little web interface inside of here, look at you. Yeah. And also, he has perhaps the most, like, aggressively antagonistic opinion of the research, so he would be a good person to, like, say, like, okay, yeah, we can pipeline this, but this is a bullshit step. Yeah.
286 00:30:45.200 ⇒ 00:30:46.380 Uttam Kumaran: Yeah, yeah, yeah.
287 00:30:46.380 ⇒ 00:30:49.560 Jay Heavner: Yeah, so…
288 00:30:50.260 ⇒ 00:30:58.460 Uttam Kumaran: Well, that’s why I think it’s, like, what’s helpful for, like, that May meeting? Is it, like, more of the ideation? Is it, like, a proof of concept?
289 00:30:58.890 ⇒ 00:31:13.050 Jay Heavner: Yeah, it used to be, too. It’s the simple back with the CES leadership team. Oh, I thought you were talking about the Kinsey strategic thing. Oh, no, I don’t… Yeah, so there’s also going to be… so the new… the new CEO starts May 1st. She’s the president now. She comes CEO.
290 00:31:13.200 ⇒ 00:31:17.390 Jay Heavner: She is doing a… strategic retreat.
291 00:31:18.100 ⇒ 00:31:27.599 Jay Heavner: early May for her direct reports, and AI is going to be a big topic. I love that they’re all just gonna come together and talk about things they don’t know. That’s great.
292 00:31:27.700 ⇒ 00:31:29.489 Jay Heavner: Let’s lean into that more.
293 00:31:29.780 ⇒ 00:31:34.620 Jay Heavner: But…
294 00:31:35.030 ⇒ 00:31:40.309 Jay Heavner: I’m still, like… so we had a conversation yesterday, Catherine and I, with our boss, and it’s like.
295 00:31:40.650 ⇒ 00:31:52.120 Jay Heavner: what do you want us to do? Do you want us to be player coaches and rise the tide of AI across the organization? Do you want these people to fend for themselves and, like, work on the big…
296 00:31:52.880 ⇒ 00:31:56.239 Jay Heavner: moving forward projects, what do you want? She’s like, I don’t know.
297 00:31:56.400 ⇒ 00:31:59.489 Jay Heavner: I don’t know if we’re gonna get an answer out of that. Yeah.
298 00:31:59.960 ⇒ 00:32:05.250 Jay Heavner: I mean, honestly, I don’t know, you know? I don’t know what the big projects are. If we had identified projects.
299 00:32:05.250 ⇒ 00:32:15.889 Uttam Kumaran: I guess if you… I would say, if you don’t feel strongly either way, the best thing to do is, like, use it within the teams that you guys have, like, immediate control.
300 00:32:16.610 ⇒ 00:32:21.440 Uttam Kumaran: And then try to demonstrate that it’s, like, It’s crushing it.
301 00:32:22.530 ⇒ 00:32:23.060 Jay Heavner: Yeah.
302 00:32:23.180 ⇒ 00:32:23.780 Jay Heavner: Because…
303 00:32:23.780 ⇒ 00:32:31.909 Uttam Kumaran: Otherwise, like, getting other people to create skills and learn how to use… it’s just, I mean, it’s just… it’s really, really tough. Yeah.
304 00:32:31.910 ⇒ 00:32:36.260 Jay Heavner: What’s interesting is, like, So, the CFO saw something.
305 00:32:36.510 ⇒ 00:32:38.530 Jay Heavner: And he’s like, that’s cool.
306 00:32:38.680 ⇒ 00:32:54.319 Jay Heavner: And he wants his team to use it, but it’s all… it’s all mandating down, right? Like, nobody wants to do it, but they want someone else on their team to start doing it. And I’m like, I think we’re going to be more successful leading this from the top.
307 00:32:54.800 ⇒ 00:32:56.450 Jay Heavner: I just don’t want to get those people there.
308 00:32:57.230 ⇒ 00:33:02.960 Jay Heavner: Right. I mean, I think, like, what I learned from… like…
309 00:33:03.490 ⇒ 00:33:09.980 Jay Heavner: If, you know, in an ideal state, like, the people that are at the, you know, top of the food chain in these departments, like.
310 00:33:10.120 ⇒ 00:33:12.359 Jay Heavner: what I want them to do is…
311 00:33:12.400 ⇒ 00:33:19.879 Jay Heavner: you know, be on board, right? Empower their staff to do it, but also to have, like, a thesis, right? Like, what…
312 00:33:19.880 ⇒ 00:33:33.689 Jay Heavner: what do I think my team can achieve with AI, right? Like, so that it’s not, like you, you know, described it the other day, like, you know, 10 puppies going 15 directions, right? It’s like, what problem does Siri want to solve
313 00:33:33.690 ⇒ 00:33:57.540 Jay Heavner: this year with AI for his team. Is it eliminating manual reporting? Is it interconnecting the systems? Is it predictive modeling? Whatever it is, but then that way, his team, when ideas come at them, they can say, like, hmm, the thing we’re supposed to drive towards is this, so these things are less important. What if we actually took that idea and we just workshopped it, like, one group at a time? Yeah. We pulled them into a room.
314 00:33:58.070 ⇒ 00:34:03.149 Jay Heavner: And we workshop these ideas, and, you know.
315 00:34:03.150 ⇒ 00:34:05.369 Uttam Kumaran: Do we actually, like, deliver one… do you deliver one of them?
316 00:34:05.580 ⇒ 00:34:07.899 Jay Heavner: Yeah. You know? Yeah.
317 00:34:07.900 ⇒ 00:34:10.419 Uttam Kumaran: You try to do it, like, the actual robust way, like…
318 00:34:10.429 ⇒ 00:34:12.059 Jay Heavner: Yeah, you can… Okay, can we…
319 00:34:12.059 ⇒ 00:34:21.079 Uttam Kumaran: Can we do every step? Can we have each step be a skill? And then can we teach them how to use the skills to get the output? And it’s like, cool. What was, like.
320 00:34:21.559 ⇒ 00:34:30.019 Uttam Kumaran: good and sucky about doing that with just one team, and then maybe extrapolate from there. Because otherwise, yeah, I think…
321 00:34:30.219 ⇒ 00:34:33.709 Uttam Kumaran: This group will have to come up with the idea, and it may not be right.
322 00:34:34.059 ⇒ 00:34:41.729 Uttam Kumaran: it’s also… it’s gonna be risky to say, like, cool, we can get enterprise adoption. It’s… I don’t know, I think it’s really, really hard.
323 00:34:41.730 ⇒ 00:34:42.380 Jay Heavner: Whoa.
324 00:34:42.920 ⇒ 00:34:47.910 Jay Heavner: But I do like the idea, like, Part workshop, part…
325 00:34:48.219 ⇒ 00:34:50.230 Jay Heavner: And now we build the thing.
326 00:34:50.719 ⇒ 00:34:54.080 Jay Heavner: You know, we can get something done in.
327 00:34:54.080 ⇒ 00:34:54.510 Uttam Kumaran: Yeah.
328 00:34:54.710 ⇒ 00:35:00.790 Jay Heavner: that is… Value. Yeah. Whatever that may be.
329 00:35:01.240 ⇒ 00:35:05.960 Jay Heavner: I think that’s definitely achievable. I think we have to be careful to not get stuck in just, like.
330 00:35:06.090 ⇒ 00:35:07.880 Jay Heavner: Automation, hell. Yeah.
331 00:35:07.880 ⇒ 00:35:08.520 Uttam Kumaran: Yeah.
332 00:35:08.520 ⇒ 00:35:13.950 Jay Heavner: Because, you know, using AI to write automations, that’s fine. That’s not really…
333 00:35:14.730 ⇒ 00:35:18.750 Jay Heavner: Right. This is writing code. Right. Or writing automations, yeah. Right.
334 00:35:18.760 ⇒ 00:35:36.979 Jay Heavner: Right. I mean, really, honestly, yeah, like, the biggest magic trick we’re all going to be able to pull off this year at CTA is just, like, look how much tech debt we got rid of. Well, yeah. We’ve gone from 1995 to 2005, and that’s a big leap for us. Yeah, yeah, true. Right? I mean…
335 00:35:37.000 ⇒ 00:35:40.269 Jay Heavner: But I think I… I don’t want to lose sight, too, of what you’re…
336 00:35:40.340 ⇒ 00:35:48.410 Jay Heavner: talking about, like, the thesis of the, like, where we try to get to piece, because, like, yeah, before we show them, like, how to write the magic trick, it’s like, yeah…
337 00:35:48.860 ⇒ 00:36:04.610 Jay Heavner: Let’s put a roadmap. What capability do you think you can give your team with AI that they don’t have currently? Yeah. Right? Let’s put a roadmap, let’s write it in pencil, and let’s call it, like, a 3-6 month roadmap. We’re not building a long roadmap here.
338 00:36:04.770 ⇒ 00:36:11.820 Jay Heavner: And… you know, we’ll build you a little… little Gantt chart, and we’ll actually start…
339 00:36:11.820 ⇒ 00:36:20.259 Uttam Kumaran: Yeah, and I think… I think on this team’s side is also what are the, like, engineering infrastructure needed to just support you in one team?
340 00:36:20.380 ⇒ 00:36:25.549 Uttam Kumaran: Versus, like, support a bunch of team, right? So, for example, some of that stuff is, like.
341 00:36:25.730 ⇒ 00:36:44.319 Uttam Kumaran: okay, everybody needs access to, like, all the integrations. Is that… are those MCPs? Are those CLIs? Right? Yeah. So, are they accessing it through their… on a local machine? Is it, like, something in the cloud? So… but that’s all the stuff that we also figure out beside that project.
342 00:36:44.980 ⇒ 00:36:49.689 Uttam Kumaran: And don’t just rush to be like, cool, I shipped a skill that does it. It’s like, okay, we’re, like, treating this like…
343 00:36:50.060 ⇒ 00:37:05.690 Uttam Kumaran: okay, this, like, has to work, it’s like an enterprise customers, just one, and then you have, like, the shared context, the integrations, the skills, and then you just make sure that, like, we achieve… that’s kind of, like, our objective, in addition to getting to their output, you know?
344 00:37:05.690 ⇒ 00:37:11.420 Jay Heavner: What’s kind of interesting about that, too, is you run into…
345 00:37:11.920 ⇒ 00:37:28.179 Jay Heavner: like, the difference between MCP and CLI. CLI tends to be a lot cheaper in terms of tokens, and, you know, as I’m talking to the market research team right now, they’re doing their research with cloud code, they’re hitting limits every day, they wanted me to just increase their threshold. I’m like, yeah, that’s fine.
346 00:37:28.290 ⇒ 00:37:29.080 Jay Heavner: But…
347 00:37:29.190 ⇒ 00:37:39.890 Jay Heavner: at some point, we also have to talk about cost. I think in year zero, everyone’s like, just go burn the tokens to the ground, but at some point, we do need to…
348 00:37:40.000 ⇒ 00:37:48.200 Jay Heavner: Think about costs and token usage, and how do we do this Affordably, effectively.
349 00:37:48.200 ⇒ 00:37:53.909 Uttam Kumaran: Yeah. But again, without that, people are gonna burn Opus themselves, right? So…
350 00:37:53.910 ⇒ 00:37:54.930 Jay Heavner: Yeah.
351 00:37:54.930 ⇒ 00:38:08.819 Uttam Kumaran: I can think of a system that just automatically just puts people in, like, the lower tier, because you have skills, and you have the contacts, so it performs. That checks that box off, versus… we have some other clients that are like, oh, we just turned on co-work for everybody.
352 00:38:09.070 ⇒ 00:38:15.200 Uttam Kumaran: I’m like, you’re gonna get rinsed this month, because people are gonna use Opus to write, like, their emails.
353 00:38:15.200 ⇒ 00:38:16.090 Jay Heavner: Yeah.
354 00:38:16.090 ⇒ 00:38:17.040 Uttam Kumaran: And…
355 00:38:17.040 ⇒ 00:38:19.250 Jay Heavner: Going to the grocery store, let’s prep the Ferrari, right?
356 00:38:19.250 ⇒ 00:38:23.770 Uttam Kumaran: Yeah, exactly, and so that’s a great way to avoid
357 00:38:23.880 ⇒ 00:38:32.329 Uttam Kumaran: That problem, because they’re increasing prices, and it’s not like people… once it’s out there, it’s gonna be very hard to, like, pull back.
358 00:38:32.330 ⇒ 00:38:32.690 Jay Heavner: Yeah.
359 00:38:32.690 ⇒ 00:38:33.879 Uttam Kumaran: You know…
360 00:38:34.040 ⇒ 00:38:35.790 Jay Heavner: Well… government.
361 00:38:35.790 ⇒ 00:38:38.980 Uttam Kumaran: governance-wise, too, there’s, like, a huge case for this to be, like.
362 00:38:39.230 ⇒ 00:38:43.010 Uttam Kumaran: someone has ownership over, like, what are the AI services.
363 00:38:43.350 ⇒ 00:38:46.869 Uttam Kumaran: we’re using, right? And I know that’s been a big thing, Catherine, you mentioned, just like.
364 00:38:46.980 ⇒ 00:38:53.740 Uttam Kumaran: access to the right data, like, you can imagine someone’s cloud code just finds a way through to access something.
365 00:38:53.740 ⇒ 00:38:54.860 Jay Heavner: Oh, it will, yeah.
366 00:38:54.860 ⇒ 00:39:00.230 Uttam Kumaran: You know, and it’s not like… it may not be malicious, it’s just, like, resourceful.
367 00:39:00.230 ⇒ 00:39:02.199 Jay Heavner: Yeah, yeah, yeah, exactly, exactly.
368 00:39:02.200 ⇒ 00:39:02.840 Uttam Kumaran: Yeah.
369 00:39:02.840 ⇒ 00:39:06.199 Jay Heavner: Well, that was one of the stories about mythos, is it escaped containment, right?
370 00:39:06.200 ⇒ 00:39:06.670 Uttam Kumaran: Yes.
371 00:39:06.670 ⇒ 00:39:07.060 Jay Heavner: all the time.
372 00:39:07.060 ⇒ 00:39:07.390 Uttam Kumaran: Yeah.
373 00:39:07.390 ⇒ 00:39:10.840 Jay Heavner: And I’m like, yeah, that doesn’t surprise me at all. Right. I mean, that’s…
374 00:39:10.840 ⇒ 00:39:18.139 Uttam Kumaran: Yeah, it’s trying every door. It’s trying every door, like, more doors than you could have tried in, like, a lifetime in, like, an hour.
375 00:39:18.300 ⇒ 00:39:41.439 Jay Heavner: I mean, honestly, the place that I’ve seen this behavior, like, the most in my, you know, little sort of narrow lane already is, like, ever since we started using more of the, like, like, sort of CLI-type stuffs to push out those Streamlit apps, like, when Cloud Code, like, for whatever reason, half the time, it can’t find where it is installed, even though it’s in my path, it’s all set up correctly. Cloud Code is the moron.
376 00:39:41.440 ⇒ 00:39:51.610 Jay Heavner: So, like, sometimes I’ll… it’ll say, like, you know, can I search here for it? Can I search here for it? And, like, I like… I like saying yes, and then seeing where it goes next, because I’m like, you are searching in places, you are…
377 00:39:51.610 ⇒ 00:40:03.280 Jay Heavner: it makes no sense to search, right? And, like, I’m doing it because I’m curious, and I’m wondering what this is gonna be like for other people, but, like, everybody else is just gonna go like, yep, yep, yep, yep. I mean, how many people really know what a bash command’s gonna do?
378 00:40:03.530 ⇒ 00:40:04.400 Jay Heavner: Oh, that’s just…
379 00:40:04.400 ⇒ 00:40:06.920 Uttam Kumaran: Yeah, they’re just gonna hit allow access, and it’s just gonna go.
380 00:40:06.920 ⇒ 00:40:08.260 Jay Heavner: Yeah, and…
381 00:40:08.260 ⇒ 00:40:27.789 Jay Heavner: I’ve seen that where you don’t have access to code. Cool, I’m just gonna pipe it to Bash and run it that way. You don’t have access to that, or I won’t let you do that. Alright, then I’m gonna do an LS ampersand, ampersand, and bash to code, and I’m gonna concatenate things together, because it’s very… it will find a way. Terrifying.
382 00:40:27.790 ⇒ 00:40:28.130 Uttam Kumaran: Yeah.
383 00:40:28.130 ⇒ 00:40:38.279 Jay Heavner: So, like, yesterday, setting up Chris’s laptop, we were trying to get some of this stuff downloaded, and I was trying to show him, right, like, just make the little robot do it for you! But it was…
384 00:40:38.660 ⇒ 00:40:50.720 Jay Heavner: trying to get more and more clever, and then finally Sentinel-1 yanked it off his machine, and then we tried more things, because this is how my brain works, and then, Sentinel-1 yanked it off his machine again and blocked it from the internet.
385 00:40:51.400 ⇒ 00:40:52.300 Uttam Kumaran: Wow.
386 00:40:52.300 ⇒ 00:40:55.090 Jay Heavner: Which is cool. Yeah. That’s kind of nice.
387 00:40:55.090 ⇒ 00:40:56.740 Uttam Kumaran: That’s kind of smart, but…
388 00:40:56.810 ⇒ 00:40:57.660 Jay Heavner: Right?
389 00:40:58.650 ⇒ 00:41:16.510 Jay Heavner: I was like, that’s kind of fun to see that work in real life. But yeah, so I don’t know, I just… I think… anyway, circling back, the infrastructure and the ecosystem piece, I think, is salient, needed, what we’ve kind of been trying to tee up with some of the AWS stuff, like, I also think, you know.
390 00:41:16.690 ⇒ 00:41:19.380 Jay Heavner: It is a chance for us to…
391 00:41:19.440 ⇒ 00:41:28.169 Jay Heavner: Not rewrite history, but, like, have a strong opinion on what the future’s gonna look like, because this organization has not historically understood what infrastructure is.
392 00:41:28.170 ⇒ 00:41:51.969 Jay Heavner: let alone invested in it. And so, like, I feel like this is our chance to be like, no, no, if we’re gonna do any of this, we’re doing it this way, we need these things, these are not negotiable, these are not, like, toys, these are not options, these are requirements before we let everybody loose with these tools, right? Which I know you’ve done a lot already with observability stuff, I think it’s the last… There’s still… well, the problem with observability is it…
393 00:41:51.970 ⇒ 00:41:53.280 Jay Heavner: it changes…
394 00:41:54.230 ⇒ 00:42:05.959 Jay Heavner: Right. All the time. Right. Well, and it doesn’t get at the, like, tier did this, the spend limits and stuff like that, right? And so, like, what does a walled garden, you know, kind of thing look like?
395 00:42:06.600 ⇒ 00:42:07.370 Jay Heavner: Yeah.
396 00:42:07.890 ⇒ 00:42:14.330 Jay Heavner: Well… Yeah, that’s…
397 00:42:16.450 ⇒ 00:42:18.149 Jay Heavner: I’m still going back to, like.
398 00:42:18.260 ⇒ 00:42:24.060 Jay Heavner: The right model at the right time, the right spin at the right time, because.
399 00:42:24.060 ⇒ 00:42:24.540 Uttam Kumaran: Yeah.
400 00:42:24.540 ⇒ 00:42:31.610 Jay Heavner: infrastructure in, and you can set all that. Like, if you have an agent, you can say, you’re using Haiku.
401 00:42:31.610 ⇒ 00:42:31.980 Uttam Kumaran: Well, that’s why.
402 00:42:31.980 ⇒ 00:42:32.539 Jay Heavner: I think usually…
403 00:42:32.540 ⇒ 00:42:39.339 Uttam Kumaran: I think we… there should be a central group that’s deciding that, because that is too many decisions by folks who, like, aren’t even, like.
404 00:42:40.070 ⇒ 00:42:49.719 Uttam Kumaran: understanding the basics, so part of what we’re doing at our company, I’m like, I don’t want our folks to have to think about Kimmy versus Claude versus whatever, it’s like…
405 00:42:49.980 ⇒ 00:42:54.819 Uttam Kumaran: It’s working, and you have skills, maybe there’s, like, a normal mode and, like, a…
406 00:42:55.010 ⇒ 00:43:06.450 Uttam Kumaran: this is not working mode. Like, there’s just two modes? Or not, like, it should work with the… you shouldn’t have to tweak the model. Instead, focus on the right skill, maybe, you know, and the…
407 00:43:06.450 ⇒ 00:43:22.759 Jay Heavner: You say that, but then look what… look what Anthropic did to Claude over the last 3 weeks, like… Yeah. You know, they’ve nerfed Sonnet to, I think, get more infrastructure space for Mythos and other things, so they have clearly detuned some of these models.
408 00:43:23.240 ⇒ 00:43:28.599 Jay Heavner: Which… That’s my thing, like… Just leave it alone.
409 00:43:28.850 ⇒ 00:43:35.950 Jay Heavner: If you want to deprecate it, deprecate it, but leave it alone. Give me something that I know exactly what it’s going to do every time I do it. Yeah.
410 00:43:35.950 ⇒ 00:43:49.030 Jay Heavner: I mean, honestly, Opus 4.7, it should have been a different model family or something. I mean, it’s too… it’s not a point release on 4.6, it’s a completely different monster. I think today, I’ve started to understand it a little better.
411 00:43:49.030 ⇒ 00:43:58.060 Jay Heavner: It’s GPT-5. Remember when it came out last August, and everyone was so excited, and it was so vastly different than whatever the… True, true, true.
412 00:43:59.390 ⇒ 00:44:06.220 Jay Heavner: But, like, also these nuances, I mean, to Duchenne’s point, like, these are nuances that, like, make a lot of sense in our world, but not to anybody else.
413 00:44:06.220 ⇒ 00:44:06.710 Uttam Kumaran: Yeah.
414 00:44:06.710 ⇒ 00:44:09.630 Jay Heavner: Oh, well, and I think that’s what we have to define. Well, then they’re gonna…
415 00:44:09.630 ⇒ 00:44:16.320 Uttam Kumaran: But it’s also, like, another thing to blame for potentially not having the output right. It’s like, oh, you choose the wrong model, or, like.
416 00:44:16.320 ⇒ 00:44:17.100 Jay Heavner: Mmm.
417 00:44:17.100 ⇒ 00:44:21.260 Uttam Kumaran: I think you kind of want to remove that, too, because otherwise, that’s going to come back to this group, too.
418 00:44:21.260 ⇒ 00:44:33.299 Jay Heavner: Well, and that’s a good point, because we will have staff who, like, oh, it doesn’t work, I, yeah, I tried, it failed. Opus was down. Yeah, yeah.
419 00:44:33.530 ⇒ 00:44:45.629 Jay Heavner: Fun fact on that note, several jobs ago, the person they hired to replace me one time said he couldn’t run a certain report on time because, I shit you not, VLOOKUP is down today.
420 00:44:46.100 ⇒ 00:44:46.989 Jay Heavner: Oh, well, that’s good.
421 00:44:46.990 ⇒ 00:44:47.610 Uttam Kumaran: Nice.
422 00:44:47.610 ⇒ 00:44:48.120 Jay Heavner: Hello?
423 00:44:48.120 ⇒ 00:44:50.079 Uttam Kumaran: That’s a great, great excuse.
424 00:44:50.080 ⇒ 00:44:58.619 Jay Heavner: Okay, okay, that’s next level. The best part was, I only found out about this because they believed him, and they told me, like, did you know VLOOKUP? It goes down sometimes.
425 00:44:58.650 ⇒ 00:45:15.420 Jay Heavner: It doesn’t. It doesn’t. It doesn’t. But I guess Eric had a nice day off, didn’t he? Right? Anyway, sorry. But no, that’s actually a good point, and I had not really thought about that, but like, yeah, the people that are gonna be like, I don’t know, I just couldn’t do it, needed a better model. Yeah. Yeah.
426 00:45:15.720 ⇒ 00:45:16.790 Jay Heavner: Interesting.
427 00:45:17.350 ⇒ 00:45:25.329 Jay Heavner: Interesting. Needed more memory, my laptop couldn’t handle the contacts. Yeah, yeah, my contacts window is full, because, you know, I,
428 00:45:25.810 ⇒ 00:45:28.559 Jay Heavner: Hey, before I forget, did you want to talk about that.
429 00:45:28.560 ⇒ 00:45:30.859 Uttam Kumaran: Oh, we didn’t want to talk about Snowflake, yes.
430 00:45:30.990 ⇒ 00:45:31.570 Jay Heavner: Oh.
431 00:45:31.570 ⇒ 00:45:35.799 Uttam Kumaran: Snowflake San… the Snowflake Sandbox, we are…
432 00:45:36.300 ⇒ 00:45:48.589 Uttam Kumaran: we wanted to propose deprecating it from OktaScreen, and ultimately shutting it down if it’s not being used, just so folks aren’t confused as we start to provision access.
433 00:45:48.590 ⇒ 00:45:49.210 Jay Heavner: like…
434 00:45:49.740 ⇒ 00:45:57.510 Jay Heavner: I started to do something with it, but I didn’t. And I moved over to the production one because I needed the data that was there, so… I think it’s fine, I mean.
435 00:45:57.510 ⇒ 00:46:02.740 Uttam Kumaran: I’ll just look… I’ll look through it, and then I’m gonna tell Ian to move it off of Okta, and then…
436 00:46:02.920 ⇒ 00:46:06.000 Uttam Kumaran: I’ll… if there… if I don’t find anything, then I’ll.
437 00:46:06.000 ⇒ 00:46:06.340 Jay Heavner: I will.
438 00:46:06.340 ⇒ 00:46:08.449 Uttam Kumaran: I’m gonna reach out to support, basically, to, like.
439 00:46:09.490 ⇒ 00:46:12.579 Uttam Kumaran: bring it down, or just… I could leave it, but I don’t know.
440 00:46:12.580 ⇒ 00:46:31.720 Jay Heavner: I was gonna say, well, so, we’re actually… we’re… we’re doing the AWS migration anyway, and so we’re moving all the stuff out of the two accounts into the new structure, so, like, I do have it open with them to figure out, like, okay, how do we… for the one that we want to keep, how do we detach and reattach that Snowflake instance? But for the sandbox one, I mean, we purchased it through the marketplace, can’t we just go.
441 00:46:32.310 ⇒ 00:46:32.970 Uttam Kumaran: Oh, okay.
442 00:46:33.150 ⇒ 00:46:35.999 Jay Heavner: I think we can. I mean…
443 00:46:36.170 ⇒ 00:46:45.579 Jay Heavner: Like, I don’t think there’s anything in there. I think it was just, like, your SCIM and SAML, like, setup stuff. Yeah, I think that’s right.
444 00:46:46.270 ⇒ 00:46:49.770 Jay Heavner: I mean, you tell me, is there any reason to keep it around?
445 00:46:49.900 ⇒ 00:46:55.360 Jay Heavner: We can always stand up a new one if we need to. Yeah, exactly, that’s my thing, yeah, yeah, yeah. Yeah.
446 00:46:56.180 ⇒ 00:47:05.450 Uttam Kumaran: I’ll just scan what’s in there, and if I find anything, I’ll… I’ll let you know, and we can move it, and then… it’s just, like, one… it’s like a small housekeeping item, just so, like, so people aren’t confused what to click on.
447 00:47:05.650 ⇒ 00:47:12.000 Jay Heavner: Yeah. So, look, going back to the AI, are you finding that GenPop
448 00:47:12.590 ⇒ 00:47:20.089 Jay Heavner: is using AI tools effectively outside of just chat GPT in a browser.
449 00:47:20.090 ⇒ 00:47:21.810 Uttam Kumaran: In for, for business, or for, like…
450 00:47:21.810 ⇒ 00:47:23.689 Jay Heavner: Yeah, I mean…
451 00:47:24.290 ⇒ 00:47:25.250 Uttam Kumaran: No.
452 00:47:25.450 ⇒ 00:47:29.470 Uttam Kumaran: I think it is… it’s almost so weird, I think…
453 00:47:29.660 ⇒ 00:47:40.309 Uttam Kumaran: time has stopped for a lot of people. Four years ago, we are still meeting people who are not on ChatGPT at their organization.
454 00:47:40.510 ⇒ 00:47:45.420 Uttam Kumaran: And then I’m… then I’m also having conversations with folks like y’all.
455 00:47:45.660 ⇒ 00:47:53.869 Uttam Kumaran: And it is very, very weird. I’m not seeing any indication that it’s, like, extremely widely adopted.
456 00:47:53.870 ⇒ 00:47:54.280 Jay Heavner: agree.
457 00:47:54.280 ⇒ 00:48:03.529 Uttam Kumaran: What we are seeing is that folks are… I think everybody’s talking about it, but whether folks have done anything since I’ve talked to them a year ago, or 2 years ago, or 3 years ago.
458 00:48:04.050 ⇒ 00:48:08.329 Uttam Kumaran: I don’t… I don’t know, it’s… we’re… we’re catching people… it’s always tough, it’s almost like…
459 00:48:08.610 ⇒ 00:48:16.450 Uttam Kumaran: we’re in the skill world and plot code world, and then I’m able to go back and say, okay, maybe we can skip, like.
460 00:48:16.560 ⇒ 00:48:20.049 Uttam Kumaran: custom GPTs and, like, prompt engineering now.
461 00:48:20.570 ⇒ 00:48:24.929 Uttam Kumaran: So, like, what is the present… I almost told my team, like, what is a presentation we would have wanted to have
462 00:48:25.300 ⇒ 00:48:28.369 Uttam Kumaran: If we would have backed up 2 years ago, 3 years ago.
463 00:48:28.490 ⇒ 00:48:30.300 Uttam Kumaran: like, that’s where I’m kind of seeing.
464 00:48:30.300 ⇒ 00:48:40.209 Jay Heavner: But here’s the thing, we were told in Q1 of 2025, oh, prompt engineering’s dead. We will solve that problem this year. Guess what never happened, right? It never happened, so…
465 00:48:40.560 ⇒ 00:48:45.070 Jay Heavner: The promises keep being made that aren’t being fulfilled.
466 00:48:45.070 ⇒ 00:48:54.599 Uttam Kumaran: Yeah, it’s just the models, I think, are getting… I think what’s happened is, like, people are continuing to one-shot, and the model is, like, getting better at dealing with that.
467 00:48:55.160 ⇒ 00:49:00.220 Uttam Kumaran: You actually, like, that is… I think with great context and good skills.
468 00:49:00.730 ⇒ 00:49:03.899 Uttam Kumaran: You… and good integrations, you can use, like.
469 00:49:04.260 ⇒ 00:49:07.920 Uttam Kumaran: way dumber models. Like, that’s what we’re finding. And…
470 00:49:07.920 ⇒ 00:49:08.800 Jay Heavner: That’s actually…
471 00:49:08.800 ⇒ 00:49:26.500 Uttam Kumaran: the part that, like, is much more important, because then you slap Opus on, like, a great system, it’s, like, gonna… it’s gonna… it’s gonna annihilate, right? And so, you can assume also in, like, a year or two years, the pricing on Opus is gonna be a certain area versus other things. Everybody may have access to, like.
472 00:49:26.840 ⇒ 00:49:33.349 Uttam Kumaran: post open source. You assume the average token cost is going down, but it’s not like people are getting better at the…
473 00:49:33.530 ⇒ 00:49:35.450 Uttam Kumaran: Entrance, is what I’m seeing.
474 00:49:35.450 ⇒ 00:49:44.699 Jay Heavner: Well, no, I think that’s right. I like the point, though, with the right diligence and hygiene, you can use crappier models.
475 00:49:44.700 ⇒ 00:49:45.260 Uttam Kumaran: Yes.
476 00:49:45.260 ⇒ 00:49:46.880 Jay Heavner: the same results out of them.
477 00:49:47.110 ⇒ 00:49:57.400 Uttam Kumaran: Like, I told my team, I think even if… I think I said even at the beginning of the year, if model development stopped, we’re… we’re in the money, because we have, like, a really good context system, we have skills.
478 00:49:57.480 ⇒ 00:49:58.150 Jay Heavner: I don’t care.
479 00:49:58.150 ⇒ 00:50:09.150 Uttam Kumaran: even if it goes slow, and in fact, I just care that it’s accurate, or at least stops when it doesn’t know. That’s what I… that’s more of interest to me than get this done in, like, 9 seconds, like…
480 00:50:09.150 ⇒ 00:50:16.249 Jay Heavner: Yeah. The thing that I really want is, I want to be able to run Opus locally, and not have to fuck with every.
481 00:50:16.250 ⇒ 00:50:16.770 Uttam Kumaran: Yeah.
482 00:50:16.770 ⇒ 00:50:19.089 Jay Heavner: That day, you know, like…
483 00:50:19.090 ⇒ 00:50:27.479 Uttam Kumaran: Well, that’s the thing, that’s why, like, you see Quinn, the new Kimi model, like, those… I think you’re finding that those are… those are hitting 4.6…
484 00:50:27.700 ⇒ 00:50:35.509 Uttam Kumaran: Level, performance, and you can run smaller versions of those, you know, locally, and so…
485 00:50:35.760 ⇒ 00:50:41.770 Uttam Kumaran: I think, like, what we’re gonna find is, like, even if using some of those models open source hosted, you’re gonna have
486 00:50:42.290 ⇒ 00:50:44.359 Uttam Kumaran: success, so it’s… it’s…
487 00:50:44.740 ⇒ 00:50:59.299 Uttam Kumaran: Well, I think those reasoning models are just getting better at doing a lot with very little, and, like, that’s more of the optimization, like, build me, like, this big, big thing. It’s like, we’re not… that’s not the type of work that we’re doing, is like, build me a website. Who’s doing that?
488 00:50:59.300 ⇒ 00:50:59.840 Jay Heavner: Right.
489 00:50:59.840 ⇒ 00:51:10.490 Uttam Kumaran: You know, it’s like, I’m not optimizing for one-shotting, like, an entire app, I’m optimizing for, there’s a specific asset, here are the inputs, walk me through, have human-in-the-loop steps.
490 00:51:10.490 ⇒ 00:51:11.160 Jay Heavner: Whoa.
491 00:51:11.160 ⇒ 00:51:17.920 Uttam Kumaran: You know? Just, like, it’s just taking normal work, you know, and just, like, having AI fill in the pieces is more of, like, what we’re…
492 00:51:17.920 ⇒ 00:51:24.359 Jay Heavner: this with, like, small language models of, just give me an SML that is trained on this very bespoke…
493 00:51:24.360 ⇒ 00:51:25.080 Uttam Kumaran: Yes.
494 00:51:25.080 ⇒ 00:51:25.540 Jay Heavner: You know?
495 00:51:25.540 ⇒ 00:51:25.900 Uttam Kumaran: Yeah.
496 00:51:25.900 ⇒ 00:51:27.990 Jay Heavner: to understand English.
497 00:51:27.990 ⇒ 00:51:28.510 Uttam Kumaran: Yes.
498 00:51:28.510 ⇒ 00:51:44.640 Jay Heavner: understand Python and a few other things, and I don’t need 14 billion training points. Right, right, right. Or let me go to the open source model and be like, this, and check the training I need, you build a model for me, and I’ll run it.
499 00:51:44.770 ⇒ 00:51:53.120 Jay Heavner: Which I think is, like, it gets back to what you were talking about, too, around, like, you know, where does the organization need our brains to achieve maximum benefit, and, like.
500 00:51:53.120 ⇒ 00:52:06.939 Jay Heavner: nobody else here is gonna have that sort of… right? Like, nobody else is gonna do that work, and so, like, if you’re bogged down hand-holding player coaching, right, you’re like, you’re not getting to do that piece of it, and so, I mean…
501 00:52:07.130 ⇒ 00:52:25.390 Jay Heavner: And again, it’s also, it’s the stuff that the, you know, the Gentop, the organization, like, tend to not understand the value of, right? Like, that sort of R&D and infrastructure stuff is just not well understood here, and so, like, I think making sure to be mindful of protecting your time and my time for some of that kind of stuff, like.
502 00:52:25.390 ⇒ 00:52:33.630 Jay Heavner: Because I do… I do think that is probably the direction to drive in. And we’re going to spend, I mean, we’re not that big, and we’re still going to spend a couple hundred grand in AI this year.
503 00:52:33.630 ⇒ 00:52:34.099 Uttam Kumaran: For sure.
504 00:52:34.100 ⇒ 00:52:34.900 Jay Heavner: Yeah, yeah.
505 00:52:34.900 ⇒ 00:52:35.430 Uttam Kumaran: For sure.
506 00:52:35.430 ⇒ 00:52:44.620 Jay Heavner: I mean, you know, my budget for Claude was, I think it was, like, $15,000 or whatever, because it was the number of users that we had approved at the time. I think we’ve already spent 30 plus, you know? Yeah.
507 00:52:44.620 ⇒ 00:52:45.060 Uttam Kumaran: Yeah.
508 00:52:45.060 ⇒ 00:52:45.990 Jay Heavner: people, I’ll start.
509 00:52:45.990 ⇒ 00:52:49.199 Uttam Kumaran: And that’s the thing, they’re not gonna release cheaper models.
510 00:52:49.200 ⇒ 00:52:50.429 Jay Heavner: Right. No. Right.
511 00:52:50.430 ⇒ 00:52:53.889 Uttam Kumaran: So, it’s like, and they’re gonna default to the…
512 00:52:53.890 ⇒ 00:52:54.250 Jay Heavner: That’s right.
513 00:52:54.250 ⇒ 00:53:00.470 Uttam Kumaran: That’s the… it’s… yeah, we’re seeing the same problem. So I… I try to fix that, like.
514 00:53:00.520 ⇒ 00:53:19.560 Uttam Kumaran: We don’t have people using more than, like, 5.2, because it’s just, like, it’s just, like, great, what do we… who’s, like, building, like, CAD modeling software at Brainforge? Like, we’re not doing that, you know? Okay, maybe Jay is doing that. Our stuff is, like, write a Google Doc, or, like, help me with this, prepare for a meeting. It’s still like that, you know?
515 00:53:19.560 ⇒ 00:53:24.660 Jay Heavner: I mean, honestly, I think, too, like, it becomes interesting to think about, like.
516 00:53:24.670 ⇒ 00:53:42.430 Jay Heavner: the same way that I’ve been sort of taking the advice I’ve seen going around of, like, well, not with 4.7 so much yet, but, like, with Open 4.6, like, use 4.0 Office to, like, do all the brainstorming, the ideating, and the working through, and the, like, researching, and, like, figure out what you want, and then tell it to give you, you know, prompts for Sonnet.
517 00:53:42.440 ⇒ 00:53:58.209 Jay Heavner: And then go from there, and, like, I’ve seen a lot of, like, you know, benefit out of that, and so, like, if you think about that same approach scaling to the rest of the organization, not the Opus Sonnet thing, but the idea that, like, some work is done with bigger frontier models that are more expensive and newer, but then it’s handed off.
518 00:53:58.240 ⇒ 00:54:01.359 Jay Heavner: to the lighter models to execute, right? Like…
519 00:54:01.550 ⇒ 00:54:11.639 Jay Heavner: Oh, this is on my list. I’m going to build, like, a control structure where you talk to it in Haiku, but then as it identifies need, it moves you into agents, models.
520 00:54:11.640 ⇒ 00:54:13.309 Uttam Kumaran: Yeah, some more routing, more routing.
521 00:54:13.310 ⇒ 00:54:14.180 Jay Heavner: Right.
522 00:54:14.180 ⇒ 00:54:18.999 Uttam Kumaran: Or at skill, the skill itself, you can use, right, your orchestrator maybe is one model.
523 00:54:19.320 ⇒ 00:54:21.030 Uttam Kumaran: The skill itself is, like.
524 00:54:21.030 ⇒ 00:54:21.450 Jay Heavner: Yup.
525 00:54:21.450 ⇒ 00:54:22.970 Uttam Kumaran: The cheapest thing, you know?
526 00:54:22.970 ⇒ 00:54:36.290 Jay Heavner: Exactly, exactly. And I think… I’m supr… I haven’t seen this yet, and I’m surprised that Cloth Code hasn’t already done this. I understand what they’re doing. They’re chasing frontier models, they’re leaving that up to the community, but, like, it’s…
527 00:54:36.860 ⇒ 00:54:45.820 Jay Heavner: there’s so much they could do there with this. I mean… Well, and interestingly, so, like, my robot brain, like, that is… I haven’t yet had it start to do model selection, but, like…
528 00:54:46.130 ⇒ 00:54:56.580 Jay Heavner: it is kind of that approach of, like, an orchestrator takes in the initial sort of, like, okay, what is this chat going to be about? And then it decides, like, where should I route this in terms of, like, skill set and agent?
529 00:54:56.580 ⇒ 00:55:07.989 Jay Heavner: I will say one thing I have learned is that, when you’re working with the bedrock models, there’s, like, whatever is Anthropic’s sort of, like, you know, baseline system prompt of, like, you know, some behavior, but…
530 00:55:07.990 ⇒ 00:55:18.919 Jay Heavner: the rest of that is, like, there’s a big difference between these models in Bedrock and these models in Cloud AI, that I’m, like… it’s not, like, limiting, but it is…
531 00:55:18.980 ⇒ 00:55:24.730 Jay Heavner: curious and frustrating. You’re saying the bedrock ones are just too vanilla? Well, it’s like, they just have the.
532 00:55:24.730 ⇒ 00:55:28.230 Uttam Kumaran: No, we’re using Azure for some of them, too, and yeah, they’re like…
533 00:55:28.700 ⇒ 00:55:32.649 Uttam Kumaran: I don’t know, I think the setup is kind of tough, and yeah, they just seem, like, weaker.
534 00:55:32.650 ⇒ 00:55:33.060 Jay Heavner: I’m like…
535 00:55:33.060 ⇒ 00:55:33.880 Uttam Kumaran: The harness…
536 00:55:33.880 ⇒ 00:55:34.410 Jay Heavner: It’s like…
537 00:55:34.410 ⇒ 00:55:39.829 Uttam Kumaran: The hardness, wherever they’re deployed, is different than what if you were to get it directly from OpenAI.
538 00:55:39.830 ⇒ 00:55:40.800 Jay Heavner: Yeah.
539 00:55:41.170 ⇒ 00:55:47.839 Uttam Kumaran: So, I don’t know, and I’m noticing that too, but I don’t… like, again, these are, like, small… just, like, how do I prove this?
540 00:55:48.310 ⇒ 00:55:49.130 Uttam Kumaran: One night.
541 00:55:49.130 ⇒ 00:55:49.870 Jay Heavner: Anyway, anyway.
542 00:55:50.200 ⇒ 00:56:07.990 Jay Heavner: in the direction of is, like, what is then, therefore, my approach to writing system prompts, right? So, like, if I’m going to be using models out of bedrock, then, like, what does Catherine want a system prompt to be like? And I think CTA might have a similar, like, what is the baseline system prompt CTA always wants in place?
543 00:56:07.990 ⇒ 00:56:08.760 Uttam Kumaran: Yes.
544 00:56:09.980 ⇒ 00:56:10.790 Uttam Kumaran: Yeah.
545 00:56:11.130 ⇒ 00:56:13.780 Jay Heavner: Which is kind of fun. That’s, yeah.
546 00:56:15.150 ⇒ 00:56:17.759 Jay Heavner: I mean, you kind of like the idea that you…
547 00:56:17.910 ⇒ 00:56:22.189 Jay Heavner: Have the power to beat the system off versus whatever it’s gonna throw at you.
548 00:56:22.190 ⇒ 00:56:22.960 Uttam Kumaran: Yes.
549 00:56:23.470 ⇒ 00:56:39.399 Jay Heavner: with great power comes great frustration. Exactly. Yeah. Exactly. And then I think, too, the other piece is, you know, what you’ve been kind of driving at lately, which is the context and memory management, right? Like… Yeah, that’s… but again, I keep waiting for this to be a solved problem. I know.
550 00:56:39.400 ⇒ 00:56:50.359 Uttam Kumaran: Yeah, like, well, I guess one thing, Jay, I don’t know if you’ve seen, like, compaction. Like, I’m using OpenCode for some stuff, and it auto-compacts the context kind of nicely, ever so often, but…
551 00:56:50.360 ⇒ 00:56:50.750 Jay Heavner: card can.
552 00:56:50.750 ⇒ 00:56:51.069 Uttam Kumaran: I’m not…
553 00:56:51.070 ⇒ 00:56:51.720 Jay Heavner: action.
554 00:56:51.720 ⇒ 00:56:59.680 Uttam Kumaran: Yeah, again, it’s probably something someone just made up, but what it does is, like, as you get to, like, 80-90% context, it’s sort of, like.
555 00:57:00.530 ⇒ 00:57:06.550 Uttam Kumaran: Dumps a bunch, creates a… creates, like, a more concise, like, system prompt, and then you can just keep going in a single session.
556 00:57:06.550 ⇒ 00:57:09.960 Jay Heavner: like, my current model is I’m using Serena.
557 00:57:10.200 ⇒ 00:57:18.119 Jay Heavner: And I try to keep my context so tight, and I started a conversation with a load, I ended every conversation with a save.
558 00:57:18.120 ⇒ 00:57:18.630 Uttam Kumaran: Yeah.
559 00:57:18.630 ⇒ 00:57:27.350 Jay Heavner: But then, it’s still, like… you get 3 or 4 turns, and you’re off the rails, right? And that’s where I dis…
560 00:57:28.630 ⇒ 00:57:29.820 Jay Heavner: Yeah…
561 00:57:30.770 ⇒ 00:57:39.699 Jay Heavner: I know. Actually, interestingly, so one of the things… I don’t remember if I sent you guys the link to the report or not, but there was that linked intent design approach.
562 00:57:39.700 ⇒ 00:57:40.510 Uttam Kumaran: Yes.
563 00:57:40.510 ⇒ 00:57:57.529 Jay Heavner: Yeah, so yet another spec-driven, kind of, like, vibe coding approach. I put it in on Robot Gray, and, like, it’s a smaller repo, it’s, you know, more narrowly scoped, it really has improved the ability of, like, to get more turns out of conversations and stuff, like.
564 00:57:57.890 ⇒ 00:58:06.719 Jay Heavner: On the other hand, it’s so fucking bloated now, that repo, that, like, when it gets lost, it is undercoverably lost, which is kind of interesting.
565 00:58:06.720 ⇒ 00:58:07.110 Uttam Kumaran: Fair.
566 00:58:07.110 ⇒ 00:58:20.579 Jay Heavner: It starts to just, like, hallucinate connections, but as long as I can keep it, like, running nicely, it’s beneficial, but it deteriorates really fast. Yeah. And also, I think my CI pipeline takes, like, 5 minutes now.
567 00:58:21.220 ⇒ 00:58:23.069 Jay Heavner: Are you using Caveman yet?
568 00:58:23.240 ⇒ 00:58:32.019 Jay Heavner: Yeah, no, that’s right, you mentioned it. I do need… actually, I need to look at cases. It is funny, because it’s gotten super popular, and there are now variations of it, where it’s like.
569 00:58:32.020 ⇒ 00:58:34.070 Uttam Kumaran: Yeah, that’s what I… I haven’t been using it either, but…
570 00:58:34.070 ⇒ 00:58:39.879 Jay Heavner: Noun verbs and direct objects is all you’re allowed, and it’s… yeah. Very German. I don’t know.
571 00:58:40.640 ⇒ 00:58:48.900 Jay Heavner: I’m like… Little words aren’t the biggest source of your token thing. That’s… cool. Yeah.
572 00:58:50.030 ⇒ 00:58:50.980 Jay Heavner: Alright.
573 00:58:50.980 ⇒ 00:58:59.319 Uttam Kumaran: Okay, cool, so I guess, like, I’ll… on Snowflake’s side, I’ll execute that. But yeah, I don’t know, like, do we want to try to do another…
574 00:58:59.320 ⇒ 00:59:00.569 Jay Heavner: Yeah. Session here.
575 00:59:00.570 ⇒ 00:59:03.659 Uttam Kumaran: on, like, that 3-6 month roadmap kind of plan?
576 00:59:03.660 ⇒ 00:59:12.060 Jay Heavner: I think so, too. Like, let’s, let me ask you this. If we did a workshop, would it be useful to bring these guys in? Because… Yeah, yeah.
577 00:59:12.060 ⇒ 00:59:17.990 Uttam Kumaran: Well, I was also kind of thinking about coming anyways May 12th, 13th. Okay.
578 00:59:17.990 ⇒ 00:59:19.890 Jay Heavner: yeah,
579 00:59:19.890 ⇒ 00:59:21.889 Uttam Kumaran: So, we can aim for that week.
580 00:59:22.500 ⇒ 00:59:35.669 Jay Heavner: Similarly, so actually, I also, I wanted to take advantage of the way this organization misfunctions sometimes, and I put in a goal for Kyle to do a workshop with at least 15 senior leaders at the organization.
581 00:59:35.670 ⇒ 00:59:36.010 Uttam Kumaran: Yeah.
582 00:59:36.010 ⇒ 00:59:47.659 Jay Heavner: Oh, I’m really good. I think, yeah, I picked the number. But yeah, so, like, I think workshops, absolutely, like, I think what I would like the three of us to maybe next time we…
583 00:59:47.660 ⇒ 00:59:57.970 Jay Heavner: check-in is, like, the infrastructure-type piece, like, are there things… because we’re, like, we have AWS ProServe working with us right now, so, like, if there are things we want to lean on them to put into.
584 00:59:57.970 ⇒ 00:59:58.360 Uttam Kumaran: Yeah.
585 00:59:58.360 ⇒ 01:00:00.890 Jay Heavner: new accounts, or at least, or explain to us how.
586 01:00:00.890 ⇒ 01:00:05.650 Uttam Kumaran: Yeah, I think the AI, like, just making sure what our models are, and like…
587 01:00:05.650 ⇒ 01:00:06.010 Jay Heavner: Yeah.
588 01:00:06.010 ⇒ 01:00:10.679 Uttam Kumaran: what… even for them to tell us, like, yeah, how the bedrock is handling, like.
589 01:00:10.780 ⇒ 01:00:18.010 Uttam Kumaran: system prompts, rate limiting, scale repository, like, I would love to… Hammer that with those questions.
590 01:00:18.010 ⇒ 01:00:37.610 Jay Heavner: anything we want, they’ll drag in the expert from it, right, and put them in front of us, and let us talk to them, and like, I mean, it’s, you know, it’s free in the scope of the contract, so… The thing that I want more than anything is just an up-to-date, within the last 72 hours, of not the model scoring, but the best model for each use case.
591 01:00:37.610 ⇒ 01:00:38.360 Uttam Kumaran: Yeah, yeah.
592 01:00:38.360 ⇒ 01:00:42.830 Jay Heavner: Like, or rank them for me ordinantly, because…
593 01:00:42.970 ⇒ 01:00:46.560 Jay Heavner: you know, you’re talking to someone about, I’m trying to do this.
594 01:00:46.730 ⇒ 01:00:49.869 Jay Heavner: What’s the best model for that? You know?
595 01:00:50.220 ⇒ 01:00:54.080 Jay Heavner: I’m writing code, great. I know I’m going to use Anthropic to write code.
596 01:00:54.080 ⇒ 01:00:55.090 Uttam Kumaran: Yeah, yeah.
597 01:00:55.450 ⇒ 01:01:11.159 Jay Heavner: doing… if I’m doing document extraction, what’s the best model for that, right? Right, right, right. Like, I’ve learned that Cohere version 4 is great for embeddings, right? See, and that’s… yeah, that’s the kind of thing I’m looking for.
598 01:01:11.290 ⇒ 01:01:13.439 Jay Heavner: And as things change, like.
599 01:01:13.650 ⇒ 01:01:25.650 Jay Heavner: you know, that thing you see me in April 4.7, I found incredibly helpful, because, like, I hadn’t read anything about it, I didn’t know what the differences between that and 4.6 were, so understanding the approach to it.
600 01:01:25.960 ⇒ 01:01:40.640 Jay Heavner: is wildly beneficial, because you take what you know about Opus 4.6, and you apply it, you’re frustrated. The same thing going from ChatGPT 4 to 5 is, like, it doesn’t work the same way anymore. Well, they’ve changed the fucking model, I mean, yeah.
601 01:01:40.640 ⇒ 01:01:41.290 Uttam Kumaran: Yeah.
602 01:01:41.600 ⇒ 01:02:00.759 Jay Heavner: Right, like, that guidance. And then I think, too, once we get a handle on how we curate that information, and it becomes a skill, and it becomes a thing that then other people can, yeah, tap into… But I kind of agree with you. I think we’re very close to being able to say, you always have people playing with frontier models.
603 01:02:01.260 ⇒ 01:02:01.950 Jay Heavner: But…
604 01:02:02.940 ⇒ 01:02:23.739 Jay Heavner: We’re rapidly approaching a point where you can stick with a model for quite a while, I think. Yeah, like, to your point, I could hang out with 0.4.6 for another year, and I don’t think I’d be disappointed. As long as they don’t nerf it, yeah, absolutely. Right, right. Yeah, yeah. Like, if it’s running well, and it’s not overloaded, and they’re not messing with whatever experiment of the day on pricing.
605 01:02:24.610 ⇒ 01:02:25.350 Uttam Kumaran: Yes.
606 01:02:25.490 ⇒ 01:02:31.210 Jay Heavner: It’s… we are there. We’re very close to being there. And for most of these people.
607 01:02:31.320 ⇒ 01:02:48.269 Jay Heavner: we were there a year ago. Right. I mean, they could… most of our use cases could run GPT-4.0, I think, and be perfectly happy with that. Which I think actually is another sort of lens I’ve been increasingly thinking through for our stuff, is like, as we build things.
608 01:02:48.280 ⇒ 01:03:13.220 Jay Heavner: can we go back and then, yeah, like, rebuild, or reassign, or, you know, whatever, like, slot in a cheaper, like, or older model, or whatever, like, build it with the nice stuff, and then maintain it with the cheaper stuff. Yeah. The same, like, even with the, like, Fivetran connections I’ve set up, like, that’s a great way to, like, buy speed, but Fivetran’s expensive, and I can totally use AI to really create those integrations on a longer timeline, while Fivetran handles it for an
609 01:03:13.220 ⇒ 01:03:14.249 Jay Heavner: now, you know?
610 01:03:14.250 ⇒ 01:03:26.900 Jay Heavner: So it’s like, how can we kind of use it as the, like, you know, rapid groundwork laying, but then also have processes that are cost-effective to come behind and put in, you know, more long-term or cheaper solutions? I don’t know.
611 01:03:28.190 ⇒ 01:03:30.829 Jay Heavner: Or even time of day, because apparently if you run things…
612 01:03:31.110 ⇒ 01:03:35.729 Jay Heavner: when the sun’s over the Pacific, it gets really cheap. Yeah, yeah, yeah.
613 01:03:36.470 ⇒ 01:03:37.270 Jay Heavner: Yeah.
614 01:03:37.430 ⇒ 01:03:43.579 Jay Heavner: I mean, yeah, or you get, like, spring break week, and you’re like, oh, nobody seems to be working, it’s great, I can actually get things done. Yeah.
615 01:03:44.530 ⇒ 01:03:54.830 Jay Heavner: I mean, Sunday mornings. oh, okay, sorry, this is entirely random, but parking it as another thing I think the three of us should work on together. The classification…
616 01:03:54.920 ⇒ 01:04:05.749 Jay Heavner: stuff, right? So whether it’s documents or use cases or whatever, but, like, we keep circling around this idea of, like, agentic classification of things, and I think that would be another cool one for us to build.
617 01:04:05.950 ⇒ 01:04:11.630 Uttam Kumaran: Yeah, like, on our side, for linear tickets, we have a script that goes and classifies the ticket to, like.
618 01:04:11.920 ⇒ 01:04:13.759 Uttam Kumaran: That type of work output.
619 01:04:13.870 ⇒ 01:04:19.029 Uttam Kumaran: Is it a dbt thing? Is it a snow… is it a data engineering task? Is it an analysis task?
620 01:04:19.220 ⇒ 01:04:24.410 Uttam Kumaran: So that’s just on the linear side, but we use, like, 4.0 Mini.
621 01:04:24.590 ⇒ 01:04:24.979 Jay Heavner: I’m like.
622 01:04:24.980 ⇒ 01:04:34.770 Uttam Kumaran: do that, and it’s, like, super easy. It’s, like, pennies to just, like… or, like, even, like, fractions of fractions of pennies to just be like, this is this, this is this, this is this, you know?
623 01:04:34.770 ⇒ 01:04:35.630 Jay Heavner: Yeah.
624 01:04:35.630 ⇒ 01:04:39.060 Uttam Kumaran: Not really an embedding pipeline, per se, but it is, like, a
625 01:04:39.380 ⇒ 01:04:43.710 Uttam Kumaran: Classification for, like, based on metadata or whatever.
626 01:04:43.890 ⇒ 01:04:55.009 Jay Heavner: Yeah, because I think if we could get something like that working in decent order, then when we get to the, like, box migrations and stuff like that, like, you know, now we’ve got patterns we can scale, and yeah.
627 01:04:55.820 ⇒ 01:04:56.780 Jay Heavner: Yeah.
628 01:04:57.630 ⇒ 01:04:59.070 Jay Heavner: We’re vectorizing everything.
629 01:04:59.260 ⇒ 01:05:00.940 Jay Heavner: Bye.
630 01:05:02.160 ⇒ 01:05:13.740 Jay Heavner: market research is, like, because I’m… I told them, like, I have access to AWS, and, like, you know, give you Bedrock, and S3, and Chris, like, maybe, like, a vector database, and, like.
631 01:05:13.740 ⇒ 01:05:31.170 Jay Heavner: I did tell him, I was like, I just, like, you need… I was like, just use the S3 vector buckets. It’s so much cheaper, you get all the stuff, like, it does everything that you need. Like, that’s how I use it with Robot Brain, like, all that open search stuff that turns out to be… Oh, yeah. That shit.
632 01:05:31.170 ⇒ 01:05:31.550 Uttam Kumaran: Yeah.
633 01:05:31.870 ⇒ 01:05:48.329 Jay Heavner: Yeah, so, like, S3 vectors, and then now we’ve got S3 files, which I still have yet to really do any exploring around. But yeah, I mean, I think, yeah, like, Chris is… he’s… he’s great for, like, pushing, pushing, pushing, pushing. I wonder if…
634 01:05:48.880 ⇒ 01:05:50.520 Jay Heavner: Why is he in market research?
635 01:05:51.120 ⇒ 01:06:09.449 Jay Heavner: Steve Koenig got him as headcount to be a floater amongst the four teams in MRD, generally speaking, and none of the teams want him because he has big, loud ideas, and they have small understandings. And so now he’s just a very frustrated person who
636 01:06:09.450 ⇒ 01:06:23.240 Jay Heavner: basically flew up at HR the other day, and, like, basically, like, the whole, like, nailed the 95 theses to the door about, like, this whole department bullshit. He’s also one of the people who saw his promotion in Glean and then didn’t get it.
637 01:06:23.950 ⇒ 01:06:24.740 Jay Heavner: Oh.
638 01:06:25.270 ⇒ 01:06:25.880 Uttam Kumaran: Oh.
639 01:06:25.880 ⇒ 01:06:27.019 Jay Heavner: Oh.
640 01:06:27.020 ⇒ 01:06:27.600 Uttam Kumaran: Nice.
641 01:06:27.600 ⇒ 01:06:32.609 Jay Heavner: Yeah, I’m kind of like, I’m gonna feed that fire. Go grind that axe, Chris.
642 01:06:36.120 ⇒ 01:06:36.890 Jay Heavner: Yeah.
643 01:06:37.240 ⇒ 01:06:37.970 Jay Heavner: Yeah.
644 01:06:38.620 ⇒ 01:06:47.430 Jay Heavner: So, you know, he did ask if he could apply to be the integrations engineer for the CES stuff.
645 01:06:47.430 ⇒ 01:06:50.420 Uttam Kumaran: Motivation’s a big component, but, you know, so…
646 01:06:50.420 ⇒ 01:06:51.020 Jay Heavner: Amen?
647 01:06:51.020 ⇒ 01:06:52.680 Uttam Kumaran: Love motivated folks.
648 01:06:52.680 ⇒ 01:07:01.039 Jay Heavner: and he’s nice, he’s a nice guy. He’s just, he’s very passionate at the moment, and I don’t mind it. I don’t mind it one bit.
649 01:07:01.290 ⇒ 01:07:05.020 Jay Heavner: No, I need to… I need to talk with him more, cause, like… Yeah.
650 01:07:05.440 ⇒ 01:07:06.540 Jay Heavner: He clearly…
651 01:07:06.730 ⇒ 01:07:19.040 Jay Heavner: has skills. He’s a C-sharp, like.net guy. Yeah, that’s what you said. Why is he a market research, then? He got laid off from a market research firm. He was, like, a developer, but he was a…
652 01:07:19.060 ⇒ 01:07:36.859 Jay Heavner: Yeah, yeah, yeah, yeah. Like, the first encounter I had with him was I, like, bumbled into a meeting where he had wired up something with, like, C-sharp and SharePoint and some godforsaken, like, set of things, and I was like, there has to be a better way to do this, but I’m very impressed that you did this. I just…
653 01:07:36.860 ⇒ 01:07:46.479 Jay Heavner: It’s gotta be a better way. But yeah, no, he’s… he’s smart. He’s smart. He is smart, and he is underutilized, and he will tell anybody who stands still that he is underutilized.
654 01:07:50.050 ⇒ 01:07:50.780 Jay Heavner: Yeah.
655 01:07:51.200 ⇒ 01:07:54.240 Jay Heavner: okay, cool.
656 01:07:54.390 ⇒ 01:08:09.680 Jay Heavner: Alright. And another one who wants to, I don’t think he’ll be quite the same dynamite that Chris is, but apparently Eamon, is a champion at the bit to start automating some stuff, because he’s had some heavy, like, Power Automate usage in the past, and I’m like, hey, they’re very different things, but…
657 01:08:09.680 ⇒ 01:08:14.920 Jay Heavner: If he at least has the ability to think in systems, we can give him better tools. You know what?
658 01:08:14.920 ⇒ 01:08:18.010 Jay Heavner: Use that Zoom thing I gave you, and then you can do more.
659 01:08:18.170 ⇒ 01:08:23.120 Jay Heavner: For their meeting minutes, it was the one I’ll give the license to, and I’m like, just…
660 01:08:23.310 ⇒ 01:08:26.249 Jay Heavner: Although that department’s… they don’t wanna…
661 01:08:26.910 ⇒ 01:08:30.410 Jay Heavner: Yeah. They, they are what they are. It’s gonna change fast.
662 01:08:30.520 ⇒ 01:08:34.219 Jay Heavner: I hope things are going to change fast.
663 01:08:35.460 ⇒ 01:08:44.339 Jay Heavner: Anyway, I know we’re over time a little bit, so… Yeah. All right, well, Tom, thank you for the call, man. Let’s, but let’s do this, like, maybe with some cadence to it. The three of us just periodically meet.
664 01:08:44.830 ⇒ 01:08:51.489 Uttam Kumaran: Yeah, and I’m gonna plan to be there that week, so maybe we, like, kind of work backwards from something.
665 01:08:51.490 ⇒ 01:08:52.240 Jay Heavner: Yeah. Yeah.
666 01:08:52.240 ⇒ 01:08:53.929 Uttam Kumaran: K-12, or is that, like, too early?
667 01:08:53.930 ⇒ 01:09:06.909 Jay Heavner: What is that meeting? The May 12th is the next meeting with the CES leadership, yeah. CES leadership. Yeah. And then, just coincidentally, he’s in town. Okay. Yeah, so yeah, I mean, that… that date might be…
668 01:09:07.080 ⇒ 01:09:13.480 Jay Heavner: Maybe that’s… It might be a little too soon for, like, a full-on workshop to get planned, but… I don’t know.
669 01:09:13.930 ⇒ 01:09:16.889 Jay Heavner: Might mean it’s, like, 3 weeks away. You can do it.
670 01:09:17.760 ⇒ 01:09:19.129 Jay Heavner: Yeah. Maybe.
671 01:09:19.180 ⇒ 01:09:31.149 Jay Heavner: I just don’t know if, like, people’s calendars and… Oh, I don’t care about that, we’re gonna make them good, yeah. Well, they’re fine. I like it. But if that’s also the week that all the adults are off in a strategic brainstorm, then maybe that’s the perfect time. No, it’s after they get back.
672 01:09:31.149 ⇒ 01:09:46.670 Jay Heavner: Right, yeah, I mean, that’s the thing, I’m actually like, oh god, yeah, that brainstorm is the week before. I feel like we’re gonna come, you know, ready to do one thing at that CES leadership meeting, and they’re gonna be like, oh, here is an entire deluge of new ideas. Great.
673 01:09:46.790 ⇒ 01:09:48.250 Jay Heavner: Ideas. Yeah.
674 01:09:48.670 ⇒ 01:09:56.350 Jay Heavner: I’m excited about our ideas. They’re gonna be… Yeah.
675 01:09:58.230 ⇒ 01:09:59.020 Jay Heavner: Okay.
676 01:09:59.130 ⇒ 01:09:59.739 Jay Heavner: We’re getting that.
677 01:09:59.740 ⇒ 01:10:04.339 Uttam Kumaran: So should we maybe do, like, a weekly until that week? Yeah. I mean, we don’t have much time, but…
678 01:10:04.340 ⇒ 01:10:05.250 Jay Heavner: Sounds good, yeah.
679 01:10:05.250 ⇒ 01:10:09.400 Uttam Kumaran: Yeah, let’s do that, and then… I’m all ears for…
680 01:10:09.520 ⇒ 01:10:13.259 Uttam Kumaran: Any ideas, I’m doing… thinking about this 24-7, so…
681 01:10:13.260 ⇒ 01:10:13.760 Jay Heavner: Yeah.
682 01:10:13.760 ⇒ 01:10:14.830 Uttam Kumaran: Yeah. Same.
683 01:10:14.830 ⇒ 01:10:16.020 Jay Heavner: That’s what it feels like, yeah.
684 01:10:16.020 ⇒ 01:10:20.979 Uttam Kumaran: Yeah, that’s… it feels like not enough waking… not enough waking hours.
685 01:10:22.200 ⇒ 01:10:25.810 Uttam Kumaran: Not enough, not enough terminal screens right now.
686 01:10:27.440 ⇒ 01:10:31.480 Jay Heavner: I am still a huge asker, that is my, that’s my style, but
687 01:10:31.480 ⇒ 01:10:49.869 Jay Heavner: No, it’s like, I think I… like, I don’t have enough cognitive ability to keep up with my own curiosity, right? Like, I get too tired to learn new things by the end of the day. I’m like, no, I just… I drool for a minute, actually. My problem is, I just… I just get to the point where I’m just swearing at things. I’ve lost all ability to be rational.
688 01:10:50.180 ⇒ 01:10:58.160 Jay Heavner: different, different approaches at the same end, you know? Right, right, right, right, right. He didn’t drool loudly. all the words.
689 01:10:59.020 ⇒ 01:11:03.930 Jay Heavner: Cool. Alright, man. Yeah, let’s… let’s do that. I like it.
690 01:11:04.090 ⇒ 01:11:04.820 Uttam Kumaran: Okay, perfect.
691 01:11:04.820 ⇒ 01:11:05.820 Jay Heavner: Right? Alright.
692 01:11:06.070 ⇒ 01:11:07.610 Jay Heavner: Talk to you soon.