Meeting Title: Brainforge x CTA: Weekly! Date: 2026-04-17 Meeting participants: Ashwini Sharma, Amber Lin, Katherine Bayless, Uttam Kumaran, Chi Quinn
WEBVTT
1 00:01:42.040 ⇒ 00:01:43.950 Katherine Bayless: Hello! Hello, hello.
2 00:01:44.530 ⇒ 00:01:45.310 Katherine Bayless: Oop.
3 00:01:46.520 ⇒ 00:01:47.040 Katherine Bayless: Man.
4 00:01:47.040 ⇒ 00:01:47.920 Chi Quinn: Hello!
5 00:01:51.630 ⇒ 00:01:55.459 Katherine Bayless: Totally went to put my desk down and forgot my chair was under it. Hang on.
6 00:02:04.610 ⇒ 00:02:07.560 Katherine Bayless: Okay… There we go.
7 00:02:08.650 ⇒ 00:02:10.189 Katherine Bayless: Happy Friday, folks!
8 00:02:12.000 ⇒ 00:02:13.200 Chi Quinn: Happy Friday!
9 00:02:19.930 ⇒ 00:02:20.870 Katherine Bayless: Okay.
10 00:02:21.590 ⇒ 00:02:32.560 Katherine Bayless: I’m gonna be totally honest, I am quite fried, so, we’ll have to be patient with my brain today. It’s been a… it’s been a good week, it’s been busy. It’s been very busy.
11 00:02:37.150 ⇒ 00:02:44.800 Chi Quinn: Yes. Oh, yeah, it’s the end of the week. Yeah, so I feel like the end of the week, it’s usually… that’s the state of our minds at the…
12 00:02:45.450 ⇒ 00:02:46.690 Katherine Bayless: Exactly. Yeah.
13 00:02:46.780 ⇒ 00:02:52.160 Uttam Kumaran: I think most of it, I just wanted to hand to Amber today, really, I think just to do some show and tell. I feel like that’ll.
14 00:02:52.160 ⇒ 00:02:52.790 Amber Lin: Huh.
15 00:02:53.070 ⇒ 00:02:55.309 Uttam Kumaran: Great discussion. I think, Amber, you and
16 00:02:55.420 ⇒ 00:02:58.539 Uttam Kumaran: Kai, I’ve been working closely on stuff in Cortex.
17 00:02:58.820 ⇒ 00:03:03.490 Uttam Kumaran: So, I think that could be a good use of time today, just to talk about progress there.
18 00:03:05.790 ⇒ 00:03:07.290 Katherine Bayless: I’d love that, honestly.
19 00:03:07.730 ⇒ 00:03:25.979 Amber Lin: Cool. We did meet on Wednesday to go through Cortex, and just to go through the semant- how the semantic view is created. I think since then, I’ve been able to set up some new stuff, so I think,
20 00:03:26.060 ⇒ 00:03:36.699 Amber Lin: Since, Catherine, you’re here, I’m gonna walk through the semantic view just really briefly, and then I’ll go to what… what has changed and, what I’m able to do after that.
21 00:03:37.350 ⇒ 00:03:43.390 Amber Lin: So… Let me share screen.
22 00:03:44.210 ⇒ 00:04:02.350 Amber Lin: So, in here, I was able to create the semantic view under… so we… if we go to AI, and then we go to Analyst, I’ve already selected the folder that I’m in, so I would… this is the one that…
23 00:04:02.480 ⇒ 00:04:05.240 Amber Lin: I’m using right now.
24 00:04:05.820 ⇒ 00:04:25.300 Amber Lin: So this is what the semantic view will look like. These are just some AI contexts that I have been adding, and, it includes the tables. So this is mostly for attendance and registration. So this has the DIMM tables and that… and that one fact,
25 00:04:25.730 ⇒ 00:04:34.700 Amber Lin: CES table. And then from that, we have some relationships between tables, and then…
26 00:04:34.760 ⇒ 00:04:45.890 Amber Lin: We are going to add some verified queries, which means, the SQL that we are sure will produce the right results, and we can add these over here.
27 00:04:46.450 ⇒ 00:05:00.329 Amber Lin: what I showed on Wednesday is, one, I created this with Cursor, so I was able to write the YAML in Cursor using context with access to art models.
28 00:05:00.330 ⇒ 00:05:10.340 Amber Lin: Access to, past calls, the audit doc, but another way we can create this is directly within
29 00:05:10.640 ⇒ 00:05:18.509 Amber Lin: within the UI here. And it’s really, really easy, and I think it’s very straightforward, so if you want to experiment.
30 00:05:18.630 ⇒ 00:05:30.859 Amber Lin: That’s a great way to figure out how, the semantic view… what it is. So, once you get… let’s… let’s start from the beginning here. Once you get to
31 00:05:30.980 ⇒ 00:05:32.110 Amber Lin: analyst.
32 00:05:32.550 ⇒ 00:05:41.190 Amber Lin: you can, let’s… you’ll see this view. So, we can just go directly to Create with Autopilot.
33 00:05:41.370 ⇒ 00:05:48.749 Amber Lin: You can give it SQL queries, but if you don’t have any, you can just skip this, and then we can go
34 00:05:49.200 ⇒ 00:06:00.899 Amber Lin: Enter name, I’ll call it Test3, and then you can select the database that you want it to be in. So in this case, I’m putting it in my folder.
35 00:06:01.010 ⇒ 00:06:08.310 Amber Lin: I think I need to be in the right… Permissions, and then…
36 00:06:08.840 ⇒ 00:06:16.840 Amber Lin: you can select the right tables, and I’ll be in Prod March. I’m using this, star.
37 00:06:17.000 ⇒ 00:06:23.030 Amber Lin: folder, and I’ll just select… these… Two tables for now.
38 00:06:23.290 ⇒ 00:06:29.310 Amber Lin: And then, it will prompt me to select the fields. I would just select everything, and…
39 00:06:29.390 ⇒ 00:06:43.809 Amber Lin: This autopilot, it will add some sample views, it will add some disruptions that will help the AI. So I’m gonna select these, tap Create, and then it’s just gonna run through and create these
40 00:06:44.130 ⇒ 00:06:46.319 Amber Lin: And you see we have these tables.
41 00:06:46.490 ⇒ 00:06:50.960 Amber Lin: And then there’s a relationship that joins them together.
42 00:06:51.610 ⇒ 00:06:59.449 Amber Lin: And then on the right-hand side is where it’s really nice to have these suggestions of, okay, given these…
43 00:06:59.620 ⇒ 00:07:16.509 Amber Lin: dimensions, what are some likely metrics we calculate from these dimensions? What are some, maybe, filters that we can use? And if you want to go deeper, you can click on each of these, and
44 00:07:16.560 ⇒ 00:07:27.130 Amber Lin: test, clicking the plus button to see what Snowflake defines as a metric, what they define as a fact or as a dimension.
45 00:07:27.960 ⇒ 00:07:36.099 Amber Lin: So, that’s a super quick walkthrough of, what… how to create a semantic view if you’re interested in testing.
46 00:07:36.210 ⇒ 00:07:40.169 Amber Lin: Other than that, I wanted… what I wanted to show you was…
47 00:07:40.300 ⇒ 00:07:45.530 Amber Lin: One, the golden dataset that I was able to create
48 00:07:45.980 ⇒ 00:07:58.090 Amber Lin: using the audit, the pre-audit doc, which I know is our golden standard, I think we’re… we had a very good start because we already had verified, answers to these questions.
49 00:07:58.090 ⇒ 00:08:07.720 Amber Lin: Which we usually don’t get on other clients, because they don’t even know what the right answer is. So, we’re off to a very good start. I created that dataset.
50 00:08:08.180 ⇒ 00:08:11.170 Amber Lin: And what I did is to…
51 00:08:11.620 ⇒ 00:08:17.430 Amber Lin: Take that, run those questions through… cortex.
52 00:08:18.010 ⇒ 00:08:26.180 Amber Lin: And do a… run another script to compare what court has got and what the ideal answer is.
53 00:08:26.630 ⇒ 00:08:42.209 Amber Lin: And of course, CoreTask gives the SQL queries, so I asked AI to give analysis of, hey, what is the likely change that we need to make in the semantic view? So, taking that result, I go back
54 00:08:42.230 ⇒ 00:08:51.330 Amber Lin: paste that and loop that through AI again to say, hey, improve this semantic view based on what you had, and then once I do.
55 00:08:51.490 ⇒ 00:08:59.070 Amber Lin: I retest that question, so that is the feedback loop that I’m creating based on the golden question set.
56 00:08:59.940 ⇒ 00:09:06.360 Amber Lin: And so… Let me pull up my cursor.
57 00:09:13.310 ⇒ 00:09:14.530 Amber Lin: Alright.
58 00:09:19.820 ⇒ 00:09:37.829 Amber Lin: So, right now, I am in my cursor. Please let me know if things change a little bit slow, because I think cursor takes a bit… a bit to load, so I’ll… I’ll try not to click too fast. I think that was what happened on Wednesday. So, in Cursor.
59 00:09:38.380 ⇒ 00:09:44.260 Amber Lin: For this workspace, you can see here I have our
60 00:09:44.580 ⇒ 00:09:58.469 Amber Lin: GitHub repo. I have some of our internal, repos open, because we have some skills and documentation that was really helpful. And I have my personal folder where I store,
61 00:09:58.760 ⇒ 00:10:03.329 Amber Lin: store, essentially, this chunk of work that I have been doing.
62 00:10:03.790 ⇒ 00:10:05.779 Amber Lin: So let’s…
63 00:10:06.790 ⇒ 00:10:16.369 Amber Lin: I’m gonna pause for any questions. I feel like I’ve been… I’ve been talking for a while, wanna make sure that we’re all on the same page, or if I need to go back to anything to explain.
64 00:10:17.810 ⇒ 00:10:26.289 Katherine Bayless: This is perfect. I wasn’t necessarily gonna interrupt with the question, but since you paused, question I was wondering with, like.
65 00:10:26.520 ⇒ 00:10:51.469 Katherine Bayless: So, for example, one of the sort of things that goes into that audit report, and many other sort of reports that we put together, is, like, you know, count of people by country. But, like, even us, the humans, usually have to clarify, like, okay, do you want the country that the person’s address was in, or where their company was? Because depending on the context, we might want to know, like, how many people came from China, or India, or Brazil, or we might want to know how many companies came from those countries.
66 00:10:51.470 ⇒ 00:11:03.069 Katherine Bayless: Right? And I have noticed that Coco does get a little confused between country and country HQ, just like I, the human, do. And so I wasn’t sure, like, is the semantic view the place to kind of, like.
67 00:11:03.070 ⇒ 00:11:13.669 Katherine Bayless: Call out the ambiguity and sort of, like, let the model know that, like, if somebody asks for a report by country, you probably want to ask them, like, which country they’re talking about.
68 00:11:14.120 ⇒ 00:11:17.979 Katherine Bayless: not which country specifically, but which version of the country data point, I guess.
69 00:11:18.270 ⇒ 00:11:39.100 Amber Lin: Yeah, I… I think I ran into that question similarly to, like, the people’s primary business, because that’s also a little bit hard to clarify. What I did now is to give it a default value. I tried to have it just as a clarifying question outright, but…
70 00:11:39.100 ⇒ 00:11:42.060 Amber Lin: what I have now is to…
71 00:11:42.070 ⇒ 00:11:46.750 Amber Lin: Give a default answer, and then also at the end of, hey,
72 00:11:46.790 ⇒ 00:12:03.609 Amber Lin: this is just for country by person. If you want country or company, here’s an… you can ask this question again. I haven’t read into that, so that’s a really good point. I’ll… I’ll take note of that, and I’ll experiment with that in a bit.
73 00:12:04.010 ⇒ 00:12:16.009 Katherine Bayless: Okay, yeah, yeah, yeah. I mean, it’s both a, like, known wrinkle for us to kind of be aware of, but also I was thinking, like, a good example of, like, where does this go in a semantic model? Like, where does that information best belong?
74 00:12:16.010 ⇒ 00:12:20.650 Amber Lin: Let’s… let me pull that up.
75 00:12:21.090 ⇒ 00:12:23.839 Amber Lin: Let me pull up the YAML file.
76 00:12:24.030 ⇒ 00:12:37.770 Amber Lin: So… Let’s go… Actually, I think I… I created a,
77 00:12:39.020 ⇒ 00:12:51.400 Amber Lin: a guide in Google Docs. I think all of you should have access. Just a quick overview of what I went through on how to create a semantic view, and I have the…
78 00:12:51.410 ⇒ 00:12:59.029 Amber Lin: I have the YAML… yesterday’s version over here. So, there’s tables, and then…
79 00:12:59.140 ⇒ 00:13:05.820 Amber Lin: relationships, as we’ve seen before, and there’s dimensions. I think what will be helpful here is…
80 00:13:06.040 ⇒ 00:13:09.829 Amber Lin: These are, for example, right here, these are…
81 00:13:10.340 ⇒ 00:13:18.670 Amber Lin: comments we will add. So, I think the difference, say, person’s country versus, companies
82 00:13:19.330 ⇒ 00:13:22.290 Amber Lin: country. We will have different
83 00:13:23.030 ⇒ 00:13:40.389 Amber Lin: comments, so we would say, country’s registration address for company, or for attendees specifically, do not use this for country. Please clarify when asked about countries.
84 00:13:40.570 ⇒ 00:13:42.080 Katherine Bayless: Okay, okay, cool.
85 00:13:42.190 ⇒ 00:13:42.950 Katherine Bayless: Cool.
86 00:13:43.170 ⇒ 00:13:48.819 Amber Lin: Yeah, and there’s different synonyms we can add in, so…
87 00:13:49.270 ⇒ 00:13:59.630 Amber Lin: That would… that would, perhaps, depending on how people ask, because different people type or word things differently, we can have those in as well.
88 00:14:00.490 ⇒ 00:14:01.410 Katherine Bayless: Okay.
89 00:14:02.200 ⇒ 00:14:06.199 Katherine Bayless: Actually, we’ve got a related question.
90 00:14:06.440 ⇒ 00:14:15.910 Katherine Bayless: And maybe this one’s a little bit more crunchy, but it’s sort of, like, a place to tell it, with the session scans being a good example, like.
91 00:14:16.270 ⇒ 00:14:31.439 Katherine Bayless: the data, in terms of it being the data we received that is now cleaned and modeled, is correct, but the scanners were used incorrectly. So it’s like, there’s nothing wrong with the data, it’s just the capture was incorrect, and I’ve been wondering, like.
92 00:14:31.560 ⇒ 00:14:55.349 Katherine Bayless: I mean, it’s just, like, we happen to know that, but I’ve been wondering if, like, that also makes sense to include in, like, the semantic view, like, somewhere, or somewhere in this sort of, like, context for AI swirl, where it’s, like, we could tell it that, like, yes, this is the data from the scanners at CES, however, we should know that there were issues with capture and collection, you know, blah blah blah, so depending on the type of question you’ve been asked.
93 00:14:55.350 ⇒ 00:15:03.710 Katherine Bayless: you know, maybe remember to take it with a grain of salt, anything that’s in the data, because it MIGHT not be as accurate as you think it is, like, that kind of thing.
94 00:15:04.730 ⇒ 00:15:16.350 Amber Lin: Yeah, I think those are totally possible to, like, to give another bit of context after every response to say, hey, this is what I calculated, but please be in mind that is wrong.
95 00:15:16.350 ⇒ 00:15:29.619 Katherine Bayless: Yeah, yeah, that would be good, yeah. Because, like, even Kyle and I have noticed, like, it’ll come back with, like, a really confident insight, and I mean, in its defense, there’s no reason to know that, like, that data is kind of janky and might not actually support that insight.
96 00:15:29.620 ⇒ 00:15:30.270 Amber Lin: Hmm.
97 00:15:30.270 ⇒ 00:15:36.869 Katherine Bayless: So yeah, I like that, of the kind of, like, there are some issues with this data’s quality, like trust but verify, that kind of thing, yeah.
98 00:15:38.000 ⇒ 00:15:38.520 Amber Lin: Awesome.
99 00:15:38.520 ⇒ 00:15:42.820 Katherine Bayless: Cool. Okay. Kai, did you have any questions? Sorry, I know I kind of took over all the time.
100 00:15:43.100 ⇒ 00:15:49.749 Chi Quinn: Oh, well, I mean, it’s funny, because I’ve been looking around and playing around, with it for a little bit, but,
101 00:15:49.890 ⇒ 00:15:55.700 Chi Quinn: I guess for me, I guess to play around, because I know when you select the databases.
102 00:15:56.610 ⇒ 00:16:07.210 Chi Quinn: I guess, what is the recommended area I should play… play around? Because I don’t obviously want to go somewhere that’s in production, but I guess it sounds like the dev marks might be the recommended,
103 00:16:07.350 ⇒ 00:16:25.350 Chi Quinn: database to, play around with semantic views, because I just want to get that, experience of, you know, adding in and playing it with myself. So yeah, that was just, like, a… where’s the recommended place to… the location to play around?
104 00:16:25.510 ⇒ 00:16:29.420 Amber Lin: Yeah, I added in a dev… Marce Utam, do you think that’s… that’s a good.
105 00:16:29.420 ⇒ 00:16:38.059 Uttam Kumaran: That’s… yeah, I think that’s fine, and then I think at some point we can just move them to staging as they, like, harden, and then we’ll decide on production, I guess, like…
106 00:16:38.940 ⇒ 00:16:43.700 Uttam Kumaran: Yeah. Either way, I think it’s, like, for our testing, I just don’t want it to have it in production.
107 00:16:43.880 ⇒ 00:16:47.409 Uttam Kumaran: And then as soon as this group feels comfortable, we can move it, you know?
108 00:16:48.710 ⇒ 00:17:03.399 Amber Lin: Awesome. And as we get more semantic views, you may have to specify which semantic view, so just copy your view’s name when you ask Cortex, because it might go to default, which I have
109 00:17:03.580 ⇒ 00:17:09.040 Amber Lin: I don’t know if I’ve said it correctly, but just make sure that you’re in the right semantic view.
110 00:17:09.180 ⇒ 00:17:15.430 Chi Quinn: So there’s a default. So when you, create the semantic views, you have one set as the default.
111 00:17:15.960 ⇒ 00:17:34.890 Amber Lin: I commented it in the semantic views AI context. I believe there should be a way, for example, in overall the Snowflake AI system prompt to say, hey, default to this view, but I have not worked in
112 00:17:34.890 ⇒ 00:17:42.660 Amber Lin: outside of the semantic view. I have not worked in, like, the AI system prop, so something down the road that will be interesting.
113 00:17:43.100 ⇒ 00:17:44.490 Chi Quinn: Got it, okay.
114 00:17:44.660 ⇒ 00:17:46.200 Katherine Bayless: Yeah, yeah.
115 00:17:46.430 ⇒ 00:17:55.570 Katherine Bayless: I mean, I… yeah, I find it fascinating, generally, with all these AI tools, like, you know, how… how close to the metal can you really get in terms of trying to direct them? Yeah.
116 00:17:55.570 ⇒ 00:18:03.880 Uttam Kumaran: Yeah, and that’s what we’re finding, too, is, like, even model changes, like moving from Opus to something, they have various, like, system prompts you don’t have access to.
117 00:18:04.230 ⇒ 00:18:12.149 Uttam Kumaran: And… yeah, it’s, like, really hard to steer. So, I think that’s what we’re kind of figuring out, like, how much steering can we do in Snowflake?
118 00:18:12.660 ⇒ 00:18:24.590 Uttam Kumaran: And create just an experience that people don’t have to futz around with, like, oh, maybe I need to, like, use the max model here. It’s like, just ask the question, and it’s gonna answer, and there’s no options.
119 00:18:24.640 ⇒ 00:18:25.910 Katherine Bayless: Right, right.
120 00:18:25.910 ⇒ 00:18:37.319 Uttam Kumaran: Yeah, but I think, Amber, like, what I liked about your process is you’ve done this, like, loop now of, like, editing questions, seeing the responses, fixing, and sort of looping.
121 00:18:37.480 ⇒ 00:18:39.589 Uttam Kumaran: I could see the AI, like.
122 00:18:39.830 ⇒ 00:18:45.269 Uttam Kumaran: trying to fit the objective, probably, without more context, so I’m interested in, like.
123 00:18:45.450 ⇒ 00:18:47.830 Uttam Kumaran: What it did to, like, get the score up.
124 00:18:48.100 ⇒ 00:18:53.669 Uttam Kumaran: Was it just, like, Snowflake adjustments, or was it actually, like…
125 00:18:53.790 ⇒ 00:18:57.010 Uttam Kumaran: improving the context? Or was it, like.
126 00:18:57.230 ⇒ 00:19:04.130 Uttam Kumaran: just filling in stuff so that the answer is right, you know what I mean? Like, I’m interested in, like, how did it actually achieve the objective on the loop?
127 00:19:04.130 ⇒ 00:19:14.579 Amber Lin: Yeah, so let me pull up… actually, before I share screen. So, when I have these questions,
128 00:19:14.700 ⇒ 00:19:28.519 Amber Lin: a direct way, or a shortcut way to improve it, is to add it to the verified queries, that you guys saw earlier. That would make sure, hey, we will just use this query no matter what. If we detect this question, we’ll use that.
129 00:19:28.550 ⇒ 00:19:46.459 Amber Lin: I know that will solve the issue, but I want to push that until later, because I want to see what can be improved out of the actual fields in the… and the context for those fields. So, that’s what I’ve been doing, is trying to go to
130 00:19:46.840 ⇒ 00:19:49.129 Amber Lin: The fields,
131 00:19:49.340 ⇒ 00:19:55.369 Amber Lin: and give it good context so that it generates the right query. And then once that’s done.
132 00:19:55.380 ⇒ 00:20:09.830 Amber Lin: We’ll probably put these questions into the verified query section, so that it will probably always come out correctly, but that defeats the point of iterating. So right now, let me show you guys what…
133 00:20:10.100 ⇒ 00:20:25.850 Katherine Bayless: I just want to, like, say, like, totally endorse. Yeah, I think that’s, I think that’s the… that’s the right instinct. I have totally seen it do what I was talking about, like, when I was working on the Fortune 500 stuff, it kept being like, well, but you told me the answer, so why would I go calculate? I was like.
134 00:20:26.570 ⇒ 00:20:27.529 Katherine Bayless: Come on, revise.
135 00:20:28.650 ⇒ 00:20:29.489 Katherine Bayless: So yeah, no, I think that.
136 00:20:29.490 ⇒ 00:20:30.110 Amber Lin: Yeah.
137 00:20:30.110 ⇒ 00:20:35.150 Katherine Bayless: Just, like, get it to behave, and then we can let it see the answer key in the back of the book.
138 00:20:36.140 ⇒ 00:20:37.040 Uttam Kumaran: Exactly.
139 00:20:37.200 ⇒ 00:20:37.650 Katherine Bayless: I like it.
140 00:20:38.060 ⇒ 00:20:57.300 Amber Lin: And something that will be really helpful is, you can see, this was asked 6 times in the last 7 days, so I think this will be really helpful to iterate in the future of, hey, our folks were asking this question, let’s go find the right query so that it doesn’t have to generate it in the future anymore.
141 00:20:57.610 ⇒ 00:20:58.360 Katherine Bayless: Nice.
142 00:20:58.520 ⇒ 00:20:59.030 Katherine Bayless: Nice.
143 00:20:59.460 ⇒ 00:21:11.210 Amber Lin: There’s that. Let me show you what this looks like. It’s… it’s in my draft, so it’s not perfectly organized. So, I have the audit report.
144 00:21:11.440 ⇒ 00:21:14.819 Amber Lin: And from that, I was able to generate
145 00:21:15.420 ⇒ 00:21:30.470 Amber Lin: this golden question set. It is in JSON, so the computer access it a little bit better, but I have this readable version that’s just for my eyes to view. So, essentially, I have…
146 00:21:30.790 ⇒ 00:21:37.510 Amber Lin: Taken these questions, and… I had a scalar output.
147 00:21:37.670 ⇒ 00:21:43.390 Amber Lin: So it’s easier to compare than a table. So we have…
148 00:21:43.720 ⇒ 00:21:50.100 Amber Lin: the question ID, the difficulty based on… this is just for how difficult it is for
149 00:21:50.220 ⇒ 00:21:54.410 Amber Lin: for Cortex. And then we have the…
150 00:21:54.570 ⇒ 00:22:01.299 Amber Lin: answer that we’re trying to compare to. So this pretty much maps directly to the pre-audit report.
151 00:22:01.600 ⇒ 00:22:06.380 Amber Lin: We have these questions… The…
152 00:22:07.480 ⇒ 00:22:20.010 Amber Lin: the natural language version of a question right here, and I think in the future we can add different versions of people, how they might approach that, and…
153 00:22:20.450 ⇒ 00:22:21.410 Amber Lin: Let’s see…
154 00:22:22.860 ⇒ 00:22:30.370 Katherine Bayless: I mean, it seems like we could naturally, like, if that verified queries thing is flagging that it’s similar questions have been asked 7 times, then it.
155 00:22:30.920 ⇒ 00:22:34.030 Katherine Bayless: Like, that data should be flowable to that.
156 00:22:34.030 ⇒ 00:22:34.780 Amber Lin: Yeah.
157 00:22:35.560 ⇒ 00:22:50.369 Amber Lin: I agree. And recently, I also added this question category, because right now, the only analy… the Cortex analyst we have is on attendance and registration, so this is just helpful for… for…
158 00:22:50.420 ⇒ 00:22:56.039 Amber Lin: AI, or for me to know, like, hey, we weren’t able to answer those questions, but,
159 00:22:56.150 ⇒ 00:22:59.340 Amber Lin: But they were likely out of scope.
160 00:22:59.620 ⇒ 00:23:02.169 Amber Lin: So, that’s what the golden question set.
161 00:23:02.770 ⇒ 00:23:13.469 Amber Lin: looks like. Let me… pull this… let’s go here. So, this is a…
162 00:23:14.160 ⇒ 00:23:26.050 Amber Lin: before I did the AI loop and put it in a skill, and before we can chat with it, this is the script that I can trigger and run manually, so this is a good,
163 00:23:26.210 ⇒ 00:23:32.630 Amber Lin: concrete view of what the steps look like. So, this is my workflow for
164 00:23:33.310 ⇒ 00:23:43.070 Amber Lin: working on… working through questions, and then applying improvements. So before we… I connect everything, and then I…
165 00:23:43.570 ⇒ 00:23:47.810 Amber Lin: run this, and then I will…
166 00:23:47.910 ⇒ 00:24:01.820 Amber Lin: do this batch run of the golden question set. I’ll probably run 5 questions with some pauses in between, and it gives me the, the result of, hey, this question.
167 00:24:02.090 ⇒ 00:24:04.369 Amber Lin: What was the issue?
168 00:24:04.730 ⇒ 00:24:06.410 Amber Lin: Or what was the result?
169 00:24:06.600 ⇒ 00:24:10.330 Amber Lin: And what’s the actual, and…
170 00:24:10.350 ⇒ 00:24:19.340 Amber Lin: how do we… how did we match? Right now, I gave it a little bit of range of if you’re within, say, like, 5 or 10,
171 00:24:19.340 ⇒ 00:24:33.979 Amber Lin: I call it, like, acceptable, or, like, 5%. I also say, hey, that’s probably acceptable for now. But I have this score, so we can always go back to non-hundred percent matches and improve them.
172 00:24:34.640 ⇒ 00:24:43.650 Amber Lin: So, once I run this, I store this result, I store it over… Over here, So this is…
173 00:24:44.620 ⇒ 00:24:53.710 Amber Lin: As you can see, let’s go to a question. So, this is the question ID. This is the question that was asked.
174 00:24:53.930 ⇒ 00:24:57.439 Amber Lin: And this is the expected result.
175 00:24:57.860 ⇒ 00:25:07.820 Amber Lin: And this is what we got. We say, okay, it has a delta of 9, and that was a very small relative percent of error.
176 00:25:07.820 ⇒ 00:25:25.350 Amber Lin: So we say, okay, so these are the type of results that I am generating, and we can always, like, add new fields, so that we can compare in the future. So we can maybe add, oh, this run was done at April 17th.
177 00:25:25.560 ⇒ 00:25:28.340 Amber Lin: So, those are extra fields we can add.
178 00:25:28.810 ⇒ 00:25:36.890 Amber Lin: Once I have those reports logged, what I pull here is…
179 00:25:36.890 ⇒ 00:25:38.320 Katherine Bayless: Please, sorry, can I ask one question real quick?
180 00:25:38.320 ⇒ 00:25:39.359 Amber Lin: Yeah, go ahead.
181 00:25:39.630 ⇒ 00:25:44.239 Katherine Bayless: So this is… well, I guess it’s in the form of a question, but it’s really more of a slot, so…
182 00:25:44.550 ⇒ 00:25:50.520 Katherine Bayless: like… I’m just, I’m thinking, I’m like, the CES attendance number is, like.
183 00:25:51.630 ⇒ 00:25:59.250 Katherine Bayless: the most, I don’t know, high-profile data point at the organization, right? And so, like, I’m almost wondering, like.
184 00:25:59.540 ⇒ 00:26:19.179 Katherine Bayless: Like, I know one of the other things we’re going to try to get people to do is, like, also not, like, always give the exact number, right? Like, because when we put out a press release, we don’t say 148,392, right? We say, like, over 148,000, right? And it’s like, I wonder if there’s also a way to say, like.
185 00:26:19.680 ⇒ 00:26:38.929 Katherine Bayless: if you don’t have a 100% match on the correct answer, like, rather than give a different CES attendance number that’s, like, off by 2, just say over 148,000. So, like, either exact match or, like, softer language, I don’t know. But, like, I think…
186 00:26:38.930 ⇒ 00:26:45.330 Katherine Bayless: Weirdly, people would be more distrustful if it’s giving numbers that are, like, Off by one.
187 00:26:45.330 ⇒ 00:26:45.780 Amber Lin: that.
188 00:26:45.780 ⇒ 00:26:47.250 Katherine Bayless: doozies versus, like…
189 00:26:47.250 ⇒ 00:26:48.030 Amber Lin: And a given…
190 00:26:48.030 ⇒ 00:26:54.399 Katherine Bayless: softer, like, you know, over 148,000, and then they’re like, okay, good enough, right? Just a thought.
191 00:26:54.400 ⇒ 00:27:10.950 Amber Lin: Let’s see. Yeah, I think especially for bigger numbers, that’s how we can answer. I don’t think we’ll be able to make a conditional, as in, oh, this is within tolerance, because we won’t have the right answer unless we humanly verify it.
192 00:27:10.950 ⇒ 00:27:24.450 Amber Lin: But that’s more… I think that’s more of a, how do we format our answers, is, hey, this is 140K. If you really want the exact number, here’s what it is, but please verify with caution.
193 00:27:24.810 ⇒ 00:27:29.500 Katherine Bayless: Yeah, yeah, yeah, yeah, yeah, yeah, report exact numbers with caution. Yeah, yeah, yeah.
194 00:27:30.420 ⇒ 00:27:31.300 Amber Lin: Yeah.
195 00:27:31.450 ⇒ 00:27:39.330 Amber Lin: So… I’m gonna go back here. I… I’ll load that question, and this is… And…
196 00:27:39.850 ⇒ 00:27:44.830 Amber Lin: Once it’s loaded, I’ll take, for example, one question here.
197 00:27:45.150 ⇒ 00:27:50.140 Amber Lin: and I’ll load it, sorry.
198 00:27:50.940 ⇒ 00:27:52.500 Amber Lin: Let’s see…
199 00:27:55.620 ⇒ 00:27:59.680 Amber Lin: for example, I’ll load the expected…
200 00:28:00.150 ⇒ 00:28:12.609 Amber Lin: So that’s the result from our comparison, and this is SQL, and here’s a suggestion of… a generated suggestion of, hey, what is the…
201 00:28:12.940 ⇒ 00:28:18.599 Amber Lin: issue right here. And here, we can see the issue was why it generated
202 00:28:18.740 ⇒ 00:28:25.830 Amber Lin: such a low number was because we asked, what was the total confirmed attendance in Las Vegas?
203 00:28:26.520 ⇒ 00:28:31.339 Amber Lin: for 26… 2026. And it said… and… and it filtered
204 00:28:31.550 ⇒ 00:28:43.510 Amber Lin: by city equals Las Vegas. So that’s why it got to such a low number. So that means we need to clarify in the city view, in the city…
205 00:28:44.410 ⇒ 00:28:55.400 Amber Lin: comment of, hey, this is what is this address for? Do not use this for certain things. This is only, say, the address for people there.
206 00:28:56.910 ⇒ 00:29:03.259 Amber Lin: So, in this non-AI-enabled view, I have to copy this.
207 00:29:03.500 ⇒ 00:29:08.309 Amber Lin: And I usually copy this, I go here.
208 00:29:08.630 ⇒ 00:29:18.559 Amber Lin: I call my skill I created… This is a… I call that edit.
209 00:29:19.480 ⇒ 00:29:21.800 Amber Lin: Sorry, you can color like this.
210 00:29:24.990 ⇒ 00:29:31.829 Amber Lin: So that will go iterate my semantic view. I call that, it suggests edits in there, and then I…
211 00:29:32.190 ⇒ 00:29:36.530 Amber Lin: I deploy this, and I run this, and I run this again.
212 00:29:36.870 ⇒ 00:29:40.970 Amber Lin: So that is… and then it generated a…
213 00:29:41.430 ⇒ 00:29:51.510 Amber Lin: slightly better, slightly better answer. So that’s one example of the loop, and essentially, I just put this all in a skill.
214 00:29:52.150 ⇒ 00:30:01.779 Amber Lin: right here, which allows me to do several things just all in the chat. I’m able to batch run
215 00:30:02.570 ⇒ 00:30:10.160 Amber Lin: So I can call that skill, and I can say, run…
216 00:30:10.670 ⇒ 00:30:14.659 Amber Lin: 5… well, let’s say we’re on three questions.
217 00:30:15.160 ⇒ 00:30:25.200 Amber Lin: And then it can do that. I can also call to tell it to… Show. Me.
218 00:30:25.850 ⇒ 00:30:29.169 Amber Lin: What failed questions?
219 00:30:30.050 ⇒ 00:30:31.380 Amber Lin: we have…
220 00:30:35.500 ⇒ 00:30:38.540 Amber Lin: So, I also asked it to show all of…
221 00:30:38.640 ⇒ 00:30:44.090 Amber Lin: It’s steps, so you can read, you can read what…
222 00:30:45.060 ⇒ 00:30:49.250 Amber Lin: It’s executed, you can read what it… It wants to do.
223 00:30:51.770 ⇒ 00:30:59.509 Amber Lin: Alright, let’s see… It prompts me to… Log in… And then…
224 00:30:59.850 ⇒ 00:31:18.789 Amber Lin: it will be able to just ask Cortex directly within the chat, so that way you don’t have to jump between files, you can review everything here, and it will also have context to all the files that you’re working with. Okay, so this one has loaded. Let me move this a little bit bigger.
225 00:31:19.250 ⇒ 00:31:25.579 Amber Lin: So I asked it, show me all the failed questions, show me all the failed questions we have.
226 00:31:27.480 ⇒ 00:31:33.989 Amber Lin: Going back… And it’ll show me… among the…
227 00:31:34.300 ⇒ 00:31:40.049 Amber Lin: 42 questions, that’s, not answering great. What was the issue?
228 00:31:40.950 ⇒ 00:31:43.939 Amber Lin: And… how many are there?
229 00:31:44.290 ⇒ 00:31:53.300 Amber Lin: And it can break down by the exact question type, so I can decide, hey, I want to work on… maybe work on this question. Why is there a drift?
230 00:31:53.440 ⇒ 00:32:02.180 Amber Lin: Most of these declines are out of scope, because it’s membership, exhibitors, conferences, so I’m going to focus on the other ones, and…
231 00:32:02.790 ⇒ 00:32:08.380 Amber Lin: For example, if we want to work… on… Want this.
232 00:32:08.860 ⇒ 00:32:12.740 Amber Lin: Let’s say…
233 00:32:15.990 ⇒ 00:32:18.760 Amber Lin: Let’s… Let’s call it…
234 00:32:25.370 ⇒ 00:32:29.640 Amber Lin: Should I change in the semantic View.
235 00:32:32.970 ⇒ 00:32:33.890 Amber Lin: So…
236 00:32:34.000 ⇒ 00:32:43.279 Amber Lin: Then it will call the other skill to look at the execution results and look at our semantic view to see how we can change it.
237 00:32:44.490 ⇒ 00:32:45.600 Amber Lin: Mmm.
238 00:32:46.770 ⇒ 00:32:53.219 Amber Lin: Anyways, oh, this one finished running. So, it ran each… each question.
239 00:32:53.630 ⇒ 00:32:56.170 Amber Lin: Like what we saw in the notebook.
240 00:32:56.290 ⇒ 00:33:05.430 Amber Lin: And… And was able to… Give us… the… Results.
241 00:33:06.060 ⇒ 00:33:08.469 Amber Lin: And… we have…
242 00:33:08.880 ⇒ 00:33:26.289 Amber Lin: a few varying results. The first one, it did not use the semantic view, which is an issue I’m running into quite frequently. It just used a direct fact-in-dim tables, which is not what we want, because fact and dim tables does not have the context, in the semantic view.
243 00:33:26.320 ⇒ 00:33:29.279 Amber Lin: So, there’s something that we can edit.
244 00:33:31.110 ⇒ 00:33:36.319 Katherine Bayless: I think it’s kind of interesting to think about, like… I mean, because I admit, like, I…
245 00:33:37.130 ⇒ 00:33:45.600 Katherine Bayless: I was blown away at how good Cortex code would answer stuff, like, you know, before we’d even started doing any of these things, and like…
246 00:33:46.130 ⇒ 00:34:05.100 Katherine Bayless: you know, my brain goes to, like, looking at what you’ve built here, right, which is awesome, and, like, my wheels turn so fast, right? I’m like, eventually, like, as we refine and whatnot, like, we can totally get this to be, like, a skill that anybody could be using, right? If we’re working with a team to onboard their data, like, they can refine their own semantic view, right? And so I’m like.
247 00:34:05.100 ⇒ 00:34:11.230 Katherine Bayless: I love this, and I see a future where more people are contributing. It will be interesting, like, I…
248 00:34:11.760 ⇒ 00:34:23.590 Katherine Bayless: I will put a bet out there that there will be times that Cortex code is probably more correct at how the data should be used than the semantic view, because this is where the humans have tried to, like, explain things and maybe.
249 00:34:23.590 ⇒ 00:34:24.050 Amber Lin: Not yet.
250 00:34:24.050 ⇒ 00:34:39.669 Katherine Bayless: quite right. And so, like, it’s interesting to me that you’re seeing it disregard the semantic view, which I’m sure at this stage is more just, like, you know, manners and that kind of behavior. But, like, eventually, I wouldn’t mind if it had the authority to, like, veto it, if it was like, I just don’t think that’s right.
251 00:34:40.100 ⇒ 00:34:48.769 Amber Lin: Yeah, and honestly, the more we work on it, and our models are really great, we can add context to, I think, to these
252 00:34:48.770 ⇒ 00:35:01.809 Amber Lin: In fact, tables as well. As long as we specify the right joints, the right formulas for certain things, I think it won’t have no problem outside of that, but.
253 00:35:01.810 ⇒ 00:35:02.310 Katherine Bayless: Yeah.
254 00:35:02.310 ⇒ 00:35:16.679 Amber Lin: we should be able to include everything in the semantic view. And, right now, the non-semantic view’s answers are usually just pretty wrong, so I just told it to say, hey, this is not using the semantic view.
255 00:35:17.330 ⇒ 00:35:21.000 Katherine Bayless: Yeah, no, no, it totally makes sense right now, because I’m, like, thinking ahead to, like, what is this.
256 00:35:21.460 ⇒ 00:35:22.239 Katherine Bayless: In the future?
257 00:35:22.240 ⇒ 00:35:22.820 Amber Lin: Yeah.
258 00:35:22.820 ⇒ 00:35:24.779 Katherine Bayless: Yeah, it’s gonna be interesting.
259 00:35:25.560 ⇒ 00:35:34.329 Amber Lin: Cool, and I think that’s… I think that’s why I created this skill, so that they don’t have to work in a workbook, they can just ask AI questions.
260 00:35:34.560 ⇒ 00:35:37.879 Amber Lin: Okay, so this one also ran.
261 00:35:39.100 ⇒ 00:35:45.790 Amber Lin: Sorry, we’re switching brakes quite a lot, because sometimes the run takes a while. So we asked it.
262 00:35:46.540 ⇒ 00:35:53.120 Amber Lin: This question, why did it fail? Because it… I don’t think… Let’s see…
263 00:35:53.750 ⇒ 00:36:02.689 Amber Lin: I asked, hey, what can I change in the semantic view? So, this is the brief of what happened, there’s a drift, we expected
264 00:36:03.130 ⇒ 00:36:05.910 Amber Lin: 62… Per set.
265 00:36:06.380 ⇒ 00:36:15.699 Amber Lin: And then it gave us a count. So, that was not a percentage, this is just a count of the domestic attendance.
266 00:36:16.200 ⇒ 00:36:21.650 Amber Lin: And then we can look at…
267 00:36:25.640 ⇒ 00:36:31.100 Amber Lin: Then we can look at… here’s its suggestions of what can we change.
268 00:36:32.140 ⇒ 00:36:45.740 Amber Lin: So, it’s suggesting that, hey, we should add a metric that is a calculated percentage. So, it knows that, hey, we will use the percentage instead of the count.
269 00:36:47.880 ⇒ 00:36:49.000 Amber Lin: Let’s see…
270 00:36:53.410 ⇒ 00:36:55.990 Amber Lin: This is the… metric.
271 00:36:56.340 ⇒ 00:37:01.550 Amber Lin: I think they’re suggesting to add… And then we have…
272 00:37:01.900 ⇒ 00:37:07.699 Amber Lin: Some synonyms, so this is, like, extra context for… for this field.
273 00:37:12.340 ⇒ 00:37:24.500 Amber Lin: I’m gonna let it run this suggestions. I… I don’t know if I will merge it yet, because I don’t think I had enough time to read through what is actually suggesting.
274 00:37:29.130 ⇒ 00:37:44.769 Chi Quinn: And that’s something that you would eventually, because of the suggested questions and what they’re suggesting you, what you should add to the semantic view, that’s something that you would then push that eventually to the semantic views and Snowflake to update the
275 00:37:44.890 ⇒ 00:37:58.820 Chi Quinn: the… the text, because I saw that example with the attendance in Las Vegas for CES. I did see that comment that was added later to say, do not include Las Vegas as… or don’t filter it to Las Vegas.
276 00:37:59.910 ⇒ 00:38:11.910 Amber Lin: Yeah, I… let’s… let me show you what… let’s have it edit. I might not deploy it until I review it, but here is… let me find the script…
277 00:38:12.290 ⇒ 00:38:16.529 Amber Lin: And then we can see what changes it would apply.
278 00:38:16.750 ⇒ 00:38:19.920 Amber Lin: So, we are in here.
279 00:38:20.200 ⇒ 00:38:26.679 Amber Lin: Some changes from… I think from yesterday.
280 00:38:29.130 ⇒ 00:38:30.519 Amber Lin: Let’s see…
281 00:38:39.560 ⇒ 00:38:40.300 Amber Lin: Okay.
282 00:38:41.470 ⇒ 00:38:52.990 Amber Lin: I’m gonna save this first, and I’m gonna asset… to apply… Apply… With… Oh.
283 00:38:54.220 ⇒ 00:39:01.640 Amber Lin: Okay. I also gave it these quick instructions, so I’m just gonna type one right now.
284 00:39:02.180 ⇒ 00:39:04.529 Amber Lin: And let’s see…
285 00:39:10.510 ⇒ 00:39:12.940 Amber Lin: Let’s see what changes it will make.
286 00:39:22.400 ⇒ 00:39:33.379 Amber Lin: So, now it’s making these changes in this doc. Usually, you don’t have to have this whole thing open, you can just read, the changes that it’s adding here.
287 00:39:34.250 ⇒ 00:39:39.429 Amber Lin: And see, hey, it… Like, for example, I don’t… I don’t…
288 00:39:39.840 ⇒ 00:39:45.449 Amber Lin: think that’s necessary… I don’t know if that’s necessarily correct.
289 00:39:48.140 ⇒ 00:39:53.969 Katherine Bayless: Actually, I mean, I… I don’t know either, but I wouldn’t be surprised if somebody asks for it that way.
290 00:39:54.470 ⇒ 00:39:58.880 Amber Lin: Yeah, is it really based on the mailing country?
291 00:39:58.990 ⇒ 00:40:00.020 Amber Lin: Or…
292 00:40:00.600 ⇒ 00:40:03.590 Katherine Bayless: Yeah, for the attendee, like, they’re mailing it. Okay.
293 00:40:03.590 ⇒ 00:40:15.030 Amber Lin: Okay, okay, okay, so that is correct. And then… attended count…
294 00:40:18.720 ⇒ 00:40:24.010 Amber Lin: Here, I think it gave, like, a sample query in the context.
295 00:40:24.270 ⇒ 00:40:28.000 Amber Lin: Here are some example questions they would ask.
296 00:40:28.820 ⇒ 00:40:33.580 Amber Lin: So this is how they would… Applied changes there.
297 00:40:33.840 ⇒ 00:40:36.670 Amber Lin: I’m gonna ask it to say…
298 00:40:44.120 ⇒ 00:40:54.230 Amber Lin: And… And, metric… Or derived metric.
299 00:40:54.660 ⇒ 00:40:58.350 Amber Lin: for… Percent of total attendees.
300 00:41:00.900 ⇒ 00:41:06.250 Amber Lin: I think that will be helpful. And once I do that, as you can see, we can… we can…
301 00:41:06.530 ⇒ 00:41:11.879 Amber Lin: deploy this view, and then we’ll just run the SQL again.
302 00:41:36.020 ⇒ 00:41:37.840 Amber Lin: Holder attended…
303 00:41:42.440 ⇒ 00:41:44.059 Amber Lin: Then you can read.
304 00:41:44.390 ⇒ 00:41:55.320 Amber Lin: what query they generated. They should probably be named more accurately, if… depending on if it’s just domestic.
305 00:41:55.960 ⇒ 00:41:57.560 Amber Lin: And…
306 00:42:03.140 ⇒ 00:42:09.590 Amber Lin: I’ll say… Is that… Call it…
307 00:42:13.750 ⇒ 00:42:22.400 Amber Lin: I’ll say, if this is still a mess… thick, it… Should be named according…
308 00:42:22.760 ⇒ 00:42:27.340 Amber Lin: Okay? So my conversation with AI is, like, really…
309 00:42:27.480 ⇒ 00:42:31.120 Amber Lin: It’s not always, like, very instructional with
310 00:42:31.320 ⇒ 00:42:50.760 Amber Lin: like, exact coding it should do. I think this is representative of maybe a less technical audience of how they might communicate, because that’s… that’s kind of what I’ve seen through my less technical clients of how they ask AI. I’m trying to replicate what they… what they would say.
311 00:42:50.760 ⇒ 00:42:55.890 Katherine Bayless: I do the same thing, actually. Like, I try to ask it questions almost, like, in the dumbest way possible on purpose.
312 00:42:56.570 ⇒ 00:42:59.299 Katherine Bayless: Just because I’m like, honestly, I could see someone asking this way.
313 00:42:59.300 ⇒ 00:42:59.870 Chi Quinn: That’s good.
314 00:42:59.870 ⇒ 00:43:00.790 Amber Lin: Yeah.
315 00:43:01.100 ⇒ 00:43:01.760 Katherine Bayless: Yeah.
316 00:43:02.080 ⇒ 00:43:06.440 Amber Lin: Yeah, awesome. So, it has this. I… I’m not gonna…
317 00:43:06.960 ⇒ 00:43:16.290 Amber Lin: like, click keep file, but… so I can revert it back, but let’s… let’s deploy and see if it changes.
318 00:43:16.460 ⇒ 00:43:20.859 Amber Lin: And then we can revert it back if we need. So I’m gonna…
319 00:43:21.170 ⇒ 00:43:25.950 Amber Lin: Oh, I’ll just ask it to deploy. Deploy, and rerun.
320 00:43:26.290 ⇒ 00:43:27.530 Amber Lin: Question.
321 00:43:27.980 ⇒ 00:43:29.250 Amber Lin: After that.
322 00:43:33.360 ⇒ 00:43:46.190 Amber Lin: So, I… my goal is that no one has to open a terminal, that I don’t have to open the files. If they wanted to, they can just stay here, because terminals might be really scary for some users.
323 00:43:46.930 ⇒ 00:43:55.709 Katherine Bayless: Well, yeah, I mean, the other thing, too, is, like, we’ll have to figure this out, but, like, we’re not gonna have Claude code for everybody, exactly.
324 00:43:56.050 ⇒ 00:43:59.849 Katherine Bayless: And also kind of silly, because most of them wouldn’t need clawed code, but like…
325 00:44:00.280 ⇒ 00:44:15.829 Katherine Bayless: if we can expose the Snow CLI in a way that’s kind of clever, or maybe the MCP, or maybe it’s, like, we give them the MCP, and then they can do the defining, and then, you know, we would take the actual… like, it could write Snow code that they hand off to us, like, you know, details to figure out, but, like.
326 00:44:15.830 ⇒ 00:44:21.029 Katherine Bayless: It’ll be interesting to figure out how we provide this experience also to somebody without the coding piece.
327 00:44:21.030 ⇒ 00:44:33.600 Amber Lin: Yeah, I think Utom has something to speak to that, because that’s what we’re trying to do for our internal folks, but here, look, this is… this has improved. And we rerun, it deployed it.
328 00:44:33.970 ⇒ 00:44:36.459 Amber Lin: It rerun this question set.
329 00:44:36.640 ⇒ 00:44:40.540 Amber Lin: And now, we have… We have this.
330 00:44:41.430 ⇒ 00:44:55.560 Amber Lin: It’s almost the exact match. So that’s an example of, how we can create new metrics, edit the context to guide it to the right place, of how it’s answering.
331 00:44:55.690 ⇒ 00:45:17.940 Amber Lin: There could be risks of that we’re just overfitting for this specific question, but my hope is that as we do more and more questions, there will be enough context that it kind of knows, like, oh, we did that type of question that way, maybe we should do this. So, hopefully we don’t have a bulging context sprawl,
332 00:45:17.940 ⇒ 00:45:22.609 Amber Lin: But for the initial phase, I think this is the way that I plan to approach it.
333 00:45:23.900 ⇒ 00:45:35.960 Katherine Bayless: Yeah, I mean, Amber, this is all really cool. I appreciate the, the deep dive into it. I was funny on Wednesday, Utema and I were talking about the, like, the Q2 scope of work, and it just… we could not get ourselves to, like, end and show up in the other
334 00:45:36.720 ⇒ 00:45:44.320 Katherine Bayless: But this is really cool, and I think, honestly, Kai, I mean, this is so exciting for, like, how we can try to scale this stuff out.
335 00:45:44.710 ⇒ 00:45:50.439 Chi Quinn: Yeah, yeah. I want to get my hands on the cursor. I, I, if I recall,
336 00:45:50.900 ⇒ 00:45:52.739 Chi Quinn: I feel like I, I, I…
337 00:45:53.110 ⇒ 00:45:58.619 Chi Quinn: I don’t know if I had… I know from Visual Studio, I could… could I add that, or… or…
338 00:45:58.760 ⇒ 00:46:00.699 Chi Quinn: Or is that something…
339 00:46:02.080 ⇒ 00:46:12.900 Katherine Bayless: Okay, so you could, you could, in theory, do the same as, like, Amber’s doing in Visual Studio Code, because, actually, I think each time had told us, like, cursor’s just another flavor of VS Code under the hood, basically.
340 00:46:12.900 ⇒ 00:46:13.570 Chi Quinn: Right.
341 00:46:13.570 ⇒ 00:46:18.799 Katherine Bayless: But we can also get you set up with Cursor. I think the reason, when we did that demo a while back, and then we all were like.
342 00:46:18.800 ⇒ 00:46:19.330 Chi Quinn: Yeah.
343 00:46:19.330 ⇒ 00:46:19.980 Katherine Bayless: Gonna stick with.
344 00:46:19.980 ⇒ 00:46:20.820 Chi Quinn: Right.
345 00:46:20.820 ⇒ 00:46:22.889 Katherine Bayless: Because it was the,
346 00:46:23.110 ⇒ 00:46:32.650 Katherine Bayless: there was some wrinkle at the time around, like, the API and getting it all connected, but I think we’ve solved that, so if you wanted to get set up with Cursor, like, we can totally do that.
347 00:46:32.870 ⇒ 00:46:36.430 Amber Lin: And I can walk you through, like, it’s working perfectly fine.
348 00:46:36.760 ⇒ 00:46:40.840 Amber Lin: I’m also outside the organization, so it probably would be even easier for you.
349 00:46:41.250 ⇒ 00:46:56.789 Chi Quinn: Yeah, no, that’d be great. That would be really awesome. Yeah, because just… I was reading your notes, just what you had on the Slack channel, the other day, and I was like, oh my gosh, that’s really cool, like, I want to try that, because I just saw, like, the process from beginning of just gathering the questions.
350 00:46:56.790 ⇒ 00:47:03.110 Chi Quinn: And then just kind of the… there’s a whole process that was built and created. So yeah, I would be tough. I would be interested.
351 00:47:03.250 ⇒ 00:47:04.090 Amber Lin: Awesome.
352 00:47:04.780 ⇒ 00:47:06.609 Katherine Bayless: I empower you to make it happen.
353 00:47:07.990 ⇒ 00:47:28.620 Katherine Bayless: The other thing that might be interesting, too, to look at is, like, on Glean, like, I mean, even just maybe asking it, like, you know, what other sort of, like, published reports do we have running around out here that we could use for similar sort of, like, initial establishment of questions and stuff like that? Because there is probably a lot of stuff that we have published and parked on SharePoint somewhere that would just be…
354 00:47:28.620 ⇒ 00:47:29.020 Amber Lin: Yeah.
355 00:47:29.020 ⇒ 00:47:30.320 Chi Quinn: That would be…
356 00:47:30.320 ⇒ 00:47:36.759 Amber Lin: That would be really helpful if you guys can send me that. I… I can make more question sets.
357 00:47:36.990 ⇒ 00:47:40.630 Katherine Bayless: Yeah. We should also just get you access to Glean, potentially.
358 00:47:40.810 ⇒ 00:47:47.150 Amber Lin: Yeah, that would be helpful. Or just downloads would work if access is a big… big pain.
359 00:47:47.390 ⇒ 00:47:48.230 Amber Lin: Elite.
360 00:47:48.230 ⇒ 00:47:49.669 Katherine Bayless: It’s a pain, but it also might be
361 00:47:50.020 ⇒ 00:47:53.149 Katherine Bayless: And you need, and it just might be, like, overwhelming, I guess, potentially, too, so yeah.
362 00:47:53.150 ⇒ 00:47:54.609 Amber Lin: Yeah, that could… that could be true.
363 00:47:54.610 ⇒ 00:47:58.009 Katherine Bayless: Yeah, maybe the intermediary is not a bad idea.
364 00:47:58.010 ⇒ 00:48:12.760 Amber Lin: Okay, I kind of want to also, if there’s no questions I can answer here, I wanted to ask about what the… what our next steps would be. Like, right now, my plan is just to refine this registration.
365 00:48:12.910 ⇒ 00:48:16.220 Amber Lin: Analysts, and then we can start
366 00:48:16.370 ⇒ 00:48:30.390 Amber Lin: on maybe, conferences, or maybe exhibitors, or maybe memberships, depending on where we want to go. And I also know we want to work on Streamlit.
367 00:48:30.390 ⇒ 00:48:38.510 Amber Lin: So, wanted to ask both of you, what timelines look like? Should that be in parallel? What should go first?
368 00:48:39.520 ⇒ 00:48:47.220 Katherine Bayless: Yeah, so I think I’ll give the… the kind of… the context for the humans, like, the background on where I think the, like.
369 00:48:47.300 ⇒ 00:48:57.000 Katherine Bayless: where I think the work will be distributed in the next, like, kind of time period, maybe, you know, a couple months. And then I’ll let you guys kind of take the specifics, out from there, but…
370 00:48:57.040 ⇒ 00:48:58.420 Katherine Bayless: So, we’re…
371 00:48:58.420 ⇒ 00:49:19.579 Katherine Bayless: we’re onboarding as much data as we can right now, right? Like, once we get to July 1st, that’s when the Innovation Awards program will launch, and, like, it kind of, at least in my experience last year, it sort of seemed like that’s, like, the first thing that’s really part of the CES, like, cycle that has that external engagement component, right? Like, it’s the first time we start talking to people about CES.
372 00:49:19.580 ⇒ 00:49:20.110 Amber Lin: That’s right.
373 00:49:20.110 ⇒ 00:49:43.350 Katherine Bayless: That kind of thing. And so, like, once we hit July 1st, people are going to switch into CES mode, and they’re going to switch into telemetry mode instead of, like, reporting mode. Like, right now, they’re very interested in, like, you know, looking backwards. How many of these, how many of those? Because they’re using that to inform their planning for the coming cycle. So, conferences is analyzing the session attendance data really heavily so that they can start figuring out
374 00:49:43.350 ⇒ 00:50:08.319 Katherine Bayless: what they want to put into the session catalog for this coming year. But as soon as that switch flips, it flips and flips hard from reporting to telemetry, and so, like, I think that’s where I’m like, okay, I want to land as much data and get it modeled and into Snowflake as possible, so that by the time the, like, what is, and how is this, and why, and, like, all those questions start coming in, we’ll have a lot of different data that we can pull together for
375 00:50:08.320 ⇒ 00:50:30.079 Katherine Bayless: people, because the magic trick that works every time we show people Snowflake is, yes, their data being in there, but also the other stuff, right? People are used to getting reports that come out of one data source, maybe two, if they’re lucky, and wait a couple weeks, right? And so, like, the big picture view is really exciting to folks, and it does help them ask, like, bigger, better questions.
376 00:50:30.080 ⇒ 00:50:42.229 Katherine Bayless: So, long way of saying, yes, all of this kind of data is streaming in, which means that this becomes, right, like, a bit of a backlog on the semantic view side, so in terms of prioritizing and picking, I think
377 00:50:42.610 ⇒ 00:50:55.209 Katherine Bayless: Membership and exhibit stuff is good, because a lot of the questions are gonna come from those two teams who do not share databases, or keys or anything. They’re…
378 00:50:55.230 ⇒ 00:51:05.010 Katherine Bayless: both going to want, like, well, how many members are exhibiting? How many exhibitors are members? What, like, this, that? Like, this interplay between their data sets is one of the most, like.
379 00:51:05.620 ⇒ 00:51:13.130 Katherine Bayless: clunky and tricky things that we deal with at the organization. One example, actually, that we were talking about yesterday was the membership team.
380 00:51:13.220 ⇒ 00:51:14.380 Katherine Bayless: They do.
381 00:51:14.410 ⇒ 00:51:39.399 Katherine Bayless: These poor guys. They do an audit of the exhibit sales, so they get, like, an infrequent, probably not terribly helpful report out of the sales team that they then use to manually cross-reference in an Excel sheet and figure out which members have an exhibit space, right? And they do this on a, like, monthly basis, and they talk about how long it takes and how intensive it is, and I do think they will still want to do an audit QA-type thing on the matching, but, like.
382 00:51:39.400 ⇒ 00:51:55.799 Katherine Bayless: we can get it to the point where it’s like, check these three records that the AI wasn’t sure about, you know? And so I think putting some thought into the semantic models for membership and for exhibit space will probably pay dividends as that sort of questioning type… as that sort of questioning ramps up, I guess.
383 00:51:56.280 ⇒ 00:52:06.009 Amber Lin: I see. And then, so we’ll do attendance registration, we’ll do membership and exhibits next, and then we’ll do, like, conferences last?
384 00:52:06.490 ⇒ 00:52:11.310 Katherine Bayless: Yeah, I mean, confr- well… Okay, yeah, good, good.
385 00:52:11.310 ⇒ 00:52:13.329 Amber Lin: Or should they be in parallel?
386 00:52:13.590 ⇒ 00:52:16.619 Katherine Bayless: I would actually almost…
387 00:52:17.120 ⇒ 00:52:31.219 Katherine Bayless: you might be able to knock conferences out pretty quickly, because we have scarce data on that program. I think there’s a lot of power to do more with the little bit of data we have, but, like, I don’t think we have a ton, so you might be able to, like, get that one kind of knocked out.
388 00:52:31.220 ⇒ 00:52:37.020 Amber Lin: That’s good. Yeah, yeah, basic questions should be okay, and then, like, more complex stuff we can do later.
389 00:52:37.020 ⇒ 00:52:44.410 Katherine Bayless: Yeah, yeah, exactly, exactly. So I would say conferences, membership, and exhibits, like, those three make sense. And then on the streamlit side.
390 00:52:44.970 ⇒ 00:52:47.350 Katherine Bayless: It’s interesting, I think…
391 00:52:47.350 ⇒ 00:53:12.320 Katherine Bayless: Personally, the, like, most frustrating thing about the Streamlit apps right now is just that, like, Claude Code is really good at building them for me, and, like, putting, like, you know, good data and code into them, but, like, just the visuals are terrible. Like, it keeps giving me, like, black text on black background. I’m like, I’m not a robot, and I don’t think you can read this either. And so, like, I think some governance of the, like, visual vibes of Streamlit are kind of urgently needed, at least in my
392 00:53:12.320 ⇒ 00:53:28.449 Katherine Bayless: opinion. I’m a dataset of one, though, so consider it an anecdote. But I also think people are going to want to see a lot more reports as their data lands, even if eventually they transition to more conversation, like that initial sort of, like, you know.
393 00:53:28.450 ⇒ 00:53:38.099 Katherine Bayless: emotional support, dashboard seems to really, like, pay dividends, where they’re like, okay, I’m in this big, new, scary tool called Snowflake, and I can do all kinds of things, but if I just come in and look at this dashboard.
394 00:53:38.200 ⇒ 00:53:57.859 Katherine Bayless: I can get the thing I need, and eventually, I build the confidence to start, you know, looking a little deeper. So I… I guess it’s a long way of saying, streamlet’s worth focusing on, but I think it really is more just a, like, can we get them to look a little bit nicer right out of the gate? Because I don’t know that we’re really struggling to get them to be, like, useful, it’s just…
395 00:53:57.860 ⇒ 00:53:58.750 Amber Lin: Getting them.
396 00:53:58.750 ⇒ 00:54:05.400 Katherine Bayless: visible. Cool, okay. You might have a totally different take, though, because I know you’ve been in the weeds on them a lot more than I have.
397 00:54:05.700 ⇒ 00:54:30.500 Chi Quinn: Yeah, yeah, no, I’m kind of, like, in the same mindset, with you, because, of course, it’s… it’s one of those things, like, how can we make the… the visual, look a little bit more, easy on the eyes for the end user, and just what I’ve been working on, on the end, because I know you have the… the Power BI Streamlit, reports, so I’ve been working on just kind of creating, like, a mock-up, just kind of see, like, the design.
398 00:54:30.500 ⇒ 00:54:33.130 Chi Quinn: Of how can this look a little bit more
399 00:54:33.130 ⇒ 00:54:40.980 Chi Quinn: pleases to the eye, and what I was also thinking, because I know if people are going to want to make reports in the future, that I guess
400 00:54:41.650 ⇒ 00:54:56.810 Chi Quinn: there… because that’s what I’m thinking ahead of, because I could see cases where, you know, people would just, right, say anything, and I guess, should there be, like, some type of structure of, here is a… the reports should at least have these basic
401 00:54:57.550 ⇒ 00:55:13.779 Chi Quinn: That, and then also, like, the process of when someone creates it, like, how can we, I guess, verify that this is, yes, it’s returning all the correct information, or whatnot. So I am kind of thinking more of, okay, I can see this, moving forward if people make
402 00:55:13.790 ⇒ 00:55:31.160 Chi Quinn: reports, and it’s going to, you know, it’s going to build up, and so it’s just kind of, like, thinking about, from the visual standpoint, how would it look? What’s the process? So, I’m… I’m there with you. I’m kind of looking at it from, you know, I guess jumping way ahead, and kind of working back to
403 00:55:31.160 ⇒ 00:55:34.970 Chi Quinn: Finding what is that sweet spot, basically.
404 00:55:35.760 ⇒ 00:55:39.110 Katherine Bayless: Yeah, I think some skills might help, too, with the streamlit stuff.
405 00:55:39.110 ⇒ 00:55:40.630 Chi Quinn: Like… Yeah.
406 00:55:40.630 ⇒ 00:55:41.080 Katherine Bayless: Yeah.
407 00:55:41.080 ⇒ 00:55:56.689 Amber Lin: Yeah, so I just searched it. I don’t think we’re able to use, like, the traditional themes, but we can have a standard, CSS or Markdown file on, hey, these are the standard things, we can put that into
408 00:55:56.890 ⇒ 00:56:15.140 Amber Lin: Maybe a rule, and then when we apply any skills to create new streamer apps, it will always reference this rule of, hey, you cannot have black on black, you need to have sections in this way. So, like, those are… will be interesting things, but
409 00:56:15.450 ⇒ 00:56:24.720 Amber Lin: who, like, who will be developing the Streamlit apps? Because then we’ll have to make sure that they have access to these skills and these rules.
410 00:56:25.370 ⇒ 00:56:50.000 Katherine Bayless: So the sort of official point person on them is Kai, conveniently. Then we are hoping to democratize this to Teams, so we have a couple folks right now that have the ability to build them, mostly just as, like, they’re good early adopter beta testers. I’m kind of just curious to see if they, like, love it, hate it, use it, don’t ever log in, whatever. But the people who would build Streamlit apps beyond Kai, and I mean, obviously me and Kai might too sometimes.
411 00:56:50.120 ⇒ 00:57:04.909 Katherine Bayless: Like, they would only be people who are either already fairly tech-savvy and we’re letting them in to just play with it, or people we’re working with really closely to teach them how to, like, get comfortable doing this, so it won’t necessarily just be a free-for-all, I guess, if that makes sense.
412 00:57:05.250 ⇒ 00:57:06.600 Katherine Bayless: At least not initially.
413 00:57:07.510 ⇒ 00:57:23.819 Amber Lin: That’s good. Then that shouldn’t be an issue. Maybe, Kai, we can work together to develop a visual package. I know you guys have a theme in Power BI. I don’t know how we can export that, so if there’s, like, a CSS file…
414 00:57:23.820 ⇒ 00:57:28.840 Chi Quinn: Power BI.
415 00:57:29.420 ⇒ 00:57:30.280 Katherine Bayless: That’s like.
416 00:57:30.280 ⇒ 00:57:41.200 Chi Quinn: It’s a little better, but yeah, I mean, just the… yeah, because the blue on blue, it’s not the prettiest, or at least for my eyes, at least my eyes. I don’t know about everybody else’s eyes, but my eyes is…
417 00:57:41.430 ⇒ 00:57:46.139 Amber Lin: Yeah, then we can go download some templates, or we can go.
418 00:57:46.140 ⇒ 00:57:46.500 Chi Quinn: Yeah.
419 00:57:46.500 ⇒ 00:57:51.750 Amber Lin: Generate some stuff, we can experiment, and once we like it, we’ll say, hey, this is the theme that we’ll use.
420 00:57:52.010 ⇒ 00:58:06.880 Katherine Bayless: Yeah. It might be kind of fun, too, to, like, you know, take a, you know, AI on a field trip and, like, send it out to, like, do a bunch of looking at our website, or our press releases, or our research downloads, or something like that, like, because that Power BI style does not look like anything this organization…
421 00:58:07.530 ⇒ 00:58:08.570 Chi Quinn: Right? That’s funny.
422 00:58:08.570 ⇒ 00:58:18.249 Katherine Bayless: It’s like, it’s so high design, and yet it matches nothing. And so, like, honestly, figuring out, like, what is our style, like, I don’t know, right?
423 00:58:18.680 ⇒ 00:58:19.430 Katherine Bayless: Sure, then come back.
424 00:58:19.430 ⇒ 00:58:30.369 Amber Lin: Does, like, the website or branding team have the fonts and, like, that type of stuff they’re using? That’s something we can just take from them.
425 00:58:31.000 ⇒ 00:58:44.230 Katherine Bayless: They might, they probably do, whether it’s marketing that has it, or Jackie, our web dev, I don’t know, but maybe we could ask on Monday, when we have the meeting with them. Okay. Do you have a brand asset design library kind of thing.
426 00:58:44.230 ⇒ 00:58:55.560 Amber Lin: Yeah, yeah, that will… that will be the basis of what we were, because then they will have the colors, they have the fonts, they have the spacing, and how they do different tiles. So, yeah.
427 00:58:55.730 ⇒ 00:58:56.400 Amber Lin: They might.
428 00:58:58.840 ⇒ 00:58:59.930 Amber Lin: Awesome!
429 00:59:00.080 ⇒ 00:59:09.420 Amber Lin: Alright, we’re on time, but I want to make sure, like, we can have everything answered, and we can note down anything we want to talk about for Monday.
430 00:59:10.220 ⇒ 00:59:11.500 Katherine Bayless: Yeah, I think we’re in good shape.
431 00:59:11.900 ⇒ 00:59:13.960 Amber Lin: Okay. Exciting.
432 00:59:15.340 ⇒ 00:59:26.900 Katherine Bayless: I am very excited. I am very ready for the weekend, but I’m also very excited. Like, I think… I think… yeah. I think the next few months are gonna be fast, but productive. Like, it’s good.
433 00:59:27.390 ⇒ 00:59:27.780 Amber Lin: Awesome!
434 00:59:27.980 ⇒ 00:59:28.720 Katherine Bayless: traction.
435 00:59:29.600 ⇒ 00:59:33.749 Amber Lin: Alright, lovely to see you both. I’ll see you guys on Monday.
436 00:59:33.750 ⇒ 00:59:34.800 Katherine Bayless: Alright, have a great weekend.
437 00:59:34.800 ⇒ 00:59:36.210 Chi Quinn: Alright, alright.
438 00:59:36.260 ⇒ 00:59:36.880 Amber Lin: Bye!