Meeting Title: Brainforge x CTA: Weekly! Date: 2026-05-08 Meeting participants: Chris Terry, Kyle Wandel, Katherine Bayless, Amber Lin, Chi Quinn, Awaish Kumar, Ashwini Sharma, Uttam Kumaran


WEBVTT

1 00:00:30.690 00:00:31.680 Kyle Wandel: Hey, Chris.

2 00:00:31.680 00:00:32.719 Chris Terry: Hey, how’s it going?

3 00:00:32.720 00:00:37.319 Kyle Wandel: Good. Probably off… off television. I’m gonna eat my lunch right now, I hope so.

4 00:00:37.500 00:00:40.809 Chris Terry: Yeah, I just… I had a quick bite to eat myself, so you’re all good.

5 00:00:43.880 00:00:45.850 Kyle Wandel: I don’t see how long it takes everybody to get in.

6 00:00:46.030 00:00:50.050 Chris Terry: Yeah, that’s always… It always does take a minute.

7 00:00:53.300 00:00:55.759 Kyle Wandel: Who did you, interview at 11?

8 00:00:55.920 00:00:59.280 Chris Terry: It was… Adrian.

9 00:00:59.410 00:01:00.000 Chris Terry: So…

10 00:01:00.000 00:01:04.480 Kyle Wandel: Oh, nice, cool. Yeah, I’ve worked with Adrian a few times, actually, before I even joined DataOps.

11 00:01:06.480 00:01:13.530 Chris Terry: Yeah, she gave me a few, few things to go off of, and some pretty good ideas. It was actually very productive, I think it’ll go well with everybody else, so…

12 00:01:14.570 00:01:25.429 Kyle Wandel: Good, yeah, she’s pretty… hey, Catherine. Adrienne’s pretty, pretty knowledgeable on pretty much everything CES, like, pretty much everything CES, she’s really, really knowledgeable on, so she’s an excellent person to interview at first.

13 00:01:25.780 00:01:26.660 Chris Terry: Absolutely.

14 00:01:28.220 00:01:32.350 Katherine Bayless: Yeah, very much Team Adrienne. She’s delightful.

15 00:01:32.520 00:01:40.539 Katherine Bayless: And also, like, super excited about some of the stuff that we’re gonna be able to build, I think, this year, now that we’ll have so much better access to that data.

16 00:01:41.870 00:01:49.660 Chris Terry: Yeah, she was talking about the, RFID stuff, and I guess getting a quote on the cost on all that. Apparently, that’s a lot more expensive than what y’all have been doing.

17 00:01:50.100 00:02:12.679 Katherine Bayless: Yeah, I mean, like, genuinely, probably will be, like, 10x or something insane. Right now, we’re just, literally, we’re printing QR codes on people’s, essentially, name tags, right? Like, there’s a big difference between paper with a picture on it that does things, and, like, you know, something that can actually be scanned. So, like, you know, I get it that it’s an increase, but I’m like, Kinsey wants this, we need it, and I would be…

18 00:02:12.680 00:02:19.609 Katherine Bayless: disappointed and surprised if the organization wasn’t willing to, like, you know, just a leap, but we’ll see. We’ll see.

19 00:02:20.330 00:02:21.320 Chris Terry: Absolutely.

20 00:02:21.320 00:02:36.730 Katherine Bayless: Yeah. But we had the conversation, with the merits team this morning around the, Snowflake data share, and so May 25th, or week of May 25th, since it’s the holiday, but week of May 25th, we should have the merits data share live in Snowflake.

21 00:02:36.940 00:02:37.270 Chris Terry: That’s.

22 00:02:37.270 00:02:42.140 Kyle Wandel: Okay, perfect. I was gonna ask about that in terms of, the… that conversation.

23 00:02:42.140 00:02:46.010 Katherine Bayless: Yeah, I’m, like, I’m so excited, and it’ll be…

24 00:02:46.010 00:03:08.730 Katherine Bayless: The same as, like, the one from Remembers. Sorry, Claude is asking me for all the permissions. It’ll just be the same, like, the one for remembers, in that it’ll just be, like, our data, but otherwise their structure. It’ll refresh… initially, I was like, I mean, we’re not doing reg, so it doesn’t seem like we need constant refreshes of the data, so I think it’ll do, like, once per day, and it’ll switch to once an hour as we get closer to Reg opening.

25 00:03:08.730 00:03:33.669 Katherine Bayless: And then they can have it actually, refresh up to, like, every 15 minutes, although I appreciated that the guy on their side was like, just to be clear, though, like, we should never use a snowflake data share for, like, live integrations, even if it is every 15 minutes. And I was like, yes, this is correct, this is the way. So, like, we would still use their APIs for anything we were actually trying to, like, live integrate, but we could at least be getting the data from a reporting perspective every

26 00:03:33.670 00:03:38.520 Katherine Bayless: 15 minutes, which would be kind of interesting. I don’t know that we need it that fast, but, you know.

27 00:03:38.520 00:03:38.900 Chris Terry: No.

28 00:03:38.900 00:03:40.570 Katherine Bayless: an option.

29 00:03:41.750 00:03:48.410 Chris Terry: You mentioned that, there was a bunch of stuff that we weren’t tracking, but apparently Merits is gonna start tracking it now, so…

30 00:03:49.470 00:04:02.910 Chris Terry: Yeah, so it seems pretty good. They, you know, I was kind of asking, like, you know, what kind of stuff do you guys need? And she’s like, yeah, a lot of… a lot of merits, like, because they’re into the AI stuff now, too, and I guess they’re getting a lot more, information back to them, so they seem pretty happy with that.

31 00:04:03.320 00:04:09.750 Katherine Bayless: Yeah, yeah, and I think, part of her, sort of angle, too, is, like, you know, if we can automate, you know.

32 00:04:10.220 00:04:25.269 Katherine Bayless: Because, in fairness to merits, they have put up with our lack of technical sophistication for a while, and so it’s like, we can make everybody’s lives easier by finally putting some of this stuff in place, and then, yeah, if they’re using AI on their side, and then if we can help with the customer service piece.

33 00:04:25.270 00:04:33.740 Katherine Bayless: I think Adrian’s angle is like, okay, also, maybe this contract shouldn’t be quite so expensive. And so, yeah, I’m like, totally team, totally team that argument.

34 00:04:33.740 00:04:34.930 Chris Terry: For sure.

35 00:04:35.590 00:04:37.250 Katherine Bayless: But yeah.

36 00:04:37.850 00:04:41.030 Katherine Bayless: Anyway… How’s everybody doing? Friday.

37 00:04:41.600 00:04:42.429 Chris Terry: Doing great.

38 00:04:42.880 00:04:43.530 Katherine Bayless: Yeah.

39 00:04:44.670 00:04:47.049 Kyle Wandel: Yeah, little band screaming her head off next to us, so…

40 00:04:49.680 00:04:52.280 Katherine Bayless: I mean, that’s not distracting at all, is it?

41 00:04:52.280 00:04:52.900 Kyle Wandel: Okay.

42 00:04:53.210 00:04:56.960 Kyle Wandel: Wife’s wife on the call right now, so that’s good.

43 00:04:57.350 00:04:57.700 Awaish Kumar: Hello!

44 00:04:58.700 00:04:59.540 Katherine Bayless: Very fast.

45 00:05:00.350 00:05:00.850 Katherine Bayless: Right.

46 00:05:00.850 00:05:01.550 Awaish Kumar: Okay.

47 00:05:02.330 00:05:03.420 Katherine Bayless: Ed, how’s it going?

48 00:05:04.210 00:05:05.140 Awaish Kumar: I’m good.

49 00:05:06.760 00:05:07.890 Awaish Kumar: Oh, it’s been…

50 00:05:09.460 00:05:20.529 Katherine Bayless: It’s been… it’s been a good week. so, the team on this side has already heard the information. I think, Amber, we ended up talking about this when we had our working session the other day, but,

51 00:05:20.530 00:05:40.490 Katherine Bayless: just for benefit of this full thread, so, our conferences team did choose to let us in-house build the speaker selection, software instead of going with the vendor session board. So we will indeed be launching our first, you know, disposable software into the wild.

52 00:05:40.490 00:05:44.510 Katherine Bayless: The speaker selection process opens June 30th.

53 00:05:44.510 00:05:56.219 Katherine Bayless: And so, between now and then, my team, this team, Jay’s team, like, we’ll all be kind of working together to support the conference’s team, getting the application into a place where, it’s, you know, ready for

54 00:05:56.220 00:06:21.030 Katherine Bayless: live. The nice thing is that the piece that will be external-facing is really… it’s just a web form at the end of the day, so it’s not anything super crazy. And so, like, in that way, I think it’s a good light testing of, you know, what this looks like. And it’s a smaller portion of the audience. Not all 150,000 CES attendees are involved in the conference content. It’s probably a universe of about 3,000 or so people.

55 00:06:21.240 00:06:37.219 Katherine Bayless: Typically. And so it’s a nice, like, perfect sort of, you know, dip our toes in, the lava, I mean water, right? Then on the back end, the admin side, I think it’ll be really interesting to see where we wind up going with it. I…

56 00:06:37.860 00:06:39.619 Katherine Bayless: I anticipate us

57 00:06:39.620 00:06:58.900 Katherine Bayless: like, building out some things that we end up being able to use for a bunch of different sort of use cases. So, like, it’s essentially an admin interface with a bunch of data enrichment coming out of our data lake, and then, you know, the ability to change the state on an entity. And so I’m like, that pattern repeats really nicely across some of the other use cases, so…

58 00:06:58.900 00:07:11.290 Katherine Bayless: I told Adrian, we can kind of, in parallel, we can build the… using the same building blocks, we can do the pre-credentialing stuff as well, because it’s the same idea. Web form on the front end, and then admin interface, managing states.

59 00:07:11.290 00:07:20.320 Katherine Bayless: on the back end. And that gives us two sort of similar but different use cases to get telemetry out of, with the exercise.

60 00:07:20.320 00:07:33.169 Katherine Bayless: Which’ll be cool. I think for this team, the thing to kind of, like, have in mind is, like, okay, this is a game changer in the sense of, instead of us just getting, you know, an export or, you know, even an API connection out of a vendor, like.

61 00:07:33.170 00:07:52.860 Katherine Bayless: what telemetry, what user behaviors, what things do we want to collect out of an application that we actually control? And that includes on the admin side, too, right? Like, not necessarily in a creepy way, but, like, are there things we can learn about internal behaviors by paying attention to how the application gets used, so that we can refine that UX

62 00:07:52.860 00:08:07.309 Katherine Bayless: you know, initially, nicely, and then eventually, programmatically. We used to use a product called Full Story, I don’t know if anybody’s ever worked with that before, but it’s, I mean, it’s cool, it was expensive, but it basically is, like.

63 00:08:07.310 00:08:13.410 Katherine Bayless: Google Analytics inside of a SaaS platform, and so it’s, similar in that you can, like.

64 00:08:13.410 00:08:27.840 Katherine Bayless: kind of tag and identify all the things that you’re tracking and look for, like, specific conversions. So, like, if we added a new button to a page and we wanted to know if people were, like, using that one versus kind of gravitating towards the older path, like, we could actually track it that way.

65 00:08:27.840 00:08:39.470 Katherine Bayless: Also helpful for debugging and that kind of stuff, when somebody would be like, hey, I got a weird error, right? So yeah, so just kind of thinking around, like, you know, what are some of the things we might want to collect out of the application?

66 00:08:39.580 00:08:58.170 Katherine Bayless: As we start to build these things. I think it’s gonna be fun. But other than that, this week, two other things to kind of share out. One is, I did do the demo of, Snowflake and Streamlit to both Kaylee and Caitlin, for, Gary and Kinsey’s executive assistants, and they were…

67 00:08:58.230 00:09:12.499 Katherine Bayless: Definitely, rather in love, they liked it. Yesterday, I got a Slack message from, Kaylee saying, hey, I’m having trouble with the sidebar, it won’t let me edit the Streamlit app. And I was like, oh…

68 00:09:12.730 00:09:22.719 Katherine Bayless: that’s not the sidebar’s fault. You just… we don’t… you don’t have that role. I was like, but there is a role for Streamlit app editing, if you would like. And so,

69 00:09:22.770 00:09:41.930 Katherine Bayless: Yeah, operating under the, the paradigm that were, you know, everybody should have the tools that they’re asking for, yeah, I’m gonna meet with Kaylee this afternoon, and I’ll give her the Streamlit creator role, and we’ll see what chaos ensues. I’m kind of excited. But yeah, I was just laughing. I saw her Slack message, and she’s like, I can’t get it to do it, and I’m like, well, that’s because we’ve locked it down.

70 00:09:43.110 00:10:02.990 Katherine Bayless: But yeah, so we’ll see. It’ll be interesting. She’s, the use case that she’s got at the moment is the, board nominations, like, tracker that kind of says, like, okay, these are all the people that applied for board seats, and then these are all the things in the universe of data about them. And so, you know, I don’t mind if she wants to tinker with it.

71 00:10:03.030 00:10:03.720 Katherine Bayless: Good learning.

72 00:10:03.720 00:10:04.319 Amber Lin: I see.

73 00:10:04.610 00:10:05.800 Katherine Bayless: Cool.

74 00:10:05.990 00:10:24.240 Katherine Bayless: Yeah, and then the other thing, I’m hoping, to actually merge the PR… well, I really wanted to get it done before I jumped on this, but we’ll see. At least at some point today, for the ExpoCAD pipeline stuff. So this has been quite the, journey of, like, well.

75 00:10:24.280 00:10:38.100 Katherine Bayless: fits and starts. So this was taking the old ExpoCAD pipeline that made the queries out to the live system, brought the data in, shook it all up together, and then put out the Slack notification with the report.

76 00:10:38.540 00:11:01.760 Katherine Bayless: Sorry again to the team, for spamming you guys at, like, 7 o’clock. The notification was wired up against each Lambda function completing, and so, there’s one report, but you got, like, 7 notifications, and I’m sorry. Fixed the bug. But yeah, so what I’ve done is, instead of it being a standalone application that handled, like, everything from the ingest to the notification, is I’ve just brought it into the DataOps repo, and so…

77 00:11:02.370 00:11:09.370 Katherine Bayless: I took, I took a look at the API, because there were some endpoints we weren’t calling previously, so I’ve added those.

78 00:11:09.370 00:11:28.020 Katherine Bayless: I tried to go through and do all of the, like, cleanup around some of the previous attempts, because we had already had some ExpoCAD data that we’d migrated from the other S3 bucket, and then we had all the, like, map your show kind of artifacts and stuff, and so I think I’ve got it all cleaned up, so we’ll now have just the one pipeline with…

79 00:11:28.020 00:11:28.970 Katherine Bayless: I think it’s…

80 00:11:30.110 00:11:45.239 Katherine Bayless: might be… it is either 7 or 11 endpoints that we’re hitting now, but we’ll get the invoices data, that’s the big piece, because I know there’s a lot of teams that really want to drive things off of those invoices getting paid, so be able to figure out those use cases.

81 00:11:45.240 00:11:50.850 Amber Lin: Is there anything I should change in the semantic views now that the ExpoCAD data has changed?

82 00:11:51.340 00:12:16.169 Katherine Bayless: Yeah, it’s a good question, and I think I might have noticed one semantic view that was on top of the ExpoCAD data, but I wasn’t super sure. I just left it aside for the moment. Functionally, the data is essentially the same. Most of what I’ve changed is the way the pipelines are handling it, but it probably would be good for you and I to pair and just make sure that I don’t introduce more chaos than I realize.

83 00:12:16.170 00:12:21.969 Katherine Bayless: Because the data in terms of, like, field names and field types and structures hasn’t really changed, other than.

84 00:12:21.970 00:12:22.300 Amber Lin: Okay.

85 00:12:22.300 00:12:27.710 Katherine Bayless: having the new endpoints, but still worth, yeah, I think just double checking.

86 00:12:28.170 00:12:29.990 Amber Lin: Cool, sounds good.

87 00:12:29.990 00:12:30.370 Katherine Bayless: Yeah.

88 00:12:30.370 00:12:38.999 Amber Lin: We have a few things on our side we wanted to discuss, but I want to make sure you can go through everything on your mind first.

89 00:12:39.430 00:12:48.270 Katherine Bayless: No, no, I’m actually… it’s a good segue. The other piece I’ll talk about, but we can parking lot it for towards the end, is just the AWS migration stuff, but yeah, let’s dig in on what you got.

90 00:12:48.660 00:12:49.260 Amber Lin: Okay.

91 00:12:49.300 00:13:08.860 Amber Lin: Our physical in-person rollout is coming soon. It’s gonna be Tuesday or Wednesday, and then we have Friday, we have Monday, and a little bit Tuesday, to make sure that we’re ready for at least the initial rollout. And…

92 00:13:09.350 00:13:13.510 Amber Lin: I’m looking at our task list.

93 00:13:13.610 00:13:17.219 Amber Lin: Let me quickly share this table.

94 00:13:20.810 00:13:26.880 Amber Lin: So, I am looking at… rule out.

95 00:13:27.000 00:13:35.499 Amber Lin: So… We have a… Few things we want to make sure that we can

96 00:13:35.900 00:13:40.000 Amber Lin: do. So, I know Sandbox has been removed.

97 00:13:40.640 00:13:48.250 Amber Lin: we still need to, make sure we have Sherlica, confirm that we…

98 00:13:48.950 00:14:01.849 Amber Lin: clean up some accesses to roll chat. I think still, right now, it can access DevMars and perhaps RAW, so we might need to look into that and clean that up before Tuesday.

99 00:14:02.230 00:14:09.080 Amber Lin: And then, we are also working on the user guide, which not only includes how do we use

100 00:14:09.140 00:14:23.190 Amber Lin: semantic views, Cortex Asian sidebar, and also Streamlay. We also want to make sure they know, how do you support, how do you submit tickets, how do you, get access with Trellica? Like, there’s those type of stuff.

101 00:14:23.790 00:14:27.009 Amber Lin: And then we want to make sure that we have

102 00:14:27.180 00:14:34.110 Amber Lin: The support channel, we have a, launch email ready, and now we have a…

103 00:14:34.260 00:14:44.670 Amber Lin: adoption dashboard. So I think I want to talk about the more admin stuff, and then once we’re done with that, I want to talk about

104 00:14:44.670 00:15:04.050 Amber Lin: what is the state of done that we’re looking for rollout? I have more of an idea of… that’s more related to creating a feedback loop than being absolutely done, because I don’t think anyone knows what done looks like right now.

105 00:15:04.370 00:15:13.060 Amber Lin: But would love to just talk about the… some admin stuff related to rollout, and then we can talk about the actual content that I’m creating.

106 00:15:14.690 00:15:15.760 Katherine Bayless: Yeah, sounds great.

107 00:15:17.150 00:15:21.439 Amber Lin: I think Ujam said he will be working with Ian on…

108 00:15:21.580 00:15:30.160 Amber Lin: Trelica, I think another thing we wanted to do is for the support channels. I know we discussed having Slack

109 00:15:30.270 00:15:41.640 Amber Lin: And Asana. Any ideas there? Are we using Asana or Slack, or are we good with setting up Slack?

110 00:15:41.810 00:15:43.339 Amber Lin: That’s a.

111 00:15:43.340 00:15:52.019 Katherine Bayless: Yes, I think I’ll set up the Slack channel… maybe, actually, I can… I can knock that out today. I’ll set up the Slack channel. It won’t…

112 00:15:52.180 00:15:57.990 Katherine Bayless: be necessarily, like, like, just snowflake stuff. I think…

113 00:15:58.990 00:16:21.940 Katherine Bayless: I mean, I think, truthfully, the lines are all kind of, you know, smushing together, and so it’ll be mostly, like, an AI people, Slack channel kind of a thing, and so it’ll cover, like, Snowflake and Cloud Code and other tools as we’re rolling them out, and that way, it kind of helps reinforce what we’ve been talking about, which is, like, we don’t want Slack to be, like, the primary place you actually request support, you know, officially, but rather just.

114 00:16:21.940 00:16:22.310 Amber Lin: Just like me.

115 00:16:22.310 00:16:23.139 Katherine Bayless: Community of Pride.

116 00:16:23.140 00:16:24.030 Amber Lin: questions.

117 00:16:24.030 00:16:26.110 Katherine Bayless: Yeah, exactly.

118 00:16:26.110 00:16:34.009 Amber Lin: mind doing a basic version and just adding me so that I can… I can send reminders and I know what I’m talking about.

119 00:16:34.300 00:16:35.260 Katherine Bayless: Yeah, yeah, sure, sure.

120 00:16:35.260 00:16:40.879 Kyle Wandel: Do we want to break it down by roles, too, or parse it up by roles, so, like,

121 00:16:41.100 00:16:46.829 Kyle Wandel: if we want to think about, like, QAing people, or people doing QA stuff, or is that mostly limited to Asana, basically?

122 00:16:47.860 00:16:49.340 Katherine Bayless: I mean, I think…

123 00:16:49.690 00:16:59.179 Kyle Wandel: I’m envisioning the questions from an explorer will be different than the questions than a builder, different questions than whatever, so I wasn’t sure if we want to do that, or just put it all under one.

124 00:16:59.830 00:17:19.779 Katherine Bayless: I think all-in-one is where my head goes. I think if we wanted… like, if we noticed that maybe there was, like, friction because the two streams are very different, we could always create an additional channel and let people, you know, self-migrate. But I think keeping it all in one place is gonna be better from the, like, community of practice perspective.

125 00:17:21.020 00:17:23.160 Kyle Wandel: Okay, sounds good.

126 00:17:24.470 00:17:30.140 Amber Lin: So we’ll do that, and any setups we need for Asana?

127 00:17:30.980 00:17:38.949 Katherine Bayless: I’ll tag Kai in on that one. I don’t know… I can’t remember off the top of my head if we decided we wanted to keep the existing request board or start up a new one.

128 00:17:38.950 00:17:52.110 Chi Quinn: We could keep it. I think just the… the issue that I had was just a few cosmetic changes, but I guess I would say let’s keep it in the current Asana that we have right now.

129 00:17:52.350 00:18:09.920 Chi Quinn: the only thing I know for sure is that when request, if we’re still doing it by, like, the form request, it’s still going to go to the right, section of the not started, which is totally fine. So it’s still good as is, so I would say keep it there.

130 00:18:10.100 00:18:10.530 Katherine Bayless: Okay.

131 00:18:10.530 00:18:18.159 Amber Lin: Cool. Can I be added to the board? I want to do some testing and make sure I can create a user guide.

132 00:18:18.430 00:18:19.290 Chi Quinn: Sure.

133 00:18:19.540 00:18:20.190 Amber Lin: Cool.

134 00:18:20.760 00:18:24.750 Amber Lin: okay.

135 00:18:25.030 00:18:32.909 Amber Lin: So… If we can set that up, and then for the announcement.

136 00:18:33.180 00:18:40.130 Amber Lin: Do we have an email draft that we want to send out, or how do we want to do the announcement?

137 00:18:40.850 00:18:58.980 Katherine Bayless: Yeah, so I think, as I talked to Christina yesterday, I think what we’re gonna do is, like, less of a, like, email announcement around, like, Snowflake specifically, but more… similar to, like, the Slack channel. It’s more of just, like, a, you know, we’re basically gonna take most of the…

138 00:18:58.980 00:19:02.459 Katherine Bayless: Most of the training wheels off a lot of these things, and so, you know…

139 00:19:02.460 00:19:23.130 Katherine Bayless: expanding access to capability for everybody, and so it’ll probably be an email that she will send, that’ll talk about other things. What I am gonna do, though, is post, like, in the, current, all-staff Slack, invitation to make the, kind of, the Wednesday open house, available to anybody who wants to come, so, like.

140 00:19:23.520 00:19:42.469 Katherine Bayless: will note that, like, we’re going to be doing primarily a deep dive around, like, snowflake things, but anybody who’s interested is welcome to attend. Yeah, it seems… like, my understanding coming out of the AI retreat, or retreat that ended up being very much about AI, that leadership went on this week, is that the directive going forward is, like.

141 00:19:42.470 00:19:57.819 Katherine Bayless: everybody tilted everything. Like, everybody try AI first, everybody get fancy tools, all the things. I mean, we’ll see exactly what some of the specifics wind up looking like, but yeah, they really want to roll out more capability.

142 00:19:57.890 00:20:00.619 Katherine Bayless: To anybody who’s willing to give it a shot.

143 00:20:00.870 00:20:16.930 Amber Lin: Okay. Would you need anything from me or from us on drafting that announcement, or do you feel good on, speaking on what we have now, or what we plan to do next, and what’s available?

144 00:20:17.740 00:20:40.639 Katherine Bayless: Yeah, I mean, I think in terms of the announcement, I’ve got what I need. I think what I would encourage this team to think about is, like, going forward, like, what does it look like for us to not necessarily, like, make announcements per se, but, like, you know, what does it look like for us to, like, have release notes? And, like, how are we gonna, like, let people know that XYZ data source is now, you know, end-to-end, kind of QA’d, and has a semantic view, and that kind of thing?

145 00:20:40.640 00:20:48.720 Katherine Bayless: I assume the logical, initial pass at this is, like, a post to the Slack channel on a, you know, weekly or so basis.

146 00:20:48.720 00:20:56.729 Katherine Bayless: But open to kind of whatever makes sense. But I think we do want to think about how we’re going to keep people informed as things roll out.

147 00:20:57.320 00:21:06.449 Awaish Kumar: I… I think the… the communities, like DBT, Airflow, they do… they do the announce… they have announcements, Slack channel, and they…

148 00:21:06.780 00:21:17.209 Awaish Kumar: post only the announcements there. There’s no communication in there. It’s just every… and there… it will be, like, a cut of announcements every week, so people can see, but…

149 00:21:17.740 00:21:30.410 Katherine Bayless: Yeah, that’s true, we could do that too. I mean, I think there’ll be probably release notes that are around, like, you know, data and snowflake-y things, but then there will also be, like, the skills and some of the stuff, and yeah, so, like…

150 00:21:30.630 00:21:37.410 Katherine Bayless: Yeah, we could do it as a standalone Slack channel with, yeah, like a digest, I guess, kind of thing.

151 00:21:38.270 00:21:38.990 Katherine Bayless: Yeah?

152 00:21:41.130 00:21:41.780 Amber Lin: Cool.

153 00:21:41.940 00:21:52.080 Amber Lin: Let me see… Great. And then, with this app and stuff in mind, I want to talk about

154 00:21:52.730 00:21:59.440 Amber Lin: what the… what… A state our work should be in when we present to

155 00:21:59.590 00:22:08.909 Amber Lin: preventive folks, because I know we don’t really… we can’t pinpoint what is the definition of done, and folks

156 00:22:09.250 00:22:19.179 Amber Lin: don’t know that as well. So, I think what we’re trying to show to them is, hey, instead of saying this is perfect, we’re saying, hey, we have

157 00:22:19.200 00:22:38.619 Amber Lin: the adoption dashboard in place, we have these feedback loops in place. We can take your feedback, look at your usage, and improve on these things, and as you said, on the weekly or bi-weekly basis, say, hey, we got feedback on this area, this is what has changed.

158 00:22:38.650 00:22:40.840 Amber Lin: This is what is to come.

159 00:22:40.930 00:22:49.050 Amber Lin: And I think it’s more important in the short period before the rollout to set up that feedback loop.

160 00:22:49.340 00:22:57.819 Amber Lin: To demonstrate, like, hey, this is a work in progress, and… and in the future, to show in the consistent cadence that we’re improving.

161 00:22:57.920 00:23:02.389 Amber Lin: So, wanted to talk about

162 00:23:02.520 00:23:12.279 Amber Lin: What that would look like, and also just how do we present our current product, which is not perfect yet.

163 00:23:13.110 00:23:19.449 Katherine Bayless: Yeah, yeah, great questions. So, a couple thoughts. One is, I think…

164 00:23:20.060 00:23:43.059 Katherine Bayless: I think it would be worth… Kyle had kind of taken a first sort of start at some of this, I think intended more for maybe, like, our use case, but it lends itself nicely to this, like, inventorying all of the different data sources that we have, and, like, who is the point of contact for them? Because, you know, yes, our team is the point of contact for the pipeline, but we might not really know much about the data that’s flowing through it, and so if it’s…

165 00:23:43.140 00:24:07.929 Katherine Bayless: you know, Shopify data, there’s multiple points of contact, right? Are you looking for research downloads? You should go talk to that team. If you’re looking for sponsorship stuff, you should go talk to that team. If you’re dealing with the integration and it’s not playing nice, like, that goes to, you know, maybe Kyle, the other Kyle on the IT team, or to the membership team, but, like, I think we want to start visualizing the data sources with the sort of lineage, like, from a

166 00:24:07.930 00:24:24.000 Katherine Bayless: human custody standpoint, like, as well as the readiness level, and so if something is, you know, newly being ingested by us, and we’re not, you know, haven’t even really gotten started at all on the semantic views or QA stuff, then, you know, maybe that’s one sort of color or label.

167 00:24:24.000 00:24:45.450 Katherine Bayless: Things that are actively in QA and, you know, reasonably, we expect them to behave well in the wild, that can be a different color label, and then things that are like, you know, no, this is actually, this is my best work, this is my Michelangelo, I expect it to perform flawlessly, right? So, like, I think we need a way to show our users what data sources we have and what state they’re in.

168 00:24:45.720 00:24:51.620 Katherine Bayless: With the reinforcement of, like, you know, we own the pipeline, but not necessarily the…

169 00:24:51.750 00:25:03.289 Katherine Bayless: use, or the usability of the information, which then leads into the other piece, is, for these, like, QA tools and building out the semantic views and stuff, like, I think…

170 00:25:03.840 00:25:28.439 Katherine Bayless: not that they all have to be done by the time we roll out, but I think we’re gonna be talking about these as things that end users will be able to interact with, so that, like, they can, you know, if they’re going in and working on something and they see something that’s not correct, like, you know, we’ll have a way for them to use a skill in Claude, like regular Claude, to, like, workshop it and figure out, like, what needs to change, that kind of stuff. So, like.

171 00:25:28.830 00:25:29.940 Katherine Bayless: We’re…

172 00:25:30.610 00:25:45.519 Katherine Bayless: Taking the posture of, like, the actual people that know the data best will be the ones to help get it into the shape that would get it the, you know, gold star designation, but we will be the team that builds the tools so that they can do that, if that makes sense.

173 00:25:46.940 00:25:54.109 Awaish Kumar: Okay, so one way for the getting the feedback is just… we can, encourage them to use the…

174 00:25:54.410 00:25:56.870 Awaish Kumar: Comment, on the answer.

175 00:25:57.350 00:26:09.929 Awaish Kumar: And basically, we are already now capturing it in our mart, so we can see their prompts, response… responses from the AI, and also the feedback.

176 00:26:10.470 00:26:13.180 Katherine Bayless: Yeah. I mean, that’s a great first one, yeah.

177 00:26:14.640 00:26:21.950 Kyle Wandel: And how… and sorry, Rish, how would you… you said something about how do you get that feedback? Is it, like, do we do Slack messages or something else?

178 00:26:21.950 00:26:29.659 Awaish Kumar: I’m not… On the interface, itself, you have this, button where you say thumbs up or down.

179 00:26:29.660 00:26:31.049 Kyle Wandel: Oh, yeah, yeah, yeah.

180 00:26:31.050 00:26:34.550 Awaish Kumar: It opens up a text box, you can write a message there.

181 00:26:34.550 00:26:35.230 Kyle Wandel: Okay.

182 00:26:35.970 00:26:36.650 Katherine Bayless: Yeah.

183 00:26:38.890 00:26:58.649 Katherine Bayless: Yeah, I mean, I think if you think about it at, like, the highest level, what we’re really trying to do is move up one layer so that, really, we’re empowering with, you know, awesome tools and support and all the things, anybody in the building to do what our team used to do, now we’re gonna just build the tools so that any end user can do it. I mean…

184 00:26:58.940 00:26:59.910 Katherine Bayless: It’s kind of cool.

185 00:26:59.990 00:27:03.420 Katherine Bayless: Kind of scary, kind of wild, but…

186 00:27:03.440 00:27:25.750 Katherine Bayless: I really… I do think this is kind of the funny brilliance of CTA, is that, like, because so many things have been so manual, the people really do know their data and their process as well. Like, I actually think having us become the tool provider instead of the, like, reasoning team is gonna be beneficial on both sides, like…

187 00:27:25.750 00:27:37.969 Katherine Bayless: I think they will know their data better than we do, and we can build them tools that understand data structures. I think we will get to, you know, really strong data faster this way than if our team was the central source of all of it.

188 00:27:40.150 00:27:40.700 Amber Lin: Cool.

189 00:27:40.700 00:27:49.509 Awaish Kumar: We will need a process after that, like, once we have feedback, like, how are we going to work on that, or…

190 00:27:50.130 00:28:05.749 Awaish Kumar: We have the feedback, it automatically just creates the tickets, then we can work on that, or, like, how that process will look like, or we use agents to look at the feedback, and the AI answer, and review it, and try to fix it.

191 00:28:06.730 00:28:22.420 Katherine Bayless: Yeah, I mean, I think, up to you guys and how you want to handle it, right? I do think initially it’ll probably be a human looking at it, but yeah, I’d love to drive towards, eventually, like, an agent that can, like, identify, like, oh, these are tiny problems, like, and go fix them. But yeah, I…

192 00:28:22.970 00:28:47.880 Kyle Wandel: So this brings us to a good point. I am building a initial plan, at least, a little scope document that basically re-engineers or reorganizes our GitHub. So the idea was when we kind of re-update and kind of redo the whole AWS accounts, this also gives us an opportunity to kind of move or kind of restructure our GitHub to be better how we want to. So I’m taking in a lot of different approaches, and I’m using Agentic AI

193 00:28:47.880 00:28:56.640 Kyle Wandel: as agents, basically, and scaling from that standpoint, using best practices for GitHub, kind of doing everything I possibly can, and give it as much information as I can.

194 00:28:56.660 00:29:14.720 Kyle Wandel: that way you can create all the READMEs, create all the structure it needs to be, so this… this feedback loop is really, really cool, because we’ll basically just tell it, how to set that up. In terms of just anything, little comments you have on that, like the release notes is a really cool idea, like, how to incorporate that into the GitHub repo as well.

195 00:29:14.720 00:29:19.830 Kyle Wandel: Anything we can make it, like Catherine said, as automated as possible using Agentic

196 00:29:20.850 00:29:31.039 Kyle Wandel: possibilities, capabilities is what we’re trying to go for, and, so I’ll probably shoot that plan over to you guys, just to have your initial thoughts on it, because again, I mean, you could…

197 00:29:31.120 00:29:48.389 Kyle Wandel: throw everything at it, but I literally just look through, like, mediums, look at how different people are using AI to do different things. So stuff like that. If you have any ideas, like, please, like, throw it at it, and we can kind of fine-tune it from here. But similar with the RBAC plan, we can just fine-tune it, and to be much more geared towards CTA.

198 00:29:50.400 00:29:56.999 Katherine Bayless: Yeah, and I think… I think, you know, one… one kind of call-out in that sort of vision, too, is, like, automated, but…

199 00:29:57.690 00:30:07.790 Katherine Bayless: automated around keeping the human in the loop, like… because I know, I mean, obviously, like, because I think what we’ll run into, too, is, like.

200 00:30:08.570 00:30:27.109 Katherine Bayless: people like us, developer types, have… at this point, we’ve done so much work and put so much tooling in place so that we can sit down and open a Cloud Code session, and the other day, I just dropped in an email from somebody who needed some numbers for media stuff for different countries, right? And I literally just dropped in the bullets from her email.

201 00:30:27.110 00:30:30.659 Katherine Bayless: and let it go do its thing, and it threw together a Streamlit app that was…

202 00:30:30.660 00:30:38.209 Katherine Bayless: shockingly pretty good, right out of the gate. I was not actually expecting it to be, Kai was kind enough to kind of QA it, and like, I think…

203 00:30:38.390 00:30:49.450 Katherine Bayless: it’s invisible to… I mean, this is what our work has always been like, right? Like, it’s invisible to that person in that moment when I show Kat how it works, that, like, there’s a lot of plumbing and engineering and understanding behind this.

204 00:30:50.290 00:31:04.479 Katherine Bayless: We can also put plumbing and engineering behind other things that they interact with, but, like, the goal is really to get them to engage and provide feedback, not for us to, like, automate the analytics or automate the, like, you know.

205 00:31:04.680 00:31:20.509 Katherine Bayless: data definitions and stuff like that, but to automate the extraction of it from their brains. Because I do think we want the human in the loop to yield the documentation and the context and the scaffolding that will become part of the, you know.

206 00:31:21.000 00:31:25.429 Katherine Bayless: the repo, the, I mean, knowledge base, right, is really what it’s gonna kind of turn into, so…

207 00:31:25.870 00:31:34.590 Katherine Bayless: Less, like, AI, go generate a bunch of definitions for these fields, and more, AI, go ask this human what the definitions are for those fields, if that makes sense.

208 00:31:36.450 00:31:37.090 Awaish Kumar: Yep.

209 00:31:41.260 00:31:57.659 Uttam Kumaran: I guess one question I had, I think, Amber, we were discussing this yesterday, is, like, sort of, like, what done looks like, or, like, what are the phases? One thing I was talking to Amber is, like, look, this is sort of a never-ending thing, so what are some characteristics of, like.

210 00:31:57.840 00:32:03.550 Uttam Kumaran: we’re comfortable at every stage. And one thing I mentioned here is, like, a couple things that I would be looking for is

211 00:32:03.610 00:32:08.060 Uttam Kumaran: Are we able to see all of the usage in one place?

212 00:32:08.070 00:32:22.990 Uttam Kumaran: Do we have, like, a triage process by which, like, issues come, and maybe, like, they don’t just come through Catherine? Like, basically, we… they’re coming from whatever angle. We’re able to put them into something and then iterate on them. And then third, I think we…

213 00:32:22.990 00:32:38.710 Uttam Kumaran: you know, I don’t know if y’all covered this before, but we wanted to, you know, see if we can execute on, like, you know, that Slack channel where people can just give feedback on things that they’re seeing, because now a lot of us here can just jump in and support, especially during this first rollout. I think

214 00:32:38.710 00:32:53.879 Uttam Kumaran: we may not have, like, a… I don’t know if we’re gonna ever have a full picture of, like, it can answer every question perfectly, but I do think that I… what I… what I… what we can control is, like, speed to triage and, like, open communication lane, and so that, I feel like, is a great…

215 00:32:54.000 00:32:59.259 Uttam Kumaran: milestone that I’d like to… that I’d like to propose is, like, what is this first, like.

216 00:32:59.540 00:33:11.860 Uttam Kumaran: push, like, what does done look like in terms of this first step, you know? Before we consider, like, improving speed, like, adding more data, everything. It’s like, all that’s gonna cascade on… on this.

217 00:33:12.050 00:33:13.240 Uttam Kumaran: You know.

218 00:33:14.890 00:33:19.849 Katherine Bayless: Yeah, so I think… Yeah. What’s interesting is…

219 00:33:21.040 00:33:44.870 Katherine Bayless: I think our definition of done is gonna apply more so to the tools that we build than the data we are serving. So, like, we will be saying that we are confident this tool will help you get your data to whatever your definition of done is. Now, I think there’s a lot of initial baseline done stuff that we can define, right, like the audit report and some of these things where we do know that there are

220 00:33:44.870 00:34:04.370 Katherine Bayless: like, established use cases and, you know, relatively kind of canonical answers. But yeah, I mean, to your point, right, like, there are so many things we don’t know that people might be wanting to do or are doing with our data. We also know that because we’ve not really been able to dig in on these analytics historically, there’s some…

221 00:34:04.410 00:34:12.859 Katherine Bayless: Like, there’s a… we… there are probably a lot more sophisticated questions that will start to come through, hopefully, that, like, we can’t imagine yet.

222 00:34:12.860 00:34:13.480 Uttam Kumaran: Yes.

223 00:34:13.480 00:34:19.319 Katherine Bayless: I think it’s gonna be, like, looking across data sources in ways that haven’t been able to be done before.

224 00:34:19.320 00:34:21.130 Uttam Kumaran: Totally, it’s exactly right, yeah.

225 00:34:21.139 00:34:31.519 Katherine Bayless: Yeah, and so, like, what I want is for us to figure out, and I mean, it doesn’t have to happen by Wednesday, but, like, this is the direction we’re kind of driving in, is what Kyle sort of described, right?

226 00:34:31.699 00:34:45.739 Katherine Bayless: When a new data source is coming in, we’re providing the tools for that team to help define it, govern it, establish it as, you know, vetted and safe for, you know, downstream consumption, rather than being the team that does those things.

227 00:34:52.570 00:34:54.400 Uttam Kumaran: Amber, any thoughts on that?

228 00:34:55.260 00:34:58.610 Amber Lin: I’m… let’s see…

229 00:34:59.410 00:35:08.769 Amber Lin: I’m more thinking of when the data is there. I’m thinking, like, one step more downstream of how that is going to be used.

230 00:35:08.950 00:35:11.760 Amber Lin: in Snowflake,

231 00:35:11.870 00:35:24.559 Amber Lin: And right now, it’s… what I can think of is more related to, okay, how… how do they look at the data? Do we… do they have a tool or a skill to explore what data they have?

232 00:35:24.880 00:35:43.139 Amber Lin: Do they know how this data connects to other sources? And then, if they want to create tools from that data, one, do they know what the tools are? So, Streamlight apps, semantic views, adding it to agents, or even adding it to

233 00:35:43.320 00:35:50.110 Amber Lin: tools outside of Snowflake, do they know how to do that? And then lastly, how would they iterate

234 00:35:50.170 00:36:06.560 Amber Lin: on their tools. So, I don’t think we have created any roadmap around that. I think our focus so far has been us doing the work, so I probably want to have a little bit of a

235 00:36:06.740 00:36:17.960 Amber Lin: Concrete plan about what tools we initially want to provide, or we want to make sure we give them, and then we can scope those out and then work on those.

236 00:36:18.770 00:36:21.909 Katherine Bayless: Yeah, totally. And like I said, I think, like.

237 00:36:22.140 00:36:38.590 Katherine Bayless: we’re very much delivering not necessarily, like, a final product or a polished anything. We are delivering, like, service, support, encouragement, and, you know, a lot of handholding. And then the tools that’ll make it possible for people to do these things.

238 00:36:38.610 00:37:01.800 Katherine Bayless: I also had one other thing I was gonna say. So, like, a perfect example of a thing I think we should work towards releasing into the wild, would be the, the skill that you had showed us, where somebody could take, like, a question that the agent got wrong, and then it would sort of do the, like, well, why’d you get it wrong? Go figure this out. And then, okay, iterate over it, and then figure out what you needed to have known in order to get it right.

239 00:37:01.800 00:37:26.770 Katherine Bayless: like, if we can package that in a way that anybody could interact with when we’re trying to kind of, like, bring their data into Snowflake, then that gives them power to go through and say, okay, this is a question I know the answer to, and that I would, you know, reasonably expect to ask the agent, and I’ll iterate until we figure out what the, you know, what details need to be behind the scenes so that the agent gets it right. Like, that’s a great first tool to be able to

240 00:37:26.770 00:37:27.989 Katherine Bayless: give people, I think.

241 00:37:29.980 00:37:30.960 Katherine Bayless: Truly good.

242 00:37:30.960 00:37:44.270 Kyle Wandel: We can probably add that to, like, the QA or, like, the builder type of, like… So, the role context and the RARBAC plan by roles does give a breakdown of, like, what tools each person should have access to.

243 00:37:44.410 00:37:57.990 Kyle Wandel: So we should have at least an initial draft slash plan for that, basically. I think it’s not a bad idea to force, we were talking about, Catherine, like, more questions for certain roles and, like, certain people.

244 00:37:57.990 00:38:15.439 Kyle Wandel: That way they really get down to, like, the nitty-gritty, and they understand the data. I think that does make sense. So… but we do have at least an initial draft of, like, what tools people should have available to based on the role that they have, and what access it is, and all that stuff. But it could probably be fine-tuned, like, I don’t…

245 00:38:15.740 00:38:22.209 Kyle Wandel: that’s, again, part of, like, the whole GitHub, repo reorganization that I’m working on, too, so…

246 00:38:23.530 00:38:24.250 Katherine Bayless: Yeah.

247 00:38:24.700 00:38:49.450 Katherine Bayless: Yeah, I mean, it’s like, it’s good to, yeah, like, not every single person in the building is going to be doing all of this day one. Like, yeah, reality is we will… I told Christina my goal is really to kind of get one person from every team, or, you know, one person from every data source, you know, like, that kind of thing, right? Like, we want to get, you know, broad coverage across the organization, but realistically, we’re looking at probably one to two kind of, you know, power user types.

248 00:38:49.610 00:39:00.550 Katherine Bayless: on a team who would be the people using these things. I mean, you know, although, on the other hand, Kaylee’s over there trying to make Streamlit apps, despite not having permission, so, like, I think the appetite is here, like.

249 00:39:01.010 00:39:04.360 Katherine Bayless: The best thing for us to do is figure out how to channel the energy.

250 00:39:05.510 00:39:12.439 Kyle Wandel: I was going to ask you that question, so for develop… for, like, role purposes, I think we talked about it initially, but is there any…

251 00:39:13.660 00:39:29.170 Kyle Wandel: like, there… we probably should not really put… I don’t… I know you hate this word, but we probably need to put some type of guardrails in terms of just, like, who can be what. And if we don’t want that, that’s fine, but then we need to be more structured to who has access to, like, the GitHub, who has access to certain things, because…

252 00:39:29.380 00:39:37.469 Kyle Wandel: I don’t know about you, but I would prefer not to have every single person in the organization be at developer level and have access to the GitHub when they don’t need to be.

253 00:39:37.830 00:39:54.330 Kyle Wandel: I understand that it’s where we want to go, because we want to give people more power, but at the same time, I think we do need to limit it in some way. And so, I don’t know if… do you have any initial thoughts on that? Like, maybe the max role that people get to outside of our team, unless we select it, is developer?

254 00:39:54.680 00:40:01.619 Kyle Wandel: Or not developer, builder, and then we select developers if we want to. And those are, like, the super users we talked about.

255 00:40:01.870 00:40:03.929 Kyle Wandel: So, I don’t know, what are your thoughts?

256 00:40:04.890 00:40:08.459 Katherine Bayless: I mean, I think… I mean, I really…

257 00:40:09.470 00:40:25.480 Katherine Bayless: I wish I had better ability to, like, you know, my crystal ball’s in the shop, right? Like, predicting the future at this point is getting harder by the minute, but I mean, it sounds like the mandate is, if somebody wants a tool, give them the tool, it’s gonna be kind of the paradigm going forward. Now, to your point.

258 00:40:25.480 00:40:35.359 Katherine Bayless: there are totally things that we don’t necessarily, like, need to add chaos to that will not yield return on, you know, capability. So I think that’s…

259 00:40:35.430 00:40:58.300 Katherine Bayless: maybe a way for us to think about it, too, is, like, what are the things that we really feel like, no, no, that actually needs to stay really tightly gated and guarded? Which could be anything from, you know, the GitHub repo, to the semantic views that we do finalize and say, like, these are indeed, you know, all the way done, we don’t expect changes unless somebody reports an issue with them, like, we don’t want them to be modifying those, like.

260 00:40:58.300 00:41:03.050 Katherine Bayless: what does it look like for us to take something and… I mean, it’s almost like…

261 00:41:03.340 00:41:20.330 Katherine Bayless: we will have many more people engaging with us in a staging sort of layer, and then there will be a smaller set of things that are in kind of a production, you know, like, signed-off, stamped, like, we’re not gonna touch this unless there’s a reason to go into it. Like, I think…

262 00:41:20.330 00:41:30.239 Katherine Bayless: I think that’s probably a way to think about it, is, like, more people are going to be able to come closer to writing code that would go into production, but they’re not going to necessarily be

263 00:41:30.510 00:41:34.539 Katherine Bayless: Altering the things that we have sort of finalized and moved on from.

264 00:41:36.680 00:41:42.800 Katherine Bayless: But, it’s also possible that we will start to see a lot more folks with a lot more tooling. So, like…

265 00:41:42.800 00:41:52.169 Kyle Wandel: Yeah, I mean, that’s like, it goes… I mean, I know that’s where we are, but I know what we talked about with the IT app building and stuff like that, like, the way my head is at is any official public

266 00:41:53.650 00:41:57.609 Kyle Wandel: App or data source that is used for anything other than just

267 00:41:57.850 00:42:03.110 Kyle Wandel: Personal use should be verified at probably developer level.

268 00:42:03.500 00:42:06.480 Kyle Wandel: And then anything else below that’s fine for Builder. So, like, if…

269 00:42:06.720 00:42:15.029 Kyle Wandel: Chris wants to do something with… not Chris, sorry, sorry, Christy wants to do something in MRD, like do industry research boards, or do, like, market…

270 00:42:15.030 00:42:29.269 Kyle Wandel: market research stuff, or market intelligence stuff, like, that’s perfectly fine for him to do in-house, for him to analyze and stuff like that, but the moment he wants to make that accessible to, like, Kinsey or somebody else, then we should step in and be like, hey, let’s make sure this is correct, let’s make sure this is public, and then we could post it.

271 00:42:29.270 00:42:33.990 Kyle Wandel: to the public sources. Is that something that your head’s at? That’s where my head’s at.

272 00:42:34.530 00:42:35.420 Katherine Bayless: I think…

273 00:42:35.420 00:42:55.589 Katherine Bayless: So, yeah, I guess, yeah, one good note, yeah, anything that’s going to go outside the walls of the organization has to go through Jay, frankly, right? From a security perspective, if you’re building an application that is going to have external people interacting with it, then, yeah, I mean, we need to have the full, sort of, like, security cohort involved.

274 00:42:55.630 00:43:07.529 Katherine Bayless: if you’re building data that could get delivered externally, well, that, you know, that’s more so our team in terms of just, like, you know, can you share it, should you share it, are you sure it’s in good shape to share it?

275 00:43:07.650 00:43:09.070 Katherine Bayless: I think it is…

276 00:43:09.240 00:43:28.209 Katherine Bayless: something we will start to run into, like, you know, for a small example, Anake asked me the other day, PwC was interested in knowing the, like, percentage of attendees for session content that were senior level, right? And she’s like, do we have that data, and can I give it to them? And I’m like, have it? Yes, give it to them.

277 00:43:28.700 00:43:52.300 Katherine Bayless: I mean, there’s no reason you can’t give them a, you know, 25% of attendees were senior level, other than that you are potentially hurting our ability to sell that data, right? And so that’s an interesting conversation that’ll have to happen, you know, somewhere at the levels of, like, leadership and legal and us, and, like, which things are, you know, we are now able to provide, and they don’t violate privacy to share, but they would undermine future business

278 00:43:52.310 00:44:01.900 Katherine Bayless: models, perhaps, around them, and so that’s… that’s a conversation that’ll have to get figured out. But yeah, anything that’s internal use, though, I think…

279 00:44:02.170 00:44:13.219 Katherine Bayless: where I want us to be the, like, accountable party is not necessarily, like, is the data correct? That should be the responsibility of the data owner.

280 00:44:13.220 00:44:38.180 Katherine Bayless: Where we are is, is the data secured? Is it governed? Is it, you know, well-structured? Is it, you know, going to be performant at scale? Like, we’re… we’re going to be looking at all of the quality of the infrastructure and infrastructure and architecture that’s behind their analysis. Like, they need to be the accountable owner of the data quality and interpretation. We need to be the people that make sure they’re sitting on

281 00:44:38.180 00:44:40.870 Katherine Bayless: On top of a really good foundation when they do that.

282 00:44:41.050 00:44:58.750 Kyle Wandel: Okay, so that makes sense, and I guess for use this example, going back to the market research, I know, I think, that we’ll actually probably do this ourselves now, but, like, the entity list, whatever. So, like, they would build that… that pipeline, that… that visual, to see whether or not there’s a connection, or a match, or whatever, and then when that’s ready to go, because, again, like, at the very…

283 00:44:58.980 00:45:06.010 Kyle Wandel: CTA-oriented document, like, everybody likes to know about that, and so I would suggest that that only be…

284 00:45:06.290 00:45:23.759 Kyle Wandel: available to MRD until they have a full build-out of it, and it can be then checked by us to make sure that they are using, like you said, the correct identity stitching tables, the correct tables between the consolidated report, everything that is correct, I think that makes sense. But yeah, you are right.

285 00:45:23.910 00:45:33.029 Kyle Wandel: And I agree with you that they are the data owners. We can’t tell them that their data is correct or not, but we can just tell them, did you use the correct identity stitching table to match the stuff, basically?

286 00:45:33.470 00:45:38.190 Katherine Bayless: Yeah, exactly, exactly. And so, if they were to, you know.

287 00:45:38.290 00:45:58.720 Katherine Bayless: interact with GitHub at all, and, you know, kind of following this example, it’d probably be something like scoping their role to be able to put things in so far as, like, a draft PR. And then at that point, somebody from our team is responsible for reviewing, probably, you know, fixing, tweaking, whatever, and then, like, only our team would be able to actually merge it into main, right? Like, that’s…

288 00:45:58.720 00:46:03.669 Katherine Bayless: probably kind of what we’re looking at here. So we might see a lot more people in GitHub, but

289 00:46:03.910 00:46:21.710 Katherine Bayless: tighter controls around what gets merged into main. I do think, you know, you’re, just from a, like, brass tacks technical standpoint, I do think we need to figure out how to make this match the RBAC in Snowflake so that that QA layer is indeed functioning the way Kyle’s talking about, right? Like.

290 00:46:21.710 00:46:35.719 Katherine Bayless: if you’re the team that’s associated with that data, you can see it in that QA stage, you can use these tools we’re gonna build against it in that QA stage to get it to the point where you’re then going to ask us to approve, you know, moving it to prod, merging it to main kind of thing.

291 00:46:35.720 00:46:42.929 Katherine Bayless: Because I think right now, Amber, to your point, actually, from the checklist, like, I think that role chat sees, like, dev.

292 00:46:42.930 00:46:59.180 Katherine Bayless: we probably want to make it so that they see QA. I think initially, we still don’t have any use cases where there’s, like, data in there that we, like, really don’t want people to see, although I think that’ll change soon, too. So some thinking around how we do the QA

293 00:46:59.500 00:47:09.140 Katherine Bayless: visibility that might need to be scoped at the, like, data source level, so people from market research can see this data in QA, but not if we start working with the concur invoice data.

294 00:47:16.150 00:47:22.259 Amber Lin: I know we have, like 10, 15 minutes left.

295 00:47:22.630 00:47:31.500 Amber Lin: Ujam, do you want to talk a little bit about what we want to do when we’re…

296 00:47:32.210 00:47:32.960 Uttam Kumaran: Yes.

297 00:47:32.960 00:47:36.920 Amber Lin: know Utam wants some… we all want some FaceTime with folks.

298 00:47:36.920 00:47:44.939 Uttam Kumaran: Yeah, so I had a great conversation with Jay, Catherine, so I just… maybe we can pull up calendars, and I can just start to block

299 00:47:45.190 00:47:52.299 Uttam Kumaran: things. I think both Amber and I fly out around 5 on Wednesday.

300 00:47:52.610 00:47:57.910 Uttam Kumaran: I think it could be nice to do a dinner on Tuesday, if people are in.

301 00:47:58.250 00:48:02.000 Uttam Kumaran: And then… Yeah, I’m kind of interested in, like.

302 00:48:02.360 00:48:05.479 Uttam Kumaran: how we should, I know there’s a…

303 00:48:05.740 00:48:14.100 Uttam Kumaran: CTA, like, kind of that executive meeting, and then if we wanted to block out time for working sessions and lunch, we could do that. So…

304 00:48:14.340 00:48:17.819 Uttam Kumaran: I don’t know, just any thoughts from anyone in the group?

305 00:48:20.320 00:48:31.979 Katherine Bayless: I’ll just throw out, Christina did like the idea of doing the Wednesday 11 to 2 as the kind of, snowflake open house, office hours, you know, I’ll bring some snacks.

306 00:48:31.980 00:48:32.450 Uttam Kumaran: Okay.

307 00:48:32.450 00:48:46.390 Katherine Bayless: folks can kind of, like, drop in and either, you know, ask what it is, or ask a specific question, or, you know, whatever. So, like, the 11 to 2 on Wednesday as a just anybody-come-one-come-all type, spot, I think is good to hold.

308 00:48:46.390 00:48:47.050 Uttam Kumaran: Okay.

309 00:48:47.050 00:48:52.159 Katherine Bayless: then the leadership meeting on the Tuesday with the CES folks,

310 00:48:52.380 00:49:09.330 Katherine Bayless: that would be 12 to 1 on Tuesday, which would be more so of, like, kind of fly on the wall, like, I’m doing a Snowflake demo to them, but I think, honestly, this whole disposable software thing is gonna wind up dominating that conversation. Snowflake’s like, a teeny tiny piece, this is my hypothesis.

311 00:49:09.330 00:49:14.740 Katherine Bayless: But I love the idea of dinner, and yeah, what else do other folks think?

312 00:49:15.360 00:49:31.789 Amber Lin: I would like a list of people to meet one-on-one with, because I have the time, and I think I can walk them through the more specifics of, an intro to if you want to set up a semantic view, what does it look like, how I look at the data and put them together, so if you have

313 00:49:31.790 00:49:38.190 Amber Lin: folks in mind, I think I can start scheduling those, or make sure that they have time.

314 00:49:38.320 00:49:40.029 Amber Lin: In those two days.

315 00:49:40.550 00:49:42.500 Katherine Bayless: Okay, okay,

316 00:49:42.910 00:49:54.710 Katherine Bayless: I might, might tag in Kai to help kind of think through who some of the folks might be as well, but definitely, Anna Rutter, I think, would be an interesting person to talk to, yeah. Go ahead.

317 00:49:55.590 00:50:03.630 Chi Quinn: Oh, I was on mute. No, I was gonna say, Anna, definitely, Anna Rush, she is definitely the person, at least on the membership side, for sure.

318 00:50:04.130 00:50:07.949 Katherine Bayless: Yeah, and then Chris Detloff, we should get him in too.

319 00:50:08.750 00:50:20.220 Uttam Kumaran: Yeah, so these would be, Catherine, like, kind of, like, Tuesday, Wednesday, before, like, our broader office hours. So I basically said, Amber, I was like, we should get in front of some of the folks that are, like, the power users, one-on-one.

320 00:50:20.400 00:50:29.060 Uttam Kumaran: And then also, like, for us to just build a relationship so that if they have questions, there’s more people to sort of help… help them, and so I think that would be a great use of time.

321 00:50:31.030 00:50:33.249 Katherine Bayless: Yeah, yeah, yeah.

322 00:50:33.620 00:50:39.490 Katherine Bayless: Okay, yeah, we can get, I guess, yeah, we can offline get you a list, but okay. But definitely, we can give you some names.

323 00:50:40.620 00:50:43.359 Uttam Kumaran: And then, did we want to try to block some time for, like.

324 00:50:43.520 00:50:53.999 Uttam Kumaran: sort of, like, rest of your planning, or do, like, a group meeting on that? I don’t know, is that off the… is that subject too sensitive?

325 00:50:54.000 00:50:59.240 Katherine Bayless: Right. No, no, I mean, I think we probably should. I mean, I think, you know.

326 00:50:59.350 00:51:03.600 Katherine Bayless: I don’t know if anybody else in this group feels this way, but I feel like I’m, like…

327 00:51:03.670 00:51:21.370 Katherine Bayless: like, stumbling across finish line after finish line after finish line, like, everything’s just, like, changing so constantly around me that, you know, like, just kind of clawing my way forward, and I… I don’t know how much we can predict will occur in the next few months, but I do think we are seeing, I feel like.

328 00:51:21.370 00:51:28.980 Katherine Bayless: our work is changing quickly from, you know, what used to be data and analytics to what is now AI enablement?

329 00:51:28.980 00:51:29.470 Uttam Kumaran: Yeah, engineer.

330 00:51:29.470 00:51:48.310 Katherine Bayless: Right? And that might merit us taking, you know, a day to really, like, you know, step back and think about, alright, maybe we can’t know what features will be out in 3 weeks, but we can figure out what things we want to be able to deliver to our colleagues in service of this goal of scaling laterally.

331 00:51:48.310 00:51:51.090 Katherine Bayless: Like, I think, yeah, I think that’s a good idea.

332 00:51:52.610 00:52:06.049 Uttam Kumaran: Yeah, I think, and I’d be interested, Kyle, in, like, your thoughts. I mean, I think, yeah, we should… let’s just… instead of now saying, like, okay, what would we have done in the old world? Like, let’s assume things are gonna change. I think my conversation with Amber was helpful, because I said, look.

333 00:52:06.250 00:52:18.649 Uttam Kumaran: we may not know what finish it looks like, but we do need observability, we do need a triage process, we do need lines of communication, and so maybe it’s more of that, and I think, like, yeah, if we can spend, like.

334 00:52:19.690 00:52:30.879 Uttam Kumaran: or so, maybe that’s, like, Tuesday before dinner or something, and just talk about, okay, like, what is some stuff previous… from previously, like, we can’t really control?

335 00:52:31.060 00:52:42.039 Uttam Kumaran: And then now, like, what are some things that, like, regardless of what happens, we need these things in place? Whether it’s, like, regular trainings, or, yeah, observability, or…

336 00:52:42.170 00:52:54.219 Uttam Kumaran: maybe what we do is we come on Monday, and we sort of share, like, here are all the questions that came in through the Slack Assistant, like, that dictates the roadmap. Kind of like that level of, like, how does this team now operate?

337 00:52:54.480 00:53:05.100 Uttam Kumaran: Because I think we’re all probably feeling the same thing, in that, like, we could set a goal, but then, like, we see something, and it’s not like shiny object syndrome, it just, like, actually is better.

338 00:53:05.310 00:53:05.930 Katherine Bayless: Yeah.

339 00:53:05.930 00:53:07.469 Uttam Kumaran: But, like, how do we still, like.

340 00:53:07.920 00:53:13.089 Uttam Kumaran: How do we still feel like, okay, we’re not just waking up and, like, whatever is on our plate, we, like, take on, you know?

341 00:53:13.800 00:53:22.729 Katherine Bayless: Yeah, exactly. How do we stay agile and not lose, like, discipline, but also, you know, keep moving at the speed that we’re gonna have to? Yeah.

342 00:53:24.240 00:53:42.939 Kyle Wandel: No, I think that’s a good point. You tell my… so part of the GitHub thing, I literally said features to add back to the plan is, like, how do we say continued research? So I was like, do we… how do we do continued, like, weekly research, or force me, myself, basically, or Catherine, or somebody else, on the team to do weekly research in terms of just, like, new best practices for AI, and like…

343 00:53:42.940 00:53:43.430 Uttam Kumaran: Yes.

344 00:53:43.430 00:53:46.449 Kyle Wandel: and stuff like that. So, it’s, like, one of those things where I’m like.

345 00:53:46.480 00:53:59.719 Kyle Wandel: I’m trying to think about this. I think Catherine’s point is the best way to talk about it, is, like, how do we build something that’s scalable? And, like, not necessarily that is tool-oriented or technical-oriented, but how is it process-oriented? That way, that when a new…

346 00:53:59.720 00:54:06.960 Kyle Wandel: thing comes out, we can just scale it, like, insanely quickly. And so that’s how I’m trying to build a GitHub out, and so…

347 00:54:07.210 00:54:08.870 Kyle Wandel: Yeah, that’s where my head’s at.

348 00:54:10.930 00:54:33.150 Katherine Bayless: I mean, I think it’s the, the interesting, like, flip, or not flip, but, like, the other end of that spectrum, too, is, like, I don’t know what it’s gonna look like in, like, a year when we’ve, like, baked all these things into these repos, like, so the building blocks one where I put one of those, like, spec-driven, you know, things in it, and I’m like, it’s cool now, but, like, a year from now, when nobody’s doing it that way anymore, how hard is it gonna be to tear this thing out? I don’t know.

349 00:54:33.520 00:54:36.189 Katherine Bayless: I don’t know. I don’t know. It’s gonna be fun.

350 00:54:37.180 00:54:51.029 Uttam Kumaran: Yeah, so also, like, even, like, yeah, like, a week… you know, some things we do internally, we’ll do, like, different, like, brown bags, or some people will share, like, I learned how to do this new thing, like, let’s share with people. So even, like, maybe we should do that once a week?

351 00:54:51.320 00:55:10.850 Uttam Kumaran: And that way, it’s not, like, an update meeting, it’s like, okay, did someone learn something interesting and, like, want to share? Or, like, someone just, like, had an unlock with Cloud Code, or saw something cool, or, like, had a thought about, like, an architecture, and even getting us to just do that as a group could be really, really nice.

352 00:55:11.020 00:55:14.930 Uttam Kumaran: Because we’re… we do that internally all the time, it’s, like, amazing.

353 00:55:15.620 00:55:22.329 Katherine Bayless: Yeah, I would actually… I would love that. I forget… what’s the, like, Agile buzzword for…

354 00:55:22.880 00:55:24.289 Katherine Bayless: It’s not demos.

355 00:55:26.490 00:55:35.709 Uttam Kumaran: I feel like I always said brown bags, but then someone asked me, like, what does that mean? And I’m like, well, in the old days, you used to bring your lunch to the conference room, and…

356 00:55:35.710 00:55:36.610 Katherine Bayless: Yeah, yeah, yeah.

357 00:55:36.610 00:55:43.239 Uttam Kumaran: be in a bag, and so I was like, but I guess I’ve been doing virtual now for so long.

358 00:55:43.240 00:55:43.950 Katherine Bayless: Yeah.

359 00:55:44.010 00:55:48.639 Kyle Wandel: This is where my head’s at. It’s a brown bag, because it’s not a complete process, it’s just initial thought.

360 00:55:49.200 00:55:51.240 Uttam Kumaran: Yeah, yeah.

361 00:55:52.210 00:55:55.210 Katherine Bayless: Yeah, no, I’ve always enjoyed that kind of conversation.

362 00:55:55.210 00:56:02.269 Uttam Kumaran: I love it, yeah. I’ve done it my whole career, it’s been amazing. And I always have something I want to be like, I’ll just put 3 slides together and just…

363 00:56:02.980 00:56:04.529 Uttam Kumaran: See what people think.

364 00:56:05.520 00:56:08.760 Katherine Bayless: I mean, do we want to, like, maybe…

365 00:56:09.020 00:56:28.190 Katherine Bayless: I mean, could we use these meetings for that? I mean, I, you know, if they become our Friday afternoon, well, you know, Friday noon-ish, sort of just, like, yeah, demo and share and brown bag, and then, you know, if we wanted to throw a, you know, in the Slack channel at some point and say, like, hey, anybody else want to join? Come on in, kind of thing.

366 00:56:28.190 00:56:29.010 Uttam Kumaran: Yeah.

367 00:56:29.010 00:56:30.389 Katherine Bayless: Yeah, yeah, yeah.

368 00:56:30.760 00:56:34.430 Uttam Kumaran: Yeah, I think, Amber, maybe I’ll give you the thought. I think, like.

369 00:56:34.800 00:56:42.710 Uttam Kumaran: I would say, like, out of the three of us from BrainFirst side, Amber’s the most organized, and so I kind of… she was the one pushing, like, hey, I think we’re just, like.

370 00:56:43.140 00:56:51.609 Uttam Kumaran: moving towards a moving target, like, what do we… what are we doing? I’m like, yeah, it’s part of the fun, but also, this is, like, yeah, it’s definitely a risk.

371 00:56:51.700 00:57:03.540 Uttam Kumaran: So, I’m interested, Amber, if you think, like, yeah, we can do enough of our planning on Monday, maybe we spend the first half of this call on, like, for the good of the orders type stuff?

372 00:57:03.930 00:57:07.369 Uttam Kumaran: And then we move to more, like.

373 00:57:07.770 00:57:10.520 Uttam Kumaran: someone has a demo prepared, or just Lightning, like.

374 00:57:10.670 00:57:12.510 Uttam Kumaran: I have a thing I want to share.

375 00:57:12.670 00:57:14.909 Uttam Kumaran: That could be a really fun way to close out the week.

376 00:57:15.060 00:57:22.500 Amber Lin: Yeah, I agree. I mean, Kyle today has something on the repo that he shared that was really cool. We can… we can do that.

377 00:57:22.510 00:57:36.910 Amber Lin: My… my fear was mostly because the deadline is approaching, I was like, I don’t feel like I’m ready to show all these people and tell them I’m done, but I am glad that we can align. So, mostly wanted alignment. I think we got that.

378 00:57:38.000 00:57:56.060 Katherine Bayless: Yeah, and I should say, too, I am… I am, like, chaos made manifest, and I… I try to, like, you know, be chaotic good versus chaotic evil, but, like, I really, like, I welcome the structure. I do not need to be the person who talks first and most on these things, like.

379 00:57:56.060 00:58:08.539 Katherine Bayless: you guys are help, like, whatever processes you want to put in place, like, and shove me into them, like, I can behave. I just am not going to be the person who designs, because it’s just not the way my brain works. But yeah, like, I would love to, you know.

380 00:58:08.890 00:58:11.980 Katherine Bayless: Fit more nicely into, orderly meetings.

381 00:58:12.590 00:58:27.129 Uttam Kumaran: Yeah, I don’t know, Kyle, maybe from, like, the… even just, like, historical CTA perspective, is there anything, like, I think about some of these as, like, okay, do we get an artifact we can report out, like, weekly? Like, how do you think about managing up, or, like, is there anything that…

382 00:58:27.250 00:58:43.869 Uttam Kumaran: I always think about, yeah, this process for us, like, what are we working on this week? But maybe even turning that into, like, a one-pager, you know, you could just… we quickly design a one-pager, send it to people, like, I’m wondering if there’s any things like that that we can use to help, you know, get visibility on the team as well.

383 00:58:44.770 00:58:58.700 Kyle Wandel: I mean, people are pretty receptive. In my opinion, to the data team in general, I think everybody’s really, really excited. If we want to do a quick, like, one-pager, I think I wouldn’t be… wouldn’t be a terrible thing. Make it fun, though. Okay. Is how Catherine would kind of express it, rather than…

384 00:58:59.130 00:59:01.549 Kyle Wandel: structured and oriented, even though… No, make it…

385 00:59:01.550 00:59:07.899 Uttam Kumaran: a fun, like, AI one-pager thing, like, something that’s, like, designed by Claude Code or whatever, you know?

386 00:59:07.900 00:59:16.050 Kyle Wandel: Yeah, basically, I think that would work as well, and yeah, Amber, you and I have the biggest challenge of keeping Catherine’s brain structured, so that’s…

387 00:59:17.390 00:59:42.380 Katherine Bayless: I’m terrible. I mean, honestly, like, it’s so funny you say that, like, moving, moving towards a moving target. It’s how I feel, like, the, like, okay, so we just greenlit the first disposable software project. Meanwhile, our AI task force proposal is still grinding through some review cycle, right? I’m like, well, what? And then, like, I’m hiring the CES platforms person, but, like, when I posted the job, it was, you’re gonna manage these 3 vendors, and now it’s gonna be, we build the

388 00:59:42.380 00:59:44.040 Katherine Bayless: software. Hope that’s okay.

389 00:59:44.360 00:59:47.309 Katherine Bayless: Right? Like, it’s just changing so fast.

390 00:59:47.310 00:59:54.749 Uttam Kumaran: Oh, and that’s even on our side, we interview now, I interview for, like, just aptitude and, like, experience and change.

391 00:59:54.750 00:59:55.320 Katherine Bayless: Like.

392 00:59:55.320 01:00:00.199 Uttam Kumaran: And I kind of want to see, like, okay, have you dealt through things, technology changes?

393 01:00:00.390 01:00:14.729 Uttam Kumaran: Versus, like, I can’t tell you what’s next. Something… change is the only thing I can guarantee, you know? It’s like they say change is the only thing that’s constant, so… yeah, I… but I also feel like, Catherine, I think, like.

394 01:00:15.150 01:00:32.639 Uttam Kumaran: your vision is actually, I think, the better option versus the conservative vision. So, you can only… you can’t… but this is what they say, you can’t have, like, the vision person be, like, sort of the integration person. It sort of puts constraints, right? So, I think we can take that on, and we’re also not, like, super heavy on process, either.

395 01:00:32.640 01:00:37.289 Uttam Kumaran: So, like, we find a nice balance, I think, always, on, like.

396 01:00:37.360 01:00:46.029 Uttam Kumaran: where to land, as you guys have seen, like, so… Yeah, let’s plan on that, and maybe, Amber, I think let’s plan on that for Monday and, like, kind of Fridays.

397 01:00:46.030 01:01:03.929 Uttam Kumaran: And then, yeah, I think I’ll go ahead and maybe Amber, me, you, we can send out invites to this group for a couple of those, like, for that Tuesday session, for the Wednesday piece, and then I’m just gonna try to find… I mean, we’ll be around, so I’m sure we’ll just be…

398 01:01:03.950 01:01:10.919 Uttam Kumaran: They’re hanging out, so… and then I’ll… whoever has best opinion on dinner, like, that would be amazing, too.

399 01:01:12.220 01:01:14.140 Katherine Bayless: That’s true. I’ll…

400 01:01:14.140 01:01:20.509 Kyle Wandel: Do you want to go in the city, or versus not in the city. Do you know where you guys are staying yet?

401 01:01:20.510 01:01:22.830 Uttam Kumaran: We’re near you guys in, like, Arlington?

402 01:01:23.310 01:01:23.960 Kyle Wandel: Okay.

403 01:01:24.830 01:01:26.299 Uttam Kumaran: I don’t mind either way.

404 01:01:26.680 01:01:34.230 Kyle Wandel: Yeah, I would suggest somewhere in Alexandria. In my opinion, that’s much better than the city. I mean, if you like… if you’re a city person, that makes sense to go into DC, but,

405 01:01:34.350 01:01:37.070 Kyle Wandel: It’s kind of… it’s kind of crappy, in my opinion, so…

406 01:01:37.310 01:01:39.550 Amber Lin: And I feel our traffic’s gonna be bad.

407 01:01:39.610 01:01:40.230 Katherine Bayless: Yeah.

408 01:01:40.230 01:01:40.790 Kyle Wandel: Oh, yeah.

409 01:01:40.790 01:01:41.500 Amber Lin: Good afternoon.

410 01:01:41.890 01:01:48.579 Kyle Wandel: It’s the, nationally, it’s the worst ranked, traffic, in the U.S, so, for commute.

411 01:01:50.340 01:01:51.300 Uttam Kumaran: It’s funny.

412 01:01:51.500 01:01:54.500 Kyle Wandel: That just overtook LA this past year, I think.

413 01:01:55.660 01:02:06.349 Uttam Kumaran: My… I’m… for me, I’m just allergic to bad food, so literally anything, anywhere, I would love… I love food, so whatever, I’m so down.

414 01:02:06.470 01:02:12.910 Kyle Wandel: Yeah, I mean, if you want to come out, like, 20 minutes out west, you go to Korean Barbecue, I’d always recommend that, so…

415 01:02:13.950 01:02:15.550 Uttam Kumaran: Oh, that could be good.

416 01:02:15.550 01:02:19.930 Katherine Bayless: I’m agnostic, vegetarian, but agnostic.

417 01:02:19.930 01:02:21.500 Kyle Wandel: But you’re a vegetarian, yeah, yeah.

418 01:02:22.070 01:02:37.830 Katherine Bayless: But yeah, no, and I think, I mean, honestly, I think the Alexandria suggestion is probably the right play, because it’s only, like, about a mile-ish from kind of where the office is, so it’s easy to get to, whether, like, Metro, Uber, or… you could actually walk if it’s nice. Yeah. Like, there’s a lot of really good restaurants down there. Yeah.

419 01:02:38.330 01:02:54.650 Katherine Bayless: Actually, I finally got a chance to try a pizza place that I’d been meaning to called Strassi, S-T-R-A-C-C-I. I guess they used to be a, like, a food truck, and then they were very successful, I think, during COVID, and so now they have an actual, like, restaurant. And it was as good as I was promised. Just, you know, pizza.

420 01:02:54.650 01:02:56.239 Uttam Kumaran: Wow, this looks really good.

421 01:02:56.240 01:02:58.750 Katherine Bayless: Yeah, it was really good, really good food.

422 01:02:59.370 01:03:04.599 Uttam Kumaran: I’ll show some people some… Yeah, now I’m hungry. It’s lunchtime here.

423 01:03:08.060 01:03:15.930 Katherine Bayless: Yeah, yeah, I had the, the seasonal pizza that looks like that one in the middle there, actually, like, with, like, ramps and, like, like, the big globs of mozzarella cheese, and I was like.

424 01:03:15.930 01:03:16.740 Amber Lin: Yeah.

425 01:03:17.520 01:03:18.630 Katherine Bayless: It’s my happy place.

426 01:03:20.400 01:03:27.689 Uttam Kumaran: I’m game for this. Maybe I’ll just put this as, like, our choice for now, and I’ll just send a hold, and then we can have a vote. I,

427 01:03:28.630 01:03:31.830 Uttam Kumaran: Yeah, wherever. Outside would be great, yeah.

428 01:03:32.370 01:03:43.980 Katherine Bayless: Yeah, they actually… I think they had… I didn’t even realize, they had a few tables up front, I had noticed, but then as I was leaving, it looked like they might actually have, like, almost like, like, cabanas, like outdoor booths, kind of a thing? .

429 01:03:43.980 01:03:45.160 Uttam Kumaran: Yeah, like this thing.

430 01:03:45.160 01:03:47.760 Katherine Bayless: Yeah, and like, down kind of to the left of the building.

431 01:03:47.760 01:03:48.700 Uttam Kumaran: Oh, okay.

432 01:03:48.700 01:03:52.960 Katherine Bayless: like, an alleyway that they’ve got built out with seating and stuff, so…

433 01:03:53.280 01:03:57.809 Katherine Bayless: Yeah, I don’t know, I mean, I’m down for this, I mean, if there’s any, any opposition.

434 01:03:57.950 01:03:59.789 Katherine Bayless: I like pizza.

435 01:04:02.050 01:04:05.810 Katherine Bayless: We’d make, Jeff Bezos proud of being a two-pizza team.

436 01:04:06.160 01:04:11.550 Uttam Kumaran: Yes. Although, I don’t know, I feel like we gotta get an appetizer.

437 01:04:11.550 01:04:12.640 Katherine Bayless: Yeah, I mean, actually.

438 01:04:12.640 01:04:14.900 Uttam Kumaran: Pizza’s an appetizer.

439 01:04:14.900 01:04:17.980 Katherine Bayless: Yeah, I think the two-pizza team is based on much larger pizzas.

440 01:04:17.980 01:04:19.280 Uttam Kumaran: Yeah.

441 01:04:21.880 01:04:22.670 Katherine Bayless: But yeah.

442 01:04:23.810 01:04:34.029 Uttam Kumaran: Okay, amazing. Well, great. I think, probably the… I think the big thing, Catherine, is to think about is the Slack channel. I don’t know if you feel like that’s still the best choice.

443 01:04:34.400 01:04:53.210 Katherine Bayless: Yeah, I think… so, I think, actually, we talked about this before you joined, but yeah, I’m gonna set up the Slack channel, that’ll be sort of the community of practice, so it won’t even necessarily just be Snowflake, it’ll be, like, Snowflake plus Cloud Code, plus whatever other tooling we’re kind of rolling out, plus these disposable software things that we might end up letting people, like.

444 01:04:53.210 01:04:59.870 Katherine Bayless: you know, spin up their own for, to Kyle’s point, internal things. But, like, yeah, so it’s gonna be kind of a, like.

445 01:05:00.320 01:05:05.039 Katherine Bayless: you know, the future of work is kind of the theme of the Slack channel, but then still very.

446 01:05:05.040 01:05:05.540 Uttam Kumaran: Copp.

447 01:05:05.540 01:05:29.410 Katherine Bayless: you know, saying that, like, if you need something from us, like, if you are actively trying to, like, request work, then put in an Asana ticket. Okay. And then Awash actually had a good suggestion, which was maybe a partner channel that is kind of like a digest or a feed of, like, the announcements, like, if we do start generating release notes and stuff like that. So yeah, I’ll get the Slack channel set up, this afternoon, and then,

448 01:05:29.410 01:05:33.990 Katherine Bayless: maybe on Monday, we can start letting people kind of come in and chit-chat in them.

449 01:05:35.590 01:05:36.280 Uttam Kumaran: Okay.

450 01:05:36.770 01:05:37.640 Uttam Kumaran: Cool. Right.

451 01:05:37.830 01:05:43.179 Amber Lin: Yeah, and I’m working on the sidebar, the metadata, and then adoption dashboard.

452 01:05:43.370 01:05:52.970 Amber Lin: Quick note, I also ran into the very silly issue of… I asked what a CES, it gave me a consumer something score, and I…

453 01:05:53.590 01:05:54.580 Amber Lin: Ugh.

454 01:05:54.930 01:06:07.319 Amber Lin: And I’ve been trying to fix that. That’s the thing I’ve been trying to fix. It’s so annoying that I can’t give the sidebar AI instructions, so I’m trying to work my way around it.

455 01:06:07.540 01:06:09.299 Katherine Bayless: I like videos.

456 01:06:09.300 01:06:16.159 Uttam Kumaran: Maybe we should stuff all the contacts, in, like, a table or something.

457 01:06:16.470 01:06:18.759 Uttam Kumaran: Life has rose.

458 01:06:18.760 01:06:25.119 Amber Lin: I’ve been trying to… the issue is it doesn’t read the tables when I ask it those textual questions.

459 01:06:25.120 01:06:25.490 Katherine Bayless: It’s all.

460 01:06:25.490 01:06:28.529 Amber Lin: It only gets the context when it’s asked something

461 01:06:28.830 01:06:42.400 Amber Lin: I asked compare CS and Cvent, and it suddenly knew, oh, I gotta look at something to compare, and I’ve read the context and the metadata. But if I only asked something of just say, what is this type.

462 01:06:42.400 01:06:45.270 Uttam Kumaran: It’s just gonna ask the model… it’s just gonna ask straight model, like…

463 01:06:47.240 01:06:48.100 Uttam Kumaran: Weird.

464 01:06:48.430 01:06:49.300 Amber Lin: M.

465 01:06:49.300 01:06:51.450 Katherine Bayless: So the verbs really matter, it seems.

466 01:06:52.770 01:06:53.330 Katherine Bayless: Yeah.

467 01:06:53.330 01:06:56.779 Amber Lin: Yeah, it’s a… it’s a little silly, and we’ll run into…

468 01:06:56.780 01:06:57.739 Uttam Kumaran: Questions that people ask?

469 01:06:57.740 01:07:03.459 Amber Lin: asking us, hey, why doesn’t it not know who we are? So…

470 01:07:05.460 01:07:19.250 Katherine Bayless: I think, at some point, it’ll probably, think that we’re the Chicago Transit Authority, and it’ll return some sort of answer about, like, you know, public transit usage or something, and they’ll be like, you had your chance, robot.

471 01:07:20.760 01:07:24.180 Amber Lin: Oh, I’ll see if we can change anything there.

472 01:07:24.180 01:07:24.810 Katherine Bayless: Okay.

473 01:07:25.130 01:07:27.080 Amber Lin: Yeah. Okay. Cool.

474 01:07:27.500 01:07:28.140 Katherine Bayless: Uncle?

475 01:07:28.310 01:07:31.000 Uttam Kumaran: Alright, thank you everyone, looking forward to next week.

476 01:07:31.000 01:07:31.929 Katherine Bayless: Yeah, yeah, it’s gonna be great.

477 01:07:32.870 01:07:33.900 Katherine Bayless: See you in person.

478 01:07:35.120 01:07:37.900 Uttam Kumaran: Okay, talk to y’all soon.

479 01:07:38.240 01:07:40.129 Amber Lin: Bye. Bye, and have a good weekend.