Meeting Title: Brainforge x CTA: Security Concerns Discussion Date: 2025-11-20 Meeting participants: Uttam Kumaran, Katherine Bayless
WEBVTT
1 00:00:19.050 ⇒ 00:00:24.069 Uttam Kumaran: Hi, sorry if you don’t mind, I’m just trying… I’m just eating a late lunch, so I will be off video.
2 00:00:24.070 ⇒ 00:00:25.329 Katherine Bayless: Oh, no, you’re fine.
3 00:00:25.490 ⇒ 00:00:26.650 Uttam Kumaran: Fairy right.
4 00:00:26.650 ⇒ 00:00:30.100 Katherine Bayless: I’m gonna sit in my chair and have a glass of water.
5 00:00:30.460 ⇒ 00:00:31.800 Uttam Kumaran: Okay, great.
6 00:00:34.720 ⇒ 00:00:35.859 Katherine Bayless: How was your day?
7 00:00:36.370 ⇒ 00:00:52.550 Uttam Kumaran: Good! It’s, it’s been busy, but, yeah, our business is growing, and it’s sort of fun to see also, like, what breaks, and we’re having surprisingly busy November, I feel like, but it’s really good, and bringing on a couple more
8 00:00:52.770 ⇒ 00:00:59.420 Uttam Kumaran: engineers, and yeah, it’s, it’s, it’s, it’s a fun. I think Sam and I worked this morning.
9 00:00:59.560 ⇒ 00:01:07.420 Uttam Kumaran: on putting together an architecture diagram that I have to review, and we’re working on a Gantt chart for the whole project.
10 00:01:07.600 ⇒ 00:01:12.340 Uttam Kumaran: sort of getting an audit of everything. So yeah, positive day, and I think…
11 00:01:12.500 ⇒ 00:01:18.899 Uttam Kumaran: Everyone’s out, sort of, like, a lot of next week, so just trying to, like, scramble to get as much done as we can.
12 00:01:18.900 ⇒ 00:01:30.950 Katherine Bayless: Yeah, I know, it’s funny, like, I’m totally in that spot where, like, I’m still so new, and, like, don’t have a ton of leave, and stuff like that, and so, like, I’ll probably be working more next week than I would, like, normally.
13 00:01:31.340 ⇒ 00:01:35.490 Katherine Bayless: I’m actually kind of looking forward to it being quiet, maybe getting some shit done.
14 00:01:35.490 ⇒ 00:01:52.560 Uttam Kumaran: Yes, no, that’s actually, like, most of my schedule is client meetings, and I unfortunately don’t get to do a lot of the fun work. I still do some fun work, but usually I come in as, like, the first person to enter the burning building, and then I sort of figure it out, and then sort of loop in my team.
15 00:01:52.690 ⇒ 00:02:00.179 Uttam Kumaran: But I like, you know, I still set up stuff, like, I like doing all the things, I just don’t get to do as much of it, I’m more on the reviewing end, but…
16 00:02:01.450 ⇒ 00:02:02.760 Uttam Kumaran: Yeah, that’s…
17 00:02:02.760 ⇒ 00:02:11.650 Katherine Bayless: It’s really hard to, like, switch out of the, like, super technical hands-on keyboard to the, like, supervising the people who have the hands on the keyboards who are.
18 00:02:11.650 ⇒ 00:02:12.440 Uttam Kumaran: Yes.
19 00:02:12.440 ⇒ 00:02:14.070 Katherine Bayless: AI, right?
20 00:02:14.070 ⇒ 00:02:18.470 Uttam Kumaran: Yes, yes. And sometimes you’re like, hey, this is all AI.
21 00:02:18.470 ⇒ 00:02:19.250 Katherine Bayless: You don’t…
22 00:02:19.450 ⇒ 00:02:28.240 Uttam Kumaran: you didn’t even try, and then… and I’m like, and I know it’s not because you don’t know, it’s because… and they’re like, yeah, sorry, I don’t know, just like…
23 00:02:28.570 ⇒ 00:02:38.659 Uttam Kumaran: And I’m like, yeah, don’t do that. Like, if we put AI stuff in front of clients, you have to be able to explain all of it. Like, I don’t care if you do it with AI.
24 00:02:38.660 ⇒ 00:02:39.479 Katherine Bayless: It has to be.
25 00:02:39.480 ⇒ 00:02:44.230 Uttam Kumaran: to work, and you have to be able to explain every line. Can you do that? Okay, if not, then it’s not…
26 00:02:44.690 ⇒ 00:02:52.910 Uttam Kumaran: It’s not ready, like, you have to account for all the stuff that goes out. And so, that’s kind of a funny conversation to be having with some people.
27 00:02:53.410 ⇒ 00:03:10.600 Katherine Bayless: Yeah, I mean, actually, that is, like, obviously, like, on my end, you know, it’s the same thing, right? Like, I won’t push something unless I can review it and explain it, but I hadn’t really thought about it in the case of, like, consultants. Like, yeah, it probably is another layer of, like, heads up, pay attention, yeah.
28 00:03:10.600 ⇒ 00:03:14.779 Uttam Kumaran: Well, it’s just easy to, like, in… I feel like in consulting.
29 00:03:15.220 ⇒ 00:03:23.779 Uttam Kumaran: it’s… it’s sort of… a lot of the things that people were doing were actually really automatable, like, people are doing decks and things.
30 00:03:23.970 ⇒ 00:03:29.490 Uttam Kumaran: And now they’re finding that, oh my gosh, I could do it so fast. The problem with us, and we’re not…
31 00:03:29.680 ⇒ 00:03:39.409 Uttam Kumaran: I don’t feel like we’re… we’re great consultants, because we work really fast, and so we just, like, push, and I don’t… if the AI just helps us move faster, like, I don’t really.
32 00:03:40.100 ⇒ 00:03:43.290 Uttam Kumaran: I don’t know, we weren’t, like, sandbagging before, so…
33 00:03:44.180 ⇒ 00:03:56.530 Katherine Bayless: It’s actually, it’s really interesting. So, like, small sidebar, but this friend of mine who, she runs a consulting business that’s in the, like, AMS sort of selection and implementation world.
34 00:03:56.680 ⇒ 00:04:02.940 Katherine Bayless: I’ve done multiple projects with her in the past, implementing, actually implementing Impexium, aka Remembers.
35 00:04:02.940 ⇒ 00:04:04.190 Uttam Kumaran: Oh, okay. Yeah, yeah, yeah.
36 00:04:04.190 ⇒ 00:04:13.390 Katherine Bayless: Yeah. And she, she, I was talking to her, at a conference a couple weeks ago, and she was saying, like, she’s really trying to have this, like.
37 00:04:13.390 ⇒ 00:04:30.949 Katherine Bayless: existential crisis moment. Like, she… a lot of her business is the selection piece, and she said the average cost of a selection contract is 55,000 for her to tell them which system to pick. Yes. It’s more complicated than that, but at the
38 00:04:30.950 ⇒ 00:04:44.609 Katherine Bayless: Yes, yes, yes, yes. And she’s like, that’s literally, like… I mean, you don’t even need Gen AI for that. I mean, it could have been an algorithm all along, but now with GenAI, there’s really no excuse, and she’s like, am I the better person if I, like…
39 00:04:44.710 ⇒ 00:04:49.689 Katherine Bayless: cannibalize my business by solving that problem? And I’m like, yes, be the first one to solve it.
40 00:04:49.690 ⇒ 00:05:00.580 Uttam Kumaran: You should be this… oh my god, you took the words out of my mouth. I’m like, yeah, there’s a few years where you will be able to compete on speed in a way others won’t, or depth in a way others won’t, and…
41 00:05:00.630 ⇒ 00:05:16.679 Uttam Kumaran: that’s your moat for a while. But also, in my business, there is a lot of things you can do with AI, but most of where… that’s why I want to align my business as much to, like, outcomes, and speed to outcome, and the quality, because there is still a lot of things that you… even… and…
42 00:05:16.930 ⇒ 00:05:24.550 Uttam Kumaran: like, there’s just a lot of things you can’t basically just toss into AI, like, being able to show up to, like, a vendor meeting, like the one we had with Remember.
43 00:05:25.370 ⇒ 00:05:27.150 Uttam Kumaran: them to do something, follow up.
44 00:05:27.490 ⇒ 00:05:31.030 Uttam Kumaran: Not something like… those are the challenges that…
45 00:05:31.150 ⇒ 00:05:35.579 Uttam Kumaran: you know, a lot of our customers are facing. It’s not necessarily, like.
46 00:05:35.790 ⇒ 00:05:44.980 Uttam Kumaran: Can you spin up Snowflake? Yes, there’s, like, there is an aspect of that, but there’s a lot of other things that really great consultants, I feel like that’s what they bring to the table.
47 00:05:45.100 ⇒ 00:05:49.739 Uttam Kumaran: And also, the other piece is, like, I’m actually able to leverage, like, my team, there are people who
48 00:05:49.910 ⇒ 00:05:54.059 Uttam Kumaran: who only had AI experience and now are able to do, like, great data analysis work.
49 00:05:54.100 ⇒ 00:05:57.890 Katherine Bayless: And so it’s actually, like, blurring the lines. If you’re just a smart engineer.
50 00:05:57.890 ⇒ 00:06:02.959 Uttam Kumaran: Like, I shipped some, like, front-end code the other day, and I don’t have any business, like, doing that.
51 00:06:02.960 ⇒ 00:06:04.320 Katherine Bayless: And…
52 00:06:04.320 ⇒ 00:06:06.609 Uttam Kumaran: That’s, like, what I think is fun about…
53 00:06:06.850 ⇒ 00:06:13.750 Uttam Kumaran: the leveling of AI, like, I don’t… yes, we can get work done faster, but my expectations just go up, like, I don’t…
54 00:06:14.320 ⇒ 00:06:16.160 Uttam Kumaran: Yeah, right. So…
55 00:06:16.270 ⇒ 00:06:25.200 Katherine Bayless: Right, I mean, that… yeah, definitely, research bears that out. Like, the more fast you can work, the more things you can do, and you will wind up doing more things.
56 00:06:25.200 ⇒ 00:06:33.280 Uttam Kumaran: Yeah, and then I want to compete on the fact that we do great things fast, and I think there’s a… the ceiling to that is way higher than the floor of, like.
57 00:06:33.690 ⇒ 00:06:36.129 Uttam Kumaran: you know, the other way, right? Yeah.
58 00:06:36.610 ⇒ 00:06:47.290 Katherine Bayless: I mean, just to validate your hypothesis and approach, I am delinquent in following up to say, like, you know, obviously, thank you for jumping on the call.
59 00:06:47.290 ⇒ 00:06:59.989 Katherine Bayless: Like, dude, you fucking… you blew everybody’s mind. Like, Melissa Madeline, who’s our VP of Membership, that was on the call, she slacked me afterwards, and she just goes, Catherine, your consultant is amazing. And I was like.
60 00:06:59.990 ⇒ 00:07:01.760 Uttam Kumaran: Wait, I said, like, two words.
61 00:07:01.760 ⇒ 00:07:10.329 Katherine Bayless: I know! I know, I know! But, like, we are so starved for, like, actual tech people in this space, and so.
62 00:07:10.330 ⇒ 00:07:13.860 Uttam Kumaran: Well, you see exactly what I told them to do, they ended up doing.
63 00:07:13.860 ⇒ 00:07:15.619 Katherine Bayless: So I was like, just…
64 00:07:15.620 ⇒ 00:07:19.460 Uttam Kumaran: serialize it further upstream for us, I don’t care, and send it to us.
65 00:07:19.500 ⇒ 00:07:20.479 Katherine Bayless: Like… I know!
66 00:07:20.710 ⇒ 00:07:31.200 Uttam Kumaran: I was totally on that call, like, do I say it? No. No, no, no. No, we’re not here to… we’re not here to… yeah. We just want the data. You can take the data.
67 00:07:31.200 ⇒ 00:07:41.169 Katherine Bayless: credit, they’ll advertise it as, like, we have this brand new feature, whatever. Right. Not to mention, it’s just, like, the world’s saddest Lucidchart, but…
68 00:07:41.170 ⇒ 00:08:00.980 Uttam Kumaran: I know, I’m like, dude, I don’t know. I was even telling Sam, like, I can’t wait. This is just… but I… this is what, like, that’s, like, what is unique in consulting that I think we do differently, is, like, we… we negotiate a lot of vendors for a lot of clients, and we come into these, like, weird environments where you’re like.
69 00:08:01.050 ⇒ 00:08:04.580 Uttam Kumaran: yeah, like, a CTO or a head of data would do that, but, like.
70 00:08:04.880 ⇒ 00:08:13.630 Uttam Kumaran: would they fight that hard, or like… and we don’t really play politics, like, I’ll play whatever your politics are, and that’s it, like, that’s… that’s the extent.
71 00:08:13.670 ⇒ 00:08:26.240 Uttam Kumaran: Meaning, like, whatever strategically you need, that’s how we support. So there’s just a lot of advantages, but I get that consultants also get a bad rap, and I don’t… I feel like for good reason, like, I think a lot of the times.
72 00:08:26.420 ⇒ 00:08:33.260 Uttam Kumaran: you win a contract, you totally sell something, then you deliver something else, or the quality’s so bad, and yeah, I don’t know, it’s…
73 00:08:33.770 ⇒ 00:08:34.789 Uttam Kumaran: industry.
74 00:08:35.020 ⇒ 00:08:58.060 Katherine Bayless: I mean, it’s also been interesting to, like… I mean, I kind of did it knowingly, if not on purpose, for this reason, but, like, you know, you guys and SDG working at the same time, and, like, I’m like, I think I’ve had, like, you know, two and a half conversations with your team, and you’re already off and running and building things. I feel like all I do is get emails from them about, like, can we talk to people? I need access to this!
75 00:08:58.060 ⇒ 00:09:02.199 Katherine Bayless: how do I get blah blah blah? And I’m like, oh my god, can you guys please figure it out?
76 00:09:02.200 ⇒ 00:09:02.730 Uttam Kumaran: Yeah.
77 00:09:03.460 ⇒ 00:09:13.969 Uttam Kumaran: Yeah, I just, like, I feel like we just try… I tell my team, we have to communicate like their best communicating employee. Like, that’s our bar. And so…
78 00:09:13.970 ⇒ 00:09:24.459 Uttam Kumaran: I don’t know, it’s weird. I feel like we get that from a lot of clients. I just don’t have a background in consulting, I’ve hired a lot of consultants, so I know what bad consulting is, and so we just try to do the opposite.
79 00:09:24.630 ⇒ 00:09:25.240 Uttam Kumaran: Yeah. But…
80 00:09:25.840 ⇒ 00:09:27.149 Katherine Bayless: Yeah, I mean…
81 00:09:27.150 ⇒ 00:09:33.730 Uttam Kumaran: Yeah, no, I appreciate it. Yeah, this is just hopefully the first of many vendors that we push to get what we need, so…
82 00:09:33.730 ⇒ 00:09:40.290 Katherine Bayless: Believe me, believe me, I have a feeling that will be the case. And actually, that’s a great segue into.
83 00:09:40.290 ⇒ 00:09:41.550 Uttam Kumaran: Okay, yes.
84 00:09:41.550 ⇒ 00:09:46.640 Katherine Bayless: Yeah, let me… let me share with you my fears. So…
85 00:09:46.960 ⇒ 00:09:51.240 Katherine Bayless: And I’ll apologize in advance, because it’s probably going to be kind of a rambling monologue,
86 00:09:51.240 ⇒ 00:09:51.879 Uttam Kumaran: That’s fine.
87 00:09:51.880 ⇒ 00:09:53.890 Katherine Bayless: I’ll do an info dump, and then we’ll go.
88 00:09:54.020 ⇒ 00:10:07.380 Katherine Bayless: So, when I started in April, there was a data breach of some sort actively occurring. I never really got all of the cog on it, and to be fair, it is more Jay’s domain than mine.
89 00:10:07.390 ⇒ 00:10:19.749 Katherine Bayless: But as far as I understand, there was some activity on the network that was suspicious, and data that was moved off of our accounting and finance drive, you know, shocker. Why we have mapped network drives in 2025, I cannot
90 00:10:20.170 ⇒ 00:10:21.870 Katherine Bayless: possibly understand.
91 00:10:22.260 ⇒ 00:10:27.489 Katherine Bayless: So there was that. And there have been other, sort of, smaller incidents.
92 00:10:27.620 ⇒ 00:10:32.510 Katherine Bayless: Most recently, though, what’s kind of like…
93 00:10:33.030 ⇒ 00:10:37.929 Katherine Bayless: There’s this, like, miasma of security… question marks?
94 00:10:39.420 ⇒ 00:10:55.239 Katherine Bayless: So, obviously, CES is, like, a big event, right? And so, like, there are plenty and plenty and plenty of scammers out there who say that they are selling the CES attendee list or exhibitor list, right? And most of them are doing a pretty solid job reverse-engineering what it would be if they had it, right?
95 00:10:55.810 ⇒ 00:10:56.460 Katherine Bayless: And this…
96 00:10:56.460 ⇒ 00:11:00.599 Uttam Kumaran: This isn’t primarily in order to sell, like, ancillary services, or, like, the…
97 00:11:00.890 ⇒ 00:11:03.839 Katherine Bayless: It depends. I mean, some of them are scams, right? Like, they’ll say.
98 00:11:03.840 ⇒ 00:11:04.320 Uttam Kumaran: Right, yeah.
99 00:11:04.320 ⇒ 00:11:19.190 Katherine Bayless: CES attendee list, and then they’ll hold your ransom, you know, hold your credit card hostage, right? Some of them will say, buy the CES attendee list just to make a quick buck, and it’s bad data. And then, yeah, I think there are some that are trying to sell ancillary services. Like, we do get housing pirates and, like, people that are, like.
100 00:11:19.190 ⇒ 00:11:24.549 Katherine Bayless: you know, we’ll do transportation for you. And it’s like, you’re actually not associated with the event at all, though.
101 00:11:25.660 ⇒ 00:11:33.129 Katherine Bayless: So, like, we do know, and I have encountered this in other jobs too, right? Like, that these faux lists that get sold are a thing.
102 00:11:34.410 ⇒ 00:11:42.480 Katherine Bayless: And so, Jay keeps saying that, like, when our data appears in the wild, that it’s surely, it must just be that.
103 00:11:42.620 ⇒ 00:11:50.959 Katherine Bayless: However, We’ve had some really oddly specific and correct data appear in the wild.
104 00:11:51.120 ⇒ 00:12:08.279 Katherine Bayless: And I was on a call earlier today with the woman who is sort of the de facto owner of the CES tech stack at the moment. It’s a whole nother saga. She’s lovely, she’s based in Brazil. I think she’s probably the only reason CES will actually happen this year, to be honest.
105 00:12:08.280 ⇒ 00:12:09.509 Uttam Kumaran: Oh, wow, okay.
106 00:12:09.660 ⇒ 00:12:16.550 Katherine Bayless: And so, like, we have this Thursday call every other… or every week with her, and all of the different vendors are on it.
107 00:12:16.850 ⇒ 00:12:31.670 Katherine Bayless: And today, she was saying that there’s a specific email address that she used in her registration, not intending to be a honeypot, but it wound up because she was like, this is literally an email I’ve used nowhere else, and now I’m getting, like, scammers and.
108 00:12:31.670 ⇒ 00:12:32.990 Uttam Kumaran: Oh, yeah, yeah, yeah, okay.
109 00:12:33.370 ⇒ 00:12:44.230 Katherine Bayless: Right. And then, we’ve also had a few other exhibitors who have contacted us, having heard from, you know, scammer, spammer types, but, like, they have the correct booth number, and things like that.
110 00:12:44.570 ⇒ 00:12:52.690 Katherine Bayless: And so, at present, Jen is asking, you know, the vendors to kind of… oh, sorry, and there’s one other piece that she mentioned.
111 00:12:53.710 ⇒ 00:13:01.750 Katherine Bayless: that I don’t have all the details on, admittedly, but it sounds like last year, there was some activity
112 00:13:01.930 ⇒ 00:13:05.600 Katherine Bayless: Pushing messages to our mobile app.
113 00:13:05.700 ⇒ 00:13:12.730 Katherine Bayless: That was not originating from within the app, and that should not be possible.
114 00:13:13.070 ⇒ 00:13:19.979 Katherine Bayless: And so, she was sort of asking the vendors, like, can you guys all check your little daisy chain of integrations, and, you know…
115 00:13:19.980 ⇒ 00:13:20.430 Uttam Kumaran: Sure.
116 00:13:20.430 ⇒ 00:13:22.590 Katherine Bayless: Make sure, key rotation, all the things.
117 00:13:23.160 ⇒ 00:13:30.030 Katherine Bayless: I was like, Jen, can we chat? Because I’m increasingly… and this is why I was like, let’s just… you and I have a tete-a-tete.
118 00:13:30.030 ⇒ 00:13:30.660 Uttam Kumaran: Okay.
119 00:13:30.660 ⇒ 00:13:33.010 Katherine Bayless: I’m afraid the call is coming from inside the house.
120 00:13:33.650 ⇒ 00:13:34.640 Uttam Kumaran: Mmm.
121 00:13:34.640 ⇒ 00:13:47.129 Katherine Bayless: I think Jay is very smart, but I think he is the smart enough to be an incredibly dangerous type. Like, he has an MCP server wired up that can edit our DNS records. I don’t think that’s.
122 00:13:47.130 ⇒ 00:13:48.120 Uttam Kumaran: Yeah, yeah, yeah.
123 00:13:48.120 ⇒ 00:13:53.720 Katherine Bayless: Right? Like, and he’s very cowboy. He, he, he, like, the Okta stuff is a mess.
124 00:13:53.720 ⇒ 00:13:54.440 Uttam Kumaran: Yes, yes.
125 00:13:54.440 ⇒ 00:13:55.119 Katherine Bayless: on Slack?
126 00:13:55.120 ⇒ 00:14:00.429 Uttam Kumaran: No, and I can tell… I can tell that everybody’s sort of just, like, tiptoeing around to figure it out, yeah.
127 00:14:00.430 ⇒ 00:14:09.760 Katherine Bayless: Yeah. Like, he literally, he posted a message, he’s like, oh, next time you go to log in, you’re gonna have to, like, create a password, but don’t worry, it’s just a backup password. I was like, that doesn’t make sense.
128 00:14:11.050 ⇒ 00:14:15.020 Katherine Bayless: Like, I know his house is, like, a house of cards in a hurricane, or, you know.
129 00:14:15.020 ⇒ 00:14:15.470 Uttam Kumaran: Yeah.
130 00:14:15.470 ⇒ 00:14:18.339 Katherine Bayless: The junk you’ve shoved into the closet before the guests come over.
131 00:14:18.590 ⇒ 00:14:27.979 Katherine Bayless: And I think he really is a victim of, you know, like, chronic under-resourcing for his team and his role, but…
132 00:14:28.320 ⇒ 00:14:31.870 Katherine Bayless: Our security is just fucking shit, and I…
133 00:14:31.870 ⇒ 00:14:32.520 Uttam Kumaran: Yeah.
134 00:14:32.520 ⇒ 00:14:34.540 Katherine Bayless: It keeps me up at night being like.
135 00:14:34.790 ⇒ 00:14:41.430 Katherine Bayless: what am I doing building this, like, lovely dataset for someone to just come in and grab, kind of thing, right? And so.
136 00:14:41.430 ⇒ 00:14:41.840 Uttam Kumaran: Yeah.
137 00:14:41.840 ⇒ 00:14:51.610 Katherine Bayless: I guess at the end of the rambling monologue, it’s like, I just can’t decide. How loud should I panic? Because nobody else seems to be. And to me.
138 00:14:51.730 ⇒ 00:15:00.289 Katherine Bayless: it feels like the kind of stuff that Jay shouldn’t be able to just be like, I don’t know, it’s probably just a scam. And I’m like, I think we could look into it a little deeper?
139 00:15:00.780 ⇒ 00:15:02.729 Uttam Kumaran: Yeah, I mean…
140 00:15:02.800 ⇒ 00:15:04.980 Katherine Bayless: I wonder if there’s a…
141 00:15:05.440 ⇒ 00:15:09.669 Uttam Kumaran: I mean, do they have any sort of access for audit logs on any…
142 00:15:09.800 ⇒ 00:15:11.450 Katherine Bayless: Nope. On any data.
143 00:15:11.750 ⇒ 00:15:16.800 Katherine Bayless: I mean, the vendors, hopefully, in some cases they do, but…
144 00:15:16.910 ⇒ 00:15:20.470 Katherine Bayless: I mean, some of them are gonna be those sort of, you know, small.
145 00:15:20.470 ⇒ 00:15:21.050 Uttam Kumaran: Yeah, yeah.
146 00:15:21.050 ⇒ 00:15:35.450 Katherine Bayless: shops, but we don’t. I literally, I asked Jay earlier, because he was talking about some other strange thing he fixed that I don’t think he should have fixed, and I was like, well, surely we have logs? And he’s like, yeah, I looked into that a few years ago, but I just don’t have the time.
147 00:15:35.560 ⇒ 00:15:36.280 Katherine Bayless: I’m like…
148 00:15:36.280 ⇒ 00:15:38.720 Uttam Kumaran: I don’t understand. But who… I guess, like, who…
149 00:15:39.000 ⇒ 00:15:45.709 Uttam Kumaran: Well, I mean, the one way to do this is to, like, now that I think there’s, like, real monetary value at risk.
150 00:15:46.270 ⇒ 00:15:50.740 Uttam Kumaran: Right? So, that could be something where it’s like, hey, if there is a breach.
151 00:15:51.130 ⇒ 00:15:54.250 Uttam Kumaran: And we don’t notify people, like, there’s some risk.
152 00:15:54.530 ⇒ 00:15:54.890 Katherine Bayless: Right.
153 00:15:55.690 ⇒ 00:15:56.320 Katherine Bayless: Right.
154 00:15:57.110 ⇒ 00:16:04.750 Uttam Kumaran: I don’t know, like, what… have you talked to anybody else in the, kind of, org, more senior, and has anyone, like, raised any concern? I mean.
155 00:16:04.940 ⇒ 00:16:09.050 Uttam Kumaran: The other way… the other thing is to really just make… try to get
156 00:16:09.320 ⇒ 00:16:13.869 Uttam Kumaran: The budget for something more on the security side, because that’ll force
157 00:16:14.570 ⇒ 00:16:19.619 Uttam Kumaran: More initiative there, but yeah, that’s kind of, like, the two things that come to my mind.
158 00:16:20.000 ⇒ 00:16:22.110 Katherine Bayless: I know. I mean, so, like, I’ve…
159 00:16:22.370 ⇒ 00:16:32.779 Katherine Bayless: I have raised the concern internally with several other, you know, VP and above people. The challenge is, I mean, kind of going back to the dearth of technical talent.
160 00:16:32.780 ⇒ 00:16:33.790 Uttam Kumaran: Yeah, okay.
161 00:16:33.790 ⇒ 00:16:37.309 Katherine Bayless: like, Jay has… I mean, you know, he’s kind of…
162 00:16:37.780 ⇒ 00:16:39.020 Uttam Kumaran: No, it’s also like a…
163 00:16:39.020 ⇒ 00:16:40.609 Katherine Bayless: It’s shutting down those conditions?
164 00:16:40.610 ⇒ 00:16:41.110 Uttam Kumaran: Yeah, yeah.
165 00:16:41.110 ⇒ 00:16:57.879 Katherine Bayless: And they’re, you know, I mean, ostriches love sand, right? Like, nobody wants to think our data is leaking into the wild. And truthfully, I mean, it might be as simple as, like, yeah, we have 50 interns, and they have Jensen Huang’s cell phone number on SharePoint. Yeah, I mean, I’d exfiltrate our data, too, if I was on a college budget.
166 00:16:57.880 ⇒ 00:16:58.440 Uttam Kumaran: Yeah.
167 00:17:00.920 ⇒ 00:17:06.109 Uttam Kumaran: But still, it’s like, you can still prove that it’s not happening. That should be such a reasonable ask, right?
168 00:17:06.609 ⇒ 00:17:09.739 Katherine Bayless: In his mind, it is impossible to prove that it’s not happening.
169 00:17:12.430 ⇒ 00:17:13.349 Uttam Kumaran: Hmm…
170 00:17:13.699 ⇒ 00:17:15.499 Katherine Bayless: Right, I know, I know.
171 00:17:15.500 ⇒ 00:17:16.659 Uttam Kumaran: I mean, I don’t…
172 00:17:17.619 ⇒ 00:17:18.420 Katherine Bayless: Yeah.
173 00:17:18.700 ⇒ 00:17:20.530 Uttam Kumaran: Yeah, that’s tough, because…
174 00:17:22.260 ⇒ 00:17:38.690 Uttam Kumaran: I mean, I don’t… again, this is, I think, is just, like, typically the way you sort of budget, like, when you get insurance, you budget because of the potential damage, right? And so one thing is, like, one way to bring it up, and how we talk about it is just, like, we have insurance on cyber and stuff, but…
175 00:17:39.350 ⇒ 00:17:46.170 Uttam Kumaran: that is something that not everyone has, but that’s the risk we run if we… if there is an issue, if there is a breach, and so…
176 00:17:46.290 ⇒ 00:17:50.009 Uttam Kumaran: Maybe the way is to indicate that there is something happening, but…
177 00:17:51.090 ⇒ 00:17:56.379 Uttam Kumaran: you’re right, like, unless someone truly cares, I don’t… I just don’t know…
178 00:17:57.330 ⇒ 00:18:02.090 Uttam Kumaran: Yeah, it’s… it’s hard, because there is gonna be a technical… there’s gonna be, like.
179 00:18:02.830 ⇒ 00:18:05.449 Uttam Kumaran: I gave him a telephone, like, how it gets done.
180 00:18:05.730 ⇒ 00:18:19.010 Uttam Kumaran: On the data side, I mean, truly, you should be able to… we’ll be able to understand, like, we have role-based access control, and you’ll be able to understand all the stuff that’s getting queried to the reporting database.
181 00:18:19.010 ⇒ 00:18:19.620 Katherine Bayless: Right.
182 00:18:19.620 ⇒ 00:18:23.379 Uttam Kumaran: But… Because there’s also so many…
183 00:18:24.140 ⇒ 00:18:31.530 Uttam Kumaran: places of data, it’s very hard. The other question is, like, is this something that you want to own? Right.
184 00:18:31.530 ⇒ 00:18:31.950 Katherine Bayless: Right.
185 00:18:31.950 ⇒ 00:18:34.240 Uttam Kumaran: You know? And see it through.
186 00:18:34.240 ⇒ 00:18:35.340 Katherine Bayless: Right, right.
187 00:18:35.340 ⇒ 00:18:38.960 Uttam Kumaran: But also, maybe it’s a nothing burger, and… Yeah.
188 00:18:39.100 ⇒ 00:18:49.169 Katherine Bayless: It’s true. I mean, it really… it really could be true, like Jay says, that it’s just very clever, you know, social engineers out there, but I’m like, I just… I don’t trust…
189 00:18:49.360 ⇒ 00:18:51.680 Uttam Kumaran: Well, I mean, like, maybe it’s, like.
190 00:18:51.820 ⇒ 00:19:04.779 Uttam Kumaran: how do we… I think maybe, alternatively, it’s thinking about how to get him to care about security. Like, one of the things, even, like, in our process is, like, we have all these tools, like, we should export the data and, like, deprecate
191 00:19:04.900 ⇒ 00:19:06.230 Uttam Kumaran: Like, the tool.
192 00:19:06.530 ⇒ 00:19:11.979 Uttam Kumaran: Right? Like, sunset, whatever it is, if we have all the data, or restrict all the access.
193 00:19:12.470 ⇒ 00:19:15.620 Uttam Kumaran: Like, as we’re going tool by tool, maybe that is something that…
194 00:19:16.270 ⇒ 00:19:20.990 Uttam Kumaran: You could own that’s actually pretty, like, in parallel with all the stuff we’re already doing.
195 00:19:21.240 ⇒ 00:19:36.399 Uttam Kumaran: Which is, like, go in, maybe for each tool, we go and we request access, like, logs, we do a user audit and removal. Already, like, that should probably cut some spend, so there is some benefit. And then if there’s tools that are just completely dead.
196 00:19:36.570 ⇒ 00:19:39.690 Uttam Kumaran: Then we should turn them off, and there’s, like, at least some overlap.
197 00:19:40.310 ⇒ 00:19:41.939 Uttam Kumaran: With security.
198 00:19:42.110 ⇒ 00:19:49.500 Uttam Kumaran: The toughest part is that, like, if they’re accessing things within the Microsoft, like.
199 00:19:50.220 ⇒ 00:19:56.290 Uttam Kumaran: if nobody else can see, like, OctoLogs… I mean, at least it’s good that you have, like, SSO and…
200 00:19:56.550 ⇒ 00:20:00.649 Uttam Kumaran: there is Okta, But, again, it’s like, you…
201 00:20:01.230 ⇒ 00:20:07.039 Uttam Kumaran: if someone does have, like, backdoor access to everything, or has Master Pass, or whatever, then it’s… it’s like…
202 00:20:07.160 ⇒ 00:20:08.999 Uttam Kumaran: sort of moot, right? So…
203 00:20:09.400 ⇒ 00:20:16.990 Katherine Bayless: Right. I mean, I think, truthfully, I think the biggest risk surface is, like, probably the API keys. I mean, there’s.
204 00:20:16.990 ⇒ 00:20:17.630 Uttam Kumaran: Yes.
205 00:20:17.850 ⇒ 00:20:36.379 Katherine Bayless: there’s just so many that he has created, provisioned, turned on, right? And, like, you know, they don’t have an expiration or a token rotation requirement, and they’re just, like, floating around. I mean, I know Impexium actually has, like, 7 or 8, I think, like, quote-unquote live integrations that go nowhere, and I’m like, okay, well, that’s not great.
206 00:20:36.490 ⇒ 00:20:40.099 Katherine Bayless: So yeah, I mean, I think it’s a good point, like, I think…
207 00:20:40.100 ⇒ 00:20:43.250 Uttam Kumaran: Why don’t we, when we go tool by tool, we can try to pull some things.
208 00:20:43.290 ⇒ 00:20:45.350 Katherine Bayless: Yeah. Yeah.
209 00:20:45.350 ⇒ 00:20:52.469 Uttam Kumaran: And, I mean, at least for something, you can sleep well that you’re trying. Right, right.
210 00:20:52.470 ⇒ 00:20:55.199 Katherine Bayless: I mean, you know, not to make it all about me, because it’s obviously.
211 00:20:55.200 ⇒ 00:20:55.610 Uttam Kumaran: No.
212 00:20:55.610 ⇒ 00:21:03.210 Katherine Bayless: Not like that, but I am like, Jesus Christ, if CTA has a big data breach, like, I mean, who’s gonna be the one in the hot seat but data girl? You know what I mean?
213 00:21:03.210 ⇒ 00:21:06.509 Uttam Kumaran: Yes, yes. But this is where it’s like, it’s… I think…
214 00:21:07.440 ⇒ 00:21:18.420 Uttam Kumaran: But I think you’re… I think it’s good, because when we work with other data leaders, it’s not a concern of theirs, but we… we use key pair auth, we one-pass everything, and yeah, we have, like, key rotation.
215 00:21:18.650 ⇒ 00:21:25.260 Uttam Kumaran: And then we also… again, we… we also… I’ve been in situations where, yeah, you have to have…
216 00:21:25.460 ⇒ 00:21:32.250 Uttam Kumaran: kind of controls where if someone requests access, there is a committee that reviews. Like, we can start to put some of that in place.
217 00:21:32.530 ⇒ 00:21:34.980 Uttam Kumaran: And maybe it is the fact that, like.
218 00:21:35.190 ⇒ 00:21:41.589 Uttam Kumaran: you can use that as a positive, and then you can say, hey, but we don’t… I don’t have, sort of.
219 00:21:41.820 ⇒ 00:21:48.160 Uttam Kumaran: Security, you know, confidence in this other area of the company, but we have it in data, right?
220 00:21:48.380 ⇒ 00:21:52.149 Uttam Kumaran: I don’t know, could be something. And we’re already gonna go tool by tool.
221 00:21:52.150 ⇒ 00:21:55.000 Katherine Bayless: Yeah. And get access to things, so we might as well…
222 00:21:55.000 ⇒ 00:21:56.470 Uttam Kumaran: When we’re in the tool.
223 00:21:57.110 ⇒ 00:22:02.550 Uttam Kumaran: I’ll go… we can go through the users and check what has… what open… what has open endpoints.
224 00:22:03.020 ⇒ 00:22:06.419 Uttam Kumaran: I mean, and yeah, I might as well do that.
225 00:22:06.440 ⇒ 00:22:11.080 Katherine Bayless: Because some of them, at least you’re probably spending money on sea costs that you could reduce, you know?
226 00:22:11.290 ⇒ 00:22:29.659 Katherine Bayless: Right, right, oh yeah, for sure, for sure. I mean, it’s funny, like, we do have the, like, some of the apps behind Okta, but, like, the vast majority of our systems are just individually provisioned by the team, like, deciding who gets access, and so, like, I know there are people with access to stuff that, like.
227 00:22:29.870 ⇒ 00:22:32.860 Katherine Bayless: I mean, you know, you hope they aren’t using, but it probably.
228 00:22:32.860 ⇒ 00:22:39.389 Uttam Kumaran: Definitely, but I guess in that question, like, do you think Jay would be open to finding that tool and putting it behind SSO at that point?
229 00:22:40.130 ⇒ 00:22:51.779 Katherine Bayless: So… he’s… okay, so we’ve had this conversation a few times, like, he’s… He wants control, control, like…
230 00:22:52.020 ⇒ 00:22:54.690 Katherine Bayless: Stick, not carrot type control, but he.
231 00:22:54.690 ⇒ 00:22:55.140 Uttam Kumaran: Okay.
232 00:22:55.140 ⇒ 00:23:08.700 Katherine Bayless: want to get involved in what he decides… what he calls business decisions. And so, like, he tends to not get… he tends to not be interested in managing systems that aren’t, sort of, like, you know.
233 00:23:08.950 ⇒ 00:23:09.690 Uttam Kumaran: Yes.
234 00:23:09.690 ⇒ 00:23:12.620 Katherine Bayless: Yeah. Which I, I think, to me, is…
235 00:23:12.620 ⇒ 00:23:15.359 Uttam Kumaran: How much management is SSO? It’s just like…
236 00:23:15.580 ⇒ 00:23:20.420 Uttam Kumaran: get the thing behind the op, and like… I guess he has to manage groups and stuff, but, like.
237 00:23:20.950 ⇒ 00:23:23.730 Uttam Kumaran: Who else is gonna do that? Like, he’s the IT guy.
238 00:23:24.080 ⇒ 00:23:32.079 Katherine Bayless: Thank you, right? Yeah, exactly. Yeah, I mean, his position on his role is interesting. I mean, we’ve had an.
239 00:23:32.080 ⇒ 00:23:44.010 Uttam Kumaran: A lot of companies have this, by the way. You have the one IT guy, a lot… I mean, a lot of our legacy clients, they say it’s like, oh, go ask this person, or he has… he knows it all, and then that person’s sort of like, I’ll get to it when I get to it. It’s kind of sad.
240 00:23:44.980 ⇒ 00:23:57.580 Katherine Bayless: And I think also, like, specifically, like, he thinks that, you know, you know, 20 years ago, when I started, I was just operational IT, like, I just had to give people a mailbox, and then my job was done, and now it’s all this other shit, and I’m like, yeah, correct.
241 00:23:58.040 ⇒ 00:24:01.449 Uttam Kumaran: Yeah, but fight for budget and get more people, like, you didn’t…
242 00:24:01.580 ⇒ 00:24:05.569 Uttam Kumaran: Instead of getting a… doing it right, you’re not doing it well.
243 00:24:05.880 ⇒ 00:24:06.430 Katherine Bayless: Right.
244 00:24:06.750 ⇒ 00:24:10.809 Uttam Kumaran: This is what happened, but someone has to come in and… and sort of…
245 00:24:11.440 ⇒ 00:24:13.499 Uttam Kumaran: Say, this is the way it’s gonna work.
246 00:24:14.110 ⇒ 00:24:14.750 Katherine Bayless: I mean…
247 00:24:14.750 ⇒ 00:24:19.439 Uttam Kumaran: maybe if it’s truly a concern, we could just build a case up over time. I think…
248 00:24:19.760 ⇒ 00:24:23.910 Uttam Kumaran: If you can show a little bit of, like, how we’re thinking about security on your team.
249 00:24:24.380 ⇒ 00:24:27.460 Uttam Kumaran: Maybe it continues just to be a conversation that…
250 00:24:27.590 ⇒ 00:24:30.190 Uttam Kumaran: sort of kind of chip away at it.
251 00:24:30.470 ⇒ 00:24:36.440 Katherine Bayless: No, no, I actually… I think that’s the right answer. I like that. Like, we will build in a way that is…
252 00:24:36.440 ⇒ 00:24:55.650 Katherine Bayless: correct, and then puts him in the position of the… and I really… I do like him, so I’m not trying to be, like, mean to him, but I am, like, this issue has to change, right? And so I’m like, yeah, if we build something the right way, and then we say to him, like, join us, and he says, no, well, now that’s a different problem, right? Because…
253 00:24:55.650 ⇒ 00:24:58.620 Katherine Bayless: Why turn down the better way of doing things, right?
254 00:24:58.620 ⇒ 00:24:59.290 Uttam Kumaran: Yes.
255 00:24:59.640 ⇒ 00:25:00.340 Katherine Bayless: Yeah.
256 00:25:00.520 ⇒ 00:25:11.899 Uttam Kumaran: For me, I just learned, like, with folks like this sometimes, it’s like, how do I make it… the idea seem like it was their idea? But I’m always like, I don’t care about it being my idea, I care about it working.
257 00:25:12.150 ⇒ 00:25:17.370 Uttam Kumaran: I mean, that didn’t give me more money when I worked in companies, or more titles, so definitely, like.
258 00:25:17.920 ⇒ 00:25:21.080 Uttam Kumaran: there was a strategic mistake I made somewhere.
259 00:25:21.900 ⇒ 00:25:23.629 Katherine Bayless: No, like…
260 00:25:24.060 ⇒ 00:25:33.729 Katherine Bayless: I’m the same way. I’m the same way. I would ra- it’s just like the Impexium thing, right? You know, like, oh, that’s so cute, you presenting Tom’s idea back to us and saying it’s yours. Good job, guys, right?
261 00:25:33.730 ⇒ 00:25:39.120 Uttam Kumaran: So be it. Let’s go. Let’s get on with it. Exactly.
262 00:25:39.120 ⇒ 00:25:54.859 Katherine Bayless: I’m right there with you. And, I mean, truthfully, I mean, there’s a part of me that’s afraid, in some way, shape, or form, that I am gonna wind up in a stumble-to-the-top sort of problem here, because… I don’t know if I’ve mentioned this to you or not yet, but, like.
263 00:25:55.000 ⇒ 00:26:01.209 Katherine Bayless: I’m pretty sure that I’m gonna be the person in charge of the CES tech stack starting January, and I don’t…
264 00:26:01.210 ⇒ 00:26:01.840 Uttam Kumaran: Okay?
265 00:26:03.210 ⇒ 00:26:12.130 Katherine Bayless: I don’t know, I just don’t know. But it does give you a lot of room to do what you’re talking about, and say, like, we’re gonna secure this a different way, join us, or, you know, get out of the way.
266 00:26:12.710 ⇒ 00:26:15.510 Uttam Kumaran: But I… but I also think this is where it’s like,
267 00:26:15.650 ⇒ 00:26:25.519 Uttam Kumaran: I don’t know, sometimes in an org like CES, an org like CTA, like, there can be so much to do that you end up doing nothing. Yes. And so…
268 00:26:25.690 ⇒ 00:26:33.960 Uttam Kumaran: I’ve been in that position before, like, it tends to happen, like, if you’re able to solve problems in an order like this, the problems sort of find you.
269 00:26:36.840 ⇒ 00:26:42.339 Uttam Kumaran: And the nice thing about my company now, I found a way to monetize that.
270 00:26:44.510 ⇒ 00:26:52.680 Uttam Kumaran: Where I’m like, yeah, sure, okay, we could keep doing more, but I get where you’re coming from, but it’s… it is a strategic decision, I mean, it’s not easy.
271 00:26:53.100 ⇒ 00:27:00.430 Uttam Kumaran: But it’s also, like, I think this data piece, there’s clearly so much alpha that I, even if you were to accomplish
272 00:27:00.900 ⇒ 00:27:09.650 Uttam Kumaran: this data platform setup and landing things and reporting, it’s a huge accomplishment outside of anything else, you know? And…
273 00:27:09.650 ⇒ 00:27:10.430 Katherine Bayless: Yeah.
274 00:27:10.900 ⇒ 00:27:14.410 Uttam Kumaran: And that is, like, a whole job and initiative, like.
275 00:27:14.940 ⇒ 00:27:18.270 Uttam Kumaran: Beyond even the entirety of… of…
276 00:27:18.580 ⇒ 00:27:20.819 Uttam Kumaran: the rest of the tech stack, you know, so…
277 00:27:20.930 ⇒ 00:27:26.559 Uttam Kumaran: part of it is, like, okay, and if we can… but again, I think we will slowly chip away at this, and…
278 00:27:26.880 ⇒ 00:27:31.229 Uttam Kumaran: It’ll sort of give you enough ammunition to think about what you want to do internally.
279 00:27:31.480 ⇒ 00:27:33.000 Katherine Bayless: Right, right.
280 00:27:33.960 ⇒ 00:27:50.760 Katherine Bayless: Yeah, I mean, actually, I mean, yeah, it’s, like, small side note from a logistical perspective. I should make a, I should probably put a reminder, like, on our calendars for maybe early December, because I do want to turn in a 2026, like, scope of work as
281 00:27:51.100 ⇒ 00:28:00.449 Katherine Bayless: reasonably soon as possible, so that, like… Okay. Because probably the month of January will be lost to the annals of time between the show and getting back from the show, but…
282 00:28:00.450 ⇒ 00:28:01.160 Uttam Kumaran: Yes.
283 00:28:01.530 ⇒ 00:28:10.779 Katherine Bayless: I do think I really would like to continue working with you and the team in the new year, and so the sooner we can put that work into the draconian contracting queue…
284 00:28:10.780 ⇒ 00:28:12.140 Uttam Kumaran: Yes.
285 00:28:12.140 ⇒ 00:28:12.740 Katherine Bayless: the better.
286 00:28:12.740 ⇒ 00:28:15.670 Uttam Kumaran: That was probably the longest process we’ve been through.
287 00:28:16.110 ⇒ 00:28:26.579 Katherine Bayless: I mean, dude, the craziest part is you guys got through so much faster than, like… like, it took 4 months to get Anthropic signed.
288 00:28:27.070 ⇒ 00:28:28.060 Uttam Kumaran: Wow.
289 00:28:28.060 ⇒ 00:28:29.540 Katherine Bayless: That’s crazy.
290 00:28:29.540 ⇒ 00:28:30.250 Uttam Kumaran: Crazy.
291 00:28:30.250 ⇒ 00:28:36.909 Katherine Bayless: Yeah. And I mean, that team is just inundated with contracts, and then…
292 00:28:37.130 ⇒ 00:28:47.250 Katherine Bayless: they have thousands and thousands that they kinda, sorta have to have the cog on for CES. I mean, I think they use mostly a… Actually, sorry, random question.
293 00:28:47.360 ⇒ 00:28:55.219 Katherine Bayless: Do you guys have, or you, but or anybody on the team, like, expertise in integrating DocuSign and Salesforce?
294 00:28:56.430 ⇒ 00:29:00.659 Uttam Kumaran: I can ask internally. I feel like we may have done it for one client.
295 00:29:00.660 ⇒ 00:29:17.249 Katherine Bayless: Okay. We have it integrated, and pretty… pretty well, as far as I understand, but there’s, like, room for improvement, and I think the internal folks have reached the upper limit on what they feel like they can do with it. And so they were like, we need a DocuSign expert, and I was like, okay.
296 00:29:17.250 ⇒ 00:29:23.119 Uttam Kumaran: No. I feel like you… you don’t want a DocuSign anymore.
297 00:29:23.120 ⇒ 00:29:24.680 Katherine Bayless: Fair point, very fair point.
298 00:29:24.680 ⇒ 00:29:37.609 Uttam Kumaran: Yeah, yeah, I could ask. I mean, so my background is, like, I used to build products, and then I have a couple people on the team that are… everybody internally is all mostly internal engineers, whether, like, full-stack or more data folks, so…
299 00:29:37.840 ⇒ 00:29:38.170 Katherine Bayless: Yeah.
300 00:29:38.170 ⇒ 00:29:40.509 Uttam Kumaran: I will ask, for sure.
301 00:29:41.010 ⇒ 00:29:42.350 Katherine Bayless: Never hurts.
302 00:29:42.350 ⇒ 00:29:49.409 Uttam Kumaran: And the other thing is, like, if you guys are… you know, one thing that I was gonna probably share later in December, once we just get through, like.
303 00:29:49.520 ⇒ 00:29:57.929 Uttam Kumaran: bringing all those data is, like, we’re starting to, for some other clients, do a lot of… I mentioned, sort of, some of the Texas SQL AI stuff, but also, like, document intelligence.
304 00:29:57.930 ⇒ 00:29:58.620 Katherine Bayless: Yeah…
305 00:29:58.620 ⇒ 00:30:04.659 Uttam Kumaran: And it’s something that I would like to just show you a couple of, like, demos of… we’re working…
306 00:30:05.460 ⇒ 00:30:10.020 Uttam Kumaran: again, it’s just like, I… we built some RAG systems internally, and I was like.
307 00:30:10.480 ⇒ 00:30:19.239 Uttam Kumaran: God, someone just has to build, like, this as a service, because I can’t… I’m learning, like, re-ranking and all this, like, context.
308 00:30:19.790 ⇒ 00:30:33.329 Uttam Kumaran: chunking, and I’m like, yeah, I’d love to learn all that, but there’s too many properties, and I’m like, someone wrapped this up into, like, an API, and I found a company on Twitter, this is, like, more than a year and a half ago, that’s doing that, this company, Contextual.
309 00:30:33.330 ⇒ 00:30:35.670 Katherine Bayless: Oh, yeah, yeah, I’ve seen them.
310 00:30:35.670 ⇒ 00:30:41.280 Uttam Kumaran: Yeah, they’re used a lot by, biotech and in semis.
311 00:30:41.280 ⇒ 00:30:42.770 Katherine Bayless: Okay, okay, okay.
312 00:30:42.770 ⇒ 00:30:43.340 Uttam Kumaran: So…
313 00:30:43.340 ⇒ 00:30:55.860 Katherine Bayless: looked into this stuff, because I… I am right there with you. I’m like, the sooner we can figure out, like, ontologies and, like, context maps, and like, it’s like, it’s GraphRag plus…
314 00:30:55.860 ⇒ 00:30:56.410 Uttam Kumaran: Yes.
315 00:30:56.410 ⇒ 00:30:57.490 Katherine Bayless: You know?
316 00:30:57.490 ⇒ 00:31:12.940 Uttam Kumaran: So I’ll show you… I can show you their product, and I can even give you access to our demo instance, because it’s really, really good. And so the guy that started this company was on the original, like, rag paper. And that’s what I was like, okay, like, good. And then I met the team, and they’re really, really good. I think that they’re, like, a…
317 00:31:13.090 ⇒ 00:31:20.130 Uttam Kumaran: they’re just pro- it’s like an SDK or an API for RAG, and so it’s sort of almost, like.
318 00:31:20.180 ⇒ 00:31:33.519 Uttam Kumaran: it’s, like, too ahead of its time. Like, a lot of their use cases, they’re selling to companies that have the expertise that they need to build RAG and they need this, or they’re selling to product companies that, like, white-label them under the hood.
319 00:31:33.540 ⇒ 00:31:38.399 Katherine Bayless: I went to them, and I’m like, guys, there’s all these data companies we’re working for that.
320 00:31:38.410 ⇒ 00:31:42.520 Uttam Kumaran: We end up getting exposed to all this unstructured data.
321 00:31:42.660 ⇒ 00:31:49.000 Uttam Kumaran: But, like, they’re not really using it for, like, analytical capabilities, they’re using it for extraction, structured summaries.
322 00:31:49.710 ⇒ 00:31:54.340 Uttam Kumaran: But the problem with contextual is, like, they need people to build on their stuff, they’re not, like, a…
323 00:31:54.580 ⇒ 00:32:03.730 Uttam Kumaran: they’re not, like, a services or application company. So we started doing some work with them, but in that use case that you mentioned, for example, if there’s a lot of, like.
324 00:32:03.860 ⇒ 00:32:12.120 Uttam Kumaran: What are our standard terms, or what are other documents that are, like, looking at things that are out of compliance, or find, like, extracting
325 00:32:12.220 ⇒ 00:32:14.769 Uttam Kumaran: Structured data from a host of documents.
326 00:32:15.300 ⇒ 00:32:16.560 Uttam Kumaran: That’s, like, what…
327 00:32:17.320 ⇒ 00:32:18.410 Katherine Bayless: Yeah, it’s actually…
328 00:32:18.410 ⇒ 00:32:19.420 Uttam Kumaran: can do.
329 00:32:19.420 ⇒ 00:32:27.370 Katherine Bayless: You know, so here’s a potential… genuinely practical application, so…
330 00:32:29.010 ⇒ 00:32:37.069 Katherine Bayless: So, so one of the, one of the, one of the pieces on my chessboard is we have two sales teams selling to the same audience.
331 00:32:37.190 ⇒ 00:32:49.700 Katherine Bayless: in an entirely disconnected way. So we have the sales team, as we call them, that sells the, like, exhibit show, or exhibit space, sponsorships, right? Everything for CES.
332 00:32:50.190 ⇒ 00:33:11.629 Katherine Bayless: they’re the team that uses Salesforce, that when you asked what go-to-market automations were, I was like, what are those? Because, I mean, they’re just, frankly, they’re used to picking up the phone and answering, not calling, right? And so the muscle of selling is pretty weak, I think, on the sales side, and increasingly showing, showing the cracks of that. But then there’s the membership team.
333 00:33:11.630 ⇒ 00:33:22.879 Katherine Bayless: Which are definitely not salespeople, but they are doing sales work, and they’re the ones that are scrappy and in the trenches and picking up phones and bothering people and all the things, and so, like…
334 00:33:23.380 ⇒ 00:33:35.820 Katherine Bayless: integrating their processes, if not their teams, is a piece that I need to move, because I just can’t stand to watch all of that redundancy and lost knowledge, right? So…
335 00:33:36.090 ⇒ 00:33:55.709 Katherine Bayless: both teams have expressed a desire for, like, it’d be cool if we could track interactions with our customers, and I’m like, what do you mean by that? Because how is that, like, an aspirational goal still, right? So I’m like, if we could bring all of that unstructured, probably it’s a lot of emails, and.
336 00:33:55.710 ⇒ 00:33:56.250 Uttam Kumaran: Yeah.
337 00:33:56.250 ⇒ 00:34:03.390 Katherine Bayless: Notes fields and contracts and whatever else into a place that actually gave us some real pipeline intelligence.
338 00:34:03.690 ⇒ 00:34:13.620 Katherine Bayless: I think… I think that would be, like, mind-blowing for the organization, sadly. Or even, like, the sales calls, right? Like, I sit near the.
339 00:34:13.620 ⇒ 00:34:14.170 Uttam Kumaran: Yes.
340 00:34:14.179 ⇒ 00:34:22.759 Katherine Bayless: team, and I listen to them on Zoom all day long, explaining membership, and what it is, and how it works, and blah blah blah blah blah blah blah. Like.
341 00:34:23.089 ⇒ 00:34:32.609 Katherine Bayless: Yeah, I would love to dump in all of that audio, video, text, and just be like, what is really the health of this pipeline? Because…
342 00:34:33.449 ⇒ 00:34:47.979 Katherine Bayless: I don’t think it’s as healthy as we think it is, and I think it’s because we are resting on the, you know, laurels of having gotten all the big fish a while back, and we haven’t really focused on the incredible churn happening kind of in the lower middle of the pipeline.
343 00:34:48.510 ⇒ 00:34:54.520 Uttam Kumaran: Yeah, I agree. I mean, it’s funny, because we do a lot of… internally at our company, we do some Zoom
344 00:34:54.620 ⇒ 00:35:09.210 Uttam Kumaran: related data, because we use it to automate, like, note-taking and things like that, and the Zoom SDK is actually really great. And you can just, for free, you don’t need anything, you can stream all those videos and transcripts to S3 and start doing things with it.
345 00:35:09.370 ⇒ 00:35:11.609 Katherine Bayless: They have great webhooks, but it’s just, like, they…
346 00:35:11.820 ⇒ 00:35:18.270 Uttam Kumaran: Zoom is a great platform that just nobody… instead, people go for, like, Fireflies or Otter and all these things.
347 00:35:18.500 ⇒ 00:35:25.299 Uttam Kumaran: you can actually get a lot of it out of Zoom. So even for sales training, or for asking questions over…
348 00:35:25.520 ⇒ 00:35:28.099 Uttam Kumaran: Transcripts, or yeah, even just associating
349 00:35:28.280 ⇒ 00:35:31.929 Uttam Kumaran: unstructured data with the Salesforce record, You know.
350 00:35:32.980 ⇒ 00:35:36.189 Uttam Kumaran: Here’s the last time you met, here’s what you talked about, things like that.
351 00:35:36.800 ⇒ 00:35:41.659 Katherine Bayless: Oh, my friend. Right now, they have to ask us. They asked…
352 00:35:41.660 ⇒ 00:35:42.350 Uttam Kumaran: Yeah.
353 00:35:42.350 ⇒ 00:35:56.030 Katherine Bayless: team, when was the last time this person reached out, went to CES, etc. I’m like, this is insane. Like, to me, it’s like, I don’t mind answering the question. I mean, I’d rather do better things with my time, but I’m like, how has this business made it so far?
354 00:35:58.180 ⇒ 00:36:00.380 Uttam Kumaran: Yeah. Right? It’s… yeah.
355 00:36:00.380 ⇒ 00:36:01.949 Katherine Bayless: What? Like, when your CEO.
356 00:36:01.950 ⇒ 00:36:05.630 Uttam Kumaran: But it’s a huge brand, I feel like it’s just because…
357 00:36:05.820 ⇒ 00:36:07.869 Uttam Kumaran: You guys are the number one, like…
358 00:36:07.870 ⇒ 00:36:08.450 Katherine Bayless: Right.
359 00:36:08.830 ⇒ 00:36:12.490 Uttam Kumaran: You know, everybody who’s loved tech
360 00:36:12.800 ⇒ 00:36:16.140 Uttam Kumaran: for even a small part of their life, pays attention, so…
361 00:36:16.830 ⇒ 00:36:20.569 Uttam Kumaran: when you’re the best, I feel like, you know, you can kind of get away, but it’s…
362 00:36:20.910 ⇒ 00:36:26.900 Uttam Kumaran: There’s a lot of competition now, and it’s also, like, you could do a lot more with your budgets, for sure.
363 00:36:27.290 ⇒ 00:36:28.849 Katherine Bayless: Yes. Yes.
364 00:36:28.850 ⇒ 00:36:29.580 Uttam Kumaran: Yeah.
365 00:36:29.580 ⇒ 00:36:30.190 Katherine Bayless: Yes.
366 00:36:30.440 ⇒ 00:36:31.210 Katherine Bayless: Yes.
367 00:36:31.620 ⇒ 00:36:46.880 Katherine Bayless: Yeah. Yeah, interesting, like, scuttlebutt around the office the last couple weeks is, I think Samsung’s kind of pull out, they apparently chose to retain their membership because they have a board seat, and they didn’t want the bad PR of losing a board seat.
368 00:36:46.930 ⇒ 00:37:04.589 Katherine Bayless: But they’ve already moved their booth out of, like, the main show floor, alleging that they think TCL is spying on them, which is probably true, but also, I mean, you’re at a convention, I don’t know why you think spying isn’t occurring, right? So yeah, like…
369 00:37:05.700 ⇒ 00:37:22.510 Katherine Bayless: Yeah, I think we have an incredible brand. I think we are at a point where it is fragile, in a way that there’s not enough recognition of. And I’m trying to, like, surface it in a, like… not saying panic, but I’m saying we might have to panic if we don’t start paying attention now.
370 00:37:22.720 ⇒ 00:37:27.650 Uttam Kumaran: I mean, at least what you’ll be able to do once you start seeing the data is to shit- tell the story, and then…
371 00:37:27.650 ⇒ 00:37:28.410 Katherine Bayless: Yeah.
372 00:37:28.890 ⇒ 00:37:39.559 Uttam Kumaran: You know, and then hopefully build some internal motion around getting around some of the KPIs and starting to move them, you know, in the right direction. But without, like, really consistent reporting, it’s like, what can you do
373 00:37:39.700 ⇒ 00:37:41.220 Uttam Kumaran: Right. It’s so hard.
374 00:37:41.570 ⇒ 00:37:45.029 Katherine Bayless: Right, right. I should send you our goals.
375 00:37:45.150 ⇒ 00:37:46.380 Katherine Bayless: I…
376 00:37:46.380 ⇒ 00:37:47.779 Uttam Kumaran: I would love to see that, man.
377 00:37:47.780 ⇒ 00:37:48.690 Katherine Bayless: I mean…
378 00:37:48.690 ⇒ 00:37:52.220 Uttam Kumaran: That’s what we’re driving towards measuring, and that’s, like,
379 00:37:52.380 ⇒ 00:37:57.029 Uttam Kumaran: I, you know, that would be great, if it’s a… yeah.
380 00:37:57.030 ⇒ 00:38:01.260 Katherine Bayless: Yeah, you might, you might be underwhelmed. You might be underwhelmed
381 00:38:02.780 ⇒ 00:38:05.710 Katherine Bayless: They’re 2 pages long, and they include things like
382 00:38:06.630 ⇒ 00:38:10.319 Katherine Bayless: Host webinars for 17 policy areas.
383 00:38:12.600 ⇒ 00:38:13.430 Uttam Kumaran: Yeah.
384 00:38:14.270 ⇒ 00:38:18.930 Katherine Bayless: Yeah. Yeah. Yeah. Yeah. Yeah. Yeah.
385 00:38:19.210 ⇒ 00:38:23.819 Katherine Bayless: I keep… that’s another one that I… it’s like a drum that I beat slowly. I’m like, these are not goals.
386 00:38:24.040 ⇒ 00:38:26.050 Katherine Bayless: Those are just tasks you want someone…
387 00:38:26.050 ⇒ 00:38:28.359 Uttam Kumaran: These are just tasks to do, these are not KRs.
388 00:38:28.360 ⇒ 00:38:47.840 Katherine Bayless: Right, exactly, exactly. I mean, yeah, OKRs are, yeah. I’d love to get there. I’ll settle for a metric that matters. I told our CEO the other day, because she was asking me, did we move up our paid media too early and lose a bunch of, like, you know, what would have been paid registrants by doing more advertising in the range where it’s free to go to CES?
389 00:38:47.840 ⇒ 00:38:52.860 Katherine Bayless: And I was like, I can’t believe you don’t have that model already, but sure, we’ll build it.
390 00:38:52.860 ⇒ 00:38:53.420 Uttam Kumaran: Yes.
391 00:38:53.420 ⇒ 00:39:02.189 Katherine Bayless: and she was like, I just don’t understand. Everybody’s really, like, panicked about, like, the number of attendees, and I’m like, because it’s a goal. We don’t get our bonuses if we don’t hit one.
392 00:39:02.190 ⇒ 00:39:02.790 Uttam Kumaran: Yes.
393 00:39:02.790 ⇒ 00:39:13.609 Katherine Bayless: 2,712, you know, attendees, or whatever it is. And I was like, Christina, we should be looking at profit per attendee, or something along those lines, and she’s like.
394 00:39:14.280 ⇒ 00:39:16.840 Uttam Kumaran: Yeah, because people are gonna juice the attendee list.
395 00:39:16.840 ⇒ 00:39:22.340 Katherine Bayless: Right, right, exactly. I was like, oh, I should not be the smartest person in the room.
396 00:39:23.760 ⇒ 00:39:28.709 Uttam Kumaran: Hey, but I don’t know, when… when you’re in there, and you see it, like, you have opportunity, and…
397 00:39:29.090 ⇒ 00:39:29.620 Katherine Bayless: I know.
398 00:39:29.620 ⇒ 00:39:36.120 Uttam Kumaran: you just go, and I think, like, you… it’s… would you rather be the smartest person? For me, it’s like, would you rather be the smartest person
399 00:39:36.510 ⇒ 00:39:46.849 Uttam Kumaran: at, like, Facebook? Or, like, or, like, the dumbest person at Facebook as far as… I don’t know. But you have… I think you… for… at least this isn’t, like, inventing rocket science, right? There are other…
400 00:39:47.010 ⇒ 00:39:51.760 Uttam Kumaran: There are other physics challenges here to go through, but… .
401 00:39:51.910 ⇒ 00:39:53.080 Katherine Bayless: Right. Yeah.
402 00:39:53.280 ⇒ 00:39:55.100 Katherine Bayless: Right. Yeah, yeah.
403 00:39:55.870 ⇒ 00:40:04.800 Katherine Bayless: Yeah. I know, there’s tons of possibility. That’s what I, I remind myself. I’m like, I know it’s a lot right now, but if I can fucking crack this.
404 00:40:04.800 ⇒ 00:40:05.380 Uttam Kumaran: Yes.
405 00:40:05.380 ⇒ 00:40:07.149 Katherine Bayless: I will have a story to tell.
406 00:40:07.150 ⇒ 00:40:07.580 Uttam Kumaran: Yes.
407 00:40:07.580 ⇒ 00:40:09.300 Katherine Bayless: Like, it just happens again.
408 00:40:12.870 ⇒ 00:40:30.619 Katherine Bayless: But I know even though you’re an hour or so behind, it is late, and I don’t want to totally abuse the privilege of your time, so I really do appreciate the conversation, though. I think I like your… your recommendation of, let’s just start building it the better way, that has a lot of observability and all the things, and…
409 00:40:31.160 ⇒ 00:40:32.879 Katherine Bayless: The smart ones will join the fold.
410 00:40:33.140 ⇒ 00:40:45.029 Uttam Kumaran: Yeah, some people may have never seen it that way, or they don’t get the benefit, and we’ll kind of show them, and it’ll be step-by-step. I think we’ll only have, like, one or two passes at going through every single source.
411 00:40:45.160 ⇒ 00:40:47.569 Uttam Kumaran: So while we’re in there, we should try…
412 00:40:48.270 ⇒ 00:40:50.110 Uttam Kumaran: You know, to just line things up.
413 00:40:50.490 ⇒ 00:40:54.710 Katherine Bayless: Yeah, yeah, yeah, totally. Totally. Yes.
414 00:40:56.870 ⇒ 00:41:01.330 Uttam Kumaran: And then tomorrow, yeah, maybe we can… I’ll grab time sometime tomorrow.
415 00:41:03.110 ⇒ 00:41:19.640 Uttam Kumaran: Yeah, I think if you do have some time tomorrow to meet, we can look at the stuff, like you said, you were kind of putting together for the architecture and the Gantt chart, at least, but… and maybe we can pull the thread a little bit on the sales versus membership thing, because even though I would like to get to the panacea of, you know, combined intelligence.
416 00:41:19.640 ⇒ 00:41:32.610 Katherine Bayless: At the moment, we are not charging correctly, because there is no integration between membership and sales, and so when sales sells a boost, they don’t know if that person’s a member, and then they charge them the wrong price, and then we have to issue a refund later, and it’s a pain in the ass.
417 00:41:33.230 ⇒ 00:41:34.015 Uttam Kumaran: Yeah…
418 00:41:34.980 ⇒ 00:41:35.370 Katherine Bayless: So, yeah.
419 00:41:35.370 ⇒ 00:41:36.470 Uttam Kumaran: Okay, let’s do it.
420 00:41:36.470 ⇒ 00:41:40.209 Katherine Bayless: There’s everything from brass tacks to bright ideas to solve.
421 00:41:42.190 ⇒ 00:41:48.269 Uttam Kumaran: Okay, so I’ll send a note, maybe later today, just to block time, or I’ll find a time tomorrow and then send an invite.
422 00:41:48.440 ⇒ 00:41:56.089 Katherine Bayless: Yeah, I mean, don’t stress, honestly. I get the sense that most people have checked out already for the long weekend slash holiday week kind of thing.
423 00:41:56.090 ⇒ 00:42:05.680 Uttam Kumaran: I just want to kind of get your view on these two things, and then we can also… I’ll poke at it a little bit more this week, so even if we can chat for, like, 30 minutes, that’d be fine.
424 00:42:05.860 ⇒ 00:42:09.879 Katherine Bayless: Yeah, no, I just mean, like, don’t stress about, like, sending the invite tonight, like, I’m pretty sure…
425 00:42:09.880 ⇒ 00:42:10.880 Uttam Kumaran: Oh yeah, okay, okay.
426 00:42:10.950 ⇒ 00:42:14.239 Katherine Bayless: I’m pretty sure I’ll be free whenever you’re looking for me.
427 00:42:14.240 ⇒ 00:42:15.050 Uttam Kumaran: Okay, okay.
428 00:42:15.220 ⇒ 00:42:15.570 Katherine Bayless: Yeah, yeah.
429 00:42:15.570 ⇒ 00:42:16.500 Uttam Kumaran: Okay, perfect.
430 00:42:16.740 ⇒ 00:42:17.330 Katherine Bayless: But…
431 00:42:17.330 ⇒ 00:42:17.910 Uttam Kumaran: Alright.
432 00:42:17.910 ⇒ 00:42:18.650 Katherine Bayless: Really.
433 00:42:18.650 ⇒ 00:42:21.760 Uttam Kumaran: Of course, of course. No, we have a long way to go.
434 00:42:21.760 ⇒ 00:42:27.380 Katherine Bayless: Yeah, exactly. But I appreciate the immediate reduction in intellectual loneliness.
435 00:42:27.380 ⇒ 00:42:29.270 Uttam Kumaran: Yes, of course, of course.
436 00:42:29.270 ⇒ 00:42:35.039 Katherine Bayless: I hope you have a good night. I’m gonna…
437 00:42:35.040 ⇒ 00:42:35.460 Uttam Kumaran: Thank you.
438 00:42:35.460 ⇒ 00:42:38.340 Katherine Bayless: Go try and get some fresh air and stretch my legs.
439 00:42:38.810 ⇒ 00:42:40.800 Uttam Kumaran: Awesome. Okay, I’ll talk to you soon.
440 00:42:40.800 ⇒ 00:42:41.840 Katherine Bayless: Alright, see ya.
441 00:42:41.840 ⇒ 00:42:42.770 Uttam Kumaran: Okay, bye.