Meeting Title: Brainforge Interview w- Greg Date: 2026-04-03 Meeting participants: Abdullah Ahmad, Greg Stoutenburg
WEBVTT
1 00:03:17.190 ⇒ 00:03:19.710 Greg Stoutenburg: Hey, Abdullah! Sorry I’m a couple minutes late here.
2 00:03:19.710 ⇒ 00:03:21.769 Abdullah Ahmad: Oh, all good. How’s it going, Gregory?
3 00:03:21.770 ⇒ 00:03:23.360 Greg Stoutenburg: Doing alright, how are you?
4 00:03:23.360 ⇒ 00:03:28.960 Abdullah Ahmad: Doing good, doing good. Especially better now, we finally have a nice sunny day in DC.
5 00:03:29.190 ⇒ 00:03:34.309 Greg Stoutenburg: I was gonna say, you must be near where I am, because my experience in the last 5 minutes has been like, hey, the sun’s finally up!
6 00:03:34.310 ⇒ 00:03:35.420 Abdullah Ahmad: I know.
7 00:03:35.420 ⇒ 00:03:36.530 Greg Stoutenburg: I’m in York, Pennsylvania.
8 00:03:36.530 ⇒ 00:03:38.430 Abdullah Ahmad: Are you in New York, did you say?
9 00:03:38.430 ⇒ 00:03:39.730 Greg Stoutenburg: No, just York.
10 00:03:39.730 ⇒ 00:03:40.950 Abdullah Ahmad: York, oh, I see.
11 00:03:40.950 ⇒ 00:03:42.310 Greg Stoutenburg: Not new.
12 00:03:42.790 ⇒ 00:03:43.310 Greg Stoutenburg: Yeah.
13 00:03:43.310 ⇒ 00:03:43.770 Abdullah Ahmad: I was…
14 00:03:43.770 ⇒ 00:03:44.120 Greg Stoutenburg: Jason.
15 00:03:44.120 ⇒ 00:03:46.959 Abdullah Ahmad: out of Pittsburgh for 2 years for my grad school.
16 00:03:46.960 ⇒ 00:03:49.000 Greg Stoutenburg: Oh, nice! Cool. Where in Pittsburgh?
17 00:03:49.000 ⇒ 00:03:55.120 Abdullah Ahmad: Shadyside. I was doing my grad school at Carnegie Mellon, so, like, somewhere close to that, essentially.
18 00:03:55.120 ⇒ 00:03:58.230 Greg Stoutenburg: Nice. I lived in, I lived in McKees Rocks for a year.
19 00:03:58.230 ⇒ 00:03:59.400 Abdullah Ahmad: Okay, okay.
20 00:03:59.400 ⇒ 00:04:01.480 Greg Stoutenburg: Two years, something like that? I don’t know, it’s a while ago.
21 00:04:01.480 ⇒ 00:04:10.509 Abdullah Ahmad: I see, yeah. While we were there, it was, like, too hectic for that grad school. We hardly even left, like, the area we were in.
22 00:04:10.510 ⇒ 00:04:17.860 Greg Stoutenburg: Yeah, yeah, yeah. Yeah, I mean, that’s… if I was… if I was near CMU, that’s what I would have done during graduate school as well.
23 00:04:18.510 ⇒ 00:04:23.760 Greg Stoutenburg: area. However, having been in McKees Rocks, I was eager to leave the area.
24 00:04:26.040 ⇒ 00:04:26.860 Abdullah Ahmad: Fair enough.
25 00:04:26.860 ⇒ 00:04:32.210 Greg Stoutenburg: Completely hear that. Yeah, cool, so you were at CMU, and then,
26 00:04:32.320 ⇒ 00:04:40.490 Greg Stoutenburg: And then, let’s see, I don’t see LinkedIn there. So, what brought you here? How did you end up, talking to Brainforge?
27 00:04:40.880 ⇒ 00:04:49.829 Abdullah Ahmad: Yeah, so I was currently at TikTok as my last employment, working there as an Investigations and insight data analyst for the…
28 00:04:49.960 ⇒ 00:04:55.229 Abdullah Ahmad: tech and product pillar, essentially. I really loved my work, really enjoyed it.
29 00:04:55.690 ⇒ 00:05:05.129 Abdullah Ahmad: but I’m also, like, looking for newer opportunities, just because we’re seeing the role of AI, like, coming into the workflows, essentially. And I think we’re sort of, like…
30 00:05:05.130 ⇒ 00:05:16.509 Abdullah Ahmad: At an unfortunate, but also a fortunate time, like, right at the beginning of it, there’s this opportunity to really ride this wave, like, something that’s really gonna, like, change the status of, like, how we know work to be.
31 00:05:16.510 ⇒ 00:05:19.790 Abdullah Ahmad: So I want to be, like, more focused on roles.
32 00:05:19.800 ⇒ 00:05:29.449 Abdullah Ahmad: which were essentially helping, clients and organizations, like, adopt these AI solutions, which is essentially going to be the future after that. BrainPoint seemed like,
33 00:05:29.490 ⇒ 00:05:35.910 Abdullah Ahmad: A new and interesting spot, like, coming about, and very embedded in the practical areas of things.
34 00:05:36.130 ⇒ 00:05:46.120 Greg Stoutenburg: Yeah, no, I agree. So we just finished our team… every other week, we do a team call. This is, like, the whole company, you know, partly social, partly, you know, here’s what we’re up to, partly up to.
35 00:05:46.120 ⇒ 00:05:46.750 Abdullah Ahmad: It’s… Hmm.
36 00:05:46.750 ⇒ 00:05:50.669 Greg Stoutenburg: And, one thing they did is, like, an encouraging
37 00:05:50.820 ⇒ 00:05:57.779 Greg Stoutenburg: continued and deepening AI adoption internally was they did a competition to create a new skill using Cursor.
38 00:05:58.020 ⇒ 00:05:58.400 Abdullah Ahmad: Amazing.
39 00:05:58.400 ⇒ 00:06:03.669 Greg Stoutenburg: in 5 minutes, like, like, start now. Hit the timer. Start now.
40 00:06:03.780 ⇒ 00:06:21.949 Greg Stoutenburg: and then pitch your skill in 5 minutes. And, like, I can’t believe, like, I only started in January, I can’t believe how much I’ve learned, so, like, I made a skill that, would create a complete deck using HTML in one prompt, so you can just…
41 00:06:22.030 ⇒ 00:06:31.709 Greg Stoutenburg: type whatever you want, boom. Get a deck out of it. Another person created a deck that used, like, looked at a website, and then.
42 00:06:31.710 ⇒ 00:06:32.220 Abdullah Ahmad: came up.
43 00:06:32.220 ⇒ 00:06:35.100 Greg Stoutenburg: With a whole, a whole pitch.
44 00:06:35.610 ⇒ 00:06:36.100 Abdullah Ahmad: pricing.
45 00:06:36.100 ⇒ 00:06:55.400 Greg Stoutenburg: that company. And another person came up with a skill that looked at their entire Slack context, everything in our GitHub repos related to a client, and, any other notes or emails or calendar items, and created an employee to-do list for the day. That was exhaustive.
46 00:06:55.400 ⇒ 00:06:55.970 Abdullah Ahmad: Cool!
47 00:06:55.970 ⇒ 00:06:58.030 Greg Stoutenburg: Inside of 5 minutes. I mean, it’s like…
48 00:06:58.030 ⇒ 00:06:58.660 Abdullah Ahmad: Yeah.
49 00:06:58.760 ⇒ 00:06:59.579 Greg Stoutenburg: That was…
50 00:06:59.580 ⇒ 00:07:01.959 Abdullah Ahmad: It’s very exciting. It’s exciting and scary at the same time.
51 00:07:01.960 ⇒ 00:07:02.410 Greg Stoutenburg: Yeah, exactly.
52 00:07:02.410 ⇒ 00:07:04.809 Abdullah Ahmad: It’s difficult to, like, keep up with this at this point.
53 00:07:04.810 ⇒ 00:07:11.139 Greg Stoutenburg: It’s… it’s crazy. It’s crazy, and it’s… it just feels like it’s… it doesn’t feel like it’s slowing down, it feels like it’s speeding up. Correct.
54 00:07:11.140 ⇒ 00:07:11.590 Abdullah Ahmad: Correct.
55 00:07:11.590 ⇒ 00:07:12.060 Greg Stoutenburg: Yeah.
56 00:07:12.060 ⇒ 00:07:12.540 Abdullah Ahmad: good.
57 00:07:12.780 ⇒ 00:07:16.459 Greg Stoutenburg: So, yeah, yeah, well…
58 00:07:17.270 ⇒ 00:07:17.810 Abdullah Ahmad: Yeah.
59 00:07:19.070 ⇒ 00:07:38.809 Abdullah Ahmad: No, it definitely seems like the leadership, like, around, like, the organizations, as well as just, like, where the financial incentives are, it’s, like, the top of every leadership’s to-do list, essentially, to integrate it. I think results are yet to be seen, but so far, like, I think, like, as, like, adoption continues, like, I’m not, gonna be surprised, like.
60 00:07:38.810 ⇒ 00:07:42.329 Abdullah Ahmad: At the capabilities of it, like, in the changes, like, at the workforce level.
61 00:07:42.620 ⇒ 00:07:49.090 Greg Stoutenburg: Yeah, yeah, yeah, no, totally. Yeah, well, I mean, we could probably talk about that for a while.
62 00:07:49.090 ⇒ 00:07:49.650 Abdullah Ahmad: True.
63 00:07:49.650 ⇒ 00:07:52.309 Greg Stoutenburg: time, we’re at 2.07, so,
64 00:07:52.630 ⇒ 00:07:55.840 Greg Stoutenburg: My understanding is you know Jasmine, who will.
65 00:07:55.840 ⇒ 00:07:56.510 Abdullah Ahmad: Correct.
66 00:07:56.510 ⇒ 00:08:04.940 Greg Stoutenburg: Soon, yep. Been, you know, part-time and done a few things, but we’ll ramp up full-time shortly. And, how do you know it’s Jasmine?
67 00:08:05.200 ⇒ 00:08:07.490 Abdullah Ahmad: I know Jasmine from my time at TikTok.
68 00:08:07.730 ⇒ 00:08:08.130 Greg Stoutenburg: Okay.
69 00:08:08.130 ⇒ 00:08:10.069 Abdullah Ahmad: Essentially on the same team, yeah.
70 00:08:10.070 ⇒ 00:08:18.570 Greg Stoutenburg: Okay, cool. I don’t think I realized that. Yeah, sweet. So, tell me, I mean, tell me, like, what role you’ve applied for, and…
71 00:08:18.570 ⇒ 00:08:18.960 Abdullah Ahmad: No.
72 00:08:18.960 ⇒ 00:08:34.849 Greg Stoutenburg: Okay. And then we’ll just, like, jump in with some questions. So this is the… is intended to be, like, the Level 2 interview, so you would have, you know, talked to… looks like you talked to Amber, yep. And that’s more, you know, sort of behavioral kind of stuff.
73 00:08:34.850 ⇒ 00:08:35.220 Abdullah Ahmad: Correct.
74 00:08:35.220 ⇒ 00:08:40.620 Greg Stoutenburg: We’ll talk about, like, role specifics, in the next, 20 minutes or so.
75 00:08:40.620 ⇒ 00:08:53.500 Abdullah Ahmad: Yeah, sounds good. I mean, so the role that I’m applying for is the customer success owner, and essentially, like, from what I understand, like, on the front lines of the contract itself, primary focus, like, not really leading…
76 00:08:53.500 ⇒ 00:09:01.999 Abdullah Ahmad: The scope itself, but more so, like, as an embedded individual from the organization who’s scoping for what new opportunities exist.
77 00:09:02.000 ⇒ 00:09:13.640 Abdullah Ahmad: Essentially looking at, what the customer requirements are and needs are, and sometimes even, like, identifying them when the customer has not really verbalized that, just, like, by your understanding of the space itself.
78 00:09:13.710 ⇒ 00:09:28.880 Abdullah Ahmad: And it’s essentially, like, finding those requirements and pitching them as a way of, like, increasing business with the customer. Me particularly, why I’m interested in this role is primarily of, like, my academic and career background around the space itself.
79 00:09:28.880 ⇒ 00:09:42.940 Abdullah Ahmad: So most recently, I was an investigations data analyst at TikTok within their detection engineering team, and my task, like, prior… primarily revolved around, like, three areas, one being, creating, like, LLM AI solutions.
80 00:09:42.970 ⇒ 00:10:02.670 Abdullah Ahmad: For the internal clients and trust and safety, creating, metrics framework in order to understand the impact that it’s had, and thirdly, to lead and maintain relationships with internal clients for our services. Prior to that, I have a four-year experience in consultancy within the MENA region specifically.
81 00:10:02.740 ⇒ 00:10:09.619 Abdullah Ahmad: Where we were doing government consultancy, specifically for the Amiri Divan, which is the administrative office of the King.
82 00:10:09.660 ⇒ 00:10:23.480 Abdullah Ahmad: Ministry of Interior, Ministry of Foreign Affairs, and Ministry of Defense. So even me moving to the tech and data side was specifically as a result of, like, my experience on the consultancy side, that how important it was to really build up those skills.
83 00:10:23.520 ⇒ 00:10:41.569 Abdullah Ahmad: Particularly when you’re trying to, like, pitch, like, novel solutions. So all in all, as a combination of these two things, I think the customer success owner work fits perfectly between these two areas, technical competencies, as well as that ability to communicate and bring business for the organization.
84 00:10:41.570 ⇒ 00:10:57.600 Greg Stoutenburg: Yeah. Yeah, that all sounds accurate, and that’s a good pitch. So as… as a CSO, you are the… you sort of, like, own… you’re the client account owner, you’ll work with them, you will… you’ll devise and propose
85 00:10:57.600 ⇒ 00:11:07.660 Greg Stoutenburg: work streams. You’ll do things like negotiate how much work you’re going to do, you’ll assign work to individual engineers.
86 00:11:07.660 ⇒ 00:11:22.659 Greg Stoutenburg: There’s technical competence required, and so… but, like, you don’t have to know how to do all of the work that you’re assigning to others. So, for me, my background is as a product manager, so I would… like, a product manager, but, like, with a product analytics background.
87 00:11:22.660 ⇒ 00:11:23.260 Abdullah Ahmad: Okay.
88 00:11:23.260 ⇒ 00:11:24.510 Greg Stoutenburg: So,
89 00:11:24.750 ⇒ 00:11:31.710 Greg Stoutenburg: So there’s, you know, there’s places where I could step in and do the work that I’ve assigned to an individual contributor, but, like, not
90 00:11:32.040 ⇒ 00:11:43.729 Greg Stoutenburg: Right. So, not even close to the case. So, yeah. So, I think what I would like to hear a little bit more about is, like, your technical background. So, like.
91 00:11:44.080 ⇒ 00:11:50.869 Greg Stoutenburg: Give me an example, like, tell me about…
92 00:11:55.890 ⇒ 00:11:56.700 Greg Stoutenburg: Whoops.
93 00:11:57.770 ⇒ 00:12:01.610 Greg Stoutenburg: Broke something in Notion. Notion’s broken now.
94 00:12:02.140 ⇒ 00:12:09.850 Greg Stoutenburg: Tell me about how you would… well, actually, let’s just… let’s just go back to an example from you. I would like to hear about a project where you have to do some analytics work.
95 00:12:10.560 ⇒ 00:12:14.009 Greg Stoutenburg: And, because if I’m hearing it correctly, that’s…
96 00:12:14.220 ⇒ 00:12:19.660 Greg Stoutenburg: that’s mostly the background, right? I mean, I’ve got you on strategy, data or AI.
97 00:12:21.250 ⇒ 00:12:26.170 Greg Stoutenburg: Actually, no. How would you… how would you categorize your experience?
98 00:12:26.170 ⇒ 00:12:28.839 Abdullah Ahmad: Yeah, that’s a great question.
99 00:12:28.840 ⇒ 00:12:35.960 Greg Stoutenburg: I’ll confess, I don’t really know which direction to take off in. I’ve got all these different templates of questions that I can ask, but I don’t know which one to apply to you.
100 00:12:35.960 ⇒ 00:12:52.869 Abdullah Ahmad: Yeah, yeah, yeah, yeah, correct. I mean, makes sense. And, yeah, and I think, like, I sort of, like, answered, like, touched a bit upon it, like, when I was, like, talking about, like, the role and my background in this space. Yeah. But more specifically, it started off, like, after, like, my undergrad at Georgetown, in economics.
101 00:12:52.870 ⇒ 00:13:05.849 Abdullah Ahmad: And I joined in, like, with the consultancy, like, which were helping, essentially, the government make, like, data-driven decisions. Yeah. At that point, like, primarily, it was, like, more around, like, project management, delivery of the scope itself.
102 00:13:06.150 ⇒ 00:13:19.830 Abdullah Ahmad: communicating and business development aspects within, integrated within that as well. So the idea wasn’t just, like, to deliver the project itself, but also essentially scope of the requirements of the customer, so the business can continue to pitch more itself.
103 00:13:19.920 ⇒ 00:13:34.619 Abdullah Ahmad: A lot of, like, my projects that I was, like, working on initially, like, was about, like, technology transfer, or building technical capabilities within it. That’s where I realized, like, sort of, like, a gap, like, within my own knowledge, which was, like, more specifically within the data aspect.
104 00:13:35.080 ⇒ 00:13:53.910 Abdullah Ahmad: This is, like, where the big data and data science was really picking off, and I felt like in order to really be an expert in this space, like, that’s where I need to be in order to understand better how to work with data, like, find solutions and find things, like, which are useful for the client itself. That’s why I decided to go to Carnegie Mellon.
105 00:13:53.910 ⇒ 00:14:09.430 Abdullah Ahmad: essentially, like, for their program and data analytics for, public policy. There, I was researching at Center for Informed Democracy Social Cybersecurity. It’s a research think tank, essentially, which focuses on AI and ML methods in order to create solutions for online harms.
106 00:14:09.510 ⇒ 00:14:19.170 Abdullah Ahmad: For social media tech platforms. I did a few papers there. One of mine, the most recent one, used, like, ML methodologies for, like, random forest regressions.
107 00:14:19.170 ⇒ 00:14:30.360 Abdullah Ahmad: In order as a methodology for identifying harassment campaigns against, like, online users, specifically journalists. That was the paper that got me excited, like, got me recruited by TikTok.
108 00:14:30.420 ⇒ 00:14:41.359 Abdullah Ahmad: For their election integrity effort, essentially. Cool. So while I was there, my focus has been, like, building LLM solutions, particularly identifying content classifications.
109 00:14:42.720 ⇒ 00:14:52.530 Abdullah Ahmad: Building data golden sets, like, conducting precision recall testing in order to see, like, the benchmarks that have been met, like, before the solution can get the go-ahead to be launched.
110 00:14:52.700 ⇒ 00:15:02.769 Abdullah Ahmad: But the other part was, like, hands-on coding, essentially also creating automated detections, in order to alleviate the pains and constraints from the moderators.
111 00:15:02.770 ⇒ 00:15:19.780 Greg Stoutenburg: Okay, that… okay, that… that helps me. Okay, so data… data analytics is sort of, like, the category, but you’re leading projects on this, and you’ve been using it to, you know, help with cybersecurity and… and… and so forth. Okay, great. That helps me. So…
112 00:15:19.780 ⇒ 00:15:23.729 Greg Stoutenburg: Tell me about, an analysis that did not lead to action.
113 00:15:26.010 ⇒ 00:15:28.190 Abdullah Ahmad: Yeah, yeah,
114 00:15:28.620 ⇒ 00:15:38.589 Abdullah Ahmad: That’s a good question. So there were a few projects that we’ve been, like, working with, and essentially, like, one of our clients, which is the ERT team, Escalation Response Team.
115 00:15:38.590 ⇒ 00:15:49.769 Abdullah Ahmad: Which conducts, relationship, like, with the law enforcement, law enforcement, essentially, for high-harm users on platform with risks around, like, school shooting.
116 00:15:49.770 ⇒ 00:15:57.830 Abdullah Ahmad: And mass casualty incidents. So I was working with them. We wanted… they wanted to understand particularly
117 00:15:59.210 ⇒ 00:16:10.389 Abdullah Ahmad: Already a set of users who have been referred by law enforcement, and also, like, individuals that we’ve identified within the platform itself, that show signals, particularly of,
118 00:16:10.430 ⇒ 00:16:25.460 Abdullah Ahmad: potential harm to society as a whole, intention, clearly sharing that online. So we wanted to understand, essentially, like, what these users are, and what can we actually do in order to, like, basically take them off.
119 00:16:25.690 ⇒ 00:16:45.149 Abdullah Ahmad: platform, like, if their actions were sort of, like, in that direction, like, where they were, no longer aligning with the community guidelines on the platform itself. So my task was, like, to investigate this particular topic, because the platform was concerned about the risk, that it was hosting on the platform itself.
120 00:16:45.410 ⇒ 00:16:55.609 Abdullah Ahmad: So I did, like, a few analysis, like, number one, starting off with, like, working with the intelligence team in order to identify the key signals through which we can identify these individuals.
121 00:16:55.760 ⇒ 00:17:04.320 Abdullah Ahmad: And sorry, just to go back and to say that ERT came to the detection engineering team as an internal client, with this concern, per se.
122 00:17:04.319 ⇒ 00:17:19.559 Abdullah Ahmad: So we wanted to identify specifically those individuals and the, violations that they were making on platform. So working with the Intel team, identifying, like, what the signals look like, then running automated queries, which were in Hive SQLs.
123 00:17:19.560 ⇒ 00:17:31.659 Abdullah Ahmad: on the Hadoop framework, which essentially runs as these filters using regular expressions in order to identify a particular language around incitement.
124 00:17:31.830 ⇒ 00:17:49.139 Abdullah Ahmad: I worked with the team in order to create an LLM solution for this, because they were having a problem, where, moderators did not have enough time for a lot of, like, these new use cases that were coming about. So this worked as, like, training,
125 00:17:49.200 ⇒ 00:18:06.490 Abdullah Ahmad: The… training the essential AI model that we were using for classification purposes, collecting, like, daily users from that, and sharing and identifying a gradual trend, when it came to what this violation looked like on the platform itself.
126 00:18:06.520 ⇒ 00:18:12.860 Abdullah Ahmad: Second thing was, like, finding prevalence as a whole in platform, just by catching a small… sorry.
127 00:18:12.860 ⇒ 00:18:17.550 Greg Stoutenburg: let’s actually just stay with the first one, if that’s okay? I wanna… I wanna dig in to understand, like.
128 00:18:17.660 ⇒ 00:18:25.780 Greg Stoutenburg: what the… what your approach was here as an analyst, and what your role on the team was for it. So, like, what were the signals?
129 00:18:26.080 ⇒ 00:18:26.590 Abdullah Ahmad: What?
130 00:18:26.590 ⇒ 00:18:30.999 Greg Stoutenburg: did you… what data set did you start with? Who said it was with the intelligence team? You, you know.
131 00:18:31.520 ⇒ 00:18:37.810 Greg Stoutenburg: They had some information for you. So… so, what did they give you, and how did you work with it?
132 00:18:38.450 ⇒ 00:18:49.229 Greg Stoutenburg: what were the signals, and how did you know? And then, what were the takeaways that you pushed for, and what were the outcomes there? And I think, actually, if we can just… we could just spend the next 12 minutes on that, and…
133 00:18:49.230 ⇒ 00:18:50.330 Abdullah Ahmad: Okay, yeah, sounds good.
134 00:18:50.330 ⇒ 00:18:58.579 Greg Stoutenburg: And just, like, being, like, yeah, being, like, very direct about it, like, I wanna… I just wanna get a good sense of, like, your technical background and expertise. I mean.
135 00:18:58.580 ⇒ 00:18:59.100 Abdullah Ahmad: in the meantime.
136 00:18:59.100 ⇒ 00:19:18.879 Greg Stoutenburg: you know, you tell me about your degrees, like, so I’m convinced, that’s fine. But, like, I want to hear about it. And then understand, like, talk a little bit about… and this can be, you know, part of you explaining this, or it can just be follow-up questions after. Get a sense of how you’d feel about and your background in doing things like managing the client relationship, where…
137 00:19:19.450 ⇒ 00:19:32.099 Greg Stoutenburg: less involved in the technical, in the weeds. Like, you still are somewhat, but less so, and more like, you know, nudging, nudging individual contributors, and scoping out projects, and leading the relationship, and things like that.
138 00:19:32.100 ⇒ 00:19:50.049 Abdullah Ahmad: Okay, yeah, yeah, no, that’s a great point, and let me reframe it then, like, specifically to answer this specifically, which was, like, this project was in the middle of the election integrity effort, primarily before the 2024 elections. So this came directly from the head of, Trust and Safety, Susie Loftus.
139 00:19:50.050 ⇒ 00:19:55.280 Abdullah Ahmad: And their concern about incitement to violence and cases of, like, credible threat of violence on the platform.
140 00:19:55.280 ⇒ 00:20:07.599 Abdullah Ahmad: The idea was, like, the first step to do was, like, to have a sit-down meeting directly with the project, the POC for that particular request, in order to scope out what exactly the requirements and needs are.
141 00:20:07.600 ⇒ 00:20:25.950 Abdullah Ahmad: Within the election integrity, there are a lot of concerns, misinformation, impersonation, credible threat of violence, but after talking to them and sitting down, like, I was able to identify that the main concern right now is, like, coming from credible threat of violence, because there does not really exist an AI model specifically identifying that topic.
142 00:20:25.950 ⇒ 00:20:32.850 Abdullah Ahmad: So once, like, that scope was agreed between, myself, like, and the product POC,
143 00:20:33.120 ⇒ 00:20:43.989 Abdullah Ahmad: The next thing was, like, because I was leading as the lead for the detection engineering team, was to reel in, like, the larger team. So we thought up of two solutions. Number one was
144 00:20:43.990 ⇒ 00:20:53.779 Abdullah Ahmad: creating, keyword-based detection, and this is where we looked at, partnered with the intelligence team, essentially. They gave us, like, high credibility keywords.
145 00:20:53.780 ⇒ 00:21:08.949 Abdullah Ahmad: around, like, incitement to violence. If you’re familiar with some of this language, they use a lot of, like, lead speak, and AlgoSpeak, so in order to get away from, like, getting detected by normal ways. So, for example, Supreme Gentleman is a very common word that’s used
146 00:21:08.950 ⇒ 00:21:14.939 Abdullah Ahmad: In order to represent a school shooter, or particularly someone, like, with those motivations in mind.
147 00:21:15.180 ⇒ 00:21:16.239 Abdullah Ahmad: So this…
148 00:21:16.400 ⇒ 00:21:24.579 Abdullah Ahmad: Yeah, so there were signals like this, particularly keywords, behaviors, how often specific keywords were used, along with the other ones.
149 00:21:24.580 ⇒ 00:21:41.409 Abdullah Ahmad: So for this, I scoped the task specifically to our data analyst on the team, because the idea was using these keywords, using HiveSQL, and essentially searching user accounts, like their bio, nicknames, and handles, for prevalence and presence of these signals within that.
150 00:21:41.450 ⇒ 00:22:01.040 Abdullah Ahmad: The second solution that we looked towards was focusing primarily on LLM-based solutions, because this was focusing more on the content. So for both of these solutions, we used very different data sets. One of them was very specific to user, profile data, which gets generated when the user creates an account with the platform itself.
151 00:22:01.040 ⇒ 00:22:16.389 Abdullah Ahmad: But that was not enough by itself, because we need enrichment data as the users continue to change their signals on their profile as well. So other relevant data sets which collect user behavior over a period of time, and the changes they’ve made to those profiles.
152 00:22:16.390 ⇒ 00:22:21.939 Abdullah Ahmad: So collecting these two table datasets together allowed that first analysis of a detection-focused one.
153 00:22:21.940 ⇒ 00:22:27.660 Abdullah Ahmad: Secondly, the second was focusing more on the content itself, where people were making direct threats.
154 00:22:27.660 ⇒ 00:22:44.350 Abdullah Ahmad: And for this, running it on millions of content, like a detection, isn’t really the most optimal solution, particularly for time, as well as, like, the confidence in that flag itself. So for this, we leveraged LLM. We used, Dolphin minstrel as a base model.
155 00:22:44.350 ⇒ 00:22:56.960 Abdullah Ahmad: My job on top of that was to specifically design the in-context policy, as well as the in-context learning example. But this specific task, I knew, like, not… did not fall within my capability.
156 00:22:56.960 ⇒ 00:23:19.010 Abdullah Ahmad: So I pulled in the engineering folks from the team as well, cutting them tickets in two areas. Number one, like, designing, like, the backend database management system, specifically that’s gonna work for this LLM detection. And then secondly, their task of, like, designing, calling in the API, as well as, like, creating a test harness, which helps us decide.
157 00:23:19.010 ⇒ 00:23:24.320 Abdullah Ahmad: What level of precision and recall these specific, detection is, like, working towards.
158 00:23:24.320 ⇒ 00:23:29.080 Abdullah Ahmad: So, essentially, delegating these two sets of works, to both of our team members.
159 00:23:29.230 ⇒ 00:23:39.159 Abdullah Ahmad: And then work… me working directly with the product POC in order to make sure that we are in alignment, like, with what the idea of the product itself was.
160 00:23:39.160 ⇒ 00:23:49.380 Abdullah Ahmad: At the end of this, we completed the project, and the main output was a dashboard that fed data directly from these detections that we’ve created.
161 00:23:49.380 ⇒ 00:24:00.070 Abdullah Ahmad: And the real output for the organization was that we had a working data dashboard ready for the Civic integrity team prior to the election period started, essentially.
162 00:24:00.070 ⇒ 00:24:19.720 Abdullah Ahmad: And this metric, like, allowed us to not only show how much was actioning, because the actioning part, like, the raw number, it can be, like, 10,000 or 5, doesn’t really tell much in terms of the value it’s offering to the organization. So the idea was what sort of, like, metric can show impact of these detections as a whole.
163 00:24:19.720 ⇒ 00:24:27.650 Abdullah Ahmad: So the main that we agreed on, or, which I thought, like, were the best tool for the leadership to give a holistic vision of the project itself.
164 00:24:27.650 ⇒ 00:24:39.329 Abdullah Ahmad: Was proactive detection rate, essentially showing how many of these users we were able to get to an action before some other user on platform was able to flag them.
165 00:24:39.330 ⇒ 00:24:57.480 Abdullah Ahmad: So essentially, since these are high-harm topics, every false negative is essentially a news headline, particularly for TikTok, especially during the election time. So this showed us, like, our ability to catch a lot of it beforehand, before the harm perpetuated on the platform itself.
166 00:24:57.480 ⇒ 00:25:08.089 Abdullah Ahmad: And the second metric was more focused on, attribution, detection attribution share, to show that how this specific product that we’ve created for trust and safety
167 00:25:08.090 ⇒ 00:25:20.440 Abdullah Ahmad: what proportion of the violations were caught by us, as opposed to every other product that’s currently operating on it? So, like, drilling down deeper into, essentially, like, the impact of these detections.
168 00:25:20.440 ⇒ 00:25:24.909 Greg Stoutenburg: Yeah, very good. Now, in this project, where…
169 00:25:25.230 ⇒ 00:25:43.739 Greg Stoutenburg: So, I got the… so you sort of led the project, sort of organized the project, I can see that. Where was your… sort of, like, where was the technical muscle flexed? Like, did you do some of the data analysis yourself? I know you mentioned that you delegated some work to the data analyst on your team, but tell me about some analysis that you did yourself as part of this project.
170 00:25:43.740 ⇒ 00:25:57.560 Abdullah Ahmad: Yeah, so my main area, like, fell in in two areas. Number one was, like, designing those in-context learning examples for the AI model to understand what it is, like, trying to flag. This might seem straightforward, but actually.
171 00:25:57.560 ⇒ 00:26:06.280 Abdullah Ahmad: ends up, like, being the most difficult part of the project itself, because the internal policies and the product policies are so nuanced.
172 00:26:06.380 ⇒ 00:26:26.239 Abdullah Ahmad: differentiating between one thing that can be acceptable, but, like, when it falls within ETSA category, which is education, documentary, satire, we would not violate it. So in order to craft those examples in such a way that aligns exactly with what the Civic integrity team considers as violated versus not.
173 00:26:26.240 ⇒ 00:26:32.790 Abdullah Ahmad: And the second thing was actually the dashboard creation part, for the metrics, because that was supposed to be the biggest
174 00:26:32.790 ⇒ 00:26:37.930 Abdullah Ahmad: impact on the client itself and leadership, so that’s what I took ownership of.
175 00:26:37.940 ⇒ 00:26:53.329 Abdullah Ahmad: And essentially what that work looks like is, like, going into the data pipelines, which are acting in Hadoop, and these collects violations and policy associations from all different diverse sector detections and methodologies.
176 00:26:53.330 ⇒ 00:27:01.209 Abdullah Ahmad: So, within that, enriching that data, particularly to make sure that it’s fully encompassing the violations around credible threat of violence.
177 00:27:01.390 ⇒ 00:27:06.830 Abdullah Ahmad: And then, like, writing the code, essentially, to measure those metrics itself as well.
178 00:27:07.120 ⇒ 00:27:10.050 Greg Stoutenburg: When you say write the code, like, what do you write it in?
179 00:27:10.050 ⇒ 00:27:10.750 Abdullah Ahmad: Yeah, so…
180 00:27:10.750 ⇒ 00:27:13.440 Greg Stoutenburg: Did you actually… were you, like, manually coding, like, coding, coding?
181 00:27:13.440 ⇒ 00:27:20.709 Abdullah Ahmad: Correct, correct, correct. So, like, we rely primarily on HypeSQL, because that’s where our database system is managed, essentially.
182 00:27:20.720 ⇒ 00:27:38.569 Abdullah Ahmad: And we use, like, internal tools for dashboarding itself. The good thing is, like, you’re able to succinct them both together, so we’re not really switching between platforms, but I’m writing, like, SQL queries in order to aggregate these violations, compare them to other methodologies itself, and then displaying them, visualizing them in the dashboard.
183 00:27:38.750 ⇒ 00:27:51.810 Greg Stoutenburg: Okay, cool. Alright, in the last couple minutes, then, in thinking about, you know, some of, like, the account manager and project manager responsibilities of CSO, how would you, I mean, like.
184 00:27:52.310 ⇒ 00:27:54.300 Greg Stoutenburg: It sounds to me less like…
185 00:27:54.490 ⇒ 00:27:59.999 Greg Stoutenburg: could you do it? And more like, would you want to? Understanding that there’s, you know.
186 00:28:00.050 ⇒ 00:28:18.780 Greg Stoutenburg: it’s a… it’s a relationship-building role. Your technical capabilities, as someone who’s, you know, been involved in the project you just described, yeah, that, you know, check. But yeah, I mean, what’s your… what are your thoughts on that? You know, having some people’s responsibility, having some relationship management responsibility?
187 00:28:19.030 ⇒ 00:28:41.959 Abdullah Ahmad: Yeah, yeah, and I think, like, the answer to this question particularly falls, like, within my personal preferences of the sort of work that I want to do. My stepping in this space, like, started from consultancy, and the whole data technicality skill set was built on top of it, like, to facilitate, being an expert in that, what I provide, essentially. Yeah. The idea is, like, I think, like, what’s most fulfilling for me
188 00:28:41.960 ⇒ 00:28:47.189 Abdullah Ahmad: It’s essentially, like, doing the work, like, for which, like, the output is, like, a tangible improvement.
189 00:28:47.190 ⇒ 00:28:56.369 Abdullah Ahmad: and specifically to be able to see the real-life, like, impact, like, a particular solution is, like, trying to make. And I think, like, the best place for something like that is, like, a…
190 00:28:56.370 ⇒ 00:29:08.569 Abdullah Ahmad: like, a front-end role, like, directly with the client itself, because I feel like even if I have, like, that technical background, I think, like, my skill set, like, primarily still revolves around, like, communication.
191 00:29:08.570 ⇒ 00:29:20.309 Abdullah Ahmad: And that ability to synthesize what the problem is, think critically, understand that what the solution that the client might be asking might not be the best solution that’s actually needed for the organization at the moment.
192 00:29:20.310 ⇒ 00:29:33.770 Abdullah Ahmad: So thinking critically through those questions itself, and then, like, identifying exactly how this work can be delegated, and broken into smaller tasks, like, that we can bring it back to the greatest possible value to the organization, essentially.
193 00:29:33.990 ⇒ 00:29:50.810 Greg Stoutenburg: Yeah, very good, yeah. Yeah, at Brainforge, like, we move really quickly, and we’re… we’re always in a position where we’re having to, you know, I mean, you have the consulting background you pointed to, so I’m sure you’re familiar with this, but always in a position of having to, like, point to and justify the.
194 00:29:50.810 ⇒ 00:29:51.170 Abdullah Ahmad: Come on.
195 00:29:51.490 ⇒ 00:29:54.190 Greg Stoutenburg: Continuing to engage with us, so…
196 00:29:54.520 ⇒ 00:29:57.339 Greg Stoutenburg: Something that’s really important is being able to have
197 00:29:57.530 ⇒ 00:30:00.960 Greg Stoutenburg: You know, have an ear to the ground and look for those signals from the client about what.
198 00:30:00.960 ⇒ 00:30:01.280 Abdullah Ahmad: days.
199 00:30:01.280 ⇒ 00:30:02.380 Greg Stoutenburg: see as valuable.
200 00:30:02.630 ⇒ 00:30:03.100 Abdullah Ahmad: Yeah.
201 00:30:03.100 ⇒ 00:30:06.069 Greg Stoutenburg: So we’re… and so we’re talking ROI, like, all the time, you know?
202 00:30:06.470 ⇒ 00:30:16.189 Greg Stoutenburg: How do you feel about, I mean, compared to your previous consulting role, how do you feel about the idea of, like, you know, you’re regularly, kind of, you’re regularly kind of pitching.
203 00:30:17.070 ⇒ 00:30:19.739 Greg Stoutenburg: Kind of on a weekly basis in certain respects.
204 00:30:19.960 ⇒ 00:30:34.839 Abdullah Ahmad: Yeah, and I think this is, like, honestly, like, one of, like, the most important roles, like, when it comes to, like, engaging with a client on a consultancy, even if you are just, like, delivering a specific project for which you’ve been hired. So for example, like, when I was, like, consulting for the government.
205 00:30:35.040 ⇒ 00:30:47.860 Abdullah Ahmad: the Ministry of Interior, like, would request us for these specific research reports, particularly around online information operations, in the lead-up to the 2024 Football World Cup, and we were providing this service to them.
206 00:30:47.860 ⇒ 00:30:58.449 Abdullah Ahmad: When I was working, I quickly realized, like, a lot of, like, this work, they have the internal capability to be able to do that, just, like, maybe, like, not the motivation to do that itself.
207 00:30:58.450 ⇒ 00:31:12.710 Abdullah Ahmad: So I talked to my leadership, and I definitely saw a specific gap there, and my idea was, like, we can pitch them a project which is not just, like, providing research reports, but helping them develop that in-house capability for that research.
208 00:31:12.710 ⇒ 00:31:27.960 Abdullah Ahmad: We created a project worth $4 million, which was, like, to upscale Austin capabilities within the organization itself, because that’s where the research direction was going. And it allowed our client more flexibility to research on topics.
209 00:31:27.960 ⇒ 00:31:33.160 Abdullah Ahmad: Which were rather more confidential, that they could not really contract someone externally.
210 00:31:33.160 ⇒ 00:31:52.280 Abdullah Ahmad: Someone would think that maybe that’s sort of, like, impacting our workstream of, doing research for them by building the in-house capability, but it actually was not quite the opposite. It further solidified the relationship between us and the client, because we’ve helped them build up that infrastructure on which they are doing research.
211 00:31:52.280 ⇒ 00:32:10.269 Abdullah Ahmad: So we sort of, like, tied ourselves as a permanent partner to an in-house, work stream that they are essentially now incorporating into their organization, and allows us, like, to stick closely with them as well. So for me, like, these sort of, like, wins, like, really mean a lot, because we were able to offer value
212 00:32:10.360 ⇒ 00:32:15.460 Abdullah Ahmad: much more than what the client initially had in mind, for example, and this is, like, the idea where I want to be.
213 00:32:15.730 ⇒ 00:32:31.360 Greg Stoutenburg: Yeah, great. We should wrap up, but, and thank you for, you know, taking the time to break down the examples for me that I was sort of insisting on. If, you know, if you have a question or two, I can try to answer it just in a minute here, before hopping, since I didn’t really
214 00:32:31.400 ⇒ 00:32:37.890 Greg Stoutenburg: give you a chance to ask. I’m just, you know, I’m a CSO, I’m not, you know, a recruiter or anything like that.
215 00:32:37.890 ⇒ 00:32:38.340 Abdullah Ahmad: Going.
216 00:32:38.340 ⇒ 00:32:41.840 Greg Stoutenburg: I can’t answer, like, ops questions, but I could answer, you know, some other questions.
217 00:32:41.850 ⇒ 00:32:55.870 Abdullah Ahmad: Yeah, I mean, I’ve read the job description, I’ve had, like, conversations with a few folks as well, but I think it’s my first time talking to an actual CSO, like, within the organization itself as well. So maybe, like, the best, useful question for me would be, like, understanding.
218 00:32:55.870 ⇒ 00:33:04.379 Abdullah Ahmad: what does a day-to-day role looks like for you, like, when you are engaging with these clients? And I know, like, it’s a relatively new company, things might be changing, so what that looks like.
219 00:33:04.550 ⇒ 00:33:22.719 Greg Stoutenburg: Yeah, I mean, I think on a day-to-day basis, it can vary by the client, like, what kind of cadence they want, so part of it’s just, like, figuring out how they want to work with you, and meeting them where they’re at. It involves, you know, and we have a variety of work streams, so we work for different clients as well, so,
220 00:33:22.720 ⇒ 00:33:42.909 Greg Stoutenburg: the… I mean, on a day-to-day basis, I’m making sure that I’ve checked in on all of my clients, I’m making sure that any projects that, we have are moving according to expected timelines. As well, I’m unblocking the team so that, like, you know, engineers can keep moving on the things that they need to be moving on.
221 00:33:42.910 ⇒ 00:33:56.739 Greg Stoutenburg: So, you know, a lot of that is that, like, sort of product manager, project manager responsibility. As well, I’m doing any kind of, any kind of analysis that’s expected of me specifically, or even if not analysis, but, like.
222 00:33:57.200 ⇒ 00:34:14.180 Greg Stoutenburg: work that is… work that’s sort of fundamental, like, for example, we have a client where I’m leading a build-up of a BI tool, and they’ve just had spreadsheet reporting, and now we’re putting them on a BI tool. And, that involves things like making sure that we’ve documented what
223 00:34:14.179 ⇒ 00:34:17.600 Greg Stoutenburg: Certain words mean that are going to appear in charts.
224 00:34:17.600 ⇒ 00:34:26.229 Greg Stoutenburg: Because it doesn’t exist yet, or it exists but in a lousy form, right? And so I’m the one who has to keep an eye on those things, and anticipate those things, work on those things.
225 00:34:26.230 ⇒ 00:34:39.400 Greg Stoutenburg: So, yeah, there can be a lot of variety. I’ll tell you, things move very quickly here. And so, you know, we were talking about AI a moment ago, like, you’re… it’s cool to get to use those things to go fast, but it’s also expected that you use those things to go fast, so…
226 00:34:39.820 ⇒ 00:34:47.280 Greg Stoutenburg: You know, if it’s supposed to take… if they know it takes, 10 minutes to create a deck with AI plus editing.
227 00:34:47.280 ⇒ 00:34:47.690 Abdullah Ahmad: You can.
228 00:34:47.690 ⇒ 00:34:50.479 Greg Stoutenburg: a day, then you took too long?
229 00:34:50.489 ⇒ 00:34:51.469 Abdullah Ahmad: You know.
230 00:34:51.469 ⇒ 00:34:53.549 Greg Stoutenburg: So there’s that kind of thing as well.
231 00:34:53.750 ⇒ 00:35:13.479 Abdullah Ahmad: Okay, amazing, amazing. And maybe, like, a final one, like, just, like, trying to get a sense of, like, what the current bottlenecks are, because I know it’s, like, a growing team right now. What is, like, the main, like, particular area where we would want, like, this CSO to come in with the high capability within that space, like, just so I can gauge exactly the sort of skill sets that I would be meeting within this particular task?
232 00:35:13.480 ⇒ 00:35:26.719 Greg Stoutenburg: Yeah, no, that’s a good question. So I think, you’d be coming in, and they’d be intending to expand, because… because you’re here? Like, so currently, there are only… I mean, besides the company founders, there’s me and one other CSO in…
233 00:35:26.720 ⇒ 00:35:27.350 Abdullah Ahmad: Okay.
234 00:35:27.350 ⇒ 00:35:32.529 Greg Stoutenburg: And so, but, you know, we’re getting… we’re getting knocks on the door regularly.
235 00:35:33.210 ⇒ 00:35:41.380 Greg Stoutenburg: business, but, you know, people… you can only handle so many clients inside of a work week. And so, yeah, so it would be a question of expansion.
236 00:35:41.380 ⇒ 00:35:42.170 Abdullah Ahmad: I see.
237 00:35:42.500 ⇒ 00:35:48.789 Greg Stoutenburg: Yeah, it’s not about team capability or anything like that, though they are still kind of ironing out which role owns what.
238 00:35:48.790 ⇒ 00:35:49.420 Abdullah Ahmad: expect.
239 00:35:49.420 ⇒ 00:35:53.039 Greg Stoutenburg: You know, expect things to change. Things could change by the end of your first week, you know?
240 00:35:53.040 ⇒ 00:35:53.640 Abdullah Ahmad: Oh, good.
241 00:35:53.640 ⇒ 00:35:58.560 Greg Stoutenburg: end up here. So, yeah, but that said, it would be about expansion.
242 00:35:58.700 ⇒ 00:36:13.360 Abdullah Ahmad: Okay, awesome. Essentially, basically focusing on things like NetPrompter score, essentially creating that positive sentiment with the client itself as a way of, like, leading into, like, future projects with them and, hopefully getting recommended in the industry as well.
243 00:36:13.360 ⇒ 00:36:20.160 Greg Stoutenburg: Yeah, as the basic… yeah, I mean, as the basic CSO responsibility, yeah, it’s that the… it’s that the client likes you.
244 00:36:20.160 ⇒ 00:36:20.540 Abdullah Ahmad: in one.
245 00:36:20.540 ⇒ 00:36:22.729 Greg Stoutenburg: to continue giving Brain Forge money to.
246 00:36:23.450 ⇒ 00:36:28.130 Greg Stoutenburg: like, that’s kind of the… that’s ultimately… at the end of the day, that’s the only KPI.
247 00:36:28.130 ⇒ 00:36:28.929 Abdullah Ahmad: Fair enough.
248 00:36:28.930 ⇒ 00:36:31.439 Greg Stoutenburg: So, yeah, yeah, yeah, yep.
249 00:36:31.690 ⇒ 00:36:34.469 Abdullah Ahmad: Awesome, awesome. That sounds great. Thank you.
250 00:36:34.470 ⇒ 00:36:44.979 Greg Stoutenburg: All right. All right, well, thanks for chatting, better get going. Sorry for being a little late, tried to make up for it on the tail end. But yeah, I’ll, you know, I’ll give my feedback to the team, and then someone will reach out. So yeah, have a good weekend.
251 00:36:44.980 ⇒ 00:36:46.520 Abdullah Ahmad: Nice talking to you, Greg. Have a nice one.
252 00:36:46.520 ⇒ 00:36:48.020 Greg Stoutenburg: You too, thanks, Mike.