Meeting Title: Brainforge Interview w- Greg Date: 2026-03-23 Meeting participants: Greg Stoutenburg, Fanu Sisay
WEBVTT
1 00:00:11.330 ⇒ 00:00:12.200 Greg Stoutenburg: Over there.
2 00:00:16.300 ⇒ 00:00:17.180 Fanu Sisay: Hello.
3 00:00:17.510 ⇒ 00:00:18.940 Greg Stoutenburg: Hey, Faneuld, nice to meet you.
4 00:00:18.940 ⇒ 00:00:20.139 Fanu Sisay: Hey Greg, how are you?
5 00:00:20.500 ⇒ 00:00:22.060 Greg Stoutenburg: Doing alright, how are you?
6 00:00:22.060 ⇒ 00:00:26.679 Fanu Sisay: Good, good. Yeah, I’m doing good. Kind of a cloudy Monday, but…
7 00:00:26.680 ⇒ 00:00:27.879 Greg Stoutenburg: Yeah, where are you located?
8 00:00:28.020 ⇒ 00:00:29.450 Fanu Sisay: I’m out in Brooklyn. How about you?
9 00:00:29.450 ⇒ 00:00:32.219 Greg Stoutenburg: Yep. York, Pennsylvania. Similarly cloudy.
10 00:00:32.220 ⇒ 00:00:38.800 Fanu Sisay: Oh, sweet. Yeah. I’d met with only West Coast people, so I was like.
11 00:00:38.800 ⇒ 00:00:43.830 Greg Stoutenburg: Okay, yeah. Well, they’re just living in the sunshine all the time. They don’t even know what we’re talking about.
12 00:00:43.830 ⇒ 00:00:47.790 Fanu Sisay: Yeah, any kind of seasonal changes, they’re not aware of.
13 00:00:47.790 ⇒ 00:00:49.120 Greg Stoutenburg: Can’t process. No.
14 00:00:49.120 ⇒ 00:00:55.800 Fanu Sisay: But I picked this over, I picked this over Unlimited Sunshine. I like the changing ups and downs.
15 00:00:55.820 ⇒ 00:01:03.960 Greg Stoutenburg: Yeah, and I do enjoy, sort of, like, the moral superiority of talking to Californians who are cold when it’s 60 degrees out, and it’s like, come on…
16 00:01:04.000 ⇒ 00:01:12.660 Fanu Sisay: Yeah, they’re… they can’t, you know, anything outside of that, like, one standard deviation is, like, the end of the world to them, and I feel, you know, more adjusted.
17 00:01:13.040 ⇒ 00:01:28.699 Greg Stoutenburg: Yeah, can’t do it, can’t do it, yeah. Yeah, well, thanks for, thanks for talking. So this is the… this is the second round interview, so basically the, the first one is sort of like a more general, you know, what’s your experience, that kind of thing.
18 00:01:28.930 ⇒ 00:01:43.379 Greg Stoutenburg: And then this one is more like the role-specific interview. I’m just a peer, I’m just someone who’s, you know, doing similar work, and so, you know, they want everyone who comes to the pipeline to talk to someone, sort of in my sort of role, so…
19 00:01:43.380 ⇒ 00:01:43.810 Fanu Sisay: Sweet.
20 00:01:43.810 ⇒ 00:01:54.270 Greg Stoutenburg: That’s what this is for, yeah. So, cool. So, like, I guess before we dig in, tell me, like, how did you end up learning about Brainforge and talking to people here?
21 00:01:54.460 ⇒ 00:02:12.150 Fanu Sisay: Yeah, so I had actually scrolled past… I want to say on LinkedIn. I, you know, have been vaguely looking around for jobs, you know, nothing too specific, but I saw the job listing, and I really like the verbiage in the job description specifically, but things like,
22 00:02:12.300 ⇒ 00:02:21.859 Fanu Sisay: low ego contributor, you know, actual adoption over, you know, just building stuff. So yeah, that caught my eye.
23 00:02:21.860 ⇒ 00:02:35.509 Fanu Sisay: went on the application, and I saw that, you know, they asked for video inserts, answering questions, which I thought was great, but, they didn’t have the ability to send in an MP4.
24 00:02:35.540 ⇒ 00:02:48.480 Fanu Sisay: Which I was, like, a bit concerned about. Reached out to Demi Lade, I want to say, on LinkedIn, and he was like, oh, thank you for letting us know, blah blah, and then he was like, oh, you know, I can just put you in with our head of HR, and I was like.
25 00:02:48.620 ⇒ 00:02:52.529 Fanu Sisay: Yeah, I guess I’m down, but, you know, yeah, that would happen.
26 00:02:52.530 ⇒ 00:02:53.640 Greg Stoutenburg: this along.
27 00:02:53.640 ⇒ 00:02:57.449 Fanu Sisay: Yeah, that’s how I got in, so I was excited with that.
28 00:02:57.450 ⇒ 00:03:03.489 Greg Stoutenburg: Cool. Yeah, yeah, cool. Yeah, and you went to Virginia Tech, graduated a few years ago?
29 00:03:03.490 ⇒ 00:03:08.690 Fanu Sisay: Yep, studied math there, really enjoyed my time there. I, you know.
30 00:03:08.840 ⇒ 00:03:23.600 Fanu Sisay: kind of got to nerd out on math for 4 years, and towards the back end, I kind of was like, okay, I have to be serious about my career and what I want to do moving forward. Caught a nice data gig up in New York, and it was really, like, all-encompassing.
31 00:03:23.770 ⇒ 00:03:39.579 Fanu Sisay: front-end, back-end, data governance. I was able to touch up on everything there, which was really great for me. Cool. And yeah, now I’m at a sports marketing firm, kind of running a whole research and insights department with me and my boss, and yeah, it’s been… it’s been nice, fun.
32 00:03:39.580 ⇒ 00:03:46.649 Greg Stoutenburg: Nice, nice. What do you like about that role that you’d wanna, you know, like, if you could, you know, keep it?
33 00:03:47.150 ⇒ 00:03:47.780 Fanu Sisay: My current role?
34 00:03:48.220 ⇒ 00:03:52.379 Greg Stoutenburg: Yeah, your current role at the, at Genesco Sports Enterprises?
35 00:03:52.380 ⇒ 00:03:55.459 Fanu Sisay: Yeah, I really like the one…
36 00:03:55.500 ⇒ 00:04:15.239 Fanu Sisay: part that I mentioned, it’s me and my boss running our, research and insights department, at least within New York. We also have a Dallas office with a team over there, but everything in New York we kind of handle to ourselves. I enjoy, the casualness of the, conversations between me and my boss around working environments,
37 00:04:15.240 ⇒ 00:04:16.019 Fanu Sisay: You know, it’s…
38 00:04:16.019 ⇒ 00:04:22.709 Fanu Sisay: We’re in person every day, so I think that’s pretty easy. We talk quite constantly, but, you know.
39 00:04:22.790 ⇒ 00:04:26.390 Fanu Sisay: The kind of casualness of, like, hey, we want to deploy this.
40 00:04:26.760 ⇒ 00:04:33.389 Fanu Sisay: We… it’s… we do check in, but it’s a lot of, like, you know, things come to us, and we’re able to do it in a…
41 00:04:33.530 ⇒ 00:04:40.939 Fanu Sisay: much rapider, like, we’re not checking in with multiple different account teams and whatnot, we’re able to run things on our own, which I really do enjoy.
42 00:04:41.210 ⇒ 00:04:43.190 Greg Stoutenburg: Yeah, you just kind of go and do it.
43 00:04:43.190 ⇒ 00:04:45.689 Fanu Sisay: Yeah, exactly. Not too many check-offs.
44 00:04:45.940 ⇒ 00:04:54.510 Greg Stoutenburg: Yeah. Yeah. Yeah. Yeah, yeah. Yeah, okay, cool. So, like, small, agile, Yeah, quickly.
45 00:04:54.510 ⇒ 00:05:07.429 Fanu Sisay: Yeah, and obviously this is based, depending on the client. We have some bigger clients who, you know, things might run a bit slower because there are those check-offs and different siloed.
46 00:05:07.840 ⇒ 00:05:12.419 Fanu Sisay: You know, teams that we have to reach out to, but especially with, like, some of our
47 00:05:12.780 ⇒ 00:05:20.959 Fanu Sisay: I don’t want to call them personal projects, but some of our work streams were able to move really fast, and those are things I love to do, and…
48 00:05:21.120 ⇒ 00:05:30.929 Fanu Sisay: you know, definitely hope to find it, in my next gig. I don’t know if Forage is exactly like that, but, that would definitely be something I’d be welcome to…
49 00:05:31.270 ⇒ 00:05:32.500 Fanu Sisay: Yeah. Yeah.
50 00:05:32.500 ⇒ 00:05:48.999 Greg Stoutenburg: Yeah, I mean, so one of the things here is we, you know, we have lots of different clients, and you’re working with lots of different people on generally small teams on projects, and so, you know, a lot of that is… a lot of that is here. You know, it is…
51 00:05:49.000 ⇒ 00:06:00.199 Greg Stoutenburg: I guess you probably do develop a sort of rhythm when it’s just you and your boss, you know, managing the whole thing. There’s gonna be more… there’ll be more voices in the room here, but you’re still very much, you know, in a position where you can own
52 00:06:00.240 ⇒ 00:06:06.729 Greg Stoutenburg: your project, and make the decisions that you think are the right ones. So, yeah, very good. Awesome.
53 00:06:07.030 ⇒ 00:06:19.049 Greg Stoutenburg: Yeah, cool. Alright, well, let’s, let’s keep it rolling. So let’s kind of pull up some of the… oh, actually, okay, yeah, I have to ask you this question, since you put it on the resume. When did people start talking about coronavirus?
54 00:06:19.240 ⇒ 00:06:30.479 Fanu Sisay: Yeah, that, that is my… one of my favorite projects I’ve ever worked on. It was me and a team of 3 people. March 17th, 2020, is when you’ll see
55 00:06:30.690 ⇒ 00:06:50.660 Fanu Sisay: or, no, March 13th, 2020, is when you’ll see NBA fans specifically talking about coronavirus, because there was a big incident with one player who ended up testing positive, touching microphones, as he was… Rudy Gobert, if you might remember the name. He touched a lot of microphones after joking about,
56 00:06:50.700 ⇒ 00:07:01.320 Fanu Sisay: the virus going around that no one knew. And then I think March 17th, 2020, was the largest spike, due to schools nationwide close, closing down.
57 00:07:01.390 ⇒ 00:07:05.139 Fanu Sisay: Yeah, I mean, obviously, this was, like, 6 years ago.
58 00:07:05.140 ⇒ 00:07:08.999 Greg Stoutenburg: Yeah, I know, I know, no, it’s okay. You could have just said, March. Yeah.
59 00:07:09.000 ⇒ 00:07:27.590 Fanu Sisay: Yeah, but, pulling from, like, Reddit APIs, it was super funny seeing, like, I think Childish Gambino’s personal subreddit was the most negative, in conversations around COVID, and then, like, some were, like, kind of positive. We did a lot of, like, yeah, subreddit by subreddit analysis. It was so much fun.
60 00:07:27.590 ⇒ 00:07:39.129 Greg Stoutenburg: Cool. Yeah, cool, that’s interesting. Yeah, I remember, I remember around that time, I had a… well, I think even before that, I had a friend who was a consultant who traveled overseas pretty regularly.
61 00:07:39.130 ⇒ 00:07:56.429 Greg Stoutenburg: And he came home, and he had this disease. He just had this problem where he was, like, so short of breath, and so tired, and doctors didn’t know what was up, and he was going to this hospital, and that hospital, and doing all these tests, and and then, yeah, I mean, later hypothesized, like, wow, I think it was one of the first COVID patients
62 00:07:56.430 ⇒ 00:08:01.590 Greg Stoutenburg: before, you know, before in the US, we were like, oh, there’s this thing called COVID. Yeah, coronavirus.
63 00:08:01.590 ⇒ 00:08:04.930 Fanu Sisay: Anyway, I mean, yeah. Such an interesting time, yeah.
64 00:08:04.930 ⇒ 00:08:16.369 Greg Stoutenburg: It is interesting, yeah. Okay, alright, let’s… let’s dive into it. So, looking at your background, you’re, like, you’ve been leading projects, you’ve been doing data engineering work…
65 00:08:16.540 ⇒ 00:08:25.030 Greg Stoutenburg: You’ve been doing analytics, kind of like that whole body of things. Yeah. And so…
66 00:08:25.620 ⇒ 00:08:31.230 Greg Stoutenburg: thinking about… let’s see, which of these question sets should I dig into?
67 00:08:37.400 ⇒ 00:08:43.879 Greg Stoutenburg: when you think about… yeah, okay, here you go. Here’s one. What’s your philosophy on data quality checks versus performance?
68 00:08:46.330 ⇒ 00:09:00.339 Fanu Sisay: And when you say data quality checks, that’s just making sure that the process is running, or, you know, the final data set is what we’re looking for, and then performance, you’re more so saying, like, running quality,
69 00:09:01.030 ⇒ 00:09:01.830 Fanu Sisay: Yeah.
70 00:09:02.130 ⇒ 00:09:06.469 Fanu Sisay: I guess… Do you mind if I think about that, per se?
71 00:09:06.470 ⇒ 00:09:07.930 Greg Stoutenburg: Yeah, yeah, go ahead, yeah.
72 00:09:08.140 ⇒ 00:09:15.279 Greg Stoutenburg: Yeah, I’m just trying to pull from, like, what looks like the most relevant set of questions here, because as I look in the record,
73 00:09:15.690 ⇒ 00:09:18.249 Greg Stoutenburg: I actually don’t see which title
74 00:09:18.590 ⇒ 00:09:21.300 Greg Stoutenburg: You may have applied under, if any of them?
75 00:09:21.300 ⇒ 00:09:40.149 Fanu Sisay: Yeah, so I believe I’m under the Senior Associate Data and Insights. I, you know, was talking with Kayla, and she said a senior strategist role, which, you know, I’m definitely interested in, but I didn’t find that in the job board, but yeah, I think that’s Senior Associate, title. Yeah.
76 00:09:40.800 ⇒ 00:09:44.999 Fanu Sisay: But yeah, I… I’m definitely… now that I’ve thought about it, I’m definitely…
77 00:09:45.670 ⇒ 00:09:49.940 Fanu Sisay: I prioritize data quality checks, mainly because,
78 00:09:50.180 ⇒ 00:09:56.590 Fanu Sisay: You know, one of our main clients that we’re running a model for right now, you know, the…
79 00:09:56.650 ⇒ 00:10:02.460 Fanu Sisay: The final data dashboards that are being sent out and used throughout the, company.
80 00:10:02.460 ⇒ 00:10:18.319 Fanu Sisay: are, you know, being used to make very important decisions around sports sponsorships. And, you know, obviously I want things to run fast and be able to report stuff to them as soon as possible with the most up-to-date, you know, data, but that data to be true.
81 00:10:18.320 ⇒ 00:10:32.549 Fanu Sisay: is what’s most important to me. And, you know, I understand, like, presenting things as, like, a beta, and, you know, there may be, accuracy issues, so during that portion, I’m okay with, you know.
82 00:10:32.950 ⇒ 00:10:37.289 Fanu Sisay: performance, high… faster running, models, but I’m… I’m definitely a…
83 00:10:37.520 ⇒ 00:10:49.679 Fanu Sisay: accuracy before everything kind of person, and my biggest fear is, you know, sending through numbers that aren’t real, or aren’t true. So yeah. I don’t know if that was exactly what you were asking.
84 00:10:49.680 ⇒ 00:11:01.260 Greg Stoutenburg: Yeah, no, that’s good. And when you think about, like, the trade-offs between accuracy and, like, speed to deliver something for a stakeholder, I mean, well, how do you think about that kind of thing?
85 00:11:01.260 ⇒ 00:11:02.170 Fanu Sisay: Like, on the one hand…
86 00:11:02.170 ⇒ 00:11:07.480 Greg Stoutenburg: you know, the stakeholder wants it yesterday, right? Yeah. And on the other hand, you want it to be right, you know?
87 00:11:08.130 ⇒ 00:11:14.620 Fanu Sisay: Yeah, I think this is, like, a really great conversation with AI, you know, being that, you know.
88 00:11:15.100 ⇒ 00:11:29.299 Fanu Sisay: impetus, making everything run faster. But, you know, there are the questions of, like, are we checking at every stop, that the data was true? And I think both can be true. You know, I think you can be, using AI to
89 00:11:29.580 ⇒ 00:11:45.350 Fanu Sisay: you know, be a catalyst to your models, performance, everything, but you can also use AI to be performing data quality checks at every stop of the way to ensure that what you’re sending through is accurate. Obviously, when it…
90 00:11:45.350 ⇒ 00:11:57.500 Fanu Sisay: client demands are in, and, you know, you work as hard to keep them, you know, on set, on base of when you plan to report everything out. But I think both can be true. And I’m someone who
91 00:11:57.500 ⇒ 00:12:06.920 Fanu Sisay: is, you know, super meticulous, on time about everything, so I… I definitely try and get everything to a client as soon as possible.
92 00:12:06.920 ⇒ 00:12:09.270 Fanu Sisay: But, you know, I also believe that
93 00:12:09.330 ⇒ 00:12:13.799 Fanu Sisay: the numbers I present are always gonna be correct. Yeah. But yeah.
94 00:12:15.330 ⇒ 00:12:20.650 Greg Stoutenburg: Handle that, so just, like, in terms of managing a project and talking to stakeholders… oh, still there?
95 00:12:20.650 ⇒ 00:12:29.000 Fanu Sisay: Yeah, I see you, the image is… er, the image is kind of blurry, but I hear you. I hear you perfectly fine.
96 00:12:29.000 ⇒ 00:12:29.780 Greg Stoutenburg: Hey.
97 00:12:30.150 ⇒ 00:12:31.759 Fanu Sisay: Yep, hello? Do you hear me?
98 00:12:33.220 ⇒ 00:12:34.420 Greg Stoutenburg: Fanny, still there? Oh, there we go.
99 00:12:34.420 ⇒ 00:12:34.800 Fanu Sisay: The airman?
100 00:12:34.800 ⇒ 00:12:40.260 Greg Stoutenburg: Alright, cool. Yeah, we froze for a little bit, so I was just, like, waiting for it to unfreeze.
101 00:12:40.610 ⇒ 00:12:57.979 Greg Stoutenburg: As far as, so kind of, like, continuing on the same question, as far as, like, managing a stakeholder, you know, supposing that the stakeholder, you know, they want it done now, they’re frustrated maybe that a project isn’t going according to the initial timeline, how do you, like.
102 00:12:57.990 ⇒ 00:13:02.019 Greg Stoutenburg: how do you work with that person and, like, manage expectations if something isn’t
103 00:13:02.220 ⇒ 00:13:08.990 Greg Stoutenburg: On track to hit some deadline that you’d intended, because you’re working out, like, data discrepancies.
104 00:13:09.400 ⇒ 00:13:26.700 Fanu Sisay: Yeah, I’m a constant stream of communication person, so I… you know, with the clients I have currently, I have their cell phone numbers, we talk outside of, you know, Teams or whatever Slack platform we’re using. So I… I am always keeping them in line with, hey.
105 00:13:26.700 ⇒ 00:13:34.940 Fanu Sisay: we’re doing well on our timeline. And that’s every stand-up, every, you know, over-communicating our status point.
106 00:13:34.940 ⇒ 00:13:50.890 Fanu Sisay: And, you know, when that status point needs to… or, you know, that final point needs to be pushed, it’s definitely something I bring up. But I also, you know, am confident in my ability to work around, things that come up, and, you know, make sure that things are developing and…
107 00:13:51.380 ⇒ 00:14:00.279 Fanu Sisay: will be there at the due date. But yeah, constant stream of communication and, you know, confidence in my ability to work around any issues that come up.
108 00:14:00.790 ⇒ 00:14:09.119 Greg Stoutenburg: Yeah. Yeah, yeah. Yeah, very good. I think probably you can’t really beat that, right? I mean, sometimes things take… they take as long as they take, right? Yeah.
109 00:14:09.120 ⇒ 00:14:10.950 Fanu Sisay: Yeah, 100%. But, you know.
110 00:14:11.100 ⇒ 00:14:19.709 Fanu Sisay: I definitely do understand that, like, sometimes you gotta be, you know, hands on the ground, or boots on the ground, just, you know, working around…
111 00:14:20.140 ⇒ 00:14:27.619 Fanu Sisay: things that come up. So, like, oh, we need this tomorrow. I… I find my best way to get this tomorrow, because that is what the client is looking for.
112 00:14:27.620 ⇒ 00:14:36.609 Greg Stoutenburg: Yeah, yeah, yeah, yeah, very good. Okay, great. Describe to me how you explain technical findings to a non-technical executive.
113 00:14:37.350 ⇒ 00:14:58.230 Fanu Sisay: Yeah, I think this is something I do quite often in modern, you know, my current job. We are a sports marketing company, and my boss are the only people on the research and insights team. Everyone else is coming from a marketing background, sports business background, and really aren’t comfortable with numbers. So, I think comparisons is my number one thing to do.
114 00:14:58.360 ⇒ 00:14:58.960 Greg Stoutenburg: Yeah.
115 00:14:58.960 ⇒ 00:15:07.119 Fanu Sisay: Knowing that they have some understanding of, you know, a comparable metric, so when talking about,
116 00:15:07.330 ⇒ 00:15:24.100 Fanu Sisay: I don’t know, a rate, a specific rate, something divided by another, compared to ROI, because everyone talks about ROI. No matter how high up you go, they’re concerned about ROI, so usually comparing something to ROI. You know, I was also a tutor in college, so…
117 00:15:24.100 ⇒ 00:15:24.470 Greg Stoutenburg: Cool.
118 00:15:24.470 ⇒ 00:15:25.089 Fanu Sisay: you know.
119 00:15:25.430 ⇒ 00:15:37.219 Fanu Sisay: Explaining complex ideas to people who, you know, have some understanding of something else, and attaching it to that, what they do understand, and pulling them into this
120 00:15:37.380 ⇒ 00:15:41.949 Fanu Sisay: you know, question mark area of, they don’t understand this, how can I get them there?
121 00:15:42.240 ⇒ 00:15:46.630 Greg Stoutenburg: Yeah, okay. Can you give a specific example of a time when you’ve done that?
122 00:15:46.630 ⇒ 00:15:48.380 Fanu Sisay: Yeah.
123 00:15:49.610 ⇒ 00:15:59.039 Greg Stoutenburg: My background’s in education as well. I was a philosophy professor, until 2021, and then I got a job at Stack Overflow, and, just kind of been in tech since, yeah.
124 00:15:59.040 ⇒ 00:16:05.549 Fanu Sisay: Yeah, well, I have questions around that, but let me, let me think of the answer of an example.
125 00:16:05.990 ⇒ 00:16:07.040 Fanu Sisay: Mmm…
126 00:16:08.200 ⇒ 00:16:20.550 Fanu Sisay: Yeah, I feel like our ROI… so, this morning, I was, presenting a Tableau dashboard to a team of new members on a client, and I kind of have to think to them, like, hey,
127 00:16:21.850 ⇒ 00:16:29.739 Fanu Sisay: you know, Breaking it down to simple chunks that… so, they use a net return as their overall,
128 00:16:29.770 ⇒ 00:16:31.850 Fanu Sisay: You know, return from a partnership.
129 00:16:31.850 ⇒ 00:16:51.729 Fanu Sisay: And showing them each of the chunks that go into that net return. So, starting from the very basics, and all of those basic things, they understand. So, their POS, point of sale, displays, and how much value those are bringing into the company. All the media value that they post on social and broadcast, showing them those specific chunks, and
130 00:16:51.730 ⇒ 00:16:55.689 Fanu Sisay: It’s easy for them to comprehend that, oh, these are all the ways we’re making money.
131 00:16:55.690 ⇒ 00:17:15.879 Fanu Sisay: And then, going over to their cost, showing that they pay X amount for an agency, this amount towards this specific sponsorship, and then this amount towards activating those, point-of-sales displays. So, like, showing them the specific chunks that they understand, and then showing them that final ROI number, it all makes sense to them.
132 00:17:15.940 ⇒ 00:17:18.950 Fanu Sisay: Then, but yeah, I… I think…
133 00:17:19.369 ⇒ 00:17:36.359 Fanu Sisay: the, you know, direct point to what I was saying earlier is what they understand is the stuff that they do. So working on social, working on broadcast, different assets, and then showing them the numbers that I’m pulling from, that ROI number, and yeah, presenting it to them.
134 00:17:36.910 ⇒ 00:17:41.210 Greg Stoutenburg: Yeah. I like that, and I see the…
135 00:17:41.530 ⇒ 00:17:47.339 Fanu Sisay: I… Yeah, I don’t want to cut it off, so I’ll save this question for the end.
136 00:17:47.340 ⇒ 00:18:05.039 Greg Stoutenburg: Oh, okay, sure, yeah. All I was gonna say is, yeah, I mean, I think that that… there’s something really good, I think, about the teacher’s instinct in working with clients and explaining things in ways that other people understand. And, you know, it’s the… you know, sometimes they say, oh, this or that skill can’t be taught. I think that’s usually not true, but, like.
137 00:18:05.830 ⇒ 00:18:24.810 Greg Stoutenburg: this is as close as one of those as you can get, right? Where it’s, like, the non-technical stakeholder, they need help, they need help on something technical, they don’t really understand it, so the closest you can find are those, like, little connection points that you’re talking about, and I think that that’s, you know, sort of having a knack for that is really critical.
138 00:18:25.330 ⇒ 00:18:30.940 Greg Stoutenburg: Yeah, great. So… Okay.
139 00:18:37.540 ⇒ 00:18:43.799 Greg Stoutenburg: I think I’ll… yeah, okay. When would you choose spreadsheets over using a BI tool?
140 00:18:45.040 ⇒ 00:18:45.680 Fanu Sisay: Hmm.
141 00:18:49.120 ⇒ 00:18:59.039 Fanu Sisay: I think, personally, I use spreadsheets for when I know it’s not going into… A, you know.
142 00:19:00.800 ⇒ 00:19:04.529 Fanu Sisay: a work stream that needs verification, if that makes sense? Like…
143 00:19:04.530 ⇒ 00:19:04.950 Greg Stoutenburg: Yeah.
144 00:19:06.780 ⇒ 00:19:15.310 Fanu Sisay: personal use. Obviously, I’m typing into spreadsheets, just, like, doing specific designs. But, you know, things that I know
145 00:19:16.090 ⇒ 00:19:29.240 Fanu Sisay: With a BI tool, I have the safety in my head to know that, okay, this is coming from a data source that is verified, has different checks, a spreadsheet just seems too casual, and not,
146 00:19:29.860 ⇒ 00:19:42.400 Fanu Sisay: you know, no real verification process of how this was inserted into the spreadsheet. Obviously, you know, there’s all new kind of inputs to Excel and what you can pull from, but,
147 00:19:43.020 ⇒ 00:19:43.870 Fanu Sisay: you know.
148 00:19:44.310 ⇒ 00:19:52.279 Fanu Sisay: a spreadsheet just doesn’t feel like the same source of truth that a BI tool pulling from a data source is.
149 00:19:52.790 ⇒ 00:19:54.770 Fanu Sisay: So yeah, I think that’s what I would say.
150 00:19:55.220 ⇒ 00:19:58.120 Greg Stoutenburg: Are there any advantages, ever, to using one rather than another?
151 00:19:58.550 ⇒ 00:20:04.580 Fanu Sisay: Well, yeah, the BI tool comes… like, and when I think of a BI tool, I consider…
152 00:20:04.700 ⇒ 00:20:05.540 Fanu Sisay: you know.
153 00:20:05.650 ⇒ 00:20:19.499 Fanu Sisay: pulling from a vendor, that has data on Snowflake. It has that verification that this vendor is verifying all the data that’s coming in through it. So, I think the benefit
154 00:20:19.500 ⇒ 00:20:28.729 Fanu Sisay: there is obvious. In a spreadsheet, it’s definitely much more customizable. You have the ability to, play with it, and… oh…
155 00:20:28.730 ⇒ 00:20:37.419 Fanu Sisay: I have to make a quick change. That’s so much more simpler in a spreadsheet than maybe a non-edible, non-editable.
156 00:20:37.420 ⇒ 00:20:37.859 Greg Stoutenburg: Dude.
157 00:20:37.860 ⇒ 00:20:42.629 Fanu Sisay: source that you would use a BI tool for. But yeah, I…
158 00:20:42.940 ⇒ 00:20:51.979 Fanu Sisay: I’m not anti-spreadsheet, especially working at, like, an antiquated company that works with clients who just, you know.
159 00:20:52.210 ⇒ 00:20:55.780 Fanu Sisay: Much more old style. It’s definitely…
160 00:20:56.560 ⇒ 00:21:01.679 Fanu Sisay: It’s definitely valuable in some senses, but, you know, when you have the money to…
161 00:21:01.820 ⇒ 00:21:05.270 Fanu Sisay: you know, book the BI tool, I think that’s the way to go.
162 00:21:05.530 ⇒ 00:21:13.849 Greg Stoutenburg: Yeah, yeah, yeah. Yeah, okay, cool, alright, I, I mean, yeah, I…
163 00:21:13.880 ⇒ 00:21:28.809 Greg Stoutenburg: I think this is really helpful, understanding where you’re coming from in terms of, like, you know, data engineering, business intelligence management, working with clients and stakeholders, and understanding what business needs are, and translating those into technical requirements.
164 00:21:28.810 ⇒ 00:21:45.100 Greg Stoutenburg: So yeah, that’s helpful for me. In the last little bit here, do you have any questions that you wanted to ask me? Now, to clarify, I’m not, like, I’m not a hirer or anything like that. Like I said, I’m just sort of, like, up here doing this kind of work, so, but if there’s anything I can answer or help with, I’m happy to.
165 00:21:45.100 ⇒ 00:22:00.160 Fanu Sisay: Yeah, sweet. Well, yeah, you mentioned earlier, that you had started in philosophy. What was it like switching over to tech? And was it a… because I assume there are philosophical roles within the tech space. I mean, I…
166 00:22:00.410 ⇒ 00:22:05.349 Fanu Sisay: just to speak a little bit about my time in math, I loved, logic, and…
167 00:22:05.350 ⇒ 00:22:05.850 Greg Stoutenburg: Yeah.
168 00:22:05.850 ⇒ 00:22:09.300 Fanu Sisay: math, any kind of truth-based, that, that was…
169 00:22:09.480 ⇒ 00:22:16.679 Fanu Sisay: honestly, my favorite classes, but, you know, I knew I had to go into the workspace, so I, you know, didn’t believe that it was possible to…
170 00:22:16.810 ⇒ 00:22:22.370 Fanu Sisay: you know, go applied after, working in such a logical space, so I was curious about that.
171 00:22:22.370 ⇒ 00:22:27.670 Greg Stoutenburg: Yeah, yeah. Yeah, I mean, you know, I mean, I guess what I would say is, like, I,
172 00:22:28.280 ⇒ 00:22:39.030 Greg Stoutenburg: if it was possible to earn a tech salary as a philosophy professor, I’d still be doing that. You know, I just… I love… I love ideas, I love the discussion.
173 00:22:39.030 ⇒ 00:22:54.389 Greg Stoutenburg: I love the, you know, it’s, you know, abstract or whatever, right? But, like, I do love the abstract, and yeah, I, I’m happy to, like, grind away at a problem and try to think of creative solutions for a long time.
174 00:22:54.390 ⇒ 00:23:02.289 Greg Stoutenburg: And it’s not really… it’s… that sort of thing is just not really possible in most of industry. So,
175 00:23:02.410 ⇒ 00:23:21.659 Greg Stoutenburg: Yeah, I mean, I think the transition’s been fine. I think the single biggest thing that sort of, like, hit me when I made the change was just that the pace of work is so much faster, coming out of higher education. And, I mean, I’ve still got friends in higher education, and, like.
176 00:23:21.840 ⇒ 00:23:37.120 Greg Stoutenburg: I’m able to just see from the outside, like, man, these people don’t even know how good they have it, in terms of, like, when something is due. But yeah, I mean, like, if part of the question is, like, how does this, like, does this apply? Yeah, I really think so, like, I think…
177 00:23:37.120 ⇒ 00:23:37.780 Fanu Sisay: 100%.
178 00:23:37.780 ⇒ 00:23:48.039 Greg Stoutenburg: I think as far as, like, what you mentioned before, trying to communicate with non-technical executives, I think having a background that’s philosophical has been really helpful, because
179 00:23:48.040 ⇒ 00:24:03.580 Greg Stoutenburg: I can quickly understand a problem that’s described and figure out what the range of solutions might be, and communicate about them in sort of a variety of ways. And I think that that’s really… I think that’s helpful for problem solving and critical thinking and communication.
180 00:24:03.580 ⇒ 00:24:04.409 Fanu Sisay: Yeah, I don’t know.
181 00:24:04.410 ⇒ 00:24:05.200 Greg Stoutenburg: So I do…
182 00:24:05.200 ⇒ 00:24:13.340 Fanu Sisay: There’s also, like, so many applications from logic and, like, something being true. It’s almost like coding
183 00:24:13.500 ⇒ 00:24:15.050 Fanu Sisay: before it was coding.
184 00:24:15.050 ⇒ 00:24:16.030 Greg Stoutenburg: Yeah.
185 00:24:16.030 ⇒ 00:24:23.850 Fanu Sisay: specific where clauses, this being true, so this may not be true. All kinds of things like that. Yeah. Yeah. Okay.
186 00:24:23.850 ⇒ 00:24:42.509 Greg Stoutenburg: Yeah, no, totally, and I still, I still teach logic online here and there. So, just to kind of, like, keep a foot in, you know? I mean, not nearly advanced, surely, as what you did, but we do, you know, we do, we do categorical logic with VIN diagrams, we do propositional logic with the whole.
187 00:24:42.510 ⇒ 00:24:47.609 Greg Stoutenburg: Array of, you know, quantifiers and rules of translation.
188 00:24:47.640 ⇒ 00:24:50.209 Greg Stoutenburg: An inference. So, yeah.
189 00:24:50.290 ⇒ 00:24:52.720 Fanu Sisay: But those are all, like, bringing back to…
190 00:24:52.720 ⇒ 00:24:54.499 Greg Stoutenburg: It’s coming back, yep.
191 00:24:54.500 ⇒ 00:24:56.850 Fanu Sisay: Should I get back into school?
192 00:24:57.840 ⇒ 00:25:00.260 Fanu Sisay: Okay, yeah, I had a list of questions.
193 00:25:00.260 ⇒ 00:25:01.180 Greg Stoutenburg: Yeah, sure.
194 00:25:01.500 ⇒ 00:25:05.280 Fanu Sisay: Okay, so, obviously,
195 00:25:05.840 ⇒ 00:25:14.120 Fanu Sisay: I know that AI’s being used at Brainforge, in order to optimize. I was wondering, are there any places where you guys are…
196 00:25:14.730 ⇒ 00:25:21.830 Fanu Sisay: And maybe this speaks to your first question about performance accuracy and things of that nature. Are there any…
197 00:25:22.500 ⇒ 00:25:30.580 Fanu Sisay: areas where you guys are hesitant to use AI, internally, externally, where you feel like, oh, you know.
198 00:25:31.380 ⇒ 00:25:35.650 Fanu Sisay: Performance isn’t running up to, you know, standard.
199 00:25:36.030 ⇒ 00:25:39.929 Greg Stoutenburg: Yeah, I mean, no, that’s a good question. So this is definitely a very, like.
200 00:25:40.030 ⇒ 00:25:51.730 Greg Stoutenburg: pro-AI organization. Like, they’ll say, use AI for kind of as much as you can. Somewhere that it’s possible, though, to get jammed up is that…
201 00:25:51.730 ⇒ 00:26:04.740 Greg Stoutenburg: If you don’t already have… like, I’ll just give an example. If you don’t already understand what an engagement with a client is supposed to do, and who the people are on that project, and what their concerns are, and things like that,
202 00:26:05.030 ⇒ 00:26:24.660 Greg Stoutenburg: you are taking a big risk if you’re gonna rely on AI to, like, help you move forward, because it will produce things for you that you don’t know if they’re true. So, and now this is, in a way, this is a very common thing, right, with AI. This is like, you know, students cheating on papers and things like that. Just getting AI to write it, but you don’t even know what it’s… if you don’t know… if you couldn’t have done it yourself.
203 00:26:24.660 ⇒ 00:26:28.630 Greg Stoutenburg: This is my opinion. If you could not have done it yourself based on your own
204 00:26:28.630 ⇒ 00:26:46.679 Greg Stoutenburg: work, and, like, without relying on AI, then if you rely on AI to do it, you are taking a risk. And so, something we have to be careful about is things like, you know, communications, or just, going the wrong direction on some… on some work, because we’re, you know, because we’re relying on the tool. So…
205 00:26:46.680 ⇒ 00:26:50.540 Fanu Sisay: I think that’s the same manner of, like, Googling when I was.
206 00:26:50.540 ⇒ 00:26:50.940 Greg Stoutenburg: Yeah.
207 00:26:50.940 ⇒ 00:26:52.180 Fanu Sisay: College, like.
208 00:26:52.180 ⇒ 00:26:52.580 Greg Stoutenburg: Yeah.
209 00:26:52.580 ⇒ 00:26:56.510 Fanu Sisay: high school. You know, there were the people who
210 00:26:56.790 ⇒ 00:27:11.280 Fanu Sisay: you know, like, oh, I can just Google everything for this homework assignment, and not really aware of what they were Googling, and just taking answers that, like, seemed that they were true, or correct. And yeah, I think that’s where, like, your prompts really matter.
211 00:27:11.530 ⇒ 00:27:16.339 Fanu Sisay: you know, how detailed you are in what you’re requesting of AI, the level of… Yeah.
212 00:27:16.460 ⇒ 00:27:23.640 Fanu Sisay: Concern you have over, like, edge cases, specific, gaps, you know, that you’re aware of.
213 00:27:23.640 ⇒ 00:27:24.050 Greg Stoutenburg: Yeah.
214 00:27:24.050 ⇒ 00:27:29.020 Fanu Sisay: Yeah, great to hear. Yeah, yeah.
215 00:27:29.070 ⇒ 00:27:48.300 Fanu Sisay: Okay, and yeah, this, I guess, is a little bit more about just, like, company culture. And, you know, it’s great that you’re also on the East Coast, because that was one of my, like, not concerns, but I know that a lot of the companies on the West Coast, what is that like for you, being in a different time zone? Do you, like, have any concerns about, like.
216 00:27:48.810 ⇒ 00:27:55.509 Fanu Sisay: just the gap, in time. I know there’s also some people, internationally, yeah. I was just curious about that.
217 00:27:55.630 ⇒ 00:28:13.100 Greg Stoutenburg: Yeah, I mean, we’re… we’re sort of rooted in Central Time to Eastern Time, but yeah, I’ve… I mean, I work with people on the West Coast, and I work with people in, like, Philippines, and, you know, and Malta, and India, and…
218 00:28:13.230 ⇒ 00:28:24.739 Greg Stoutenburg: Pakistan. Like, so we really are global. I’ve not found time zones to be an issue, but I think, and I mean, I don’t know if maybe you’ve encountered this in previous roles before, but
219 00:28:24.740 ⇒ 00:28:41.739 Greg Stoutenburg: Something that I have noticed being a remote worker in tech is that if you’ve got a global company, it’s kind of on you to mind your calendar, otherwise you can end up in a position where you’re kind of always on, and you have to kind of guard against that. But it’s not… it’s really not been much of an issue.
220 00:28:41.740 ⇒ 00:28:43.850 Fanu Sisay: Okay, yeah, that’s great to hear.
221 00:28:43.850 ⇒ 00:28:55.869 Greg Stoutenburg: I don’t know what my colleagues would say who are, you know, in Philippines, and they log on at probably, like, 8 PM, and then work until 4, but yeah.
222 00:28:55.870 ⇒ 00:28:57.989 Fanu Sisay: Yeah, to each their own.
223 00:28:57.990 ⇒ 00:28:58.959 Greg Stoutenburg: Yeah, yeah, yeah.
224 00:28:58.960 ⇒ 00:29:08.080 Fanu Sisay: I’m definitely a flexible guy, especially… I’m in the office now, but in the times where I have worked remote, I’m not, you know…
225 00:29:08.400 ⇒ 00:29:12.830 Fanu Sisay: I’m not opposed to… You know, responding at any time.
226 00:29:12.830 ⇒ 00:29:13.180 Greg Stoutenburg: Yeah.
227 00:29:13.180 ⇒ 00:29:26.099 Fanu Sisay: Definitely, like, getting work done and stuff, I try and keep in reasonable hours, but yep. Yeah, great to hear. Okay, I know it’s time, but I just also wanted to ask, where is your hoodie from?
228 00:29:26.100 ⇒ 00:29:30.449 Greg Stoutenburg: Oh yeah, this is Round Top Mountain Resort. This is a… this is a ski hill.
229 00:29:30.450 ⇒ 00:29:34.209 Fanu Sisay: Round time? Okay, where’s that?
230 00:29:34.790 ⇒ 00:29:38.430 Fanu Sisay: Sweet. I’ve went skiing…
231 00:29:38.550 ⇒ 00:29:44.650 Fanu Sisay: at the intersection of, like, West Virginia, Maryland, and Pennsylvania. I don’t know what the name of the mountain is.
232 00:29:44.650 ⇒ 00:29:46.620 Greg Stoutenburg: Whitetail, maybe? Does that sound right?
233 00:29:47.030 ⇒ 00:29:48.770 Fanu Sisay: It rings a bell? I don’t know.
234 00:29:48.770 ⇒ 00:29:52.360 Greg Stoutenburg: I’m from Michigan originally, and I’ve only done a little bit of skiing here.
235 00:29:52.360 ⇒ 00:29:53.020 Fanu Sisay: Yeah.
236 00:29:53.370 ⇒ 00:29:53.980 Fanu Sisay: Hi.
237 00:29:53.980 ⇒ 00:30:06.709 Greg Stoutenburg: I’m sorry, we actually just saw we hit 12.30, and I do have another call, so I better hop. But great to talk with you. I’ll just, I’ll just sort of share my feedback with the team, and then whoever’s been in touch will be in touch. Awesome. And, good luck, and thanks for your time.
238 00:30:06.710 ⇒ 00:30:07.940 Fanu Sisay: Thank you, I appreciate it. You have a good day.
239 00:30:07.940 ⇒ 00:30:09.489 Greg Stoutenburg: Alright, see you too. Bye.