Meeting Title: Brainforge AI Engineer Interview Date: 2026-05-04 Meeting participants: Brylle Girang, Nathan Zeke Gandawa, Samuel Roberts
WEBVTT
1 00:00:22.050 ⇒ 00:00:22.910 Brylle Girang: Hello.
2 00:00:24.270 ⇒ 00:00:26.080 Nathan Zeke Gandawa: Hello, hi!
3 00:00:27.700 ⇒ 00:00:28.400 Brylle Girang: Hi.
4 00:00:32.240 ⇒ 00:00:33.440 Nathan Zeke Gandawa: Hello, can you hear me?
5 00:00:34.130 ⇒ 00:00:35.429 Brylle Girang: Yep, I can hear you.
6 00:00:36.340 ⇒ 00:00:37.729 Nathan Zeke Gandawa: Oh, yeah, okay, I can hear you.
7 00:00:37.730 ⇒ 00:00:38.350 Samuel Roberts: Hello?
8 00:00:38.940 ⇒ 00:00:39.900 Brylle Girang: Hey, Sam!
9 00:00:40.890 ⇒ 00:00:41.500 Nathan Zeke Gandawa: M?
10 00:00:51.420 ⇒ 00:00:53.639 Samuel Roberts: Sorry, I’m gonna suck my videos.
11 00:00:54.270 ⇒ 00:00:55.070 Samuel Roberts: Messed up.
12 00:00:56.990 ⇒ 00:00:58.030 Brylle Girang: Okay.
13 00:01:03.140 ⇒ 00:01:04.300 Samuel Roberts: Okay, we’re good.
14 00:01:04.569 ⇒ 00:01:05.450 Samuel Roberts: Hello?
15 00:01:06.740 ⇒ 00:01:07.889 Brylle Girang: Can you hear us?
16 00:01:08.360 ⇒ 00:01:09.619 Samuel Roberts: I can now, yeah.
17 00:01:10.260 ⇒ 00:01:10.590 Brylle Girang: Great.
18 00:01:10.590 ⇒ 00:01:16.400 Samuel Roberts: Thank goodness. Alright different headphones every day, and it’s a… it’s a pain. Alright, how are you guys?
19 00:01:18.560 ⇒ 00:01:19.430 Nathan Zeke Gandawa: I’m well.
20 00:01:20.210 ⇒ 00:01:21.140 Nathan Zeke Gandawa: Great. How are you?
21 00:01:21.140 ⇒ 00:01:28.460 Samuel Roberts: Great to meet you. I don’t know if you guys were chatting before I jumped in and didn’t have any audio, so I wasn’t sure if I missed anything, but,
22 00:01:29.680 ⇒ 00:01:37.560 Samuel Roberts: Okay, so, the way… Oh, is there lag, too? I don’t know. Okay. So,
23 00:01:37.560 ⇒ 00:01:57.569 Samuel Roberts: Welcome. Quick intro. The way I tend to do this is I give myself a brief intro. I ask you to give a brief intro, B, I’ll ask you to give a brief intro then, too. Then, I’ve got some questions. I like to get about halfway, and then switch and let you ask some questions so we don’t get to the end and not have any time for that.
24 00:01:57.570 ⇒ 00:01:59.510 Samuel Roberts: And then just kind of chat from there.
25 00:01:59.570 ⇒ 00:02:06.940 Samuel Roberts: So, brief background on me, I’m Sam Roberts, AI engineer here at Brainforge.
26 00:02:07.320 ⇒ 00:02:17.170 Samuel Roberts: I’ve been here since July. My background is a lot of startups, web tech. I studied mechanical engineering in the past, and got pulled into startups right after graduation, and been doing
27 00:02:17.340 ⇒ 00:02:25.490 Samuel Roberts: A lot of developments since then, obviously a lot of AI in the past few years. Yeah, so, Nathan, if you could give yourself an intro?
28 00:02:30.070 ⇒ 00:02:30.940 Nathan Zeke Gandawa: Me?
29 00:02:31.530 ⇒ 00:02:32.330 Samuel Roberts: Yeah, sure.
30 00:02:33.560 ⇒ 00:02:34.450 Nathan Zeke Gandawa: Oh, okay.
31 00:02:35.210 ⇒ 00:02:38.329 Nathan Zeke Gandawa: Yeah, so my name is Nathan Z. Gandawa.
32 00:02:38.620 ⇒ 00:02:43.189 Nathan Zeke Gandawa: I’m here in Austin, Texas. I’m a full-stack AI engineer.
33 00:02:44.050 ⇒ 00:02:50.720 Nathan Zeke Gandawa: Most of my work has been really just about taking data and, ingesting it.
34 00:02:51.510 ⇒ 00:02:58.110 Nathan Zeke Gandawa: Reorganizing it, or organizing it, or different stakeholders.
35 00:02:58.580 ⇒ 00:03:08.360 Nathan Zeke Gandawa: I have also My experience has spanned over several industries, including healthcare, enterprise consultancy,
36 00:03:08.770 ⇒ 00:03:17.509 Nathan Zeke Gandawa: marketing and sales technologies, different, different areas. I’m excited about, automation, AI,
37 00:03:18.130 ⇒ 00:03:24.870 Nathan Zeke Gandawa: Yeah, so that’s, that’s really, that’s really, mostly about me.
38 00:03:25.630 ⇒ 00:03:29.230 Samuel Roberts: Great, thank you. Yeah, Bea, if you want to introduce yourself real quick?
39 00:03:29.500 ⇒ 00:03:33.930 Brylle Girang: Yeah, what’s your preferred nickname? Is that Nathan, or is that Zeke?
40 00:03:36.060 ⇒ 00:03:41.640 Nathan Zeke Gandawa: My, my, my name? Okay, you can call me, you can call me, Nathan. Nathan’s fine.
41 00:03:41.640 ⇒ 00:03:54.150 Brylle Girang: Hi, Nathan. You can call me B. I’m leading the learning and development team here in Brainforge. For this round, I’m going to be mostly sitting in, just going to watch how this unfolds.
42 00:03:54.290 ⇒ 00:04:02.769 Brylle Girang: But I have been with Brainforge for a total of 4 months now, and I’m mostly leading the team that is
43 00:04:03.170 ⇒ 00:04:08.369 Brylle Girang: Trying to make sure that everyone in Brainforge performs at the same baseline and above.
44 00:04:09.810 ⇒ 00:04:12.259 Brylle Girang: Okay, thank you. I’ll give it to you, Sam.
45 00:04:12.880 ⇒ 00:04:13.510 Samuel Roberts: Great.
46 00:04:13.870 ⇒ 00:04:19.550 Samuel Roberts: Excuse me. Thank you both. Yeah, so I think we’ll just jump in, some, some questions here.
47 00:04:20.630 ⇒ 00:04:29.800 Samuel Roberts: So what I usually like to start with is, can you talk about, an LLM-based, feature that you’ve, shipped to production, and the problem that it solved?
48 00:04:32.240 ⇒ 00:04:33.220 Nathan Zeke Gandawa: Okay.
49 00:04:33.440 ⇒ 00:04:38.970 Nathan Zeke Gandawa: That’s, that’s a good one. As an AI engineer, there’s, like, there’s, like, a thousand projects that forward.
50 00:04:38.970 ⇒ 00:04:41.090 Samuel Roberts: Exactly, yeah, yeah.
51 00:04:41.540 ⇒ 00:04:46.350 Nathan Zeke Gandawa: Okay, so… I’ll talk about the one that I’m most proud of.
52 00:04:46.880 ⇒ 00:04:54.140 Nathan Zeke Gandawa: And that is, DopTalk. So the cop… the core problem there was that,
53 00:04:55.190 ⇒ 00:05:03.830 Nathan Zeke Gandawa: there was a cohort of students that went from Notre Dame to, Mexico, so they had a problem with communicating with locals there.
54 00:05:04.140 ⇒ 00:05:09.400 Nathan Zeke Gandawa: Because of the language barrier. So, they needed a translation tool.
55 00:05:09.820 ⇒ 00:05:12.209 Nathan Zeke Gandawa: So that is when I start working on
56 00:05:12.360 ⇒ 00:05:17.279 Nathan Zeke Gandawa: the first version of DocTop, which was basically a translation tool
57 00:05:17.450 ⇒ 00:05:22.069 Nathan Zeke Gandawa: We, did, basic translation between,
58 00:05:22.100 ⇒ 00:05:37.730 Nathan Zeke Gandawa: between, doctors and patients, clinicians, and, and patients. And then, we started to get a little more serious. The models we were using, I fine-tuned BioBERTS, which is, which is a BERTS model.
59 00:05:38.120 ⇒ 00:05:41.940 Nathan Zeke Gandawa: To be able to do proper translation for, like.
60 00:05:42.210 ⇒ 00:05:58.610 Nathan Zeke Gandawa: medical terminology. I did it further, yeah, I realized, wait, we’re collecting transcripts from both languages, why not just create notes, right? So, for those guys, we converted the transcripts.
61 00:05:58.610 ⇒ 00:06:08.339 Nathan Zeke Gandawa: into doctor notes, perfect doctor notes, and then we thought, wait, why not just create, patient notes, right? So, after your consultation, you could get a link with
62 00:06:08.860 ⇒ 00:06:13.420 Nathan Zeke Gandawa: A summary of your consultation and everything the doctor said and advised you to do.
63 00:06:14.290 ⇒ 00:06:29.939 Nathan Zeke Gandawa: voila, it was done. So, yeah, that’s a… that’s a project, I’ll say, that was, LLM heavy. I built it, I ended up making it, HIPAA compliant, and make it, like, a proper healthcare conversational system.
64 00:06:30.050 ⇒ 00:06:33.720 Nathan Zeke Gandawa: And, for that, I used OpenAI’s,
65 00:06:33.880 ⇒ 00:06:39.740 Nathan Zeke Gandawa: I used, OpenAI… I used the OpenAI, model at that time.
66 00:06:41.100 ⇒ 00:06:46.780 Nathan Zeke Gandawa: But, I started with OpenAI, and then I… I transitioned to the BioBird model.
67 00:06:47.700 ⇒ 00:06:48.600 Samuel Roberts: Cool, cool.
68 00:06:48.830 ⇒ 00:06:54.510 Samuel Roberts: So, full stack, where do you tend to feel,
69 00:06:54.800 ⇒ 00:07:02.370 Samuel Roberts: most comfortable building in versus, like, experimenting, I guess? Like, are you happy all the way through, or are you, you know… or is there some area that you feel
70 00:07:03.020 ⇒ 00:07:04.729 Samuel Roberts: That you’ve spent the most time in.
71 00:07:08.830 ⇒ 00:07:12.540 Nathan Zeke Gandawa: I am… that’s a… that’s a very unfair question.
72 00:07:13.570 ⇒ 00:07:16.359 Samuel Roberts: I feel the same way, because I’m a full-stack guy, and I like to think I’m, you know.
73 00:07:16.360 ⇒ 00:07:16.770 Nathan Zeke Gandawa: Yeah.
74 00:07:16.770 ⇒ 00:07:17.720 Samuel Roberts: Yeah, but I… yeah.
75 00:07:17.720 ⇒ 00:07:23.660 Nathan Zeke Gandawa: Yeah, all the way, yeah. I consider myself a creative engineer, so…
76 00:07:23.660 ⇒ 00:07:24.280 Samuel Roberts: Okay.
77 00:07:24.870 ⇒ 00:07:28.679 Nathan Zeke Gandawa: I love to first design products and then do the…
78 00:07:29.220 ⇒ 00:07:36.959 Nathan Zeke Gandawa: logical part. I’m excited about interacting with, the end user, trying to understand their problem, and how I can solve it, how to bring it up to them.
79 00:07:37.670 ⇒ 00:07:46.499 Nathan Zeke Gandawa: But also, I’m also a very logical person, I love to solve problems, I love to break down a challenge. But, ultimately.
80 00:07:46.910 ⇒ 00:07:53.039 Nathan Zeke Gandawa: I will say, if it’s a full stack, I’ll say I’m 52% front-end.
81 00:07:53.520 ⇒ 00:07:54.170 Nathan Zeke Gandawa: Cool.
82 00:07:54.170 ⇒ 00:08:00.110 Samuel Roberts: Cool. Alright, yeah, that’s great, that’s great. Alright, so yeah, talking to users, I like that.
83 00:08:00.340 ⇒ 00:08:12.410 Samuel Roberts: So, we interact with clients a lot. They’re not always technical, but a lot of times, there’s so much news out there about AI and LLMs, they’re hearing all kinds of things.
84 00:08:12.660 ⇒ 00:08:16.070 Samuel Roberts: How do you think about, or go about… excuse me, sorry.
85 00:08:16.390 ⇒ 00:08:22.200 Samuel Roberts: Explaining the limitations of some of the technology that non-technical stakeholders may not
86 00:08:22.630 ⇒ 00:08:29.789 Samuel Roberts: fully understand where the line is for LLMs. You know, sometimes I think it’s a magic box, but how do you go about
87 00:08:30.560 ⇒ 00:08:34.010 Samuel Roberts: Explaining to them, you know, where the limitations are and how it works and stuff.
88 00:08:35.669 ⇒ 00:08:40.019 Nathan Zeke Gandawa: Yeah, so… Usually, when I…
89 00:08:40.619 ⇒ 00:08:47.739 Nathan Zeke Gandawa: when I’m in such a situation, what I like to do is… I’d like to highlight…
90 00:08:47.909 ⇒ 00:08:51.399 Nathan Zeke Gandawa: when I’m… when I’m confronted with a situation where I have to
91 00:08:51.729 ⇒ 00:08:54.959 Nathan Zeke Gandawa: have to highlight that. What I have to do,
92 00:08:55.159 ⇒ 00:08:58.489 Nathan Zeke Gandawa: I usually like to highlight what could go wrong.
93 00:08:59.290 ⇒ 00:08:59.690 Samuel Roberts: Sure.
94 00:08:59.690 ⇒ 00:09:19.210 Nathan Zeke Gandawa: Not just tell them the good things. I like to highlight what could go wrong, what are the risks of implementing AI? So, I usually explain it in terms of where it is very reliable and where it’s not, right? LLMs are very good at pattern recognition, summarization, extraction, and helping others de misinformation.
95 00:09:19.210 ⇒ 00:09:24.940 Nathan Zeke Gandawa: In fact, I like to say… I like to include them in the solution.
96 00:09:24.940 ⇒ 00:09:40.110 Nathan Zeke Gandawa: I like to highlight where they’ll be active in the solution, not just how AI is going to complete the whole thing. I love to include them in the solution and tell them that there’s parts of this problem that AI will not touch and will never touch.
97 00:09:42.530 ⇒ 00:09:58.910 Samuel Roberts: That’s good. Yeah, we had a project where we were automating a flow that a client had that they were using in Claude, and pasting back and forth, and so we helped automate that, but it was very important for us to keep them in the loop, because, you know, their taste, theirs was a design firm, so they, you know, we wanted to make sure that they were
98 00:09:58.910 ⇒ 00:10:08.509 Samuel Roberts: It wasn’t just like, oh, yeah, here’s the prompt, go spit out a, you know, proposal kind of thing, because that’s really, you know, they can automate some of the simple stuff, not simple, but the, you know, more consistently
99 00:10:08.510 ⇒ 00:10:09.620 Samuel Roberts: You know.
100 00:10:10.490 ⇒ 00:10:20.330 Samuel Roberts: I think the idea was that, like, exactly what you’re saying. Like, they’re gonna be an important part of the process. Yeah, that’s great. Great. Let’s jump down,
101 00:10:21.350 ⇒ 00:10:40.160 Samuel Roberts: Have there been any, like, a feature that you’ve owned end-to-end that there were, issues with, or failures, or misunderstandings, from a user that you either had to revise how it worked, or, you know, explain to the user how it could work, and where the limitations were there in terms of the actual feature?
102 00:10:42.760 ⇒ 00:10:48.249 Nathan Zeke Gandawa: Sorry, you’re asking if there’s a feature that I shipped and it failed?
103 00:10:49.160 ⇒ 00:10:59.169 Samuel Roberts: Yeah, or, you know, failed, or there was misunderstanding, maybe, about what it was doing. So, yeah, either of those, you know, and how that worked.
104 00:11:00.170 ⇒ 00:11:00.970 Nathan Zeke Gandawa: Okay.
105 00:11:02.360 ⇒ 00:11:03.060 Nathan Zeke Gandawa: Hmm.
106 00:11:03.240 ⇒ 00:11:09.139 Nathan Zeke Gandawa: Let me think about this one. So, one example I can think of is,
107 00:11:12.010 ⇒ 00:11:23.530 Nathan Zeke Gandawa: So, an example I like to think of… I think this one is going to connect with your previous question of where AI can be highlighting the limitations of AI. So, currently at my role.
108 00:11:23.590 ⇒ 00:11:35.880 Nathan Zeke Gandawa: I worked with the sales department. So they have a call center, right? And, they needed an AI system that could interact with customers, right?
109 00:11:37.130 ⇒ 00:11:47.770 Nathan Zeke Gandawa: I don’t, I don’t fully champion for the whole process to be taken by AI, but they did ask for it. So what I built was an AI system that could,
110 00:11:47.890 ⇒ 00:11:50.530 Nathan Zeke Gandawa: that could, I used,
111 00:11:51.480 ⇒ 00:11:57.980 Nathan Zeke Gandawa: I use a voice API and, a transcript, like, We transcribed the code.
112 00:11:58.230 ⇒ 00:12:05.950 Nathan Zeke Gandawa: like, from the time you transcribe the code, convert it to text-to-speech and speech-to-text. So what… what would happen is if someone called in
113 00:12:06.620 ⇒ 00:12:08.549 Nathan Zeke Gandawa: This AI system would pick up.
114 00:12:08.990 ⇒ 00:12:20.900 Nathan Zeke Gandawa: and, it could, answer the phone, and then it would hand off, the call to the, to the call center, associate. Okay. So, the person I was working with
115 00:12:21.800 ⇒ 00:12:25.809 Nathan Zeke Gandawa: He wanted the whole thing to be handled by AI,
116 00:12:25.920 ⇒ 00:12:38.409 Nathan Zeke Gandawa: I believe he was in a rush to have the whole team just be just AI, just replace it. But, I had to break down how risky it was
117 00:12:38.490 ⇒ 00:12:46.990 Nathan Zeke Gandawa: for AI to be handling patient data, right, stereotype patient data. Because the way I built it was, the AI would pick up the phone.
118 00:12:47.140 ⇒ 00:12:55.580 Nathan Zeke Gandawa: and then it would try and help the person solve the issue. If the issue was too complicated, it would hand off to the human being.
119 00:12:55.740 ⇒ 00:12:57.690 Nathan Zeke Gandawa: But he didn’t want a human being.
120 00:12:58.200 ⇒ 00:13:04.800 Nathan Zeke Gandawa: So, what I did was, I designed the system and handed off to a call center associates.
121 00:13:04.800 ⇒ 00:13:19.430 Nathan Zeke Gandawa: But he didn’t want that, so I had to explain to him that this is, this is PHI, this is, this is health data for patients, so if you mess up even just once, the lawsuit there, you…
122 00:13:19.640 ⇒ 00:13:26.739 Nathan Zeke Gandawa: You’ll never recover from that, and the potential damage you can have to people’s lives is insane.
123 00:13:27.340 ⇒ 00:13:39.000 Nathan Zeke Gandawa: Ultimately, he ended up understanding, and we managed to converge on a system that, still took off the burden of, like, a thousand calls a day on his small team.
124 00:13:39.000 ⇒ 00:13:39.700 Samuel Roberts: No, nice.
125 00:13:39.700 ⇒ 00:13:48.760 Nathan Zeke Gandawa: But still, managed to help customers get safe help from actually human beings where it was necessary.
126 00:13:49.590 ⇒ 00:13:50.170 Samuel Roberts: Great.
127 00:13:50.460 ⇒ 00:14:02.330 Samuel Roberts: I like hearing that. We have a project where we’re doing something similar, customer service for the customer service representatives, so an AI chatbot that they can use to help find all this information that was scattered over documents and databases, but
128 00:14:02.330 ⇒ 00:14:12.060 Samuel Roberts: when someone calls this company, they’re still getting a person, and that’s important to that business, and so I really appreciate that from them, working with them, but also hearing you say that, that’s wonderful.
129 00:14:12.300 ⇒ 00:14:23.459 Samuel Roberts: Great. Let’s… let’s jump around a little bit here. So we talked a little… I mentioned a little bit, you know, industry changing a lot, lots of things happening. Is there a trend, or…
130 00:14:23.460 ⇒ 00:14:33.240 Samuel Roberts: and trend in the broadest term, I guess, here, that you were excited about within LLM’s AI that, again, you were excited about, but decided maybe not to adopt
131 00:14:33.280 ⇒ 00:14:34.960 Samuel Roberts: For… for various reasons.
132 00:14:37.390 ⇒ 00:14:40.419 Nathan Zeke Gandawa: Yes. Okay, so…
133 00:14:41.090 ⇒ 00:14:46.050 Nathan Zeke Gandawa: This was a little earlier on. I’m warming up to it right now. I didn’t jump on that train.
134 00:14:46.430 ⇒ 00:14:53.330 Nathan Zeke Gandawa: But the whole, AI agents, situation.
135 00:14:53.600 ⇒ 00:15:02.249 Nathan Zeke Gandawa: I didn’t fully adopt OpenClaw, because I knew that, this was,
136 00:15:02.840 ⇒ 00:15:11.000 Nathan Zeke Gandawa: there’s a lot of, like, an AI agent that has full access to a computer, that is a serious security risk.
137 00:15:12.000 ⇒ 00:15:15.349 Nathan Zeke Gandawa: But when Claude.
138 00:15:15.510 ⇒ 00:15:21.260 Nathan Zeke Gandawa: started… because OpenCloud was coded by one guy in his room, like, on his computer, and it’s.
139 00:15:22.090 ⇒ 00:15:34.719 Nathan Zeke Gandawa: open source. So, I knew that, it didn’t have that stress testing that, like, a professional version of it would go through with, like, a team like Claude and, OpenAI.
140 00:15:34.930 ⇒ 00:15:38.749 Nathan Zeke Gandawa: So, I waited for a while,
141 00:15:39.080 ⇒ 00:15:54.940 Nathan Zeke Gandawa: for, like, people to, like, for bigger companies to adopt it and start rolling out their versions, and then that’s when I started warming up to it, and, I actually developed an MCP server from… for one of my projects called UT AI.
142 00:15:55.170 ⇒ 00:16:00.969 Nathan Zeke Gandawa: So, yeah, that is an example of, of a trend I didn’t hop on too early.
143 00:16:02.090 ⇒ 00:16:06.070 Samuel Roberts: Yeah, no, good example. I… my limited experience with… with,
144 00:16:06.260 ⇒ 00:16:12.299 Samuel Roberts: OpenClaw was… the hype was going around, and I was like, I have to know what this is about, but I was exactly what you said, worried about the.
145 00:16:12.300 ⇒ 00:16:12.750 Nathan Zeke Gandawa: I’m curious.
146 00:16:12.750 ⇒ 00:16:20.460 Samuel Roberts: So I put it in a VM on my server and gave it limited access to everything, and turned on little things occasionally to test it out, and then would turn them off, and…
147 00:16:20.460 ⇒ 00:16:21.080 Nathan Zeke Gandawa: Test it out.
148 00:16:21.080 ⇒ 00:16:26.599 Samuel Roberts: It was interesting what it could do, but it was still pretty scary at the time, so I appreciate that, yeah.
149 00:16:26.770 ⇒ 00:16:30.400 Samuel Roberts: Great, okay, so we’re… we’re getting, about halfway, but,
150 00:16:30.730 ⇒ 00:16:35.689 Samuel Roberts: I want to ask at least one last question, then we can switch, and we can keep going back and forth after that, but .
151 00:16:35.690 ⇒ 00:16:36.270 Nathan Zeke Gandawa: get it.
152 00:16:36.590 ⇒ 00:16:40.549 Samuel Roberts: If you had 6 months, with no obligations.
153 00:16:41.040 ⇒ 00:16:43.050 Samuel Roberts: What would you spend that time working on?
154 00:16:45.750 ⇒ 00:16:52.129 Nathan Zeke Gandawa: 6 months. Okay, what’s, is there… are there any constraints to that? Is there any goals?
155 00:16:53.810 ⇒ 00:17:01.960 Samuel Roberts: No, I mean, it’s, you know, imagine you were, you know, independently wealthy and could do whatever you want, I guess, for 6 months, what would you spend the time doing, I guess?
156 00:17:04.880 ⇒ 00:17:08.009 Nathan Zeke Gandawa: I would finish working on my…
157 00:17:09.109 ⇒ 00:17:14.280 Nathan Zeke Gandawa: time series, forecasting models. I’ve been working on,
158 00:17:14.640 ⇒ 00:17:21.330 Nathan Zeke Gandawa: sometimes to use forecasted normals. I’m a very avid tracker of markets. I love
159 00:17:21.660 ⇒ 00:17:30.199 Nathan Zeke Gandawa: watching the markets. I have, trading agents that I run on, like, that does paper trading. I just like to test out
160 00:17:30.340 ⇒ 00:17:46.939 Nathan Zeke Gandawa: how accurately a model can predict, the markets. So, so yeah, I’ll spend… I’ll spend the time building an AI system around time series forecasting and operational planning.
161 00:17:47.720 ⇒ 00:17:54.960 Nathan Zeke Gandawa: It’s… what interests me is not just projecting numbers, but connecting messy business data. If we’re talking about messy data.
162 00:17:55.400 ⇒ 00:18:05.049 Nathan Zeke Gandawa: markets… financial market data is probably the most messy data. It’s… it’s… it’s literally just random walk, data.
163 00:18:05.200 ⇒ 00:18:21.289 Nathan Zeke Gandawa: And I feel like if I can pick up signals from that data, if you can pick up signals, organize, signals within that data, then you’ve, like, solved, like, a huge challenge. And maybe at the end of the 6 months, I have something that can print money for me. So, yeah.
164 00:18:21.460 ⇒ 00:18:41.269 Samuel Roberts: It’s the dream, yeah. I like that. That’s really cool, really cool. Okay. Like I said, so I have plenty more questions I can dig into, but I want to make sure we have time to answer your questions. So, we’ll switch over now, so any questions you have about the company, the role, you know, anything you’re curious about, go ahead.
165 00:18:42.120 ⇒ 00:18:47.609 Nathan Zeke Gandawa: But yeah, so I understand, BrainForge’s,
166 00:18:47.860 ⇒ 00:18:56.070 Nathan Zeke Gandawa: you… you help companies implement AI. So I want to understand, do you have, like, an in-house product, or do you do it on a consulting team?
167 00:18:57.300 ⇒ 00:19:09.140 Samuel Roberts: We… we’re a consultancy. There’s, you know, different projects and things. Bea can probably even speak to some of that, better than I can at this point, but, you know, the AI team does client work.
168 00:19:09.140 ⇒ 00:19:18.510 Samuel Roberts: And so that’s, you know, our bread and butter there. So, a little background. Brainforge was sort of started as a data consultancy, doing data work for.
169 00:19:18.530 ⇒ 00:19:22.129 Samuel Roberts: Clients, that’s sort of where Uten, the CEO, comes from.
170 00:19:22.130 ⇒ 00:19:46.459 Samuel Roberts: And then, you know, AI is getting bigger and stronger and can do more things, and so, not only was that happening internally, and, you know, we think of ourselves as a very, like, AI-first company, like, trying to figure out where we can be more efficient and do things in better, faster ways, but then clients were also interested in that, because obviously, like, everyone’s talking about everything AI-related. So, the AI team, AI automation team kind of spun out
171 00:19:46.920 ⇒ 00:20:05.030 Samuel Roberts: that way, doing internal projects, and then starting to do some more client work, and so now it’s kind of a full service line that we do for clients. Sometimes they come from data, and they have been doing, you know, pipelines and things like that already, and then know they want to do something on top of that, or in addition to that. Sometimes they just come in looking for
172 00:20:05.030 ⇒ 00:20:11.269 Samuel Roberts: AI services, whether that’s, like, a pipeline, a RAG pipeline, or some of those automations I mentioned,
173 00:20:11.420 ⇒ 00:20:18.609 Samuel Roberts: Really, a few different things that we, you know, we’ve started kind of doing a lot of different things and slowly nailing down, like, what we find is our,
174 00:20:18.970 ⇒ 00:20:30.660 Samuel Roberts: our best, you know, services that we can offer. And, you know, as you know, things are changing all the time in the AILM space, so, you know, we’re internally using lots of tools, and we have,
175 00:20:30.660 ⇒ 00:20:43.819 Samuel Roberts: internal tooling that’s been built, and a platform that ingests meetings, and so we have a bunch of that stuff AI-related, but then we’re also doing client work, so that’s really the main thing is client work. We don’t necessarily have a product, that we offer at
176 00:20:43.910 ⇒ 00:20:46.840 Samuel Roberts: At this point, maybe, I’ll say, but, yeah.
177 00:20:46.840 ⇒ 00:20:55.190 Nathan Zeke Gandawa: Yeah, at this stage. Okay. Oh, that makes sense. Okay, so is there anything currently, right now…
178 00:20:55.900 ⇒ 00:20:58.250 Nathan Zeke Gandawa: That has prompted you to hire this role?
179 00:20:58.540 ⇒ 00:21:07.090 Nathan Zeke Gandawa: A project or something important that needs… that’s on fire right now, that… that you’d say a person who’s filling this role is going to be solving on day one.
180 00:21:08.330 ⇒ 00:21:20.499 Samuel Roberts: Yeah, I think, you know, we’re growing overall, so, you know, we’ve, since I’ve joined in, in July, and since, you know, B’s been around here the last, 4 months, like, we’ve,
181 00:21:21.330 ⇒ 00:21:32.929 Samuel Roberts: gone through, like, that kind of startup growing phase, where, you know, it’s been a handful of people, it’s tens, it’s, you know, growing more now. And so, that means we’re looking at
182 00:21:33.240 ⇒ 00:21:37.059 Samuel Roberts: Bigger clients, bigger projects, you know, more…
183 00:21:37.250 ⇒ 00:21:46.200 Samuel Roberts: established companies, rather than maybe working with someone who needs, like, startup work kind of thing. And so, really, we’re… we’re looking for this role because we’re… we need more
184 00:21:46.390 ⇒ 00:22:03.169 Samuel Roberts: manpower, for lack of a better word. You know, we lean on AI a lot, and we try to, you know, using… we’re using open code now, we were on cursor for a while with different things, so we use these tools to try to get as much done as we can with the bodies that we have, but, you know, that’s… there’s no…
185 00:22:03.170 ⇒ 00:22:09.480 Samuel Roberts: you’re not going to completely replace, you know, devs at this point. At this point at least, you know, who knows where things are going, but, you know.
186 00:22:09.480 ⇒ 00:22:09.970 Nathan Zeke Gandawa: Yeah.
187 00:22:09.970 ⇒ 00:22:28.040 Samuel Roberts: people to solve problems and kind of take that step from, you know, maybe the AI can do some coding for you, but you’ve got to put the plan together, you gotta work with the team, you know, all that kind of… the creative engineering, exactly what you, you know, talked about. And so I think, you know, at this point, we have a few client projects that we’re working on. There’s more in the pipeline. There’s not necessarily, like, a…
188 00:22:28.920 ⇒ 00:22:45.850 Samuel Roberts: thing on fire that we need, you know, someone to come and solve this exact problem, but I think, you know, we’re kind of planning forward, knowing that there’s things down the road that are coming, and we want to make sure that we’re set up well to handle those, and to be able to service
189 00:22:46.180 ⇒ 00:22:50.760 Samuel Roberts: Multiple clients… multiple… more clients, bigger clients, bigger projects, things like that.
190 00:22:51.750 ⇒ 00:22:59.079 Nathan Zeke Gandawa: Yeah, yeah, yeah. I mean, I’m excited by AI, but I’m not… I’m not too scared about AI, about AI replacing something here.
191 00:22:59.270 ⇒ 00:23:00.130 Nathan Zeke Gandawa: In fact.
192 00:23:00.420 ⇒ 00:23:07.900 Nathan Zeke Gandawa: I think over the next couple of years, demand for software engineers might actually increase, because the amount of, like, vibe-coded
193 00:23:08.190 ⇒ 00:23:16.909 Nathan Zeke Gandawa: stuff that’s been pushed into production. A lot of companies are gonna break, and they’re gonna be like, okay, we need, like, back-to-real engineers now.
194 00:23:16.910 ⇒ 00:23:17.450 Samuel Roberts: Yes!
195 00:23:17.450 ⇒ 00:23:19.120 Nathan Zeke Gandawa: That’s life-coded, yeah, yeah.
196 00:23:19.120 ⇒ 00:23:20.439 Samuel Roberts: No, I, I, I…
197 00:23:20.550 ⇒ 00:23:38.760 Samuel Roberts: I definitely agree. I think there’s… there’s some things that you may not need to look at the code if it’s, you know… some things can be, you know, oh, that’s fine, but there’s definitely human eyes on something, understanding the code. I think I’ve seen recently, even in the past few weeks, a bit of a push for
198 00:23:38.850 ⇒ 00:23:51.980 Samuel Roberts: you know, software engineering fundamentals, and, you know, if you’re working with the AI, make sure the AI understands, like, how the process goes, and it’s not just a black box, you say, build me this, it’s, you know, understanding the whole problem, the whole…
199 00:23:52.810 ⇒ 00:24:05.359 Samuel Roberts: you know, what are we trying to solve? How are we doing it, where are we cutting the right corners? Where are we not cutting the corners? I think… I think you’re absolutely right. I think there’s a bit of a tide turn, even, from the… the Vibe code,
200 00:24:06.020 ⇒ 00:24:09.380 Samuel Roberts: mantra, maybe? I don’t know what you want to call it, but yeah, I think, you know, there’s…
201 00:24:10.110 ⇒ 00:24:21.869 Samuel Roberts: It’s another tool that we have, and you gotta know where to focus. It’s a tool. Yeah. I’ve seen another phrase called, ugh, I’m blanking on it, comprehension debt instead of technical debt, where
202 00:24:21.970 ⇒ 00:24:38.429 Samuel Roberts: The idea of technical debt is that, you know, things… you know when things are not in the right state, but it’s good enough for now, and you’ll get to it later, and something will break, and you’ll realize that. But comprehension debt being the idea that you don’t know what’s going on in your codebase. Like, you know the inputs, you know the outputs, but it’s just a box, and that’s scary to me.
203 00:24:38.430 ⇒ 00:24:38.910 Nathan Zeke Gandawa: So…
204 00:24:38.910 ⇒ 00:24:41.650 Samuel Roberts: I feel that, I feel that a lot, yeah.
205 00:24:41.650 ⇒ 00:24:45.670 Nathan Zeke Gandawa: Exactly. Great. You actually want to understand each process.
206 00:24:45.790 ⇒ 00:24:48.790 Nathan Zeke Gandawa: Okay, my last question now would be…
207 00:24:49.230 ⇒ 00:24:53.490 Nathan Zeke Gandawa: What does success for someone in this role look like, maybe for the next month?
208 00:24:53.990 ⇒ 00:24:56.970 Nathan Zeke Gandawa: the next 2 months, 4 hours.
209 00:24:57.550 ⇒ 00:25:00.979 Samuel Roberts: Yeah, yeah, so, you know, we’re looking for someone…
210 00:25:01.090 ⇒ 00:25:04.989 Samuel Roberts: that can really hit the ground running. You know, we…
211 00:25:05.190 ⇒ 00:25:09.900 Samuel Roberts: We have a lot of stuff to do, and there’s gonna be more to do, so it’s not someone that we necessarily want to, you know.
212 00:25:10.650 ⇒ 00:25:29.250 Samuel Roberts: build up as much, but at the same time, you know, the initial month is still onboarding, they’re still learning the system, learning the processes, and I will say that, you know, since July when I joined, that process is a little more improved, we’ll say. You know, we’re working on that, that’s an important part of getting people in. I would say success…
213 00:25:30.250 ⇒ 00:25:41.439 Samuel Roberts: comes in kind of two things that I like to think about from an engineering side. One is, you know, the work is good, you know, the tickets are satisfied, you know, things are moving along,
214 00:25:41.550 ⇒ 00:25:49.570 Samuel Roberts: you know, there’s always hiccups here and there, but, like, you know, making sure that whoever is in this role is succeeding by, you know, able to do the work. But at the same time.
215 00:25:49.970 ⇒ 00:26:05.930 Samuel Roberts: you are asking the right questions, you’re learning the right way, you’re moving in the right direction as well is important. So, like, the… that, like, velocity, you know, being, like, a vector kind of pointing in the right direction, rather than just, like, how fast you can go, how much you can get done. You know, I think…
216 00:26:06.400 ⇒ 00:26:21.889 Samuel Roberts: some of the stuff we talked about a little bit, like the planning, the putting together things, the working with the team, the understanding, the spiking on certain things to figure out, like, prototyping a few things and understanding that. You know, you don’t necessarily want someone that can… that will, you know, be given a task.
217 00:26:21.900 ⇒ 00:26:31.410 Samuel Roberts: have maybe questions, but not articulate them, and then go in the wrong direction. I think someone successful, understands that, you know.
218 00:26:31.410 ⇒ 00:26:37.570 Samuel Roberts: It’s even like working with an AI agent, you know, they have their training data, they make all kinds of assumptions if you don’t tell them what to do.
219 00:26:37.570 ⇒ 00:26:53.159 Samuel Roberts: And so I think, you know, the reason humans are in the loop still is because we can realize when we don’t know these things, and figure that out, and work with the team, or, you know, bring something to the client and understand their needs better, or, you know, recognize the assumptions we’re making, and…
220 00:26:53.460 ⇒ 00:27:06.590 Samuel Roberts: at least document them, if not really dig into why. I think someone that’s successful is not only getting the work done, but growing in that way. I think Bea would probably agree with the, you know, learning the L&D stuff we have going on. Like, that’s really important. So I think, you know.
221 00:27:06.700 ⇒ 00:27:11.610 Samuel Roberts: That first month is, you know, you’re getting the work done, it’s going well, but, you know, by the next few months, you’re…
222 00:27:11.870 ⇒ 00:27:23.289 Samuel Roberts: asking the right questions, you’re surfacing when things aren’t clear enough, or whether that’s clear enough for you because you don’t have the context, or that there’s not enough context, you know, whatever that is, I think that’s really important.
223 00:27:23.870 ⇒ 00:27:25.720 Samuel Roberts: Yeah, I think that’s… that’s kind of…
224 00:27:26.680 ⇒ 00:27:31.049 Nathan Zeke Gandawa: Okay. Yeah, yeah, that, that does, that does make a lot of sense.
225 00:27:32.680 ⇒ 00:27:41.610 Samuel Roberts: Great, yeah, I think it’s an interesting time to be a software engineer, especially, like, working in AI with AI. Things are changing all the time, and…
226 00:27:41.610 ⇒ 00:27:41.960 Nathan Zeke Gandawa: psyched.
227 00:27:41.960 ⇒ 00:27:44.790 Samuel Roberts: You know, there’s so much… it’s very exciting, but there’s so much information out there.
228 00:27:44.790 ⇒ 00:27:45.120 Nathan Zeke Gandawa: out there.
229 00:27:45.120 ⇒ 00:27:56.230 Samuel Roberts: I agree with you that I don’t think we’re necessarily going anywhere as engineers, and we’re probably going to be more valuable if you understand it, but things will be different, and I think it’s an exciting time to…
230 00:27:56.230 ⇒ 00:27:56.690 Nathan Zeke Gandawa: Yeah.
231 00:27:56.690 ⇒ 00:28:02.019 Samuel Roberts: to be… to be doing this stuff. And Brainforge is a fun place to get to do that, because we’re, you know.
232 00:28:02.390 ⇒ 00:28:22.260 Samuel Roberts: working with these newer tools, you know, we’re trying out new things, we’re testing the new models, where, you know, we had a project where we switched over from GPT-40 to Gemini, and that meant, you know, you can’t just flip a switch, you gotta really dig in, and now we understand that model better, and deploying to GCP, and all these things that, you know, we were focused on AWS for one thing, and we were able to switch, and we learned all that, and so I think there’s a lot of…
233 00:28:22.510 ⇒ 00:28:29.099 Samuel Roberts: you know, I’m, you know, been doing this for 10 years, and AI for the last however many, but, you know, things are still learning stuff, so it’s great.
234 00:28:29.410 ⇒ 00:28:36.440 Nathan Zeke Gandawa: there’s a lot to learn. There’s a lot to learn, yeah. Things are advancing, things are moving so fast right now, so yeah.
235 00:28:36.440 ⇒ 00:28:43.719 Samuel Roberts: it’s… it can be overwhelming at times, and I feel like there’s so much information out there that it’s hard to know what to trust and what to go with, but yeah.
236 00:28:44.560 ⇒ 00:28:45.250 Nathan Zeke Gandawa: Yeah.
237 00:28:45.250 ⇒ 00:28:50.470 Samuel Roberts: Any… we’re getting sort of toward time here, so I just want to make sure if there’s any other things you have… you’re wondering about.
238 00:28:50.830 ⇒ 00:28:53.390 Nathan Zeke Gandawa: No, no, then…
239 00:28:53.860 ⇒ 00:29:12.469 Samuel Roberts: Okay, great. Then just to sort of wrap things up, so this is, you know, you submitted the video and kind of made it past that first, like, pre-screen kind of thing. This is the first full interview. There’ll be… if everything goes well throughout all of them, there’ll be two more interviews. The next one is a bit more role-focused, a bit more…
240 00:29:12.510 ⇒ 00:29:19.410 Samuel Roberts: I hesitate to say technical interview, because you’re not… it’s not elite code, it’s not a coding in front of people thing, but it’ll be,
241 00:29:20.000 ⇒ 00:29:33.560 Samuel Roberts: how would you go about this? How would you think about that? That kind of, you know, interview. After that, if that goes well, you would get a, kind of take-home assessment, just sort of a prompt for a project.
242 00:29:33.620 ⇒ 00:29:48.699 Samuel Roberts: You’re free to go about solving that in, sort of, you know, various… whatever ways you like. But then you bring that back and have a presentation in front of a panel of us, and we would, you know, ask questions and dig in and really try to understand, like, the problem solving there.
243 00:29:49.650 ⇒ 00:30:02.849 Samuel Roberts: And then after that would be an offer or not. So that’s sort of the process. We like to move pretty quick, we don’t want to drag things out. I think the biggest thing is just scheduling this kind of synchronous time for these interviews.
244 00:30:03.020 ⇒ 00:30:17.689 Samuel Roberts: But, you know, we don’t want to… we don’t want it to take a super long time either way, so we try to make sure that that happens, pretty quick. So, you should hear back, you know, this week, one way or another, about either scheduling the next thing or not, and then, yeah, from there, you know, that’s the rest of the process.
245 00:30:19.100 ⇒ 00:30:21.809 Nathan Zeke Gandawa: That’s… that’s… that sounds exciting. That sounds exciting.
246 00:30:21.810 ⇒ 00:30:22.590 Samuel Roberts: Great.
247 00:30:22.860 ⇒ 00:30:26.139 Nathan Zeke Gandawa: So yeah, I’ll, I’ll keep my fingers crossed.
248 00:30:26.810 ⇒ 00:30:29.030 Nathan Zeke Gandawa: So, yeah, I look forward to hearing from you.
249 00:30:29.790 ⇒ 00:30:37.259 Samuel Roberts: Alright, great, yeah, you should, like I said, you should hear from the recruitment team, you know, relatively soon. You know, feel free to reach out, and if you have any other questions,
250 00:30:37.460 ⇒ 00:30:39.670 Samuel Roberts: I’m happy to answer whatever I can… whenever I can.
251 00:30:40.710 ⇒ 00:30:41.630 Nathan Zeke Gandawa: Okay.
252 00:30:42.060 ⇒ 00:30:43.060 Nathan Zeke Gandawa: Exciting.
253 00:30:43.060 ⇒ 00:30:44.949 Samuel Roberts: Thanks so much. Yeah, thanks, Peter.
254 00:30:44.950 ⇒ 00:30:45.450 Nathan Zeke Gandawa: Oh, yeah.
255 00:30:45.450 ⇒ 00:30:46.430 Samuel Roberts: And…
256 00:30:46.430 ⇒ 00:30:47.430 Nathan Zeke Gandawa: Thank you so much.
257 00:30:48.480 ⇒ 00:30:49.449 Samuel Roberts: Hope to see you again!
258 00:30:50.240 ⇒ 00:30:52.350 Nathan Zeke Gandawa: Alright, cheers, bye.
259 00:30:52.350 ⇒ 00:30:53.609 Samuel Roberts: Yeah, you as well, cheers.