Meeting Title: Brainforge Interview w- Sam Date: 2026-05-05 Meeting participants: Brylle Girang, austinW
WEBVTT
1 00:09:02.350 ⇒ 00:09:03.230 austinW: Hey there.
2 00:09:03.370 ⇒ 00:09:04.330 austinW: Bro?
3 00:09:05.690 ⇒ 00:09:06.280 Brylle Girang: Nope.
4 00:09:06.420 ⇒ 00:09:07.159 Brylle Girang: Can you hear me.
5 00:09:07.700 ⇒ 00:09:09.000 austinW: Hi, Ken, how you doing?
6 00:09:09.330 ⇒ 00:09:11.610 Brylle Girang: Good, I’m doing good. Amazing.
7 00:09:12.300 ⇒ 00:09:13.410 austinW: Yeah.
8 00:09:13.410 ⇒ 00:09:14.269 Brylle Girang: going to Austin.
9 00:09:15.080 ⇒ 00:09:18.830 austinW: Yes, Austin. Austin Whitaker, nice to meet you. I realize I’m a little bit early, but…
10 00:09:19.310 ⇒ 00:09:22.740 Brylle Girang: Nice to meet you. No, I’m from the Philippines, so it’s 4 AM.
11 00:09:22.740 ⇒ 00:09:26.839 austinW: Okay. Okay. Well, thank you for making the time, man.
12 00:09:26.840 ⇒ 00:09:28.760 Brylle Girang: Of course, of course.
13 00:09:29.020 ⇒ 00:09:44.370 Brylle Girang: Yeah, so I just want to set expectations about how this interview will go, but before that, I’m going to introduce myself, so… You can call me B. Everyone in Brainforge calls me B. I am leading the learning and development team here in Brainforge, so I’m the one
14 00:09:44.370 ⇒ 00:09:51.189 Brylle Girang: in charge of making sure that everyone at Brainforge performs at the same baseline and above.
15 00:09:51.590 ⇒ 00:09:52.390 Brylle Girang: Awesome.
16 00:09:52.540 ⇒ 00:10:08.460 Brylle Girang: Yeah, so I will… this will mostly be about me getting to know you. I want to make sure that we have this conversation, and I’m also going to give you a chance to ask me questions about Brainforge and about the role that you’re… that you’re seeking, okay? Sounds good?
17 00:10:08.460 ⇒ 00:10:11.339 austinW: That sounds great. Yeah, is it just gonna be us, or is Sam gonna be joining?
18 00:10:11.820 ⇒ 00:10:13.909 Brylle Girang: So I’m allowed to be joining, so I’m taking over.
19 00:10:14.450 ⇒ 00:10:15.419 Brylle Girang: It’s all interview.
20 00:10:15.610 ⇒ 00:10:16.990 austinW: Sounds good. Alright, cool.
21 00:10:16.990 ⇒ 00:10:24.219 Brylle Girang: Why don’t we start with you, letting me know about why you want to join Brainforge?
22 00:10:24.660 ⇒ 00:10:38.420 austinW: Yeah. No, so I’ve had kind of a good run of things, you know, I’ve been in tech for about 14 years, and then, you know, the way that the job market’s been, I’ve had to adapt as far as what FTE roles used to be, and then what freelancing is and has been.
23 00:10:38.510 ⇒ 00:10:47.150 austinW: So it was about 2 years ago I got in contact with a mutual peer between, well, my current con- freelance partner, and then,
24 00:10:47.280 ⇒ 00:10:59.070 austinW: Why am I blinking? Utam. So I met Utam, actually, about a year ago, and we’ve had a kind of passive connection, so I’ve been finishing about a contract I’ve been on for the last 6 months.
25 00:10:59.190 ⇒ 00:11:04.840 austinW: Utena and I have been kind of missing each other, but my, you know, recent contract work has been in the field of
26 00:11:04.940 ⇒ 00:11:11.450 austinW: really, I think, UI kind of services development with Cursor, but also breaking into the AI space overall.
27 00:11:11.690 ⇒ 00:11:16.489 austinW: After, you know, I think it was probably about 2023,
28 00:11:16.670 ⇒ 00:11:27.520 austinW: I was interested to, you know, get out from the work that I was doing, and really invest in exploring more tool sets in the data engineering space, but also to catch the wave of the AI stuff going on.
29 00:11:27.730 ⇒ 00:11:42.330 austinW: So more recently, you know, I just see what y’all are doing, you know, coming from a basis of business intelligence, big data, data engineering, but then how we parlay that into AI applications. I just see it as, like, a continuity of my own progression in that.
30 00:11:42.480 ⇒ 00:11:59.050 austinW: And then, working as a freelancer, it’s kind of going in as an individual, and I’ve been working through one other freelance through his LLC, but meeting Utam, hearing about Brainforge, and kind of coming into a team of… y’all are probably towards 15, 20 nowadays? So I see the growth, and you know, just that small group of…
31 00:11:59.210 ⇒ 00:12:05.910 austinW: You know, it’s not a large consultancy, it’s a small consultancy, but being small makes it so that y’all are very, you know.
32 00:12:06.040 ⇒ 00:12:13.630 austinW: up on, certainly, I wanna say, knowledge sharing and working asynchronously. So just being effective with the tools that we have.
33 00:12:13.730 ⇒ 00:12:21.660 austinW: And then going in as a team and sharing, you know, our strengths to deliver, you know, it feels like a good experience in the freelance, you know, arena, in a sense.
34 00:12:22.220 ⇒ 00:12:24.890 Brylle Girang: Okay, have you worked with a consultancy before?
35 00:12:25.300 ⇒ 00:12:30.010 austinW: I’ve not worked for one. I’ve been working, you know, my, everything on the resume is really…
36 00:12:30.130 ⇒ 00:12:33.329 austinW: Was full-time, you know, on-site and then remote.
37 00:12:33.750 ⇒ 00:12:40.120 austinW: And then recently, this freelance work, I’m kind of going in as an individual. I’ve worked with consultants before, but not for a consultancy.
38 00:12:40.470 ⇒ 00:12:44.069 Brylle Girang: Okay, and this is also going to be, like, your first asynchron?
39 00:12:44.420 ⇒ 00:12:45.150 Brylle Girang: As in Chris.
40 00:12:45.150 ⇒ 00:12:52.379 austinW: I mean, I’ve had a lot of, I’ll say, you know, independence as far as my project work.
41 00:12:52.490 ⇒ 00:13:09.270 austinW: an asynchronous, you know, the asynchronous stuff, it doesn’t worry me by any means. I just find that companies are either very well attuned to asynchronous work, or not very well attuned to it. So, going into a remote first and asynchronous feels like a modern step, you know, with the type of team culture that I… that would really work with me.
42 00:13:09.720 ⇒ 00:13:16.770 Brylle Girang: Okay, now, going blindly into, like, an asynchron, what do you think would be the most important for you to be successful?
43 00:13:17.570 ⇒ 00:13:22.040 austinW: Yeah, I mean, I think that, like, communication in general, just keeping up, like, with…
44 00:13:22.280 ⇒ 00:13:33.879 austinW: expectations that you set, both, you know, making sure that we’re level setting and that we’re meeting, you know, the promises that we’re giving to each other, internal just as well as external. But otherwise, yeah, I just… I just find that, I think.
45 00:13:34.270 ⇒ 00:13:53.119 austinW: it’s important to not hoard information. It’s important to receive insights about what we’re building here, but make sure that other people can become aware of it too, because there’s so much loss when, you know, you have to have been part of the conversation. But if you’re good about taking notes and planning, then, you know, everyone kind of comes to the board prepared and abreast of what’s going on.
46 00:13:53.640 ⇒ 00:14:07.179 Brylle Girang: Good, good, yeah, we’re on the same page, like, everyone here is working remotely, and we… I totally agree with you that communication and setting expectations are the key, especially… we’re not going to be seeing face-to-face, we need to make sure that
47 00:14:07.320 ⇒ 00:14:13.419 Brylle Girang: We are visible, and we are doing work without… without eyes all around us, right?
48 00:14:13.420 ⇒ 00:14:14.640 austinW: Absolutely, yep.
49 00:14:14.640 ⇒ 00:14:20.880 Brylle Girang: Okay, yeah, yeah, tell me more about your contract word. What are you doing right now?
50 00:14:21.230 ⇒ 00:14:29.439 austinW: Yeah, sure. So throughout the last 9 months, I’ve been working, in the healthcare space, but they actually run and manage a learning management system.
51 00:14:29.610 ⇒ 00:14:40.160 austinW: Their content of specialty is kind of, it’s anything under the sun that a nurse would need to know, and it’s about continued education and evaluation of competencies and skill sets.
52 00:14:40.280 ⇒ 00:14:52.099 austinW: I came in as part of a small… it was a small offshoot product manager’s team who was building with, kind of, internal, I want to say, prototyping and deployment tools.
53 00:14:52.300 ⇒ 00:15:05.700 austinW: such that we could pretty rapidly, you know, spin up, anything you could run necessarily across lambdas, but really it was pointed towards creating a React app, which could help us to create front-ends towards API backends.
54 00:15:05.810 ⇒ 00:15:09.809 austinW: And my original, kind of, go-get, it’s… it was interesting in that…
55 00:15:09.940 ⇒ 00:15:23.679 austinW: this company runs a learning management system. Most of their question material is multiple-choice questions, and there is a comp… it’s an editor… editor team, where there’s thousands of questions, but you take highly, you know, specialized nurses by background.
56 00:15:23.810 ⇒ 00:15:30.530 austinW: And these nurses are the ones that are kind of curating the question sets that they have, so they have to update the existing stuff, but also come up with new stuff.
57 00:15:30.840 ⇒ 00:15:47.640 austinW: And so, they wanted to explore AI applications for that generative AI, to say, well, hey, can we, A, make an AI that can enforce editorial standards to say that, hey, the sentence structure and the format of the data needs to meet these specs?
58 00:15:47.820 ⇒ 00:15:57.059 austinW: But additionally, we wanted to explore things like, and we did, was things like RAG to say, hey, can we take our course transcripts and create question sets off of that content?
59 00:15:57.160 ⇒ 00:16:03.519 austinW: Separately, I got on a more recent length that thought there was more meat on that bone in terms of taking deep research.
60 00:16:03.740 ⇒ 00:16:09.460 austinW: creating kind of a corpus of knowledge, and then pointing the same AI generative stuff to that.
61 00:16:09.730 ⇒ 00:16:28.739 austinW: But at most, really, it was kind of, you know, for any kind of real topic in nursing, we could generate questions with validity in the question and answer, as well as making kind of, like, highly plausible, highly relevant distractor options. So, you know, just conceivable wrong answers.
62 00:16:28.770 ⇒ 00:16:45.250 austinW: And in this case, we went as far as creating a full front-end application where there was generative AI features embedded in that, so you could, you know, have an editor’s experience to search whatever questions existed, but also edit and, you know, get generative suggestions throughout. It’s pretty cool.
63 00:16:45.870 ⇒ 00:16:52.119 Brylle Girang: How involved were you in this project? Like, were you more on the building side? Were you more on the strategy side?
64 00:16:52.700 ⇒ 00:17:06.850 austinW: I was definitely following, like, hey, go-gets, there’s a vision put out by the product manager, but I was on the hook to deliver, you know, technically and everything like that. So, I was managing both generating, like, running the code, testing the code, deploying the code.
65 00:17:06.849 ⇒ 00:17:15.100 austinW: managing changes. I had one partner who… her name was Giselle, and she was a nurse by practice, and was coming into an analytics kind of capacity.
66 00:17:15.140 ⇒ 00:17:17.350 austinW: So she was kind of also…
67 00:17:17.569 ⇒ 00:17:32.719 austinW: it was kind of an evaluation, I think, for his sake. He wanted to make the argument that you could take a non-technical person and equip them with something like Cursor, so I was really the technical back-end SME, and then she was kind of assisting, and together we rolled out this, this great, you know, app. It was pretty cool.
68 00:17:33.030 ⇒ 00:17:41.900 austinW: Coming from, like, a data engineering background myself, like, early warehousing stuff, to go towards React development without, you know, the bona fides, it was pretty fulfilling, yeah.
69 00:17:42.600 ⇒ 00:17:45.699 Brylle Girang: Can you tell me more about, like, your AI stack?
70 00:17:46.120 ⇒ 00:17:46.750 austinW: Yeah.
71 00:17:46.880 ⇒ 00:17:52.710 austinW: So in this case, you know, we were taking… the shop had both dependencies on Azure as well as AWS.
72 00:17:52.970 ⇒ 00:18:00.320 austinW: As I was building my initial builds, it was all within AWS, so I was using Bedrock and Knowledgebase for RAG.
73 00:18:00.740 ⇒ 00:18:19.339 austinW: I did evaluate, Crew AI. I like to look for open source tools and that kind of stuff, because I wanted to break from just prompting and getting an answer to getting more multi-agent experience. So, Crew AI kind of was an evaluation. It felt like a corner cut. It was pretty cool how you could define the agents and the tasks pretty succinctly.
74 00:18:19.430 ⇒ 00:18:23.110 austinW: But the later builds that were more lasting, it was more so about…
75 00:18:23.250 ⇒ 00:18:32.120 austinW: chaining together calls to Bedrock LLMs, and then passing the prompt context to the subsequent, you know, steps and calls.
76 00:18:32.280 ⇒ 00:18:40.630 austinW: On the deep research side, there was actually… we were provisioning compute from Azure, so we had an Azure,
77 00:18:40.970 ⇒ 00:18:54.500 austinW: it was a deep, deep research model in general that they did have an account access to. So I did build a dependency into what was… I found a proof of concept that Microsoft supported, kind of took their repository, but pointed it to our capacity.
78 00:18:54.520 ⇒ 00:19:03.260 austinW: But expanded the, I took it from… it was really a headed app, and I turned it into a headless deep research service that we could call.
79 00:19:03.440 ⇒ 00:19:11.329 austinW: And that was still orchestrating across lambdas, because it could be some long-running processes, but the compute was coming from Azure, so…
80 00:19:11.470 ⇒ 00:19:15.069 austinW: Yeah, kind of mostly… most of it was running on Lambdas.
81 00:19:15.550 ⇒ 00:19:20.759 Brylle Girang: Okay, okay, gotcha. How do you get up to date with the AI craze right now?
82 00:19:21.210 ⇒ 00:19:25.960 austinW: Yeah, so what’s funny is that, you know, Let’s see.
83 00:19:26.300 ⇒ 00:19:35.660 austinW: Through all this work, right, you get, you arrive at work, you have a stack that you have to learn sometimes, other times you have a problem space that you need to break into, and you have to think about the new tools, and…
84 00:19:35.790 ⇒ 00:19:51.909 austinW: I think that the data engineering side is the piece where that’s been the biggest chase, because it’s been this, you know, it used to be a couple stacks, a couple tools, Informatica, SSIS, whatever. Now it’s very composable. You can have 4 different products by a different name, but they more or less do the same thing.
85 00:19:52.280 ⇒ 00:20:06.079 austinW: So, in both data engineering and in AI, I’ve looked for ways to make, you know, small enough prototypes to really get a grip of what the thing is doing. Docker helped me out prior to getting into, really, corporate cloud accounts, necessarily.
86 00:20:06.210 ⇒ 00:20:23.860 austinW: So I would, you know, maybe stand up orchestration things in Docker. Separately, I was… with AI specifically, I would, I was doing more local LLM stuff. I was using Olama and LLM Studio, mostly because I was interested to know the models by name, necessarily, but I didn’t want to run a bill either.
87 00:20:25.690 ⇒ 00:20:34.020 austinW: I hit the ground with this current contract. I picked up the Data Engineering O’Reilly book and was kind of reading about, the different,
88 00:20:34.290 ⇒ 00:20:42.199 austinW: really, the… data preparation side, the embedding algorithms you can manage, and then prompting, so the separation of, like.
89 00:20:42.560 ⇒ 00:20:48.420 austinW: you know, what place do you go to to refine a certain edge end of the experience? But…
90 00:20:48.590 ⇒ 00:21:02.100 austinW: how do I keep up with it? I think that when I was hands-on with the Olama and that kind of stuff, I was a little bit more in tune with the names of the models, because I was then picking up DeepSeek, or picking up Quencoder, separate of, Llama, or something like that.
91 00:21:02.890 ⇒ 00:21:18.330 austinW: Now that we have things like cursor or Cloud Code, granted, Cloud Code’s gonna lean towards anthropic models, but Cursor being a little bit more agnostic, I’m less, you know, opinionated about which model name, necessarily, but I do look for those that are gonna be multi,
92 00:21:18.780 ⇒ 00:21:30.809 austinW: like, text-based versus multimodal. So something like a Sonnet versus Haiku. I think Haiku is more text-oriented, whereas Sonnet, it can manage both non-text and text. So there’s some edge cases.
93 00:21:30.910 ⇒ 00:21:36.570 austinW: But I think that, you know, we hear about, and Medium’s been the best way, I think, to…
94 00:21:36.680 ⇒ 00:21:56.640 austinW: be really on top of what’s cutting edge, and then certainly through that, you pick up on certain, you know, personalities, and there’s one called Gastown that I’ve been following. So there’s some people, and there’s companies you follow, but in general, you know, I’d say what’s being exposed by the cloud providers is what’s most realistic in the, in the, you know, client setting.
95 00:21:57.380 ⇒ 00:22:02.710 Brylle Girang: That’s down. That… that… It’s my first time hearing it. I’m looking at it right now, so it’s.
96 00:22:02.710 ⇒ 00:22:03.300 austinW: Yeah.
97 00:22:03.300 ⇒ 00:22:07.089 Brylle Girang: severe… a medium community.
98 00:22:08.020 ⇒ 00:22:18.550 austinW: Yeah, it’s… it’s the author of that thing, he’s been managing it for a while, but it’s a big piece of, it’s like OpenClaw, very open source, very…
99 00:22:19.130 ⇒ 00:22:24.940 austinW: Open source leaning, but yeah, the author himself is definitely,
100 00:22:25.410 ⇒ 00:22:31.680 austinW: I guess, opinionated in the space, and that he takes an interesting angle that’s separate from the giants, like, say, Altman or whatever, yeah.
101 00:22:33.160 ⇒ 00:22:33.750 Brylle Girang: Okay.
102 00:22:34.040 ⇒ 00:22:36.260 Brylle Girang: I’m just adding notes here, if you don’t mind.
103 00:22:36.260 ⇒ 00:22:37.709 austinW: Yeah, you got it, you got it, no worries.
104 00:22:40.810 ⇒ 00:22:51.440 Brylle Girang: Okay, the main reason why I’m asking is, I think we’re… we’re aligned here. We don’t use cloud code here in the business, because we don’t want to be tied to, like, one model, one.
105 00:22:51.440 ⇒ 00:22:52.010 austinW: Exactly.
106 00:22:52.430 ⇒ 00:22:55.899 Brylle Girang: We’re trying to explore the AI, the AI.
107 00:22:56.060 ⇒ 00:23:12.209 Brylle Girang: town is really crazy right now, you know, we have Dipsy coming up, which is really amazing, and we’re trying to make sure that we’re not going to be tied into one specific poll, and we’re not going to be able to explore other stuff, right? Right now, what we’re mainly using is OpenCode.
108 00:23:12.380 ⇒ 00:23:25.059 Brylle Girang: Which gives us lots of options for models. It has cloud, it has GPT, it has, you know, Kimi, Minimax, etc, which are… which have been really amazing. But we’re also trying to, like.
109 00:23:25.200 ⇒ 00:23:29.610 Brylle Girang: Our main goal here is that we’re going to provide our people
110 00:23:29.710 ⇒ 00:23:33.099 Brylle Girang: All harnesses that we think would be beneficial.
111 00:23:33.470 ⇒ 00:23:33.880 austinW: It’s.
112 00:23:33.880 ⇒ 00:23:37.140 Brylle Girang: To them to choose whichever fits their… fits their needs.
113 00:23:37.140 ⇒ 00:23:37.460 austinW: Yep.
114 00:23:37.460 ⇒ 00:23:45.269 Brylle Girang: You can use a cursor if you want to have an IDE experience. You can use code desktop if you want something more…
115 00:23:45.620 ⇒ 00:23:48.810 Brylle Girang: more easy to understand. You can use…
116 00:23:48.920 ⇒ 00:23:51.989 Brylle Girang: If you want to take advantage of the worksheets at Chandra.
117 00:23:52.260 ⇒ 00:24:04.710 Brylle Girang: Okay, so I think, I think it’s reassuring to know that you’re also well aware of, you know, the local models, and you’re not just talking about Claude, Opus, how amazing Opus is, etc.
118 00:24:05.140 ⇒ 00:24:16.340 austinW: Yeah, I found it this is important, you know, for me, I’ve come into tech actually from non-tech in school, so I’ve been learning the technical stuff and applying it from DBA, even before data engineering title existed.
119 00:24:16.410 ⇒ 00:24:30.289 austinW: So, yeah, I think just, you know, appreciating that open source is there, and then, yeah, avoiding that vendor lock-in, because there’s folks that might say, like, well, I only work in Snowflake. Well, why don’t you just work in EveryDB, you know, necessarily, because so much of that basis carries over. So, yeah.
120 00:24:30.680 ⇒ 00:24:37.880 Brylle Girang: Yeah, okay. So, what’s one… what’s one recent AI news that you felt really excited about?
121 00:24:38.920 ⇒ 00:24:42.260 austinW: Recent, yeah, news, really excited about. Gosh,
122 00:24:42.940 ⇒ 00:24:48.580 austinW: I mean, I think that… and I’ve been a little bit out of it since I’ve been, you know, focused with the work, but I think that it was…
123 00:24:49.580 ⇒ 00:25:05.359 austinW: maybe the, the Claude, where it’s kind of taking the desktop, you can give it control, necessarily, but these are also things that, like, I wouldn’t necessarily risk my own machine at the time, but I just think that, in general, seeing the extent of the inference, like, what it can deduce is just impressive, so…
124 00:25:05.530 ⇒ 00:25:11.179 austinW: there are leaps and bound things. I guess, you know, there’s a lot of hype right now just about, the…
125 00:25:11.320 ⇒ 00:25:13.869 austinW: I forget the name of the model,
126 00:25:14.980 ⇒ 00:25:23.199 austinW: But the one that’s, like, basically exploit, like, discovering exploits in security layers. There’s disruptions, I think, and it’s just these cases where
127 00:25:23.640 ⇒ 00:25:24.700 austinW: you know.
128 00:25:24.830 ⇒ 00:25:44.509 austinW: each couple quarters, you know, things pop up where you’d say that, hey, it takes a very specialized, you know, tech person to do this thing, and here’s a model that’s coming in doing it very effectively. And I guess I watch that, and I say, how big is that disruption gonna be? But then also, how are the companies gonna react to it? Because it’s really important to still keep an expert and a specialist in the loop.
129 00:25:44.790 ⇒ 00:25:51.199 austinW: So I think I watched, like, excited, but skeptical, too. It’s to say that, you know, these,
130 00:25:51.750 ⇒ 00:25:52.720 austinW: the…
131 00:25:54.160 ⇒ 00:26:01.840 austinW: The depth at which it can actually get the thing right with proper prompting, especially from an expert, it’s just impressive that it is managing to do what it does.
132 00:26:03.150 ⇒ 00:26:18.809 austinW: And the applications are there. You know, when you’re looking at network and infrastructure capacity, scaling, that kind of stuff, like, it’s great for it. So I think I just, generically, I just look at where it’s really breaking in and causing disruptions. I think I’ve had a little bit more…
133 00:26:19.320 ⇒ 00:26:29.759 austinW: my recent stuff, I’ve actually had to look across the entire SDLC of this current company to understand, like, where does AI fit? And in that case, I actually had a better appreciation, frankly, for…
134 00:26:29.890 ⇒ 00:26:35.640 austinW: cases where things like Figma, like, more so the… you see…
135 00:26:35.810 ⇒ 00:26:50.900 austinW: a product, like a licensed product, a SaaS product, and where they’re applying AI, and to say that just where MCP is doing, because I think MCP actually blew the doors open on most things, and that’s… that’s, I think, where I caught it very, very early on, and there was the hype of, like, everyone needs to be either
136 00:26:51.010 ⇒ 00:26:53.060 austinW: Vendors need to be making MCP.
137 00:26:53.070 ⇒ 00:26:57.020 austinW: Or the opportunity is there for anybody to kind of make an MCP for certain things.
138 00:26:57.030 ⇒ 00:27:11.920 austinW: And when I have encountered different tool sets that I wasn’t typically using, I didn’t necessarily have a use for a design tool like Figma, but here, when I come into this company, they start their software builds from design, necessarily. And understanding that, well, if you use Figma well.
139 00:27:11.920 ⇒ 00:27:19.410 austinW: then the MCP is going to work very well for you. So, I just think that from QA, to design, to development, to build release, to infrastructure.
140 00:27:19.450 ⇒ 00:27:22.939 austinW: I think that’s most exciting, is that there’s a really great
141 00:27:23.400 ⇒ 00:27:30.360 austinW: offering or product, and it’s bleeding edge, and it’s on every end of development. So that’s, you know, a lot of big space to learn.
142 00:27:30.650 ⇒ 00:27:35.129 austinW: And I was just… Further expand, you know, appreciation for where people are specializing.
143 00:27:35.940 ⇒ 00:27:43.870 austinW: But bridge what they’ve been doing with what this tool can do. So you give a tool to a good specialist, and we’re seeing it across the whole arc of development.
144 00:27:44.390 ⇒ 00:27:48.479 Brylle Girang: Have you tried, like, jumping on the open train?
145 00:27:49.390 ⇒ 00:28:04.769 austinW: OpenCloud? No, I’ve heard about it, and I think that it would definitely be in my appetite. You know, I was… when I was looking for work, and I was really focusing on just skilling up, when I was hands-on with Alma more actively, I was definitely looking for open source. But I’ve been, you know, a little bit…
146 00:28:04.770 ⇒ 00:28:09.939 austinW: I’d say in the last year, like, adapting into… I did an East… a West Coast to East Coast move.
147 00:28:10.000 ⇒ 00:28:12.899 austinW: I picked up work with this contract, and I’ve kind of…
148 00:28:13.120 ⇒ 00:28:29.260 austinW: not cruise, but I was kind of finding a level for myself as a person where I could keep up with personal things that were fulfilling, and also do work that was still learning and, like, new and expansive, so… But in general, it’s like, it’s about being, I think, in a mode of…
149 00:28:29.910 ⇒ 00:28:45.749 austinW: you know, chasing new learnings and being in a culture that really accommodates it, too. So, you know, I’m one that… I want to know about the new stuff, but right now, I haven’t budgeted that time as much, more recently. But I’ve heard about OpenGlaw, I’ve seen it, and I’m interested to check it out for sure.
150 00:28:46.290 ⇒ 00:29:03.649 Brylle Girang: Okay, okay. Tell me more about, like, how you’re actually using AI for your work. Do you still write code? Do you use AI as a sort of a co-pilot, or do you, like, let AI do all the heavy lifting, and then you just do the thinking? How does it… how does it go?
151 00:29:03.950 ⇒ 00:29:22.010 austinW: Yeah, I would call it still, like, maybe a 70-30. You know, it’s kind of get at that point where I’d say, actually, I’m leaning towards letting AI do a lot more coding. I’m definitely reviewing the changes, so it was most, you know, I think I got most routine in the React development. It was very quick to get that satisfaction, and it was doing the right thing. Go ahead.
152 00:29:22.170 ⇒ 00:29:23.740 Brylle Girang: So, so 730, you mean.
153 00:29:23.740 ⇒ 00:29:28.499 austinW: 70-30, the code is being generated by AI. 30 me getting in there.
154 00:29:28.500 ⇒ 00:29:29.359 Brylle Girang: Okay, gotcha.
155 00:29:29.360 ⇒ 00:29:29.900 austinW: Yeah.
156 00:29:30.200 ⇒ 00:29:42.669 austinW: I wanna… I wanna lean in and trust, and I think that I found a place with the React code, certainly, where it was the expert, and I just needed to validate. I knew enough about what I was asking it to do, and I could speak coherently to what I was asking for, and see that.
157 00:29:42.840 ⇒ 00:30:01.120 austinW: I think that in… specifically in areas of, actually, database interpretation and data in databases, I’ve got a little bit more reluctance. There’s cases where if I don’t hold the keys to that database, then I want to make sure that any AI I’m connecting, A, I can… I’m allowed to do it, but B, that, like, the role-based access is there where it’s not going to do anything crazy.
158 00:30:01.280 ⇒ 00:30:05.160 austinW: So, I think the SQL work is one where…
159 00:30:05.630 ⇒ 00:30:13.769 austinW: you know, I realize that the… you can give it the context of the schemas, but the content of the data itself is not visible until queried, so that’s one where…
160 00:30:14.280 ⇒ 00:30:26.420 austinW: granted, I’ve had such bad database experience, I don’t think it’s impossible. I know that these vendors are doing great things, but I myself, I’m still very hands-on with SQL, necessarily, as I’m… because I need to see the data for myself sometimes, especially if it’s a new system.
161 00:30:26.860 ⇒ 00:30:41.829 austinW: But yeah, Cursor’s been super helpful, and I certainly use it to advise. It’s been helpful with cloud configurations, certainly IAM policies, which can always be kind of tricky in getting out of, you know, why is it not, you know, types of situations.
162 00:30:42.030 ⇒ 00:30:52.540 austinW: And I am a planner, you know? So, you know, I do tend to like to make my own, you know, I want to say, conceptual documentation for myself.
163 00:30:52.670 ⇒ 00:31:03.199 austinW: But I find that, you know, given enough context, or even just files that the businesses have, it can definitely, you know, shortcut, you know, some of the things we need to know about. But I do experience the case where
164 00:31:03.200 ⇒ 00:31:20.510 austinW: you know, if you didn’t write it yourself, then you’re not as fresh with it in your own head. So, if it’s stuff that I want to present, or if it’s, like, a plan that I’m gonna march to, that is one where, like, I might have it generated, but I really need to make sure I do spend adequate time with what it came up with, so that I’m not promising something that I’m not sure of what’s inside of it, right?
165 00:31:20.950 ⇒ 00:31:28.380 Brylle Girang: Yeah, I think the next round of interview will be more of a technical piece, and I’m sure that you’re going to hear this
166 00:31:28.820 ⇒ 00:31:41.680 Brylle Girang: term, comprehension gap, or comprehension, something like that. So I think this is, like, the… the most common fear of developers right now, especially those who are fully delegating their… their.
167 00:31:41.680 ⇒ 00:31:42.220 austinW: Very cool.
168 00:31:42.220 ⇒ 00:31:47.250 Brylle Girang: to AI. Like, how well do you know the actual code?
169 00:31:47.250 ⇒ 00:31:47.760 austinW: Right.
170 00:31:47.760 ⇒ 00:31:56.520 Brylle Girang: you have the output, how well do you know the actual code? I just thought of that when you mentioned about, you know, trying to make sure that you understand the code.
171 00:31:56.960 ⇒ 00:32:10.069 austinW: Yeah, it is important, I think, you know, because not everyone is an expert, even if they know a little bit about code. So, you know, I do look at it like, you know, you could have been really, really specialized in Vue versus React, and it does benefit you to know some of those syntaxes.
172 00:32:10.390 ⇒ 00:32:22.630 austinW: But as a generalist, actually, like, I guess I feel super, super strong in SQL, so I know that it’s not gonna get anything behind me. But then there’s… there are, you know, code smells that pop up when you don’t have, you know, other things in place, yeah.
173 00:32:22.630 ⇒ 00:32:24.579 Brylle Girang: Yeah. Okay. Gotcha.
174 00:32:24.690 ⇒ 00:32:29.570 Brylle Girang: Was there a time where AI fucked you up, and how did you go about that?
175 00:32:29.890 ⇒ 00:32:33.409 austinW: Yeah, I mean, certainly, you know, I’d say that
176 00:32:33.580 ⇒ 00:32:37.040 austinW: I’m also one where… when I talk about the 70-30,
177 00:32:37.440 ⇒ 00:32:44.709 austinW: As I was doing these UI things, and especially as I was integrating service calls to those UIs, you know, there’s just cases where
178 00:32:44.920 ⇒ 00:32:58.890 austinW: you might be, you know, necessarily vibe coding, and you’re making so many calls, and you’re committing code over. And I, you know, again, it was a similar piece where I didn’t want the agent’s hand entirely, you know, foot entirely on the pedal. So…
179 00:32:58.900 ⇒ 00:33:10.980 austinW: In this case, I actually physically separated, I would do Claude, or do cursor, rather, on one cursor ID instance. Very narrow context, and then I would actually have the production repository in VS Code.
180 00:33:11.120 ⇒ 00:33:20.969 austinW: And what I was doing was giving it only the code I needed to affect, and as it made suggestions, I would, you know, port the code over. And I was managing my commits. And I think that, you know.
181 00:33:21.200 ⇒ 00:33:40.149 austinW: cases where you do or you don’t have good Git cleanliness, it’s hard to work backwards, you know, through what was the state of the code that was working, because certainly there’s interdependencies. So there’s been times where, you know, let’s a context just goes off the rails, and it’s like, no, I need to scrap this conversation.
182 00:33:40.420 ⇒ 00:33:43.219 austinW: But cases, you know, if I was…
183 00:33:43.410 ⇒ 00:33:45.710 austinW: You know, call it 5, 6, 7…
184 00:33:45.830 ⇒ 00:33:51.779 austinW: prompts through with code changes, and I wasn’t committing well, and I wasn’t, you know, testing closely.
185 00:33:51.870 ⇒ 00:34:00.669 austinW: you’d have a sneaking bug that came in, you know, 6 prompts ago, and you needed to catch it. So, I would just say that that’s very common, it was intermittent.
186 00:34:00.670 ⇒ 00:34:13.880 austinW: And to those cases, I’m just careful about stepping through, like… I don’t ask for big bang anything in my prompts. I actually ask for kind of small. And for that sake, I think maybe I’m taking extra steps, but I did find that I would, you know, have…
187 00:34:14.210 ⇒ 00:34:20.280 austinW: less backtracking that way, because I think Big Bang is just too much change, too much room for variables.
188 00:34:20.920 ⇒ 00:34:21.790 Brylle Girang: Okay, okay.
189 00:34:21.790 ⇒ 00:34:22.310 austinW: Yeah.
190 00:34:22.310 ⇒ 00:34:30.229 Brylle Girang: Let’s say… let’s say you get into Brainforge, how… how many weeks do you need to… to… to get the… to kick up and get the ground running?
191 00:34:31.260 ⇒ 00:34:42.039 austinW: Yeah, no, I don’t typically ask for too much time myself. I’m one where, with every corporate gig, and even with this contract gig, like, I’m trying to ship code on the first week type of deal. I’m trying to…
192 00:34:42.440 ⇒ 00:34:47.950 austinW: you know, be a sponge as much as I can, you know, it’s definitely cases where I’ve arrived, especially at…
193 00:34:48.159 ⇒ 00:34:50.670 austinW: Corporate settings, where you need to…
194 00:34:50.870 ⇒ 00:35:04.430 austinW: find out what’s going on for yourself, so I don’t sit there and wait for someone to tell me what to do. If there’s access to wikis, if there’s customer accounts or project context, I’m definitely one to go find it, even. And, in the past, too, with Geared specifically.
195 00:35:04.430 ⇒ 00:35:13.810 austinW: I would generate so much of the documentation, so I kind of hope to hold or find that people have a high standard for how they document what’s going on.
196 00:35:13.880 ⇒ 00:35:28.980 austinW: And then in organizations that just don’t do that, I definitely try to kind of evangelize that, just for my sake and for other people, because, you know, I appreciate that, again, we’re all here, you know, we started knowing nothing at one point. We learned it somehow, you know, so I think paying it forward…
197 00:35:29.170 ⇒ 00:35:38.410 austinW: There’s a self-sufficiency, so… I don’t ask for too much accommodation, typically, unless there’s a real gap. I think in the cases where if we’re going into, like.
198 00:35:38.710 ⇒ 00:35:55.660 austinW: very specialized implementations. I’ve never really had the opportunity in corporate settings to go chase a certification, for example. So, like, I’ll use the tools, but to take the time to go get certified, it’s a… it’s a budget and allowance that you hear about maybe the, you know, the Deloittes, those consultants where you’re spending a lot of time on the bench.
199 00:35:55.750 ⇒ 00:35:58.970 austinW: But for me, I like to have a project, I like to, you know.
200 00:35:59.170 ⇒ 00:36:08.130 austinW: have the team awareness around me, and then, you know, see where we can all grab on and pull. So, not too much of an ask as far as getting started.
201 00:36:08.680 ⇒ 00:36:13.100 Brylle Girang: Yeah, okay. Things move pretty, pretty fast here in Brainforge, and I’m not…
202 00:36:13.480 ⇒ 00:36:13.870 austinW: Yeah.
203 00:36:13.870 ⇒ 00:36:16.459 Brylle Girang: I’m not… I’m not exaggerating that, so…
204 00:36:16.460 ⇒ 00:36:17.629 austinW: Oh yeah, I get it.
205 00:36:17.630 ⇒ 00:36:23.420 Brylle Girang: Once you get in, please expect that, you know, at week one, you will receive feedback. If we don’t.
206 00:36:23.930 ⇒ 00:36:28.730 Brylle Girang: feel that you’re not moving the pace that we want it to be. But, yeah.
207 00:36:29.070 ⇒ 00:36:36.539 Brylle Girang: What else? I think… yeah, I think we have 4 minutes left. I think I had all my questions. Do you have any questions for me?
208 00:36:37.710 ⇒ 00:36:40.329 austinW: I think I was asking… well…
209 00:36:40.440 ⇒ 00:36:43.969 austinW: You know, as far as the typical engagement size, is there, like, a…
210 00:36:44.600 ⇒ 00:36:51.430 austinW: Is there a size of project, or a length of engagement you guys try to shoot for, where it’s a typical 3-month engagement, or like…
211 00:36:51.590 ⇒ 00:37:00.729 austinW: What’s the typical size and length, as far as team that’s getting involved in the contract, or as far as, how do you guys size the client opportunities?
212 00:37:01.040 ⇒ 00:37:02.700 Brylle Girang: Gotcha.
213 00:37:03.450 ⇒ 00:37:16.340 Brylle Girang: I think there have been really big changes over the past few weeks. Previously, our clients were more… were more for startups, and we’re, like, assigning 2 to 3 people.
214 00:37:17.340 ⇒ 00:37:31.900 Brylle Girang: Right now, I think we’re in a really strong trajectory. We just had our first Fortune 500 company, and we’re still… we’re on the way of securing bigger clients and moving away from startups.
215 00:37:32.090 ⇒ 00:37:37.779 Brylle Girang: Commonly, we’re assigning, like, one CSO that’s going to be, like, the account manager.
216 00:37:37.780 ⇒ 00:37:38.700 austinW: Start off.
217 00:37:38.700 ⇒ 00:37:42.759 Brylle Girang: One service lead that’s going to be, like, the back… the technical leader.
218 00:37:42.760 ⇒ 00:37:43.270 austinW: ending.
219 00:37:43.270 ⇒ 00:37:44.530 Brylle Girang: EB2?
220 00:37:45.000 ⇒ 00:37:50.739 Brylle Girang: ICs, who will help with the technical stuff, but does that answer your question?
221 00:37:51.020 ⇒ 00:38:07.310 austinW: It does, it does. Yeah, and actually, I was hearing, too, granted, you know, you guys are talking about having a pretty composable stack, being kind of agnostic. I was hearing a little bit about, maybe implementation partners with the likes of Snowflake or others. Are there leanings where you guys are partners with some vendors?
222 00:38:07.630 ⇒ 00:38:18.790 Brylle Girang: Yes, yes, we do have partner vendors, and we’re trying to tap into their customers as well, but I can share more about that. I’m pretty sure that you’ll get to see them on your day once you get in.
223 00:38:19.190 ⇒ 00:38:38.920 austinW: There you go. Awesome. No, well, that’s about it. Again, I had a great talk with Utam, and I guess I appreciate what y’all are doing, and I think that I, I’ve worked the very, you know, typical corporate setting, pre- and post-COVID, necessarily. And right now, just the way that the market is, is that I feel like, you know, we’re all kind of funny animals, in a way.
224 00:38:39.080 ⇒ 00:38:52.329 austinW: But I’m just looking for, you know, a good community in it, you know, so I’ve appreciated the clients and the management that I’ve found in my client customers. But again, it’s just… there’s a way to be very…
225 00:38:52.830 ⇒ 00:39:01.379 austinW: on the forefront with development, so I like that you guys are embracing open source, and that you guys are small but lean, you know, with a big appetite for client opportunity, so…
226 00:39:01.580 ⇒ 00:39:09.300 Brylle Girang: Definitely, definitely. I mean, I’m also saying that the AI world is crazy right now. You’re going to be at the cabin.
227 00:39:09.300 ⇒ 00:39:10.890 austinW: The very front, yeah.
228 00:39:10.890 ⇒ 00:39:12.270 Brylle Girang: Yeah.
229 00:39:12.270 ⇒ 00:39:15.499 austinW: Yeah. I’ve seen what Utam does in Austin, too. Go ahead.
230 00:39:15.740 ⇒ 00:39:27.009 Brylle Girang: It was really nice meeting you, Austin. I’m going to be totally honest, I think you’re a go. I’m going to be… I’m going to be endorsing you over to the next round, but you’ll hear from my recruitment team, okay?
231 00:39:27.010 ⇒ 00:39:28.829 austinW: Appreciate you, thank you. It was great talking with you, bro.
232 00:39:28.830 ⇒ 00:39:29.880 Brylle Girang: Bye-bye.
233 00:39:29.880 ⇒ 00:39:31.280 austinW: Thanks, V. See you. Bye.