Meeting Title: Brainforge Interview w- Pranav Date: 2026-05-07 Meeting participants: nolanrobbins, Pranav
WEBVTT
1 00:04:46.040 ⇒ 00:04:47.409 Pranav: Hey, Leno. Hey, Nolan.
2 00:04:48.520 ⇒ 00:04:49.759 nolanrobbins: Hey, Pradesh, how we doing?
3 00:04:49.760 ⇒ 00:04:50.939 Pranav: Pretty good, how you doing?
4 00:04:51.190 ⇒ 00:04:56.530 nolanrobbins: Hey, doing well, my man. Just trying to enjoy this Austin weather a little bit. I know it’s a little rainy out there, but…
5 00:04:57.100 ⇒ 00:04:58.530 Pranav: Oh, nice, you’re in Austin?
6 00:04:58.700 ⇒ 00:05:02.870 nolanrobbins: Yes, yeah, yeah, just met with, Udom, I think it was last week, or the week before.
7 00:05:03.210 ⇒ 00:05:08.709 Pranav: Oh, cool, cool. Give me… give me just one minute to just, get set up here. Sure.
8 00:05:08.890 ⇒ 00:05:10.740 Pranav: Something else
9 00:05:34.450 ⇒ 00:05:36.359 Pranav: Okay, cool. Yeah, sorry about that.
10 00:05:36.540 ⇒ 00:05:37.340 nolanrobbins: Yeah, no worries.
11 00:05:37.460 ⇒ 00:05:38.409 Pranav: How you doing?
12 00:05:39.130 ⇒ 00:05:42.270 Pranav: Yeah, how did, that conversation with Utam go? Like, what did you guys do?
13 00:05:43.160 ⇒ 00:06:01.940 nolanrobbins: Good, yeah, I mean, he just kind of talked about, his vision for performing Brainforge. Obviously, we got into each other’s experiences rather well. You know, he’s an AI guy at heart, it’s clear. I mean, even with the internal integration, too, within Brainforge, not even including, you know, the work y’all do for your clients as well. It seems like a very AI-native company.
14 00:06:01.940 ⇒ 00:06:13.020 nolanrobbins: I like what he said in regards to, like, bonus structures, and how he really wants to have his engineers, overperform in terms of efficiency within the company as well, which I thought was an interesting take.
15 00:06:13.140 ⇒ 00:06:22.029 nolanrobbins: Intelligent guy, he wants very smart-minded people in terms of communication as well, and yeah, I’m just continuing the interview forward, and it’s looking like a good fit thus far.
16 00:06:22.920 ⇒ 00:06:34.799 Pranav: Cool, yeah. I’m interested to hear, you know, from you, just a little bit about your background, you know, just kind of… just, like, maybe, like, a minute to a minute and a half, before we just hop into the rest of the interview.
17 00:06:34.970 ⇒ 00:06:35.550 nolanrobbins: Sure.
18 00:06:35.610 ⇒ 00:06:45.470 nolanrobbins: Yeah, so formally, my name’s Nolan Robbins. I’m a machine learning AI engineer based here in Austin, Texas. That passion really started back in late 2022 with the advent of ChatGPT.
19 00:06:45.500 ⇒ 00:07:03.849 nolanrobbins: That kind of sparked my, perpetual passion for learning all things applied artificial intelligence and any, you know, webs that have kind of sparked from that. Started with data analysis, then machine learning, then AI engineering, and now, you know, agentic workflows, and, you know, anything I can really get my hands on. I like to get into systems, see how they tweak.
20 00:07:04.040 ⇒ 00:07:10.590 nolanrobbins: Push the limits of the boundaries of my knowledge in any particular instance, and continue to grow as a learner and an engineer.
21 00:07:11.250 ⇒ 00:07:24.809 Pranav: Cool, yeah, and then, you know, you mentioned, you know, what kind of got you into, like, this AI engineering space was ChatGPT. So since then, what has been kind of, like, your experience, like, building the type of things you’d be building here at Brainforge?
22 00:07:25.250 ⇒ 00:07:42.890 nolanrobbins: Sure. So I guess, you know, in my current role, I basically am an AI consultant as is. You know, I’m really good at taking, problem constraints and then developing an AI-forward, approach to solving those as well, in terms of my use and possibly for the use case of the customer as well. I’ve done hackathons on the side in regards to,
23 00:07:43.150 ⇒ 00:07:57.780 nolanrobbins: you know, optimizations, engines, the machine learning aspect behind the scenes, and then for, clients and, specific industry projects, you know, RAG, basic RAG templates, multi-agentic system, computer vision models as well.
24 00:07:57.850 ⇒ 00:08:03.779 nolanrobbins: Yeah, just basically the bread and butter of all things AI, and looking to continue, applying that technology.
25 00:08:04.840 ⇒ 00:08:07.109 Pranav: Cool, yeah.
26 00:08:07.780 ⇒ 00:08:15.930 Pranav: Yeah, I noticed from, like, your LinkedIn, like, this, competition winning for, like, the NVIDIA, was that from, like, a hackathon in Austin?
27 00:08:15.930 ⇒ 00:08:16.980 nolanrobbins: Very much so.
28 00:08:16.980 ⇒ 00:08:22.959 Pranav: Okay. Yeah, I think I remember hearing about that hackathon. Was that through AITX?
29 00:08:23.380 ⇒ 00:08:25.329 nolanrobbins: Yes, and the Antler Boys.
30 00:08:25.330 ⇒ 00:08:29.259 Pranav: Okay, cool, that’s very interesting. Yeah, I know those guys pretty well.
31 00:08:29.640 ⇒ 00:08:31.929 Pranav: Cool, yeah, tell me a little bit about what you built there.
32 00:08:32.580 ⇒ 00:08:45.650 nolanrobbins: Sure. So basically the project was, for a startup called Glide, G-L-I-D. They have an autonomous dual-use vehicle that can, you know, basically transform itself from a railway to a road, if you will.
33 00:08:45.920 ⇒ 00:08:53.099 nolanrobbins: So having that ability opens up a lot of logistics opportunities in regards to, getting
34 00:08:53.100 ⇒ 00:09:08.919 nolanrobbins: this vessel and its cargo to the destination, as quickly, as efficiently as possible. So, part of the problem was actually surge optimizations based from porting. So, when the ship containers come into the port, sometimes if those containers can’t leave necessarily, there becomes a backlog
35 00:09:08.970 ⇒ 00:09:14.310 nolanrobbins: And, higher costs in regards to drayage, they’re called, basically storage costs, and then…
36 00:09:14.360 ⇒ 00:09:28.440 nolanrobbins: You know, it was on the Glide vehicle then to pick up the ship container and go to its first mile, destination, as efficiently as possible. So, what me and my team built, was an engine behind the scenes of optimizing that.
37 00:09:28.450 ⇒ 00:09:44.109 nolanrobbins: for me as the AI engineer on the project, I developed a graph neural network model that basically trained the edges of the entire U.S, and in terms of the nodes, those are just key ports and key train stations, warehouses, that we were given data to.
38 00:09:44.270 ⇒ 00:10:00.110 nolanrobbins: So we trained the whole graph neural network on the US in regards to those nodes and edges, and then once those were trained, just did a basic Dijkstra’s algorithm, shortest path algorithm to, you know, get the vehicle to its destination. So I was in charge of building the graph neural network behind the scenes at that.
39 00:10:01.510 ⇒ 00:10:08.140 Pranav: Okay, interesting. Yes, that’s pretty technically complex, like, building those, neural networks.
40 00:10:08.390 ⇒ 00:10:18.990 Pranav: what we’re building for the most part here at Brainforge is, building, you know, RAG applications, building, automations regarding AI.
41 00:10:18.990 ⇒ 00:10:33.259 Pranav: We haven’t built models here, we haven’t built, fine-tuned models yet. Fine-tuned models are definitely, like, interesting for going forward, but something that, you know, is our bread and butter with the things that we
42 00:10:33.480 ⇒ 00:10:40.469 Pranav: build with… for a bunch of different clients is, like, MCP servers, data warehousing,
43 00:10:40.670 ⇒ 00:10:58.040 Pranav: data warehouses, basically, for querying context for our applications, building embeddings, of course, for the RAG applications. So, yeah, kind of, that segues me into our, like, what I would like to talk about with you first, which is, tell me about a…
44 00:10:58.410 ⇒ 00:11:17.190 Pranav: a RAG application, or just, like, an AI application, not necessarily neural nets, but something that would be more, applicable to, like, what I just described of the work that we do here at Brainforge, and one specifically that you had, like, a lot of ownership of, because I want to go into, like, the technical as well.
45 00:11:17.650 ⇒ 00:11:18.250 nolanrobbins: Sure.
46 00:11:18.400 ⇒ 00:11:30.690 nolanrobbins: So, basic example, that hits all these points here was my, project for the University of San Diego. I was chosen out of the cohort of students. It was me and one other individual to work with the Office of Sponsored Projects.
47 00:11:30.690 ⇒ 00:11:40.990 nolanrobbins: Basically, they were non-AI people at all, just, behind the scenes, almost human resource level, in terms of, distributing salaries to
48 00:11:41.330 ⇒ 00:11:48.389 nolanrobbins: professors, researchers, what have you. So they came to us, and they said, hey, we have this Excel file.
49 00:11:48.490 ⇒ 00:12:02.980 nolanrobbins: We have a budget document, a justification, based off that Excel file that we have to handwrite, and it takes us about 3 hours, and, you know, we’re looking to cut back on this cost. Let’s use AI, right? That’s basically the extent of the problem constraint they gave us, but…
50 00:12:03.070 ⇒ 00:12:08.719 nolanrobbins: Knowing what I know about the field and the applications of it, if you have the input and you have the output.
51 00:12:08.740 ⇒ 00:12:20.399 nolanrobbins: that’s one of the easiest steps, right? Then, from there, you can just kind of fill in the rest. So, from there, me and my, colleague, we just developed a basic parsing format of these Excel files. Lucky for us, the,
52 00:12:20.400 ⇒ 00:12:30.690 nolanrobbins: The formats were very specific, meaning that we really didn’t need to rely on OCR to really, dig deep into tables or complex diagrams. There was none of that, it was just Excel files, so…
53 00:12:30.730 ⇒ 00:12:40.950 nolanrobbins: Setting up that parser was rather easy. From there, basically the main constraint of the project was that this information is proprietary to the university.
54 00:12:40.950 ⇒ 00:12:57.250 nolanrobbins: So we were not allowed to use closed-source models at all. We had to rely on open source models for these, in terms of building the code, then also applying the data for it. So this was all done on device. We developed a local embedding model, just the ChromaDB, setup in terms of having those embeddings.
55 00:12:57.310 ⇒ 00:13:04.800 nolanrobbins: You know, in terms of the demo, we just had a basic Llama model as well to then pull those embeddings, and then what was really important to the,
56 00:13:05.180 ⇒ 00:13:18.879 nolanrobbins: the manager of the project, too, that had asked us this problem was that it was in a DocX format, and it was really cool, you know, to really just implement that for her, so all she had to do was push a button, and it downloaded into the format that she’s most familiar with, so…
57 00:13:19.550 ⇒ 00:13:27.329 Pranav: Okay, and then, I may have missed it. What was, how were people, like, at, UC San Diego using this application?
58 00:13:28.500 ⇒ 00:13:37.460 nolanrobbins: So it’s all internal at this point. There was no real need to… for scale, so it was easy to kind of run on device. They gave us, like, a little server to set up that was…
59 00:13:37.460 ⇒ 00:13:40.150 Pranav: I mean, like, what were they using the application for?
60 00:13:41.130 ⇒ 00:13:42.380 nolanrobbins: I wasn’t.
61 00:13:42.380 ⇒ 00:13:53.499 Pranav: You mentioned, you know, how it’s, like, certain, like, salary information across, like, the college itself, so are they just using it to, like, query salary information, or what else are they using it for?
62 00:13:55.300 ⇒ 00:14:10.460 nolanrobbins: So there was no app previously, this was all done by humans. They just take a look at the Excel file, and these are people that are trained on this, and they know the salary base and what to say for those justification narratives at the end. So basically how they’re using it now is all they have to do is put the Excel file in.
63 00:14:10.460 ⇒ 00:14:17.299 nolanrobbins: out comes the budget justification, the narrative, and all they have to do is just read through. They wanted observability in that mindset.
64 00:14:17.300 ⇒ 00:14:31.490 nolanrobbins: They didn’t want it fully autonomous, but, you know, just checking, that everything was parsed okay, maybe making a tweak here or there, saves them a boatload of time. I think the manager said it was about… it takes about 10 minutes for the total. Put the Excel file in, read through.
65 00:14:31.540 ⇒ 00:14:32.680 nolanrobbins: And push send.
66 00:14:33.080 ⇒ 00:14:51.569 Pranav: Okay, and so, yeah, just want to understand, like, the business problem a little bit more, because I’m just, like, a little bit unclear on that. So this Excel file, right, this Excel file is built on a per-candidate basis, or a per-employee basis. What is this Excel file that they’re inputting into the application?
67 00:14:52.720 ⇒ 00:14:59.290 nolanrobbins: So, it’s a little bit of both. It actually has a lot to do more with department styles, with things.
68 00:14:59.670 ⇒ 00:15:14.310 nolanrobbins: So, I forget the individual departments, but basically you can think of it, like, from a degree level. Like, the… I remember there was, like, a psychology department, so it had, like, X teacher, X researcher, ex-student, travel expenses,
69 00:15:15.030 ⇒ 00:15:24.230 nolanrobbins: like, book cost or anything, like, on hand, kind of, as well, so it was kind of layered in that sense. It was, like, A, B, C, D, the whole way down. But yeah, for the most part, it was very…
70 00:15:24.330 ⇒ 00:15:28.279 nolanrobbins: transferable across departments. So, like, the… the…
71 00:15:28.810 ⇒ 00:15:35.330 nolanrobbins: professors were always at the top, let’s say. They were Group A, and then, like, expenses were, like, B, and then researchers, Group C, etc.
72 00:15:36.220 ⇒ 00:15:49.440 Pranav: Okay, gotcha. Yeah, you mentioned how, you know, you’re doing everything locally, you’re building embeddings, and then you’re using an open source model as well, so that it, and I think…
73 00:15:50.480 ⇒ 00:16:07.729 Pranav: you… what were you doing is, like, running the model locally, right? So that you were sending the proprietary information to the cloud. Yeah, what, I guess to, like, start off with, like, what model did you guys end up using for this, specifically since, you know, and, like, why did you choose that model?
74 00:16:08.230 ⇒ 00:16:22.439 nolanrobbins: Sure. So it was an older model, it was the Llama 3.1, I think it was an 8 billion parameter off the top of my head. But I remember very vividly in doing the research behind it that it’s a very, instruction-based model. It doesn’t tend to get too…
75 00:16:22.540 ⇒ 00:16:31.920 nolanrobbins: non-deterministic with its results. It tends to stay in its lane very well, so setting up the system prompt and everything behind the scenes, just worked well for our use case and for our experimentation.
76 00:16:32.600 ⇒ 00:16:40.459 Pranav: Gotcha. And so, embeddings is interesting, because I still… I don’t think I’m understanding why you had to build embeddings for this. Can you kind of explain that to me?
77 00:16:41.140 ⇒ 00:16:56.889 nolanrobbins: Sure. So, in terms of the embeddings, we still had to keep the, cells appropriate, with how things were kind of run, from department, from, departments across. Basically what we… we kind of saw was that it was very…
78 00:16:57.000 ⇒ 00:17:00.319 nolanrobbins: What’s the word I’d like to get across to you?
79 00:17:00.600 ⇒ 00:17:06.369 nolanrobbins: It was easiest for the LLM to pull the necessary data to then link to the narrative itself.
80 00:17:06.440 ⇒ 00:17:23.919 nolanrobbins: That’s what we needed, more of, like, the natural language progression, because if we just parse the numbers, you know, if we didn’t have an LM, it would just repeat those numbers on a narrative, and that’s not what they wanted. This narrative acted almost as, like, a paragraph, right? It had to have natural language in regards to where the embedding kind of came from in regards to that salary.
81 00:17:24.050 ⇒ 00:17:40.569 nolanrobbins: So, like, say, like, professor salary from 2023, 2024, 2025, 2026, that was past data, but then there’s also be fields where it was future salary, so, like, 2026, 2027, 28, 29, right? And we had to make sure that those linked up, like, you know.
82 00:17:40.570 ⇒ 00:17:59.169 nolanrobbins: it obviously goes up with inflation and such, and higher pay, but if there were anomalies, the LLM also was kind of looking into that as well. That was baked into our system prompt, and then ultimately the embeddings, like I said, is more for the narrative base at the end. It had to justify why the salary was in regards to expenses or past years.
83 00:18:00.170 ⇒ 00:18:06.489 Pranav: Okay, so it seems like for an Excel sheet, right,
84 00:18:07.090 ⇒ 00:18:17.110 Pranav: there’s not gonna be that many tokens to fill up the entire token window, so, like, what prevented you guys from just, you know, attaching a system prompt, and then attaching the Excel sheet, and then…
85 00:18:17.240 ⇒ 00:18:36.230 Pranav: just having an output for the… for the user. Since, from my understanding of, like, what you just described, this isn’t something that you’re consistently adding a ton of different documents to, you’re not querying it multiple times either, so I think I’m still kind of a little bit confused on why you needed the embeddings here.
86 00:18:37.430 ⇒ 00:18:47.650 nolanrobbins: Semantic meaning was important in terms of the narrative base as such. There were new documents coming up fairly recently in regards to when they brought teachers in. I want to say it was, like.
87 00:18:47.840 ⇒ 00:18:55.260 nolanrobbins: Off the top of my head, like, 3 to 5 times a year, almost, like, the semesters of things, and planning for future.
88 00:18:55.600 ⇒ 00:19:01.759 nolanrobbins: You know, in terms of just going straight the whole way through, like, we don’t want this thing to be,
89 00:19:02.100 ⇒ 00:19:17.060 nolanrobbins: like a copycat. We don’t want it just to recite the numbers, because that’s pretty easy in a sense, too. It had to actually have NLP knowledge to justify and compare across expenses of things. So, like, why are we paying this researcher this much, right? Because…
90 00:19:17.060 ⇒ 00:19:31.609 nolanrobbins: They have this many… or why are we paying this professor this much? It’s because they need this many researchers, this much supplies, this is their task, etc. That’s where the embeddings kind of come in, of just kind of link everything together, so that the output is then a paragraph of, like, hey, this is why, right?
91 00:19:31.790 ⇒ 00:19:43.540 nolanrobbins: So that Y nature is more, you know, non-deterministic in a sense, in terms of semantics, I guess. Again, not having that copycat the way through. So that’s why we ended up choosing the RAG-based implementation with the LLM creation.
92 00:19:44.620 ⇒ 00:19:55.160 Pranav: Okay, interesting. Yeah, so, what I’ve found is, it’s, it’s common, kind of, like, if you’re, you know, for, like, one Excel sheet, right?
93 00:19:55.280 ⇒ 00:19:59.109 Pranav: Do you have, like, an estimate of, like, how many tokens that would be? .
94 00:19:59.110 ⇒ 00:20:01.100 nolanrobbins: It fit in our 8K.
95 00:20:01.200 ⇒ 00:20:03.649 nolanrobbins: Token, so it was not much, yeah.
96 00:20:05.080 ⇒ 00:20:08.899 Pranav: Okay, so your context window is only 8K, yeah.
97 00:20:08.900 ⇒ 00:20:17.440 nolanrobbins: We tended to keep it that way, because it was all local, like, we could have went up with it, but obviously, tokens per second goes down when you do that, KV caching, memory aside, like, yeah.
98 00:20:17.700 ⇒ 00:20:26.519 Pranav: Okay. Yeah, yeah, no, that’s a good point, yeah. So, since you’re running it locally, your context window’s gonna be much shorter. Okay, interesting.
99 00:20:29.080 ⇒ 00:20:39.190 Pranav: Cool. So yeah, let’s, let’s, like, switch gears. I want to also give you kind of, like, scenarios that we’ve gone through here at, Brainforge.
100 00:20:39.280 ⇒ 00:20:39.980 nolanrobbins: Sure.
101 00:20:40.270 ⇒ 00:20:47.330 Pranav: Yeah, actually, one other question that I just have on what you’re working on currently, or what you’re working on for that project, is,
102 00:20:47.630 ⇒ 00:20:55.350 Pranav: you know, you put it into production, there’s people actually at UC San Diego using this. How…
103 00:20:55.770 ⇒ 00:21:02.510 Pranav: How did, like, certain, like, production bugs or certain production, like,
104 00:21:04.250 ⇒ 00:21:09.650 Pranav: faults get patched. Like, what is your process usually for, like, patching those type of things?
105 00:21:10.560 ⇒ 00:21:27.789 nolanrobbins: So we have some basic testing that were to come up from this. Obviously, logging and tracing was super important for a system like this as well, so metadata flows throughout the entirety of the system. I remember having some problems with the retrievals of the embeddings as well, since they were local.
106 00:21:27.840 ⇒ 00:21:38.549 nolanrobbins: Trying to remember exactly how we patched that in regards to just… oh, it had to do with the chunking of it as well. We experimented with different chunking strategies.
107 00:21:38.550 ⇒ 00:21:57.989 nolanrobbins: So we went from, like, those blocks of A, B, C, right? Because there weren’t that many tokens in there specifically. And ultimately, what we ended up doing was actually going on a row level instead of the column-based. So that was something that we had gave them, like, an original prototype, be like, hey, how is this working for us? Or how’s this working for you? It was working good for us.
108 00:21:57.990 ⇒ 00:22:03.159 nolanrobbins: And they were kind of noticing more hallucinization… hallucin… it was hallucinating more.
109 00:22:03.160 ⇒ 00:22:21.429 nolanrobbins: in regards to the paragraph style of how we were pulling the Excel, I thought that was going to be more useful information, but ultimately, we needed, you know, more embeddings on the row level. So that’s something we kind of worked through, and then, you know, every time they query something, we had, like, a little database, that my partner came up with. I believe it’s called MinIO.
110 00:22:21.640 ⇒ 00:22:23.990 nolanrobbins: Off top of my head that, you know.
111 00:22:24.130 ⇒ 00:22:30.040 nolanrobbins: Gave all the queries and everything, and kind of held, like, a locking database as well, simply because they weren’t querying it that much either.
112 00:22:30.600 ⇒ 00:22:35.850 Pranav: Gotcha, okay. And, what was… so it was, like, you just and one other partner working on this, right?
113 00:22:36.460 ⇒ 00:22:40.079 Pranav: Okay, cool. And, like, what was, like, your individual roles on this project?
114 00:22:40.560 ⇒ 00:22:53.130 nolanrobbins: Sure. So he actually came from more of a database, file system, more backend development, and then I was obviously more leaning on the AI engineering side of things. So he kind of let me reign on the RAG system of everything.
115 00:22:53.130 ⇒ 00:23:00.920 nolanrobbins: He did very well with Docker containerizing everything like that, making sure that it was appropriate to actually place on the server for them.
116 00:23:01.210 ⇒ 00:23:07.780 nolanrobbins: So, yeah, that’s generally how we kinda… kinda broke it up. He was more back-end, I was more AI engineering, getting the model working.
117 00:23:08.000 ⇒ 00:23:11.550 nolanrobbins: Latency in terms of token creation, embeddings, that kind of stuff.
118 00:23:12.920 ⇒ 00:23:20.530 Pranav: Sounds good. Yeah, let me maybe go into the scenarios a little bit later in the interview, and I’m happy to stay on a little bit longer, too, if you don’t mind.
119 00:23:20.530 ⇒ 00:23:21.000 nolanrobbins: Okay.
120 00:23:21.000 ⇒ 00:23:22.249 Pranav: Yeah, do you have those.
121 00:23:22.250 ⇒ 00:23:23.449 nolanrobbins: 2.30, yeah.
122 00:23:23.450 ⇒ 00:23:38.619 Pranav: We’ll be done way before then. Cool. Yeah, is there another project, too, that, comes to mind when I talk about, like, you know, RAG applications, or building MCP servers, building chat interfaces? Does anything else come to mind?
123 00:23:39.600 ⇒ 00:23:45.489 nolanrobbins: In terms of, like, industry, or maybe, like, professional side projects, or self-studying kind of mindset?
124 00:23:45.700 ⇒ 00:23:58.469 Pranav: Yeah, it can be anything, you know, what would probably be the most interesting to talk about is, like, whatever you’re the most proud of, most technically complex ones, that is most relevant to Brainforge. Sure. Like the work that we do at Brainforge. Sure.
125 00:23:59.320 ⇒ 00:24:14.999 nolanrobbins: You know, I could definitely dive into the NVIDIA Hackathon, that’s probably the one I’m most proud of and technical, but again, that’s more on the, deep learning side of things, so… I think another project that I’ll kind of touch base on is something I did at Ferocitor. We built a multi-agentic system, for a wellness app.
126 00:24:15.040 ⇒ 00:24:33.030 nolanrobbins: I have to be a little bit careful with the NDAs, kind of how we get into it, but basically, this app was… the differentiator was the onboarding process as well. Very doctor-based, very medical-based in terms of everything, so they want the onboarding process to last, like, 15-20 minutes, so it was a lot of data.
127 00:24:33.140 ⇒ 00:24:48.310 nolanrobbins: And at the end, there was, like, a generated protocol that took all of these into account, right? So, with all of these sets of data as well, we made it a multi-agentic system where we had, like, a sleep agent, and we had a nutrition agent.
128 00:24:48.480 ⇒ 00:25:01.139 nolanrobbins: social agent, like, just different, pillars of health, let’s say. And we were able to keep those agents rather small, right? Because we wanted to make sure that it ran within about 60 seconds was our absolute, like.
129 00:25:01.520 ⇒ 00:25:11.720 nolanrobbins: like, we can’t go past 60 seconds, no matter what kind of thing. So then from there, we could just kind of… if I can maybe diagram on the top, right, we have the input, and then all the little…
130 00:25:12.090 ⇒ 00:25:30.110 nolanrobbins: bits to the different agents, and then they all came together in what we call the SIFT agent, basically. And the SIFT agent was a little bit larger in nature, but basically its role was that we didn’t have any contradictions between multiple agents. And if we did send it back to the agent, hey, we have this constraint, bring me another output, kind of thing, so…
131 00:25:30.190 ⇒ 00:25:37.619 nolanrobbins: A lot of prompt chaining in regards to that, a lot of orchestration went into that as well. We used LaneGraph and,
132 00:25:37.960 ⇒ 00:25:51.700 nolanrobbins: Lang Smith for the traceability. We chose Lang Graph simply because if this individual that owns the app wants to add another health agent, right, it’s not sequential in nature, and it can simply just add it like a node on a graph and have it still run the system that way.
133 00:25:51.860 ⇒ 00:25:53.919 nolanrobbins: And we were really pleased with the results.
134 00:25:54.290 ⇒ 00:25:59.909 Pranav: That’s awesome. And, did you use any, like, agentic frameworks to… to build out the agents?
135 00:26:00.620 ⇒ 00:26:05.090 nolanrobbins: Yeah, Langgraph was the main one, and then Lang Smith was our traceability.
136 00:26:05.400 ⇒ 00:26:10.290 Pranav: Okay, cool. Yeah, that sounds, that sounds super interesting.
137 00:26:10.440 ⇒ 00:26:17.689 Pranav: Yeah, let’s hop into, like, some of the scenarios that I’ve, like, I’ve run into. Just wondering how, like, you think them through.
138 00:26:18.150 ⇒ 00:26:34.570 Pranav: So, yeah, I’ll just give you a little bit of context on the client, not gonna kind of give you a whole spiel on it, because it’s not super relevant. Sure. But one of them was, like, an e-commerce company. We built out, like, 5 MCP servers for them, and we built out this, standalone chat interface,
139 00:26:34.640 ⇒ 00:26:49.980 Pranav: some other stuff as well, which is why we made it, like, a standalone application. But basically, one issue that we had was, when the client would ask a question specifically about Shopify orders,
140 00:26:50.170 ⇒ 00:27:08.369 Pranav: they would be… get, like, say, like, they asked the question of, like, what was my last 7 days of revenue? And generate a table for me, kind of showing me the last 7 days. It would bring back data, but it would bring back completely incorrect data. It’s not even just, like, obviously sample data, it’s, like.
141 00:27:08.870 ⇒ 00:27:14.980 Pranav: data that, you know, looks like it could be real, but then when you fact-check it on Shopify itself, it’s completely off.
142 00:27:16.960 ⇒ 00:27:24.940 Pranav: What’s kind of, like, the first thing that comes to mind, for you in regards to this? And then also, how would you, like, diagnose this issue?
143 00:27:25.530 ⇒ 00:27:26.100 nolanrobbins: Door?
144 00:27:26.170 ⇒ 00:27:42.360 nolanrobbins: You know, not hearing… not being able to get into the code and see it myself, the first thing that I’m kind of noticing is, the actual data that it’s pulling and where it’s from. Maybe these docs just aren’t up to date, or if this was more a RAG-based embedding kind of format that it’s pulling from the chat model itself.
145 00:27:42.400 ⇒ 00:27:55.760 nolanrobbins: Maybe the embedding’s off, maybe the chunking is off in regards to what it’s kind of bringing back that way, if there’s any metadata in regards to that chunking as well, if there are, you know, let’s say, past years, future years, newer documents, etc.
146 00:27:56.100 ⇒ 00:28:00.360 Pranav: Oh, basically. So, this was not using, like,
147 00:28:00.730 ⇒ 00:28:06.609 Pranav: like, static data, I guess. It was pulling from… it was using the Shopify MCP that we built.
148 00:28:06.610 ⇒ 00:28:07.900 nolanrobbins: dynamic data, okay.
149 00:28:07.900 ⇒ 00:28:14.259 Pranav: Yeah, yeah, yeah. So, there’s no embeddings involved here. Yeah, so I just wanted to give you that context as well.
150 00:28:14.790 ⇒ 00:28:15.370 nolanrobbins: Sure.
151 00:28:15.520 ⇒ 00:28:18.460 nolanrobbins: So I guess on that point, I would… I would…
152 00:28:19.500 ⇒ 00:28:34.859 nolanrobbins: you know, obviously metrics are kind of, you know, front of mind in regards to this. An important one that I use is, like, faithfulness, is a really important one in regards to make sure that we’re pulling the right fact from them. If it’s dynamic data, that’s a little bit different. I would…
153 00:28:35.160 ⇒ 00:28:42.019 nolanrobbins: probably ask what the chatbot really is, what’s really running that chatbot, maybe some more insights there, so I can kinda…
154 00:28:42.160 ⇒ 00:28:43.810 nolanrobbins: Lean my direction.
155 00:28:43.810 ⇒ 00:28:47.579 Pranav: Yeah, yeah, no, keep asking questions too, like, I can provide information.
156 00:28:47.840 ⇒ 00:28:57.669 Pranav: Okay, sure. Yeah, so what we were using was, I think, like, the cloud model we were using was, like, Cloud Sonnet something.
157 00:28:57.790 ⇒ 00:29:03.720 Pranav: we were using, like I said, we, like, we built them this Shopify MCP server,
158 00:29:04.180 ⇒ 00:29:17.629 Pranav: And for Cloud Sonnet, we had, like, reasoning enabled, so temperature was set to 1.0, by default. Yeah, those are just kind of, like, some of, like, the additional tools that we use.
159 00:29:17.960 ⇒ 00:29:24.439 nolanrobbins: Sure. Was there a reason why you guys decided to go with a reasoning model relative to just a non-reasoning model in regards to this?
160 00:29:24.440 ⇒ 00:29:30.589 Pranav: Great question. We just had that enabled, and so we realized, too,
161 00:29:31.340 ⇒ 00:29:41.700 Pranav: you know, maybe… so that was just, like, one parameter that we found out. Is there something that comes to mind, like, when I say that that was a reasoning model versus, you know, not being a reasoning model that leads you to believe a certain thing?
162 00:29:42.300 ⇒ 00:29:52.859 nolanrobbins: Sure. So anytime you use a reasoning model, your context window’s gonna increase. And anytime the context window increased, there’s an obvious trade-off in regards to how many parameters that model has.
163 00:29:52.890 ⇒ 00:30:11.789 nolanrobbins: To opening a wide array of hallucinations, in regards to that. So that’s kind of the first thing that comes to mind. Anytime you can keep a context window small, it’s generally a good practice. You know, I just watched a recent video from, I don’t know if you know, it’s a Matt Pocock, I believe his name is.
164 00:30:11.790 ⇒ 00:30:25.430 nolanrobbins: Where it’s like, you get to about 40%, the model’s good, 60% dumb. So, in being able to keep the context window probably within around the 40% mark as much as possible, you’re gonna minimize your chances of hallucinations.
165 00:30:25.620 ⇒ 00:30:31.730 nolanrobbins: So that’s what immediately comes to mind with the reasoning format, because why it reasons is it’s generating a lot of tokens, right?
166 00:30:32.300 ⇒ 00:30:40.509 Pranav: Yeah, okay. Interesting. Yeah, that’s a good way to put it. You’re totally right, like, we could have…
167 00:30:41.450 ⇒ 00:30:46.979 Pranav: we could have made it so that, you know, reasoning wasn’t supported. Not that that should have made,
168 00:30:47.780 ⇒ 00:31:07.339 Pranav: that should have generated incorrect data. However, it definitely contributed to that, right? You know, these are… Sure. These are models, they’re going to act in a certain way, it’s not always going to be deterministic, right? You should never assume it’s going to be deterministic, but there’s ways to make, you know, the system more deterministic, which is what we’re trying to do here.
169 00:31:08.190 ⇒ 00:31:17.789 Pranav: Yeah, so then what we noticed, actually, was that the model was failing to pull the information via the MCP from Shopify.
170 00:31:17.930 ⇒ 00:31:33.030 Pranav: So what it did, once it failed to make… to forget the data, is that it just generated its own, thinking that’s what the user wanted. And so, of course, you know, me and you, we’re human beings, we know that’s not exactly what we want. Right. So…
171 00:31:33.180 ⇒ 00:31:37.089 Pranav: What solution would you design to fix this problem?
172 00:31:38.670 ⇒ 00:31:54.179 nolanrobbins: So, basically what I’m kind of understanding is the connection itself was a little bit off. So, maybe redesigning the MCP server, maybe, you know, different API calls in regards to being able to pull that data effectively. Is there, like, a paywall behind this as well? Is there…
173 00:31:54.180 ⇒ 00:32:01.040 nolanrobbins: OAuth, you know, behind the scenes, maybe that aren’t being integrated correctly, would probably be my rationale.
174 00:32:01.090 ⇒ 00:32:02.290 nolanrobbins: To kind of move through that.
175 00:32:02.860 ⇒ 00:32:05.020 Pranav: Okay, yeah.
176 00:32:05.570 ⇒ 00:32:19.950 Pranav: Well, so, yeah, that’s definitely, like, we can revisit the MCP server, we can definitely make it more robust, that’s definitely a good option. But, you know, APIs fail. Even the best APIs are gonna fail. If you use Gemini long enough, it’s gonna fail every once in a while.
177 00:32:20.460 ⇒ 00:32:32.260 Pranav: So, under the assumption that APIs are going to fail, no matter how robust you make your MCP server around it, what else can you design into the system so that it at least fails gracefully?
178 00:32:33.800 ⇒ 00:32:36.109 nolanrobbins: Definitely, you know.
179 00:32:36.190 ⇒ 00:32:47.259 nolanrobbins: you can kind of start… like, the small seed example is probably just prompt engineering, right? You can just say, don’t… you know, if you can’t find the data, let me know, do not hallucinate kind of thing, but…
180 00:32:47.260 ⇒ 00:32:57.930 nolanrobbins: that can also be non-deterministic in that sense, so I think it kind of comes down to context engineering and make sure the context around that software is actually pooling the data source that it
181 00:32:58.030 ⇒ 00:33:02.130 nolanrobbins: We know, as humans, and have worked with the data, is actually pulling from.
182 00:33:02.360 ⇒ 00:33:18.580 nolanrobbins: Getting back to the area around it, you could even go as far as harness engineering in regards to how much data is it really touching. Is it the whole Shopify MCP server? Does it know to go into the individual, channel that the client is actually a part of as well? Something I would take a look at.
183 00:33:18.770 ⇒ 00:33:26.239 nolanrobbins: But failing gracefully has to do with logs, has to do with tracing, being able to really understand the input to the output.
184 00:33:26.830 ⇒ 00:33:33.160 nolanrobbins: going through my mind, so those are things I would think that I would start small, test, go bigger, then go bigger, then bigger.
185 00:33:33.820 ⇒ 00:33:34.400 Pranav: Cool.
186 00:33:34.680 ⇒ 00:33:36.159 Pranav: Yeah, that makes sense.
187 00:33:36.360 ⇒ 00:33:41.440 Pranav: Yeah, I think that was most of, like, the questions that I had.
188 00:33:42.250 ⇒ 00:33:47.470 Pranav: happy to, like, you know, answer any questions that you may have, too, about just BrainForge, or just in general. Sure.
189 00:33:48.380 ⇒ 00:33:55.039 nolanrobbins: So, just quickly, so I based the question, how long have you been with the company, kind of since the beginning, or you joined a little bit after, or…
190 00:33:55.210 ⇒ 00:34:09.370 Pranav: Yeah, I joined, I think Utam started Brainforge around two, two and a half years ago. I had first met Utam last August. I was living at Austin at the time, I’m actually moving back to Austin in the next.
191 00:34:09.370 ⇒ 00:34:12.730 nolanrobbins: Oh, nice, cool. There you go. Alright. What part of town? What part?
192 00:34:13.040 ⇒ 00:34:16.690 Pranav: I will be… oh, you live in Austin, so you probably know.
193 00:34:16.690 ⇒ 00:34:17.630 nolanrobbins: Yeah, yeah.
194 00:34:17.630 ⇒ 00:34:21.130 Pranav: Right near Franklin’s, actually. There’s, like, an apartment building over there.
195 00:34:21.530 ⇒ 00:34:24.860 nolanrobbins: 100%. Hopefully you have one of those nice views of the city, man, that’d be nice.
196 00:34:24.860 ⇒ 00:34:25.770 Pranav: Yeah, yeah, no.
197 00:34:26.540 ⇒ 00:34:28.189 Pranav: I’m excited to live over there, it’s gonna be fun.
198 00:34:28.190 ⇒ 00:34:33.700 nolanrobbins: East is nice. Yeah, I’m more in, South Austin when you get here, called St. Elmo, that’s kind of the region, but besides the point.
199 00:34:33.909 ⇒ 00:34:52.829 Pranav: Okay, cool. That’s awesome, that’s awesome. But yeah, so, met him then, and then, we talked a little bit, at that point, like, the company was pretty small. Then they had a really big, big, kind of, growing period last December, and so that’s when I joined, I think, beginning of December, I joined.
200 00:34:53.230 ⇒ 00:35:05.659 nolanrobbins: Okay, because basically, you know, and how Uden was kind of talking to me about it, like, this company started as more of, like, a data analysis, data validation kind of company, and now it’s really grown to the AI solution space, and what have you, so…
201 00:35:05.660 ⇒ 00:35:17.229 nolanrobbins: I guess more of my question is, how do you, you’ve been hands-on with this for a good bit of time in the clients that y’all have. How do you really see that transitioning within the company? Is that something that’s gonna continue to broaden in terms of…
202 00:35:17.240 ⇒ 00:35:25.439 nolanrobbins: you know, more maybe agentic frameworks, different AI integrations that are more on the cutting edge of things, or do these solutions really tend to be
203 00:35:25.940 ⇒ 00:35:32.850 nolanrobbins: You know, simple to start with the expectation that we retain the customer and, you know, kind of sell them abroad since then, you know.
204 00:35:33.210 ⇒ 00:35:33.970 nolanrobbins: Moving forward.
205 00:35:33.970 ⇒ 00:35:35.280 Pranav: Yeah, I think,
206 00:35:35.990 ⇒ 00:35:46.299 Pranav: Probably most, like, consulting firms, like, how they’re gonna operate, or how they should probably operate, is, like, what is, like, the… the biggest problem that they’re having, and how can we build them a solution, like, that is…
207 00:35:46.310 ⇒ 00:36:05.339 Pranav: great, but then also we can build it quickly for them, right? Those are, like, the best things. Like, what’s the biggest problem? What’s the, like, thing that we can fix the fastest for them? If we can maximize both those things, like, the client’s gonna be the most happy, and they’re gonna be more open to renewing, and then also expanding scope for future projects.
208 00:36:06.540 ⇒ 00:36:09.609 Pranav: And so, that’s kind of, like, how, you know, just…
209 00:36:10.040 ⇒ 00:36:15.130 Pranav: probably, like, we’re structuring deals as well. In terms of, like.
210 00:36:15.310 ⇒ 00:36:19.849 Pranav: how we’re operating internally, which I think is super unique, is,
211 00:36:19.980 ⇒ 00:36:29.769 Pranav: We are trying to find repeatable workflows, that whenever we get a new client, we can say, hey, this is very similar to what we built for, you know, XYZ.
212 00:36:29.770 ⇒ 00:36:30.310 nolanrobbins: Yeah.
213 00:36:30.310 ⇒ 00:36:38.180 Pranav: And so, we build a playbook, essentially, and then we can just throw it to a cloud agent, and then they can build out that entire thing again.
214 00:36:38.350 ⇒ 00:36:42.150 Pranav: And so I think that’s where we’re really seeing, like.
215 00:36:42.600 ⇒ 00:37:01.049 Pranav: AI helping as well, internally, in terms of just, like, building things out, like, being very comfortable with, like, letting cloud agents fill in gaps where they can fill in gaps, and then knowing where you should be a human assessing, like, output, or a human actually building the code or infrastructure.
216 00:37:01.080 ⇒ 00:37:05.810 Pranav: Yeah, so I think that’s what’s actually a little bit more unique about Brainforge.
217 00:37:06.220 ⇒ 00:37:06.910 nolanrobbins: Cope.
218 00:37:07.240 ⇒ 00:37:25.840 nolanrobbins: So, how would you say, as an engineer for Brainforge as well, you know, AI is evolving rapidly, right? New things are coming out every week, new models, etc. How do you best cut through the noise and kind of see more of, like, you know, what is actually the future of AI and being able to work that into your workflow?
219 00:37:26.490 ⇒ 00:37:28.000 Pranav: Yeah, totally.
220 00:37:28.580 ⇒ 00:37:34.809 Pranav: I think, yeah, like you said, there’s probably, like, a lot of noise out there, but also there’s a lot of just,
221 00:37:35.140 ⇒ 00:37:44.060 Pranav: real, actually, innovation that’s happening at a scale, or happening at a pace that is just way faster than, honestly, everything… anything that I’ve ever experienced before.
222 00:37:45.080 ⇒ 00:37:47.370 Pranav: And so, yeah, like…
223 00:37:47.790 ⇒ 00:38:00.440 Pranav: Claude Co-work is, like, an example of, like, it’s actually providing people a lot of value, you know? Sure. Also, tokens as well, like, people are talking about how it’s very expensive, which is definitely the case.
224 00:38:00.590 ⇒ 00:38:06.329 Pranav: But let’s put that aside, because that’s more of, like, a financial thing. Sure. In terms of, the actual…
225 00:38:06.540 ⇒ 00:38:20.349 Pranav: outcomes that it’s providing to people, some of our clients are like, yeah, what are you doing that Claude Cohort can do for me? That’s an actual question people ask us, right? And so it requires us to be more innovative in thinking, okay, what is the gap that we’re filling?
226 00:38:20.370 ⇒ 00:38:37.289 Pranav: at one point, it was really just, how can we build a… maybe, like, a couple years ago, like, what is a chat GPT wrapper that you can build that is just maybe automated to run, like, 100 times in a row, or just, like, doing one authentic thing? Now, it’s much more complex. How can we…
227 00:38:37.500 ⇒ 00:38:44.189 Pranav: How can we leverage embeddings? How can we, really think about that context layer?
228 00:38:45.220 ⇒ 00:39:00.980 Pranav: things of that nature. How can we think about, okay, not just building a system for you, but has visibility across an entire organization? You know, these companies, like, and then products like Cloud Co-Work, like, they do a good job with, like, integrating certain tools. Sure.
229 00:39:01.200 ⇒ 00:39:09.569 Pranav: And, like, for your specific use, but there’s still gaps there, too. Like, maybe it can integrate every single application, right?
230 00:39:10.730 ⇒ 00:39:21.340 Pranav: Maybe it’s not gonna dive deep into every small file, right? Because it doesn’t know exactly what you’re trying to do. So, it really requires you to put on your product mind.
231 00:39:21.660 ⇒ 00:39:22.370 nolanrobbins: I understand.
232 00:39:22.370 ⇒ 00:39:24.820 Pranav: What is the technical solution that you can build, and then…
233 00:39:25.180 ⇒ 00:39:30.940 Pranav: it’s also… it’s very quick… you can… you can fall behind pretty quickly, as you probably know, right?
234 00:39:30.940 ⇒ 00:39:32.460 nolanrobbins: Yes, absolutely.
235 00:39:32.990 ⇒ 00:39:44.419 Pranav: moving week to week, month to month, and so if you’re not on top of, like, what has dropped this month, you might be building kind of, like, old technology.
236 00:39:44.590 ⇒ 00:39:51.930 Pranav: So, I think, yeah, it’s just… we’re full of, like, AI engineers here that,
237 00:39:52.140 ⇒ 00:40:00.900 Pranav: think about things in this way. We’re constantly challenging, timelines, so… and we’re also, like, constantly
238 00:40:01.140 ⇒ 00:40:07.389 Pranav: We’re constantly questioning, how heavy, like, an application needs to be.
239 00:40:07.910 ⇒ 00:40:11.870 Pranav: data warehousing, or can we use a CLI tool instead?
240 00:40:12.080 ⇒ 00:40:13.140 Pranav: So
241 00:40:13.400 ⇒ 00:40:30.069 Pranav: things of that nature, they’re always kind of, what they’re doing is that they’re driving down the time it takes to production, really. So it’s like, we’re trying to make decisions that are gonna create a product that still performs exactly how the client wants it, but we’re not…
242 00:40:30.070 ⇒ 00:40:37.359 Pranav: Creating too much technical complexity where it’s gonna make our roadmap be months and months, but instead, you know, weeks or even days.
243 00:40:37.500 ⇒ 00:40:38.210 nolanrobbins: Right.
244 00:40:38.450 ⇒ 00:40:55.200 nolanrobbins: I think you kind of hit on a couple of those points. It is that trade-off of, like, yeah, this new thing came out, but it might be over-engineering for the actual solution, right? So, like, having that domain expertise in AI, in software engineering, computer fundamentals of just, like, hey, you know, we could make an agent for this, or we could…
245 00:40:55.200 ⇒ 00:41:00.179 nolanrobbins: just not. I have, like, a little system prompt or something that kind of runs in the background, you know?
246 00:41:00.180 ⇒ 00:41:08.840 nolanrobbins: That’s kind of what I struggle with, and sometimes with my clients, it’s like, they have this vision, oh, they saw the Claude thing, whatever, like, they want an agent now, and it’s just like…
247 00:41:08.870 ⇒ 00:41:16.739 nolanrobbins: Do you know how non-deterministic the agent is? Do you know how, you know, autonomous this is? It could go off the rails if we’re not careful. The, the,
248 00:41:16.930 ⇒ 00:41:33.450 nolanrobbins: you know, the complications of it having a multi-agentic system as well, is its own separate thing, so, all that aside, so, I want to be respectful of your time. I just… I’ll ask one simple last question more about the culture of things, how you kind of work in a team. You know, I talked to…
249 00:41:33.730 ⇒ 00:41:38.409 nolanrobbins: I forget his name, he’s from Cleveland, Ohio.
250 00:41:38.670 ⇒ 00:41:41.270 nolanrobbins: Sam, Sam, that’s right, yeah, Sam. Yeah.
251 00:41:41.830 ⇒ 00:41:49.550 nolanrobbins: is, like, he your boss kind of thing? How big is your team, and how do you guys kind of work together to really, you know, address client issues on the day-to-day?
252 00:41:50.410 ⇒ 00:42:06.300 Pranav: Yeah, so, kind of how the AI service is currently structured is, yeah, so Sam, he is kind of, I guess we would say, like, our senior individual contributor. So, technical, has a lot of, software engineering experience,
253 00:42:06.500 ⇒ 00:42:20.249 Pranav: And, yeah, AI engineer. So, we have a couple other AI engineers as well. What I do is, I also… AI engineer, I guess, but what I also do is, I work directly with the client.
254 00:42:20.350 ⇒ 00:42:29.440 Pranav: And so, I’m kind of taking in the client recommendation, or the client problems, and then building, like, a technical…
255 00:42:29.570 ⇒ 00:42:48.479 Pranav: technical approach, technical design for an application, and then I’m kind of going to Sam, to our other two engineers, to say, hey, can we execute this? What are… what are certain, blockers here? Yeah, so…
256 00:42:48.600 ⇒ 00:42:54.909 Pranav: basically what we all do, essentially, like, no one reports to me, I don’t report to anybody besides just, like, kind of Utom.
257 00:42:54.910 ⇒ 00:42:55.460 nolanrobbins: Sure.
258 00:42:55.460 ⇒ 00:42:57.990 Pranav: So, yeah, if you were to join, it would be pretty similar.
259 00:42:58.580 ⇒ 00:43:16.399 nolanrobbins: Okay, very cool. Yeah, Udom seems really nice, so, he’s a driven young man, honestly. You can kind of tell he comes from that engineering background. He saw those problems happen in the real day-to-day with the companies he was a part of, and, yeah, he’s looking to change that culture and move forward, so…
260 00:43:16.400 ⇒ 00:43:26.650 nolanrobbins: You know, I want to thank you for your time today, Mr. Pranav. Yeah. Be respectful of your time as well, and I guess any next steps, I forget, Kayla will be in contact with me, maybe, or…
261 00:43:26.650 ⇒ 00:43:29.060 Pranav: Yeah, yeah, Kayla will definitely give you the next steps.
262 00:43:29.330 ⇒ 00:43:29.950 nolanrobbins: Cool.
263 00:43:30.190 ⇒ 00:43:37.269 nolanrobbins: Well, nice, it was, I enjoyed my time with you today, and good luck, moving back to Austin and everything, up on the east side there.
264 00:43:37.450 ⇒ 00:43:40.350 Pranav: Yeah, yeah, I appreciate it. Nice talking to you, Nolan. Have a good one.
265 00:43:40.350 ⇒ 00:43:42.550 nolanrobbins: Alright, take care, bro.