Meeting Title: Brainforge Project Workflow Discussion Date: 2026-04-30 Meeting participants: Brylle Girang, Michele Altomare
WEBVTT
1 00:02:01.280 ⇒ 00:02:03.879 Michele Altomare: Okay… let’s see…
2 00:02:05.110 ⇒ 00:02:05.670 Brylle Girang: Hello.
3 00:02:06.660 ⇒ 00:02:07.860 Michele Altomare: Hello?
4 00:02:08.400 ⇒ 00:02:09.490 Brylle Girang: How are you doing?
5 00:02:09.889 ⇒ 00:02:11.359 Michele Altomare: Good, how you doing?
6 00:02:11.640 ⇒ 00:02:13.320 Brylle Girang: Great! Amazing!
7 00:02:13.790 ⇒ 00:02:16.470 Michele Altomare: How has the week been so far?
8 00:02:17.520 ⇒ 00:02:19.999 Brylle Girang: How would I say it?
9 00:02:20.300 ⇒ 00:02:27.009 Brylle Girang: I think most of my week has been dedicated to making sure that people who needs help?
10 00:02:27.190 ⇒ 00:02:29.370 Brylle Girang: Actually, it gets the help that they need.
11 00:02:29.540 ⇒ 00:02:31.810 Brylle Girang: So, it has been challenging.
12 00:02:32.040 ⇒ 00:02:33.699 Brylle Girang: Well, we’re working,
13 00:02:33.920 ⇒ 00:02:47.749 Brylle Girang: I’m going to share this with you. We are working with data engineers and then analytics engineers, you know, software developers, and I’m not one of those, so I’m trying to make sure that, you know, I get to keep up at their bandwidth.
14 00:02:48.350 ⇒ 00:02:50.239 Michele Altomare: Yeah, that’s funny.
15 00:02:50.650 ⇒ 00:02:52.880 Michele Altomare: I imagine. It sounded…
16 00:02:53.210 ⇒ 00:03:00.300 Michele Altomare: like, this week for you might have been a lot of that. I met Lisa today, we talked for almost 2 hours, and she’s like, yeah, like…
17 00:03:00.450 ⇒ 00:03:04.420 Michele Altomare: Working with bees, it’s like… your world sounds busy, dude, so…
18 00:03:04.570 ⇒ 00:03:07.849 Michele Altomare: I guess this call, in a way, is adding to that list of challenges.
19 00:03:07.850 ⇒ 00:03:18.879 Brylle Girang: No, no, no. I mean, you know, everyone’s busy, we just have different priorities, and my priority is to make sure that I get busy with people, so that’s a good thing.
20 00:03:19.370 ⇒ 00:03:25.110 Michele Altomare: By the way, kudos on the… The L&D…
21 00:03:26.220 ⇒ 00:03:26.790 Brylle Girang: The courses?
22 00:03:26.790 ⇒ 00:03:29.040 Michele Altomare: It’s very cool. The course is very solid.
23 00:03:29.560 ⇒ 00:03:32.739 Brylle Girang: Oh, thank you, thank you. Was it, was it really helpful?
24 00:03:33.340 ⇒ 00:03:48.519 Michele Altomare: It was super… it was super helpful. The only… the only thing that I couldn’t… not even figure out, and I don’t know if I messaged you about this, one of the last modules, I think on Accelerator on 4, the pages were split into a bunch of subpages.
25 00:03:48.790 ⇒ 00:03:50.269 Michele Altomare: There’s, like, a formatting thing.
26 00:03:50.480 ⇒ 00:03:57.870 Michele Altomare: I don’t think it was… it was something even you did. It just looked like the chapters got cut into a bunch of small chapters.
27 00:03:58.290 ⇒ 00:03:59.360 Michele Altomare: I can find it.
28 00:03:59.980 ⇒ 00:04:02.000 Brylle Girang: That is weird. Okay, let me check.
29 00:04:02.000 ⇒ 00:04:07.600 Michele Altomare: But that was… but that’s, like, a tiny thing. Like, that’s not…
30 00:04:08.300 ⇒ 00:04:10.239 Michele Altomare: But I thought it was a pretty…
31 00:04:10.890 ⇒ 00:04:17.090 Michele Altomare: helpful tool. I can imagine for people that are just diving into it for the first time, it’s like…
32 00:04:20.079 ⇒ 00:04:21.990 Michele Altomare: Are people going through it?
33 00:04:28.140 ⇒ 00:04:28.740 Michele Altomare: That’s.
34 00:04:28.740 ⇒ 00:04:31.770 Brylle Girang: Dude, I don’t… I don’t see it on my end.
35 00:04:32.530 ⇒ 00:04:36.160 Michele Altomare: Let me see… It was… dude, it was very small, it was, like, 2 pages.
36 00:04:36.160 ⇒ 00:04:36.720 Brylle Girang: Yeah.
37 00:04:37.120 ⇒ 00:04:40.519 Michele Altomare: Unless it might have also just been something on the formatting on my side.
38 00:04:40.630 ⇒ 00:04:41.470 Michele Altomare: Like…
39 00:04:48.470 ⇒ 00:04:49.240 Brylle Girang: I think…
40 00:04:49.240 ⇒ 00:04:52.840 Michele Altomare: And it was probably just a local thing then, which is good.
41 00:04:53.590 ⇒ 00:04:54.170 Brylle Girang: Yeah.
42 00:04:58.920 ⇒ 00:04:59.750 Michele Altomare: Let’s see…
43 00:05:00.610 ⇒ 00:05:07.020 Michele Altomare: I’m clicking through all of it. Okay, dude, I can’t find it. I don’t know where it was. Maybe… I don’t know if it was the cloud agents?
44 00:05:07.950 ⇒ 00:05:10.029 Michele Altomare: It was somewhere, but it was, like, a very…
45 00:05:10.030 ⇒ 00:05:10.430 Brylle Girang: Okay.
46 00:05:10.430 ⇒ 00:05:15.310 Michele Altomare: It just looked like it might have been, like, a HTTP thing, or just something in the browser, like…
47 00:05:15.870 ⇒ 00:05:18.969 Michele Altomare: paragraphs were broken up into, like, sentences, and I was like, oh.
48 00:05:18.970 ⇒ 00:05:20.210 Brylle Girang: Oh, okay.
49 00:05:20.710 ⇒ 00:05:27.309 Brylle Girang: That’s interesting. Thank you for letting me know. Oh, okay. I think that’s it, in the Quick Start course, I think.
50 00:05:28.360 ⇒ 00:05:31.240 Brylle Girang: Quick Start Course Module 4.
51 00:05:31.710 ⇒ 00:05:33.539 Brylle Girang: the GitHub? The GitHub?
52 00:05:33.650 ⇒ 00:05:34.430 Brylle Girang: chapters.
53 00:05:34.430 ⇒ 00:05:37.329 Michele Altomare: Maybe. Yes, exactly.
54 00:05:37.590 ⇒ 00:05:38.130 Brylle Girang: Oh, wow.
55 00:05:38.180 ⇒ 00:05:41.630 Michele Altomare: Everything after subsection 4.4?
56 00:05:42.040 ⇒ 00:05:43.199 Michele Altomare: You see what I mean?
57 00:05:43.520 ⇒ 00:05:45.490 Brylle Girang: Yeah, yeah, yeah, that’s weird.
58 00:05:46.860 ⇒ 00:05:49.090 Brylle Girang: Mmm, okay, I think I see it.
59 00:05:49.850 ⇒ 00:05:58.949 Brylle Girang: Okay, so, yeah. Thank you for pointing this out. I think this is because the summary changes impact our headings, and then the platform
60 00:05:59.330 ⇒ 00:06:02.990 Brylle Girang: Separates the headings from each other, but…
61 00:06:03.500 ⇒ 00:06:15.929 Brylle Girang: Okay, the… wow, this is… this is funny. But thank you, thank you for the feedback. I’m really glad that you were able to go through that course. I’m really glad that you dedicated time into actually looking into that.
62 00:06:16.260 ⇒ 00:06:19.309 Brylle Girang: But, yeah, that’s, that’s cool.
63 00:06:19.860 ⇒ 00:06:20.330 Michele Altomare: It was very similar.
64 00:06:20.330 ⇒ 00:06:20.790 Brylle Girang: I…
65 00:06:23.150 ⇒ 00:06:29.100 Brylle Girang: I was looking at this thing that you have shared, the job with Ray.
66 00:06:29.830 ⇒ 00:06:35.140 Michele Altomare: Yeah, sorry I didn’t send it earlier, I was running around and I thought I had sent this. That’s why I’m…
67 00:06:35.420 ⇒ 00:06:40.690 Brylle Girang: No worries. Yeah, would you mind, like, let’s go through it together, and would you mind, like, walking me through?
68 00:06:40.690 ⇒ 00:06:42.219 Michele Altomare: Yeah, we can run through it. It’s…
69 00:06:43.100 ⇒ 00:06:48.580 Michele Altomare: I also, as you can probably already tell, this was, like, an output from Kirscher and Claude, but basically.
70 00:06:52.380 ⇒ 00:06:56.589 Michele Altomare: I had my knowledge before Brainforge, and then knowledge now going into it.
71 00:06:59.690 ⇒ 00:07:02.889 Michele Altomare: Before Brainforge, a lot of what I’ve done, again, is in the
72 00:07:03.370 ⇒ 00:07:09.310 Michele Altomare: production side of things, generating content, I can create videos and static images
73 00:07:09.680 ⇒ 00:07:16.250 Michele Altomare: one by one, right? Almost like the analogy that I think of is, like, your personal home kitchen.
74 00:07:16.500 ⇒ 00:07:20.040 Michele Altomare: I have my pots and pans. I can generate something on Hicksfield.
75 00:07:20.150 ⇒ 00:07:23.760 Michele Altomare: something in ChatGPT images, whatever.
76 00:07:23.860 ⇒ 00:07:35.700 Michele Altomare: what I’m trying to think through now, and this is, like, after Brainforge, which is what I was talking with with Utam, is how can we systematize, or just…
77 00:07:35.960 ⇒ 00:07:40.150 Michele Altomare: What I’ve never figured out how to do, at least not yet, is the handoffs.
78 00:07:40.440 ⇒ 00:07:44.399 Michele Altomare: of… the tasks in the production process, so, like…
79 00:07:44.610 ⇒ 00:07:50.350 Michele Altomare: the writing, to generation, to checking it with a smaller sub-agent, I presume.
80 00:07:50.660 ⇒ 00:07:52.979 Michele Altomare: And then handing off again.
81 00:07:52.980 ⇒ 00:07:53.750 Brylle Girang: Yeah.
82 00:07:53.750 ⇒ 00:07:57.550 Michele Altomare: Utsami and I started vibecoding something with OpenCode, and he ran…
83 00:07:57.790 ⇒ 00:08:03.710 Michele Altomare: this tool called Comfy AI to generate some images locally with local inference on his Mac Mini.
84 00:08:04.410 ⇒ 00:08:05.110 Brylle Girang: Okay.
85 00:08:05.110 ⇒ 00:08:07.189 Michele Altomare: That was a very specific thing, but…
86 00:08:08.260 ⇒ 00:08:11.559 Michele Altomare: that’s kind of what I’m trying to think through now.
87 00:08:11.910 ⇒ 00:08:16.520 Michele Altomare: is how to… Chain together some of the…
88 00:08:16.990 ⇒ 00:08:22.940 Michele Altomare: some of those processes. Maybe it is with, like, cloud agents, or with skills, like, was in L&D.
89 00:08:22.940 ⇒ 00:08:23.470 Brylle Girang: Yeah.
90 00:08:23.470 ⇒ 00:08:29.930 Michele Altomare: The second piece, and then I’ll bring it back, is that I’ve done… dude, I don’t know.
91 00:08:30.400 ⇒ 00:08:40.270 Michele Altomare: Wow, like, thousands of, like, these sort of prompts and things from, like, the last 4 years, because I’ve also just been tinkering with it, but it lives inside of, like.
92 00:08:40.870 ⇒ 00:08:50.890 Michele Altomare: ChatGPT projects, Cloud projects, I’ve exported all of them, put them into MyObsidian, but still, like, there’s context, it’s in a bunch of different silos.
93 00:08:51.150 ⇒ 00:08:58.050 Michele Altomare: And I’m just wondering how to have, to whatever degree is possible, like, Synchronization.
94 00:08:58.290 ⇒ 00:09:02.619 Michele Altomare: between those ideas, so I can try to port them into Brainforge.
95 00:09:02.770 ⇒ 00:09:10.560 Michele Altomare: And then… start to, like, change some of those things together. And I… there’s some examples in the… Notion.
96 00:09:10.670 ⇒ 00:09:12.899 Michele Altomare: If that makes sense. Yeah. I’ll lock that.
97 00:09:13.600 ⇒ 00:09:22.030 Brylle Girang: Yeah, I think, what I’m understanding here is that there are two parts. The first part is how can you move from
98 00:09:22.180 ⇒ 00:09:29.460 Brylle Girang: Like, individually prompting and individually creating the outputs into more of a…
99 00:09:29.800 ⇒ 00:09:33.749 Brylle Girang: consecutive, as you mentioned, orchestrated movement.
100 00:09:33.750 ⇒ 00:09:34.420 Michele Altomare: sucked.
101 00:09:34.640 ⇒ 00:09:40.980 Brylle Girang: Right? And then the second part is, how can you synthesize your…
102 00:09:41.970 ⇒ 00:09:44.649 Brylle Girang: current knowledge that is in your Obsidian.
103 00:09:44.860 ⇒ 00:09:48.230 Brylle Girang: And convert it to something that we can use in the platform.
104 00:09:48.590 ⇒ 00:09:49.179 Brylle Girang: Is that right?
105 00:09:49.180 ⇒ 00:09:49.780 Michele Altomare: tripped.
106 00:09:49.920 ⇒ 00:09:50.590 Michele Altomare: Correct.
107 00:09:50.590 ⇒ 00:09:51.400 Brylle Girang: Okay.
108 00:09:51.910 ⇒ 00:10:01.449 Brylle Girang: Okay, and in the first part, I guess I wanted to know, like, what our guidelines and limitations here are. Are we expecting that we’re going to be
109 00:10:01.610 ⇒ 00:10:07.670 Brylle Girang: Like, build our own orchestra? Or are we open to, like, using
110 00:10:08.350 ⇒ 00:10:11.570 Brylle Girang: other orchestration tools, let’s say NHF, make.
111 00:10:14.540 ⇒ 00:10:18.540 Michele Altomare: I don’t know what I don’t know. I believe that you know.
112 00:10:18.540 ⇒ 00:10:21.250 Brylle Girang: So… Gotcha.
113 00:10:21.250 ⇒ 00:10:21.600 Michele Altomare: I will say.
114 00:10:21.600 ⇒ 00:10:22.010 Brylle Girang: So.
115 00:10:22.010 ⇒ 00:10:25.129 Michele Altomare: When we were messing with,
116 00:10:25.280 ⇒ 00:10:29.940 Michele Altomare: again, the tool was called Comfy AI, and Utam was calling it from…
117 00:10:29.940 ⇒ 00:10:30.389 Brylle Girang: Yeah, yeah.
118 00:10:30.390 ⇒ 00:10:33.729 Michele Altomare: the code sitting on top of the Brainforge platform.
119 00:10:35.950 ⇒ 00:10:43.740 Michele Altomare: like… It was the whole process, and I think to what you just said, that was creating the orchestra.
120 00:10:44.080 ⇒ 00:10:48.060 Michele Altomare: Like, we explicitly said, like, don’t use N8N, don’t use outside tools.
121 00:10:48.480 ⇒ 00:10:57.699 Michele Altomare: I’m wondering if, for the sake of practicality, even to try something, what would be the shortest path to an MVP, if that makes sense?
122 00:10:57.700 ⇒ 00:11:06.820 Brylle Girang: Yeah, yeah. I think I have seen so many people using N8N to actually do this. Like, what happens is.
123 00:11:07.040 ⇒ 00:11:14.220 Brylle Girang: There’s a trigger, it creates… there’s, like, a content generation agent, which creates, like, the story.
124 00:11:15.170 ⇒ 00:11:29.000 Brylle Girang: on to, like, a text-to-speech agent, which creates the voice, then it gets passed on to, like, a video creator agent, which creates, well, the visuals, and then that’s the final output, but that’s using N8N.
125 00:11:29.170 ⇒ 00:11:36.950 Brylle Girang: If we’re, like, going to do it in-house without those tools, I think that’s something that I need to
126 00:11:37.530 ⇒ 00:11:39.360 Brylle Girang: better research on.
127 00:11:39.960 ⇒ 00:11:46.769 Brylle Girang: But when it comes to, like, the MVP, I think we… I think it is worth trying, like, an 8N or make.
128 00:11:46.990 ⇒ 00:11:49.829 Brylle Girang: Just to see… just for us to get a sense of
129 00:11:49.940 ⇒ 00:11:52.030 Brylle Girang: What our final product should be.
130 00:11:52.230 ⇒ 00:12:00.250 Brylle Girang: And then let’s slowly transition to using our in-house tools. Okay, this is possible using Anita, and this is possible using Make.
131 00:12:00.500 ⇒ 00:12:07.260 Brylle Girang: how can we slowly convert the steps into something that’s going to run locally within Brainforge?
132 00:12:07.600 ⇒ 00:12:09.219 Brylle Girang: I think that’s how I’m envisioning it.
133 00:12:09.760 ⇒ 00:12:12.029 Michele Altomare: Exactly. Because I could see…
134 00:12:12.360 ⇒ 00:12:17.250 Michele Altomare: I can see us getting to the point, because it seems like it would be more economical, and just…
135 00:12:17.760 ⇒ 00:12:21.799 Michele Altomare: All vertically integrated of, like, rebuilding this content wheel.
136 00:12:22.600 ⇒ 00:12:22.990 Brylle Girang: Yeah.
137 00:12:22.990 ⇒ 00:12:28.479 Michele Altomare: no allergy is perfect, but I would also think that just borrowing, like you said, whether it’s Make or N8N,
138 00:12:28.590 ⇒ 00:12:33.210 Michele Altomare: From what already exists and stitching it, just to see… Does that make sense?
139 00:12:33.600 ⇒ 00:12:38.630 Brylle Girang: Yeah, yeah, exactly. What’s the end goal here? Like, is this just meant for us to…
140 00:12:38.750 ⇒ 00:12:44.250 Brylle Girang: Like, improve our efficiency, or is this going to be, like, a revenue-generating service in the future?
141 00:12:44.250 ⇒ 00:12:52.290 Michele Altomare: So, I was talking with the… I was talking about this with Lisa today, who is leading partnerships, and she wants to…
142 00:12:52.530 ⇒ 00:13:00.649 Michele Altomare: just… With stakeholders at companies, or in how we present, In our outbound content.
143 00:13:01.700 ⇒ 00:13:03.530 Michele Altomare: She, like, in general.
144 00:13:03.550 ⇒ 00:13:19.400 Michele Altomare: me, her, you, Wu Tong. We’re just trying to think of ways to level up the content. I could make anything with a physical camera and all this other stuff by myself, but I’m trying to think of how to systematize it. So, like, this example, I don’t know exactly how we would use this specific idea.
145 00:13:19.400 ⇒ 00:13:27.390 Michele Altomare: But the principles would be evergreen. On the notion that I sent you, there’s this, like, tweet from Silicon Mania. Do I look like a fucking…
146 00:13:27.910 ⇒ 00:13:31.689 Michele Altomare: with Jensen Huang in the beginning, he’s like, do I look like a fucking… I don’t know what he’s
147 00:13:32.730 ⇒ 00:13:33.919 Michele Altomare: But see, all that is, it’s like.
148 00:13:33.920 ⇒ 00:13:34.420 Brylle Girang: Yeah.
149 00:13:34.800 ⇒ 00:13:36.210 Michele Altomare: 8 news stories.
150 00:13:36.780 ⇒ 00:13:42.429 Michele Altomare: They made a scene, there’s characters, they have a dialogue, and they talk, and then it puts them together.
151 00:13:43.040 ⇒ 00:13:44.730 Michele Altomare: I don’t know how difficult that is.
152 00:13:44.880 ⇒ 00:13:47.789 Michele Altomare: If I did it by hand, I wouldn’t know how to do it. It’s like, okay.
153 00:13:48.050 ⇒ 00:13:53.570 Michele Altomare: Take the stories, make them into sentences, give it to a character, generate that sound.
154 00:13:54.500 ⇒ 00:13:55.150 Michele Altomare: So…
155 00:13:55.150 ⇒ 00:13:55.940 Brylle Girang: Yep, yep.
156 00:13:56.470 ⇒ 00:14:00.969 Michele Altomare: I’m talking a lot, but I’m just trying to think through… Yep. …if this was, like, a test, right?
157 00:14:01.300 ⇒ 00:14:05.869 Michele Altomare: To see what the motion looks like, specifically for, like.
158 00:14:06.010 ⇒ 00:14:08.700 Michele Altomare: generative AI, as far as, like.
159 00:14:09.040 ⇒ 00:14:14.240 Michele Altomare: Image and video, which seems different from a lot of what reinforcing’s done up to now.
160 00:14:15.040 ⇒ 00:14:16.860 Brylle Girang: Yeah, that makes sense. Okay.
161 00:14:17.280 ⇒ 00:14:23.760 Brylle Girang: Okay, I think my next question here is, before we build Before we’ll build, like.
162 00:14:23.870 ⇒ 00:14:28.569 Brylle Girang: the entire car. Have we tried building the individual parts first?
163 00:14:29.950 ⇒ 00:14:38.700 Brylle Girang: Like, have you tried, like, creating, you know, just pulling off a really solid, like, skill that creates scripts?
164 00:14:38.930 ⇒ 00:14:42.349 Brylle Girang: A really solid skill that helps us create videos, etc.
165 00:14:42.620 ⇒ 00:14:51.890 Michele Altomare: Not yet, but I assume, and it sounds like that would be the first step. It’s like, parse the news, create characters, give them a script, and then…
166 00:14:52.510 ⇒ 00:14:56.729 Michele Altomare: So is that how you would see it being done? Make the skills individually, like, 10 of them?
167 00:14:57.660 ⇒ 00:14:58.479 Brylle Girang: Yeah, I think…
168 00:15:00.120 ⇒ 00:15:10.559 Brylle Girang: well, we can argue about this, but I think that’s the best way that… I think that… that’s how I would approach it. Like, before I try to build, like, the connections.
169 00:15:10.670 ⇒ 00:15:13.459 Brylle Girang: I’m going to try and build individually first.
170 00:15:13.740 ⇒ 00:15:14.710 Michele Altomare: Say, hey, girl.
171 00:15:14.710 ⇒ 00:15:16.340 Brylle Girang: I’m going to focus on the script.
172 00:15:16.440 ⇒ 00:15:20.690 Brylle Girang: How can I create a skill that creates a script in one go?
173 00:15:20.860 ⇒ 00:15:22.670 Brylle Girang: That… that’s actually solid.
174 00:15:22.870 ⇒ 00:15:24.140 Brylle Girang: I have that script.
175 00:15:24.590 ⇒ 00:15:32.140 Brylle Girang: What tool can I use that helps me create, like, a solid video? Hagen, it might be Hagen, it might be Higgs build.
176 00:15:32.280 ⇒ 00:15:32.910 Brylle Girang: Chatter up?
177 00:15:33.570 ⇒ 00:15:43.900 Brylle Girang: So sort of going through it manually, step by step, and then once it’s solid, like, hey, these tools are working, these skills are perfect, how can we connect those?
178 00:15:46.060 ⇒ 00:15:47.070 Michele Altomare: That makes sense.
179 00:15:48.000 ⇒ 00:15:57.719 Michele Altomare: those skills, we would make them similarly to how it was in L&D. They can just be made inside of a cursor agent, and say our end goal is to make a skill.md that does this thing.
180 00:15:58.310 ⇒ 00:15:59.140 Brylle Girang: Exactly.
181 00:15:59.200 ⇒ 00:16:11.620 Brylle Girang: Exactly, and as much as possible, I think it’s going to revolve around, you know, the prompts, the formats, like, how would we define excellence when it comes to the… when it comes to our outputs?
182 00:16:11.670 ⇒ 00:16:22.579 Brylle Girang: And then we need to make sure that those are transferable across tools. I’m pretty sure we’re not going to be forever using HeyGen. How can we make sure that we create a prompt, or we create something that’s
183 00:16:22.870 ⇒ 00:16:26.440 Brylle Girang: That we can use, you know, across Higgsville, Sora, Agent, etc.
184 00:16:26.440 ⇒ 00:16:27.160 Michele Altomare: Yeah.
185 00:16:27.980 ⇒ 00:16:39.720 Brylle Girang: So I think that’s going to be the important first step. Because if, you know, if we have the orchestrator, and then the individual steps are not solid, then it’s… it’s not going to be successful.
186 00:16:39.720 ⇒ 00:16:41.410 Michele Altomare: Oh, jam everywhere, yeah.
187 00:16:41.410 ⇒ 00:16:42.760 Brylle Girang: Exactly. Exactly.
188 00:16:43.870 ⇒ 00:16:44.730 Michele Altomare: Okay.
189 00:16:44.930 ⇒ 00:16:46.789 Michele Altomare: Okay, that makes sense then.
190 00:16:47.130 ⇒ 00:16:51.930 Michele Altomare: That’ll be the first thing I, I try to tackle.
191 00:16:53.000 ⇒ 00:16:57.380 Brylle Girang: Yes, and then, the second piece of this, like, how do you synthesize
192 00:16:57.670 ⇒ 00:17:01.389 Brylle Girang: I think this is… this is going to be…
193 00:17:01.570 ⇒ 00:17:09.019 Brylle Girang: really base, but have you tried asking Cursor, like, hey, synthesize my obsidian, and then see how it disintegrate porch?
194 00:17:09.349 ⇒ 00:17:18.789 Michele Altomare: I haven’t… I haven’t tried it, I haven’t tried it. Just because I have so much… I haven’t… some of the things are, like, personal things, but I’ve wondered, like…
195 00:17:20.019 ⇒ 00:17:22.399 Michele Altomare: personally, I still run into the issue where
196 00:17:22.979 ⇒ 00:17:26.189 Michele Altomare: Two weeks ago, I exported all my ChatGPT
197 00:17:26.599 ⇒ 00:17:30.539 Michele Altomare: Data from the last 4 years, which is, like, 3 gigabytes of stuff.
198 00:17:30.869 ⇒ 00:17:35.599 Michele Altomare: And then my cloud is, like, a gigabyte. It’s a lot of shit, bro. So much… some of this stuff is
199 00:17:35.920 ⇒ 00:17:37.109 Michele Altomare: This one looks like.
200 00:17:37.220 ⇒ 00:17:41.880 Michele Altomare: I don’t know what’s in there. They gave me images, I was like, God, I don’t know why these are in here.
201 00:17:42.370 ⇒ 00:17:43.250 Michele Altomare: But…
202 00:17:46.940 ⇒ 00:17:50.910 Michele Altomare: Two weeks ago, I pulled stuff from the last four years, but there’s stuff that’s a week…
203 00:17:51.370 ⇒ 00:17:56.469 Michele Altomare: old, but it’s not in that anymore, because I did it inside, like, claw.ai.
204 00:17:57.410 ⇒ 00:17:57.869 Brylle Girang: with the chat.
205 00:17:58.040 ⇒ 00:18:00.619 Michele Altomare: Which it does not write to Obsidian.
206 00:18:01.570 ⇒ 00:18:05.589 Michele Altomare: But those were still things that I could see elements of it being…
207 00:18:05.870 ⇒ 00:18:13.319 Michele Altomare: relevant to Brainforge. Now, obviously, like, you can’t perfectly automate every piece of a workflow, but I’m just wondering, like.
208 00:18:13.660 ⇒ 00:18:20.799 Michele Altomare: if there’s a tool like Cursor or something that I can just do all my work inside of, that talks to
209 00:18:21.960 ⇒ 00:18:25.579 Michele Altomare: ChatGPT, chat, Claude Chat, Claude Code.
210 00:18:25.580 ⇒ 00:18:26.260 Brylle Girang: Yeah, yeah.
211 00:18:26.260 ⇒ 00:18:26.970 Michele Altomare: Becca.
212 00:18:27.900 ⇒ 00:18:33.939 Michele Altomare: And then through every pass, it’s writing to Obsidian, and then that Obsidian can then talk to BrainFold. Does that make sense?
213 00:18:34.240 ⇒ 00:18:39.289 Brylle Girang: Yeah, yeah, I think… I think I understand it better now. Let me rephrase that.
214 00:18:39.450 ⇒ 00:18:48.320 Brylle Girang: The goal is not for us to synthesize your current knowledge base, it’s to make sure that whatever tool you use, it gets thrown into one.
215 00:18:48.610 ⇒ 00:18:50.070 Brylle Girang: specific place.
216 00:18:50.350 ⇒ 00:18:53.829 Michele Altomare: Correct, correct. So then once it’s all somewhere.
217 00:18:54.180 ⇒ 00:18:59.849 Michele Altomare: that then… like, the biggest thing is… let me think of a very specific example, because I think it’ll…
218 00:19:00.770 ⇒ 00:19:01.630 Michele Altomare: grounded.
219 00:19:11.100 ⇒ 00:19:14.429 Michele Altomare: Utom showed me the skill last 30 days, which I thought was super cool.
220 00:19:14.950 ⇒ 00:19:15.820 Brylle Girang: Right.
221 00:19:15.820 ⇒ 00:19:18.129 Michele Altomare: Which, like, scraped Reddit and all this stuff.
222 00:19:19.450 ⇒ 00:19:29.839 Michele Altomare: the last company that I was at, and I did it for him, and I did it for some other people, but we wrote, and this is not crazy sophisticated, but a series of, like, hooks
223 00:19:29.950 ⇒ 00:19:34.360 Michele Altomare: For content that had reference to later in the video,
224 00:19:36.280 ⇒ 00:19:39.829 Michele Altomare: There was just, like, a bunch of architecture, for lack of a better word.
225 00:19:40.170 ⇒ 00:19:44.090 Michele Altomare: I would love to be able to Type into something.
226 00:19:44.360 ⇒ 00:19:47.240 Michele Altomare: that reads my Obsidian, and then I tell…
227 00:19:47.620 ⇒ 00:20:01.890 Michele Altomare: whether… I don’t know where I’m interfacing, but I tell it, I’m like, go read all these, like, 20 pages that I wrote for this, and then rewrite it in context of Brainforge. Like, all those little stories. Replace Bhutan, put B in there, put these people in there, like, just try.
228 00:20:02.300 ⇒ 00:20:07.089 Michele Altomare: And we can do stuff with it, because there’s, like, Bro, hundreds of documents.
229 00:20:07.090 ⇒ 00:20:08.430 Brylle Girang: Yeah. Yeah.
230 00:20:08.780 ⇒ 00:20:18.680 Michele Altomare: But they’re all, like, they’re kind of scattered. Like, I’m sure they got exported when I did the mass export of, like, 4 gigs, and now it’s, like, in Obsidian, but that’s mixed with other details.
231 00:20:18.850 ⇒ 00:20:21.900 Michele Altomare: So… If that makes sense.
232 00:20:22.160 ⇒ 00:20:24.340 Brylle Girang: Yeah, yeah, that makes sense, that makes sense.
233 00:20:24.440 ⇒ 00:20:26.970 Brylle Girang: I’m just trying to marinate on this.
234 00:20:28.000 ⇒ 00:20:32.709 Brylle Girang: I think first one, the main reason why we are not using Claude
235 00:20:32.920 ⇒ 00:20:42.210 Brylle Girang: is because of that specific, like, blocker. It doesn’t allow you at all to, like, do anything with your own data.
236 00:20:42.210 ⇒ 00:20:43.219 Michele Altomare: Yeah, very siloed.
237 00:20:43.680 ⇒ 00:20:52.900 Brylle Girang: Exactly, exactly. I think Anthropic really wants you to be contained in the cloud only, so I think that’s going to be a problem.
238 00:20:53.040 ⇒ 00:20:57.859 Brylle Girang: The second piece of that, I think this is very, very possible.
239 00:20:58.990 ⇒ 00:21:03.080 Brylle Girang: Because our knowledge base is basically in GitHub.
240 00:21:03.980 ⇒ 00:21:10.139 Brylle Girang: I think one system that we need to build here is sort of a long-term memory for whatever you’re doing.
241 00:21:10.400 ⇒ 00:21:21.519 Brylle Girang: So how I’m imagining it is, like, if you use, you know, if you use Claude, if you use Cursor, it does something, it logs whatever it did into the BrainFresh platform folder.
242 00:21:21.710 ⇒ 00:21:26.859 Brylle Girang: Since it’s in your local drive, whatever tool you use, it needs to read that.
243 00:21:27.310 ⇒ 00:21:31.639 Brylle Girang: sort of a handoff between the agents, right? Yeah. So…
244 00:21:32.390 ⇒ 00:21:38.650 Brylle Girang: I think what… how I’m… how I’m envisioning the end goal here is that up.
245 00:21:38.970 ⇒ 00:21:44.400 Brylle Girang: Once a tool completes something, It writes into a singular document.
246 00:21:44.900 ⇒ 00:21:52.050 Brylle Girang: And tells the next agent, hey, this is what you need to do, and then it compounds into that one document only.
247 00:21:52.550 ⇒ 00:21:53.200 Michele Altomare: Right.
248 00:21:53.530 ⇒ 00:21:54.200 Brylle Girang: Oh…
249 00:21:54.200 ⇒ 00:21:56.590 Michele Altomare: Every interaction between when I press
250 00:21:56.920 ⇒ 00:22:00.820 Michele Altomare: I’m sorry, I promise I don’t mean to make you into, like.
251 00:22:02.630 ⇒ 00:22:05.460 Michele Altomare: an AI guru to solve all my problems, right? I know you have a.
252 00:22:05.460 ⇒ 00:22:06.430 Brylle Girang: No, no, no, no.
253 00:22:06.700 ⇒ 00:22:10.770 Michele Altomare: But I’m just… I’m trying to conceptualize, like, literally, if there was, like, a layer.
254 00:22:11.710 ⇒ 00:22:18.439 Michele Altomare: a layer between every time that I press enter on some prompt, and then it goes through… you know what I mean? Whatever this is?
255 00:22:18.440 ⇒ 00:22:18.770 Brylle Girang: Yeah.
256 00:22:18.770 ⇒ 00:22:22.290 Michele Altomare: And then everything that gets written and red.
257 00:22:22.960 ⇒ 00:22:25.479 Michele Altomare: gets logged somewhere. I don’t care if it takes.
258 00:22:25.480 ⇒ 00:22:26.400 Brylle Girang: Exactly.
259 00:22:26.400 ⇒ 00:22:29.490 Michele Altomare: You know, but, like, something there. Exactly.
260 00:22:30.100 ⇒ 00:22:38.639 Brylle Girang: Exactly. I sent over, like, a GitHub plugin to you. This is Claude Memory. This isn’t exactly…
261 00:22:38.830 ⇒ 00:22:43.599 Brylle Girang: the solution to our problem, but I am envisioning that this is something like it.
262 00:22:43.930 ⇒ 00:22:52.370 Brylle Girang: So what it does is, every session in Cloud, it compresses it, and then it synthesizes it, like, hey, this is what happened.
263 00:22:52.770 ⇒ 00:22:55.560 Brylle Girang: And then it injects it into future sessions.
264 00:22:55.680 ⇒ 00:22:58.079 Brylle Girang: So, similar to how we want the handoff.
265 00:22:58.180 ⇒ 00:23:01.469 Brylle Girang: I think with cloud right now, if you use chat.
266 00:23:01.640 ⇒ 00:23:05.980 Brylle Girang: And then you have another chat session, it doesn’t even know what’s going on there.
267 00:23:06.140 ⇒ 00:23:07.140 Michele Altomare: No, it has no idea.
268 00:23:07.140 ⇒ 00:23:07.780 Brylle Girang: Right.
269 00:23:07.990 ⇒ 00:23:22.529 Brylle Girang: This is the solution for that. Like, this ensures that your sessions are connected, and it just dumps everything into one place, and then the agent just tries to get information from that one place every time it runs.
270 00:23:23.060 ⇒ 00:23:27.569 Michele Altomare: And this is beyond… this is not just Claude Code, but also Claude Chat.
271 00:23:28.440 ⇒ 00:23:28.990 Brylle Girang: I…
272 00:23:28.990 ⇒ 00:23:29.820 Michele Altomare: Gorgeous. Imagine.
273 00:23:29.820 ⇒ 00:23:37.070 Brylle Girang: I imagine that, yes, this is specifically built for cloud code, but I don’t think that it’s…
274 00:23:37.390 ⇒ 00:23:39.380 Brylle Girang: I don’t think that it’s impossible.
275 00:23:39.550 ⇒ 00:23:41.759 Brylle Girang: For other tools to also use this.
276 00:23:42.210 ⇒ 00:23:43.060 Michele Altomare: Yeah.
277 00:23:43.780 ⇒ 00:23:47.340 Brylle Girang: Even OpenClaw can use this, so there’s an OpenClaw gateway.
278 00:23:47.690 ⇒ 00:23:48.300 Brylle Girang: Here.
279 00:23:48.300 ⇒ 00:23:49.010 Michele Altomare: Interesting.
280 00:23:49.120 ⇒ 00:23:50.120 Michele Altomare: Interesting.
281 00:23:51.030 ⇒ 00:23:51.790 Michele Altomare: Cool.
282 00:23:51.980 ⇒ 00:24:03.430 Michele Altomare: Because what I’ve started doing is, instead of using Claude Chat anymore, I just have threads inside of my Cloud code that I just chat with now. And I just tell them every single time to, like, read-write to Obsidian.
283 00:24:03.550 ⇒ 00:24:05.400 Michele Altomare: It sounds so, like, barbarious.
284 00:24:05.890 ⇒ 00:24:13.350 Michele Altomare: It just sounds like such, like, a caveman way to do it, but I’m like, I’ve stopped using Claw.ai because I just have, like, pinned chats.
285 00:24:13.730 ⇒ 00:24:14.520 Brylle Girang: Yeah, yeah.
286 00:24:14.520 ⇒ 00:24:18.220 Michele Altomare: The context just keeps getting compressed, but I’m like, okay, at least this is something.
287 00:24:18.550 ⇒ 00:24:20.320 Brylle Girang: I mean, when I heard that…
288 00:24:21.040 ⇒ 00:24:22.490 Michele Altomare: You can use it from your phone, though, that’s the challenge.
289 00:24:22.490 ⇒ 00:24:23.380 Brylle Girang: Yeah, yeah.
290 00:24:23.380 ⇒ 00:24:26.040 Michele Altomare: You’re limited to code, so I can’t chat with it.
291 00:24:26.680 ⇒ 00:24:27.470 Brylle Girang: Exactly.
292 00:24:28.100 ⇒ 00:24:36.220 Brylle Girang: I think when I heard that, like, I can’t imagine how your tokens are burning, because it goes through that every time.
293 00:24:36.410 ⇒ 00:24:37.650 Brylle Girang: Hi, I disagree.
294 00:24:37.650 ⇒ 00:24:40.999 Michele Altomare: I don’t know where to… I mean, I could figure out, but yeah, it’s…
295 00:24:41.400 ⇒ 00:24:47.190 Michele Altomare: For sure, it’s like building, like, a Formula 1 car every time anyone wants to go to, like, the grocery store. It’s, like, overkill.
296 00:24:47.190 ⇒ 00:24:49.820 Brylle Girang: Exactly, exactly.
297 00:24:49.820 ⇒ 00:24:50.610 Michele Altomare: So…
298 00:24:50.610 ⇒ 00:24:55.849 Brylle Girang: Let me see, I’m going to… I have been interested in this cloud memory.
299 00:24:56.480 ⇒ 00:24:59.369 Brylle Girang: Ever since… do you know, Bill?
300 00:24:59.510 ⇒ 00:25:05.410 Brylle Girang: This is funny. Do you know Mila Jovovich? Like, have you watched Resident Evil, the movie?
301 00:25:06.070 ⇒ 00:25:10.430 Michele Altomare: No, I know what Resident Evil is. Is it not also a video game?
302 00:25:11.540 ⇒ 00:25:16.729 Brylle Girang: Yeah, it is also a video game. So, the main actress of the movie was Mila Jovovich.
303 00:25:17.520 ⇒ 00:25:18.330 Michele Altomare: Okay.
304 00:25:20.180 ⇒ 00:25:21.790 Brylle Girang: And the funny thing is that
305 00:25:22.090 ⇒ 00:25:25.990 Brylle Girang: This specific actress just also created
306 00:25:26.210 ⇒ 00:25:29.079 Brylle Girang: like a memory plugin for GitHub.
307 00:25:29.340 ⇒ 00:25:30.320 Brylle Girang: Which is…
308 00:25:31.260 ⇒ 00:25:31.700 Michele Altomare: Crazy.
309 00:25:31.700 ⇒ 00:25:33.680 Brylle Girang: Fucking crazy, man.
310 00:25:37.610 ⇒ 00:25:40.990 Brylle Girang: This is another plugin that I’m going to take a look at and see if it
311 00:25:41.430 ⇒ 00:25:43.419 Brylle Girang: It’s called Man Palace.
312 00:25:43.670 ⇒ 00:25:50.980 Brylle Girang: Yes, you can see that the author there is Mila Djopovich, who is his… Really, really crazy.
313 00:25:51.770 ⇒ 00:25:52.589 Michele Altomare: Oh, jeez.
314 00:25:53.750 ⇒ 00:25:54.740 Michele Altomare: Nuts.
315 00:25:57.690 ⇒ 00:25:59.250 Michele Altomare: Memory Palace.
316 00:25:59.950 ⇒ 00:26:02.800 Michele Altomare: Best benchmarked open source AI memory system.
317 00:26:03.870 ⇒ 00:26:06.759 Brylle Girang: Yeah, so bottom line, I think your problem is, like.
318 00:26:07.430 ⇒ 00:26:12.349 Brylle Girang: a common problem. I think it does also make a problem, like, how can we make sure that
319 00:26:12.770 ⇒ 00:26:16.560 Brylle Girang: we don’t… Be dependent on chat logs?
320 00:26:16.940 ⇒ 00:26:21.879 Brylle Girang: and sessions are being saved in one place. So, whatever we use.
321 00:26:22.000 ⇒ 00:26:24.239 Brylle Girang: It can be used by any other AI tool.
322 00:26:26.610 ⇒ 00:26:31.840 Brylle Girang: Let me marinate on this. I think this is something that’s going to be more challenging than the first part.
323 00:26:33.030 ⇒ 00:26:41.109 Michele Altomare: For sure. But your advice on the first part makes a lot of sense. So, if the orchestrator is making… is working on half-built
324 00:26:41.400 ⇒ 00:26:44.539 Michele Altomare: Stacks, then it’s gonna crash on the first or second one.
325 00:26:45.160 ⇒ 00:26:53.800 Brylle Girang: Yeah, imagine, like, imagine we’re making a pizza, and then we need to make sure that each of the toppings are perfectly cooked first, before we…
326 00:26:53.800 ⇒ 00:26:54.640 Michele Altomare: Yeah.
327 00:26:54.640 ⇒ 00:26:59.329 Brylle Girang: Well, that’s a bad analogy, because in pizzas, you cook it at the same time, but…
328 00:26:59.330 ⇒ 00:27:00.910 Michele Altomare: You see the time, but I got you, I got you.
329 00:27:00.910 ⇒ 00:27:02.090 Brylle Girang: Yeah, yeah.
330 00:27:02.090 ⇒ 00:27:05.390 Michele Altomare: I got you. Okay.
331 00:27:05.860 ⇒ 00:27:10.369 Michele Altomare: Okay, I’ll do that next, then. I’ll break the final thing into…
332 00:27:11.220 ⇒ 00:27:14.090 Michele Altomare: Verticals, and then figure out how to write skills for all of them.
333 00:27:14.430 ⇒ 00:27:18.939 Brylle Girang: Yes, exactly. And then, once it’s done, let’s figure out how to orchestrate those.
334 00:27:19.360 ⇒ 00:27:27.219 Michele Altomare: Yeah. Does… do I have to interact with GitHub to make those skills? I’m still getting familiar with, like.
335 00:27:27.600 ⇒ 00:27:29.710 Michele Altomare: Pushing and pulling a little bit.
336 00:27:30.190 ⇒ 00:27:30.990 Michele Altomare: It’s been a while.
337 00:27:32.100 ⇒ 00:27:50.690 Brylle Girang: Yes, yes, I think it’s going to be really helpful if whatever you do, you log into the platform via GitHub. I would say don’t even think… you don’t need to worry about, you know, how to push, how to pull. Just tell cursor, just tell open code, hey, I did this save in the platform, create a PR, it will.
338 00:27:50.690 ⇒ 00:27:52.130 Michele Altomare: Every time for you.
339 00:27:52.130 ⇒ 00:27:54.790 Brylle Girang: Yeah, we have skills, we have skills specifically.
340 00:27:57.180 ⇒ 00:27:59.909 Michele Altomare: Think like a caveman. Caveman Max.
341 00:28:00.220 ⇒ 00:28:07.499 Brylle Girang: Exactly. Well, what Utam is going to tell to you? Man, don’t think about the complicated things. Just tell AI to do it.
342 00:28:09.310 ⇒ 00:28:14.059 Michele Altomare: Cool, bro. Cool. Anything I can do on my end, I don’t know, to support you somehow?
343 00:28:14.550 ⇒ 00:28:23.510 Brylle Girang: Not yet. I’m still trying to build effective courses so that we can launch our L&D service, but once it’s
344 00:28:23.930 ⇒ 00:28:25.909 Brylle Girang: We’re definitely going to be working.
345 00:28:26.450 ⇒ 00:28:31.749 Michele Altomare: For sure, because you think the L&D service could be, like, a line item for something that gets sold to people?
346 00:28:32.010 ⇒ 00:28:32.640 Michele Altomare: Perhaps.
347 00:28:32.640 ⇒ 00:28:35.020 Brylle Girang: Exact? Definitely, definitely.
348 00:28:35.470 ⇒ 00:28:39.769 Michele Altomare: cool, bro. I’ll be a big cheerleader once it’s ready, just let me know.
349 00:28:40.220 ⇒ 00:28:42.790 Brylle Girang: Yes, yes, I’m excited. Thank you, man!
350 00:28:43.180 ⇒ 00:28:44.650 Michele Altomare: Alright, talk soon, V.
351 00:28:45.010 ⇒ 00:28:45.710 Brylle Girang: Bye-bye.