Meeting Title: Friday Brainforge Demos & Retro Date: 2026-03-20 Meeting participants: Brylle Girang, Casie Aviles, Holly Condos, Kaela Gallagher, Greg Stoutenburg, Rico Rejoso, Jasmin Multani, Luke Scorziell, Pranav, Hannah Wang, Robert Tseng, Amber Lin, Uttam Kumaran, Advait Nandakumar Menon, Demilade Agboola
WEBVTT
1 00:10:00.350 ⇒ 00:10:01.370 Greg Stoutenburg: Hey, everyone!
2 00:10:02.620 ⇒ 00:10:04.180 Kaela Gallagher: Hey, how’s it going?
3 00:10:04.890 ⇒ 00:10:07.670 Greg Stoutenburg: Doing phenomenally. How are you?
4 00:10:07.670 ⇒ 00:10:09.949 Kaela Gallagher: You’re home, finally!
5 00:10:09.950 ⇒ 00:10:11.410 Greg Stoutenburg: See?
6 00:10:11.680 ⇒ 00:10:13.180 Kaela Gallagher: I should’ve… I shouldn’t.
7 00:10:13.180 ⇒ 00:10:15.860 Greg Stoutenburg: There’s a background out of this… out of this place?
8 00:10:17.240 ⇒ 00:10:19.120 Greg Stoutenburg: Wherever I am. Looks like a mom.
9 00:10:19.790 ⇒ 00:10:23.760 Kaela Gallagher: Mmm… Where did you get stuck?
10 00:10:24.440 ⇒ 00:10:30.560 Greg Stoutenburg: So, I was in Newport Ritchie, Florida, which is about an hour north of Tampa.
11 00:10:30.670 ⇒ 00:10:31.880 Kaela Gallagher: Okay.
12 00:10:32.020 ⇒ 00:10:41.870 Greg Stoutenburg: Yeah, and yeah, I mean, I think what made it such a… so, there were a lot of canceled flights, but what made it especially bad that we had to drive is that,
13 00:10:42.190 ⇒ 00:10:55.139 Greg Stoutenburg: we flew on a Legion, which only flies, like, twice a week, round trip from Harrisburg, Pennsylvania, down to Clearwater. And so, when that flight was canceled, the next available flight was Friday afternoon.
14 00:10:55.300 ⇒ 00:11:06.810 Greg Stoutenburg: And it’s like, we went from a Monday evening planned arrival to a Friday, you know, night planned arrival, and you know, people, I mean, people just had things they needed to get to, so…
15 00:11:06.870 ⇒ 00:11:20.069 Greg Stoutenburg: We’re like, alright, well, let’s get the next available car. First, we couldn’t find a car at all, and then we found one car that would let us rent and take a car across state lines, because they don’t want to… the rental companies don’t want to do that either.
16 00:11:20.070 ⇒ 00:11:23.020 Kaela Gallagher: Oh… oh my gosh, wow, that’s…
17 00:11:23.020 ⇒ 00:11:30.010 Greg Stoutenburg: I won’t say how much the car cost to rent, but I will tell you that it was triple the price of the plane tickets. So…
18 00:11:30.350 ⇒ 00:11:31.320 Greg Stoutenburg: It was a…
19 00:11:31.320 ⇒ 00:11:31.880 Kaela Gallagher: Wow.
20 00:11:31.880 ⇒ 00:11:33.739 Greg Stoutenburg: We were lucky, in a way.
21 00:11:34.030 ⇒ 00:11:34.710 Kaela Gallagher: Yeah.
22 00:11:37.680 ⇒ 00:11:43.130 Greg Stoutenburg: So… Yeah, good week to… to not fly.
23 00:11:52.220 ⇒ 00:11:55.469 Kaela Gallagher: Casey, how’s it going? Are you the host today?
24 00:11:56.070 ⇒ 00:11:59.179 Casie Aviles: Oh, yeah, yeah, I’ll be hosting today, so…
25 00:12:00.220 ⇒ 00:12:04.819 Casie Aviles: Yeah, doing good today, so I’ll just share… I’ll just be sharing my screen.
26 00:12:05.840 ⇒ 00:12:09.340 Kaela Gallagher: Oh, no, no rush. I think people are still coming in.
27 00:12:10.200 ⇒ 00:12:10.890 Casie Aviles: Okay.
28 00:12:20.720 ⇒ 00:12:24.500 Rico Rejoso: Yeah, let’s wait around… 2 minutes,
29 00:12:24.830 ⇒ 00:12:27.110 Rico Rejoso: My mom said that he’s running late as well.
30 00:15:04.870 ⇒ 00:15:06.440 Robert Tseng: Hello, everyone!
31 00:15:07.340 ⇒ 00:15:08.530 Greg Stoutenburg: Hey, how’s it going?
32 00:15:09.250 ⇒ 00:15:10.480 Robert Tseng: Happy Friday!
33 00:15:11.710 ⇒ 00:15:13.349 Greg Stoutenburg: Very, very happy Friday.
34 00:15:14.790 ⇒ 00:15:16.160 Greg Stoutenburg: Where are you now, Robert?
35 00:15:17.110 ⇒ 00:15:18.350 Robert Tseng: I’m back in New York.
36 00:15:20.550 ⇒ 00:15:22.890 Greg Stoutenburg: Is it nice there right now? It’s super nice here.
37 00:15:23.520 ⇒ 00:15:31.490 Robert Tseng: It is. It’s like… I guess I haven’t stepped outside yet, but it looks sunny, so…
38 00:15:31.490 ⇒ 00:15:31.919 Greg Stoutenburg: That is…
39 00:15:31.920 ⇒ 00:15:33.829 Robert Tseng: to go outside.
40 00:15:34.820 ⇒ 00:15:39.340 Robert Tseng: Apparently, I don’t know if it got… it got to, like, the 80s last week.
41 00:15:41.520 ⇒ 00:15:41.980 Greg Stoutenburg: Okay.
42 00:15:41.980 ⇒ 00:15:51.870 Robert Tseng: I wasn’t here for that, but I heard it was… it felt like summer. People were walking around in shorts. And then it went back to below freezing as of yesterday, and then I think it’s…
43 00:15:52.150 ⇒ 00:15:53.889 Robert Tseng: Somewhere in the 50s today.
44 00:15:54.280 ⇒ 00:15:55.870 Greg Stoutenburg: Yeah, yeah, yeah, nice.
45 00:15:57.860 ⇒ 00:15:59.629 Greg Stoutenburg: Yeah, it’s like 65 out here.
46 00:16:00.090 ⇒ 00:16:00.780 Robert Tseng: Nice.
47 00:16:01.850 ⇒ 00:16:05.729 Robert Tseng: Hey, Greg, do you want to come to, Omnicon on Thursday, next week?
48 00:16:06.030 ⇒ 00:16:07.620 Robert Tseng: I don’t know if you saw that message.
49 00:16:07.990 ⇒ 00:16:11.740 Greg Stoutenburg: Yeah, I saw… Hannah DM’d me,
50 00:16:12.060 ⇒ 00:16:15.939 Greg Stoutenburg: Let me, let me look into that and see if it’s possible for me to get there.
51 00:16:16.150 ⇒ 00:16:19.870 Robert Tseng: Okay, yeah, not to put you on the spot, just thought, you know, but, you know, yeah.
52 00:16:19.870 ⇒ 00:16:21.909 Greg Stoutenburg: Want to? Definitely.
53 00:16:22.170 ⇒ 00:16:24.420 Greg Stoutenburg: Ken? Let me see.
54 00:16:24.420 ⇒ 00:16:25.420 Robert Tseng: Okay, okay.
55 00:16:25.420 ⇒ 00:16:26.819 Greg Stoutenburg: Yeah, that’d be cool, though.
56 00:16:27.180 ⇒ 00:16:31.049 Uttam Kumaran: Make a good impression on Shivani when you guys meet her. I told her to go say hi to you guys.
57 00:16:31.390 ⇒ 00:16:32.150 Robert Tseng: Oh, yeah.
58 00:16:32.960 ⇒ 00:16:37.200 Uttam Kumaran: Yeah, so I said you’ll meet the team that you’ll get if you pay us way more.
59 00:16:37.430 ⇒ 00:16:39.270 Greg Stoutenburg: Yeah. Thanks.
60 00:16:39.730 ⇒ 00:16:41.260 Uttam Kumaran: That got a slight chuckle.
61 00:16:48.050 ⇒ 00:16:48.720 Uttam Kumaran: Cool.
62 00:16:49.140 ⇒ 00:16:52.420 Uttam Kumaran: Is Casey hosting? Yeah.
63 00:16:52.650 ⇒ 00:16:53.450 Casie Aviles: Yeah.
64 00:16:54.080 ⇒ 00:16:59.699 Casie Aviles: Hey everyone, so… Well, it’s been a great week for you all.
65 00:17:00.090 ⇒ 00:17:06.609 Casie Aviles: So yeah, I’ll be hosting today’s Friday meeting, so… I’ll just start with,
66 00:17:06.770 ⇒ 00:17:09.159 Casie Aviles: So this is our usual agenda.
67 00:17:10.480 ⇒ 00:17:13.160 Casie Aviles: And I’ll just start with a quick icebreaker.
68 00:17:13.819 ⇒ 00:17:19.069 Casie Aviles: So what’s your favorite way to relax or decompress after a busy day?
69 00:17:19.300 ⇒ 00:17:27.090 Casie Aviles: I’ll start. So, lately, my mom bought some… Lotus seeds, and…
70 00:17:27.200 ⇒ 00:17:30.780 Casie Aviles: I don’t know, like, I’ve just been invested in seeing them grow.
71 00:17:31.090 ⇒ 00:17:34.879 Casie Aviles: So I’ve been… actually, I’m not much of a plant person, but…
72 00:17:35.580 ⇒ 00:17:40.370 Casie Aviles: when, you know, I would just go and see it every day, and I would, you know, water them.
73 00:17:41.120 ⇒ 00:17:48.570 Casie Aviles: And… Yeah, that’s so… I think it’s pretty cool, like, they’re usually different from usual plants, you know, so…
74 00:17:49.250 ⇒ 00:17:53.680 Uttam Kumaran: And do you just put them in the water, or, like, how do you, like… Germany.
75 00:17:54.510 ⇒ 00:18:00.929 Casie Aviles: Yeah, so we put them in the water first, and then later on when they grow roots.
76 00:18:01.380 ⇒ 00:18:09.460 Casie Aviles: And, like, the leaves start to grow as well. You have to put them in soil, like, I think it has to be mostly clay.
77 00:18:09.990 ⇒ 00:18:13.829 Casie Aviles: like a mix of clay and regular garden, so I think…
78 00:18:14.160 ⇒ 00:18:20.400 Casie Aviles: I haven’t gotten to that part yet, so… I’ve just bought some… soil.
79 00:18:21.280 ⇒ 00:18:22.140 Casie Aviles: Yeah.
80 00:18:23.980 ⇒ 00:18:26.859 Casie Aviles: Yep, so that… yeah, that’s… for me, that’s…
81 00:18:27.870 ⇒ 00:18:34.349 Uttam Kumaran: Do you lay on the floor, like, just, like, like, face down?
82 00:18:34.590 ⇒ 00:18:39.019 Pranav: No, no, no, no. That’s probably a bad day if it’s face down.
83 00:18:39.880 ⇒ 00:18:44.159 Pranav: It was just busy, but it’s all I bet.
84 00:18:45.960 ⇒ 00:18:51.750 Uttam Kumaran: Me, it’s face down. Face down. Face down on the Zoom calls, still.
85 00:18:54.370 ⇒ 00:18:56.949 Casie Aviles: Yeah, Greg was raising his hand earlier.
86 00:18:57.740 ⇒ 00:19:01.499 Greg Stoutenburg: Yeah, I don’t know what I did. I didn’t mean to do that. Oh, okay.
87 00:19:01.500 ⇒ 00:19:04.009 Uttam Kumaran: Okay, but what is it? Well, what is your… yeah, you have to answer, though.
88 00:19:04.010 ⇒ 00:19:12.900 Greg Stoutenburg: Well, I guess I’ll have to answer now. If… if possible, just get outside. Just… just go. Kind of anything. Kind of anything outside.
89 00:19:13.140 ⇒ 00:19:24.649 Greg Stoutenburg: whatever helps me switch gears. So, for me, I’m someone who really struggles to switch gears. I get outside and drive for 20 hours. That’s it. You know, nothing relaxes me like driving for an entire day and a half.
90 00:19:24.650 ⇒ 00:19:45.179 Greg Stoutenburg: Yeah, it’s… being outside, for me, is the easiest way to… to hit that mental reset. So, yeah, that’s a good way to decompress. Yeah, nothing I do to, like, decompress is relaxing. I don’t think I know how to relax. So, like, for me, I feel really great after a hard workout, or, like, a hockey game, or, you know, yesterday I was throwing the ball around with my kid.
91 00:19:47.420 ⇒ 00:19:49.070 Casie Aviles: Yeah, that’s nice. Yeah, I mean…
92 00:19:49.070 ⇒ 00:19:49.455 Uttam Kumaran: Nice.
93 00:19:50.140 ⇒ 00:19:54.450 Casie Aviles: I think, yeah, going outside, definitely, even for me, like.
94 00:19:54.710 ⇒ 00:20:02.500 Casie Aviles: I barely see sunlight, so it’s nice, like, to see, you know, to get some sun as well, and vitamin D, so…
95 00:20:05.990 ⇒ 00:20:06.780 Uttam Kumaran: Nice.
96 00:20:08.570 ⇒ 00:20:13.669 Casie Aviles: Yeah, everyone has really nice ways, so I might even try some of these,
97 00:20:16.080 ⇒ 00:20:19.320 Casie Aviles: You know, ways to relax, so… Yeah.
98 00:20:19.750 ⇒ 00:20:24.750 Casie Aviles: Thank you, everyone, for sharing. So yeah, I’ll just move on to our lab share.
99 00:20:27.130 ⇒ 00:20:30.299 Casie Aviles: Cool, so… My lab share will be…
100 00:20:30.970 ⇒ 00:20:35.110 Casie Aviles: mainly about AI, and just, you know, for me, like, I thought.
101 00:20:35.580 ⇒ 00:20:40.710 Casie Aviles: AI right now is quite advanced, and we use it at a daily basis now.
102 00:20:40.850 ⇒ 00:20:41.590 Casie Aviles: But…
103 00:20:42.080 ⇒ 00:20:47.490 Casie Aviles: I just, you know, I wanted to go back and see, like, where it… how it… how it was back then, and…
104 00:20:48.220 ⇒ 00:20:53.400 Casie Aviles: Going through that was, like, you know, it made me more appreciative of, like, of the developments that
105 00:20:53.720 ⇒ 00:20:57.949 Casie Aviles: Had happened, and, you know, like, all of the tech behind it, so…
106 00:20:58.310 ⇒ 00:21:03.080 Casie Aviles: Yeah, there’s definitely a lot of interesting things that happen.
107 00:21:05.180 ⇒ 00:21:11.080 Casie Aviles: to get to where we are now. Like, back then, like, the simple chatbots were really… were not very good.
108 00:21:11.290 ⇒ 00:21:14.529 Casie Aviles: So it’s crazy to see, like, you know, how good they are now.
109 00:21:14.950 ⇒ 00:21:19.029 Casie Aviles: So, yeah, this is just a very condensed, lab share, but…
110 00:21:19.380 ⇒ 00:21:23.249 Casie Aviles: So, I’ll start with, you know, just the early chatbots,
111 00:21:23.550 ⇒ 00:21:33.569 Casie Aviles: So, basically, back then, how they worked is they just used, you know, simple rules and pattern matching. So, like, an example here is ELISA. So, this is from the 1960s.
112 00:21:33.840 ⇒ 00:21:40.300 Casie Aviles: So I’ve also included, like, a link if you want to play around with the chatbot, but here on the right, like, there’s
113 00:21:40.550 ⇒ 00:21:43.900 Casie Aviles: A screenshot of my conversation with it, so…
114 00:21:44.310 ⇒ 00:21:48.060 Casie Aviles: Basically, how it works is it just picks up, you know, like.
115 00:21:48.250 ⇒ 00:21:51.360 Casie Aviles: Words in my… in my message, so…
116 00:21:51.740 ⇒ 00:21:57.809 Casie Aviles: for example, in the first few lines, I said, hello, my PC won’t start, but…
117 00:21:58.160 ⇒ 00:22:05.900 Casie Aviles: It picks up only hello, so… and then it’s just going to respond with, how do you do? Please state your problem, you know, and…
118 00:22:06.560 ⇒ 00:22:10.079 Casie Aviles: It’s just a bunch of, you know, if-else cases.
119 00:22:11.280 ⇒ 00:22:15.960 Casie Aviles: In the backend, basically, so that’s… that’s how… Simple it was before.
120 00:22:16.600 ⇒ 00:22:19.710 Casie Aviles: And now, if we drop to now, like.
121 00:22:19.880 ⇒ 00:22:26.259 Casie Aviles: AI right now is a lot more fancy, like, tech behind it. We have machine learning and neural networks.
122 00:22:26.670 ⇒ 00:22:31.009 Casie Aviles: Basically, how those work is…
123 00:22:31.220 ⇒ 00:22:35.559 Casie Aviles: They learn patterns from huge amounts of text, you know.
124 00:22:36.030 ⇒ 00:22:42.320 Casie Aviles: That’s why we have a lot of… we… we hear about, you know, training data, and we have to feed a lot of data to, like.
125 00:22:42.740 ⇒ 00:22:50.420 Casie Aviles: When building these models, so… Now we have these foundational models that are… that we can simply just using…
126 00:22:50.740 ⇒ 00:22:52.170 Casie Aviles: Interface with.
127 00:22:52.580 ⇒ 00:23:00.090 Casie Aviles: So, behind all that, it’s just a bunch of, you know, really clever algorithm and statistics and probability, you know.
128 00:23:00.430 ⇒ 00:23:05.450 Casie Aviles: So… Yeah, that’s, that’s,
129 00:23:05.630 ⇒ 00:23:08.540 Casie Aviles: Kind of how it was then, and how it is now.
130 00:23:09.090 ⇒ 00:23:15.090 Casie Aviles: And my second slide here is just, you know, some words that we use today.
131 00:23:15.380 ⇒ 00:23:17.080 Casie Aviles: in AI that have, you know.
132 00:23:17.500 ⇒ 00:23:24.320 Casie Aviles: that have roots in early NLP, and I’ll just explain just some of these. So, for example, tokens.
133 00:23:25.580 ⇒ 00:23:32.449 Casie Aviles: these are… we hear… I think we hear a lot about tokens, now, but basically what they are
134 00:23:32.700 ⇒ 00:23:35.400 Casie Aviles: Is splitting text into words or symbols.
135 00:23:35.670 ⇒ 00:23:41.320 Casie Aviles: So they can be assigned meetings. So, if we look here to the right, like, there’s this screenshot of
136 00:23:41.790 ⇒ 00:23:49.669 Casie Aviles: OpenAI’s tokenizer, and… For example, the text here is, AI can help 100 students learn faster.
137 00:23:50.600 ⇒ 00:23:53.559 Casie Aviles: So AI… so computers would naturally just
138 00:23:53.900 ⇒ 00:24:02.989 Casie Aviles: start with… look, it would just look at this as just a bunch of, you know, characters, so it’s not really gonna be as helpful. So the question would be, like.
139 00:24:03.440 ⇒ 00:24:08.160 Casie Aviles: how would we process this? How would we make computers understand this? So…
140 00:24:08.330 ⇒ 00:24:11.479 Casie Aviles: We need to split them into more meaningful chunks.
141 00:24:13.080 ⇒ 00:24:17.279 Casie Aviles: that we can assign some meaning later on. So that’s a different…
142 00:24:17.580 ⇒ 00:24:21.110 Casie Aviles: Tokenizers have different ways of doing that as well, so…
143 00:24:22.020 ⇒ 00:24:26.240 Casie Aviles: And then also embeddings is also something that I’m…
144 00:24:26.640 ⇒ 00:24:29.020 Casie Aviles: quite fond of when I first learned of it.
145 00:24:29.350 ⇒ 00:24:33.029 Casie Aviles: Because I, I, I just… for me, like, it’s…
146 00:24:33.380 ⇒ 00:24:36.800 Casie Aviles: Really interesting how they were able to come up with, like, a way to
147 00:24:37.780 ⇒ 00:24:40.469 Casie Aviles: Kind of, you know, create, or…
148 00:24:40.610 ⇒ 00:24:46.599 Casie Aviles: Turn, turn, like, assign meaning to these, that’s such that a computer would understand it, so it…
149 00:24:46.870 ⇒ 00:24:51.600 Casie Aviles: Basically, these are ways of representing Words as numbers.
150 00:24:52.810 ⇒ 00:24:55.769 Casie Aviles: And that way we can, like, assign
151 00:24:56.110 ⇒ 00:24:58.810 Casie Aviles: Or understand relationships between them.
152 00:24:58.990 ⇒ 00:25:02.750 Casie Aviles: So… Basically, we can do math on these words.
153 00:25:02.960 ⇒ 00:25:03.840 Casie Aviles: type.
154 00:25:04.020 ⇒ 00:25:09.670 Casie Aviles: For example, if we were to… we were to… if we want to query for, like.
155 00:25:10.070 ⇒ 00:25:10.910 Casie Aviles: a fruit.
156 00:25:12.710 ⇒ 00:25:19.149 Casie Aviles: We would have to embed that as well. We would turn that into numbers, and then we’re going to, like,
157 00:25:19.610 ⇒ 00:25:23.880 Casie Aviles: understand, like, mathematically the distance between, so…
158 00:25:24.280 ⇒ 00:25:28.239 Casie Aviles: The closest distance would be the… would be apple and oranges.
159 00:25:28.440 ⇒ 00:25:34.230 Casie Aviles: For example. So, that’s how, I just think that it’s a really clever
160 00:25:34.390 ⇒ 00:25:37.050 Casie Aviles: thing, and although right now we have the
161 00:25:37.350 ⇒ 00:25:43.070 Casie Aviles: That… that kind of… that process is probably abstracted to us now that we have, you know,
162 00:25:43.660 ⇒ 00:25:46.809 Casie Aviles: Tools that are already built to handle that, but…
163 00:25:46.950 ⇒ 00:25:49.230 Casie Aviles: Yeah, under the hood, that’s how it’s working.
164 00:25:49.370 ⇒ 00:25:54.950 Casie Aviles: And that’s also something we’re actually applying in one of our clients for ABC, so…
165 00:25:55.270 ⇒ 00:26:01.440 Casie Aviles: We’re basically taking a bunch of their documents, and then we…
166 00:26:01.660 ⇒ 00:26:06.440 Casie Aviles: Turn them into embeddings so that the AI assistant can…
167 00:26:06.760 ⇒ 00:26:10.940 Casie Aviles: Query for, like, the relevant section in those documents.
168 00:26:12.710 ⇒ 00:26:18.890 Casie Aviles: Yeah, basically, that’s, that’s, my lab chair for today.
169 00:26:19.860 ⇒ 00:26:22.040 Uttam Kumaran: I guess, like, Casey, question on this, like.
170 00:26:22.420 ⇒ 00:26:25.559 Uttam Kumaran: Can you ex- can you explain, like.
171 00:26:26.020 ⇒ 00:26:33.639 Uttam Kumaran: Is it a fair example to say, like, that almost the next word is getting predicted, the next token is getting predicted?
172 00:26:33.890 ⇒ 00:26:37.099 Uttam Kumaran: Like, how do you… how can you explain to people a little bit about, like.
173 00:26:37.720 ⇒ 00:26:42.809 Uttam Kumaran: you’re not putting an answer… you’re not asking a question to AI, and it’s like, okay, I’m gonna…
174 00:26:43.200 ⇒ 00:26:45.670 Uttam Kumaran: I have, like, this whole paragraph ready to pull.
175 00:26:45.790 ⇒ 00:26:49.280 Uttam Kumaran: Right? If you can talk a little bit about how it’s predicting the next token.
176 00:26:50.980 ⇒ 00:26:56.720 Casie Aviles: You mean, like, in terms of, like, embeddings, or, like, just in how they work right now?
177 00:26:57.930 ⇒ 00:27:05.550 Uttam Kumaran: Just how, yeah, how it works right now, and, like, how when you… when a response is coming through, it’s actually, like, predicting the next
178 00:27:06.000 ⇒ 00:27:06.850 Uttam Kumaran: Peace.
179 00:27:08.530 ⇒ 00:27:14.930 Casie Aviles: Yeah, like, I’m not, like, super sure, like, how exactly it works either, but… I just know that…
180 00:27:15.170 ⇒ 00:27:17.050 Casie Aviles: You know, there’s, like.
181 00:27:18.630 ⇒ 00:27:26.800 Casie Aviles: what… what I do know is… what I know is, like, based on, like, the training data that they have, they would assign weights or something,
182 00:27:27.230 ⇒ 00:27:31.660 Casie Aviles: And based on, like, what’s the likeliest response…
183 00:27:33.160 ⇒ 00:27:39.349 Casie Aviles: you know, that would be what the AI would output, if that’s how I understand it right now.
184 00:27:44.740 ⇒ 00:27:47.010 Jasmin Multani: Yeah, I wonder how this is.
185 00:27:47.310 ⇒ 00:27:49.429 Jasmin Multani: For our, like, which clients?
186 00:27:49.550 ⇒ 00:27:53.109 Jasmin Multani: Which is mainly for one of their settings.
187 00:27:53.220 ⇒ 00:28:01.030 Jasmin Multani: I know these embeddings are really, really helpful for recommendations, being able to float…
188 00:28:01.790 ⇒ 00:28:06.969 Jasmin Multani: You know, let’s say, like, which element product this person would like.
189 00:28:07.190 ⇒ 00:28:12.840 Jasmin Multani: And I also wonder how, like, what’s the minimum data?
190 00:28:13.090 ⇒ 00:28:17.209 Jasmin Multani: Requirements, like amount of data, or,
191 00:28:17.910 ⇒ 00:28:27.479 Jasmin Multani: Data values or data features that describe the customer and their habits in order for this to be effective.
192 00:28:30.160 ⇒ 00:28:31.090 Jasmin Multani: Yeah, so forth.
193 00:28:31.090 ⇒ 00:28:38.680 Casie Aviles: Or… So, sorry, what was the question again? Like, how many…
194 00:28:39.350 ⇒ 00:28:52.210 Jasmin Multani: Yeah, let’s… as a first base, like, out of all of our clients, who do you think would benefit from these embeddings? And, what services could we introduce these embeddings into?
195 00:28:52.670 ⇒ 00:28:57.489 Casie Aviles: Oh, okay. I think… Anything that has to do with, like.
196 00:28:58.020 ⇒ 00:29:04.830 Casie Aviles: Getting, like, spec… you know, specific, information from a huge…
197 00:29:05.300 ⇒ 00:29:11.750 Casie Aviles: con- document. I think that’s, like, for example, like, what I did mention with ABC,
198 00:29:12.290 ⇒ 00:29:19.289 Casie Aviles: Like, the problem that… that we were facing in the beginning was they have a lot of documents that were…
199 00:29:19.590 ⇒ 00:29:21.899 Casie Aviles: Disorganized, you know, and…
200 00:29:24.100 ⇒ 00:29:31.039 Casie Aviles: And we, we, we did, like, we had to clean that up and make sure that it was…
201 00:29:31.460 ⇒ 00:29:35.500 Casie Aviles: You know, that we could more easily process that.
202 00:29:36.330 ⇒ 00:29:42.860 Casie Aviles: So it’s easy… so more… easier for the AI to, you know, do the… perform the vector embeddings.
203 00:29:43.300 ⇒ 00:29:44.670 Casie Aviles: querying.
204 00:29:44.920 ⇒ 00:29:47.309 Casie Aviles: And so I think, yeah, anything that has…
205 00:29:50.530 ⇒ 00:29:58.030 Casie Aviles: Yeah, it’s getting, like, a specific, information from, from, basically a pool of, texts.
206 00:29:58.570 ⇒ 00:30:04.329 Casie Aviles: They would benefit from using embeddings, because it’s not like, you know,
207 00:30:04.880 ⇒ 00:30:06.659 Casie Aviles: It’s not going to be, like…
208 00:30:07.070 ⇒ 00:30:11.759 Casie Aviles: The exact matching or string matching, you know, we can actually use, like.
209 00:30:12.140 ⇒ 00:30:15.460 Casie Aviles: The numbers there that represent the meaning.
210 00:30:15.700 ⇒ 00:30:23.470 Casie Aviles: Of the, you know, of words, or… Phrases, for example, so… Mmm.
211 00:30:23.690 ⇒ 00:30:38.679 Jasmin Multani: Yeah, yeah, from, like, the little I’ve been exposed to, I know the numbers can go up really, really high. It’s, like.222222, that could represent queen, or .22223, that could represent woman.
212 00:30:38.680 ⇒ 00:30:39.260 Casie Aviles: Yeah.
213 00:30:39.260 ⇒ 00:30:42.179 Jasmin Multani: Is that how other people understand it, too, in the room?
214 00:30:43.780 ⇒ 00:30:47.880 Jasmin Multani: Maybe Pranav can chime in with them, if you guys know.
215 00:30:48.120 ⇒ 00:30:56.089 Uttam Kumaran: Yeah, my… my underst… my understanding is that, like, when you do… when you… When you compare, like, vectors.
216 00:30:56.230 ⇒ 00:31:13.649 Uttam Kumaran: it’s, like, you’re comparing, like, distance in, like, a 3D space. So, basically, what it’s doing is it’s, like, predicting, like, what is the closest, vector, and it’s ultimately, like, a little bit of a math problem. Yes, like, each vector, the way Casey has it, is represented like this.
217 00:31:13.650 ⇒ 00:31:22.290 Uttam Kumaran: And so, what it does is it just predicts, like, what is the next closest vector. The embedding process is, like, going from cat to these three things.
218 00:31:22.290 ⇒ 00:31:34.860 Uttam Kumaran: These three, like, vectors are just describing cat in a space. And so ultimately, yes, they’ve not only done things where they’re expanding the number of vectors, but also the precision of the vectors.
219 00:31:34.860 ⇒ 00:31:50.079 Uttam Kumaran: But ultimately, embedding is a huge deal, and there are several models to do embedding, but what’s… what’s actually happening now is, like, the token throughput per second, which is, like, how fast can the language model actually process
220 00:31:50.080 ⇒ 00:32:02.320 Uttam Kumaran: go through the embedding and process the next token is what’s getting optimized. So that’s why you’ll see, like, ChatGPT, the latest ChatGPT, is, like, super, super fast, and they publish metrics around TPS, like tokens per second.
221 00:32:02.390 ⇒ 00:32:16.870 Uttam Kumaran: And that is, like, really where I think there’s a lot of innovation on throughput. They are also starting to innovate on every part of this, like, LLM development structure, from training to the data corpus.
222 00:32:17.220 ⇒ 00:32:25.750 Uttam Kumaran: like, fine-tuning, kind of all the direction, but I think it is helpful to really understand that it is just predicting.
223 00:32:25.870 ⇒ 00:32:28.929 Uttam Kumaran: And the predictions are getting accurate, they’re getting deeper.
224 00:32:30.120 ⇒ 00:32:49.939 Uttam Kumaran: And you’ll start to see that now, I think what the innovation is is, like, multi-agent systems. So you have several agents, you have several LLMs that are issuing a query, getting results back, and then comparing them. And so, one thing that’s been helpful for me is to understand these fundamentals and sort of start to abstract them up.
225 00:32:50.070 ⇒ 00:33:09.400 Uttam Kumaran: Like, imagine you had several ChatGPT sessions up. Imagine you had another ChatGPT session that just runs those and monitors those. So, like, I feel like in our business, it’s helpful to start to see those abstractions. I don’t know, that’s at least, like, how I think about it. I don’t know, Pranav, if, like, you would describe it
226 00:33:10.100 ⇒ 00:33:11.639 Uttam Kumaran: Like, any differently.
227 00:33:12.390 ⇒ 00:33:13.920 Pranav: Yeah, on the…
228 00:33:14.100 ⇒ 00:33:26.709 Pranav: I was just, while you were saying that too, just kind of just, like, re-remembering certain things, because, you know, sometimes we just get lost in, like, the things that we’re building that I kind of just need to, like, retouch on some of the stuff, but…
229 00:33:26.830 ⇒ 00:33:33.180 Pranav: One thing that is… Kind of the basic understanding of, like, how you can think about embeddings is that
230 00:33:33.270 ⇒ 00:33:52.279 Pranav: Like Casey was saying, like, if you have this huge document store, the best way to query against that and to get the relevant information is by embedding the query that you’re giving to, let’s say, the chatbot, and then finding what is similar, and then that’s, like, what Utan was saying is, like, that’s where the math comes in.
231 00:33:52.280 ⇒ 00:33:57.749 Pranav: Such as, like, what is the embedding that is most similar to the query? Now, like.
232 00:33:58.370 ⇒ 00:34:02.759 Pranav: sometimes that’s not exactly what you need to do, you don’t need to just find the exact
233 00:34:03.000 ⇒ 00:34:07.240 Pranav: Information in the… in the…
234 00:34:07.310 ⇒ 00:34:19.570 Pranav: in the embeddings that is, like, similar to the query that you’re asking. And so, yeah, that’s what I was just, like, looking into, like, semantic search a little bit, too, which is how you can just, like, boost your vector search.
235 00:34:19.610 ⇒ 00:34:31.600 Pranav: And so, I think high level, though, like, kind of what Casey was saying is, like, is right. Like, where I see the application, like, when I’m in sales calls and I’m talking about, like.
236 00:34:31.670 ⇒ 00:34:42.649 Pranav: knowledge bases. It’s like, where do you have just, like, this vat of information, where it’s just documents that people are just, every single day, just, like, searching through?
237 00:34:42.880 ⇒ 00:34:50.129 Pranav: And that’s, like, the best application for, what Casey’s, like, presenting right now, like, creating embeddings.
238 00:34:53.420 ⇒ 00:35:05.840 Uttam Kumaran: So now what you’re gonna have… you have, like, multimodal search, so, like, for example, it’s… you’re not only embedding text, you can embed images, videos, audio, and so you’re seeing models come out with native multimodal
239 00:35:05.950 ⇒ 00:35:16.209 Uttam Kumaran: like, search. And then, yes, for several clients, we actually will embed, like, hundreds of documents, and so what you’re embedding is, like.
240 00:35:16.350 ⇒ 00:35:35.480 Uttam Kumaran: what’s in the document, metadata, the data was opened, who created it, all this information. So you build an embedding model so that when you ask a question, you basically are positioning all those documents in a space so that AI is like, okay, this document is more relevant to this query I just got versus another doc.
241 00:35:35.620 ⇒ 00:35:51.300 Uttam Kumaran: Right? And so that’s where, in math, right, vectors, you have a distance between a point and a vector, and so it’s using that, like, distance algorithm to sort of, like, measure, okay, like, what is the relevant document, or what is the relevant vector to sort of traverse?
242 00:35:52.310 ⇒ 00:36:00.840 Jasmin Multani: Okay, so what a cool application could be, like, hey, given a customer’s,
243 00:36:02.060 ⇒ 00:36:10.050 Jasmin Multani: like, experience through the website, do we think that they are at risk, or do they… do we think that,
244 00:36:10.330 ⇒ 00:36:14.190 Jasmin Multani: This customer really likes, their relationship with
245 00:36:14.820 ⇒ 00:36:19.310 Jasmin Multani: Element, or Global VetLink, or something.
246 00:36:20.080 ⇒ 00:36:31.690 Uttam Kumaran: Yeah, like, a good example there is, you may actually find now… so what happened now is because the context sizes are so big for AI, you may actually just be able to, like.
247 00:36:32.020 ⇒ 00:36:37.929 Uttam Kumaran: Run a query that pulls all their events, and then… Just literally passed, like.
248 00:36:38.090 ⇒ 00:36:46.110 Uttam Kumaran: a thousand-line CSV to ChatGBT, and it will actually execute that entire thing. You may not need an external embedding model.
249 00:36:46.260 ⇒ 00:36:54.280 Uttam Kumaran: But yes, like, there is… as an example, you could create embeddings on potentially, like, every customer and their behavior.
250 00:36:55.830 ⇒ 00:37:01.179 Uttam Kumaran: It’s like, another… another way to think about that problem, though, and this is where, like, I sort of sometimes challenge
251 00:37:01.230 ⇒ 00:37:13.250 Uttam Kumaran: Pranav and the team on, like, the architecture, is wouldn’t it be just better for your AI to be able to write the best query to then find the answer? Right? And so another way of thinking about it is, like, how would Jasmine approach that problem? Well.
252 00:37:13.250 ⇒ 00:37:26.610 Uttam Kumaran: you probably have a huge table with all of the events. You’re gonna think about, like, what is the combination of events that leads to the best outcome. And so, what are pieces? And then you’re gonna run that, and then you’re gonna get, like.
253 00:37:26.610 ⇒ 00:37:34.950 Uttam Kumaran: these are the at-churn, these are the at-risk, versus these are, like, the people that are using the product the best. But wouldn’t it be better for actually AI to just
254 00:37:35.130 ⇒ 00:37:46.350 Uttam Kumaran: figure out what combination of queries or… and just… that actually proves that, and the AI can actually write SQL really well. So, in that sense, you don’t have to embed, like.
255 00:37:46.430 ⇒ 00:37:57.280 Uttam Kumaran: 5 million events, you just have to get the AI to write the best query. And so then it’s like, okay, how does the AI know what is the optimization for the best query to run?
256 00:37:57.320 ⇒ 00:38:10.589 Uttam Kumaran: How can it maybe run, like, several queries, get a corfus of a dataset, and then give you, actually, hey, I ran these, like, 100 queries in parallel, and I identified that these are the… or these are the golden events.
257 00:38:10.700 ⇒ 00:38:18.480 Uttam Kumaran: So, in that sense, like, you could go the distance and embed all of those events, but actually, the easier solve could be just, like.
258 00:38:19.010 ⇒ 00:38:24.000 Uttam Kumaran: have 10 LLMs run queries, compile the results, and give you the best
259 00:38:24.230 ⇒ 00:38:27.920 Uttam Kumaran: set of golden events. Very similar to how, like, we would just do this
260 00:38:28.630 ⇒ 00:38:31.490 Uttam Kumaran: You know, just, like, a couple years ago, you know?
261 00:38:34.580 ⇒ 00:38:43.369 Uttam Kumaran: So I find it almost, like, helpful to literally just model our typical process, and then find ways where AI… we could pass off, like.
262 00:38:43.650 ⇒ 00:38:47.050 Uttam Kumaran: some of the execution to AI,
263 00:38:47.880 ⇒ 00:38:59.899 Uttam Kumaran: But, yeah, I think there’s some other examples where we can actually embed, like, customer profiles, like, find customers that are similar to each other. Like, there’s a lot… there’s a lot in AI with graph DBs as well, like.
264 00:39:00.080 ⇒ 00:39:03.349 Uttam Kumaran: Graphical relationships between entities, things like that.
265 00:39:04.540 ⇒ 00:39:08.260 Jasmin Multani: So, when you say you hand things off to AI, do me, like.
266 00:39:08.450 ⇒ 00:39:16.819 Jasmin Multani: AI meaning, an external software that we use, or something internal that Brainforge develops.
267 00:39:17.460 ⇒ 00:39:24.960 Uttam Kumaran: Yeah, I guess, like, I’m being very agnostic right now. When I say hand off to AI, I’m like, I consider, like, you give an instruction to an LLM.
268 00:39:25.710 ⇒ 00:39:31.269 Uttam Kumaran: like, I think agnostic of anything that we’ve built. Meaning, hey, LLM,
269 00:39:31.460 ⇒ 00:39:33.599 Uttam Kumaran: like, I want you to help me.
270 00:39:34.150 ⇒ 00:39:36.190 Uttam Kumaran: Find a set of golden events.
271 00:39:36.570 ⇒ 00:39:40.459 Uttam Kumaran: Given this… that events table that I have.
272 00:39:41.110 ⇒ 00:39:43.520 Uttam Kumaran: Help me think through the methodology.
273 00:39:43.720 ⇒ 00:39:46.690 Uttam Kumaran: I’ll approve the methodology, and then go ahead and, like.
274 00:39:46.860 ⇒ 00:40:01.840 Uttam Kumaran: run your queries. And so part of that is actually there’s a lot of setup involved, right? Like, you have to prompt it in sort of that way. The LLM, through Cursor, or Codex, or ChatGPT, has to have access to Snowflake or Mother Duck.
275 00:40:03.050 ⇒ 00:40:11.080 Uttam Kumaran: And it has to be able to access and actually go run those queries. But I would challenge you to try to do that. Like, I think you’ll find that the results are…
276 00:40:11.230 ⇒ 00:40:18.400 Uttam Kumaran: really, really good. And actually, the way that we typically do it is actually the right way. The optimization is just, like.
277 00:40:19.040 ⇒ 00:40:30.290 Uttam Kumaran: letting AI do even more research than we would have done, right? Or go even further than we would have done, given time constraints. That’s what I found in data, primarily, is like, I don’t think anything in our job
278 00:40:30.560 ⇒ 00:40:36.080 Uttam Kumaran: Has changed, it’s just, like, the speed at which we can do that and the accuracy is what’s changed, you know?
279 00:40:43.790 ⇒ 00:40:47.169 Casie Aviles: Yeah, yeah, that’s great. Also, like.
280 00:40:47.370 ⇒ 00:40:55.450 Casie Aviles: we don’t necessarily need to do, like, embeddings as much anymore, like, as we, as Otong also mentioned, like.
281 00:40:55.690 ⇒ 00:41:01.219 Casie Aviles: This was some… something that was, one of the things that…
282 00:41:01.500 ⇒ 00:41:06.129 Casie Aviles: We had when we were dealing with the problem for the context size as well, like…
283 00:41:06.600 ⇒ 00:41:10.780 Casie Aviles: Embeddings were also, like, a good way to…
284 00:41:11.080 ⇒ 00:41:14.820 Casie Aviles: You know, tackle that problem when we couldn’t feed, like, more…
285 00:41:15.380 ⇒ 00:41:19.680 Casie Aviles: in, like, context into the AI, but now that we can see, like.
286 00:41:20.080 ⇒ 00:41:25.200 Casie Aviles: Their contacts window has also been… Increasingly, so…
287 00:41:25.610 ⇒ 00:41:30.190 Casie Aviles: Yeah, there are… sometimes this is still a viable option, but…
288 00:41:31.400 ⇒ 00:41:40.330 Casie Aviles: Yeah, oftentimes there’s also, you know, we can do a lot with just, you know, getting, like, for example, with Cursor and how it’s able to, like, get
289 00:41:40.760 ⇒ 00:41:45.389 Casie Aviles: read from… Our, documentation, so…
290 00:41:49.610 ⇒ 00:41:57.400 Casie Aviles: I don’t think we build LLMs in-house yet, right? Because we would have to, like, do the training
291 00:41:58.860 ⇒ 00:42:04.300 Casie Aviles: Or we would probably take… A model, and we would…
292 00:42:04.770 ⇒ 00:42:07.490 Casie Aviles: fine-tune it with more data, I believe.
293 00:42:08.790 ⇒ 00:42:19.350 Uttam Kumaran: Yeah, I would say we’re yet to even, like, do really fine-tuning exercises, but I think the alpha you’re gonna find in just getting your prompts right and stringing LLMs together.
294 00:42:20.330 ⇒ 00:42:26.979 Uttam Kumaran: you’re gonna find, like, it’s really, really good. Like, I’m thinking there’s less and less use cases for training LLMs, because
295 00:42:27.570 ⇒ 00:42:32.189 Uttam Kumaran: The free, like, the open source models are, like, at better than,
296 00:42:32.900 ⇒ 00:42:36.570 Uttam Kumaran: are, like, 100x better than, like, ChatGPT 4.0 right now.
297 00:42:36.810 ⇒ 00:42:48.180 Uttam Kumaran: So the need for us to, like, train… I think fine-tuning could be interesting, like, I’ve talked to Clarence about, hey, can we fine-tune, like, a consulting model based on, like, a bunch of our corpus of private data?
298 00:42:48.450 ⇒ 00:42:51.349 Uttam Kumaran: So I think there’s still some, like, interesting things for us to try.
299 00:42:52.030 ⇒ 00:42:53.280 Uttam Kumaran: Side projects.
300 00:42:56.500 ⇒ 00:43:03.169 Casie Aviles: Yeah, I haven’t personally done, like, fine-tuning, so that’s also something I’m… I’m not curious about…
301 00:43:04.380 ⇒ 00:43:09.020 Jasmin Multani: I feel like in cybersecurity,
302 00:43:10.320 ⇒ 00:43:13.430 Jasmin Multani: It’s just the possibilities are endless.
303 00:43:13.840 ⇒ 00:43:21.369 Jasmin Multani: Because people are always trying to do bad things, so the option of, like, fine-tuning is always there. So I always feel like.
304 00:43:21.700 ⇒ 00:43:25.660 Jasmin Multani: That’s gonna be a service, a forever service, as long as humans are alive.
305 00:43:25.850 ⇒ 00:43:26.659 Jasmin Multani: That we can build.
306 00:43:26.660 ⇒ 00:43:34.729 Uttam Kumaran: I would like… I would like us to get into fine-tuning and training LLMs. I think there’s… it’s like… but it’s… it’s, like, incredibly expensive.
307 00:43:34.930 ⇒ 00:43:35.430 Jasmin Multani: Yeah.
308 00:43:35.430 ⇒ 00:43:42.130 Uttam Kumaran: And, like, again, I think the cheap… the worst open source model right now is, like, extremely good.
309 00:43:43.570 ⇒ 00:43:43.990 Jasmin Multani: Hmm.
310 00:43:43.990 ⇒ 00:43:47.610 Uttam Kumaran: And it… what I’m more… if you look at the trajectory.
311 00:43:48.430 ⇒ 00:43:55.919 Uttam Kumaran: I don’t know if there’s really gonna be a need for a company like ours to do that. I think more of what we’re gonna do is actually just get better at assigning work out.
312 00:43:56.180 ⇒ 00:43:58.170 Uttam Kumaran: you know, too AI to do.
313 00:43:58.630 ⇒ 00:44:01.429 Uttam Kumaran: And I think I’ll have some… probably some more to share.
314 00:44:01.870 ⇒ 00:44:04.619 Uttam Kumaran: next week, probably on this, so…
315 00:44:09.490 ⇒ 00:44:11.480 Casie Aviles: That was a nice discussion.
316 00:44:12.380 ⇒ 00:44:17.719 Casie Aviles: So yeah, I’ll move over to… Let’s see, yeah, executive…
317 00:44:17.720 ⇒ 00:44:34.029 Uttam Kumaran: Casey, can I, can I share something? I just wanted to, like, walk through one thing I prepared today. One of… one of the, I think, challenges that we’re facing across two of our biggest teams, which is sales and delivery, is sort of the handoff from, like.
318 00:44:34.140 ⇒ 00:44:41.439 Uttam Kumaran: sales to service, and, like, I think that’s, like, a good moniker for this, is, like, how do we sell something that someone wants to buy? It’s like.
319 00:44:42.150 ⇒ 00:44:58.140 Uttam Kumaran: the, like, meta problem here. I think we’ve solved that. How do we then scope it that is something we can actually deliver on time? And then how do we, like, quickly move into delivering it, and then move it back up to say, like, hey, we delivered it, what else can we do for you? I think we’ve, like.
320 00:44:58.560 ⇒ 00:45:04.230 Uttam Kumaran: I think our company has gone through waves of, like, figuring this… this, like, challenge out, and I think we’re…
321 00:45:04.540 ⇒ 00:45:11.649 Uttam Kumaran: now that, given, like, we have a CSO SL model, I feel like we’re really close to
322 00:45:11.650 ⇒ 00:45:26.139 Uttam Kumaran: enforcing, like, setting forth, like, another set of standards, just to make it easy for everyone on how a project moves through the manufacturing firm that is Brainforge. And so, a couple things I just wanted to highlight, and then I’ll pass it off.
323 00:45:28.370 ⇒ 00:45:35.059 Uttam Kumaran: I kind of worked on, like, a little bit of, like, a one-pager on, like, how I’m thinking about service to plan, like.
324 00:45:35.250 ⇒ 00:45:42.939 Uttam Kumaran: like, SOW and, like, delivery. And so really, like, if you think about the smallest unit of work at Brainforge is a ticket.
325 00:45:43.000 ⇒ 00:46:01.929 Uttam Kumaran: And I think, like, we’re all… a lot of us are engineers, you’re used to working on tickets. I think tickets are a great unit of work, like, a good, like, representation of a unit of work. And everything is an abstraction on top of that, right? A ticket, there’s a deliverable or a project, there’s a phase. A phase is part of something we offered.
326 00:46:01.930 ⇒ 00:46:12.989 Uttam Kumaran: as part of the offering, there’s a service. And so let’s take a… let’s take an example of, like, Default, which is one of our clients. For Default, we are developing
327 00:46:13.120 ⇒ 00:46:15.659 Uttam Kumaran: We’re landing data for them in Mother Duck.
328 00:46:15.700 ⇒ 00:46:26.679 Uttam Kumaran: We are building data models for them in Mother Duck using dbt, and then we are also building dashboards for them. And then finally, we’re looking at the data and saying, hey, you guys can make more money here.
329 00:46:26.680 ⇒ 00:46:37.149 Uttam Kumaran: what services are we using? Well, down here, you’re gonna see sort of, like, what are our three core service lines? We have AI, we have data, and we have strategy and analytics. Within each, you’re gonna see some of the subservices.
330 00:46:37.150 ⇒ 00:46:52.080 Uttam Kumaran: And so on a company like Default, right, they are buying data and strategy and analytics from, like, I didn’t say anything that was AI, but in particular, they’re buying our data infrastructure work, they’re buying some of our analytics and BI work, but they’re also trying to buy some of our data strategy work
331 00:46:52.350 ⇒ 00:46:58.559 Uttam Kumaran: And so, as you can see, one client has several services they’re buying from us.
332 00:46:58.590 ⇒ 00:47:14.179 Uttam Kumaran: Very similarly, like, if you go to McDonald’s and you buy soda, and you buy a burger, and, you know, something else, right? And so, there’s actually… the way to think about it in our business is we can have multiple teams with purview over each service.
333 00:47:14.180 ⇒ 00:47:19.280 Uttam Kumaran: But ultimately, for the client, we’re a one-stop shop, meaning the client doesn’t care
334 00:47:19.310 ⇒ 00:47:21.960 Uttam Kumaran: about any of this. The client is like.
335 00:47:22.460 ⇒ 00:47:24.279 Uttam Kumaran: I want to make more money.
336 00:47:24.590 ⇒ 00:47:38.389 Uttam Kumaran: I think data, AI, strategy analytics, or one or more, are the way to do that, and I want you guys to do that for me, right? All those things are true. Internally, though, for us to build a business that isn’t just, like.
337 00:47:39.110 ⇒ 00:47:46.110 Uttam Kumaran: everything is for one client, everything is for the next client. We start to build these concepts. And so…
338 00:47:46.150 ⇒ 00:48:00.590 Uttam Kumaran: What we’re trying to look forward to is, like, when a client buys a set of work from us, they’re buying a set of services, they’re buying, like, an offer, they’re also buying the fact that we are going to deliver something on some timeline.
339 00:48:00.630 ⇒ 00:48:08.599 Uttam Kumaran: To give ourselves some feedback, like, I think this is… I think we’ve always struggled to do two things. We’ve struggled to…
340 00:48:08.850 ⇒ 00:48:14.250 Uttam Kumaran: accurately set up SOWs in a way that reflects what we are going to do.
341 00:48:14.300 ⇒ 00:48:27.569 Uttam Kumaran: And we’ve all… we’ve also struggled to say exactly what we’re going to do, and then accomplish it. And that’s not saying that we’re not the best at what we do, we’re not the fastest, those are all true, but both…
342 00:48:27.570 ⇒ 00:48:39.429 Uttam Kumaran: of those sides are struggling because we’re not on the same page. And so one thing that we’re going to be trying to do next quarter is really have, like, a clear through-line from what we sold to a client.
343 00:48:39.540 ⇒ 00:48:55.400 Uttam Kumaran: down to the CSO who owns the fact that we’re delivering an end, like, solution, down to the service leaders that actually own the service that we’re giving, and then back down to the initiative, the milestone, the project, and the linear ticket.
344 00:48:55.950 ⇒ 00:49:09.600 Uttam Kumaran: And all the way back up, right? As tickets get done, as projects get done, as we check off milestones, we then go back to the SOW, say we deliver this not faster and cheaper, and then sales can say, what else can we do for you?
345 00:49:10.020 ⇒ 00:49:20.150 Uttam Kumaran: Well, that’s, like, really what I’m trying to, like, get nailed down by next week. And it’s not that, like, we didn’t… we’re not already doing all this, I just think it’s not been…
346 00:49:20.270 ⇒ 00:49:31.379 Uttam Kumaran: codified, and it’s not been made super, super crisp in this way. And so, as part of a lot of this, we’re gonna have, like, templates for certain things, and some process, but ultimately.
347 00:49:31.730 ⇒ 00:49:48.500 Uttam Kumaran: the most important thing that we’re gonna try to deliver on is standards. Like, what is the standard for the way that we show up for each other on our teams? What is the standard for the way that we show up for clients? What is the standard expectation for work across each of these services?
348 00:49:48.500 ⇒ 00:49:58.660 Uttam Kumaran: And that’s something that we’re all gonna just agree on. And that way, when in doubt, we’re not looking for the next template, or we’re not saying, like, oh, there’s no template for this, or there’s no…
349 00:49:58.660 ⇒ 00:50:09.980 Uttam Kumaran: I don’t know exactly what the process is, because process and template is endless. Ultimately, you’re going to develop an intuition for what is the right decision to make, given limited information.
350 00:50:09.990 ⇒ 00:50:26.479 Uttam Kumaran: And I had a great conversation with Pranav today about how do we develop that intuition? Well, part of developing is actually removing some of this process. It’s like, hey, you’re not going to be able to go to, like, they’re going to an even kickoff meeting. I’m like, don’t go to the meeting and say, hey, great to meet you.
351 00:50:26.530 ⇒ 00:50:35.329 Uttam Kumaran: I would love to hear, like, you reiterate what your toughest problems are. Like, no. Also, don’t go to the meeting and say, cool, today I’m gonna outline
352 00:50:35.470 ⇒ 00:50:50.689 Uttam Kumaran: this thing, this thing, this thing. No, that’s also wrong. Come to the meeting as a great Brainforged consultant. Like, come to the meeting and say, hey, we know how your business makes money, we know Danny, the COO, like, what you’re kind of having difficulty with.
353 00:50:50.830 ⇒ 00:50:59.490 Uttam Kumaran: we think we can accomplish this unlock for you. I’m happy to go through the projects, tickets, blah blah blah, but ultimately, that’s what we’re here to do, is deliver you an outcome.
354 00:50:59.900 ⇒ 00:51:06.790 Uttam Kumaran: If the client wants to go through tickets, so be it. But ultimately, we want to come across as an outcomes-focused business.
355 00:51:06.850 ⇒ 00:51:18.250 Uttam Kumaran: And as much as possible, we are going to be very, very detail-oriented, but ultimately, the CSO is going to be in charge of delivering the outcomes on the timeline that we stated we would deliver it to.
356 00:51:18.250 ⇒ 00:51:33.029 Uttam Kumaran: And so, like, there will be process and templates and things like that. We’ll use a lot of AI to keep this organized, but I want to stress that the service leaders, your job is to make sure that anything that comes out of your service that goes to a client that’s buying it.
357 00:51:33.100 ⇒ 00:51:47.749 Uttam Kumaran: is on time, is great, and the people that are delivering it are able to deliver it. The CSOs have a… of a different job. Their job is to make sure that they take the entire pie that’s baked from multiple services, and they deliver it with a bow to the client.
358 00:51:47.800 ⇒ 00:52:03.480 Uttam Kumaran: the client may or may not need to care what service, who did it, whatever. They’re the guy, like, they’re the Domino’s delivery driver, like, at your door, being like, I have the pizza for you, right? And, I think we’re gonna arrive at those two roles being, like, really, really crisp.
359 00:52:03.710 ⇒ 00:52:05.860 Uttam Kumaran: Going into, next quarter or so.
360 00:52:07.730 ⇒ 00:52:15.480 Uttam Kumaran: That’s my update on how I’m thinking about this. Any… Questions or thoughts?
361 00:52:19.990 ⇒ 00:52:22.629 Amber Lin: We came a long way, it’s really cool.
362 00:52:23.720 ⇒ 00:52:25.880 Uttam Kumaran: Thanks, Amber. I appreciate it.
363 00:52:26.400 ⇒ 00:52:27.770 Uttam Kumaran: I’m trying.
364 00:52:28.580 ⇒ 00:52:32.200 Uttam Kumaran: yeah, Jasmine or Casey, yeah, anyone?
365 00:52:39.470 ⇒ 00:52:46.789 Uttam Kumaran: This really makes a lot of… this is making, like, perfect sense to everybody here, like, I feel like this took me a lot of thinking to arrive at this point.
366 00:52:47.200 ⇒ 00:52:47.840 Uttam Kumaran: I flew…
367 00:52:47.840 ⇒ 00:52:57.379 Jasmin Multani: Yeah. Yeah. I agree on the intuition thing. Amber and I met up yesterday, and we were so tired.
368 00:52:57.540 ⇒ 00:53:07.590 Jasmin Multani: I thought we were just, like, chatted about life and career. But there was, like, a point where I think Amber was, like… or, like, my…
369 00:53:08.840 ⇒ 00:53:16.360 Jasmin Multani: I guess, career art, tidbit was, like, at some point, I just got really good at reading the room.
370 00:53:16.580 ⇒ 00:53:28.790 Jasmin Multani: It was just, like, a huge point shift somewhere in my late 20s, and that made it easier for me to show up crisp and informed. Like, even…
371 00:53:29.130 ⇒ 00:53:47.489 Jasmin Multani: in, like, a very professional room, the head of department would be like, oh, Jasmine, you’re really professional. And I’m like, no, this is just how I talk. Like, it’s just permeating every part of my life at this point. Being able to read the room, being able to understand the person in front of me, it’s like.
372 00:53:47.490 ⇒ 00:53:54.220 Jasmin Multani: what do they want to say, but what are they actually saying, and why aren’t they saying the actual thing? Is it because
373 00:53:54.270 ⇒ 00:54:05.639 Jasmin Multani: A, they don’t want a, you know, company liability if they document things, or if they say things out loud, or B, are they just really nervous or unsure of what’s happening?
374 00:54:05.910 ⇒ 00:54:15.199 Jasmin Multani: And there have been times where, like, I’ve had stakeholders where I’m like, I need to get something done, like, can we just write this down and you literally sign off on this?
375 00:54:15.490 ⇒ 00:54:28.130 Jasmin Multani: that person would refuse to document, refuse to, like, say, yes, I agree to do this, and after the second try, I was like, I’m not working with this person anymore, and I’ve told
376 00:54:28.390 ⇒ 00:54:37.130 Jasmin Multani: my manager, I told their manager, and I was like, this isn’t gonna work if this person can’t even agree with what he’s doing, because he’s…
377 00:54:37.350 ⇒ 00:54:41.630 Jasmin Multani: misquoting things. So I think there’s that instinct where…
378 00:54:42.010 ⇒ 00:54:47.449 Jasmin Multani: Over time, like, the type of people you will work with, and you refuse to work with until…
379 00:54:47.590 ⇒ 00:54:48.870 Jasmin Multani: the get-togethers.
380 00:54:50.140 ⇒ 00:54:55.669 Uttam Kumaran: Yeah, I would… I would describe that as intuition. I think maybe I’ll come up with some more words for that, but, like.
381 00:54:55.880 ⇒ 00:55:00.410 Uttam Kumaran: I… I think what I’m trying to define is sort of, like, the undefinable part.
382 00:55:00.540 ⇒ 00:55:05.720 Uttam Kumaran: And that’s why even… what I don’t want to happen is that we are, like.
383 00:55:05.910 ⇒ 00:55:11.160 Uttam Kumaran: Okay, yeah, the next piece is the project plan, the next piece is this thing, and then we go to ticket, like…
384 00:55:11.500 ⇒ 00:55:15.730 Uttam Kumaran: I feel like, ultimately, in our business, that will get taken care of.
385 00:55:15.830 ⇒ 00:55:29.020 Uttam Kumaran: But fundamentally, if you’re not able to go to the meeting and get confidence from this other human being on the line, who we’re there to serve, and we’re there to make their job better, help them get promoted, help their company win.
386 00:55:29.070 ⇒ 00:55:42.180 Uttam Kumaran: That’s it. And, like, that is the intuition building that I’m trying to impart. I think you’ll find over the next quarter, it’s gonna get easier to build slides. This whole process is gonna get automated. Like, that’s not…
387 00:55:42.340 ⇒ 00:55:54.109 Uttam Kumaran: our job. Our job is to be the best partners, the best consultants, you know, for these people that are paying us a ton of money to do so, and that’s gonna require
388 00:55:54.110 ⇒ 00:56:06.810 Uttam Kumaran: like, pulling the legs out of this… these processes, so everybody here develops that, like, intuition. And so I think you’re gonna hear us talk a lot about that, and you’re gonna see me challenge, like.
389 00:56:07.660 ⇒ 00:56:13.260 Uttam Kumaran: a lot of the ways of typically doing things, because ultimately, and what I told Pranav is, like.
390 00:56:13.610 ⇒ 00:56:18.669 Uttam Kumaran: If you deliver for the client, and the client is happy, and the client wants to do more work.
391 00:56:19.260 ⇒ 00:56:21.530 Uttam Kumaran: what else can I ask for, right?
392 00:56:21.690 ⇒ 00:56:29.570 Uttam Kumaran: maybe instead of, like, the original pizza that you promised you baked a pie, it doesn’t matter to me. What matters is ultimately that
393 00:56:29.870 ⇒ 00:56:34.449 Uttam Kumaran: The client is happy that we delivered what they wanted, and that we have opportunity to do more for them.
394 00:56:34.960 ⇒ 00:56:53.089 Uttam Kumaran: So I will… we will start to let people run their services and their clients the way they want to. Of course, I… we… me and Robert, have done this for the last 3 years, day in and day out. I have a lot of solutions for problems or ways of doing things, but
395 00:56:53.430 ⇒ 00:57:04.239 Uttam Kumaran: we are… we’re gonna be a lot… lot, lighter-handed on defining, like, this thing to this thing to this thing. Instead, it’s gonna be, hey, do you know how Eden makes money?
396 00:57:04.600 ⇒ 00:57:07.420 Uttam Kumaran: Do you know what Eden’s priorities are this quarter?
397 00:57:07.800 ⇒ 00:57:12.280 Uttam Kumaran: Do you know why they’re paying us to do the things that we’re doing, and how they ladder up?
398 00:57:13.520 ⇒ 00:57:20.810 Uttam Kumaran: until we, like, are able to answer that per client, we will not move to, like, okay, show me the projects, show me the milestones. So…
399 00:57:20.930 ⇒ 00:57:36.040 Uttam Kumaran: I think that’s a great point, Jasmine. Yeah. It’s tough, though. I think this… this industry can get a lot… you can add a lot of process in order to define things, but ultimately, you’re dodging the part that you have to develop this intuition to be a great consultant, you know?
400 00:57:36.360 ⇒ 00:57:40.950 Jasmin Multani: Yeah, and this crappy part is that, like, you build that intuition
401 00:57:41.110 ⇒ 00:57:55.990 Jasmin Multani: through trial by fire. Yeah. By getting yelled at, and then being like, actually, I don’t like getting yelled at. Is this… am I getting yelled at because I dropped the ball, or is it because this person’s, like, really intense, and how do I manage out of that?
402 00:57:55.990 ⇒ 00:58:01.519 Jasmin Multani: In a way that they still feel respected, and they still feel calm. I think that’s, like…
403 00:58:01.550 ⇒ 00:58:09.290 Jasmin Multani: The only reason why I was able to build this tuition is because I was able to hop on different teams.
404 00:58:09.310 ⇒ 00:58:25.790 Jasmin Multani: and see how calm, like, my most recent team has been. Like, if someone drops the ball, they’re like, okay, let’s help you pick back up, how do we lower the barrier? If it didn’t work, if you didn’t launch this, then it’s on you. Like, no one’s gonna yell at you, no one’s gonna, belittle you, but you’re just…
405 00:58:26.120 ⇒ 00:58:29.300 Jasmin Multani: your thing is not gonna go forward.
406 00:58:29.660 ⇒ 00:58:38.299 Jasmin Multani: And it was very, very calm. I’d say, like, the response was indifference, and I feel like when I’m met with indifference, that’s when I’m, like.
407 00:58:38.520 ⇒ 00:58:54.869 Jasmin Multani: more reflective, and I’m like, oh, I could have updated this. But then when I’m… if I… in other cases, if I’m met with, like, an 8pm slack from a director, I’m like, you’re insane. This is… you’re… you must be insane if you think…
408 00:58:55.830 ⇒ 00:59:05.880 Jasmin Multani: I should be responding to this, yeah. So I think, that’s also the culture set for co-workers to show up with. It’s like, I like to carry myself
409 00:59:07.160 ⇒ 00:59:17.750 Jasmin Multani: making sure that I log off by 6pm, and I told this to Amber, too. I think she was like, how much money… how much would you be paid for this dress? And I’m like, zero dollars, like, fine.
410 00:59:19.090 ⇒ 00:59:24.080 Jasmin Multani: If it’s 7 or 8 PM, and I’m stressed out, and I’m yelling at my…
411 00:59:24.530 ⇒ 00:59:29.500 Jasmin Multani: family members, because I’m still thinking about work, like, that’s time for me to re-evaluate.
412 00:59:29.620 ⇒ 00:59:30.370 Jasmin Multani: what I’m doing.
413 00:59:30.370 ⇒ 00:59:30.800 Uttam Kumaran: Yeah.
414 00:59:31.120 ⇒ 00:59:37.070 Jasmin Multani: previously, like, we sold the groceries, and I was an employee. It’s not like…
415 00:59:37.440 ⇒ 00:59:45.760 Jasmin Multani: with my parents, run their own liquor store, like, they were end-to-end owners, and they had employees responding to them. And even they had a hard…
416 00:59:46.030 ⇒ 00:59:53.620 Jasmin Multani: cut off of, like, okay, son, yep, going home. There’s no point in stressing out about this. So that’s also the co-worker I am, like.
417 00:59:53.880 ⇒ 01:00:04.969 Jasmin Multani: If it’s 6pm, I really don’t care about work. Sorry, sorry, that’s not, like, kosher, but if it’s 7pm and I’m still thinking about work, that means I didn’t do enough.
418 01:00:05.360 ⇒ 01:00:07.760 Jasmin Multani: from 8 to 6 p.m. is my…
419 01:00:07.760 ⇒ 01:00:27.390 Uttam Kumaran: Yeah, I think… but I think the last part you mentioned is right, right? Like, this is one of the things we’ll talk about, is, like, you own your thing that you promise to do, then yeah, it doesn’t matter when you log in, log off. Like, I could care… I could frankly care… care less. We care about the ultimate thing, which is our clients getting a great service from us.
420 01:00:27.520 ⇒ 01:00:28.340 Uttam Kumaran: You know.
421 01:00:28.510 ⇒ 01:00:39.420 Uttam Kumaran: And that’s why it’s gonna be… it’s gonna be freeing, but also it’s like, if you’re not… if you’re used to having a lot of structure, that will be the challenge, is like, how do you elevate, you know, through that, so…
422 01:00:40.590 ⇒ 01:00:46.159 Uttam Kumaran: Cool. I think, maybe we can continue, Casey, if you want to just share, we can go to ops.
423 01:00:46.750 ⇒ 01:00:47.290 Casie Aviles: Fair.
424 01:00:50.440 ⇒ 01:00:51.830 Casie Aviles: Oops.
425 01:00:56.940 ⇒ 01:00:57.690 Casie Aviles: Okay.
426 01:01:01.150 ⇒ 01:01:01.760 Rico Rejoso: Alright.
427 01:01:02.270 ⇒ 01:01:06.949 Rico Rejoso: Thank you for sharing screen, hopefully everyone can…
428 01:01:07.200 ⇒ 01:01:14.460 Rico Rejoso: hear and understand me, I’m having some internet issues, so… Let me know if you have any questions, or feel free to call me up, guys, okay?
429 01:01:15.380 ⇒ 01:01:24.150 Rico Rejoso: So, today is just a walkthrough of how our escalation model works, to set up a clear expectation for everyone. So, as I know in the previous retro that we have.
430 01:01:24.300 ⇒ 01:01:34.799 Rico Rejoso: We’ve been enforcing, quite a few policies already and want to make sure that, you know, everyone’s been compliant with it. So, we use one standard sequence of operations, people operations and leadership.
431 01:01:34.800 ⇒ 01:01:47.900 Rico Rejoso: Don’t improvise different consequences, and all of these are logged into one place, so decisions later, such as, renewals, allocations, and maybe bonuses, are based off… are based on facts, and not just, you know, memory stuff.
432 01:01:48.070 ⇒ 01:01:51.739 Rico Rejoso: Now, to discuss the three steps for escalations.
433 01:01:53.710 ⇒ 01:02:02.600 Rico Rejoso: First, occurrence is the informal reminder. Now, for first time for that policy with that person, we just send a direct message, restate the rule.
434 01:02:02.600 ⇒ 01:02:16.399 Rico Rejoso: point out the policy and offer help if something’s unclear, then log it on our tracker. Second occurrence is a written acknowledgement. Now, if the same policy is missed again, we step up, we provide a written summary of what the policy requires, and that it wasn’t met.
435 01:02:16.440 ⇒ 01:02:24.940 Rico Rejoso: the acknowledgement with team member and commitment to follow. Then, again, same process, log it and attach this, signed written acknowledgement documentation.
436 01:02:25.330 ⇒ 01:02:28.129 Rico Rejoso: Now, for the third occurrence, the compliance review
437 01:02:28.180 ⇒ 01:02:47.160 Rico Rejoso: Third time of that same policy, operations or people ops conduct a live conversation to discuss challenges, mitigations, and what we’ll do differently. We make it clear that ongoing noncompliance can affect the continued engagement with the company. Same goes, we log it on our tracker and keep an internal documentation of what was discussed during that compliance review.
438 01:02:47.250 ⇒ 01:02:59.820 Rico Rejoso: Now, some policies are labeled critical. Now, for those, the escalation model may start at written acknowledgement or even compliance review. The tracker and documentation rules still apply. Now.
439 01:03:00.430 ⇒ 01:03:13.909 Rico Rejoso: you may be… you may have, like, confusion, or if you’re unsure whether a policy is critical, you may check the team-facing documents that we shared on our Bainforge team channel, or feel free to send us a direct message, and we’ll help you out, or, explain things out as well.
440 01:03:14.010 ⇒ 01:03:25.720 Rico Rejoso: Now, we’re not… we’re not stacking unrelated or different policies into one scorecard. Take note, the occurrence levels for first, second, and third are tracked separately for each policy and for each.
441 01:03:25.840 ⇒ 01:03:26.790 Rico Rejoso: the member.
442 01:03:27.070 ⇒ 01:03:27.770 Rico Rejoso: Right.
443 01:03:28.230 ⇒ 01:03:31.919 Rico Rejoso: Questions so far on our escalation model?
444 01:03:34.500 ⇒ 01:03:38.620 Greg Stoutenburg: I… can you clarify what is… what is meant by escalation? Like, what’s…
445 01:03:38.750 ⇒ 01:03:42.250 Greg Stoutenburg: I’m not sure what is occurring that we’re describing being escalated.
446 01:03:42.950 ⇒ 01:03:56.200 Rico Rejoso: Yeah, this applies to all policy that we, enforce here at the company, so, like, for out-of-office Clockify and stuff, we also record, the number of occurrence that it was not, or that everyone, or that
447 01:03:56.220 ⇒ 01:04:10.849 Rico Rejoso: team member didn’t comply to the policy that we sent out, right? One thing is that because, it affects, decision for our executives and stakeholders, again, if that policy that was not, followed, affected some
448 01:04:11.200 ⇒ 01:04:27.239 Rico Rejoso: workflows, or let’s say, deliverables for our clients, so we want to make sure that we also record that one. Now, this one, we’re just being transparent and explaining how the escalation model works, so everyone is also aware. We have a theme-facing doc that you guys might want to check out, and we’ll send out after this meeting.
449 01:04:32.750 ⇒ 01:04:47.420 Rico Rejoso: Hopefully, I answered your question, Greg. Guys, yeah, if you have any questions, feel free to send me… to raise your hand now, or maybe if you want, you can also send me a direct message. We can hop on a call or further discuss, any policy or de-escalation model that we are presenting now.
450 01:04:51.380 ⇒ 01:04:57.030 Rico Rejoso: Okay, that’s it for operation, maybe we can move to the next presentation.
451 01:04:57.990 ⇒ 01:04:59.179 Casie Aviles: Thank you, Rico.
452 01:04:59.970 ⇒ 01:05:00.470 Casie Aviles: Thanks, guys.
453 01:05:03.050 ⇒ 01:05:09.920 Kaela Gallagher: Okay, yeah, just for a couple quick recruitment and people updates, sorry guys, I know we are well over time here, but…
454 01:05:10.010 ⇒ 01:05:33.730 Kaela Gallagher: Quick updates, I’ll send, more in the chat shortly. We are rolling out a couple programs to really invest in the team. So, first one is our Brainforge buddy program. Big shout out to Amber for being our first buddy and partnering with Edvate here. This is to support our new team members coming in, and after two successful buddy partnerships, we will pay out a bonus on that.
455 01:05:34.120 ⇒ 01:05:58.289 Kaela Gallagher: And then we are rolling out a certification bonus as well, so this is only for pre-approved certifications. A good example would be the OmniCert. I know Utam has been encouraging. The process for this is in Notion, but I do want to highlight, like, not only are we paying for the time to complete a certification like this, but then
456 01:05:58.290 ⇒ 01:06:05.189 Kaela Gallagher: Upon successful completion, we’re also paying out the bonus, so this is super exciting, and I’ll… I’ll send the…
457 01:06:05.190 ⇒ 01:06:06.570 Kaela Gallagher: Link for this, too.
458 01:06:06.850 ⇒ 01:06:08.110 Kaela Gallagher: That’s it for me.
459 01:06:09.540 ⇒ 01:06:10.509 Casie Aviles: Thank you, Kayla.
460 01:06:11.730 ⇒ 01:06:25.200 Uttam Kumaran: Yeah, I just want to highlight, like, if you… we’re really serious about getting certified, and I think Demi gave us some great feedback on, like, we’re big proponents of continuous learning, and, like, if I wasn’t
461 01:06:25.410 ⇒ 01:06:39.670 Uttam Kumaran: doing a lot of meetings, I would go for some of these certs if I could. I have a lot in my backlog, and so whether it’s Snowflake, dbt, Omni, I also have… gonna explore a bit around some of the Cloud Architect, related certs, like
462 01:06:39.690 ⇒ 01:06:57.039 Uttam Kumaran: please, please, please see if you just want to take one this next month, and, like, if you need help carving out time to do it, or, like, you’re struggling, like, just reach out. We want you to do this, and I want to pay you to do this. I want to pay you twice to do this, so please, give it a go.
463 01:07:04.410 ⇒ 01:07:07.189 Casie Aviles: Okay, let’s move on to GDM updates.
464 01:07:08.580 ⇒ 01:07:10.960 Luke Scorziell: Yeah, I can go on this.
465 01:07:11.330 ⇒ 01:07:20.230 Luke Scorziell: So, nothing too crazy, I guess we’re having some ships on content, but yeah, just, I think, reiterating the three different ways that we’re looking at
466 01:07:20.600 ⇒ 01:07:26.629 Luke Scorziell: Leads coming in is MQL generation, MQL to SQL conversion, and then partner source leads.
467 01:07:27.200 ⇒ 01:07:34.309 Luke Scorziell: So we’re already doing some partner office hours. Next week, Mother Duck, kind of an experiment that we’re running.
468 01:07:34.850 ⇒ 01:07:42.890 Luke Scorziell: We already sent out invites, they’re actually sending out invites from their sales team, and then Hannah and I are working on some diagrams, too, that we’re…
469 01:07:43.390 ⇒ 01:07:49.510 Luke Scorziell: Mapping the go-to-market process. So, yeah, let’s, about it. For this week.
470 01:07:51.500 ⇒ 01:07:52.460 Casie Aviles: Thank you, Luke.
471 01:07:55.760 ⇒ 01:08:02.019 Casie Aviles: Alright, yeah, so… Yeah, just shout-outs to everyone. I can start here,
472 01:08:02.430 ⇒ 01:08:05.320 Casie Aviles: Oh yeah, I’d like to shout out, Pranav,
473 01:08:05.510 ⇒ 01:08:11.800 Casie Aviles: So he’s been… I’ve been working more closely with him on ABC, and he’s been… really helpful, and…
474 01:08:12.350 ⇒ 01:08:15.939 Casie Aviles: Making sure that, you know, the work we do there matters, so…
475 01:08:16.850 ⇒ 01:08:19.199 Casie Aviles: Yep, thanks, for now.
476 01:08:26.600 ⇒ 01:08:29.660 Casie Aviles: Does anyone else have any shoutouts to give?
477 01:08:30.470 ⇒ 01:08:37.760 Uttam Kumaran: Yeah, I was gonna shout out Avate. Great first week. We said hi on Wednesday, but, like, everybody’s been giving positive feedback, so…
478 01:08:38.029 ⇒ 01:08:40.740 Uttam Kumaran: Pumped to have you, and… Excited.
479 01:08:43.220 ⇒ 01:08:55.469 Advait Nandakumar Menon: Yeah, I’m equally excited as well. I want to give a shout out to Amber, Kayla, and really the whole team who has been really helpful with the onboarding, and all the guides and the supports.
480 01:08:56.229 ⇒ 01:09:00.990 Advait Nandakumar Menon: Documents you guys have set up here, so… yeah, slowly getting settled in, and…
481 01:09:01.399 ⇒ 01:09:05.650 Advait Nandakumar Menon: Yeah, hope I can cultivate more in the upcoming weeks.
482 01:09:13.540 ⇒ 01:09:14.319 Casie Aviles: Okay.
483 01:09:18.460 ⇒ 01:09:22.520 Casie Aviles: Yeah, if that’s all, I guess…
484 01:09:22.670 ⇒ 01:09:24.749 Casie Aviles: Yeah, we can end here,
485 01:09:25.180 ⇒ 01:09:31.849 Casie Aviles: So, yeah, have a great… have a great weekend ahead. Guys, thank you everyone for joining the meeting.
486 01:09:33.529 ⇒ 01:09:34.309 Uttam Kumaran: Thank you.
487 01:09:34.310 ⇒ 01:09:35.259 Demilade Agboola: Thank you, have a good one.
488 01:09:35.260 ⇒ 01:09:37.790 Brylle Girang: Bye-bye. It’s a Friday. Bye, guys.
489 01:09:37.790 ⇒ 01:09:39.460 Kaela Gallagher: Thanks, bye.