Meeting Title: Brainforge Data Engineer Interview Date: 2026-04-14 Meeting participants: Kristoffer Kero, Awaish Kumar
WEBVTT
1 00:00:12.190 ⇒ 00:00:12.980 Awaish Kumar: Hi.
2 00:00:13.170 ⇒ 00:00:14.629 Kristoffer Kero: Hello, I wish.
3 00:00:15.590 ⇒ 00:00:19.479 Awaish Kumar: Hello, Christopher. I’m pronouncing it… pronouncing it correctly.
4 00:00:19.480 ⇒ 00:00:21.440 Kristoffer Kero: Yeah, that is, pretty good.
5 00:00:21.890 ⇒ 00:00:22.560 Awaish Kumar: Okay.
6 00:00:22.670 ⇒ 00:00:24.990 Awaish Kumar: Yeah, thank you for your time today.
7 00:00:25.200 ⇒ 00:00:27.509 Awaish Kumar: For this interview,
8 00:00:27.910 ⇒ 00:00:41.590 Awaish Kumar: So in this interview, we are just going to talk a little bit more about yourself, your background, what we have been doing, and yeah, I’m here to answer it if you have any questions regarding Rain Forge at the road.
9 00:00:42.220 ⇒ 00:00:49.559 Kristoffer Kero: Okay, sounds good. I don’t know, did you watch the video that I sent in, or are you fairly clean?
10 00:00:49.810 ⇒ 00:00:53.090 Awaish Kumar: Let’s start with, yeah, brief introduction.
11 00:00:54.720 ⇒ 00:00:55.470 Kristoffer Kero: Yeah, sure.
12 00:00:55.700 ⇒ 00:00:56.929 Awaish Kumar: Yes, sir. Yep.
13 00:00:57.660 ⇒ 00:01:13.259 Kristoffer Kero: Yeah, so, like I was saying in the interviews, or the videos that I sent in, I have a pretty solid, software engineering background coming from a pretty data-heavy, full-stack, developer, back into, background.
14 00:01:13.400 ⇒ 00:01:17.989 Kristoffer Kero: Where I did a lot of, not, like, classic data, work, cleaning, and,
15 00:01:18.460 ⇒ 00:01:33.899 Kristoffer Kero: some data modeling and, normalizing tables and stuff for a, information, site in Sweden, which, they had, public, vehicle information, on story that they were presenting, through, a web API. So,
16 00:01:34.470 ⇒ 00:01:46.759 Kristoffer Kero: I moved on from there to work as a software engineering test for, when Ericsson was rolling out their 5G network. That’s where I got my first, contracting, experience as well, so I was working for a, separate,
17 00:01:46.870 ⇒ 00:01:58.749 Kristoffer Kero: Actually, two separate, companies, that were contracting for them, so, we had this, client, customer, relationship the whole time.
18 00:01:59.270 ⇒ 00:02:06.179 Kristoffer Kero: I’ve moved on since then to work as an individual contractor, so I worked a little bit for Ericsson.
19 00:02:06.180 ⇒ 00:02:23.309 Kristoffer Kero: In the same project there as well, and I’ve been working in data engineering projects as well since then. So, I think my experience, reflects really well the, the listed requirements that you have. I worked a lot in, AWS,
20 00:02:23.610 ⇒ 00:02:28.610 Kristoffer Kero: have, yeah, like, the basics, SQL, Python, and Spark.
21 00:02:28.830 ⇒ 00:02:31.619 Kristoffer Kero: Worked a little bit with,
22 00:02:32.280 ⇒ 00:02:42.729 Kristoffer Kero: other orchestration tools, Step Functions and AWS, I’ve worked a little bit in, Databricks and Snowflake, dbt, a little bit of Airflow, not too much.
23 00:02:43.300 ⇒ 00:02:48.649 Kristoffer Kero: Yeah, I’ve experienced in designing data warehouses, from
24 00:02:48.870 ⇒ 00:02:57.809 Kristoffer Kero: Like, from a problem statement, all the way to, like, serving the data. So, have very good experience in, in,
25 00:02:58.250 ⇒ 00:03:06.499 Kristoffer Kero: continuous delivery loops, that’s pretty much what I was doing the first, time at Ericsson was,
26 00:03:07.570 ⇒ 00:03:11.740 Kristoffer Kero: Having, like, control over coding standards and,
27 00:03:12.020 ⇒ 00:03:16.019 Kristoffer Kero: tests, requirements. So…
28 00:03:16.130 ⇒ 00:03:18.350 Kristoffer Kero: If you want me to…
29 00:03:18.810 ⇒ 00:03:21.959 Kristoffer Kero: Go deep into something, we can just, go through it and…
30 00:03:21.960 ⇒ 00:03:26.449 Awaish Kumar: Yeah, let’s talk about one of the data engineering projects that you did.
31 00:03:26.610 ⇒ 00:03:30.680 Awaish Kumar: Oh, that was… that you consider, like, most challenging project?
32 00:03:30.790 ⇒ 00:03:32.080 Awaish Kumar: In your carrier?
33 00:03:32.770 ⇒ 00:03:33.630 Awaish Kumar: And,
34 00:03:34.120 ⇒ 00:03:44.839 Awaish Kumar: And yeah, just… yeah, and just, like, brief me about your contribution and tools and technologies that you worked on during that project, and the final outcome.
35 00:03:45.010 ⇒ 00:03:52.280 Kristoffer Kero: Yeah, sure. So, I think the biggest one probably was, I was involved in a healthcare, project they had.
36 00:03:52.410 ⇒ 00:03:59.509 Kristoffer Kero: a lot of, different facilities reporting in, and they were having very, I would say.
37 00:03:59.580 ⇒ 00:04:15.679 Kristoffer Kero: unorganized, inputs where they had CSV, files that were, reported, so they had a really hard time consolidating the information and actually doing cross-facility, analysis on it. So, I was actually part of,
38 00:04:16.019 ⇒ 00:04:30.810 Kristoffer Kero: designing the data pipeline that would go into it, so I built a proof of concept for that, where we, set up a medallion architecture for, raw immutable data, so that we had a source for truth.
39 00:04:30.930 ⇒ 00:04:36.310 Kristoffer Kero: And also, after that, a, a curated layer where we had a,
40 00:04:36.920 ⇒ 00:04:41.390 Kristoffer Kero: a strict correctness boundary so that we didn’t,
41 00:04:41.860 ⇒ 00:04:57.490 Kristoffer Kero: let, bad data propagate, downstream. And then we had, from there on, we had a, dimensional, layer where we, targeted the, specific metrics that they were after, so we had, we set up our,
42 00:05:00.920 ⇒ 00:05:09.240 Kristoffer Kero: We set it up so that we did our, partitioning by the different facilities that they were after.
43 00:05:09.240 ⇒ 00:05:11.090 Awaish Kumar: You’ve got the source systems.
44 00:05:12.200 ⇒ 00:05:13.760 Awaish Kumar: What were the source systems?
45 00:05:14.810 ⇒ 00:05:16.260 Kristoffer Kero: So,
46 00:05:16.650 ⇒ 00:05:23.190 Kristoffer Kero: I only received, like, raw CSV files, so I wasn’t in contact with that. I got pretty much.
47 00:05:23.190 ⇒ 00:05:26.839 Awaish Kumar: All CSV files in where Night King?
48 00:05:27.200 ⇒ 00:05:35.569 Kristoffer Kero: Yeah, so they were uploaded to a, we inserted them to S3, so we had S3 as our.
49 00:05:35.840 ⇒ 00:05:37.030 Awaish Kumar: That’s true.
50 00:05:37.030 ⇒ 00:05:38.270 Kristoffer Kero: ingestion, yeah.
51 00:05:38.580 ⇒ 00:05:42.350 Awaish Kumar: Yeah, and then from S3, they got ingested to where?
52 00:05:42.710 ⇒ 00:05:58.600 Kristoffer Kero: Yeah, so, from there, we, so this was, a proof of concept, so I was, I was utilizing, to get, like, really quick iteration and feedback on it. I, I wanted to have it as open as possible, so I was actually using, using a glue,
53 00:05:58.920 ⇒ 00:06:17.260 Kristoffer Kero: data catalog and just, using Athena as the top layer for that, so, I was writing my transforms there, so, but it was, orchestrated by, step functions in AWS, so it was, I could, move it over to,
54 00:06:17.260 ⇒ 00:06:25.889 Kristoffer Kero: do more, Spark-based and Parquet, based, for when the scale was going up, so.
55 00:06:26.160 ⇒ 00:06:28.640 Awaish Kumar: No, that was not deployed in production.
56 00:06:29.420 ⇒ 00:06:41.550 Kristoffer Kero: It was, no, it was, it was a proof of concept where we, went through and, and proved that, the, that we could do, actually, the, the data cleaning, and, the,
57 00:06:41.860 ⇒ 00:06:44.809 Kristoffer Kero: The correctness boundary that they were after, so that we could.
58 00:06:44.810 ⇒ 00:06:48.130 Awaish Kumar: You mentioned Medellin architecture. What is that?
59 00:06:48.560 ⇒ 00:07:01.210 Kristoffer Kero: Yeah, so we had the, a raw immutable layer where we stored all the historic data that we ingested, and then we had a curated layer where we, scrubbed, personal data and,
60 00:07:01.330 ⇒ 00:07:15.290 Kristoffer Kero: we could open up, I guess, the access layers for that, so we didn’t have to have it as strict as the bronze. But that was pretty much the data that we relied on as the…
61 00:07:16.290 ⇒ 00:07:20.429 Kristoffer Kero: golden, information for the dimensional layers that came later that we aggregated.
62 00:07:20.430 ⇒ 00:07:23.429 Awaish Kumar: It was, like, you had raw data in S3,
63 00:07:23.590 ⇒ 00:07:26.490 Awaish Kumar: Yes. And then you’ve had curated data in S3.
64 00:07:26.710 ⇒ 00:07:31.230 Kristoffer Kero: Yes, exactly. So it was, yep.
65 00:07:31.410 ⇒ 00:07:33.150 Awaish Kumar: Middle near.
66 00:07:34.600 ⇒ 00:07:37.489 Awaish Kumar: like, middle in architecture. It has 3 layers.
67 00:07:37.490 ⇒ 00:07:46.720 Kristoffer Kero: Yeah, so, and then we had the gold layer after that. So we had the raw layer, the curated layer, and then we had the gold layer, where we did the aggregations that were,
68 00:07:46.950 ⇒ 00:07:50.350 Awaish Kumar: Yeah, but the RV was… Industry?
69 00:07:50.880 ⇒ 00:08:00.049 Kristoffer Kero: Yeah, this was also an S3. I kept everything in the same, because of the proof of concept, so we could obviously switch those out, as we, as the requirements would change, so…
70 00:08:00.750 ⇒ 00:08:05.450 Awaish Kumar: Okay, so how did you perform dimension modeling in the S3?
71 00:08:06.380 ⇒ 00:08:09.049 Kristoffer Kero: Yeah, so, since they wanted facility.
72 00:08:09.500 ⇒ 00:08:20.290 Kristoffer Kero: data, I had a facility, directory where we, partitioned by, this was actually… they had multiple states, so I, partitioned it by states, and then by,
73 00:08:20.400 ⇒ 00:08:31.669 Kristoffer Kero: By facility, so I had a dimensional, model for the… all the separate, facilities, and, that would contain all the… all the data that they had there, so,
74 00:08:32.090 ⇒ 00:08:36.370 Kristoffer Kero: in a more production model, I would probably go to more of a.
75 00:08:36.570 ⇒ 00:08:38.660 Awaish Kumar: But that is just one dimension, right?
76 00:08:38.970 ⇒ 00:08:47.109 Kristoffer Kero: Yeah, that was, that was the main dimension that they were interested in. We had other dimensions as well, with, for data handling and stuff.
77 00:08:47.990 ⇒ 00:08:49.460 Awaish Kumar: What is dimensional modeling?
78 00:08:50.310 ⇒ 00:08:53.439 Kristoffer Kero: Dimensional modeling, so that’s,
79 00:08:53.980 ⇒ 00:09:09.529 Kristoffer Kero: collecting, I guess, in a more, normalized, sense, so you have the, I guess, from a software perspective, I would call it, like, a decorator layer, where we pull in the information that doesn’t change very often. But, and then we have the…
80 00:09:09.530 ⇒ 00:09:14.329 Kristoffer Kero: events, that are more in a fact, table, so we had a fact table with…
81 00:09:14.580 ⇒ 00:09:30.779 Kristoffer Kero: Both, quarterly reports and also daily, hourly, or not hourly, daily hours, that were reported in. So, we had two fact tables, and then we had, a couple of, dimensional tables.
82 00:09:31.900 ⇒ 00:09:35.530 Awaish Kumar: Okay, you also mentioned that you have worked with Airflow?
83 00:09:36.420 ⇒ 00:09:46.650 Kristoffer Kero: Yeah, a little bit, not much. I have more experience in other orchestration tools, but I’m aware of the DAG models and what usually goes into Airflow, so…
84 00:09:46.840 ⇒ 00:09:51.730 Awaish Kumar: What are the other tools, I guess? Save Functions, or any other tool as well?
85 00:09:51.890 ⇒ 00:10:03.440 Kristoffer Kero: Yeah, step functions, and then I’ve… I guess, sort of the same in dbt, where I’ve had different models, not exactly the same. And then I’ve used Databricks a little bit as well.
86 00:10:03.440 ⇒ 00:10:05.159 Awaish Kumar: That is not an arbitration tool.
87 00:10:05.620 ⇒ 00:10:06.360 Kristoffer Kero: Excuse me?
88 00:10:06.870 ⇒ 00:10:09.570 Awaish Kumar: DBT is not an orchestration tool.
89 00:10:09.580 ⇒ 00:10:18.629 Kristoffer Kero: No, no, I know. I’m just saying that I use that instead of a pure orchestration tool for getting the DAGs and following the lineage.
90 00:10:19.010 ⇒ 00:10:19.760 Kristoffer Kero: In the same way.
91 00:10:21.260 ⇒ 00:10:25.860 Awaish Kumar: like, how did you… how did you use dbt for… orchestration?
92 00:10:27.170 ⇒ 00:10:44.800 Kristoffer Kero: No, I’m just saying, I used that as a, when I was using Snowflake, so, I had one project where we used dbt, as the, transformations, and, producing the DAG so that we have lineage for the, for the data that we…
93 00:10:44.950 ⇒ 00:10:45.970 Kristoffer Kero: Processed.
94 00:10:45.970 ⇒ 00:10:50.230 Awaish Kumar: Only the step function that you are familiar with.
95 00:10:50.230 ⇒ 00:10:57.310 Kristoffer Kero: I would say step functions mainly, I’ve tampered a little bit with Databricks and timed a little bit of Airflow, so yeah.
96 00:10:59.050 ⇒ 00:11:02.750 Awaish Kumar: Okay, but, like… Mmm.
97 00:11:05.560 ⇒ 00:11:11.829 Awaish Kumar: But state functions are… Like, serverless deployments, right?
98 00:11:12.420 ⇒ 00:11:29.679 Kristoffer Kero: Yeah, so, it’s on AWS, you have a virtual machine, I guess, that runs the different steps, and you can configure there if you want to do retries, and if you have step backs on the retries, and how to handle errors, and…
99 00:11:30.790 ⇒ 00:11:43.179 Awaish Kumar: So, for example, in Airflow, you have a user interface, where you can actually see how your tasks are running, and if something fails, whatever, and then it has logs and everything.
100 00:11:43.180 ⇒ 00:11:43.800 Kristoffer Kero: Yep.
101 00:11:43.800 ⇒ 00:11:46.499 Awaish Kumar: How you get similar feature in state functions?
102 00:11:47.020 ⇒ 00:11:50.219 Awaish Kumar: If I have to… if I have to run 100 jobs.
103 00:11:50.530 ⇒ 00:11:51.080 Kristoffer Kero: Yep.
104 00:11:51.910 ⇒ 00:11:54.480 Awaish Kumar: So what… what will be the ideal…
105 00:11:55.890 ⇒ 00:12:02.509 Awaish Kumar: Like, the… what will be the best practices for deploying 100 jobs as estate functions?
106 00:12:03.340 ⇒ 00:12:06.349 Kristoffer Kero: That’s a very good question.
107 00:12:06.500 ⇒ 00:12:20.550 Kristoffer Kero: I think probably if I had more scale like that, I probably would go over to… to more of an airflow, solution. I was working mainly on, smaller, smaller scale, project, but,
108 00:12:20.850 ⇒ 00:12:29.159 Kristoffer Kero: I would have a, like, a dashboard that would, get the insights from, reporting, to,
109 00:12:29.550 ⇒ 00:12:37.289 Kristoffer Kero: So, like, metadata metrics for the different runs, so I would have a dashboard that would, keep track of that.
110 00:12:37.930 ⇒ 00:12:38.660 Awaish Kumar: Okay.
111 00:12:39.590 ⇒ 00:12:43.489 Awaish Kumar: How your development flow works with the functions.
112 00:12:45.560 ⇒ 00:13:04.430 Kristoffer Kero: Yeah, so, you set up, different states, and then, you have an input to the state, and then you have an output, and then you have, success or failure, depending on, what happens. So, so you can have branches, you can have, parallel executions, and,
113 00:13:04.970 ⇒ 00:13:19.849 Kristoffer Kero: Yeah, so it would be, depending on, how the execution went, we would go to the next step, so you define all the… all the states that you can go through, and then all the… the results that… that are possible, and, you have failure, and…
114 00:13:20.230 ⇒ 00:13:21.369 Kristoffer Kero: Acceptable, so…
115 00:13:21.540 ⇒ 00:13:27.889 Awaish Kumar: Are you fe- are you familiar with the… Any ingestion tool?
116 00:13:29.330 ⇒ 00:13:30.520 Kristoffer Kero: ingestion tools.
117 00:13:30.900 ⇒ 00:13:31.550 Awaish Kumar: Yes.
118 00:13:32.070 ⇒ 00:13:37.050 Kristoffer Kero: Yeah, so I’ve used, different injection, ingestion,
119 00:13:37.510 ⇒ 00:13:41.289 Kristoffer Kero: Features, depending on if we had,
120 00:13:41.990 ⇒ 00:13:48.099 Kristoffer Kero: streaming data, we had, obviously in the healthcare project that I talked about was more batch.
121 00:13:48.100 ⇒ 00:13:51.499 Awaish Kumar: Okay, let’s talk about, talk about streaming. So…
122 00:13:51.960 ⇒ 00:13:54.640 Awaish Kumar: You mentioned that you have worked with Snowflake.
123 00:13:55.190 ⇒ 00:13:55.720 Kristoffer Kero: Yep.
124 00:13:56.570 ⇒ 00:14:01.249 Awaish Kumar: So, how would you, stream data from S3 into Snowflake?
125 00:14:01.970 ⇒ 00:14:15.699 Kristoffer Kero: So I’d have to set up an external stage in Snowflake that would keep track of that, and then I would have to have a snow pipe, that would take care of the streaming data. It’s not something that I’ve worked a lot within Snowflake, but…
126 00:14:15.860 ⇒ 00:14:18.060 Kristoffer Kero: Yeah, I, I know, sort of.
127 00:14:18.060 ⇒ 00:14:19.790 Awaish Kumar: We needed snowpipe before?
128 00:14:20.460 ⇒ 00:14:28.480 Kristoffer Kero: I’ve… I’ve used it for, like, a learning type of project, just to see how I connect the different pieces. I can’t say that I’ve even worked in depth with it.
129 00:14:28.480 ⇒ 00:14:29.120 Awaish Kumar: Totally.
130 00:14:29.580 ⇒ 00:14:34.059 Awaish Kumar: Okay, let’s… walk me through exact… what exact steps you will…
131 00:14:34.590 ⇒ 00:14:38.750 Awaish Kumar: You did, to actually… Build a complete snowpack.
132 00:14:40.790 ⇒ 00:14:46.989 Kristoffer Kero: Yeah, so I would have… I would set up an S3 or… or having that, before.
133 00:14:47.130 ⇒ 00:14:47.820 Kristoffer Kero: Yep.
134 00:14:48.040 ⇒ 00:14:51.889 Awaish Kumar: Yeah, let’s say there is a bucket in S3 that has a name test.
135 00:14:52.070 ⇒ 00:14:57.339 Awaish Kumar: Inside of it, we have a folder called A, and it has all Parquet files.
136 00:14:57.710 ⇒ 00:15:00.680 Awaish Kumar: So now, just let me know, what would you do
137 00:15:01.460 ⇒ 00:15:06.379 Awaish Kumar: after that, to set up that everything, all files in A, just
138 00:15:06.800 ⇒ 00:15:14.960 Awaish Kumar: gets, streamed into Snowflake, and also, whenever there’s a new file, they also just get loaded.
139 00:15:15.680 ⇒ 00:15:31.620 Kristoffer Kero: Yeah, so, I would set up a, like I said, I would need a… an external stage set up in Snowflake that would, reflect the, S3 bucket, that I have, and then I’d need to set up the, the proper,
140 00:15:32.220 ⇒ 00:15:41.760 Kristoffer Kero: Aim, rules for that as well in, in, in AWS. And,
141 00:15:41.960 ⇒ 00:15:46.089 Kristoffer Kero: then I would have to set up a… a target,
142 00:15:46.520 ⇒ 00:15:54.740 Kristoffer Kero: target, database, schema in, in Snowflake that would, receive, the,
143 00:15:54.880 ⇒ 00:16:07.570 Kristoffer Kero: transformed, data, and then I would set up the, the transform, after that, and I would use, merge, merge semantics, to, get, so that I can,
144 00:16:07.970 ⇒ 00:16:26.019 Kristoffer Kero: rewrite data. When it comes specifically to streaming, I haven’t used it a lot. I know that there are, differences depending on if you have, change data capture or if you have, batch, inputs, so I haven’t worked specifically with, with Snowpipe and those,
145 00:16:26.370 ⇒ 00:16:27.110 Kristoffer Kero: So…
146 00:16:28.690 ⇒ 00:16:29.380 Awaish Kumar: Okay.
147 00:16:31.180 ⇒ 00:16:37.780 Awaish Kumar: So… Yeah, I think… Thanks.
148 00:16:38.940 ⇒ 00:16:44.700 Awaish Kumar: Okay, last… my last question is more about… Sure.
149 00:16:47.310 ⇒ 00:16:51.259 Awaish Kumar: How would you communicate with your stakeholders if there are
150 00:16:51.680 ⇒ 00:16:55.220 Awaish Kumar: If there is any disagreement over…
151 00:16:56.390 ⇒ 00:17:01.990 Awaish Kumar: Not… okay, let’s say, like, you created a data pipeline, And…
152 00:17:03.410 ⇒ 00:17:07.069 Awaish Kumar: And that has, obviously, some latency.
153 00:17:07.250 ⇒ 00:17:08.000 Awaish Kumar: Wow.
154 00:17:08.160 ⇒ 00:17:10.419 Awaish Kumar: Because of the infrastructure you have.
155 00:17:11.250 ⇒ 00:17:14.889 Awaish Kumar: So, and… and stakeholder might disagree.
156 00:17:15.000 ⇒ 00:17:24.639 Awaish Kumar: with the data freshness or things like that, he might be expecting real-time data and things like that. So how would you communicate your…
157 00:17:26.290 ⇒ 00:17:28.580 Awaish Kumar: Communicate and defend yourself.
158 00:17:28.980 ⇒ 00:17:30.649 Awaish Kumar: In that situation.
159 00:17:31.460 ⇒ 00:17:46.379 Kristoffer Kero: Yeah, I mean, obviously, I would try to have a data-informed decision when I come to a disagreement that I actually have a backing for why I might have a disagreeing conclusion.
160 00:17:46.690 ⇒ 00:17:52.970 Kristoffer Kero: I would first specify, like, what is the actual disagreement when it comes to if it’s, ingestion.
161 00:17:52.970 ⇒ 00:18:08.449 Kristoffer Kero: latency, then it would have to be presented as what are the trade-offs? So, is the stakeholder willing to take on the extra cost, or, is it, have they thought about that? So, I would come in with.
162 00:18:08.510 ⇒ 00:18:25.520 Kristoffer Kero: different scenarios and what that would actually mean for the business. So, that would go into a lot of, like, what the trade-offs are, and obviously the stakeholder is the one who’s going to take the final decision. It’s just my job to let them know this is,
163 00:18:25.550 ⇒ 00:18:38.000 Kristoffer Kero: the informed decision so that they understand the technical requirements behind that. So, it would be an ongoing conversation to figure out what is the best for the business, and not just,
164 00:18:38.570 ⇒ 00:18:41.250 Kristoffer Kero: Trying to win an argument, so…
165 00:18:44.230 ⇒ 00:18:45.000 Awaish Kumar: Okay.
166 00:18:47.860 ⇒ 00:18:53.610 Awaish Kumar: Okay, and how… Okay, yeah, one more question. It’s more about…
167 00:18:54.090 ⇒ 00:18:57.630 Awaish Kumar: What are you looking for in the… Noodle on.
168 00:18:58.050 ⇒ 00:19:02.520 Kristoffer Kero: Yeah, so I’m mainly looking into getting more,
169 00:19:03.410 ⇒ 00:19:14.939 Kristoffer Kero: I guess working with the same stuff that I have been. I want to design data pipelines for correctness and trust, and so that’s something that I want to get into.
170 00:19:15.090 ⇒ 00:19:31.899 Kristoffer Kero: just getting more miles, I guess, in doing it. I’ve been working as an individual contractor, and I guess I haven’t been… like, I enjoy talking to clients, figure out their problems, and trying to come up with the best solutions and conversations like that. I haven’t…
171 00:19:31.910 ⇒ 00:19:45.480 Kristoffer Kero: been as comfortable, maybe, with the sales and marketing type of thing, so that’s… I think that would be specifically what is attracting me to a position like this, to get more solid with the actual,
172 00:19:45.690 ⇒ 00:19:50.400 Kristoffer Kero: tech and services that I’m working with, so, yeah.
173 00:19:51.150 ⇒ 00:19:51.840 Awaish Kumar: Okay.
174 00:19:52.790 ⇒ 00:19:58.340 Awaish Kumar: Okay, I think I’m good with my questions. I will leave some time for you to ask any questions.
175 00:19:59.000 ⇒ 00:20:06.690 Kristoffer Kero: Yeah, I was just wondering, like, is it one project that you’re hiring for, or do you have, like, multiple ongoing, projects for this position?
176 00:20:09.150 ⇒ 00:20:15.860 Awaish Kumar: So, this is a position that… where you will be working directly with Brainforge,
177 00:20:16.430 ⇒ 00:20:22.099 Awaish Kumar: Obviously, like, it’s not like you will be kind of directly connected with somehow
178 00:20:22.600 ⇒ 00:20:25.149 Awaish Kumar: With the client, and… Okay.
179 00:20:25.790 ⇒ 00:20:31.240 Awaish Kumar: Yeah, your contract would say that you are… for this project, obviously, you will be hired
180 00:20:31.360 ⇒ 00:20:33.040 Awaish Kumar: to work with brain food.
181 00:20:33.160 ⇒ 00:20:37.039 Awaish Kumar: And we have a lot of clients right now, and we need people.
182 00:20:37.200 ⇒ 00:20:44.660 Awaish Kumar: some… That means there’s a lot of work, so yeah, it’s not just for one project.
183 00:20:45.570 ⇒ 00:20:53.050 Awaish Kumar: This… Since we are, like, growing rapidly, we are… we are in need of
184 00:20:53.180 ⇒ 00:20:58.899 Awaish Kumar: more resources, we are hiring in almost every department at Brainforge.
185 00:20:59.660 ⇒ 00:21:02.410 Awaish Kumar: Data engineering is one of them, and
186 00:21:02.850 ⇒ 00:21:06.660 Awaish Kumar: Certainly, we are growing in more clients and more projects.
187 00:21:07.750 ⇒ 00:21:16.280 Kristoffer Kero: Yeah, so, you have multiple projects then, so they’re varying maturity, or do you have mostly greenfield or brownfield, projects, or…
188 00:21:19.220 ⇒ 00:21:23.690 Awaish Kumar: Yeah, like, the projects are of different… Maturity, obviously.
189 00:21:23.800 ⇒ 00:21:29.249 Awaish Kumar: Yep. Levels. It depends on the client, like, you might be working on one client that’s really matured.
190 00:21:29.530 ⇒ 00:21:30.460 Awaish Kumar: Come on.
191 00:21:31.980 ⇒ 00:21:34.730 Awaish Kumar: And then a new client comes in that
192 00:21:35.150 ⇒ 00:21:39.810 Awaish Kumar: And there’s no data infrastructure, and then you start working with them.
193 00:21:40.630 ⇒ 00:21:46.639 Awaish Kumar: And basically help them build the data foundations, Yup. And,
194 00:21:47.070 ⇒ 00:21:50.029 Awaish Kumar: And build the engagement with the client, and…
195 00:21:50.230 ⇒ 00:21:55.630 Awaish Kumar: Until that line reaches at the point where it’s kind of… we can call it a…
196 00:21:56.050 ⇒ 00:22:10.399 Awaish Kumar: matured client and a matured project, where there’s not a lot of work, but then it’s just always… it’s always like that, right? We have clients that, we build the data foundation, we build the data.
197 00:22:10.510 ⇒ 00:22:12.300 Awaish Kumar: analytics…
198 00:22:12.750 ⇒ 00:22:21.179 Awaish Kumar: engineering models that are being used, then it’s a job of data analysts. They spend a lot of time to keep that engagement
199 00:22:21.340 ⇒ 00:22:40.469 Awaish Kumar: share new insights, and only ask, we are there for, kind of, maintenance. But then, basically, your time is now being less spent on client A, then you might be, yeah, you might be assigned to another client as well, in parallel, so you can
200 00:22:40.590 ⇒ 00:22:42.559 Awaish Kumar: Basically, you have to work on both of them.
201 00:22:43.150 ⇒ 00:22:57.969 Kristoffer Kero: Okay, gotcha. So, just to make clear, like, so Brainforce is pretty much selling the service of building up the data pipeline. It’s not like you’re selling a, a development team or individual contractors, right?
202 00:22:57.970 ⇒ 00:23:00.610 Awaish Kumar: Oh, yeah, we’re not selling contractors and…
203 00:23:01.440 ⇒ 00:23:06.410 Awaish Kumar: individuals, but it is more like… we are sharing services. Data engineering?
204 00:23:06.610 ⇒ 00:23:08.829 Awaish Kumar: And inside of data engineering, we might have
205 00:23:09.170 ⇒ 00:23:15.640 Awaish Kumar: 5, 3, 4, 5 different, kind of services that we sell. Similarly, we have analytics engineering.
206 00:23:15.800 ⇒ 00:23:18.589 Awaish Kumar: There, you can, have, like, okay.
207 00:23:18.690 ⇒ 00:23:27.180 Awaish Kumar: building the data warehouse, models, and building a dashboard, and things like that. Okay. Then we also have an AI service, and then our strategy
208 00:23:27.820 ⇒ 00:23:33.919 Awaish Kumar: Service, where a strategy team that has its own services for providing
209 00:23:34.560 ⇒ 00:23:37.869 Awaish Kumar: Like, they have some set of services that they provide.
210 00:23:38.160 ⇒ 00:23:39.890 Awaish Kumar: Yep. In this enterprise, yep.
211 00:23:40.290 ⇒ 00:23:55.649 Kristoffer Kero: Okay, cool. Obviously, like, you seem to have a growth phase now, but, what about, like, downtime between projects? It’s obviously not a problem now, but, do you have, like, continuous learning, or is it more like, if you have work, you have work?
212 00:23:57.090 ⇒ 00:24:00.999 Awaish Kumar: Yeah, I’ve been here for an year, and I have never seen a…
213 00:24:01.190 ⇒ 00:24:04.880 Awaish Kumar: time where I didn’t have to work. So…
214 00:24:05.540 ⇒ 00:24:09.040 Awaish Kumar: But yes, there… if there is a…
215 00:24:10.120 ⇒ 00:24:22.900 Awaish Kumar: as I… as I mentioned, as the clients mature, you might have some time, so it’s… it goes to the… either to the new client, or if it will go to helping internal teams.
216 00:24:23.110 ⇒ 00:24:27.090 Awaish Kumar: Like, for example, you might,
217 00:24:27.200 ⇒ 00:24:30.600 Awaish Kumar: help create playbooks for it, right? Like…
218 00:24:30.750 ⇒ 00:24:32.389 Awaish Kumar: How to create a snow pile.
219 00:24:32.560 ⇒ 00:24:38.239 Awaish Kumar: Yep. And then standardizing those, like, codifying those.
220 00:24:38.520 ⇒ 00:24:43.880 Awaish Kumar: things, building assets for the company, so that will… the time will go there.
221 00:24:44.180 ⇒ 00:24:47.149 Kristoffer Kero: Okay, yeah, that’s awesome. So…
222 00:24:47.390 ⇒ 00:24:52.959 Kristoffer Kero: I don’t know that I could see it specifically mentioned, but I’m assuming it’s a remote position.
223 00:24:53.590 ⇒ 00:24:54.220 Awaish Kumar: Yes.
224 00:24:54.400 ⇒ 00:24:58.770 Kristoffer Kero: So, how… How much communication is there between,
225 00:24:58.960 ⇒ 00:25:07.020 Kristoffer Kero: like, do you work in Teams and have communication within the teams, or do you have, I mean, Slack or Teams, or, or.
226 00:25:09.310 ⇒ 00:25:22.519 Awaish Kumar: I would say, like, at least there will be a close collaboration between data engineers, analytics engineers, and the strategy team. We only have AI team that might not be
227 00:25:23.430 ⇒ 00:25:26.180 Awaish Kumar: We might not be interacting with them,
228 00:25:26.590 ⇒ 00:25:33.489 Awaish Kumar: as… as much as we will be with other teams, but for these three teams, it is close collaboration. Although I said
229 00:25:33.600 ⇒ 00:25:42.670 Awaish Kumar: These are different teams with a different set of services, but these are… these are all, like, interrelated, so based on the data we just…
230 00:25:42.820 ⇒ 00:25:46.159 Awaish Kumar: Analytics engineer has to build a model and a
231 00:25:46.270 ⇒ 00:25:49.369 Awaish Kumar: Someone from the strategy team has to create a dashboard.
232 00:25:49.640 ⇒ 00:25:53.499 Awaish Kumar: It all depends, like, it’s a kind of a lineage.
233 00:25:53.960 ⇒ 00:25:55.930 Awaish Kumar: So…
234 00:25:57.180 ⇒ 00:26:01.950 Kristoffer Kero: Yeah, so you’re selling the whole package, pretty much, you’re two different clients, so…
235 00:26:03.110 ⇒ 00:26:08.469 Kristoffer Kero: Okay, cool. I think that was, all the questions I really had.
236 00:26:11.930 ⇒ 00:26:26.350 Kristoffer Kero: I guess it would just be, like, hiring, like, starting points. I don’t know if you had anything on compensation either, I don’t know, more like asking where… where in the process those things are… are discussed.
237 00:26:26.600 ⇒ 00:26:28.200 Kristoffer Kero: If I move forward.
238 00:26:28.200 ⇒ 00:26:30.849 Awaish Kumar: Anything that you can discuss with Kayla?
239 00:26:30.990 ⇒ 00:26:33.280 Awaish Kumar: And then ask her?
240 00:26:33.480 ⇒ 00:26:36.060 Awaish Kumar: Regarding all these questions, and she will let you know.
241 00:26:36.240 ⇒ 00:26:37.210 Kristoffer Kero: Okay, yeah.
242 00:26:37.500 ⇒ 00:26:38.470 Kristoffer Kero: Sounds good.
243 00:26:39.300 ⇒ 00:26:42.789 Awaish Kumar: Okay, I think, think, okay, yeah, we’re good.
244 00:26:43.190 ⇒ 00:26:50.180 Awaish Kumar: Thank you for your time, and then, as I mentioned, Kayla will get back to you with the feedback and the next steps.
245 00:26:50.400 ⇒ 00:26:52.010 Kristoffer Kero: Yep, sounds good. Thank you for your time.
246 00:26:52.460 ⇒ 00:26:53.840 Awaish Kumar: Thank you. Bye.