Meeting Title: Uttam_Anton Date: 2025-02-05 Meeting participants: Uttam Kumaran, Anton Gefvert
WEBVTT
1 00:00:28.346 ⇒ 00:00:30.660 Anton Gefvert: That’s enough.
2 00:00:38.530 ⇒ 00:00:42.210 Uttam Kumaran: Hey? Anton? Sorry. Just was apologies.
3 00:00:42.360 ⇒ 00:00:43.177 Anton Gefvert: Yeah. No worries.
4 00:00:43.860 ⇒ 00:00:45.099 Uttam Kumaran: Nice to meet you. How are you?
5 00:00:45.100 ⇒ 00:00:48.839 Anton Gefvert: I think. I’m pretty good. I’m a bit tired other than that.
6 00:00:48.840 ⇒ 00:00:49.450 Uttam Kumaran: Me, too.
7 00:00:49.720 ⇒ 00:00:54.139 Uttam Kumaran: Yeah, I was up late doing work. And still this query I’m working on is not done.
8 00:00:54.688 ⇒ 00:00:57.641 Uttam Kumaran: I don’t know what to do. There’s some duplication.
9 00:00:58.690 ⇒ 00:01:00.415 Uttam Kumaran: I don’t know.
10 00:01:01.440 ⇒ 00:01:05.560 Uttam Kumaran: but yeah, I think I’m really happy. Mike put us in touch, you know. Me and Mike have been.
11 00:01:05.810 ⇒ 00:01:13.408 Uttam Kumaran: We work together. We worked way back when and we’ve always kept in touch really respect a lot of the work that he’s done. And
12 00:01:13.690 ⇒ 00:01:14.130 Anton Gefvert: Yeah.
13 00:01:14.130 ⇒ 00:01:21.229 Uttam Kumaran: Always trying to find a way to like work together. I recently started this company brain, for I guess not recently, maybe, like now it’s been 2 years.
14 00:01:21.530 ⇒ 00:01:22.035 Anton Gefvert: Yeah.
15 00:01:22.540 ⇒ 00:01:24.400 Uttam Kumaran: And a half. But
16 00:01:25.021 ⇒ 00:01:39.019 Uttam Kumaran: yeah, we’re like a data engineering data, analytics, consultancy. In addition, over the last 6 months, we’ve also been doing AI deployment. So Llms agents, building rag systems, things like that.
17 00:01:39.630 ⇒ 00:01:44.190 Uttam Kumaran: And yeah, we we, we’re about like 12 or 13 people. Now
18 00:01:45.680 ⇒ 00:01:53.080 Uttam Kumaran: just kind of crazy. And so they sort of growing. So I everybody I talk to. I’m sort of like.
19 00:01:53.270 ⇒ 00:01:55.399 Uttam Kumaran: have you talked to anyone smart recently.
20 00:01:56.340 ⇒ 00:02:05.097 Uttam Kumaran: So let me know. I don’t care what they do. But of course, like, you know, also very curious to learn about your background and hear what you’re doing. And
21 00:02:05.650 ⇒ 00:02:08.279 Uttam Kumaran: yeah, just have a casual conversation. So.
22 00:02:08.280 ⇒ 00:02:08.800 Anton Gefvert: Yeah.
23 00:02:08.999 ⇒ 00:02:11.189 Uttam Kumaran: I’m I’m here in Austin, by the way, in the States.
24 00:02:11.190 ⇒ 00:02:15.120 Anton Gefvert: Yeah, yeah, yeah. I’m in Sweden.
25 00:02:15.370 ⇒ 00:02:17.649 Anton Gefvert: Darkest country weather. Shit right now.
26 00:02:20.193 ⇒ 00:02:22.390 Anton Gefvert: Yeah, I guess you can just like
27 00:02:22.570 ⇒ 00:02:25.349 Anton Gefvert: go through some background. I guess I kinda
28 00:02:25.540 ⇒ 00:02:31.789 Anton Gefvert: like most, at at least my age. I guess most software engineers was like, I like games. I wanna make games.
29 00:02:32.267 ⇒ 00:02:38.889 Anton Gefvert: Which kind of led me into the path of programming. I started programming in Gary Smart, if you know what I was like.
30 00:02:39.340 ⇒ 00:02:44.439 Anton Gefvert: have like a wire MoD has, like a programming module in. So I did.
31 00:02:44.440 ⇒ 00:02:45.030 Uttam Kumaran: Yeah.
32 00:02:45.540 ⇒ 00:02:48.788 Anton Gefvert: Up there. I was very fun
33 00:02:49.500 ⇒ 00:03:03.450 Anton Gefvert: And then, like at university, I was like, Do I do like a computer science engineering? Or do. I like specifically game programming? I’m really glad I chose. Like in general software engineer. That’s the one thing I realize is, I don’t like game programming.
34 00:03:03.450 ⇒ 00:03:04.680 Uttam Kumaran: Yeah, okay.
35 00:03:05.900 ⇒ 00:03:11.249 Anton Gefvert: I realize I like like most programming, except for games.
36 00:03:11.250 ⇒ 00:03:11.785 Uttam Kumaran: Okay.
37 00:03:13.670 ⇒ 00:03:26.250 Anton Gefvert: yeah, we had like a course or 2 courses in like machine learning. AI, I guess, wasn’t a lot of machine learning, because it was like the starting courses, and like that got me hooked, and I started doing that for my master’s.
38 00:03:26.510 ⇒ 00:03:27.180 Uttam Kumaran: Nice.
39 00:03:27.953 ⇒ 00:03:36.606 Anton Gefvert: Then a bunch of shit happened. I I took on too many side projects. I like overworked myself, got into a depression.
40 00:03:39.520 ⇒ 00:03:40.460 Uttam Kumaran: That happens?
41 00:03:40.690 ⇒ 00:03:41.270 Uttam Kumaran: Yeah.
42 00:03:41.630 ⇒ 00:03:46.099 Anton Gefvert: Which is like I usually say it’s like.
43 00:03:46.590 ⇒ 00:03:51.739 Anton Gefvert: I wouldn’t wish it on anyone. I wouldn’t do it again. But I’m so glad it happened, cause it’s like
44 00:03:51.850 ⇒ 00:03:53.500 Anton Gefvert: talking so much about myself.
45 00:03:53.800 ⇒ 00:03:54.230 Uttam Kumaran: Yeah.
46 00:03:56.780 ⇒ 00:04:06.823 Anton Gefvert: But yeah, I got back decently on my feet. And then, like I, I was planning on finishing my studies before starting to work, but like
47 00:04:07.600 ⇒ 00:04:11.729 Anton Gefvert: that whole term, I was doing my master’s thesis. I like got sick like
48 00:04:11.850 ⇒ 00:04:14.939 Anton Gefvert: a total of 6 week and broke my arm. So it’s like.
49 00:04:15.360 ⇒ 00:04:21.442 Anton Gefvert: yeah, so I started working before finishing all that. But now I’m like, really close to graduating
50 00:04:22.029 ⇒ 00:04:26.340 Anton Gefvert: And before that, I guess I had like a bunch of just internships.
51 00:04:27.320 ⇒ 00:04:30.619 Anton Gefvert: I just told the story of my 1st internship to
52 00:04:31.150 ⇒ 00:04:33.809 Anton Gefvert: to Michael it was actually like the the
53 00:04:34.730 ⇒ 00:04:39.149 Anton Gefvert: most fun series of like random occurrences.
54 00:04:39.730 ⇒ 00:04:43.450 Anton Gefvert: So it starts with, like, I was going back from
55 00:04:44.020 ⇒ 00:04:48.040 Anton Gefvert: Stockholm to lean shopping where I live, and I sat on a
56 00:04:48.340 ⇒ 00:04:58.129 Anton Gefvert: batteries on my computer said, I can’t do anything like the socket was broken. But like the guy next to me, he’s like, he starts talking to me. I’m like cool. And he started like
57 00:04:58.320 ⇒ 00:05:12.007 Anton Gefvert: at the same university, or like he’s an alumni. And we start talking and like one of the conversations like AI company that’s like a startup in our city. And to do like football based image recognition and data collection.
58 00:05:12.790 ⇒ 00:05:15.939 Anton Gefvert: so so I talked to him. It was cool, and then, like
59 00:05:17.760 ⇒ 00:05:23.235 Anton Gefvert: I go to like a pub. It’s like an alumni pub where you can like meet alumni and
60 00:05:25.990 ⇒ 00:05:28.130 Anton Gefvert: another guy I knew was there
61 00:05:28.430 ⇒ 00:05:39.989 Anton Gefvert: who’s also like an alumni. So I sat down to this table, and, like one of the spots that were taken like comes in a Guy, and he is like he was the opponent for my brother’s master’s thesis.
62 00:05:40.370 ⇒ 00:05:41.230 Uttam Kumaran: No way.
63 00:05:41.230 ⇒ 00:05:42.120 Anton Gefvert: Football company.
64 00:05:42.570 ⇒ 00:05:43.390 Uttam Kumaran: No way.
65 00:05:43.390 ⇒ 00:05:45.319 Anton Gefvert: So I talked to him about this like oh.
66 00:05:45.320 ⇒ 00:05:45.830 Uttam Kumaran: Weird.
67 00:05:47.830 ⇒ 00:05:57.054 Anton Gefvert: And he’s like, Oh, shit! This is really interesting, but I have to go ahead to catch a trade, and then I meet him at like a fair at their university, and we’ll like, Get in touch. And
68 00:05:57.972 ⇒ 00:06:03.059 Anton Gefvert: he mails me. And then I just email him. And I get like an interview, and I got the job. It was really fun.
69 00:06:03.060 ⇒ 00:06:04.999 Uttam Kumaran: Sick. Let’s go.
70 00:06:05.000 ⇒ 00:06:12.382 Anton Gefvert: Yeah, yeah, I guess. I don’t know.
71 00:06:13.490 ⇒ 00:06:15.609 Anton Gefvert: it’s it’s so weird because I’m like.
72 00:06:16.880 ⇒ 00:06:21.140 Anton Gefvert: I look at these things. It’s like, this is pretty cool, but I feel like just like
73 00:06:21.430 ⇒ 00:06:28.390 Anton Gefvert: a couple of years of not doing proper machine learning or like not doing that much. I’m like, I feel I feel like I’m a fucking dinosaur man.
74 00:06:29.920 ⇒ 00:06:33.860 Uttam Kumaran: None of the AI stuff is like as hard as machine learning. Dude, like all this stuff is
75 00:06:33.860 ⇒ 00:06:35.070 Uttam Kumaran: the second drag and drop.
76 00:06:35.558 ⇒ 00:06:39.520 Anton Gefvert: For the most part it’s just all application, like all the work we do in AI is all like.
77 00:06:39.970 ⇒ 00:06:42.480 Uttam Kumaran: Basically developing like vertical apps.
78 00:06:43.272 ⇒ 00:06:49.840 Uttam Kumaran: Chat bots, different sort of like rag things like that. I think over time. I want to get more into like
79 00:06:50.670 ⇒ 00:06:52.200 Uttam Kumaran: fine tuning training.
80 00:06:52.200 ⇒ 00:06:53.040 Anton Gefvert: Yeah.
81 00:06:53.455 ⇒ 00:06:54.669 Uttam Kumaran: Things like that.
82 00:06:55.080 ⇒ 00:06:58.010 Uttam Kumaran: That’s like beyond my pay grade. So
83 00:06:58.400 ⇒ 00:07:11.609 Uttam Kumaran: it’s like one figuring out, okay, what is like, what is, what are the what do customers need? And of course, like finding really great people, we’ve kind of started right now by just we have a few clients that we’re sort of building AI agents for
84 00:07:12.291 ⇒ 00:07:15.270 Uttam Kumaran: that allows us to sort of work with the foundational Llms.
85 00:07:15.729 ⇒ 00:07:21.999 Uttam Kumaran: Build some workflows. We maybe build a custom like app on Heroku, or something like that, and deploy it.
86 00:07:22.000 ⇒ 00:07:22.620 Anton Gefvert: Yeah.
87 00:07:23.038 ⇒ 00:07:24.830 Uttam Kumaran: But I think over time.
88 00:07:25.160 ⇒ 00:07:28.979 Uttam Kumaran: We will probably try to get more sophisticated.
89 00:07:30.420 ⇒ 00:07:37.890 Uttam Kumaran: you know, because some of that will get taken up by, you know, just different sort of workflow builders. I think ideally, we want to get more sophisticated and
90 00:07:38.000 ⇒ 00:07:43.160 Uttam Kumaran: do things that are yeah, like fine tuning training things like that.
91 00:07:44.490 ⇒ 00:07:47.200 Uttam Kumaran: And then, on the data side, you know, we do a lot of like.
92 00:07:47.360 ⇒ 00:07:53.300 Uttam Kumaran: you know, everything we do is mostly SQL. But do you think some clients over time are interested in sort of machine learning
93 00:07:53.991 ⇒ 00:08:00.820 Uttam Kumaran: or like some data, science related activities because they have for transactions or churn prediction, or Ltv. Something like that.
94 00:08:03.550 ⇒ 00:08:04.440 Anton Gefvert: Cool.
95 00:08:05.000 ⇒ 00:08:08.910 Anton Gefvert: Yeah, it sounds interesting. I’m like.
96 00:08:09.210 ⇒ 00:08:11.559 Anton Gefvert: I don’t know in my my head then.
97 00:08:11.840 ⇒ 00:08:16.120 Anton Gefvert: Well, I do machine learning, but I don’t know how well I’m doing it, but it feels like it’s
98 00:08:17.000 ⇒ 00:08:18.200 Anton Gefvert: good enough.
99 00:08:18.520 ⇒ 00:08:22.209 Uttam Kumaran: What are what are like? What are the problems that you’re solving in? Ml.
100 00:08:24.739 ⇒ 00:08:29.329 Anton Gefvert: I mean, on my current job. We have like.
101 00:08:30.100 ⇒ 00:08:34.080 Anton Gefvert: it’s kind of weird, because our products are like, very good.
102 00:08:34.320 ⇒ 00:08:37.429 Anton Gefvert: So like, we’re kind of shoehorning in machine learning
103 00:08:37.942 ⇒ 00:08:54.017 Anton Gefvert: but we have like the I guess the main thing I’ve worked on is so we do like level measurement devices. So you put like this device on top of like ginormous tank, and we can like measure how much fluid is in the tank down to like millimeter precision, using radar.
104 00:08:55.547 ⇒ 00:09:01.589 Anton Gefvert: And sometimes you can have, like 2 different mediums. You can have like oil on top of water, for example.
105 00:09:01.590 ⇒ 00:09:02.320 Uttam Kumaran: I see.
106 00:09:02.460 ⇒ 00:09:07.113 Anton Gefvert: And what I’ve been working on is like trying to use machine learning to
107 00:09:07.970 ⇒ 00:09:14.940 Anton Gefvert: get a high resolution. Basically, if you have a very thin layer of oil on top of water, because what happens if you have like?
108 00:09:15.780 ⇒ 00:09:26.399 Anton Gefvert: If you do, some Fourier transforms and stuff, you get like a a curve, and where there’s something the radar detects there will be like a spike in the curve. Right? But when those 2 spikes
109 00:09:26.610 ⇒ 00:09:29.039 Anton Gefvert: intersect you’ll get like one peek.
110 00:09:29.410 ⇒ 00:09:30.210 Uttam Kumaran: Hmm, okay.
111 00:09:30.210 ⇒ 00:09:31.949 Anton Gefvert: And you should like try to.
112 00:09:31.950 ⇒ 00:09:32.550 Uttam Kumaran: I see.
113 00:09:32.550 ⇒ 00:09:35.360 Anton Gefvert: Extrapolated the peaks from that one peak.
114 00:09:35.630 ⇒ 00:09:38.660 Uttam Kumaran: Where are you? Where are you running like this sort of analysis?
115 00:09:39.730 ⇒ 00:09:47.730 Anton Gefvert: I mean, this is just like one dimensional signals. I’m just like doing shit on the computer. I’m working.
116 00:09:47.730 ⇒ 00:09:48.700 Uttam Kumaran: Okay. Okay.
117 00:09:48.700 ⇒ 00:09:57.440 Anton Gefvert: Yeah. But it’s it’s also like, since we’re doing embedded software, we have to use simple models if we want to introduce it. So it’s like the models are not.
118 00:09:57.890 ⇒ 00:09:58.570 Uttam Kumaran: I see.
119 00:09:58.720 ⇒ 00:10:01.030 Anton Gefvert: Very demanding, or anything.
120 00:10:01.400 ⇒ 00:10:04.549 Uttam Kumaran: So what do you think about the current place you’re working with like, what do you like?
121 00:10:04.770 ⇒ 00:10:06.520 Uttam Kumaran: Are you gonna stay there like? What do you.
122 00:10:06.520 ⇒ 00:10:09.310 Anton Gefvert: No like I’m you’re outside.
123 00:10:09.410 ⇒ 00:10:12.220 Anton Gefvert: What I’m currently focusing on is like.
124 00:10:12.380 ⇒ 00:10:13.969 Anton Gefvert: I want to move to Japan.
125 00:10:15.360 ⇒ 00:10:17.485 Uttam Kumaran: Nice with Mike. Go with Mike.
126 00:10:17.840 ⇒ 00:10:22.899 Anton Gefvert: Yeah, he’s he’s yeah, he’s getting there.
127 00:10:23.760 ⇒ 00:10:29.740 Anton Gefvert: Yeah, I I legit like, we talked about it when we were skiing. I’m like, Bro.
128 00:10:29.960 ⇒ 00:10:33.139 Anton Gefvert: I feel like we have such similar values. And like
129 00:10:33.350 ⇒ 00:10:36.870 Anton Gefvert: the way we think about things, we should just start a fucking company here.
130 00:10:38.490 ⇒ 00:10:41.206 Uttam Kumaran: Good luck, dude! It’s brutal.
131 00:10:41.750 ⇒ 00:10:50.250 Anton Gefvert: Yeah, I’ll probably we’re doing that. But I think in the future it would be nice to have like especially in Japan, where, like.
132 00:10:51.210 ⇒ 00:10:58.199 Anton Gefvert: there’s so many expectations from society, and like your bosses and whatever, if you’re working there, it would be nice to like control that yourself.
133 00:10:58.410 ⇒ 00:10:59.609 Anton Gefvert: Yeah,
134 00:11:02.600 ⇒ 00:11:05.399 Anton Gefvert: So I like my job.
135 00:11:06.300 ⇒ 00:11:12.390 Anton Gefvert: I want to do more machine learning. The problem is like, we don’t have data. So it’s all simulations.
136 00:11:14.430 ⇒ 00:11:20.250 Anton Gefvert: Cause. It’s like our clients. They’re like not connected to fucking anything.
137 00:11:20.610 ⇒ 00:11:24.830 Anton Gefvert: because if they get hacked, an entire power plant would go down right.
138 00:11:24.830 ⇒ 00:11:25.380 Uttam Kumaran: Oh, yeah.
139 00:11:26.030 ⇒ 00:11:26.640 Anton Gefvert: And
140 00:11:29.270 ⇒ 00:11:37.359 Anton Gefvert: the protocols we’re using for communication. They’re like 30 years old, like maximum throughput is a thousand bytes per second.
141 00:11:37.360 ⇒ 00:11:38.020 Uttam Kumaran: Yeah.
142 00:11:38.680 ⇒ 00:11:39.200 Anton Gefvert: Yes.
143 00:11:39.200 ⇒ 00:11:40.180 Uttam Kumaran: Horrible!
144 00:11:40.906 ⇒ 00:11:41.633 Anton Gefvert: So
145 00:11:43.760 ⇒ 00:11:51.530 Anton Gefvert: and like. In this example, for example, like the thin interface thing, getting ground truth from real applications is literally impossible.
146 00:11:52.190 ⇒ 00:11:58.800 Anton Gefvert: because, like. You can’t see the interface. You can’t like. Measure it. You can’t. You’d need something else
147 00:11:59.190 ⇒ 00:11:59.880 Anton Gefvert: that like.
148 00:11:59.880 ⇒ 00:12:00.930 Uttam Kumaran: Oh, I see!
149 00:12:01.150 ⇒ 00:12:04.989 Anton Gefvert: Reliably. So the way we get data we have like a plastic pipe.
150 00:12:05.290 ⇒ 00:12:10.820 Anton Gefvert: So it’s like actual translation. And then you go there with like a rule, this is the interface size.
151 00:12:10.820 ⇒ 00:12:11.849 Uttam Kumaran: So we have a
152 00:12:15.160 ⇒ 00:12:16.390 Uttam Kumaran: awesome.
153 00:12:17.390 ⇒ 00:12:23.530 Anton Gefvert: So I just wish I could like work more with machine learning, and also, like the company, is just like
154 00:12:25.030 ⇒ 00:12:28.910 Anton Gefvert: the direction it’s going, just doesn’t align with me some. Yeah.
155 00:12:29.010 ⇒ 00:12:32.120 Anton Gefvert: Okay, yeah. Don’t.
156 00:12:32.960 ⇒ 00:12:36.429 Uttam Kumaran: Yeah, at the moment, all of our stuff is like
157 00:12:36.540 ⇒ 00:12:40.060 Uttam Kumaran: way easier. It’s all like SQL data modeling.
158 00:12:40.180 ⇒ 00:12:42.890 Anton Gefvert: I think we have a couple of clients that you know.
159 00:12:43.270 ⇒ 00:12:45.299 Uttam Kumaran: May be interested in doing
160 00:12:45.520 ⇒ 00:12:50.339 Uttam Kumaran: some machine learning, but the volumes that we’re working with for some clients aren’t like super high.
161 00:12:50.500 ⇒ 00:12:50.940 Anton Gefvert: Yeah.
162 00:12:50.940 ⇒ 00:12:55.606 Uttam Kumaran: They’re everybody like asks. But I usually talk them out of it because I’m like, you’re not gonna see anything.
163 00:12:56.050 ⇒ 00:12:56.720 Anton Gefvert: You can’t just.
164 00:12:56.720 ⇒ 00:12:59.430 Uttam Kumaran: Solve, probably with like people or process.
165 00:12:59.640 ⇒ 00:13:00.100 Anton Gefvert: Yeah.
166 00:13:00.593 ⇒ 00:13:07.380 Uttam Kumaran: But there are some clients, I think, as we go where, if they’re interested in machine learning, maybe there’s some opportunities.
167 00:13:09.220 ⇒ 00:13:17.668 Anton Gefvert: It’s it’s so funny because you’re like, Oh, machine learning, that’s like next level. I’m like, you’re just saying words that I’m not like, what fuck does this mean.
168 00:13:17.960 ⇒ 00:13:24.759 Uttam Kumaran: I mean, dude, we’re, it’s we’re still doing like basic sequel, like users, table doing joins doing ranks.
169 00:13:25.410 ⇒ 00:13:28.789 Uttam Kumaran: And like, we can get like 90% of the way there. Right? I think
170 00:13:29.230 ⇒ 00:13:37.559 Uttam Kumaran: there’s some clients where they they really have, like an embedded product where they want to build like a benchmarking product, or they want to do something where they maybe need
171 00:13:37.770 ⇒ 00:13:42.479 Uttam Kumaran: machine learning. I feel like we’ve yet to see like a really great use case.
172 00:13:42.760 ⇒ 00:13:43.480 Uttam Kumaran: Yeah.
173 00:13:44.170 ⇒ 00:13:50.809 Uttam Kumaran: a lot of like what we’re we probably need more help with. Short term is just sort of like on the AI side, just like
174 00:13:50.980 ⇒ 00:13:59.449 Uttam Kumaran: deploying infra learning, because there’s not many like engineers that actually like worked in building Llm applications.
175 00:14:00.040 ⇒ 00:14:00.509 Uttam Kumaran: I don’t know.
176 00:14:00.510 ⇒ 00:14:07.380 Uttam Kumaran: Find like couple of good, really good people that were just sort of doing it on the side I’m like. Come, do it like not on the side.
177 00:14:07.550 ⇒ 00:14:07.995 Uttam Kumaran: but
178 00:14:08.440 ⇒ 00:14:13.679 Anton Gefvert: So what like building an Llm application mean? Because to me it’s like, this is what.
179 00:14:13.680 ⇒ 00:14:26.509 Uttam Kumaran: So basically like, yeah, it’s like, you know, when you like. Let’s say you go on chat. Gbt, and let’s say you have all the perfect information you need and the perfect prompt. And of course, you’re like, I’m gonna get the best answer. It’s basically like
180 00:14:26.700 ⇒ 00:14:27.740 Uttam Kumaran: building that.
181 00:14:28.615 ⇒ 00:14:28.840 Anton Gefvert: So.
182 00:14:28.840 ⇒ 00:14:43.200 Uttam Kumaran: It’s like, can you get the right information into alongside a prompt to generate an answer? Whether that’s like data from a dB, whether that’s like from an Api so part of it is like data. Pipelining. Part of it is like.
183 00:14:43.200 ⇒ 00:14:43.530 Anton Gefvert: Yup!
184 00:14:43.880 ⇒ 00:14:50.081 Uttam Kumaran: Just orchestrating Llms, you know, to give sentiment to,
185 00:14:50.790 ⇒ 00:14:55.699 Uttam Kumaran: you know. Give analysis to coach someone through creating a document. Things like that. So it’s like.
186 00:14:55.700 ⇒ 00:14:56.480 Anton Gefvert: It’s just like.
187 00:14:56.740 ⇒ 00:15:01.463 Uttam Kumaran: Basically abstracting one layer above Llms and like sort of stringing them together.
188 00:15:02.390 ⇒ 00:15:07.730 Uttam Kumaran: it’s just a lot of new like technologies that I’m not like so confident in.
189 00:15:07.850 ⇒ 00:15:09.700 Uttam Kumaran: But we’re iterating a lot like.
190 00:15:09.700 ⇒ 00:15:10.200 Anton Gefvert: It’s such.
191 00:15:10.200 ⇒ 00:15:12.150 Uttam Kumaran: Hard to know, like what’s actually like
192 00:15:12.270 ⇒ 00:15:20.430 Uttam Kumaran: the foundational Llm, sure I don’t. It doesn’t really matter anymore. But like for building agents, and like the structure, and like the tools to do that.
193 00:15:20.981 ⇒ 00:15:25.319 Uttam Kumaran: And again, agents, it’s nothing more than just like this prompt abstractions.
194 00:15:25.835 ⇒ 00:15:33.339 Uttam Kumaran: so like dynamically putting information into a prompt like, dynamically having those strung together and building basically like a
195 00:15:33.840 ⇒ 00:15:36.529 Uttam Kumaran: after your workflows. That’s mostly it.
196 00:15:36.976 ⇒ 00:15:48.089 Uttam Kumaran: But the Llms are really good, like you can pass them a lot of stuff and really get really robust answers out of it. And I think a lot of current use cases that are ui based will get replaced with like a chat. Bot. You know
197 00:15:48.200 ⇒ 00:15:49.170 Uttam Kumaran: the sound I am.
198 00:15:49.170 ⇒ 00:15:55.359 Uttam Kumaran: and that’s why they didn’t work. But now that they’re really good. You can pass it images, video text.
199 00:15:55.630 ⇒ 00:16:08.929 Uttam Kumaran: and you’ll get answers. You don’t need to interact. And you also can can ask it to go do things. So if you give the AI the schema of like an of an Api, it can say cool. I need this information. I’ll go make a post request
200 00:16:10.180 ⇒ 00:16:15.680 Uttam Kumaran: at some level. What do you need an interface for? Because you’re either retrieving info you’re getting, or you’re submitting info right?
201 00:16:15.680 ⇒ 00:16:16.460 Anton Gefvert: Yeah.
202 00:16:16.460 ⇒ 00:16:19.819 Uttam Kumaran: So that’s sort of like the underlying thesis is that like
203 00:16:20.220 ⇒ 00:16:28.699 Uttam Kumaran: a lot of times when people interact with documents or information, it could just get way more streamlined by handling that with an AI assistant, you know, sort of like
204 00:16:29.010 ⇒ 00:16:34.839 Uttam Kumaran: handling like inputting those documents, telling you what it is, understanding what change you want, and then going and making that change
205 00:16:35.217 ⇒ 00:16:38.850 Uttam Kumaran: sort of. If you take that idea, you can sort of blow it up a lot.
206 00:16:41.100 ⇒ 00:16:42.652 Uttam Kumaran: That’s the gist.
207 00:16:43.170 ⇒ 00:16:47.460 Anton Gefvert: So it’s basically like I kind of from when I was done studying
208 00:16:48.270 ⇒ 00:16:54.862 Anton Gefvert: to. Now, it’s just been like a boom of Llms. And I’m like, what the fuck does this mean?
209 00:16:55.210 ⇒ 00:16:57.899 Uttam Kumaran: You’re not. It’s nothing. It has nothing to do with data or.
210 00:16:57.900 ⇒ 00:16:59.899 Anton Gefvert: Machine learning like.
211 00:17:00.230 ⇒ 00:17:02.039 Anton Gefvert: So what does this mean?
212 00:17:02.370 ⇒ 00:17:05.700 Uttam Kumaran: It’s really quite different than anything on our data side.
213 00:17:05.869 ⇒ 00:17:10.450 Uttam Kumaran: There is partly a data problem, but it’s honestly more like workflow building
214 00:17:10.819 ⇒ 00:17:15.170 Uttam Kumaran: in like sort of prompt. It’s like a lot of like I hate the word. But like prompt engineering.
215 00:17:15.170 ⇒ 00:17:15.849 Anton Gefvert: Yeah.
216 00:17:16.010 ⇒ 00:17:21.659 Uttam Kumaran: Sort of like building dynamic prompts. But then there’s also other things like we have orchestration. We have observability.
217 00:17:21.800 ⇒ 00:17:24.920 Uttam Kumaran: We have. We’ll, we’ll add, testing and evaluations.
218 00:17:25.335 ⇒ 00:17:33.580 Uttam Kumaran: We’ll add security and things like that guardrails. So there is some layers to it. But for us we’re really trying to go into like legacy companies where they don’t
219 00:17:33.800 ⇒ 00:17:43.380 Uttam Kumaran: like they don’t use anything and really helping them sort of using them. We use a lot of it internally. To like update notion for stuff to summarize messages.
220 00:17:43.949 ⇒ 00:17:46.309 Uttam Kumaran: to to research for sales
221 00:17:46.550 ⇒ 00:17:49.129 Uttam Kumaran: which has been really, really like fruitful as well.
222 00:17:49.680 ⇒ 00:17:52.670 Anton Gefvert: Yeah, yeah, I haven’t really played a lot of
223 00:17:52.920 ⇒ 00:17:56.700 Anton Gefvert: round with them a lot like, I’ve used Google co-pilots and stuff.
224 00:17:56.810 ⇒ 00:18:01.440 Anton Gefvert: But like, now that I’m like rewriting my resume. It’s just fucking all AI written honestly.
225 00:18:01.440 ⇒ 00:18:03.110 Uttam Kumaran: Yeah, dude. That’s it. Yeah.
226 00:18:03.110 ⇒ 00:18:04.710 Anton Gefvert: It’s it’s so good, perfect.
227 00:18:05.090 ⇒ 00:18:08.920 Anton Gefvert: I know what I want to say, but I can’t say it in the way like.
228 00:18:08.920 ⇒ 00:18:15.540 Uttam Kumaran: You should try using cursor like I use cursor every day. It’s the best, for it’s like an ide. It’s like a Vs code
229 00:18:16.040 ⇒ 00:18:17.720 Uttam Kumaran: type Ide.
230 00:18:18.580 ⇒ 00:18:21.350 Uttam Kumaran: But dude. It’s like Autofill on steroids.
231 00:18:22.640 ⇒ 00:18:25.750 Anton Gefvert: I need lim binding so I can’t. Yeah, I think you can.
232 00:18:25.750 ⇒ 00:18:31.439 Uttam Kumaran: I don’t know. Actually, I don’t know. Yeah, I called my friend yesterday, and he told me he’s not using it because I don’t know if they have in bindings.
233 00:18:31.440 ⇒ 00:18:37.210 Anton Gefvert: Yeah, that you once you like, I converted my friend recently. He’s like.
234 00:18:37.410 ⇒ 00:18:44.259 Anton Gefvert: Bro, you’re such a try hard using win? What the fuck are you doing? I’m like Bro. It it will make your coding experience twice as fun.
235 00:18:44.490 ⇒ 00:18:45.260 Uttam Kumaran: And then he’s.
236 00:18:45.260 ⇒ 00:18:49.794 Anton Gefvert: Started using. Zoom, he said, Bro, this is cocaine.
237 00:18:50.550 ⇒ 00:18:55.600 Uttam Kumaran: Okay, I got it then. And then I feel like it’s I learned them in school like, I studied computer engineering as well.
238 00:18:55.600 ⇒ 00:18:56.250 Anton Gefvert: Yeah, yeah.
239 00:18:56.630 ⇒ 00:18:58.150 Uttam Kumaran: But I have not.
240 00:18:59.120 ⇒ 00:19:00.999 Uttam Kumaran: I do vim here and there.
241 00:19:01.340 ⇒ 00:19:05.989 Uttam Kumaran: but for most part I just open shit and text that in I just make whatever change I need and close it.
242 00:19:05.990 ⇒ 00:19:16.239 Anton Gefvert: Yeah, like, we’re using windows at my job. And I really did test windows. So like, I’ve got an even, I’m just using them for everything I’m like I use get in vim. I I use
243 00:19:16.690 ⇒ 00:19:20.980 Anton Gefvert: no worries files, and then I don’t fucking touch the file explorer. I’m just like.
244 00:19:20.980 ⇒ 00:19:22.100 Uttam Kumaran: That’s wild.
245 00:19:22.750 ⇒ 00:19:24.700 Uttam Kumaran: That’s sick. Okay.
246 00:19:28.640 ⇒ 00:19:34.979 Uttam Kumaran: Cool. Well, maybe I’ll I don’t know. I’ll keep you in mind if we end up getting any of that stuff. I mean, if you’re interested in any of the AI things.
247 00:19:35.455 ⇒ 00:19:40.060 Uttam Kumaran: Let me know if you’re like, if you’re like, look, I’m gonna dip this job and you’re like
248 00:19:40.530 ⇒ 00:19:47.930 Uttam Kumaran: you have anything I mean, do the stuff we’re doing on the AI side? Isn’t that complicated? We just sort of like I’m more interested in just having people that like.
249 00:19:48.630 ⇒ 00:19:54.949 Uttam Kumaran: I just want people like for me. I’m the sort of like engineer where I’m like, just like I just, I’ll go figure some shit out.
250 00:19:55.391 ⇒ 00:20:08.039 Uttam Kumaran: I’ll Google, or or whatever my way into things, the AI stuff you’re never. It’s it’s like, it just happens. So there’s nobody has experience in it for me. I’m more interested in. People are like, Yo, I wanna I wanna like.
251 00:20:08.330 ⇒ 00:20:09.680 Uttam Kumaran: take this as like
252 00:20:10.030 ⇒ 00:20:14.419 Uttam Kumaran: and take something that’s like, maybe kind of janky and build like a robust system for it.
253 00:20:15.168 ⇒ 00:20:27.499 Uttam Kumaran: And then, of course, we also have like data work like, but it’s probably not as Ml. Related, but on the AI side. We definitely have, like opportunity to come in and help build some of these applications for clients.
254 00:20:27.880 ⇒ 00:20:30.330 Anton Gefvert: I mean, I’m not like, necessarily.
255 00:20:31.000 ⇒ 00:20:42.739 Anton Gefvert: I realized I have a interview earlier today where I spoke to like a recruiter in Japan. And
256 00:20:42.930 ⇒ 00:20:48.759 Anton Gefvert: I realized, like what I’m interested in. I don’t really like machine learning or AI, or whatever
257 00:20:48.880 ⇒ 00:20:56.469 Anton Gefvert: I use. I love programming. And most of all, I love learning like I’m like, whatever I do.
258 00:20:56.660 ⇒ 00:21:01.459 Anton Gefvert: my my friend called it. You have these 8 Adhd faces where you just like fucking, go like.
259 00:21:01.460 ⇒ 00:21:03.130 Uttam Kumaran: Yeah. 100 miles an hour. Yeah.
260 00:21:03.130 ⇒ 00:21:06.259 Anton Gefvert: Yeah, the shit. And then you’re like whatever.
261 00:21:06.400 ⇒ 00:21:07.389 Anton Gefvert: And I think you I mean.
262 00:21:07.390 ⇒ 00:21:11.649 Uttam Kumaran: Dude. You’re gonna love the AI stuff because they’re so it’s like so new.
263 00:21:11.650 ⇒ 00:21:12.290 Anton Gefvert: Yeah, yeah.
264 00:21:12.290 ⇒ 00:21:17.460 Uttam Kumaran: A lot of it is very similar to like the just like general paradigms of building great software.
265 00:21:17.720 ⇒ 00:21:18.590 Anton Gefvert: No, I think so.
266 00:21:18.590 ⇒ 00:21:19.499 Anton Gefvert: I think my
267 00:21:20.220 ⇒ 00:21:30.840 Anton Gefvert: also the fact that you it’s like it’s evolving so quickly. I think that’s why I like I want to get back into more like AI machine learning, or whatever, rather than embedded, because, like.
268 00:21:31.440 ⇒ 00:21:33.889 Anton Gefvert: there’s always new things to learn. Embedded is kind.
269 00:21:34.215 ⇒ 00:21:34.540 Uttam Kumaran: Back.
270 00:21:35.200 ⇒ 00:21:38.710 Anton Gefvert: Kind of salt. I don’t know. It’s like so old, so like.
271 00:21:38.710 ⇒ 00:21:39.430 Uttam Kumaran: And it’s limited.
272 00:21:39.840 ⇒ 00:21:47.090 Anton Gefvert: Yeah, it was so fun. The the limiting part is fun, though, because you have to like figure out fun algorithms efficiently.
273 00:21:47.591 ⇒ 00:21:50.259 Anton Gefvert: But you’re not like learning a lot.
274 00:21:50.570 ⇒ 00:21:51.450 Uttam Kumaran: Hmm.
275 00:21:52.020 ⇒ 00:21:52.485 Anton Gefvert: So.
276 00:21:53.830 ⇒ 00:21:56.510 Anton Gefvert: But Michael had like an
277 00:21:56.810 ⇒ 00:22:01.549 Anton Gefvert: I mean to me. He pitched the idea like dude. You work from Japan
278 00:22:01.910 ⇒ 00:22:04.439 Anton Gefvert: and do like part time at this company, and you.
279 00:22:04.440 ⇒ 00:22:05.250 Uttam Kumaran: Yeah.
280 00:22:05.250 ⇒ 00:22:07.130 Anton Gefvert: American salary, because.
281 00:22:07.421 ⇒ 00:22:10.048 Uttam Kumaran: I also think that’s a that’s a great pitch.
282 00:22:10.340 ⇒ 00:22:11.070 Anton Gefvert: Yeah, I, I.
283 00:22:11.070 ⇒ 00:22:11.899 Uttam Kumaran: That’s what you should do.
284 00:22:12.430 ⇒ 00:22:20.730 Uttam Kumaran: I mean, that’s what I told Mike, too, because I told Mike I’m like, look one like you have a friend with a company. So that’s that’s often not the case. Second.
285 00:22:20.870 ⇒ 00:22:25.560 Uttam Kumaran: I’m like a normal. And like, I’m just like a normal dude. So it’s like you’re not dealing with like weirdos.
286 00:22:25.670 ⇒ 00:22:33.499 Uttam Kumaran: 3, rd like I do have work like. And it’s also the easiest way to learn AI is when you have like a task. It’s like you don’t have a real something to solve.
287 00:22:33.980 ⇒ 00:22:37.730 Uttam Kumaran: It’s kind of like you’re just gonna futz around or maybe do tutorials and shit.
288 00:22:38.120 ⇒ 00:22:38.850 Anton Gefvert: Yeah.
289 00:22:38.850 ⇒ 00:22:46.857 Uttam Kumaran: So I don’t know. Like, if that’s something that can, you know I’m trying to think of something to bring Mike in on as well, but if that’s something that you’re interested in,
290 00:22:47.270 ⇒ 00:23:00.279 Uttam Kumaran: I mean, Mike, I need some help on, like some some of the stuff on the data side. Because he did a lot of the same work that we’re doing. But if you’re if you’re interested, like, yeah, I’m gonna I have Miguel, who I think he’s just working on other shit right now.
291 00:23:00.710 ⇒ 00:23:03.899 Uttam Kumaran: Supposed to be on this call. He runs our AI team.
292 00:23:05.110 ⇒ 00:23:10.059 Uttam Kumaran: very chill, dude, cool dude like we’re just started. We just kicked off a client this week.
293 00:23:10.513 ⇒ 00:23:35.089 Uttam Kumaran: And we’ll probably kick off. We’ll probably close another AI client. It’s a the client we close as a home services company. They do like pest care, lawn care things like that, but they have like 200 customer service reps, and they have, like they have a shitload of like Google Docs and Google sheets about process. That’s so messy. So our job is basically to come in and one help them clean all that up, build a taxonomy, and then, second, build an AI agent that helps them
294 00:23:35.200 ⇒ 00:23:40.540 Uttam Kumaran: answer any question they need, instead of going documents, and like
295 00:23:40.860 ⇒ 00:23:43.729 Uttam Kumaran: they’re gonna end up close. And then we’re gonna basically build.
296 00:23:43.840 ⇒ 00:23:50.559 Uttam Kumaran: we’re gonna start to try to build them on like success. Basically, like, how often are their agents getting used. How often are they leading to positive.
297 00:23:50.750 ⇒ 00:23:52.979 Uttam Kumaran: successful calls or upsells?
298 00:23:53.240 ⇒ 00:24:06.209 Uttam Kumaran: Another company? They’re they’re in telehealth. They want us to build like a text based Llm. That allows people that are signing up for their service to like, personalize their ability to sign them up via text.
299 00:24:06.330 ⇒ 00:24:18.089 Uttam Kumaran: because typically right now they send them to a link. The link has, like 10 different forms that you have to fill. Most people drop off like immediately. They have, like a 90% churn rate all that information you can capture over text.
300 00:24:18.390 ⇒ 00:24:19.119 Uttam Kumaran: But you want to.
301 00:24:19.120 ⇒ 00:24:24.130 Uttam Kumaran: Conversational like, not just like this step. And then if this, then this step. It’s like.
302 00:24:24.130 ⇒ 00:24:24.900 Anton Gefvert: Alright!
303 00:24:24.900 ⇒ 00:24:29.689 Uttam Kumaran: They should be able to just put in whatever and the Lm. Sort of dynamically ask the next question. And then it’s like, Oh, you’re good.
304 00:24:30.000 ⇒ 00:24:45.939 Uttam Kumaran: So that’s an option, too. But again, there’s also stuff on like, we’re trying to decide on the best infra we’re trying to decide on the best vector, store, the best chunking strategy, the best semantic similarity models to use. We’re doing a spike on like the best rag methods for like retrieval.
305 00:24:46.650 ⇒ 00:24:50.480 Uttam Kumaran: there’s just like a bunch of shit that we need to figure out for ourselves. And there’s no answer.
306 00:24:50.480 ⇒ 00:24:51.570 Anton Gefvert: It’s our bank. Mom.
307 00:24:51.740 ⇒ 00:24:57.800 Uttam Kumaran: Rag. So basically, rag is like retrieval. Like retrieval augmented generation. Basically, it’s like
308 00:24:57.930 ⇒ 00:25:09.329 Uttam Kumaran: you get, we get you get a request in a vector store. You have all these documents. And then you have the vectors. Basically, you need to find the right document and then bring that into context.
309 00:25:09.802 ⇒ 00:25:18.380 Uttam Kumaran: Search over all documents. Right? Because there’s you may have like a hundred 1,000 documents. Instead, you need to find the one that’s most similar to your question and then bring it in.
310 00:25:18.690 ⇒ 00:25:23.349 Uttam Kumaran: Do you do that retrieval? Well, there’s also concept like chunking, which is like when you have a document.
311 00:25:23.490 ⇒ 00:25:32.250 Uttam Kumaran: How do you chunk the text into vectors, so that you’re when you search in, you can find the right chunk and then therefore find the right document.
312 00:25:32.718 ⇒ 00:25:37.120 Uttam Kumaran: There’s also like different Ocr strategy. So how do you like convert tables?
313 00:25:37.120 ⇒ 00:25:37.560 Anton Gefvert: etc.
314 00:25:37.560 ⇒ 00:25:44.109 Uttam Kumaran: Documents to like something. And then over time, I think I also want it. Sorry.
315 00:25:44.400 ⇒ 00:25:49.490 Anton Gefvert: Sorry. I I actually wrote my thesis on, like, kind of
316 00:25:50.760 ⇒ 00:25:53.420 Anton Gefvert: in that sense, like, we would, yeah.
317 00:25:54.420 ⇒ 00:26:10.370 Anton Gefvert: basically classes or grouping together like it was based off like surveys, like a company. So like the employers like, what do you think of your or the employees like? What do you think of your employer like, group them together. Okay. But these are like all about
318 00:26:11.150 ⇒ 00:26:14.500 Anton Gefvert: working with so kind of oh, nice. Okay.
319 00:26:18.450 ⇒ 00:26:36.889 Uttam Kumaran: Yeah, exactly. So for sentiment, for sentiment, training, document, retrieval, document creation. Those are a lot of things we’ll probably take on early, and then I think we’ll start to try to get towards advanced. I want to start working on like image generation, video generation audio generation like that. I think there’s still a lot of like. Not a lot of people have it figured out.
320 00:26:37.400 ⇒ 00:26:37.750 Anton Gefvert: Yeah.
321 00:26:38.007 ⇒ 00:26:42.120 Uttam Kumaran: But yeah, I mean, these are all like complete open problems. I learned everything on Twitter
322 00:26:42.400 ⇒ 00:26:45.070 Uttam Kumaran: so, and I running a business.
323 00:26:45.895 ⇒ 00:26:47.870 Uttam Kumaran: An AI, I guess so.
324 00:26:47.870 ⇒ 00:26:50.220 Anton Gefvert: Learn everything, you, too.
325 00:26:50.220 ⇒ 00:26:55.520 Uttam Kumaran: Learn everything on Twitter and Youtube. And so now that’s it. So like this is, it’s completely like
326 00:26:55.760 ⇒ 00:27:03.559 Uttam Kumaran: nascent, like, it’s just starting right now. So if you’re curious about a lot of that, I think that’s where we have, like a lot of immediate opportunity, and maybe just come in and
327 00:27:03.820 ⇒ 00:27:06.809 Uttam Kumaran: pick off some tickets for some of these projects, and sort of just like.
328 00:27:07.440 ⇒ 00:27:09.730 Uttam Kumaran: just be on the team and sort of do that. If that’s
329 00:27:10.030 ⇒ 00:27:14.330 Uttam Kumaran: something you’re interested in like I feel like that’s a great opportunity, I mean. And again, like.
330 00:27:14.330 ⇒ 00:27:14.990 Anton Gefvert: I.
331 00:27:15.180 ⇒ 00:27:24.240 Uttam Kumaran: Yeah, I think your angle is good because the people we have in the AI team right now are coming from like they’re just like they’ve learned to use out. They just they learned they graduated like recently. They learned how to.
332 00:27:24.560 ⇒ 00:27:27.250 Uttam Kumaran: I’m using Llms. Their whole mind is that.
333 00:27:27.380 ⇒ 00:27:33.179 Uttam Kumaran: And so for me. I come from the other side where I’m like they don’t know anything about like writing code deploying code.
334 00:27:33.180 ⇒ 00:27:33.700 Anton Gefvert: Yeah.
335 00:27:33.700 ⇒ 00:27:37.559 Uttam Kumaran: Observability. I’m the one that’s like guys we need. This is like production apps.
336 00:27:37.830 ⇒ 00:27:40.230 Anton Gefvert: So, having another person from that angle.
337 00:27:40.940 ⇒ 00:27:42.660 Uttam Kumaran: Would be really, really helpful.
338 00:27:42.870 ⇒ 00:27:47.440 Anton Gefvert: Yeah, yeah, I mean, I
339 00:27:48.730 ⇒ 00:27:56.589 Anton Gefvert: I would like the way Michael pitch it. He was like, well, if you work like 10 h or hours a week, like
340 00:27:56.910 ⇒ 00:28:15.770 Anton Gefvert: a couple of weeks here and there, like you’ll get like actual wealth in Japan with like an American salary. I’m like, that sounds amazing. And it’s actually there’s a lot of work in Japan where they have like to specify where, like, where they specify that like having a work on the side is fine.
341 00:28:16.060 ⇒ 00:28:17.290 Uttam Kumaran: Oh, okay. Okay.
342 00:28:17.980 ⇒ 00:28:18.549 Anton Gefvert: I think it might.
343 00:28:18.550 ⇒ 00:28:20.070 Anton Gefvert: Yeah, I.
344 00:28:20.070 ⇒ 00:28:22.833 Uttam Kumaran: Ideally, a lot of the work is like Async. But
345 00:28:23.540 ⇒ 00:28:30.089 Uttam Kumaran: of course, sometimes we have to meet with team, but the the team is everywhere in the world, too, so don’t really have much like requirements.
346 00:28:30.310 ⇒ 00:28:30.916 Anton Gefvert: Yeah.
347 00:28:32.130 ⇒ 00:28:32.760 Uttam Kumaran: On, that.
348 00:28:32.760 ⇒ 00:28:39.460 Anton Gefvert: You’re in America, that’s that’s like from here, minus.
349 00:28:40.100 ⇒ 00:28:41.390 Uttam Kumaran: 7 or 8.
350 00:28:41.390 ⇒ 00:28:47.810 Anton Gefvert: Minus 7. So it’s like Japan is plus 7 or 8. So it’s plus 8,
351 00:28:49.490 ⇒ 00:28:57.289 Anton Gefvert: because you wrap around. Yeah, I mean, it’s doable. You just have to. I’m just trying to figure out if it’s morning or nights for me. If I’d be in.
352 00:28:57.290 ⇒ 00:29:00.069 Uttam Kumaran: Yeah, I mean, but but also some of our team is in Asia.
353 00:29:00.240 ⇒ 00:29:01.570 Anton Gefvert: Oh, okay. Yeah.
354 00:29:01.770 ⇒ 00:29:07.759 Uttam Kumaran: So I don’t really care. I don’t really care. As long as the work gets done. I’m up late, and I’m up early also.
355 00:29:09.700 ⇒ 00:29:17.180 Uttam Kumaran: so I don’t care where the team meets frankly, like I want to come in. I want to be on less meetings so that I can come in and like, actually.
356 00:29:17.440 ⇒ 00:29:17.700 Uttam Kumaran: yeah.
357 00:29:17.700 ⇒ 00:29:18.520 Anton Gefvert: And shit.
358 00:29:18.520 ⇒ 00:29:18.940 Uttam Kumaran: It might.
359 00:29:19.990 ⇒ 00:29:23.649 Uttam Kumaran: The the AI team is now sort of almost operating like on their own, so.
360 00:29:25.570 ⇒ 00:29:29.750 Uttam Kumaran: I actually have to jump. I we have a data team meeting right now that I have to go.
361 00:29:29.950 ⇒ 00:29:30.340 Anton Gefvert: It’s that’s.
362 00:29:30.340 ⇒ 00:29:31.210 Uttam Kumaran: Run.
363 00:29:31.330 ⇒ 00:29:35.290 Uttam Kumaran: But yeah, let me know what you think, or just like Whatsapp me.
364 00:29:35.400 ⇒ 00:29:45.710 Uttam Kumaran: I don’t know. I don’t know what your timeline for Japan is, or I don’t even know what your availability is now. But if you’re curious like, I can add you to slack, and you can start talking to some people on the team, and sort of just like, see what’s up.
365 00:29:46.190 ⇒ 00:29:48.949 Anton Gefvert: Yeah, for sure, it sounds very interesting, and like
366 00:29:49.540 ⇒ 00:29:55.462 Anton Gefvert: my timeline, for Japan is as soon as possible notice period in Sweden is 3 months, so like at least 3 months, but.
367 00:29:56.600 ⇒ 00:30:01.369 Uttam Kumaran: So maybe I’ll like, yeah, let’s keep whatsapping. Maybe I want to introduce you to Miguel on the team.
368 00:30:01.370 ⇒ 00:30:01.960 Anton Gefvert: Yep.
369 00:30:02.275 ⇒ 00:30:04.109 Uttam Kumaran: Maybe you can chat with him.
370 00:30:04.110 ⇒ 00:30:04.430 Anton Gefvert: Yeah.
371 00:30:04.430 ⇒ 00:30:11.330 Uttam Kumaran: He’ll give you a sense of what’s going on. I think the only thing I can tell you is, do not be worried that you don’t know AI shit. I’m gonna
372 00:30:11.330 ⇒ 00:30:13.320 Uttam Kumaran: latino again. Don’t be worried.
373 00:30:13.770 ⇒ 00:30:14.969 Anton Gefvert: No. II.
374 00:30:14.970 ⇒ 00:30:15.580 Uttam Kumaran: Fine.
375 00:30:15.580 ⇒ 00:30:18.099 Anton Gefvert: Very confident in my ability to learn.
376 00:30:18.100 ⇒ 00:30:19.691 Uttam Kumaran: Yeah. And I think,
377 00:30:20.420 ⇒ 00:30:26.909 Uttam Kumaran: it’ll help unlock like a new pattern of like developing like, it’s just a different sort of thing.
378 00:30:27.790 ⇒ 00:30:30.940 Uttam Kumaran: And then, yeah, I think maybe if you talk to Miguel and then
379 00:30:31.240 ⇒ 00:30:45.419 Uttam Kumaran: maybe we, you can let me. Well, I guess we’ll all 3 talk or something, and we can figure, see if there’s anything where you can plug into and yeah, and then for us, like, yeah, I mean, ideally, if we have tickets and stuff. So if there’s like work that you can take on, then that that would just be it really.
380 00:30:45.780 ⇒ 00:30:46.125 Anton Gefvert: Yeah.
381 00:30:47.450 ⇒ 00:30:51.359 Anton Gefvert: Sounds cool. Okay, yeah. Cool. Appreciate your time.
382 00:30:51.360 ⇒ 00:30:52.580 Anton Gefvert: Yeah. Yeah.
383 00:30:53.020 ⇒ 00:30:53.700 Anton Gefvert: Hello. Yeah.
384 00:30:53.700 ⇒ 00:30:57.259 Uttam Kumaran: Hoping and chat again soon. Maybe the 3 of us can hop on and chat soon.
385 00:30:57.260 ⇒ 00:30:57.890 Anton Gefvert: Yeah, absolutely.
386 00:30:58.680 ⇒ 00:31:03.259 Uttam Kumaran: But yeah, otherwise enjoy the week and get some rest stuff time soon. Yeah.
387 00:31:03.260 ⇒ 00:31:07.516 Anton Gefvert: It’s been a grind.
388 00:31:08.580 ⇒ 00:31:09.580 Uttam Kumaran: Yeah, bye.