E65: DeepSeek vs NVIDIA and the $TRUMP $MELANIA Crypto Circus 🎪

🔥 DeepSeek & the AI Disruption – is NVIDIA’s dominance under threat? We break down the shocking efficiency gains of DeepSeek R1 and what it means for the AI arms race
💰 $NVDA’s 15% Drop – Market Overreaction or the Beginning of the End? How DeepSeek’s breakthrough forced a market reckoning for AI stocks, and whether NVIDIA’s margins are in serious danger
🛡 Cybersecurity’s Next Frontier – Why this company’s data recovery solutions might be the missing piece in your cybersecurity portfolio
🥐 This UK High Street King Is Under Pressure – does this British bakery chain deserve a spot in your portfolio, or is Badger just craving sausage rolls?
🤡 $TRUMP & $MELANIA Coins – Political Crypto Chaos – a presidential meme coin clownshow: what happened, who got burned, and why it’s already peak 2025 madness
🌍 Fintech Deep Dive – Badger adds a new fintech stock — here’s why international money transfers are a compelling long-term play

Segments:

00:00 AI Models and Reasoning Capabilities
02:25 DeepSeek’s Impact on US Tech Stocks
05:30 Market Reactions and NVIDIA’s Decline
08:03 Understanding AI and Language Models
12:37 NVIDIA’s Market Position and Challenges
25:48 DeepSeek R1: A Game Changer in AI
35:33 DeepSeek’s Breakthrough and NVIDIA’s Future
37:18 NVIDIA’s Valuation and Investment Dilemma
41:45 The Impact of Efficiency on Hardware Demand
45:51 $TRUMP & $MELANIA Coin Madness
53:19 Fintech Play: Why Badger Bought This Remittance Expert
58:53 Stock Safari: cybersecurity and sausage rolls
01:10:00 Patreon Shoutouts

 E65: DeepSeek NVDA RBRK Gregg’s $TRUMP $MELANIA

[00:00:00] k: if you give these, AI models, call it more time to do the work, they will,, produce answers that don’t hallucinate.

[00:00:15] Luke: going on. Like if you were playing with older generations of large language models, like say, chat GPT 3, 3. 5, 4, like a way people would hack them with, would be by prompting them to say, okay, chat GPT, go step by step, think through the logic and come up with your answer. And now.

Reasoning models, like the latest evolution of models from Gemini alphabet, from OpenAI Anthropic, they have reasoning almost kind of built in. And the way the models are designed is to do exactly that. And some of them show you their train of thought and how they’ve planned to answer the question and how they kind of check themselves as they go [00:01:00] along.

[00:02:00] ​

[00:02:22] Luke: Welcome to the latest Wall Street Wildlife with Christophe and Luke. In today’s episode, what is DeepSeek and why is it threatening to wipe a trillion dollars off of US tech stocks? Also, how the UK government is preparing for dystopia, a presidential crypto circus, And we go on stock safari with cloud computing and sausage rolls.

[00:02:48] k: Damn, Badger. You’re looking beautiful this morning. I’m glad to see that you’re, uh,

yeah. uh, we’re talking about,

see like three more whiskers. You added three more whiskers

to the farm. had to [00:03:00] trim it. I had to trim it the other day. It’s getting a bit wiry. It’s so thin, but this is the, this is the final day. We’re recording on January. The 27th, and I’ve got to have this travesty of a thin ass beard off my face before my wife arrives in Chamonix on the 1st of February, so I’m gonna go shave immediately we finish recording this episode.

[00:03:20] Luke: I think I achieved my objective though, which was demonstrating I cannot grow a beard, but at least now you believe me. 

[00:03:26] k: There’s something definitely that’s quite rakish to you because you’re always so polished and smooth and, you know, this gives you a little bit more street cred, you know?

[00:03:38] Luke: I’m a sacrifice, Fred, for the, for the, uh, the sanctity of my marriage.

[00:03:44] k: A committed husband, indeed. Glad to also know you’re still alive and haven’t fallen off some cliff in the, in the Alps, even with your new, what, anti avalanche jetpack thing.

[00:03:54] Luke: Yeah, I’m all good. As I’ve been a bit lazy, I’ve had two days off and currently have a day off because it’s just [00:04:00] bucketing down with rain in the mountains right now, which is not a good thing. , so yeah, happy to spend the day on the sofa reading and catching up on what’s been happening in U. S. tech because it’s been quite an exciting weekend.

[00:04:12] k: Yeah, so let’s get to it, right?

[00:04:14] Luke: Now we should say, Oh, we should say in preface to this, uh, Christophe and I recorded an interview with Chit Chat Stocks, which is an awesome podcast that we’re great, both great fans of with Brett and Ryan. And we recorded that three days ago. And then the world changed quite a lot in the last three days. So you’re, if you’re listening to us on the wall street wildlife feed, you’re probably going to hear today’s discussion.

And then next week you’re going to hear the Chit Chat discussion. So just bear that in mind that they’re back to front because. What we say now is fresh tomatoes of what we say next week, potentially, uh, might be looking like a poor take

[00:04:53] k: Right, and a quick, uh, shout out to our Patreon community over at patreon. com slash [00:05:00] wallstreetwildlife. That’s where you can find Badger and Monkey talking about the latest stuff. pretty much near real time in our community forum, so if you don’t want to wait the week to get the newest podcast update on YouTube or wherever, then head over to patreon.

com slash wallstreetwildlife. Okay, how do we start? So, it is, uh, Monday, January 27th. Right. And I’m the markets just opened after the big news dropped this weekend. And the most interesting fact is at the moment, Oh, NVIDIA is down about 12%.

[00:05:36] Luke: that’s relatively substantial

[00:05:38] k: and the reason it’s substantial is not because 12 percent exactly is substantial. I mean, now in, in, in the quant bays, uh, there’s big moves just on. On pricing wars, you know, like the, the computers doing their thing. That’s not the issue. The issue is when you’re a 3 trillion company and you lose 12%, that’s like [00:06:00] the equivalent of, uh, I can’t do the math, uh, live, uh, it’s too early in the morning, but like,

so but it’s also impactful because like the whole the whole market is almost held up by AI and NVIDIA. And over the weekend, it seems like tech investors are panicking AI party might be over. And so. It’s not just NVIDIA that’s dropped. We’ve seen a retraction in valuations across like the NASDAQ across all tech stocks.

our objective, our main objective with this episode is to walk you through. Hopefully on the pretty deep level, what’s going on and what we have to think about. Now, I don’t know if you know this, Badger, I am wearing my dodo hat this morning instead of typical monkey hat, and it’s because NVIDIA is going out of business, buddy.

It’s done. Okay.

[00:06:53] Luke: a fricking premature judgment to make. Now we, we’ve both been looking at this topic [00:07:00] separately and I’ve got a funny feeling we’ve got a different take on this and we don’t know what each other have to say. So we’re going to have this conversation live, but you’ve been doing a ton of deep research all morning and yesterday.

So when you give us the grounding in what’s actually happened.

[00:07:13] k: There’s so much to say. I want to say it all at once. One more, one more backstory. It was, uh, because of the chit chat, uh, stocks podcast, I had just spent the previous. Well, two weeks really brushing up on the NVIDIA case. And that was mostly bullish. I left doing that research feeling really excited about NVIDIA.

shocking that, yeah, two days later, now I’m like, Oh, I’m really not sure anymore in this moment. For that kind of fast pivot to happen is kind of like it’s boggling. My mind’s still spinning. So here’s the deal. Here’s the deal. I’m going to. need to lay some, some basic foundational, explanations on the tech because not [00:08:00] everyone’s an expert in this stuff.

In fact, I think few people are. So AI, when we talk about AI and language, um, basically the AI that you’re familiar with, right? Like chat, GPT all the models that most people are using on a daily basis. You have to know that basically there’s two parts to it. There’s the training of the model, which I think of as ingesting and reading.

All the books, or all the articles, right? Which takes massive computing power. And then there’s the second half, which is, now that the model is trained, it has to spit out info, called the inference portion.

[00:08:41] Luke: Yeah, that’s a good analogy, precisely only,

[00:08:45] k: When thinking of these things, with regard to tech, that first half, we’ve kind of come to the conclusion that there’s a limit to it. Because the amount of information that exists in the world is, you could [00:09:00] say, for simplicity’s sake, most of it has already been uploaded. Even though that took a massive amount of computing power, it’s kind of diminwe’re, we’re now at the point of call it diminishing returns. Right? But it took

[00:09:15] Luke: only in terms of, , say the written material because you’re right. Like most of these models have been trained on like the entire public internet, like models are now being trained on like video and audio and other formats as well. And like real world, like true cameras. So if you think about training in that respect.

Like we’ve barely scratched the surface

[00:09:39] k: That’s right. That’s right. So, but this part, this first half was, I think, always known ahead of time that we’re going to reach some modest limit of, of more, more computing power. If you throw at it, won’t really make much of a difference because most of the information is already [00:10:00] call it in the computers in the way I’m talking about it.

So that was called the scaling law. And that was already known, and it’s kind of like thinking about these models as being commodified. And it’s true. We, all technologists knew this was going to happen. sooner than, I mean, eventually, call it, right? But then there’s this next bit, which has to do more with the inference part.

And there’s what’s now known as a new scaling law. Which depends on something called chain of thought models. In this now, we’ve seen, uh, people playing with this stuff, realize that if you give these, AI models, call it more time to do the work, they will,, produce answers that don’t hallucinate.

[00:10:55] Luke: going on. Like if you were playing with older generations of [00:11:00] large language models, like say, chat GPT 3, 3. 5, 4, like a way people would hack them with, would be by prompting them to say, okay, chat GPT, go step by step, think through the logic and come up with your answer. And now.

Reasoning models, like the latest evolution of models from Gemini alphabet, from OpenAI Anthropic, they have reasoning almost kind of built in. And the way the models are designed is to do exactly that. And some of them show you their train of thought and how they’ve planned to answer the question and how they kind of check themselves as they go along.

And these are definitely producing higher quality results.

[00:11:43] k: Right. And this matters though, because Whereas that first portion I said kind of runs into a natural limit. This bit of AI modeling requires the more compute power you throw at it, the better it becomes. So in this new scale yeah, [00:12:00] more is still better. You want to keep pushing as much hardware at the problem as possible.

So, so far so good, right? And by the way, this, this stuff that I’m talking about is kind of reserved for what I call genius level use cases, but that’s where things are going. Like, if you know a medical procedure depends on getting the exact right answer or some legal case depends on getting the document exactly right, you need as much of this fancy thinking as possible.

You, good enough is not good enough, right? So again, translate to GPU hardware, more is still better. Okay, that’s the sort of background so given this up to now, I’m going to start talking about NVIDIA because that’s really the use case, right? That’s the problem we’re trying to figure out is NVIDIA in trouble or not. So, NVIDIA has had a huge advantage up to now, uh, what 90 percent [00:13:00] margins. In the space mostly is that I said in the episode about NVIDIA that the CUDA software programming language is their moat translated into English CUDA is a call it programming language. Technically, it’s a driver that allows software to communicate with the GPU chips and NVIDIA basically has a monopoly on this, the world’s leading experts.

with AI, I’ll use this programming language. It’s only on NVIDIA that it exists. The other, advantage that NVIDIA has is that they knew ahead of time that they needed to find a way to interconnect all of the GPUs so that they all basically form a way bigger brain and operating. Alone, they made this brilliant acquisition of Melonyx for about seven billion and they were right and, NVIDIA is where it’s at up to today, right? but this is where it gets interesting. That’s kind of the bull [00:14:00] case. I don’t know if you want to add something to that. Like they have the lead in terms of engineering. The program, uh, more is better still.

[00:14:08] Luke: um, let’s, let’s recap the conversation we had with Brett and Brian on ChipChack stocks, right? Um, like the whole world now is being driven by this idea. AI is going to become increasingly capable. You move beyond the confines of like the digital world into the real world and start driving cars and piloting robots around, around your home and.

Like AI will get baked into everything. And the thesis for Nvidia pretty much is that they have the best software, as you just described CUDA, but also the best hardware and together that’s almost an unassailable moat over competing hardware designers. But remember Nvidia design the chips, [00:15:00] Taiwan semi manufacture the chips using machines created by ASML.

So you got this like. Supply chain that’s creating like the tools that will allow all the AIs to run and do stuff that should make all of our lives easier, better, faster, more efficient, help companies generate more revenues, create efficiencies. Like, this is like the oil in the whole stock market right now.

[00:15:26] k: That’s right. And, uh, Spent a lot of time on our NVIDIA episode, talking about the softer side. The more, rather the more intangible qualities about NVIDIA. In short, they have a superior culture. They have the smartest people. They’re workaholics. So their lead seems, or seemed until three days ago, like insurmountable.

Right? For all the reasons that we talked about. I mean, that bull case for NVIDIA is really, really powerful. And seductive, right? Because, The whole world, like no one is [00:16:00] denying that AI is just, it’s, it’s completely going to revolutionize everything and test cases and use cases we can’t even dream of. So when you’re the market leader and you have such a incredibly, what’s it called?

Process by which you you’re out hustling all your competitors and like, it’s just hard, right? It’s hard in many ways. It’s hard to overcome such a lead with so many assets on the balance sheet. And now shockingly here, we are going to talk about how, in fact, this might be a peak. That’s what I’m going to try to walk through. I mentioned this too on the, the, the podcast that, you know, capitalism under capitalism, anytime you have a monopoly, you also have a giant target on your back. When you’re a particular monopoly as an NVIDIA’s case runs the whole world, potentially nobody, right? Like nobody wants, nobody besides NVIDIA wants that because as we’re finding out they could, they could charge whatever they want. There’s also [00:17:00] saying the market will find a way to surmount the insurmountable, but seems like the insurmountable. And so this is what I learned recently that I did not know just a few days ago. Nvidia now has a hardware threat. The threat to NVIDIA is coming from multiple, , potential paths. Let’s talk about the hardware first. There are now companies with massive amounts of resources that are actually, designing chips. In a very different way from the way NVIDIA designs them, that kind of solves for their moat.

Let me be more specific. One of these companies is simply making bigger chips. Like much bigger. So I don’t know what the, when you look at these things, they’re like, what, like an inch or so, like most of them seem like they’re kind of like one inch by one inch [00:18:00] square, or maybe a little bit bigger, right.

Give or take.

[00:18:03] Luke: but, it’s like in video. Package them with like hundreds on like a giant board and then they layer those boards in like a massive server. So you have this like city block full of those little like postage stamps. Yeah,

[00:18:18] k: So now a company is now companies. The specifics here don’t really matter are just making their chips figured out to make them way, way bigger. And now that might not seem like a big deal, but what it actually does is it solves for the what’s known as the interconnect problem that remember one of NVIDIA’s modes is that they, they figured out how to connect all the GPUs together.

Right, but that’s not a simple problem to solve unless you make bigger chips, which means they no longer need to be connected to one another because each one is so friggin large. So that’s a hardware solution to NVIDIA. That’s one. there’s another [00:19:00] solution that I came across.

I really hate getting too technical with the jargon, but it depends on something called the TPU, the Tensor Processing Unit. Which is designed, so to speak, for exact mathematical operations. , that unlike traditional GPUs where the exact timing of things can vary, these chips perform the operations exactly predictably the same way.

Each time. The reason that’s important is because that helps the inference bit. Of, of the AI solution. So in other words, they they’ve created a new kind of chip, call it an advance of GPU. So CPU, GPU, TPU, right. And then video does not have a, call it a monopoly over TPUs. That makes sense.

[00:19:53] Luke: and to clarify, like a T, a tensor processing unit, it’s kind of like a type of [00:20:00] GPU. It’s essentially doing the same sort of thing and companies like Alphabet have their own line in TPUs that they manufacture, they put in like phones and they’re starting to use in their data centers too.

[00:20:12] k: Another takeaway for me is Nvidia has the moat, but their moat is GPUs, but now a new kind of. Unit is coming across where NVIDIA no longer has the unassailable moat potentially, right? So that’s the hardware, I would say the hardware short case. Okay, and we also, need to acknowledge, by the way, that all of the big boys, other Mag 7 companies with, what, massive resources, Amazon, Google, Microsoft, Apple, Tesla, they’re also developing their own chips internally, and they have the resources with which to do it, right?

So, when NVIDIA has 90 percent margins, those guys are like, Mmm, okay. Point two [00:21:00] is NVIDIA is an IP company. Which is a fascinating thing to acknowledge, right? Their, their true 3 trillion market cap rests on the propriety of their engineering designs. But who actually makes the chips? TMC and ASML. And one of the ways that NVIDIA got to the very top was by poaching the most brilliant minds from other companies like Intel over the, call it 20 years of their ascendancy. Well, now think about this from the other way, you know, the major fab companies. On whom NVIDIA depends, they die without them, all of a sudden have massive cash flows, they’re the ones that make the stuff, what prevents them from, saying to the most genius level chip designers, hey, come work with us, you know, and help us make our own stuff.

 It’s just [00:22:00] worth knowing. I noting that NVIDIA is not vertically stacked and, and it’s, it’s a. It’s a kind of risk to say in a brutal capitalistic framework, you never know when say money is gonna kind of take your best people away and sabotage, you know, the way it works. And I argued against this myself on the chit chat episode. Anything you want to add to

that? you haven’t got to the meat yet, right? The thing that’s actually changed, which is your, which is really the.

Oh, oh, 

[00:22:30] Luke: the bit where we’re going to debate, I think

[00:22:32] k: oh, yeah, yeah. This was just me talking about the hardware stuff, right? So, okay. So NVIDIA has, has things to consider from the

[00:22:38] Luke: everything you said so far, like those are known risks to any NVIDIA investor who’s done the homework and that’s been known for years. And yeah,

[00:22:50] k: It’s just that, yeah, I would say it’s just that now, the risks are more visible, they’re being quantified, they’re, the products are now, you know, in [00:23:00] hand within other companies as opposed to theoretical. Okay, anyway, the next, the next risk for NVIDIA has to do with the software side of things. Hopefully I don’t have to go on and on for too long about this, but CUDA is the software language, and it’s proprietary, and they, created a moat for themselves by basically ensuring that the world’s smartest people only use By world’s smartest people, I mean the world’s most important GPU designers are only using this language.

So it’s sort of like the network effect we see with something like Facebook, uh, right? You’re not going to use the other guy because there’s nobody doing it over there, right? So you go where everyone’s doing it, which is a CUDA. Okay, but we now know that while AMD itself had a pretty awful product that nobody wanted to use, that product itself is getting better. And [00:24:00] we also know there’s a bunch of people, creating a kind of more open source version of CUDA. And they’re doing it via different kinds of, uh, systems, a different one at Apple, a different one at OpenAI, a different one at Google. And so that whole, it’s, it’s the same point with the hardware, but again, with the software, when you have a monopoly, people are going to figure out a way around it and it hasn’t happened yet, but it seems like in the next year or two, there might be some inflection point where enough developers are now using.

the alternate products and it’s good enough. And then the monopoly breaks the 

last 

[00:24:44] Luke: to lean on that point too much, but the one you didn’t mention is Meta with LLAMA. And actually that is like an open source framework for developing AI applications that have actually got really wide acceptance [00:25:00] already today.

[00:25:01] k: Yeah. So it’s coming from, from different directions. And the last point is CUDA itself, and this is so far beyond my pace and pay grade, but. My understanding is that CUDA itself might get abstracted, so the last point is that CUDA itself might be abstracted. Which means that in, without getting too deep into the technical stuff that everybody learns how to use CUDA because it just becomes, uh, another layer to the stack and I can’t explain it to you on the, on the, coding side.

So just assume that there’s merit to that. So that takes us to the big news of today, which is the release of a new model, new AI model called DeepSeek. Badger, do you want to maybe, uh, since I just yapped a bunch, give us the sort of background stuff about your understanding of what was [00:26:00] released and why this matters and I’ll fill in the other more technical details.

[00:26:03] Luke: Yeah. So actually DeepSeek has been around for a little while and I think they AI company, I think were founded. Out of like some like high frequency trading firm, like in the finance world, trying to do stuff faster. But just a week ago on the 20th of January, they released DeepSeek R1, which is like their latest model.

And the reason this has got the whole kind of tech industry attention and really paying attention to what’s going on here is according to all the published benchmarks. DeepSeek R1 is giving almost as good results as one of the industry leading models, O1 from OpenAI, although OpenAI have O3 as well, , so it’s giving like at par pretty much with the best models we have, but at far less cost, i.

e. [00:27:00] like much less processing time, both for training and critically for inference. The reason that’s got tech investors interested is like everything we said for the last half an hour was about. NVIDIA and AI and ever increasing demand for more and more like AI chips to do all the AI stuff and make the world work.

Well, maybe there is still a ton of efficiency to be had by improving the software, which is what essentially DeepSea Core 1 has done. And you can do so much more with less, and maybe that means we don’t need as much AI hardware as we all thought we needed,

[00:27:45] k: Right, so a couple of, uh, a couple of, uh, critical views. This is a Chinese company? So bear in mind that, as a Chinese company, [00:28:00] we always have to put asterisks, whether what we’re seeing is like legit or whether there’s, you know, closed manipulation. So just, that’s important to know, but, but,

[00:28:12] Luke: but just like, I think that’s, I think that’s true. And that’s a good guideline. If you’re investing in Chinese stocks, as I was, and I sold a bunch recently, but this deep seeker one is open source. Maybe questions about how they built it and did they build it as efficiently, like do the training as efficiently as they claim, but the inference.

Like the efficiencies are real because people can run it and they can go, Oh heck, this actually does work, gives me the same results at much lower cost.

[00:28:45] k: yeah, absolutely. just a little bit of a gray area, but in the end, the results are what speak for themselves. Two, that efficiency you were talking about is staggering. from what they say, they did this with fewer than 200 employees. [00:29:00] So that’s tiny relatively to all the mega corporations that have AI departments

[00:29:07] Luke: Well, I don’t know if that number of employees is like a relevant metric necessarily because like the world, everyone stands on the shoulders of giants all the time, right? So we, we talk often about like a solo printer building a potential billion dollar company. So I don’t think the number of engineers necessarily, if you’re eager to.

You could have like 10 super smart guys and I’m sure they could build a better model still.

[00:29:32] k: yeah, I don’t disagree. I think it’s just a relative comparison that for something that shakes the world when we’re talking about trillions of dollars, it’s relatively a small number of people. So not, not a major point, but the bigger point is that. It costs approximately 5 million to train versus the scale of 100 million for the other guys.

So that’s about 45 times more efficient. I mean, like, [00:30:00] it’s just a shock. It shocked the world. The efficiency bit is what shocked the world. And they did it, like you said, by being more clever rather than being more powerful. In essence, that’s right. That’s the, that’s one of the main, takeaways. And the way They did this on the technical side is they basically figured out that once you’re dealing with inference stuff of the model that you can sacrifice absolute precision for something we could say is good enough.

And if you let enough good enough work together in more clever ways, the output will still be stupendous. Even though it’s not like precise to the last decimal point and that’s what allows for so many of the efficiencies,

[00:30:47] Luke: And I ran into a really nice analogy, uh, somewhere on Twitter. So I can’t attribute this. It wasn’t my idea. If you think about, let’s say OpenAI’s ChatGPT latest model, like [00:31:00] they’ve trained the model and it’s almost like you could imagine. They’ve got this AI with like a world class, like lawyer, doctor, engineer, biochemist, physicist, like every discipline, because you’ve got all the knowledge in there.

And at inference time, i. e. when you go on there and you ask your question, it’s like the model asks all of these different experts and tries to come up with the best answer. And the innovation in DeepSeek R1 is that at inference time, when you ask the question, it kind of figures out. Which of these specialists is going to give you the best answer.

So if you’re asking like a legal question, it’ll use like the legal part of the model and it won’t activate the mass of the rest of the model. That’s actually not going to be that relevant as a specific question you’re asking. So it’s like having a phone book with all of these experts in it. We only call up the ones that are actually going to give you.

an answer, they’re actually going to [00:32:00] help you answer the question you’re trying to answer.

[00:32:02] k: right? So I think we’ve gone into the technical bit from all sides, right? Enough. The punchline here is, is this, this is what it comes, comes down to. You now know, like people, uh, running Facebook and all the Magnificent Seven now know, because they, like you said, could look at the open source code, that they could get amazing results for 95 percent the price cheap off right now, whatever the discount is massive discount.

Right? So how now in this moment, do the CEOs of those companies justify paying Nvidia, whatever the price and video was charging when this little guy, they could get it for free. That’s the open question right now in the moment that the market is trying to figure out.

[00:32:58] Luke: And I think this is, it [00:33:00] shined a real light on this question, but it’s not a new question. Like we’ve always known, like there are multiple ways to get efficiencies, either with like better, faster hardware, or chucking more hardware, or asking the questions in a smarter way with better software. And it was always clear that there were.

Efficiencies to be had in the software. It’s just that DeepSea Car One a week ago has suddenly done it. They’ve actually delivered a really substantial software efficiency just by designing the system differently. And that’s, that means, as you say, it’s open source, that means like. If we put NVIDIA aside for a minute and we just talk about, let’s say, like the main buyers of NVIDIA hardware, say like the Metas and the Amazons and the Googles and these guys, Apples they build their own models so they can leverage the open source techniques that DeepSeeker has shown off [00:34:00] into their own models and then without having to go and buy a ton more hardware, they can in theory do a ton more with what they have.

So this is actually, I think, going to be good for the model providers like those guys, the hyperscalers, and it’s going to be especially good for all the businesses and like you and I, people who are trying to use AI. To either make our lives more efficient, or if you’re a business generate more revenues, because you’re going to get all of this incredible capability much cheaper.

So I think it’s, to me, it seems unambiguously good for tech, but the question we’re facing here is, is it good for Nvidia and maybe AMD and Taiwan semi and ASML, like all the guys who essentially. Their end product is the hardware this stuff runs on, because it’s a legitimate question. Maybe we don’t need as much [00:35:00] hardware as we thought we were going to need.

[00:35:01] k: Yeah, and you know what I think of, uh, in this moment is this is still the kind of AI that we know is, is, not human level. Because human level, has to have the capacity to quote unquote understand itself. Which some people say will never happen and some people’s, you know, this is the futurists that the AGI moment, right?

This is a great singularity will or won’t happen. Well, when you have leaps like this, you all of a sudden have to think like, Oh shit, it might be sooner than we think. I want to actually read one quote from this long, uh, essay I read to prepare for this one of the sources I used because I think it’s really touches, uh, this key, uh, um, um, point.

We’re going to have a link to this source in our show notes. Quote, with R1, DeepSeq essentially cracked one of the holy grails of AI, getting models to reason [00:36:00] step by step without relying on massive, supervised datasets. Their DeepSeq R1 0 experiment showed something remarkable.

Using pure reinforcement learning with carefully crafted reward functions, Being managed to get models to develop sophisticated reasoning capabilities completely autonomously. This wasn’t just about solving problems, the model organically learned to generate long chains of thought, self verify its work, and allocate more computation time.

So this is that aha moment and for me, like the conclusion is, is NVIDIA now a short? Is it a long? I don’t know anymore. I’m completely torn and ambivalent because this makes me think AGI is coming sooner. And then we thought, well, potentially in all the use cases like human robots, I mean, I’m sorry, you know, optimists like robots, they might still need a [00:37:00] lot more juice that NVIDIA is the best at, but then there’s all the other reasons we talked about why they won’t be the only game in town and it’s an issue of margins now coming down and from a 3 trillion valuation where margins were, you know, as high as they could be.

Being pressed like is NVIDIA a good investment now, 

[00:37:22] Luke: And I think it is a valuation question, isn’t it? Because like when you’re investing growth stocks, you’re not buying them for the earnings you get this year. You’re buying them because you think those earnings are going to compound and grow and grow and grow in the future. And you’re buying like all those future cash flows.

And up until a week ago, everyone, no one could see an end to the growth that NVIDIA was, was undergoing. Like we chatted about some of the numbers in the chit chat stocks episode. like NVIDIA is a three and a half trillion dollar company, or at least it was yesterday doing doing like a hundred percent revenue growth.

And they were forecasting [00:38:00] like another doubling of revenue again and again. So the market expected this train of growth just to continue and continue. And now people are asking skeptical questions, which they should have been asking already. And they’re now asking those questions of, well, themselves when they consider their allocation to NVIDIA, if they’re an NVIDIA shareholder.

[00:38:21] k: you know, what I want to more than anything else right now is to be a fly inside NVIDIA’s headquarters and to listen and hear how Jensen is addressing this. And I don’t know to what extent he already knew about it or, you know, no, no idea. But one thing I learned about NVIDIA is that, you know, their core principle is we were 30 days to being extinguished.

So we have to always. be at the front, I would love to understand their thinking in this moment, whether they feel, you know, that they’re as confident as they always would be or whether they’re experiencing this as an existential [00:39:00] threat.

[00:39:00] Luke: talk about existential threat. , I think I quoted Andrej Karpathy when we spoke to Brad and Ryan and his quote was like, cause we talked about deep seek in that conversation and his point was, does this mean you don’t need large GPUs to do like large language models? No, but you have to ensure you’re not wasteful with what you’ve got.

And this is a good demonstration. There’s still a long way to go with data and algorithms. Like what he, he’s a super smart guy. And what he’s saying there is like, yeah, we’re always going to get these improvements and efficiencies. And like in some ways, right, this is Moore’s law in action. like stuff gets, we can do things.

Increasingly quickly and lower costs and lower power consumption. And this is just how technology evolves and here, okay, we’ve had like a step change, because seemingly. DeepSeek has just gone, wow, like we can do a lot more with a lot less, but that’s, you know, Moore’s law and these improvements, they [00:40:00] don’t come on this like slow slope.

They come all at once. It’s like a jagged step. And we’ve just taken a bigger step. You know, maybe this is just the way the world operates.

[00:40:11] k: So, uh, question for you, NVIDIA is down 13 percent right now, which, yeah, is, is. Significant. are you tempted to buy given what everything we’ve said?

[00:40:24] Luke: No, because I’ve, I’ve owned it for quite some time since you and I did battle at seven investing over who would get to recommend it. and I’ve been trimming it along the way because of stuff like this, because it has always seemed like it’s been a pretty expensive stock. , like I wish I hadn’t been trimming it because it just continued to go parabolic, but now maybe it’s going to come back in line.

I don’t know. I feel like I’ve got it. About the right size in my portfolio, which is like a one and a half to 2 percent allocation.

[00:40:54] k: So in videos, one of the stocks I sold out completely from, and then I bought back, you [00:41:00] know, my typical one share as a flag, you know, in the portfolio. And this morning. Uh, I added an extra share, which I don’t know if this is like meaningful at all, but you know, just more like the thinking is, okay, 13 percent dip.

I’m basically to me trying to explain myself to myself, meant don’t buy the short thesis fully yet. And I just wanted to signal that we’ve got a long way to go. So I doubled my position to two shares.

[00:41:36] Luke: That’s cool, that’s immaterial, right? But you’re putting like another marker in the portfolio to go, I was bullish at this price point, I guess.

[00:41:44] k: Yeah, yeah, exactly

[00:41:45] Luke: But maybe the question here is, I don’t think anyone can answer this question right now, especially when emotions are running high as they seem to be on Twitter, on X.

 Will this efficiency mean that we need less and so we don’t, we use less hardware, but deliver the same thing. [00:42:00] Or as I’m kind of inclined to believe the efficiency means we still need a crap ton of hardware at never increasing rate and initiatives like Stargate, like the U S is pushed to invest 500.

Billion dollars in AI over the next couple of years. Those are still necessary, but it means we’re going to do even more with the hardware that we’re buying. Like suddenly we can deliver far much more in terms of economic benefit, but we still need more hardware to let us do that.

[00:42:30] k: to be, uh, remains to be seen, right? I wonder how this episode will even age in, in a couple of weeks, because we’re talking about it. Like this is the front lines. This is just the sort of first couple of days where everyone’s trying to process this.

[00:42:46] Luke: Well, it’s quite interesting, right? Because like on Friday. You And I, we were both relatively aligned and we told essentially like the bull case for NVIDIA. But then three days later, [00:43:00] you came into this episode and the first half an hour was you going on about like the huge bear case around this stuff.

So you seemingly have like 180’d, albeit you just doubled your stake. I kind of feel the same from last week to this week, and I’m more optimistic about AI. Now, maybe it’s because I’ve got a much bigger allocation to, to me, the undoubted beneficiaries, which are companies like Google and companies like all the small tech companies that are going to use this capability.

Like intuitive surgical, like my biggest allocation to a robotic surgery manufacturer. Like they can suddenly do more with all the data they have at lower cost. They can start to like, bring like FSD to surgery. I see a huge upside for my tech investments and my little NVIDIA allocation. I’m still positive about, I’m certainly not going to be selling my, my 2 percent NVIDIA holding because of this news.

[00:43:58] k: [00:44:00] yeah, that makes sense from the that’s I’m glad you said that from the broad portfolio perspective. This is good news for me. I think the final takeaway is I am deeply I’m sorry. I am. I am significantly concerned about NVIDIA’s margins going forward [00:45:00] and from such a lofty valuation. That would make the company still a truly brilliant company, but a very poor stock potentially going forward.

So let’s see what happens.

[00:45:14] Luke: That’s a, that’s a good distinction to draw. Great, still continues to be a great company, but it was an expensive stock and maybe even looks even more expensive as a result of the weekend’s news

[00:45:25] k: Right. And okay, let me give just one reiterate one point about that when your margins are 90 percent and you get to charge whatever you want, as soon as that shifts, even a little, the, the financial metrics. If it’s almost like a death knell to be determined.

[00:45:46] Luke: Well, we’ve got a whole other episode to get through. Uh, should we dive into one of our other topics?

[00:45:51] k: Sure, uh, I wanted to say something quick ish around [00:46:00] crypto. On the one hand, I have never been more bullish about crypto When the president of the United States himself is now as positive for the industry as you can be making all the changes from people who want to see it dead to people who want to see it thrive.

So on that level, like this is like crypto’s new golden era and we’re at the very beginning, right? At the very same time, like I’ve said all along, crypto itself is a meaningless term because you have the legit projects. And then you have the vast majority, which are circus acts, almost literally in existence to, bankrupt people.

It’s like a game of, you know, musical chairs. It’s pure speculation. It’s pure gambling. It’s pure nonsense. And it was just shot at not shocking. Cause we should come to expect this, but at the, you know, right before the inauguration, Trump [00:47:00] released his own coin, which, uh, made, made billions. And then, in the most like Trump like script, his wife Melania released a coin, which made people realize that this now, it itself is a circus run by the president, so Melania’s coin basically rug pulled, or caused the whole, like, gambling side of things to, like, drop massive percentages and made people, like, lose their money, And it made it more obvious than ever that you just don’t know what I guess Trump could do is liable to do is willing to do on that end that it is mostly circus acts and that you had better, better be damn sure if you’re investing in the space to not get yourself involved with any of that nonsense. Keep your eyes very strictly focused on the real world use cases, which again, for the 20, [00:48:00] 000th time to me is Chainlink and Bitcoin.

[00:48:03] Luke: Fair enough. And, and maybe arguably like Ethereum a few others, apart from like those top 10 or so really established projects, like most of the other stuff is, as you say, a circus. Now let me just put aside, okay, cause I was scratching my head about when you put this topic on the, uh, on the conversation for today’s episode.

Like, how do I feel about it? I mean, it’s just ludicrous, right? It’s, I mean, I would be embarrassed if a leader of the UK government tried to pull the same stunt, like they’d never get away with it, but you’ve got to be, you’ve got to be the Donald to even get away with this. Does it really change anything though?

Like, does it really hurt anybody? That’s probably the bigger question. And I suspect, like, at the margins, and maybe a few people, but probably, the people who got involved and bought, [00:49:00] like, Trump coin or Melania coin, they were probably already, like, crypto bros, crypto guys and girls. Who thought they were playing, they knew the game and they’re playing like the greater fool theory and they thought they were going to like jump off the roundabout just before the music stopped and scam everybody else.

And maybe they got scammed along the way because like millennia coin effectively like turned off the lights on Trump coin. But if you live in this space, right, you probably every week you find some hot idea like fart coin or whatever the thing was like hawk to a coin. A few weeks before that, and you’re probably used to having your pants pulled down.

 Here is the president doing it rather than some like YouTube celebrity. But, hopefully there weren’t too many like mom and pop Trump supporters who got into crypto just because of Trump coin. And then they lost like a chunk of their wealth. That’s where the real pain would be. I’m hoping that’s like few and [00:50:00] far between.

[00:50:00] k: I think this is actually more severe. I’m going to be a little cynical here for a bit. I have to under, even though I’ve been deeply engaged in the space for, for some years now, call it four, three. And, you know, I do things legit. I read this, I go deep into the project. Most people, most people, I don’t know what the percentage is like 95 don’t know shit about crypto.

Don’t understand that technology. Don’t really understand anything beyond whatever meme is meme is shit or headline they see. So when the president of the United States. Engages in the exact kinds of things that give crypto a bad name and like, make it seem like a circus. It confuses people at like best.

And at worst it bankrupt, you know, it like steals their money. And then people, the message they take away from that is like, crypto [00:51:00] is a giant, you know, heaping pile of dung, which they’re right to say, if the president of the United States is doing this kind of stuff, so it’s just so stupid. It’s just like every crypto person that I kind of saw the serious people talk about this is like, why, like, really, Donald, like, you know, like, it’s only because at that point, it’s like, you’re, you’re, he’s, he’s acting against his own self interest, too, because he has massive amounts of money invested in this stuff now.

So it’s just fucking idiotic. I hope that there is leave that in because

[00:51:41] Luke: Yeah. It’s an embarrassment, but you know, I don’t think probably shouldn’t be too surprised. Probably a little bit surprised if you’re David Sachs, who’s like the newly appointed AI and crypto czar. And then you’re here, you know, you’re, you’ve shown up for job on day one, and you’re going to try and help like get crypto [00:52:00] accepted and regulated.

And, and then suddenly like your boss pulls a fast one like this and. I just brings even more disrepute to this space.

[00:52:10] k: yeah. Transcribed Yeah, takeaway for me, though, is this is short term more fart noises and Something like Bitcoin and chainlink are as strong as ever and their projects are have never been in a more positive Background and I can’t wait to see what happens for those in 25, which by the way one last thing on on our patreon, I ran a poll about, uh, Chainlink, can we predict, you know, asking people where Chainlink will end the year in terms of hierarchy and ranking?

And we’ve got some interesting feedback, from all our patrons. So check it out if, if you’re interested in casting your vote and making a prediction about Chainlink.

[00:52:49] Luke: And it’s just, just for context on that, how big is also where in like the strata of cryptos is chain linked today?

[00:52:57] k: So, uh, when the poll went out, it was about [00:53:00] 12th

[00:53:00] Luke: Cause like what the 12th biggest call by market cap is that what you’re saying?

[00:53:04] k: Yeah. By market cap. But you have to realize Bitcoin is like, whatever, a trillion Ethereum is called it, like 500,000,000,012 place. Tiny chain link is only something like 15 billion. So it’s still orders of magnitude difference.

[00:53:19] Luke: So something else I’ve been doing on the Patreon lately is using our new monkey and badger trades chat group. And we’re not, we’re not like guys where we’re saying like, you must trade today. And it’s hour by hour changing our views on stuff.

But we are now using that comms channel to tell our Patreons when we’ve just done something, or in this case, when we’re about to do something, because I had a bunch of trades I did in my own investment portfolio. So I’ve kind of cleaned things up a little bit. But I did want to flag. A company I just added to the portfolio because I’ve just bought myself a starter stake in Remitly.

Heard of [00:54:00] Remitly?

[00:54:00] k: I’ve heard of it. I’ve seen the name, but it’s, you gotta hit me with the goods badger.

[00:54:04] Luke: Yeah. So it’s not wildly dissimilar. You might remember like a month ago, I said I bought a stake in Newbank. The, uh, Brazilian FinTech Remitly is like another FinTech, and these guys are experts in remittances, which is essentially is like o the majority of the volume. There is like overseas workers. if we’re in remotely sake, mostly working in the U S C, you’ve got like, say tons of, Indian nationals and from other like lower cost parts of the world, developing nations, working in the States, earning money, sending money home, remitting money home to their family or, you know, to buy a house or whatever they might be doing back home.

And so remotely make that process kind of digital first, seamless, low cost, fast, efficient, secure, safe. And, uh, they’ve got just a good track record of doing this. I kind of liked the, I liked the idea of the company. I’ve had my eye on it for some time. I think I was [00:55:00] reassured with, you know, we just cast dispersions on Trump for his kind of crypto shenanigans, but he has also made some quite, , smart comments, I think, around focusing his anti immigration.

rhetoric on illegal immigration and maybe being more supportive of legal immigration and things like H 1B visas. So, I don’t think we’ll see some of the nonsense we saw in his First four years, where it’s quite difficult to become like an overseas worker in the States. I think we’re probably going to see a lot of support for that.

And so I feel this sort of plays into companies like Remitly, , and just seems like a really well run firm with a pretty reasonable valuation. And they’ve just flipped into. Being operationally positive, I think like one quarter ago. So that kind of caught my eye. I like that when companies start to become a bit more self sufficient.

So yeah, I’ve added like a starter 1 percent position [00:56:00] in my portfolio.

[00:56:01] k: So this is big news because anytime based on your investing philosophy, anytime you add a company. I think it’s a strong signal because you’re not as active as I am. Same question I had for you, uh, previous week ago, I think a couple weeks ago, is it the banking industry itself because of your inner expertise that allows you to have more confidence in the business model or the moat that you’re looking in the sector some more?

[00:56:31] Luke: I think it’s more that, I mean, there is certainly a similar, I feel like I understand payments pretty well, but, that I look across all my portfolio and it’s hard to find reasonably valued stuff. Like most of the stuff I’m invested in. Well, NVIDIA was a good example, is like pretty overvalued and, but the FinTech space, I found a few opportunities there where they seem to be at more reasonable valuations.

Like WISE I’ve owned for some [00:57:00] time, WISE PLC, and actually it’s still, I think it’s a bargain. Nubank seemed to be pretty reasonable and was put on sale by, Like the macroeconomic conditions in Brazil of late with like hyperinflation essentially, and remotely they haven’t taken a big hit, but they seem to be kind of reasonably valued for where they are in the maturity.

So it’s probably more that like FinTech seems to be a bit more reasonable than some of the other sectors.

[00:57:28] k: And I’d like to point out for our listeners that. What tends to happen over time is when you get an interest in a particular sector, like Badger is interested in fintech stuff, it becomes easy to understand each company and their differences, and you end up acquiring that kind of, uh, know what you own level expertise, which makes it easier for you to Understand the thesis one, but also not get shaken out when [00:58:00] some flash bad news hits and you just understand things better.

So for you, I think of you as Adyen, Wise, now Remitly and, uh, well, Melly, I guess, Mercado Libre is also like in that bucket, their payment section, right? There’s a, and what was it, um, Nu, Nu, right? Yeah.

[00:58:23] Luke: And C, C limited in Asia. Well, they got a C money on, yeah, I got a few fintech y, e commerce y investments. So I feel like it’s a sector I’ve got reasonable exposure to

[00:58:34] k: So, uh, right. If we iterate, uh, it’s on our Dolphins, uh, our Dolphins membership tier where you get to hear whenever we make a trade like this in approximately real time, give or take.

[00:58:48] Luke: and chat about it with our, uh, our loyal Patreons. So, yep, absolutely. All right. Should we, uh, should we go on safari with a couple of companies before we wrap up today’s episode?

[00:58:58] k: What do you think?

[00:58:59] Luke: I think we [00:59:00] should. I got one I’m, I’m trying to tell you about, but do you want to tell us about your one first?

[00:59:05] k: Sure. Uh, given the, the intense tech talk that we’ve front loaded you all with. I’m, I’m going to try to spare you here. Uh, unfortunately, this is a really sort of tech heavy investment. So, the company is called Rubrik. The ticker is, uh, R B R K. And I think of this as fulfilling a missing bucket in the cybersecurity realm.

So this is why you’re going to love this, Luke. This plays extremely well with Zscaler, CrowdStrike, Palo Alto, and Okta in, in many ways. And all of those companies are both, , they’re working together and they’re also competing. So [01:00:00] you have to know that up front. The short version is that these guys are in the business.

Of, restoring your data or like, um, it’s not the prevention, so to speak aspect of cyber security, but making sure that systems are in place once. In a way, in a sense, something has gone wrong. So it’s sort of, uh, I never thought about this really, but, you know, I kind of thought like with CrowdStrike and Zscale, you’ve got all that you need.

Right. But apparently because cybersecurity crime is such an existential threat to companies that for this company, which is now several billion dollars in valuation. Let me, uh, check the exact number for you. , it’s bigger than I realized. It’s 13. 4 billion for a business to be this big. I just reverse engineer and say like, there was obviously a market fit. Even with a company like CrowdStrike and [01:01:00] Zscale already existing. So those guys weren’t doing what these guys are doing. So much jargon, I could bore you to death with.

But Rubrik Security Cloud is what they call their product. And the statement is they provide a simple way to test, validate, and document the success of an organization’s cyber recovery plans. Provides businesses a way to instantly recover the last known clean copy of data into production while performing forensic investigations out of band in an isolated recovery environment. so that’s the, I think you, you, you of all people, I imagine you owning a company like this to broaden your cybersecurity basket, their financials are, it’s like, it’s like for me, researching this was deja vu all over again, it’s the same kind of stuff with cloudflare and yeah, CrowdStrike, same kind of metrics, same kind of like we’re losing a bunch of money upfront, [01:02:00] but we’re growing at 60 percent in some segments, a hundred percent in other segments.

The net retention rate is 120. Customers are just buying more and more of your thing. high growth company, not profitable right now, but the trajectory looks like they’re going to be cashflow positive, , approaching cashflow positive pretty soon, even though the net, um, the operating losses right now are pretty severe.

And so this is one of those things like, based, you know, to tie with our data conversation earlier, the more data you have, the more needs to be secure. And breaches will happen. These guys solve the what now problem.

[01:02:43] Luke: Yeah, I guess there’s a ton of money in, uh, in cybersecurity and maybe these guys help you. Like, what’s that saying, like bolting the door after the horse has bolted or whatever it is. Like if you’ve had an incident, these guys can help you at least recover from it cleanly.

[01:02:59] k: Yeah. And, you know, [01:03:00] they project their TAM, uh, to be taken with a pinch of salt to be about 53 billion in not too long from now. So as a 13 billion company. Uh, at the moment that is not going to get, you know, it’s, it’s the same thesis with the other guys that is going to keep growing. And the proof is these guys have customers.

So the only, the, the bear case I would make against the company like this. Would be what if CrowdStrike company outcompete them at their own game somehow, which I, as a non expert, have no way of knowing how close that is to happening. So far, obviously, they haven’t been able to do that. Who knows what tomorrow holds.

pure transparency, I actually took a small position in this company. couple of weeks ago. So just like one of those sample size nibbles.

[01:03:58] Luke: Great stuff. [01:04:00] I haven’t bought my Sipari stock yet, but it’s one that’s been on my radar for about a year and a half and it suddenly went on sale after delivering less than stellar results. So, uh, have you heard of Greggs?

[01:04:15] k: Say that Have you, have you heard of Greggs, G R E G G S? Greggs. Yeah.

I have not.

[01:04:23] Luke: It’s a, I’m no surprise there. Uh, do you know what a sausage roll is?

[01:04:28] k: vegetarian, I want to say not really. Or unless it 

[01:04:31] Luke: They are famous. They are famous for their vegan sausage roll, actually, funnily enough. No. Uh, so anyway, Gregg’s are, I suppose they’re a bit like, like the UK high street version of, I know, I mean, I might liken them to like a, what’s that? Or like a Chipotle, say, for example, you know, like a sort of North American bit of a cultural thing.

And, uh, they make, they’ve, you know, they’re on every, in every major [01:05:00] city. There’s like probably two Chipotles within sight of each other. Gregg’s is kind of like that. They’ve got thousands of branches across the UK, traditionally seen as a more sort of Northern y type, not that I’m advocating this North South divide, but it’s kind of, you know, like traditional hearty, not very healthy food.

 It’s a over a hundred year old company. Uh, surprisingly, and they, yeah, like they’re just fricking everywhere. And I looked at them about a year and a half ago and they seem quite interesting because I have always had a hankering to get me some dividend paying stocks, like some income stocks in the portfolio.

And Greg’s have, they had a pretty decent, like dividend yield about 3%, not bad. Um, but I was put off investing them a year and a half ago because they were just going into like a major expansion phase, building like a whole new manufacturing plant in the middle of England. And then like preparing to scale up and massively increase their footprint, like the number of [01:06:00] branches.

Um, so that put me off and I was, that was a mistake as it turned out, because actually the company’s done great. The stock has done great while it’s been expanding, but they delivered less than stellar results, uh, a couple of weeks ago and they’ve taken like a pretty massive hit for like a slow, steady grower and stock price or the valuation is now back to, uh, really where it was several years ago.

So quite tempted to add this one back to the portfolio. I quite like the idea of having like this. I mean, I don’t love the food myself. The coffee is among the worst coffee I’ve ever had in my life. But, uh, well, they have their raving fans in the UK.

[01:06:42] k: Oh my god, where do I start with this? Are you, are you about to add a shit coat to your, to your pure, pristine portfolio, 

[01:06:53] Luke: not. No, no, 

definitely not. This is actually a rock solid, like hundred year old brand that has a really [01:07:00] good ethos where if you work at Gregg’s for like an amount of time, like I don’t know how long, it’s like a partnership model. So actually you become like a part owner as an employee.

They’ve actually got really good culture. And as I say, they’ve got raving fans. It’s almost like a kind of like meme food. Cause it’s kind of acknowledged as being like not great in terms of health, but it’s convenient and they’re everywhere and it’s cheap. And if we do go into a recession, which is, you know, who knows, right.

But if we do go into a recession, it’s probably more likely in the UK than it is in the US. Then like some of these sort of cheap eats and these staples, probably the company, there’s a kind of companies that will benefit and that kind of tight fiscal environment. So yeah, it’s definitely, definitely not a shaker.

[01:07:49] k: Okay, I’m gonna leave it to our Patreon audience to decide. Like, when, when Badger’s going on about how terrible the coffee is, and, and how bad it is for you, and it’s [01:08:00] sausage rolls. Uh, anyone listen to this, please, uh, in the comments section, let us know, is Badger’s portfolio all of a sudden, uh, does it have a little bit of a smudge of, of mud, uh, kind of leaked in?

One follow up question. It’s, um why, if this is such a steady company that’s, you know, like long and boring, , what was the big problem that led to the underperformance?

[01:08:26] Luke: Yeah, I dunno, I think it was just like a reduction in growth or maybe even like sales going backwards. I haven’t, honestly, this is like a safari stop for me. So I haven’t done my deep research. I haven’t bought any shares that yet. I’m just kind of flagging this on my radar, but I’m planning to do that research this week because I’ll kind of would like to add it if I think I could, I think you can make an argument.

It’s interesting philosophical question. You can make an argument for. Buying or adding to a stock when its share price is going up and when its share price is going down, but you need to, [01:09:00] you need to have, you’re not doing it because of those things. You’re kind of doing it in spite of those things. And, uh, here’s a company where the valuation seems to be retracting.

So I need to get under the covers of exactly why that is. And if I can alleviate my doubts or any potential concerns that it’s not. Like getting hit because actually the brand is doomed and it’s selling less stuff and it’s fans are maybe going off it. Then I think that will be a good opportunity to buy.

Well,

[01:09:29] k: So there you have it, folks. In one episode, we went from talking to the most, uh, complex and sophisticated of technologies and NVIDIA chips and deep seek tech. revolutions to shitty coffee and sausage rolls

and in with a little Melania and Trump crypto coins to even things out in the middle. [01:10:00] So we got you covered. , for any of you who want to know more and want to engage with us, uh, one on one, Uh, our Patreon is where we’re at, so patreon. com slash wallstreetwildlife. Special shout out, uh, to Simon N., Olof H.,

Richard A., Aron H., and my own dear Hannah P. Uh, thanks for joining us on, uh, this great, wonderful safari adventure called Investing.

[01:10:33] Luke: I’m sure our Patreons are, and I know you and you and I are Christophe, but to ask our audience, are you ready to become a beast of an investor?

[01:10:42] k: Your journey starts right friggin here.

​[01:11:00] 

Leave a Reply