On a recent episode of Odd Lots, we talked about Intel, and how the former dominant American semiconductor company was stumbling. But big things are happening in the chip industry beyond the manufacturing woes of one company. As it turns out, we're seeing a dramatic rethink of chip architecture, and what they can do, with more emphasis on specialized semiconductors that are really good at performing a specific task. One company that's blazing new ground is Apple, whose M1 chip is earning rave reviews online. We speak with Doug O'Laughlin, a former buy-sider, who now writes the newsletter Mule's Musings, on the industry and other things in tech.
Correction: A previous version of this description misspelled Doug O'Laughlin's name.
Hello, and welcome to another episode of the Odd Lots Podcast. I'm Joe Wisnthal and I'm Tracy Halloway. Tracy, I had a thought, and it might be a little controversial, but what if um you with a controversial thought? Never No, it's like, I know, I'm just gonna throw this out there and tell me what you think. What if we made that Odd Lots wizon about market and finance anymore, but just about the semiconductor history? Could we maybe could we maybe start with a semiconductor series before we completely rebrand the podcast? All right, we'll we'll revisit the question of a total rebrand. But I don't know about you, but I found our episode a couple of weeks ago with Stacy Raskin from Bernstein about the kind of Intel to be fascinating, And after that, I was like, I just wanted to talk more chips, Like I just that like that really sort of like it really went my appetite. It was definitely a lot more interesting than I had been expecting. I think I mentioned when we recorded it that I hadn't been following all the ins and outs of the semiconductor industry, except of course, as it pertained to the US China trade tension. So I wasn't aware of some of the competition between the big makers, like just how bad things had gotten for Intel, and I wasn't aware of some of the strategic decisions that we're going into the making of their chips and the way that some other chip makers have proceeded in a very uh different direction. Yeah, right, it's that different direction that's interesting because, I mean, we talked with Stacy a lot about the sort of manufacturing issues that Intel has been facing as it tries to unveil it's seven animator chips. But there's more to the story than just faster and smaller chips. There's also a fundamental ship happening in what chips are and how they're used and how they're designed. It goes beyond just sort of manufacturing problems. Yeah, and I guess the big news on that front is Apple and its new series of Mac processors. Uh. That news came out this month, and everyone, as far as I can tell, all the technologists that have YouTube channels are very very excited about what exactly this means. I don't completely understand the technology, but I am interested in learning why everyone else um is so excited and why a lot of people are saying this is a game changer. I heard one there's one guy on YouTube who was talking about how this is going to make Apple computers basically on a different level to other types of computers. So in the way that we don't necessarily compare Apple phone with androids anymore, we will no longer be comparing Max with you know, your average PC. It's just like two different types of equipment. Yeah, there was like a medium article about the Apple chip then went viral on like articles about chips don't typically go viral, but that I think sort of gives you an indication of the enthusiasm, the excitement that people have for all this stuff. So definitely a lot to talk about. But I have to admit I still don't like really get it. Like I'm trying to read about this stuff and understand it, but I don't totally get Yeah, I'm in that same place. So today we are going to be talking more as anyone could have guessed about chips. And I'm very excited because in addition to being a sequel of sorts to that other chip episode, um our guest who is who writes on this stuff, and he's prominent on Twitter, etcetera. We're going to do something we've never done before. He's going to dos himself on this episod. So he's been writing under and tweeting under a pseudonyms, agreed to come on and with this episode actually tell us who he is. So this is a a totally new thing for us. I don't think we've done this before. No, we definitely haven't, and we should just stress this is a voluntary self docs scene. We haven't forced him into it or anything like that. That's exactly right. So we're going to be speaking with the author of the newsletter called Mules musing. He tweets under the handle at full all the time, great stuff on the matter of chips. But we're going to find out now who he is. So U Mule, thank you for joining the Outlaws podcast? Who are you and why are we listening to? Hey? I'm My name is Doug O Offlin, and I am just a huge nerd with an investing background. I guess I previously worked at a byeside firm in Dallas and kind of just taking a break tween things started writing a newsletter. Um, and clearly there's been a lot of interest in semi coonnucters, not just from you guys, but from the investing community at large, and it kind of just hit a note. So I'm I'm really obsessed with the whole you know, the whole revolution all the YouTubers are going crazy about. I'm obsessed, Like, it's not just um, it's not just all the technology stuff, but there's a lot of business and investing um things that could happen, and I think it's I think it's truly a revolution. Like it's it's about as exciting as it's ever been in semiconductors, which is frankly not that exciting, but it's still very exciting to me. UM. So yeah, that's that's what I've been doing recently. I have to say that while you were introducing yourself, I just had a look at your Twitter profile page and I saw the binary code on your profile and I looked it up and you must be the first All Thoughts guests that we've ever had who who has basically done like a practical joke in binary code. So um, kudos for that, and I guess I give some indication of of your interests. H and everyone else can go and check it out and translate it into English to see what it actually says. But before we begin, I'm curious, why did you decide to start the sub stack and why were you tweeting under the pseudonym to begin with? Well, um, especially in the past past, it's it's not you know, I was employed, and it's um, it's not you're not supposed to talk publicly about public markets if you're in the business. So that was that was the big reason for a pseudonym. And frankly, Finfoot is an amazing community. There's a lot of um, there's more information there than anywhere else, so it's it's an awesome place to learn. But obviously you have to be extremely private if youtuose to do so. So um that that was the reason why I under a pseudonym, and then the sub stack. I guess, um, I'm just kind of like in between things. Next year, if everything goes right, I'll take a little bit of a gap. But you know, I'm like born to do or keep researching and doing stuff like this, Like this was pretty much what I was doing in my free time, and I was like, heck, a lot of other people could really learn a lot from this. As well, and so then I started writing for a broader audience, and um, writing in general has been really fruitful. Like you, it's it's a crazy skill to have and I'm frankly not I'm getting a lot better at it, but it's still um, it's been a learning journey. So that's why I started the sub stack and I've just been kind of writing the wave. I have to say, I feel like a deep, emotional, resonant connection with your story. Because I got my start I worked for a small portfolio management company. I also started I started blogging in two thousand five under a blog called The Stalwart, which is now defunct, but that's why I got my Twitter handle, and it was kind of the same thing. I just wanted to practice writing for the skill of writing and keep up to date, and it was between jobs, so kind of like you. So I'm I'm a big fan of people with your art and I think big things. But let's let's move on and let's talk chips. I mentioned there's this a viral article about the Mac chip that came out, and articles about chips don't particularly go viral, but everyone's raving about this new chip from Apple, So why is the chip going vibral okay, so then yeah, now and I can talk ages about this. UM. So the the the M one chip is really exciting, so UM. The reason why is it's kind of it's kind of like a hybrid and we've we've seen we've seen it before, frankly, like your iPhone is powered by something that's similar to and M one chip, but we've never seen it for a Mac or a desktop or any other form factor other than iPhone. And part of it is because up until this point, all the laptops in the world have been mostly running x A D six. So x A D six isn't UM is like just think of it as like a language of processors. And for a long time that was kind of like, you know, the defen ACTO standard. But the iPhone getting a lot better has pretty much made a second language become a lot more popular, which is ARM. And now that it's become a lot more mature, it's actually kind of um. In the past, it never was fast enough to really compete in a actual computer, but now it seems to be that the M one chip is like supermature. The ARM infrastructure is ready and the laptop looks really compelling, and a big reason for what that is is it's not really a CPU like it's it's actually a lot of different dies on one giant die, and it looks like there's a lot of benefits of arm over x eight six that we didn't even know where possible up until the M one came out. So so, I mean, I think from what I understand, this is kind of the heart of it. So normally when we talk about chips, you know, certainly when we're talking about chips made by Intel or m D or something like that, we're talking about CPUs, these central processing units. But the Apple hip, the M one is not a CPU. It's it's something else, or only one component of it is a CPU. Can you can you explain that in a bit more detail. Yeah, So it's called a system on chip or sock, and what that means is that it's it has a traditional die like any other piece of silicon, but there are little pieces on the silicon that each have different jobs. And so instead of having one giant piece of silicon that does everything kind of well pretty much, this this silicon each has little parts that do their job really well, and then together it does a lot better as a system, because it's like, hey, instead of having a ton of random you know, instead of having a general computer that can do everything kind of well, they have like, you know, ten specials in the house. And when you look at a laptop, really you're only doing ten types of workloads or or however many number of workloads. And so they said, instead of trying to make just a faster CPU, why don't we specialize a little bit, and we'll break down all the things that the Mac the laptop has to do, and we're gonna make them do it really well and with these specialized ships. So that's pretty much what I'm what the differences, and so this is kind of in the picture of the broader things kind of been a revolution anyways, Like getting off the CPU has been happening for a long time. And part of the reason why now it's become such a big shift is because, you know, in the last episode with Stacy, guys talked about more laws slowing down or maybe it's even you know, they love to say it's dead. It's not quite dead, but it's it's slowing down quite a bit. And so now the road for it definitely seems to be specialization and so the M one did this really well. It's specialized in a lot of different ways, put it all onto one piece of one piece of silicon, and also coupled the memory really close together, which is a whole you know, kind of in the same vein but a little different. And now the M one just rocks. Frankly, it's it's not yeah, it's it's a little bit shaky as a first product, right like the Apple Watch one or the iPhone one wasn't the best product ever, but it's still it still is, um a miracle, quite frankly. So you know, you talk about how this sort of emerged out of Apple's chips for the iPhone and something I think, you know, like I buy a new iPhone every few years like anyone else. Um, I don't buy every single one that comes out, but you know, when my mine feels like it's getting old or whatever, I buy a new one. But honestly, the only thing I do on my iPhone besides tweeting is taking photos. And so even though like I have this like incredible computer in my pocket, basically is just a camera with Twitter attached or Twitter with a camera attached. And so I'm curious like how much has this sort of like shift tour words. Uh, These sort of focused tasks instead of generalist chips come out of this fact that yes, you have I have this whole computer. Well, there's one thing that it's really important for the iPhone to do for a lot of people, and that is take really clear can really uh really clear photos, and this sort of recognition that there are just a few simple tasks that on a phone you have to get done extremely well for most people. Yeah, that's actually the perfect that's like the perfect thing to bring up because on the Apple phone, right, the A fourteen or A fifteen or whatever iteration of the A chip, the biggest die, the biggest specialized die is something called an IPU or an image process Sorry can you die when? Uh? Die is just like a piece of silicon. Sorry, it's it's um. Yeah, just just think of it as like a unit of silicon, if that makes sense. And the biggest piece of like that's been taking share from everything else is an IPU, and the CPU has become a smaller and smaller part and similar in that same vein right, like computers, one of the things that we've been using the most is GPUs to play video games. So just like the CPUs kind of losing relevance or the you know, the CPS losing relevance in your phone to an IPU or for your images in in a desktop computer, for example, the CPU seems to be losing relevance to a GPU for a lot of people who are using it for games. So kind of that same that same energy is you know, disrupting one and the other. And I think you ask a question about like why Apple Apple in the iPhone specifically kind of like shifted over to the Mac. What's interesting about that is in the iPhone itself, they actually had a lot of requirements that were not that you didn't really need to use in a in a computer, right, Like, it has a lot smaller form factor, it has a battery life constraint. It has to also not have a fan to actively cool it, and so all these things. Um in the beginning, we're just like, you know, a pain pain for them to to manufacture around. But as they got better and better and better, they pretty much got really close to a laptop in an iPhone. And that's while having a smaller form factor, while having no active fan. And then they're like, wait, why don't we just do this for a laptop. And so that's kind of what they did. And the M one really is like a super souped up um Apple chip. And the the announcement that you guys just broke the story on is that they're going to actually be making even bigger, beefier versions of this because the M one is actually made to be put in a laptop with no active cooling. Like that's also a big part of it. There's like it's supposed to not heat up, it's supposed to not have a fan. But what if we what if we just like let it run loose and you know, actively cool to put a fan. See how hot, how hot we can get this thing going? And I think what's gonna happen is we're gonna be really shocked by the outcome. And so all the manufacturing and the constraints that they had to do to make the chip fit into a small phone pretty much when you actually scaled it up into a laptop, it made it a lot better. And that's kind of part of the M one, you know, the Apple silicon magic. And Apple isn't the only company pursuing this, like I think, you know, almost every company is kind of looking at Intel and saying like, okay, we need to transition off this vendor. And whether it's a data center or whether it's a phone, or whether if it's um anything else, Like everyone is kind of going this way. And that's what's exciting. Is this isn't just an Apple story. Qual Calm mentioned they might do heterogeneous chips, which is like one way to call this whole thing is heterogeneous, as in many different types of chips put together into a larger package. And so you know, Android might have a similar and one like chip very soon. That's probably you know, a few years away. But the writings on the wall that everyone has to shift into the this kind of M one style of chip, if that makes sense. So it does make sense. Um, this is actually what I want to ask you next. So I think I understand why Apple sort of struck upon this design given its experience with the iPhone, why it thought that specialized chips for specialized asks would be more efficient. But why was it able to actually produce the chip ahead of time? Or maybe another way of saying this question is, Um, why was Apple able to do this when, as you just mentioned, the entire industry seems to, you know, maybe at different speeds be moving in this direction. So Apple silicon has always been a leader. They've they've been they've been crushing it for a long time. But in particular this this kind of goes back to the last episode you guys had. T SMC really enabled them to do this, and Apple has been buying and building silicon expertise in house for years. This is not an overnight success story. The the original chip, like the A seven or whatever, was when they first started to um to really make this more heterogeneous, and they've been every year it adds more specialization, and it's gone to this point where now the CPU is like one of the not the least important, but it's just one aspect of the chip in the phone. And so Apple has been doing this for a long time. T s MC has been a huge part of it enabling this, and then other companies kind of look to Apple and are seeing like, okay, we can do this too. So I hope that kind of answers the question when it comes to manufacturing, and of course that was the big thing we talked about on our last chip episode, and this idea that it's like getting more and more difficult to sort of shrink these chips have the smaller an animator form factor. Is that an issue here? Or does sort of rethinking the fundamental architecture of the chips such that it's it is h heterogeneous chip. Does that sort of sidestep some of these issues? If that's does that that questions? Yeah? Yeah? Um actually, I think that's the huge driver of why this whole thing exists, right, Moore's law slowing down and the struggle to to shrink smaller and smaller pretty much made them think, um, hey, we have to improve this somehow. We can't do it through the ways we used to do it. Why don't we specialize a little bit. So it's kind of like a mix of both that's happening at the same time. I definitely think that the desire for better, faster chips was what helped the heterogeneous revolution begin. But um t SMC being there to essentially make it so that anyone who isn't because in the past they all until also owned the method to get smaller. Right, So it's like, you know, you can't compete against some a general chip that's getting smaller, getting faster, better, faster than you could specialize your chips faster. But now that t SMC is clearly in the lead and anyone can can reach out to them and work with them, now everyone's like, well, why don't we just make our own special little chips? And so that's really been um kind of all the things that's made this happen at this right moment right now. I mean, on that note, how easy is it going to be for competitors to make these specialized chips? Then are they going to be able to do the same thing that apples do? And in particular, is Intel going to be able to do it? Given that, as we discussed in the last episode, their business seems to be almost entirely based on this idea of selling you know, general purpose CPU which then can go into a large variety of desktop PCs or whatever. Yeah, so Intel is already pursuing this. I mean, the people in Intel are smart and they know what they're doing. Um, but they're obviously tied to a very large platform. It's something called Phobos, which is like kind of their method of stacking the chips together. They have this um, this new platform and their Intel architecture. Day They're like, okay, we're gonna go all in on chiplets, like we're going to do the same thing that everyone else is doing. But that kind of brings it back to the point like Intel is going to be doing the same thing that everyone else is doing. And if they don't have the manufacturing advantage like they used to, pretty much everyone's on the same level playing field and it's it's for real right now. Am D for example, does have like a big, big reason why they won was kind of like a process heater geneous thing where it's like, Okay, we don't have to be the fastest, you know, we don't have to have the smallest um chip. But what we're gonna do is we're just going to make sure this really important part is is super small and that way, you know, it's like getting across across the finish line um by having the best of the little pieces that really matter and having a lot of like not as good pieces that didn't really matter as much. And so they're everyone is kind of going to this like hybrid playbook, and every company to a certain extent is pursuing this. And yeah, I mean I can talk to individual companies in particular, but like I know, it's like I know, it's a lot to wrap your mind around sometimes. So when we think of Intel, we think of this sort of like basic business model, which I guess you know still exists. Is still primates tons of one a where the the one company makes the chips and then another company puts together boxes or laptops or servers where they assemble various components, taking Intel chips or chips from elsewhere and then packaging them and distributing them, etcetera. Apple is always kind of the strewed that model, being much more vertically or horizontally integrated, I can never remember which is which and having a sort of custom set. Is that fundamental model going to remain or is there going to be pressure on the very premise of having a separate computer that's distinct from a chip. So, yeah, that's a great question, I think. Um, I mean, man, I can't. I think something that's happening in the near term is definitely verticalization, like you're seeing it with Apple, and they're they're probably furthest down the road. I actually wrote about this quite a bit, But like Amazon in some ways is making sure that their whole data center is pretty much owned by them using arm chips that they license manufactured on t SMC and in some ways, they're their whole data center will be vertical and they'll start to like think of the data center as a giant computer. They'll own the entire thing, and it won't be like how it used to be where you could build your own computer or in this case, a data center. Instead, you're going to be forced to be vertical, just like Apple is kind of for your phone, if that makes any sense. That's a huge that's a huge new trend that is like right now happening. The AWS announcements last like the last week or so pretty much they announced like three pieces of custom silicon, which is which is crazy. So so they're they're doing this, but think of think of it in a much bigger picture versus Apple out a much smaller picture. And same with Microsoft, same with UM, same with Google especially, and Facebook has even recently hired someone to probably start this this path for them as well too. So I have a dumb question kind of going in the opposite direction. But if Apple is making, you know, this new type of chip that everyone is very excited about, could they ever sell it on the market in the way that you know, Intel sells its chips or does it just not make any sense for them to do it. I think I think it makes a lot of sense. The YouTube community is freaking out right now about if M one can run Windows, because if it can, that would be awesome, and so Apple could actually, um they could sell it. I don't know if that would be kind of like in their ethos right, like they make very very defined um in like a very closed ecosystem, but um, yes they could, and even Microsoft would buy it for their UM for their high end surface books. A lot of a lot of the laptop O E M s would definitely be an insta buyer. So um yeah, if they wanted to, they could instantly they could open it up and sell it, and I think it would be really good for their business, Like it would be gross margin, a creative. Um it's actually probably more profitable than selling the phones out right, which is crazy to think about. And I have two questions. One is very short and one is a little that following onto what you just said. Hey, I you've said the term ARM or you talked about ARM a few times. What does ARM mean or what does it stand for? What does it represent and to just there you said, the question is can it run Windows? What do we really talking about when like, here's this powerful chip, can it run a piece of software? What would be the tension or the difficulty in running Windows? And how much does that sort of add to all this this this need to sink the chip with the with the operating system. Okay, so arm is UM is kind of one of those languages. I don't remember what it stands for off the top of my head, but it's it's actually it's it actually got sold to Navidio. Well it's in the process of the deal to the video, which is like, you know, super High States is arm accompany? Yes, Arms arms a company. Yeah, I mean I know that arm is a company, but it's also a language. Like this is where I always Army is a company, and they pretty much they pretty much license their software. They're they're tied their language of silicon out to anyone who wants to use it. So um, whenever ARM licensit, they essentially say like, hey, you can you can use this, will even help you tinker with it a little bit, make sure it works out, and you're gonna have to like two of revenues or something like that. It's usually a percentage of and it's it's pretty cheap for a lot of companies, like right if you if you don't have any silicon expertise in house, like building anything from the ground up is you know, probably a billion dollars plus enterprise. And so ARM just does that all for free. You can essentially just buy off the off the shelf ARM you know, cores or i P and then you can kind of build build your own little piece UM itself and and UM. Oh and so the software side of it, so x A D six is pretty much what everything for the PC has been built on. ARM obviously is everything mobile has already been built on UM. And there's kind of been this push, this push and pull right like because ARM right now definitely has the ecosystem momentum, but UM people are really nervous. Like even even the laptops, the M one. The biggest problem with the new Mac laptops with with the M one in it is that some of their some of the programs won't work with it and you pretty much have to you have to redo it to make it into UM an ARM compatible style. However, I mean a lot of companies kind of can see what the writing on the wall is. So they are. But there's also something called an emulator where essentially it's like the ARM chip will run a mini you know, fake x A six and then it will by using that like you know emulation, it can do can move that software and use it on that hardware. But um, so it is a big deal. So right now it seems like Windows um from my understanding, can run on M one and but the problem is like when it does, it's gonna be buggy, Like have you ever used a Linux computer for example, Like you can't play video games on the next computer because they're not compatible. And so that same, that same kind of like problem with the software ecosystems is going to have to be hammered out and that's gonna, um a huge growing pain for everyone involved. So I'm curious. Everyone is very excited about these specialized chips is And we were talking a little bit about Moore's law at this idea that like maybe it's not completely dead, but people are having to look at new ways to make their chips more efficient rather than just making them faster. After we do this, after we do these integrated chips or whatever, you want to call them. What's the next big thing that you see down the line for the industry. It's kind of like there's this like mix of things going on. We're still getting smaller, right, Um, that's that's one thing. That's a huge, a huge thing. Um. TSMC just had their five minimeter as we discussed, as you guys discussed in your last episode. Um, you know, the nanimeters don't actually mean anything anymore. It's still a marketing term. But but you do get benefits from getting smaller. So that's still going to continue, just at a slower pace. Um, the specialization is going to continue until it stops working. And people there's kind of like you know, I've been I read this paper that there is a theoretical limit until when that stops working, but like we're going to keep running down that road until it stops working. After that, I I really don't know. I mean quantum stuff. You know. I keep hearing and reading and researching and trying to figure out if this is real. Clearly it's a little bit away, but that would still only be really useful for a very small subset of problems. And so I think the way that we're going down which is specialization is probably the the surest surest way forward. And not only will it become it will like start from most of the hardware, but like the software will probably start to have have to be written better to couple with the hardware. That's something that's like a big trend where UM where the software and the hardware will kind of specialized together. Because that's that's something that you can continue forward. Right, Like the less flexible it is, the faster it will be, versus the more flexible the less faster. And so if you can just like kind of make it as at the least flexible as possible in theory, you can make everything into an ASI, and the speed ups that you get between UM specialization and non specialization are our magnitudes order faster. So that's the reason why it's so exciting, because it's like you could have UM if you could if you just made a single program that only had to run one type of thing, you could make it a hundred times faster by just running it on the most specialized chip possible. Now you have to be really really sure that all you want to do is just this program and UM. A great example of this is Google's TPU, which is a tensor processing unit, which is their their ecosystem called TensorFlow, which is focused on deep learning. So Google is like, hey, we know that we're gonna be running a lot of tensor TensorFlow in the future, and AI is going to become this huge exponential problem. So we're gonna make We're gonna make a chip that only runs TensorFlow. And it's really really fast. It's um compared to how would be done on a CPU. I couldn't really tell you, but it's like ten two hundred, maybe even more like the magnitude it's it's a lot faster. And so until the specialization stops working, I don't really see anything that's going to um gonna stop stop it for now. So basically it sounds like sort of wrapping some of these thoughts together, is that as chip architecture has migrated towards more specialized tasks, more specific tasks you just mentioned Google and it's deep learning AI stuff, it really becomes more part and parcel with every technology. So if you're working on machine learning or AI, you it's almost like you have to have a chip component to it, or I guess maybe it makes sense to pursue a chip component rather than just sort of writing code, writing software and then running that generic off the shelf chip from some other companies. Yeah, that's the perfect way to put it together. And and frankly, it used to be awesome, and we used to just do it, like think about how awesome would be. It's like, oh, you can just write the software and every year it just gets you know, every two years it gets faster, like faster. Um, you didn't have to think about any of this, and it's it's a pain to think about all this stuff where it's like, hey, we have to make a custom chip platform in order for our software to get faster. You know, we missed the old days of more slow for a reason. But now now that we're slowing down, clearly clearly like we're gonna have to do some very specialized, very top to bottom structures of software and hardware that work together to make the best outcome. So so I just have a couple more questions. But one that I'm interested in is like, you know, we've been talking about Intel, We've been talking about Apple, We've been talking about Google, Amazon, in video and Taiwan Semi. But in video has become this monster, and I'm curious like their role in it because I mostly think of them as a company that makes chips for video games, but then I always see stuff like about how they're also doing stuff with self driving cars and sometimes I see these really cool demos or like other sort of machine learning image recognition types of thing. What is their role in the ecosystem? What is their competitive advantage and how did they emerge to be just like such an incredible player in the stock's just done? Wild? Yeah, yeah, so the company is wild. Jensen Jong is awesome, like he's he's he's an amazing CEO, but notoriously um, notoriously stubborn. He's very like Steve Jobs. Ask GPUs think about this way that there's a gradient of like the most general to the to the least general. GPUs are one step past the CPU and where it's like it's still pretty flexible and you can, you know, you can make it do a lot of things, but it's not like so but it's a lot faster, but it's not so inflexible that it's like a pain to write for. And so that's kind of they're been they're huge benefits. So as the original use of a GPU is essentially to make all the pixels on your screen in parallel, and and that's really hard because you have to do it all at the same time. And so whenever you hear about paralyzation and that, GPUs are like these super parallel engines that are really good at calculating everything at the same time. And so the reason why that's so important is like a I actually happens to be that kind of that same use set almost identically UM and so they have this huge ecosystem and they're like, hey, well, why don't you use your all your AI things and we'll speed up with this GPU. And it's very easy to use, and so that's been their huge they're huge step is that it's much better than the CPU. It's extremely easy to program and and they've they've started to add a lot of software demos and and so they're kind of in some ways, what what Navidio is doing is that they're they're also making their own ecosystem, but instead of starting with the software and moving down like let's say Apple, they're starting with the hardware and trying to move up. And GPUs are, by the way specialized they are. They are less. They're less specialized than a six or an I P or what you know, all the other really really specific things that just do one thing really well. They can do multiple things pretty well, but um, they are obviously more specialized than a CPU that can do everything. You know, kind of Okay, So that's that's been their big role, and they've really taken a lot of share as as things have started to specialized a little bit. They're they're the they're there like the easiest first step, if it makes sense. But I have one more question. So you know, for a long time, I feel like, and correct me if I'm wrong, but I feel like the semiconductor industry for like decades was sort of like a cyclical industry, and you'd have these sort of it's almost like manufacturing or like an industrial thing, and you have these up cycles when there was growth, and then you'd have like an inventory correction and the foundries and the chip companies have made too many chips, and then they'd have to cut prices or d stock and they'd go into a downturn and so forth. And then it feels like in the last several years, maybe five years, ten years, I don't know, it has stopped being a cyclical industry and become more secular. And and that is as chips get put into more and more things and phones and all kinds of different stuff. So it's like less of an up and down cycle and more just like a sort of growth industry. A Is that true? Is my perception correct? And be? Is there any slow down on the horizon for the world's overall demands for chips or do we just keep finding new places and new new places to put them in, new needs for them. Um? So yes, historically it was a crazy cyclical industry. If you guys have ever read um the Innovators what I'm dilemma, Yeah, innovatives sylemma. The case studies he uses is almost all semi conductor bunnies because they're so cyclical and they're they're like constantly you know, you know, destroying each other's modes, and they're you know, they're they're innovating and then you know, growing up and then they're dying like it's it's chaoss um. Things have definitely changed a little bit since those days, and so one of the largest markets that really dictates the cycklicality of all the semiconductor and UM markets is memory, and there's been this thesis that UM it's become a lot more consolidated, meaning that there's like I think there used to be you know, tens of players. There's like three for DRAM, which is a type of memory, and five for nand five to six or whatever. And so that's on one hand, that's that's dampen the cycklicality. And pretty much they used to do they used to do the races bottom thing where um, whenever things would go down, they would just add more supply and then everyone would would bankrupt together kind of like shale is to some extent or or does. And and they've done less of that, and that's been one one thing that's dampened the cycklicality. UM. But I think going forward, it seems like there's more end markets than ever. In the past. It was really just like PCs, right, and then phones became well PCs, and and there's some like some industrial and like you know, aerospace, and there's always been small markets, but for the most part it was just the PC market. And then phones became a thing, and then so now became PC and mobile. And now it seems like auto is becoming a thing. So auto is this new third third market, and so it's industrial IoT stuff, which I keep hearing about, but I'm always confused about how the semiconductors work into it. And then um, the AI market in itself will probably become so demand intensive from the cloud, from the cloud customers that they'll have their own end cycle. And so there's each of these markets have their own cycles to to their own extent, but together all the demand is starting to like overlap and make it a lot more smooth, if that makes sense. To be fair, the market is still cyclical, it will always be. It will always probably be cyclical, but I think all the demand aspects are really pulling things forward, and at least in the near term, I think next year seems really well positioned. Like, for example, they're saying, like this huge influx of auto demand in China, They're like, well, we can't really make it because we're we're struggling with semiconductor capacity. That's the headline that came out in the last few days. And I think that means, you know, that's kind of a sign of of all the new things that semiconductors have become a part of, not just your PC, but your car, but you're you know, your nests, thermostat all the other things in the world that you use. So can I just ask one follow up on what you just said. You said someone came out and was talking about Chinese car production like ramping up. Who who was that and that they couldn't make all the chips needed for it. It was Volkswagen. Volkswagen. Okay, thank you, Sorry, I just wanted to look it up after this. Yeah, yeah, I want to say it was a Reuter's article, but it pretty much was like, hey, semicoucter manufacturers can add capacity because it's like so, I mean, that's really interesting, and like like auto constantly is just called out as this market that's going to be this huge thing, and it is obviously a very big market. But like if we're really going down the route of EV stuff like hardcore, you know we can we can you know, no comment on the ev s back bubble or whatever, then there's gonna be a lot of more semi connects. Right, It's it's like almost completely digital. Now. That was fantastic. You're you're great at explaining this and really appreciate you making your public non anonymous debut on Odd Lots and I learned a lot to Thanks for coming up. Thanks man, I really enjoyed it. So, Tracy, are are you cool now to change the focus of our podcast on a permanent basis. I don't think I'm quite there yet. I admit that semiconductors are a lot interesting. Um, then perhaps I gave them credit for before we've recorded these two episodes. But yeah, I don't think I want to do all thoughts um Comma Semiconductor Specialists just yet. Well I was gonna say, maybe, like when we're a gigantic media brand in our own right, we could do like a spinoff show, The Semiconductor Vertical. Yeah, the semi Conductor Vertical excellent. Um. What I was gonna say is I'm quite interested in possibly looking at one of those Apple UM and one computers at some point. Yeah. I don't think we intended for this episode to be a giant advertisement for Apple snow chip, but a lot of people are excited about it, and it was really interesting listening to Doug go into some of the technical details about why exactly it's considered such a new thing. Yeah. No, I thought that was super interesting as well. Um, I feel like the next one we do in our series, we should do like at t s MC focus for sure. Yeah, let's do that. Like I feel like, you know, like we've done until this sort of a sort of a little more Apple heavy all the everyone. But it feels like t SMC is just like such a fascinating story and it's own right, combining their sort of tech prowess and geopolitical significance. That that that's that whenever we do our next trip episode, let's do it on that agreed, all right? Should we leave it there? Yeah, let's leave it there. Okay. This has been another episode of the All Thoughts podcast. I'm Tracy Alloway. You can follow me on Twitter at Tracy Alloway. And I'm Joe Wisenthal. You can follow me on Twitter at the Stalwart. Follow our guest Doug Laucelin on Twitter. He's under the handle at full All the Time. Also be sure to check out his newsletter Mules Musing. Follow our producer Laura Carlson at Laura M. Carlson. Follow the Bloomberg head of podcast, Francesco Levi at Francisco Today, and check out all of our podcasts at Bloomberg under the handle at podcast, Thanks for listening,