In the first part of this week's two-part episode, Ed Zitron walks you through how impossible it will be for OpenAI to stay alive - and how SoftBank’s attempts to help are materially harming the company.
Vote for Better Offline's "Man Who Killed Google Search" as the best business podcast episode in this year's Webby's! Open until April 17! Vote today!
https://vote.webbyawards.com/PublicVoting#/2025/podcasts/individual-episode/business
Vote for Weird Little Guys in this year’s Webby's! https://vote.webbyawards.com/PublicVoting#/2025/podcasts/individual-episode/crime-justice
---
LINKS: https://www.tinyurl.com/betterofflinelinks
Newsletter: https://www.wheresyoured.at/
Reddit: https://www.reddit.com/r/BetterOffline/
Discord: chat.wheresyoured.at
Ed's Socials:
https://www.instagram.com/edzitron
Media.
Hello and welcome to Better Offline. I'm your host ed Sir Tron. Now, before we go any further, I hate to ask you to do this, but I need your help. I'm up for this year's Webbe's for the Best Business Podcast Award. I know it's a pain in the ars to register, but can you sign up and vote for Better Offline? Never won an award in my life, and I have enough listeners that I think we can tip the scales. But on with the show. The following two part episode is the culmination of months of research, presenting a case I'd been making in parts since July of last year. Open AYE is a financial abomination, a thing that should not be, an aberration, a symbol of rot at the heart of Silicon Valley, a company that unrepentantly and needlessly burns billions of dollars with no end in sight, helmed by a creticist, dis ingenuous billionaire who continually lies about what it is that it will do because he, like I've been saying since last year, knows that Generative AI can't do much more than it does today. What I'm going to lay out for you is my case the Open AI can't survive, that it's boadline impossible under any of the current terms for this company to continue, and that its demise will bring about the death of a or significant harm of several other firms. A cottage industry of despicable, billion dollar burn rate capitalistic monstrosities has sprung up around this stupid fucking company in the hopes of further inflating a bubble set to burst at any moment. My discuss for the parties involved is unrelenting. This is my discuss for a media industry that failed to even attempt to tackle the subject matter I'm going to detail. I believe at the end of these episodes you'll see my point and at the very least agree that open AI's current situation is totally untenable. If I'm right, open ai will go down in history as an abdication of due diligence, fiscal responsibility, in frankly common sense, both in the venture capitalists and entities that propped it up and a tech media that was more concerned with taking detailed notes on my comings and goings than knowing their ass from their ear hole. Those who have failed to hold men like Sam Altman and darra Ama Day Accountable have tacitly approved a financial and environmentally destructive movement that will lead to the very little actually happening or changing in the world other than damaging our power grid and the theft of art from millions of people. These episodes will be full of numbers and statements and very few declarations or personal opinions, though probably a few swear words. If I'm honest, I don't need to get that personal here. The numbers in question, they're damaging, they're staggering, they're worrying, and ultimately spell collapse. The truth is, we don't even need to talk about tarifs for things to go sideways for this industry. The price of a GPU could rise one hundred percent, or it could have it really wouldn't make much of a difference to open AI's chances of survival. That's how bad the fundamentals are. And to illustrate that point, I'm going to ask a number of relatively simple questions over the next couple of episodes and make an attempt to answer them. First, let's start with something simple, how much cash does open ai have. At the start of April, open Ai closed what was called the largest private tech funding round in history, where it raised and astonishing forty billion dollars. And the reason I'm saying this with the sarcastic inflection is that open ai has only actually raised ten billion dollars of the forty billion dollars, with the rest arriving by the end of the year, and even as I record this, I don't know if the money's actually gone yet. I'll get it into that in a minute. Now. A lot can happen in a year, and the remaining thirty billion dollars twenty billion dollars of which will allegedly be provided by soft Bank, is partially contingent on an open AI's conversion from a nonprofit to a for profit by the end of twenty twenty five, and if it fails, soft Bank will only give open Ai a further twenty billion dollars. I'll get into how fucking stupid this gets later. The round also valued open Ai and astonishing three hundred billion dollars. To put that in context, open Ai had revenues of four billion dollars in twenty twenty four. This deal values open Ai at seventy five times its revenue. That's more than Tesla even at its most looted Chris Peak. I also want to add that, as of writing this sentence, this money is yet to arrive. Maybe it will arrive by the time this is out. Maybe what really make me look stupid. But based on soft Bank's filings, which I'll link to in the spreadsheet for this episode, say that the money will arrive mid April. It's April fourteenth as I'm recording this, and that soft Bank would be borrowing as much as ten billion dollars of the financing for the round, with the option to syndicate, meaning you bring in other investors the rest of them. For the sake of argument, I'm going to assume the money actually arrives, though filings also suggest that, and I'm quoting in certain circumstances, the second thirty billion dollar tranch could arrive in early twenty twenty six. This isn't great. It also seems that soft banks ten billion dollar commitment is contingent on getting a loan, as it says it's financed through borrowings from Mizuho Bank Limited, among other financial institutions. Soft Banks had plenty of loans in the past, so I think they're going to get it. But I think this is one of their biggest open a I also revealed it now has twenty million paying subscribers and over five hundred million weekly active users. If you're wondering why it doesn't talking about monthly users, it's because they're likely much higher than five hundred million, which would reveal exactly how poorly open ai converts free chat GPT users to paying ones. The Information reported back in January that open ai was generating twenty five million dollars in revenue a month from its two hundred dollars a month pro subscribers, and just so we're clear, they lose money on every one of those two, suggesting that they have around one hundred and twenty five thousand Chat GPT pro subscribers, each losing the money somehow. Assuming the other nineteen million, eight hundred and seventy five thousand users are paying twenty bucks a month, that puts revenue about four hundred and twenty three million dollars a month, or at least five billion dollars a year from chat GPT subscriptions. This is what reporters mean when they say annualized revenue. By the way, it's literally the month. The monthly revenue that the money they're making in one month multiplied by twelve, and you'll be surprised to hear that people play silly buggers with that all the time. In March, Bloomberg reports the open ai expects its revenue to triple to twelve point seven billion dollars in twenty twenty five, assuming a similar split of revenue to twenty twenty four. This would require open ai to nearly double its annualized subscription revenue from Q one twenty twenty five from five billion dollars right now to around nine point twenty seven billion dollars, and nearly kuldruple API revenue from twenty twenty four's revenue of one billion, which includes Microsoft's twenty percent payment for access to open AI's models, and that would get them about three point four to three billion. We all need to super worry about these numbers. And I realized these are messy numbers. It's just unclear how open ai intends to pull any of this off. It's an incredible leap, and open AI's own plans don't exactly inspire confidence. They're really good at getting free subscribers, they don't seem to be able to get paying ones in quite the same number, and also they even lose them money. Every time I think about this company, I start feeling a little crazy, and if I'm on it anyway, The Information reported in February that if the open ai plan to grow its revenue by making three billion dollars a year selling agents with chat GBT subscriptions seven point nine billion dollars and API cous one point eight billion dollars making up the rest. This is, of course, what's the technical term, Oh, it's bollocks. It's complete fucking bollocks. I'm sorry. Agents, by the way, are AI chatbots that can do something like interact with another program on the user's behalf. Open aiy's agents can't even do the simplest tasks. And three billion dollars of twelve point seven billion dollar figure appears to be a commitment made by soft Bank to purchase three billion dollars a year of Open Aiyes, tech, now, let's pass out these numbers precisely. So incoming monthly revenue roughly four hundred and twenty five million dollars give or take theoretical revenue from soft Banks completely made up thing two hundred and fifty million dollars a month. However, I really can find no proof that soft Bank has begun to make these payments, or indeed that it intends to make them even how it intends to. Now, let's talk liquidity. Open Ai has ten billion dollars that they yet to receive a recording this, but let's assume they get it, and it will be ten billion dollars seven point five billion from soft Bank and a syndicate of investors including my mic Krossoft and Cutchiu and a few others. Potentially. There's also an indeterminate amount of remaining capital on the four billion dollar credit facility provided by multiple banks back in October twenty twenty four, and that was raised alongside of funding ground that valued the company one hundred and fifty seven billion dollars now. As a note, the October announcement stated that open ai had access to over ten billion dollars in liquidity, giving us a sense of how fast it's burning through the cash it has on hand, as that was pretty much all the money they'd raised at the time. Based on reports, open Ai will not have access to the rest of the forty billion dollars that soft Bank is funding them with. Until the end of the year. And it's unclear what part of the end of the year that is, but SoftBank's filing says December. Will it be December first? Will it be Christmas time? Will Sam Mortman look on to the tree and Messiyoshi san will be there? We'll find out, won't we. But we can assume in this case that open ai likely has, in the best case scenario, access to about roughly sixteen billion dollars in liquidity at any given time. It's reasonable to believe that open ai will raise more debt this year despite this massive raise, and it does so to the tune of around five or six billion dollars. Without it, I'm genuinely not sure what they're going to do. And as a reminder, kids, open ai loses money on every single user, free or paying. Now here's my next question, what are open AI's obligations? Now? When I put out how does open ai survive and open aiy is a bad business? I used reported information to explain how this company was at its court unsustainable. I'll link to the newsletters and the spreadsheet in the podcast too. But let's refresh our memory, shall we. Okay, compute costs at least thirteen billion dollars of which are going to Microsoft a loan in twenty twenty five, and as much as five hundred and ninety four million dollars to my favorite company, core Wave. Core Weave's back. Every time I think I got away from core Weave, they pop out like a sour turd coming back in the toilet. Anyway, it seems from even a cursory glance that open AI's costs are increasing dramatically. The information reported earlier in the year the open ai projects to spend thirteen billion dollars on compute with Microsoft the loan in twenty twenty five, nearly tripling what it spent in total compute in twenty twenty four, which was five billion dollars, with three billion for training and two billion for running their models. This suggests that open aiyes costs as skyrocketing, and that was before the launch of their new image generator, which led to multiple complaints from Sam Mormon about a lack of available GPUs, leading to open Aiy's Terrible Little Man CEO saying that he expected stuff to break and delays in new products. Nevertheless, even if we assume open ai in factored in the compute increases into its projections, it still expects to pay that thirteen billion dollars to Microsoft a loan. This number, however, doesn't include the twelve point nine billion dollar five year long compute deal signed with core Weave, a deal that was a result of Microsoft declining to pick up the option to buy the compute themselves for open Ai. As an aside from what I am here and this is sources telling me most of the Microsoft compute was open Ai. It's basically they put an open Ai sticker over Microsoft. It's basically the same thing. Now, payments for this deal with with open Ai and core Weave they start in October twenty twenty five according to the information, and assuming that it's even paid, or like evenly paid or even like anyone gets the cash. But these contracts are weird, you never know exactly how they're paid, this would still amount to roughly two point three eight billion dollars a year. However, for the sake of argument, let's consider the payments are around one hundred and ninety eight million dollars a month. Though there are scenarios such as say core weaves build out partner not being able to build the data centers or core weave not having the money to pay to build them, where open ai might end up paying less. And don't worry, I'll get to that later. Let's talk about Stargate. You heard this, You've seen this. Stargate is this data center project the open ai is going into. And they've dedicated somewhere in the region of nineteen billion dollars to this project, along with another nineteen billion dollars provided by Software Bank at some indeterminate time. And there are other partners too, like oor Coins. Not even obvious what they're putting in either. One thing I can tell you, though, is the Trump administration is not actually spending any money on this anyone. I've had tons of people emailing me being like, hey, yeah, Donald Trump was there. Yeah, Donald Trump was there. Doesn't mean shit anyway. Based on reporting from Bloomberg, open Ai plans to have sixty four thousand Blackwell GPUs running by the end of twenty twenty six, or roughly about three point eighty four billion dollars worth of them. I should also note that Bloomberg said that sixteen thousand of these chips would be operational by summer twenty twenty five, though it's unclear if that will actually happen, and as I'll get to later, I got some more questions about what exactly is happening at Stargate or told. Though it's unclear who actually pays what parts of Stargate, it's safe to assume that open Ai will have to at least put a billion dollars into a project that is meant to be up and running by the end of twenty twenty six, if not more. Now Stargate really on has one like data center project under development is in Abilene, Texas, and as I've mentioned, it's not really clear how it's going though. Are recent piece from the Information reported that it was currently empty and incomplete, and if it stays that way, and I quote again, open Ai could walk away from the deal, which would cost Oracle billions of dollars. Though the article takes great pains to assure the reader that that won't be likely. Even an inkling of such a possibility is a bad fucking sign. Business insiders reported on the site in Abilene calls it a three point four billion dollar data center development, as did the press release from the developer Crusoe, who will get too later. Though these numbers don't include GPUs hardware or the labour necessary to run them. Right now, Cruso is, according to Business Insider, building six new data centers, each with a minimum square footage, which will join the two it is already constructing for Oracle. Oracle is signed, according to the information, a fifteen year long lease with Cruiser for its data centers, all of which will be rented to open Ai. In any case, open AI's exposure could be much, much, much higher than billion dollars. And I'll explain in greater depth how I've reached the figure at the women I get there. But nevertheless, it could be much higher. If OpenAI has to contribute significantly to the costs of associating with Stargate in general, it could cost them a lot of money. Data centers aren't something you can do. Funny money math with Data Center Dynamics reports that the Abilene site is meant to have two hundred megawatts of compute capacity in the first half of twenty twenty five, and then as much as one point two gigawats by mid twenty twenty six. To give you a sense of the total cost of the project, Former Microsoft VP of Energy Brian Janis said in January that it costs twenty five million dollars a megawat or about twenty five billion dollars a gigawe, meaning that the initial capital expensures for Stargate to spin up its first two hundred megawat data center will be around five billion dollars, spiraling to thirty billion dollars or more for the entire project. The Information has reported that the site, which could be and I quote potentially one of the biggest AAY identify data centers, could cost fifty billion to one hundred billion dollars in the coming years. Where's that fucking money coming from? Me? Stick with the lower end of these cost estimates, it's likely the open AI is on the hook for over five billion dollars for the Abilene site. Based on the nineteen billion dollars it's committed to the overall Stargate Data center project. This expenditure won't come all at once and will be spread across several years. Still, assuming even the rosiest numbers, it's hard to see how open ai doesn't have to pony up at least a billi in twenty twenty five, and that's likely because the development at this site is going to be heavily delayed by both tariffs, labor shortages, and oracles as reported by Information. Well, they're trusting and I this is the kind of quote you really want to hear. They're trusting scrappy but unproven startups to develop the project. Is that good? Is that who you want doing this? I know when I get a contractor in to fix something, my first thought is, is this guy's scrappy? Anyway, Let's talk about the other costs. They're at least three point five billion based on reporting from the information from last year, open ai will spend at least two point five billion dollars across salaries, data referring to buying data from other companies, hosting another cost of sales in sales and marketing, and then another billion on what infrastructure open ai owns. I expect the latter cost to balloon with open AI's investment in physical infrastructure for stargame. Here's another bloody question. How does open ai meet its obligations? Based on previous estimates, open ai spends about two dollars and twenty five cents to make a buck. At that rate, it's likely the open AI's cost in its most rosy reed revenue projections of twelve point seven billion dollars or at least twenty eight billion, meaning that it's on course to burn at least fourteen billion dollars in twenty twenty five a loan, assuming that open ai has literally all of the money they had from last year. They don't, but for the sake of argument, let's pretend they have ten billion dollars, as well as ten billion from soft Bank. It's still unclear how they pay for everything now. While open ai likely has preferential payment structures with all of their vendors, such as discounted rates with Microsoft, Razi or cloud Services, it will eventually have to pay someone, especially in the case of costs related to stargate, many of which are upfront and involve physical things happening. In the event that its costs a severe as reporting suggests, open AI's revenue comes at a terrible cost and will likely be immediately funneled directly into funding the obscene cost behind inference and training models like GPT four point five, which Sam Wuonton called a giant expensive model, and yeah, nevertheless he pushed that to every single user. Worse. Still, open ai has while delaying its next model, GPT five promised to launch after all that its O three reasoning model after saying it wouldn't do so, which is strange because it turns up three is actually way more expensive to run that people thought. With arc Price Foundation, a nonprofit that makes the RKGI test for benchmarking models, because real tests don't work on them, they need to make something out. Well, they're estimating they will cost about thirty grand a task, which is that's a lot of money. And yes, they again it. Open Ai has a new sugar Daddy in the form of SoftBank. But SoftBank has to borrow money to meet its obligations for Stargate and also open Ai, and this is leading to its financial condition likely deteriorating. And that's S and P Global that you know, you know, the S and P five those people, those people said that that's not what you want to hear. Let's look soft Bank. As of right now, soft Bank has committed to the following at least thirty billion dollars in funding, is part of open AI's recent forty billion dollar funding round. Now. Soft Bank's filing surrounding open AI's funding also suggests that they are ultimately on the hook for the entire forty billion, but they can syndicate it. Like I mentioned earlier, reporting suggests that the syndication will happen, and it will happen with people at COUTE, Microsoft and other investors. Now here's the funny part. If open ai fails to convert to a for profit, that forty billion dollars slashed down to a paltry thirty billion dollars, although again soft Banks shares contingent upon whether it finds other investors to join the deal. Now there's another three billion dollars that soft Bank's promised to spend on open AI's tech, and then nineteen billion dollars for the Stargate Data Center project, which soft Bank is taking full financial responsibility for. The information reports, and the total is either fifty two billion dollars or sixty two billion dollars, with at least ten billion dollars due by the end of twenty twenty five, but more like twenty Like there's so much money. To be clear, SoftBank had to borrow all of the ten billion dollars. How the fuck are they meant to get the thirty billion dollars? Now I kind of mentioned s and P. But I want to really return to that now, SoftBank's exposure to open Ai is materially harming the company, and I'm quoting the Wall Street Journal here. Ratings agency S and P Global said Tuesday that SoftBank's financial condition will likely deteriorate as a result of the open Ai investment, and that its plans to add debt could lead the agency to consider downgrading SoftBank's ratings. While one might argue that SoftBank has a good amount of cash, the journal also adds that they're somewhat hamstrung in its use as a result of CEO Massaoshi's son's dick shit reckless gambles. Again quoting from the Wall Street Journal, SoftBank had a decent buffer of thirty one billion dollars of cash as of December thirty first, but the company is also pledged to hold mone much of that in reserve to quell worried investors. Soft Bank is committed to not borrow more than twenty five percent of the value of its holdings, which means it will likely need to sell some of the other parts of its empire to pay the rest of the open Ai deal. Worse still, it seems that as mentioned before that SoftBank will be financing the entirety of the first ten billion seven point five billion if they're able to syndicate it. As a result, soft Bank is likely going to have to sell off parts of their actually valuable holdings in companies like Alababa or arm or worse still, parts of their ailing investments from a vision fund, resulting in a material loss and its underwater deals. This is an untenable strategy, and I'd like to explain why. First. Open Ai needs at least forty billion dollars a year to survive and its costs are only increasing. While we don't have much transparency into open AI's actual day to day finances, we can make educated guesses that its costs are increasing based on the amount of capital it's raising. If open AI's costs were flat or only mildly increasing, we'd expect to see raises roughly the same size as the pre vis one six point six billions something in that range. A forty billion dollar raise is nearly six times the previous funding round. Admittedly, multiples like that aren't particularly unusual. If a company raises three hundred grand in the pre seed round and a three million dollar series a round in funding. That's a tenfold increase. But we're not talking about hundreds of thousands of dollars or even millions of dollars. We're talking about billions of dollars. If open AI's funding round with soft Bank goes as planned, it will raise the equivalent of the entire GDP of Estonia, a fairly wealthy country itself that's also a member of NATO and the European Union. That alone should give you a sense of how fucking stupid this all is. Stupid, sure, but undoubtedly necessary. Per the information, open ai plans to spend as much as twenty eight billion dollars in compute on Microsoft dosy or cloud in twenty twenty eight. Over a third of open AI's revenue, per the same article, will come from SoftBank's alleged spend. It's reasonable to believe that open ai will, as a result, need to raise in excess of forty billion dollars a year, though it's reasonable to believe that it will need to raise more like fifty billion a year until they reach profitability you know never And now this actually has a reason. It's due to the growing cost of doing business as well as the various infrastructure commitments they've made, both in the terms of Stargate as well as deals with third party suppliers like core Weavan, indeed, Microsoft. Open Ai CEO Sam Mortman's statement around costs also suggests that they're going up quite fiercely. In late February, he claimed that open ai was out of GPUs. While this suggests that there is demand for some products like it's horrifying him as generating tech that has made abominations that insult Miyazaki and I will hate them for it forever and thus made them go viral in much it also means that to meet the demand of the horrible abominations, open Ai needs to spend more, and at the risk of repeating myself, that demand doesn't necessarily translate into revenue or profitability. Now. Second, soft Bank cannot fund open ai long term. I must be clear. Open AI's costs are projected to be three hundred twenty billion dollars over the next five years. SoftBank has to overcome significant challenges to fund both open ai and it's part of Stargate, and when I say fund, I mean fund the current state of both projects, assuming no further obligations or complications. The Information reports the open Ai forecast that will spend as I mentioned, twenty eight billion dollars on compute with Microsoft a loan in twenty twenty eight, and I apologize for repeating myself, these numbers fucking matter. The same arcle also reports that open ai would turn profitable by the end of the decade after building out Stargate, suggesting that open AI's operating expend ditches will grow exponentially year over year. According to the Information, it expects its costs to surpass three hundred and twenty billion dollars between twenty twenty five and twenty thirty, with half of that going towards funding model training and development. How the fuck does building stargate make them profitable? I really can't anyway, it won't. It's the same shit. But SoftBank has had to and will continue having to go to remarkable lengths to fund open AI's current forty billion dollar round, lengths so significant that it may lead to their credit rating being further downgraded. As I mentioned, even if we assume the best case scenario, open ai successfully converts to a for profit entity by the end of twenty twenty five and gets that full thirty billion dollars. It seems unlikely, if not impossible, for it to continue raising the amount of capital they need to to continue operations. As I've argued in multiple news letters and podcasts, there are only a few entities that can provide the kind of funding that open AI needs. These include big tech focused investment firms like soft Bank, sovereign wealth funds like those of Saudi Arabia and the UAE, and perhaps the largest tech companies. I also want to be clear that I keep getting messages being like the gatherment could do it daily, couldn't it get me word? If the government did get it's not going to happen. Stop fucking asking me. It's just not going to happen. The government's not sending forty billion dollars to Sam Altman. They're not going to do it. I will apologize personally to each and every one of you if I'm wrong, but they will need to send forty billion dollars cash to open ai. But these entities can meet open ai needs for some time, but not all the time. It's not realistic to expect soft Bank or Microsoft or the Soald's Oracle or whoever to provide forty billion dollars every year for the foreseeable future. And just to be clear, this is what open Ai needs. Eventually, even the Souds have to have a break. And I don't know if you remember from a previous episode, but Masayoshi's son gets a lot of his money from them, and they're not happy with him. In fact, Masaoshi's son has said a few years ago that he owed Muhammad bill and Salmon. Not a great guy to owe anyway. This really is especially true for soft Bank by the way. They're bruised. They're battered after several rough years, including a failed multi billion dollar investment in we Work. Based on its current promise to not borrow more than twenty five percent of its holdings, it's near impossible that SoftBank will be able to continue to fund open Ai at this rate, and forty billion dollars a year may not actually be enough. Based on soft Bank's last reported equity value of its holdings, they have about two hundred and twenty nine billion dollars of stuff, meaning that they can borrow just over fifty seven billion while remaining compliant with these guidelines. In any case, it's unclear how soft Bank can fund open Ai, but it's far clearer that nobody else is willing to. Now we're going to move off the questions for a minute because I just want to get into some problems. I've popped because it turns out that open ai they got doodoo ass. That's a professional finance term. I did just say financi and we're just going to keep that. Now. Open Ai has started running into capacity issues, and this is a real problem, and it suggests material instability in their business or infrastructure, and it's really not clear how open ai expands further. Let me explain it's important to note that open ai does not really have any of its own compute infrastructure. The majority of its compute is provided by Microsoft, though as mentioned previously, open now has a deal with core Weave to take over the capacity that Microsoft was going to have twelve billion dollars or so of capacity in the future. Anyway, in the last ninety days, Sam Altman has complained about the lack of GPUs and pressure on open ai servers many times. In my newsletter published a few days ago, I was did six such examples. Should you be curious, these statements in a bubble seem either harmless or like open AI's growth is skyrocketing, the latter of which might indeed be true, but botzil for a company that burns money with every single user, any mention of rate limits or performance issues suggests that open ai is having significant capacity issues, and at this point it's unclear what further capacity it can actually expand to outside of that currently available. Sam Wilton's complaining about melting GPUs, You've got the lead from Sourer saying, yeah, we're gonna have some problems showing you stuff for a minute, just because like, yeah, we're melting GPUs. There was a whole thing about how you should expect delays on product launches and service problems. None of it's really good, And like I said, it isn't really obvious how open ai is going to expand much further. Remember Microsoft has now pulled as much as two gigawatts of data center projects, walked away from a billion dollar data center development in Ohio and declined the option the one I just mentioned on twelve billion dollars a compute from Corewave that open ai had to pick up, Meaning that open ai may be pushing up against the limits of what is physically available. While The total available capacity of GPUs at many providers like Lambda and Crusoe is unknown, and indeed, I don't know if Cruso has a single data center at this point. We know that Corewave has upon approximately three hundred and sixty megawats available compared to microsoft six point five to seven point five gigawads, a large chunk of which I think powers open Ai. If open ai is running into capacity issues, it could be one of the following. They could be running up against the limit of what Microsoft has available or is willing to offer the company. The Information reported in October twenty twenty four the open Ai was frustrated with Microsoft and said it wasn't moving fast enough to supply open ai with servers and now. It could also be that while open ai is cony is sufficient, it does not have the resources available to easily handle bursts in user growth in the stable manner. Per The Information's reporting, Microsoft promised open Ai three hundred thousand in Vidio GB two hundred Blackwell chips. By the end of twenty twenty five, are roughly eighteen billion dollars worth of them. It's unclear if this has changed since Microsoft allowed open Ai to seek other compute from other companies in late January twenty twenty five. I also don't believe that open ai has any other viable options for existing compute infrastructure outside of Microsoft. Corweep's current data centers mostly feature in Vidia's aging Hopper GPUs, and while it could and likely is retrofitting its current infrastructure with Blackwell chips, doing so isn't easy or cheap. Blackwell chips require far more powerful cooling and server infrastructure to make them run smoothly, a problem which led to a delay in their delivery to most customers according to the information. And even if Corey was able to replace every last Hopper GPU with a Blackwell, and they won't, it still wouldn't match what open ai needs to expand one might argue that it simply needs to wait for the construction of the Stargus data center, or for core we have to finish the gigawa or so of construction it's working on. I want to be clear how impossible that is. I need to be clear. One of my least favorite responses to my work as people saying they'll just build more data centers. They just go and build them inn They're just going to build them right now. They won't. You can't just data centers don't just grow from the fucking ground. As I've argued in the past, I have serious concerns over the viability of core We've ever completing. It's alleged contracted one point three gigawatts of capacity. Based on calculations, it'll have to spend in excess of thirty nine billion dollars to build it. It's unclear how that will happen, and it doesn't have the money to do so, like it actually does not have the cash. I'll get into that later, but they don't have the money. However, even if I were to humor this idea, it's impossible that any of this project is done by the end of twenty twenty five, and I'd argue even in twenty twenty six. I can find no commitments to any time scale, other than the fact that open Ai will allegedly start paying core Weave in October per the information, which could very well be using their current capacity. I also can't find any evidence that Cruso, the company building the Stargate data center in Texas, has any compute anywhere else. Lander, a GPU compute company that raised three hundred and twenty million dollars earlier in the year, and according to Data Center Dynamics, operates out of co location data centers in San Francisco, California, and Allen, Texas. It's backed by Moorner eight hundred and twenty million in funds raised this year. All of that just kind of doesn't say that they have a data center at all. A colocation. That means you're in someone else's buddy. You don't own a house, you're ranting one. And just to be clear, open AI's ability to scale is entirely contingent on the availability of whatever data center providers it has relationships with, and all of their growths coming from these to these two companies. I'll get there, don't worry. But every time I kind of say this stuff out loud, I feel my soul slightly stripped from me. I feel like I feel like I'm in hell because this is an insane thing. If you or I win to someone went like, hey, look, I'm going to lose five billion dollars, but I promise you God will come out of the computer. They institutionalize you or me, they definitely would instalize me. It's just fucking stupid, But in any case, this means that open AI's only real choice for GPUs is either Core Weave or Microsoft. And while it's hard to calculate precisely, open AI's best case scenario is that sixteen thousand GPUs come online in the summer of twenty twenty five as part of the Stargate Data Center project. And that's a drop in the bucket compared to the three hundred thousand of the fucking things that Microsoft have previously promised. Now, one last thing, any capacity or expansion issues that happen with open Ai will needcap this company, open ai is, regardless of how you or I may feel about generative Ai, one of the fastest growing companies of all time. It currently has, according to its own statements, over five hundred million weekly active users. Putting aside that each user is unprofitable. Such remarkable growth, especially as it's partially as a result of its extremely resourced intensive image generator, is a massive, horrifying strain on their infrastructure. The vast majority of open aiyes users are free customers using chat GPT with only, like I mentioned earlier, twenty million paying subscribers, and the vast majority of them on the cheapest twenty bucks plan. Open Aiyes, services, even in the case of image generation, are relatively commoditized, meaning that users can, if they really care, go and use any number of other large language model services. They could use bing Oh, they could use stable Diffusion, they could even use GROC if they really I don't like saying it, but free users they're a burden on the company, especially with such piss poor conversion rates, losing money with each prompt, which is by the way, also case with paying subscribers, and the remarkable popularity of its horrible image generator, it only threatens to bring more burdens than one off customers that will generate a few abominable studio ghibli pictures of Garfield with giant knockers and then never return. If AI's growth continues at this rate, it will run into capacity issues, and it does not have much room to expand. Well, we don't know how much capacity they're taking up with Microsoft, or indeed whether Microsoft is approaching capacity or otherwise limiting them. We do know that open ai has seen reason to beg for access for more GPUs. In simpler terms, even if open ai wasn't running out of money, even If open ai wasn't un horrifyingly unprofitable, it also may not have enough GPUs to continue providing its services in a reliable manner. If that's the case, there really isn't that much that can be done other than significantly limiting free users activity on the platform, which is open AI's primary mechanism for revenue growth and customer acquisition. Limiting activity or changing the economics behind its paid product. And this is quoting sam Orman. They could potentially find some way to let people pay for compute they want to use more dynamically, that's not good. Almans come up with some other ideas, like an idea for paid plans on March fourth, where twenty bucks a month goes to credits which you can use across features like deep research, oh one at four point five sore and so on, with no fixed limits per feature, and you choose what you want. If you run out of credits, you can buy more. I just want to be clear that that is a terrible fucking deal. We have no idea what the credits would be, and it would definitely be rigged so that you would have to buy more. He's also brought up things like mentioning losing two hundred bucks a month on the pro subscription, but he's a funny one. Buried in an article from the Information from March fifth is a comment that suggests that open ai is considering measures like changing its pricing model entirely, with Altman reportedly telling developers in London in February twenty twenty five the open aiy is prime to judge twenty or thirty percent of pro customers a higher price because of how many research queries they're doing. But he suggested in a la carte or pays you go approach when it comes to agents, though we have to judge more than two hundred dollars a month. Fucking just this fucking guy. The problem is that with all of these measures, even if they succeeded in generating more money for the company, also need to reduce the burden on open aiyes available infrastructure. Remember remember data centers can take three to six years to build, and even with stargates accelerated, and I'd argue on realistic timelines, open ai isn't unlocking a tenth of the promised compute that Microsoft gave them. Three hundred thousand GPUs is a lot sixteen thousand. Really not, so what might these capacities look like? What are the consequences? You love payal horses. I love payal horses. Let's get riding. Though downtime might be an obvious choice. Capacity issues that open ai will likely manifest in hard limits on what free users can do, some of which I've documented previously. Nevertheless, I believe the real pay all horses of capacity issues come from arbitrary limits on any given user group, meaning both free users and paid users. Some limits on what a user can do, a reduction in the number of generations of images for paid users, any introduction of peak hours, or any increases in prices are assigned that open ai is running out of GPUs, which it's already said as happening publicly. However, the really obvious thing, the real obvious pale horse, would be service degradation delays in generations of any kind, five hundred status code errors, or chat GBT just failing to produce an answer. Open ai has up until this point had fairly impressive uptime. Still, it's if it's running up against the wall this streak will end. The consequences depend on how often these issues occur. And to whom they occur. If free users face service degradation, they'll bounce off the product, as their US is likely far more fleeting than the paid user, which will begin to erode open AI's growth. Ironically, rapid and especially unprecedented growth in one of open AI's competitors like Higher or Anthropic could also represent a pale horse for open Ai, though based on the monthly active users I've seen from Anthropic, I don't think that's going to be a problem now. If paid users face service degradation, it will likely cause the most harm to the company is while paid users still lose open Ai money, in the end, at least they receive some money. Open Ai is effectively one choice here get more GPUs from Microsoft, and really its future depends heavily on both Microsoft's generosity and there being enough of them time when Microsoft is pulled back from over two gigabats of data centers, specifically according to td COD because of it moving away from providing compute to open Ai. Now, admittedly, open Ai has previously spent more on training models than inference, and that's the actual running of them, and the company might be able to smooth downtime issues by shifting capacity. This would of course have a knock on effect on its ability to develop new models, and the company is already losing ground, particularly when it comes to Chinese rivals like Deepseek. Now, I know this has been a long episode and the fact is I'm not even close to finish. I have some more tough, tough questions and tough problems for open Ai. Tune in to the next episode to hear them. I realized this has been a lot, and I know you're very patient with me. You let me read the long scripts. But this stuff is important. It's important to me, and I think you're going to find it important too. And I'll have a nice sexy conclusion to this at the end of the next one. But before I go, please vote for me in the fucking Webbies. I want to win. It's my pin post on Blue Sky and to wear in my handles eadsit Tran dot Com and Blue Sky. Please he vote for me and the Webbies. Anyway, catch you next episode. Thank you for listening to Better Offline. The editor and composer of the Better Offline theme song is Matasowski. You can check out more of his music and audio projects at Matasowski dot com, M A T T O S O W s ki dot com. You can email me at easy at better offline dot com or visit better offline dot com to find more podcast links and of course my newsletter. I also really recommend you go to chat dot Where's Youreed dot at to visit the discord, and go to our slash Better Offline to check out our reddit. Thank you so much for listening.
Better Offline is a production of cool Zone Media. For more from cool Zone Media, visit our website cool Zonemedia dot com, or check us out on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts.