OpenAI Head of Policy Chris Lehane discusses DeepSeek concerns and the future of AI growth. He speaks with Bloomberg's Tom Mackenzie.
We're going to now head over to the AI Action Summit in Paris, where Bloomberg's Tom mackenzie is sitting down with open AI's chief Global affairs officer, Chris Lahane.
Guys, thank you very much doing yes at Le Grand Ballet in Paris. I'll start with that question about deep sea because it is a key question for people attending this event. Chris Lhane, head of Global policy of course at open AI deep Sea, how do you characterise the impact of deep sea. We're just hearing from Demisisarvice, who I sat down with earlier. He said their claims around the way they built and trained this model, the cost and the amount of chips they were using. He says, that's exaggerated.
So I think there can be two things that they are true here. One that they have built a really impressive model. It basically competes with what open AI had put out back in September. Now we've since put out other more advanced models, but clearly a very capable model. I think the second thing with Demis said can also be true, which is perhaps and we've seen news reports on this, that the costs, that how the technology was derived, that you know, whether they access certain types of chips, you know, whether what was initially said was not actually the case. I do think the big takeaway though, even if everything I just said turns out to be accurate and crewe, it is still a very impressive and competitive model. And so to me, the big, big, big takeaway in all of this is that this really reaffirmed something that open ai has been saying since the summer of twenty twenty four that there are two countries in the world that can build AI at scale, the US, the CCP led China. And what that really means is that there is a global competition right now between whether the world's going to be built on small, d democratic AI rails or authoritarian, autocratic AI rails. And that's the big takeaway for this.
I want to get to that, but I want to push you on what you'll see with discovery because I know this is investigation that has been a problem within open ai as to whether or not deepseek inappropriately use some of the inferencing data distilling from your own models. Have you come to a conclusional We're.
Still looking at it. Obviously, there's we and we've already made public that we had seen some evidence of that taking place, And just so folks understand, because distillation is not a normal word, at least certainly that I don't use at the breakfast table all the time. There is different kinds of distillation. And I'll use an analogy. If you go to the library and you take out a book and you learn from that book, and that ultimately informs some of your work. That's fine. That takes place a lot of that. That's part of what happens in the AI space. There's another version of it where you go in to the library, take the book, keep the book, put your name on the book, slap a cover on the book, and hand it out as if it's your book. And that's the replication. And I think that's what we're concerned about, and again what we've seen some evidence of and are continuing to review to have a better understand We've talked with government officials about it, and we'll share more as we learn more.
Well open AIS critics have said, then trend your models on data, and you haven't been fully transparent in terms of the use of that data. I don't know you would push back.
That's when there's good calories and bad calories there's a good distallation and problematic distillation.
Have any of your customers pushed back and said, look, we're uncomfortable with the pricing of your models at this at this junct shot.
Well, let me put it this way. You know, open ai came out in November of two years ago, twenty twenty two, right, and within two months was it a one hundred million users? Right? Were well over three hundred million today continuing to see you know, that really strong growth. And what we try to do is have different models priced at different levels depending on how you ultimately want to use it. And that's something just given the nature of how fast the technology is moving and the pace this moving at that you're always constantly trying to evaluate.
Prices go down from it.
Oh yeah, So I think amongst the most interesting things that we have seen this is a an interesting aspect, which is the efficiencies of systems, particularly using the reasoning technology is coming down. Sam, When our CEO put a blog out last night, maybe technically this morning, I don't know, but at some point in the last twenty four hours that sort of looked at three observations of AI. The first was that the more you spend on compute to build frontier models. It's pretty logrhithmic. The more powerful of those gets so you're going to require more and more compute, more and more investment in infrastructure. Secondly, over the last year or so, we've seen a big increase in efficiencies which are bringing down the cost of a token. Of that I think of a token as a price of computer as a unit of computing about one hundred and fifty percent. But even as the costs come down, the amount of people using it go up, so that then puts pressures back on computing. Like an analogy here is car prices come down, more people drive cars, but you then need more energy, more roads. Third, piecet that has come out and his observation, sorry, just bear with me for a second, is that is that the economic productivity that you're getting in is super exponential.
Where are you getting the money to spend one hundred billion initially on star gates, elon must says, you probably don't have the money, even something Adela says, we're good for our eight two billion. That's Onion.
So first of all, we have incredible partners. We have soft Bank, which is a proven track record of raising enormous syndicated money from sovereigns and pensions.
Thought is coming from nails.
And then and then we have Oracle, right, which actually builds these. Then you have open aies. What are piece of this? First of all, in terms of the media question, there's one hundred billion that's going to be going out the door in the immediate future. We already have a facility in Abilene. You guys need to come down. You can abline Texas. Sorry, we'd love to give you, guys. You can hang out the Oracle. Guys have done an awesome job down there. No, no, no, no, why don't you come see it? Show? We like to show, not tell. And then and then the open AI piece comes in two different pieces here, and I think this is to understand the economic model here. First of all, what we bring to this is the IP and you can think of compute the same way. Maybe you can think of gas regular gas, medium gas, and then premium gas. The premium gas is what people are gonna pay for. The premium compute is going to be the most expensive compute in the world because it's gonna be the highest level compute. You only get that premium compute with our IP. IP going too, the chip design IP going into the data centers, IP going how the clusters are sort of structured. And then we also are a buyer of the compute, right, so we commit to buying a certain amount of the compute that's coming out, which helps the whole economic model work.
Chris, you worked in the Clinton White House. Is the Trump administration there go go to great headline is to the Trump administration right to slash AI regulation.
I think what the Trump administration is focused on is one thing, and one thing very clearly, who is going to prevail in the competition between democratic That's what they get up and think about every day, at least from what I have seen and from what I have heard, And so I think they understand that you do have to really be leading in leaning into the innovation. If you think about about where the sort of comparative advantages are. Right at the end of the day, this is actually pretty simple. Whoever has access to compute is going to be in a strong position. What makes up compute, it's data, it's energy, it's chips, and it's talent. Right, and if you think of what the PRC has, they have an enormous amount of data. Authoritarian State Energy ten nuclear facilities last year, another ten coming on this year. Our chips are better. They're throwing a ton of money at it talent, and I think the talent piece is really interesting because they do have great talent in China. But in capitalist systems, it is capitalism that unleashes the developer, the builder, the entrepreneur, the people who are actually making this stuff in the tools that starve hauled problems. And that's where our advantage is.
Is that why you've joined up and partnered with Aurae'm to thinking of the AI principles a Google. They've adjusted them and Google is no longer ruling out building tech for defense for the military. Yeah, you're signing up, you partner with Angurill. How far does that relationship go? How much are you prepared to embed tech into weapons system?
Yeah? And we also just announced a partnership with the National Labs, which are the you know, which play an incredibly important role in how the US government thinks about national security. Is on the FILS, Yeah, yes, and the labs you know obviously like Los Alamos and others, are very big players in the broader national security ecosystem. So you know, for US, right, we do want to be a partner on innovation with the government. We do believe it's incredibly important that democratic AI prevails, and that means making sure that the government is getting access to the highest capabilities. You know, we'll certainly do it consistent with our values and our principles, but at the end of the day, like we do believe it is very consistent with their mission that is to make sure AI benefits everyone. That you're ensuring that AI is going to be built in a democratic way with democratic values.
Chris la Haye, thank you very much. Indeed, Global head of Policy at Open AI.