Sam Bankman-Fried Wants to Save the World

Published May 24, 2022, 4:05 AM

Sam Bankman-Fried is the founder and CEO of the crypto exchange FTX. His problem: How to spend billions of dollars to save humanity.

Sam is one of the most interesting people in crypto -- in large part because he doesn't think crypto is the most interesting thing in the world. He got into the business because he wanted to make as much money as possible in order to give almost all of it away.

He's now worth over $20 billion, and he's already donated hundreds of millions. In the next few years, he could give away billions more.

On today's show, he lists a few of the causes he's supporting -- and explains why he's likely to make massive political donations in 2024.

If you’d like to keep up with the most recent news from this and other Pushkin podcasts be sure to subscribe to our email list

Pushkin. One of the most interesting people in the crypto world right now is Sam Bankman Freed. He's thirty years old, founded the crypto exchange FDx a few years back, and today he is worth around twenty billion dollars according to Forbes. And yet he still lives with roommates and drives a Toyota Corolla. And maybe the most interesting thing about Sam, it's actually not that into crypto. He didn't get into it because he thinks bitcoin is going to replace the dollar, or that we're on the brink of some revolution in the very meaning of money. He got into crypto so that he could make as much money as possible and then give almost all of it away. So when Sam thinks about really big problems, he doesn't necessarily think about how much the price of bitcoin is falling, or that a big stable coin fell apart a few weeks ago. He thinks about things like how to save humanity from extinction. How many people will live if we play our cards right as a world like in the ninetieth percentile outcome, How many people will live in the future dancers trillions, probably maybe hundreds of hundreds of trillions. I'm Jacob Goldstein and this is what's your problem? The show where entrepreneurs and engineers talk about how they're going to change the world once they solve a few problems. My guest today is Sam Bankman Free, and his problem is this, how do you save the world. Before we get to the interview, I just want to take a minute here and set up this one big idea. This really useful intellectual framework that drives almost everything Sam does. It's called expected value. I try to use it a lot because I think it sort of is the default correct way in some senses to calculate something like if you're just trying to do a generic calculation, I think it's usually the right thing to use. You could understand expected value by understanding how Sam decided to start his company, the crypto Exchange FTX. He was working as a trader making millions of dollars, and when he thought about starting FTX, he knew there was a really good chance it might fail. It might ultimately be worth zero. But on the other hand, if it succeeded, it could be worth tens of billions of dollars. So here is a slightly oversimplified version of how you would use expected value in this case, say, Sam thought there was only a one percent chance that his exchange would be really successful, but that if it were really successful, it would make him twenty billion dollars. The expected value of starting the exchange is the probability of it being successful one percent times the value if that success happens twenty billion dollars, which comes out to two hundred million dollars a lot. So in twenty nineteen, Sam started FTX, and Sam told me there is a really important lesson here about expected value. One of the sort of takeaways that often ends up coming from really thinking hard and critically about expected values is that you should go for it way more than is generally understood. Go big. You should really go really big, even if you probably will fail and wind up with zero. That's absolutely right. And I think one of the intuitions for why that's the case, for why I think going big is often the right thing to do well if you think about it, like you know, you've got obviously a number of options available to you. Somewhere on the far right hand side of distributions is like the best possible thing, meaning the really good outcomes. That's right, the best possible thing you could imagine happening, and the best possible thing is probably really good. You know, it's probably orders of magnitude bigger than whatever you're sort of expecting to do. Right, it's not a little better, it's wildly better. It's almost unimaginably better, that's right. And if you're thinking about, well, if I found a company, how is it going to go? You know, you're probably thinking this might be a million dollar company, right, but the right hand distribution, the right hand tail of that is going to be a billion dollar company, and that's a thousand times bigger. And so in order for it to be justified to choose that decision, if you really do care linearly about money, if you really do think that getting that marginal dollars worth a lot, you know, even once you already have a lot of money, then it should lead you to think that that, you know, the best outcomes might be outcomes that have a ninety nine percent chance of failure. Right, Because of ninety nine percent chance of failure and a one percent chance of that billion is still that's ten million, and that's a lot. And so any kind that like there is some non zero, non negligible chance of a really really good outcome are times when you're gonna be incentivized more than seems natural, probably to choose extreme outcomes. I feel like that maps in different ways both to your work you do for money and your altruism, which obviously are tied up, right, They both seem like very much rooted in those extreme outcomes. In the case of your work, it's an extremely large amount of money in a short amount of time, and in the case of the altruism, it's profoundly bad outcomes like everybody dying. Right, Like, both of those are sort of the same. In the kind of expected value universe, there are things we should probably think more about than seems intuitive if we're not using expected value, that's exactly right. And you know, when you think, as you said the altruism perspective, right, how many people will live if we play our cards right as a world, like in the ninetieth percentile outcome, how many people will live in the future. The answer is trillions, probably maybe hundreds of trillions, rights if thousands of times more than the number of people who have ever lived. And so that's just that's a huge factor, right, Anything that we do that actually has impact on the whole future of the world is massively important. It's kind of a ridiculous way to think at some level, right, it just gets so big then, like you're just some guy with a lot of money at some level, right, talking about like trillions of people in the whole future of humanity. Like it gets weird, Right, it does get really weird. You should really stress us these and think like, Okay, do I really believe this? Like, do I really actually think that there's compelling evidence that like, you know, these numbers that I'm looking at are as big as I'm claiming they are, or am I kind of bullshitting myself on this? Like, you know, you should absolutely have some humility around that, But it's not totally implausible. And there are examples of people people who we've heard of were very famous, and people who no one has ever heard of, who have had massive, massive impact on the world, who have had that massive multiplier, And so it's not totally implausible. And so you know, I think that like, while we should absolutely have you know, a healthy dose of humility towards extreme outcomes, you know, we should also acknowledge that they can be real and that often the highest expected value things are in fact pushing directly towards them. Well, and in fact you did hit the extreme right tale of the distribution in work right, you did just get implausibly rich in a ridiculously short period of time. So at least on that one it worked. That's right, and I think it's that's certainly it's been a big update towards me in the direction of life. This stuff is plausible. Uh huh. So the fact that you got so rich so fast in crypto does it make Does it push your altruism toward like Welsh? Shit? If I could make twenty billion dollars in three years, everybody on Earth could in fact die from a pandemic or from some out of control AI and I should spend some of the money to try and reduce the probabilities of that. I mean, is it like that? Yeah, it absolutely does. I think it absolutely does make me think, you know what, like these you know, really extreme outcomes are probably plausible, and they're probably plausible enough that I should be taking them really seriously, you know, And that has pretty profound implications I think for what we should be doing. We'll get to those profound implications and to exactly where Sam is giving his money away in just a minute. That's the end of the ads. Now we're going back to the show. Sam told me he's given away about two hundred million dollars so far, which obviously is a lot, but it is also somewhere around one percent of what he plans to give away eventually. His giving has been broad anti poverty, animal welfare, healthcare, but he has started to focus on a few areas. One of the biggest is pandemic preparedness. That is a category that fits right into that expected value framework. You know, a low probability but super deadly pandemic is worth spending a lot to prevent. Another place where he's been giving is politics. Sam was one of the biggest donors to President Biden's twenty twenty campaign. More recently, he donated over ten million dollars to support a candidate in a Democratic primary for a congressional seat in Oregon, largely because that candidate wanted to focus on pandemic preparedness. The primary was held just last week and Sam's candidate lost by a lot. I think that there are a lot of takeaways from it, and yeah, I think that might do it again. I would do it a bit differently than last time. But you know, fundamentally, I think it was a well fought race. I think that you know, he had a real show. Um. And you know, going back to the discussion of expected values, right, like, if you're donating blacal races such that you think your candidates are in ninety nine percent to weigh, you're almost certainly doing something right because that person doesn't need your money exactly. You should be donating such that you think that you have a pretty substantial chance of losing. And you know, I first stand by that. Do you expect you'll give a lot of money in the twenty twenty four election cycle? I would guess so. I don't know for sure. It's going to depend on who's running, but you know I would guess so, well, let's say Donald Trump runs for president. Would that cause you to probably give a lot of money to the person who's running against him? That's that's a pretty decent guess. And and you know, I think that I'm going to be looking a lot less at like political party, um from that perspective, and a lot more about you know, uh same governance Like that is you know, at it's for the thing that I think I care the most governing governance. I think the United States has both a big opportunity and big responsibility to the world to shepherd the West in a powerful but responsible manner, and that everything that we do there has massive, massive ripple effects on what the future looks like. You've talked before about being surprised at how little money is in politics. It is quite small, the amount of money that is donated relative to how much money the government spends. Right, does that lead you to want to donate a lot? But does it follow from that that, like, you'll probably donate a lot of money? It follows that I might in the end. That's basically what I think. I think that like, there are in some ways, there's you know, in some ways surprisingly little money in politics, and given sort of the scope of its impact, noe, that doesn't necessarily mean they're good things to do donating politics, Like it might be that, sure, but like, how are you actually going to do anything useful? You know? Maybe that's how it turns out, but but maybe not, you know, I mean, it's not necessarily the case that more money can have a meaningful effect on the outcome, That's right, It's not necessarily the case, but it certainly gestures a little bit in that direction. I mean, I imagine you have some probability distribution in your mind of how much money you might give in the next election cycle, Like, give me some number. I would guess north of one hundred million, um, And you know, as for how much north of that, I don't know. You know, it really does depend on what happens, Like it's really dependent on exactly who's running where for why, like like these these are are a super contingent thing. But um, but yeah, I think that gives maybe some sense of what the what the sort of like scale might be here more than one hundred million, sort of spread across many races, organizations, but towards the twenty twenty four election. So if that's a floor, what's the ceiling? Like a billion? Might you give a billion? Yeah? I think that's a decent life thing to look at as a as is sort of like I mean, I would hate to say like hard ceiling its whom those was going to happen between now and then, but as like at least sort of a soft ceiling, I would say, yeah, okay, So so the ballpark is like one hundred million ish to a billion ish with again a lot of caveats on this, and you know, there's a world which told end up being close to zero if they're you know, things just work out such that there isn't Is there much I'm excited about? Like that seems like a very low probability to me, based on what I know about you in the world. Yeah, it's pretty I think it's I think it's very low that it's actually gonna end up being zero. That does seem pretty unlikely. Yeah, a billion seems way more likely than zero to me. I think it's quaitely right. So I think the most anybody gave last time, if I have the right numbers, is two hundred and fifteen million. That's for the twenty twenty cycle, the last presidential cycle. It seems like you'll probably give more than that. Based only on what you've told me, I think it is eminently possible that I I think that that would not surprise me. I've heard you say that you know, at some points over the next few years, you hope to find opportunities where you can spend give away like a billion dollars really quickly. What are some of the places you think that might happen. I mean, the election, the twenty twenty four election is clearly one. What are a few others? I think pandemic prevntion is potentially one of them. I think you look at like how much would it cost to you know, really definitely prevent the next pandemic or you can never definitely prevent it, but to have you know, a really good shot at it. I think you're probably talking tens of billion dollars, which is crazy that that governments aren't spending that money, right, that's right, that really should be government spending it. And part of this might be working with governments that because it's not that much like if truly if they could reduce the risk of a pandemic by half, say for thirty billion dollars a year, like, do you actually think that is that true? Or fifty billion? I think I think something like that's probably true. I think something that's like not too far off from that order of magnitude, and that probably you know, by the time you're talking about you know, many tens of million billions of dollars um. You know, if that's something that you're probably going need to have government stepping in on. But you know, I would be happy to throw in a fair bit to help facilitate that. So I should actually I should have asked this earlier. But to what extent are your political donations a pandemic prevention strategy? So I think most of them have been so far, and you know, going forward, like there may become other policy, you know, things like AI policy that that that that you end up being really important. And so it's not to say that like pandemics are the only things that are ever going to matter to me policywise, but that has been the big one so far, and it's the idea there, like tens of billions of dollars a year to significantly reduce the risk of another pandemic is not that much for the government, but it's more than you have, right, so you need a lever You can't you can't actually spend all of your money and meaningfully reduce the chances of another pandemic. And so if you can use political donations to elect candidates who want to spend money to prevent a pandemic, that works, that's right. So you're giving lots of money to political candidates, you're also doing a lot of work to um shape regulation of crypto in the US, tell me about the overlap between those two things. So most of the giving has not been done with crypto in mind. And I have been doing a ton of policy engagement on that, but that's mostly going to DC and talking with policymakers. I mean, here's the narrow version of the question. Is part of what you want from your political donations some particular outcome in crypto regulation? That is not a big part of it. I wanted to talk more about crypto with Sam, but our time was running short. I do promise to talk more about crypto on the show before too long, and Sam did have time for a quick lightning round. We'll have that lightning round in a minute. Okay, let's get back to the show. We're gonna close with the lightning round. Let me just let's just do a lightning round. A few quick questions and you can answer them fast. What's the least rational thing you do? Least rational thing? I know I spend way too much time like aimlessly browsing Facebook. Fee. Is it true you still sleep on a bean bag chair? And if so, why I did last night? I do do many nights. Um, It's uh, I find it. I kind of I don't know, it's what I'm used to is honestly just part of the answer there. It's like, it's what feels natural for me. If everything goes well, what problem will you be trying to solve in five years? I would say, the details of how to of what to prioritize for pandemic prevention funding with you know institutes that have been set up and are you know online it a ton of capital into it, and you know really great teams who are are devoting themselves to building it out. So the dream is you'll be like deep in the weeds figuring out how to prevent a pandemic. Um I've seen in other interviews you're doing lots of different things during the interview. I couldn't actually tell if you were doing other things during this interview, But were you and if so, what were you doing? As playing game of Storybook for all? I say the name of the game again, Storybook Brawl. How did you say it? I took second place out of eight, could have been worse, and I apologize I do have to haul off. Okay, last one, what's one piece of advice you'd give to somebody trying to solve a hard problem. One piece of advice I would say, I just keep going, Just keep going, stuff by stuff, you know, try and solve it, but by bit, and you know, eventually, hopefully you'll get there. Sam bankman Fried is the founder and CEO of FTX. Today's show was produced by Edith Russlo, edited by Robert Smith, and engineered by Amanda K. Wong. You can reach us at problem at Pushkin dot fm, or you can find me on Twitter at Jacob Goldstein. I'm Jacob Goldstein and I'll be back next week with another episode of What's Your Problem

What's Your Problem?

Every week on What’s Your Problem, entrepreneurs and engineers talk about the future they’re trying  
Social links
Follow podcast
Recent clips
Browse 143 clip(s)