Just as Facebook was on the verge of becoming Meta Platforms, Inc. in late 2021, a scathing series of articles was published by the Wall Street Journal. The reporting was based on internal documents that detailed the ways Facebook’s platforms “are riddled with flaws that cause harm, often in ways that only the company fully understands.” The source for these internal documents — some tens of thousands of pages — became known as The Facebook Whistleblower. The name behind these revelations is ex-Facebook product manager Frances Haugen.
On this episode, Haugen reveals why she came forward, what she hopes to accomplish with her new book, The Power of One, and what she sees as the perils — and promise — of an ever-changing technology landscape that requires transparency to keep itself honest.
Hi everyone, I'm Katie Kurk, and this is next question. In September of twenty twenty one, The Wall Street Journal began to roll out a series of eleven stories that would have a major impact on the way people think about technology companies. The series was called The Facebook Files, and the source for the series maybe one of the biggest journalistic treasure troves of the century, was a former Facebook product manager named Francis Hogan. Francis had only been at Facebook for a couple of years, but during that time she became increasingly alarmed by a disturbing pattern.
The only data that we get out of these companies is how many users do they have? How much time do you spend, how the ads do you book on? What's that revenue? You don't get the societal costs that come as a consequence.
It seemed that Facebook was prioritizing their own profits over public safety and putting people's lives at risk. So she blew the whistle. She made tens of thousands of pages of internal documents available to The Wall Street Journal. And what happened next testifying before the US Congress, the UK and EU parliaments, and filing a complaint with the sec exposed the depths that the tech company would go to to mislead the public and grow its bottom line. Now, Francis has written a book. It's called The Power of One. How I found the strength to tell the truth and why I blew the whistle on Facebook, And I'm very glad that it's brought her to next question. First of all, Francis, it's great to finally meet you. I am a huge admirer of yours, so.
Excited to be here. Thank you so much for inviting me.
Of course, it's been quite a ride for you since you became the Facebook whistleblower in October of twenty twenty one, now a year and a half or so later. How are you feeling? Are you happy that you came forward? Do you have any regrets? Would you do it all over again?
You know, when I originally came forward, I had very very basic goals, like I wanted to not have to carry a secret that I thought had the potential to really impact the lives of others. I came forward because I was concerned about how Facebook was operating in African countries in Southeast Asia, and I then and still genuinely believe that if we continue to operate the way we do, there are millions of lives on the line from things like ethnic violence. But the world has changed a lot since I came out. I was really shocked last week when the Surgeon General issued his advisory on social media and mental health for kids. I've been amazed at how just knowing that these companies knew these harms were real across a wide variety of harms has really galvanized the activist community. It's caused legislative conversations around the world that were pretty stalled for a long time. And so if I could do it again, I would totally do it again. I've been incredibly fortunate and how smooth it's gone, and it's exceeded my wildest expectations.
You've written a book about your experiences. Why did you want to take pen to paper, fingers to laptop and share your story with the world.
One of the things that kind of baffles tech journalists when I talk to them is because tech journalists live in a little bit of an echo chamber. You know, like our classic criticism of tech is tech lives in an echo chamber. Tech journalists also live in an echo chamber. A little bit when I say to them, you know, when I take flights. You know, I'm a friendly person. I'm one of those annoying people that talks to their seed mates.
I did that too.
Yeah, and it's amazing. At least half the people I sit next to you have never heard of the Facebook whistleblower. And so it's one of these things where culture change, and that's the thing that we really need. We need to reset our relationship with these companies. It takes a long time, and this book, I'm hoping helps a much much larger set of people, much more diverse set of people get a seat at the table, but kind of laying out, like what are the conversations we need to be having, Like what are the choices we get to make in the next few years, because we are in a moment of inflection and we need to have as many informed people at the table as possible.
You know, you talk about the Vic Murphy, the Surgeon General issuing a warning about the dangers of social media for kids. Other people becoming more aware, and I'm just curious if you feel this way. But for me, I'm kind of like, what took so long? This doesn't take a brain surgeon, a rocket scientist, or a tech expert to know that people were becoming have become incredibly addicted to social media. Tristan Harris was sounding the alarm after he left Google. All kinds of experts were saying, this is dangerous. Why did it take so long for this to really become headline news.
One of the things I talk about in my book is, you know, what's the difference between say, the car industry and like the automotive industry and social media when it comes to our ability to hold it accountable or or our ability to understand it. Back in nineteen sixty five, it's going to sound shocking. There were no seatbelts in cars, no airbags. I remember that, Yeah, Like I listened to like my parents still stories about the kids all like jumbling all over each other in the back of the station wagon, and it's like, really, wow, a different world. We now put eight year olds in car seats, right. But the world changed very suddenly when a guy named Ralph Nader came out with a book called Unsafe at Any Speed. And what really changed was that people didn't realize that there was the ability to live in a different world. You know, that our fatality rate today is way less per mild driven for cars because of a long series of actions. But the thing that people need to understand is when when Ralph Nator published that book. You know, there were one hundred thousand automotive engineers in the world when I came forward. I think they were on in order, probably three hundred or four hundred people in the world who really understood how systems like facebooks work. And of those people, you know, we are educated in such narrow ways, I think a lot of those people didn't understand the larger societal consequences of those choices and decisions. And so Ralph Nader could have a chorus of automotive engineers all say this is happening. When it comes to social media, each of us sees a different world. You know, for many, many, many people who would be the ones asking those questions. When they open social media, they see their friends and family who are likely relatively similar to themselves. You know, the idea that Facebook could be radically different, radically more dangerous in a place like an African country or in Southeast Asia, it sounds foreign to us. Were like social media is about looking at pictures of cats, and so I think that's a big part of it, Like we need to be able to have the right to study social media, We need to have the right to be able to get independent data off these systems, because then we can have definitive conversations.
Is that because we're not having a universal experience. It's a highly deeply personalized experience for everyone. So it's not as if we're all driving cars. We're all on different vehicles, if you will. So it's not unifying people to realize that they have to demand change.
It's really important for people to understand just how different those worlds are in terms of transparency. And that's part of why I wrote the book. You know, I've I've got to live a lot of that arc of how we write software or like what does it mean to have experiences online? And I wanted to walk people through, you know, this is what changed from step to step to step so that more people have that context.
I'm just curious, Francis, I know you weren't super psyched to go to Facebook. When did you realize we're not in Kansas anymore? Something is awry.
It was interesting, like when they reached out to me, I was like the only thing I work on is misinformation. You know. It's interesting. I got there and I think one of the first moments where I was like, wow, this is this is chaotic is to the role I had is something called a product manager. So product managers are responsible for helping articulate what is the problem we're trying to solve, how might we solve that, and then once we come to consensus on a solution, what's this series of engineering tasks that will allow us to execute that solution. I had a role of being a product manager. And Facebook understood that they were a different enough company that they had seen that if people came in from the outside, they didn't succeed at a very high rate, like there was a lot of churn, and so they established a boot camp for two full weeks to just give kind of a basic level of like heire's how Facebook works. And my manager pulled me out of it with a like three days. He was like, we're you know, things aren't fired too much, Like we have to come up with a plan for the next six months, even though you know nothing about the problem or what's going on, Like we need you an articulative plan. Now. That was kind of my first warning that I was like, oh, wow, like the house is on fire. The house is on fire, and people are running around, even having the self awareness to be like, oh, we know that if people don't get at least a certain amount of bootstrapping. Facebook is very hard to figure out, even internally, how it works. And it was interesting. I showed up for that first meeting, you know, the one that my managers like urgently pushing me to prepare for, and we spent twenty minutes basically discussing should I have a job? So imagine you show up, You've just gotten hired for this thing, and all the leadership you're saying is like, why why do you have a team? Like why does this team exist? Like think about that for a moment. You know, because activists have told you, because the un told you, Hey, and me and Mark, your negligence around misinformation killed twenty four thousand people. You know of a problem. And yet I could sit in a room full of the leadership of safety having them be like shouldn't this team exist? Right? You can imagine that first six months was a little stressful.
What was Facebook doing in the lead up to the events? On January sixth, we'll talk about that right after this, we're back with Francis Hogan, Facebook whistleblower and author of the Power of Want. Francis, did Facebook really care about misinformation or did the company just feel like dealing with it was an exercise and futility.
I think part of the problem was that Facebook had taken the most obvious path to deal with misinformation, which was, let's hire experts, let's hire journalists to help us assess what's true and false. But that kind of approach, you know, fixing safety after the fact, like, oh, we've already hyper amplified you know, extreme content. Now we're going to pluck out the dangerous parts. Those strategies don't scale. You know, Facebook has three billion users. There were maybe a few thousand fact checks being done a month, maybe a few thousand. You can see why you actually need to take a different kind of approach, which is coming in and saying, why are the algorithms rewarding extreme content? How do we change our operations to deal with that fact?
As they say, lies make it around the world before the truth has the chance to tie its shoes, And so what was the alternative? What would that approach it? Ben I mean, what is the answer.
Let's take, for example, something as simple as should you have to click on a link before you reshare it, so you hit the nail on the head. One of the problems with third party fact checking is journalism takes time. At Facebook, on average, it was like two or three days for someone to write a fact check, and they put a huge amount of effort into trying to build prediction systems to guess which are the pieces of content that might go viral, because we have to give the journalists a head start. Alternatives are things like if you require people to click on a link before they reshare it, that reduces misinformation by like ten or fifteen percent, just because people have to pause and think for a moment. You know, there's one or two billion people on Facebook who live in places where Facebook is the internet. You know, they might have become literate to use Facebook, and as a result those places. In some countries, thirty five percent of everything you see on your news feed is a reshare and so Facebook wasn't willing to take the hit of you know, zero point one point two percent less profit of reducing the amount of content that was moving through the ecosystem. As a whole. So those are kinds of product design ways of dealing with misinformation.
Our third party fact checker still the primary way Facebook is ostensibly trying to combat miss and disinformation.
You know, I don't know like because we have no transparency. You know, right now, we don't have any trend parency into how these book operates. We know that Mark fired lots and lots of safety people during his year of efficiency, so it's possible that things have changed, but given his recent behavior, it's unlikely things have materially changed.
Let's talk about sort of the public facing explanation from Facebook. I interviewed Cheryl Samberg in twenty nineteen when you were still working at Facebook. I asked her if she felt like Facebook was doing enough to invest in its security. Let's take a listen and let's hear your reaction to what Cheryl told me.
We've put tremendous engineering resources, and we're doing things like red teams, asking what do we think the bad guys would do and how would we do it? So we're never going to fully be ahead of everything. But if you look at if you want to understand what companies care about, you look at where they invest their resources and if you look back three to five years and you look at today, we've totally changed when we invest our resources. And my job has changed too. If I look at up at Facebook eleven and a half years, for the first eight or so, I spent most of my time growing the company and sometime protecting the community. We always did some protection, but now that's definitely flipped. My job is a majority building the systems that protect and minority grow. And so we're definitely changing as a company. We're in a different place across the board on all of these things.
Do you think you're changing enough fast enough?
I hope, so we're trying.
What's your reaction to that, Francis, It's.
So interesting when we listen to media, you know, a film clip, an audio clip from the past, you know, you can often hear the emotional echoes of that moment. And I think back in twenty nineteen, she's quite earnest, like the only part of Facebook that was growing was the civic Integrity team. I think twenty nineteen was the year that the UN report on Menmark came out and you know, firmly placed blame on Facebook. They were still living right in the immediate echoes of Cambridge Analytico for context, right about when when you interviewed her, that would have been when the FTC find Facebook five billion dollars because of privacy violations from Cambridge Alica. But over the course of the next two years or even the next year, I think Facebook began to realize that having a big safety team, having people with PhDs asking questions was putting Facebook in a quite awkward position because the more people dug in, the more people having the ability to ask questions, they found things, and they found things that were quite disturbing. I think they did a pretty good job in the run up to the twenty twenty election, but as soon as the twenty twenty election passed, they fired that team, the Civic Integrity team.
One month after the election, yeah, one month, and of course five weeks later was when the Trump supporters, many of whom organized on Facebook, stormed the US capital.
And I think part of what happened was their papers in the Facebook files. They say like, we saw all this building up. But I think because now no single person was responsible, no one felt like they had the authority to go in there and intervene.
Where do you draw the line? And nobody has a crystal ball to say these people are going to do that.
Facebook talks about the difference between movements and we are to knows adversarial movements. So an adversarial movement knows that they're violating Facebook's policies and actively does countermeasures to try to get around Facebook. That's one we differentiate between, like, does this movement think they're doing something wrong? Right?
Right?
And you saw that extensively with Stop the Seal.
If the Facebook Civic Integrity Team had not been disbanded less than one month after the election, what would they have seen and what would they have done with the activity they witnessed going on on the platform.
In the round of the twenty twenty election, there were a number of things in place where Facebook said, hey, we know we have vulnerability in our system. For example, live videos. This is where you know, I can film something on my phone and Facebook will put a little announcement at the top of people's feeds. Facebook knew that live video was a particularly big vulnerability for the company because video is harder to monitor than audio or text, for sure. So you either can deal with that after the fact, or you can say, hey, what's leading to that video? Going viral, and in the case of live video, Facebook said, hey, you know, every piece of content on Facebook earns a score based on how relevant it is to uktie or to your listener. You know, is it similar to other things that they've seen before. Does this person generally produce content that people would like to engage with? You know, there's a bunch of factors. You earn a score, and that gives you a priority in the news feed. When it came to live video, they would give a boost. They'd say that score, We're going to multiply it by eight hundred and fifty times to make sure that it will show up at the top of your feed. They said, hey, we know this is dangerous. We're going to only boost it sixty five times. In the runchy election, it's a little tiny detail, but when they stormed the capitol, the rioters actively used live video to coordinate, and so there's these little things where they could have had the safety measures on that were on election day, but because no one felt they had the authority to say we're in a situation, no one turned those on until the day after they stormed the capital. These are little, tiny details, but you have to remember When people interviewed the rioters after January six, they said it seemed real. It seemed real. It seemed like everyone was saying, like, we're about to experience a coup, like we need to like go and save our democracy. These little product tweaks would have changed the information environment that those people experienced, and who knows what would have happened with January six.
So why didn't they do it? Because the team had been dissolved.
There was no longer a person in the company who wore the hat of saying, let's make sure we're a positive force in society, right, there was diffuse responsibility for Little Tiny Slippers. And I think after they dissolved the team, which was on December second or December third, I don't think there was anyone who felt like they had the authority to say, hey, some people are going to have to work over the holidays, right, Like this is a big enough deal that someone's going to have to do something. And I think that's why Facebook was asleep at the wheel.
When we come back, Francis and I talk about improvements in social media that could have a positive impact on the team Mental health crisis. We're back with Francis, Howgan do you think Mark Zuckerberg just cares about profit over everything? And is there something about the broader culture of Facebook that makes this almost a Sisifian task to try to control or at least even monit or remove really dangerous content.
So I'm glad that you bring up Mark, so just for people understand how different the leadership of Facebook is versus other companies. Mark Zuckerberg holds about fifty five fifty six percent of the voting shares that control Facebook, so that means he's the chairman of the board, He's the CEO. If he wants to invest tens of billions of dollars in the metaverse, no one can stop him because he is the only voice that matters. I do think responsibility goes to the top, right, Like part of the challenge here is you have a man who has been CEO since he was nineteen years old. Facebook is intimately tied to his identity, and it's very hard for people to admit that their life's work might be hurting other people. And so unfortunately, there is an internal culture to the company where the people who surround Mark know that being too critical isn't going to get you very far, I said, I think part of why Cheryl left from when I came out, she left maybe six once after the Facebook files happened. I think as Cheryl was a voice that was trying to push for responsibility, and there wasn't really an appetite in Tronto the company to do that.
To this point, in twenty nineteen, I spoke with her about whether Facebook's business model ultimately rendered implementing security measures bad for business. Let's hear what she said.
So on this, I'm really pretty proud of our track record. If you look a number of years ago and you listen to our earnings calls, So earnings calls are exactly what people are worried about. They're directed at investors. It's our quarterly report. If you actually watch us in earning calls, we are spending as much time talking about the measures we take on safety and security as we are about our business growth.
Easily.
We actually said many quarters ago, this is so important to us that we are going to make massive investments and change the profitability of our company by making real resource investments. And we have to the tune of billions and billions of dollars, and we will keep doing it. We've taken action after act after action that is better for protecting them community than it is for our growth, and we're going to continue to do that. Mark has said it over and over again. I have said it over and over again.
Do you believe that, Francis.
Oh, Kittie, I'm so glad you played that clip for me, because I am totally going to go get the transcripts now of the investor calls just to see how things have changed, right, because I think back in twenty nineteen they were trying like they got burned by Cambridge Analytica. They lost a huge amount of goodwill and regulators from users. I don't think that sentiment she expressed is still true. One of the things that Elon Musk showed was that you could fire all your safety teams and no one bad. It an not right because we don't have any stats. I want to be super honest with people. Mark Zuckerberg has fired a huge number of safety people in the last six months, and the market has rewarded him. You know, their stock price is going up because Facebook looks more profitable. But he also fired their AI safety team, and then they open sourced their large language model when people talk about existential risks from AI. Allowing for mass proliferation of these technologies doesn't allow us to do thoughtful, slow, intentional development, And so I don't think what she's saying is true anymore. We're living in a very different world.
In fairness to Cheryl, do you think it was true at the time.
I think in twenty nineteen, they were trying hard. If Facebook had continued in the vein they were working in twenty nineteen, I probably would have never been a whistleblower. You know. I probably would have been like many people who came before me, who've kept their head down and kept trying, kept trying to make it safer, and eventually burned out because the only part of Facebook that was growing was the safety teams in twenty nineteen. By twenty twenty, they had given up on that. You know, they'd said, we're not getting acknowledged for the effort we're putting in, and these teams are just liabilities.
Let me ask you just what can be done. We've heard about kids and mental health. We've heard about misinformation and the election. We've heard about so many things that are causing harms to society because social media platforms like Facebook. Section two thirty prevents or protects these social media platforms from liability for the content they may carry. The Supreme Court just made a ruling on that, and I guess now it's up to Congress. But in the best of all possible worlds, what would you like, Francis to be done to rain in the social media companies if you had to wave a magic wand so.
I think it's important for people to understand kind of what's the tool chest that's available to us. I think the way forward is more something like what Europe did. So Europe came in and said, hey, you need to be honest with us about the risks, the harms you know about. You need to publicly tell us how you're going to reduce those risks, and you need to get us enough data that we can see if you're making progress on those things. Because for context, I think the fundamental problem is our relationship with these companies is spewed.
And Congress doesn't seem to really understand the rudimentaries of the technology that powers Facebook to actually want to do something about it.
I think the thing that's going to push Congress over the line is actually the growing crisis around teenage mental health. Historically, just for people's contexts, over the last sixty years, we've had only a handful of Surgeon General advisories. It's things like seat belts save lives, smoking causes cancer, breastfeeding helps infants health, things that we take for granted today. But before those advisories happened, there was ambiguity, there was controversy. Historically, after a Surgeon General advisory is issued, usually within two to three years, some sort of legislative action takes place. I think it'll be really interesting to see how things play out over the next year or two, at least in the context of kids.
And what can be done about that. Tell me how to reverse or stop the negative impact that social media and things like Instagram are having on young people.
So you mentioned earlier that you know the business model is working counter to our own well being or safety. Let's take a look at sleep deprivation and kits. So one of the things called up by the surge in general was that thirty percent thirty percent of teenagers say they use social media till midnight or later most weekdays. That's crazy when we look at risk factors for things like multiple kinds of mental illness. That's not just depressure and anxiety. It's also things like bipolar. When we look at risk factors for accidental death, both automotive and just general accidents. When we look at risk factors for substance use, uppers post they're tired downers because they're depressed. All of those things link back to sleep deprivation. We've known for twenty years that we can influence whether or not people use products. Imagine if for two hours before eleven, Instagram got a little bit slower and a little bit slower, and a little bit slower, it was like it was like you're pushing the post a little harder. Maybe there was a lag on TikTok between videos. Who knows. We've known for twenty years that if you make an app a little bit slower, people use the less. Imagine as you approached your bedtime, you just got tired and went to bed. That feature is live on Instagram today. That's a meaningful thing that would help kids go to bed.
How Mud if parents come in and take their kids' phones.
We should definitely do that, right, we ignore the fact that these technologies are extremely powerful and addictive, and they operate the level of independence that no other consumer product does today.
In closing, Francis, I feel like I have to ask you about AI, which is the new boogeyman of technology, And rightfully so, it was pretty chilling when these AI leaders said that artificial intelligence poses a threat as big as pandemics and nuclear war, and it's sort of like, holy shit. And yet you wonder, since the government has been so impotent when it comes to figuring out how to regulate social media, what they're going to do about this looming threat.
So I think it's always important to remember that these are percentage risks, right, So this is you know, they say there's a one percent and two percent risk, which is terrifying, right, you know, one or two percent risks of extension. We should take those seriously. But I think one of the things that people also need to be honest about is we kind of let the cat out of the bag, right. I think things like Fortune five hundred companies should get together and say, hey, we will only buy generative AI products that meet this bar of safety. There's a code of practice, the code of conduct, where we're like, we're not going to let our economic might fuel development of AI unless you do it in an intentional, thoughtful, responsible way. I think that's totally a thing that should happen one hundred percent. I think Sam Altman's talks about having licenses of saying hey, right now, there is a market disincentive to be safe, you know, move fast and break things to quote more exeper work. The fact that Facebook fired AI safety team, no one's punishing them for them. But when people talk about existential risk, to not have that existential risk, we have to say, no one in the world, that includes governments and militaries, get to have AIS more powerful than a certain level. You know, how do we have a just more stable world? Because if we are just escalating, the path of escalation will lead to will lead to all those existential risks.
Is there anything you're excited about when it comes to AI, Francis, so we don't have have to end on a terrifying note.
Yeah, we need to talk about short term and long term. The short term on generative AI, I think is transformative. Right now around the world, there are literally billions of people who don't have doctors. We're going to live in a world in the next ten years where every child in the world has a pediatrician, it just might be a robot pediatrician. We're going to live in a world in the next ten years. We're going to live in a world where every child in the world is going to have the highest quality reading instruction that has ever existed for humanity. You know, an endlessly patient tutor that will sit there and over and over again as long as that kid keeps working, will help them learn to read. That's going to be transformative. There are high probability short term rewards that I think are almost certainly going to happen. It is going to transform the world. The thing I try to caution people on is those existential risks are very low probability, and they're much longer off, and so it is more important for us to try to build a just world where the motivations incentives for doing those existential risks are as low as possible. So one of the things that I am always trying to remind people is we have invented new communication technologies before. When we invented the printing press, suddenly a bunch of people learned to read, and people start publishing pamphlets on things like how do you know if your neighbor's a witch? What should you do about that? And chaos ensued. We had wars that killed huge numbers of people when we invented the cheap printing press. We had wars about misinformation things like you know, yellow journalism. But we learned and we responded. We developed journalistic ethics, We founded journalism schools to teach those things, journalistic trade associations to help people self regulate. We passed laws on media concentration to make sure that you know, you got to hear from different voices. We learned about how it lived in our media environment or information environment. It feels overwhelming right now because because we're the ones who are responsible for figuring out where we go from here, you know, it's about how are we going to learn, How are we going to respond, how are we going to act? And part of why I have faith that we're going to figure this out is is while it may seem impossible right now, every single time before when we've made a new media technology, we've learned and we've responded. So I will keep on pushing and I just have a longer time horizon, I think than many other people did.
From your lips to God's ears. Francis Hagen, thank you so much for talking with me. Your new book is called The Power of One. How I found the Strength to Tell the Truth and Why I Blew the Whistle on Facebook. Thank you so much. Thanks for listening everyone. If you have a question for me, or want to share your thoughts about how you navigate this crazy world reach out. You can leave a short message at six oh nine five one two five to five oh five, or you can send me a DM on Instagram. I would love to hear from you next time. This Question is a production of iHeartMedia and Katie Couric Media. The executive producers are Me, Katie Kuric, and Courtney Ltz. Our supervising producer is Marcy Thompson. Our producers are Adrianna Fazzio and Catherine Law. Our audio engineer is Matt Russell, who also composed our theme music. For more information about today's episode, or to sign up for my newsletter wake Up Call, go to the description in the podcast app or visit us at Katiecuric dot com. You can also find me on Instagram and all my social media channels. For more podcasts from iHeartRadio, visit the iHeartRadio app. Apple podcasts, or wherever you listen to your favorite shows.