Social media is much more than a way to stay in touch with friends. According to today's guest, Tristan Harris, Facebook and platforms like it are actually manipulating billions of people's minds. Tristan joins Katie and Brian to explain how tech companies are creating addictions, steering elections and making many of us lonely. He would know: After selling a startup to Google in his twenties, Tristan worked there as an in-house design ethicist, where he studied how tech affects people's attention, well-being, and behavior. Now, as a founder of the Center for Humane Technology, Tristan is on a mission to reform the tech industry. Plus, he offers up some tips on how to track and curb your smartphone addiction.
Learn more about your ad-choices at https://www.iheartpodcastnetwork.com
Hi, Katie, So, how would you describe your relationship with your smartphone? Um? Dependent? I've got a problem. It's bad. I would use the word obsessive to describe me. Probably not as much as you, but me too. So that's the focus of today's show, technology specifically, are growing dependence on it or should I say our addiction to it? Now, we asked all of you to write in and call us with tales from the tech trenches. You had a lot to say. Angie loved Joy on Twitter wrote, I have fiber my algia and no one I've been over using my phone when my thumb on the phone holding hand starts to hurt. Even through the pain, I sometimes soldier through like it's my life's work to check social media and scroll through the news. Ouch. I feel your pain. Well, I don't exactly feel your pain, Angie, but I understand where you're coming from and share a Dupuis, which is a beautiful last name. On Twitter, told us about the moment she knew she had a problem with tech, writing, I realized that when my children were having to ask me questions multiple times because I was engrossed in my phone. You know we've all been there, Yes, Nalini Sang on Twitter said, I give myself test to see how long I will go without looking at my phone, and I feel so quickly. I noticed when a friend's dad forced me to turn off my phone during a dinner. We have a sickness as I type on my phone. Listen. I agree with all of the tweets that we got about this subject. We are addicted, and we are unfortunately ignoring our children at times, ignoring our spouses or our friends. There's an actual term Brian for it. It's called fubbing p h U B b I n G. That means blowing someone off to look at your telephone. And it's so aggravating, and I'm sure it is for other people, because I'm sure I do it all the time never. But you know, it's maybe not entirely our fault, because in hard our phones and these tech platforms are designed to addict us, which we talk about in today's show. That's right, in an attention economy, that's definitely true. We also got this very interesting voicemail from a listener named Steve who tried to return to a flip phone. Hi, my name is Steve Glenn. I'm an airline pilot and for eight months last year I went back to a flip phone. I decided I was spending too much time on my smartphone, and so I got a Kyo Seros flip phone like they used in the military, and I really enjoyed unplugging a bit and not being as accessible all the time. But I finally had to go back and buy an iPhone because as an airline captain, I had to access the Internet and access company websites using my smartphone. And also texting is going to a different type of technology where people send pictures and people send files with texts that you just can't do that with with pumps anymore. So I had my my experiment was a failure. I had to go back to an iPhone. I try and limit my use of it more than I used to. But I thought you'd like to hear that. Thank you very much, well, Steve Glenn. A noble effort, we must say, going back to a flip phone. How luddite ish of you. But you can see, Brian why that would be really hard because so much of our day to day activity and interactions require, uh, you know, a smartphone and not a flip phone. There's even an app Katie called Moment, which I was yes, I have a very long time, and it tells you not only how much time you're spending on your phone, but what you're actually doing on your phone. So you kind of a lie to yourself and you think, oh, I'm checking my email and you're really on Instagram and Twitter, and then you get sucked into a vortex of crap. Yeah, and self loathing for being sucked into a vortex of crap. But anyway, it's a vicious cycle. Clearly, technology is eating up our time, attention, and changing our way of life. Loyal listeners, you may recall when we talk with psychologist Jane Twangy about the costs and consequences of endless screen time is especially for kids and teens. I encourage you all to listen to that episode if you haven't, its number thirty six. But for this episode, we wanted to get into these topics from the perspective of a tech insider, and I can't think of many people better on this topic, Katie, than today's guest Tristan Harris. In his twenties, he sold his tech company to Google and they worked there as an in house design ethicist of all things, which I think many people think is probably an oxymoron when it comes to big tech companies, But The Atlantic called trist On the closest thing Silicon Valley has to a conscience. His latest project is something called the Center for Humane Technology, which he started in February, bringing together many former insiders in the tech world who believe we need to design technology in a different, more humane way. To give you an idea, here's how the Center on its website refers to Snapchat, Instagram, Facebook, and YouTube. They say these are not neutral products, they are part of a system designed to addict us. Now. I originally spoke with Triston for my Nachio documentary series available gosh on demand and ironically on YouTube, Facebook, and Hulu, but I wanted to continue our conversation. He's such a smart guy an excellent communicator. So for today's episode, we talked with Tristan about how tech hooks us and what ethical design means to him. And to start things off, I asked him how he landed at Google and why he left. Well, UM, I was a tech entrepreneur when I was twenty two at a Stanford I started a small tech company and UM It's a long story, but after about five or six years, we we soft landed the company at Google, we were acquired, and after a year into being at Google, I became kind of disenchanted with where things in the tech industry we're heading that instead of really building tools that were empowering people, it was more and more becoming this race between these different companies of getting people's attention and exploiting people's psychology. And I I felt alarmed by this. And I was working with the Gmail team and Gmail has its own problems with people feeling addicted. Uh, And I made this you know, this presentation about Google's moral responsibility and shaping a billion people's attention. It was sort of a slide deck and it exploded. I mean it went throughout the company. Tens of thousands of people saw it. And um, that led to becoming a design ethicist, where I basically was asking the question, which know what had asked before? You know, how do you ethically steer a billion people's attention? We should pause you for a second because you mentioned this slide show. You designed this hundred and forty four page Google slide presentation which was called a call to Minimize distraction and respect users attention. Yeah, and that was in two I mean, this is a long time ago that this these concerns first rose up, and you did this as a Google employee, correct, intending to sort of share it within your team. But then I think Larry Page saw it and other senior leaders with in the company. And what was the reaction to the concerns that you were raising. Yeah, I can't imagine they were super jiggy about that. Well, you know, um, I was. I was nervous about that presentation because I mean I was ready at the time to leave the company. I kind of felt like this was the thing that was most concerning to me. Um. But before I left, I wanted to raise alarms about this issue. And I had a whole background prior to this on how people's minds are manipulated, an influence when I was a kid, I was a magician. I studied at this persuasive tech lab, so I understood that technology could really manipulate people's minds. And that's why I made this presentation. So I sent it to ten people and said, hey, I want your feedback. It was just a slide deck head, give me your feedback. I went home. I came back the next morning, I checked my email and I had like, you know, a hundred emails about this presentation, and I clicked on the you know, their their links, and when you click on the link, it shows you, um, inside of Google slides the number of people who are looking at it at the same time, and there was something like a hundred and fifty people that that morning. When I looked at it later that day, there was four hundred. Uh And just through the next you know week, it was just exploding. I mean, and I had heard that Larry Page it had two or three meetings that day where people brought it up in conversation with him. Uh And so it became this kind of momentum. I wouldn't I don't want to overstate its influence. I mean, I don't think it changed the course of anything, but that it definitely rose alarms and people started to talk about it. But you wrote something that I thought was pretty smart and prescient, which was never before in history. Had the decisions of a handful of designers, mostly men, white, living in San Francisco, aged thirty five, working at three companies that is, Google, Apple, and Facebook, had so much impact on how millions of people around the world spend their attention. We should feel an enormous responsibility to get this right. True now more than ever. Um. And so what did Google do in in response to this besides change your job? Um? Not much. I mean I was given this space to kind of explore these topics, and I didn't get fired. I really just focused on understanding what it meant to hold that responsibility. It's not like we've ever had There's no academic discipline, there's no university that could teach you this is how we do manipulation of a billion people's attention, and that never existed before. So this was a new field, and so I did mostly a lot of research. Um. And in terms of things changing at Google, I tried bringing this up with some of the key products that I thought would be most important to change, so Android because that's the home screen, the phone notifications that people billions of people live by, Chromes, the web browser. People spend most of their time in UH and Gmail. But it was really hard to get a concerted focus on saying we have this is an enormously important topic. This is literally everything, and we need a whole different way of thinking about this. And I was unsuccessful at getting getting those neems to change inside Google. Well, you must have been considered such a massive disruptor, Tristan. I mean here you are saying, hey, hold the phone, everybody, and these companies that are growing like weeds, and obviously the concerns that you were raising were anathetical to their business model, right, you know, It's it's funny, Katie, because it wasn't an explicit There was never an explicit responsive we can't do that. You're asking us to make less money? No, you know it was I never got that response. There this be kind of the smiling and nodding, but then no traction, no momentum. Right, So I talked to you know, an Android, the Android team, and say, well, what if we designed it this way we help people check their phone less and there'd be this kind of smiling and nodding and yeah, maybe we could do that, and there's some teams that are kind of working on that, but then nothing much would really happen. There was no concerted effort. And and this is why ultimately I did leave, is because I realized that there needed to be a much bigger public conversation in public demand for this stuff. But I will say there's a difference between and roid, you know, which is the how your mobile phone works that doesn't need to maximize how much time you spend on the phone. It's really it's like, no one goes to work at Androids saying, gosh, how do we just steal everyone's time. No one says that YouTube, on the other hand, that is their goal. I mean, that is the business model. Is more time watching videos equals more money for YouTube. That means more attention to the advertisements, right exactly. That means more more attention to the advertising. Advertising is the driving business model behind this, this addiction to stealing people's attention and time. So Tristan, let's just step back for a second and talk about how and why these tech companies addict us. It's gone to the point where we check our phones more than a d and fifty times per day. Knowledge workers on average spend a third of their day just doing email. Um, as you said, it's sort of central to the business model of these tech platforms. So can you talk to us about, you know, how and why this happened. Yeah, well, you know it starts by you know, someone building an app and saying I gotta get you to use it. So what they what they want is they want people, you know, to register accounts. They want people to create new accounts and new users to show up, and then they want each of those accounts to come back every single day. They want you to be hooked, So they have to start finding reasons for how can I get you to come back tomorrow. So if your Instagram and you say, okay, if this in the early days of Instagram, I know the guys who made it, they went to school with me, Mike and Kevin, you know, and they're thinking, okay, how do we keep people coming back to this photo sharing app? And they didn't always have this feature called the number of followers you have? Right, why would they add that? We think that that's just natural, that's just the world we live in. But the number of followers we have. Showing that number to you is a good way to get you to come back tomorrow because you want to know if that number went up, And you also want to know how many likes you get on each photo you post, and that's another reason to get you to come back. And so think of these as like biological organisms sitting on a table and are mutating these new limbs, which are just these things that are good at getting you to come back and to stay longer. And if that thing works, it keeps that limb, and it starts, it continues to evolve, and so it's involving all of these new ways. Likes, um messages, shares, filters, these are all ways to keep you coming back and hooked. But what we think we miss is these cultural externalities. This entire selfie culture where you know, young you know, teenagers and a lot of women take these selfies over and over again of basically their their appearance. And let's say that Instagram didn't have a feature called the number of followers you have. Let's say they never went down that road. Would we have an entire culture where everyone's taking photos of themselves and posting it on social media if we didn't have the notion of followers for ourselves. I mean, these design techniques are all being done because they're good at hooking people, but downstream they create these cultural externalities. You have people being more concerned that they're self worth is directly tied to how many likes they on in that last photo. And um, I think we're confusing a whole generation of children to attach their self worth and belonging to the wrong place, not just children triest On. I mean, I am always checking my likes on Instagram and my followers and has it gone up? And you know what comments have I gotten? And on the one hand, I kind of appreciate the community that Instagram allows you to have. You do feel like you're kind of relating to people who have similar interests, and I appreciate hearing from people. But the flip side of that for me is this kind of hunger and almost frantic feeling that, oh, I need to get more followers. I need to make sure they like what I'm posting. And I'm embarrassed to admit that. But I'm a sixty one year old woman. Think about the impact this has on teenagers who are still developing their self esteem and sense of worth. Yeah. I think of this a lot like sugar. I mean, think of our human evolutionary instincts. I mean, sugar tastes good to all human animals. We're built to love sugar. Right, There's a reason because it used to be really rare. Um It's just like social approval. I mean, there's a reason that it should feel good to get those likes and to get that social validation and approval, but we weren't built to get it at this level of frequency dripping into our our mind every you know, five ten minutes with a new batch of notifications on our film. Speaking of that, two things really struck me from Katie's Natchio hour on tech addiction, which is the physiological reaction to this stimulus, the stress hormone cortisol that we're getting sort of overdosed on because we feel nervous about our text and our emails and checking them, are not checking them. And then dopamine, which is this um this pleasure neurotransmitter, neurotransmitter you know this, I don't um that kind of gives us a high when we're getting social validation on these apps and we're not meant or we weren't designed to receive these two things this at the level we are, with the frequency we are. That's right, And I think that's how we can think about how we fix this, which is to say, we need to turn the lens back at ourselves and say, what were we built for? You know how? How are all of our evolutionary instincts tuned and how do we respect those instincts. So a good example is something like a simple one that leads to phones is trigger colors. So red is a trigger color. So every single time you look at your phone and you see that red dot with the number of notifications, it's triggering you into a little bit of a a kind of alarm or or or sort of you know, grabbing your attention, like maybe I should see there's there's something important there that I have to check into. Right, But do we want to be sounding the alarm inside of our minds every single time we check our phone with notifications? You know, in the same way that social validation or social approval are really important things to care about. It's useful back in the savannah to know what do my peers think of me? Otherwise how else are you going to survive in the community or the tribe. But we weren't meant for a virtual community of tens of thousands of people around the world to be dosing us with little bits of social approval and dopamine every ten minutes. And so I think we have to ask how do we go back to being in alignment with how our human evolutionary instincts work? Before we even talk about that, Trystan, I want to dive in a little deeper on some of the techniques that we can be aware of and ways we're being manipulated. You mentioned the red lettering for alerts. Can you help us understand other ways they do it so we can aware be aware that we're being manipulated. Oh man, there's just so many. One thing that really struck me in your writing was the bottomless bowl. Can you explain what that is? Sure? Um? You know, so a bottomless bowl. There's this study that if you give people a bowl of soup, and you know, we think we're in control and we choose how much food we eat, and you give a bunch of people sitting down at the table a bowl setter bowls of soup, and some of the bowls of soup um are just regular, there's all the same size, But some of the other bowls of soup have a little um pipe at the bottom and it's actually refilling the bowl of soup as you're drinking it. And the question was would the people who are drinking from the bottomless bowl notice and to stop eating? And the point of the study basically showed that people don't notice the bottomless bowl. Uh. And that is what how our technology works. The Instagram feed and the Facebook feed and the Twitter feed all scroll infinitely, right. They could choose to not do that. They could have stopping cues. Your your mind is built for something called a stopping queue, where basically we expect that there's going to be a queue that says, Okay, now this is done, and your mind kind of wakes up and figures out what do I want to do next? But if I'm trying to hook your attention and keep you going for as long as possible, my job is to figure out how can I remove all of those stopping cues so I can keep you sucked in as long as possible. What other things other than the bottomless bowl? Uh? I was interested that President Trump's digital campaign advisor was able to plant these ads, and of course we're going to get into the political uses of these big tech companies in a minute, but that they could measure if red or green or blue, the different backgrounds for and and and the kind of words that seem to be attracting more people on the design that would suck people in or kind of agitate them, and they were able to figure out what was the most effective way of in essence, trashing Hillary Clinton? Yeah, I mean this, this whole. All of these systems are basically asking one question, which is, you know, how do we get your attention? And the best way to get your attention is what works on your evolutionary instincts? Does red work better on your evolutionary instincts? Are on your mind? Does blue work better? Does this word work better for you? If I use this political message? Does that work better for you? We've reared so far away from any authentic relationship between a person trying to just talk to you versus a person and sitting there scanning all of the things you ever said on social media and trying to figure out you use um. Whenever you talk about a concept like immigration, you always use these three adjectives. And let's say they're positive or they're negative. Right now, if I'm a political advertiser, what do I want to do to manipulate you? Well, I want to use your own words back to you so you most agree with me. So I'm going to start repeating your own viewpoints, your own opinions back at you with your exact word choices. So you nod your head in agreement, like, man, that person really understands me and what we've created with social media and Facebook. Beyond the addiction layer, the addiction just sets up the kind of matrix everyone's jacked in. So now the moment they wake up in the morning, their thoughts are being sort of influenced by these phones. The second layer is that we've sold that to the highest bidder because of advertising, and we've enabled anyone to go in there and say I can target the precise messages that will resonate with your specific mind. And that's exactly what Cambridge Analytica was in the last election. They were trying to sell campaigns on the on the ability to specifically persuade people using the things that would be most persuasive per mind, so they'd have a profile that your mind is influenced. Um, you know, more by authority. If I tell you that the New York Times said this was true, then you're really likely to believe it. Or if I told you that Fox News said this was true, then you're really likely to believe it. Um. These are all different ways of influencing people, you know. I think we have to realize that all of our minds can be influenced. That's what I learned as a magician. It doesn't matter how smart you are, it doesn't matter what language you speak. Magic and slight of hand works on every single mind. And what we've just enabled is this, you know, arms race for anybody to go in there and do what they want. It's time to take a quick break. We'll be back with Tristan Harris right after this. And now back to our conversation with Tristan Harris. Can you talk about how tech in general and social media in particular have had a corrosive impact on people's perceptions of the news and the issues and basically all of these things we've spoken about since Donald Trump became a candidate. Yeah. Well, oftentimes people think that these feeds, you know what, what pot what shows it. The inside of your feed is just whatever your friends post, so it's just a neutral tool. But this is not true. When you open up that Facebook feed, there's actually thousands and thousands of things they could show you, and out of those thousand things, they try to figure out what will be most likely to get you to click to watch, or to share, or to like, And it turns out that outrage is really good at getting you to click and to share because you want to tell other people, I can't believe the thing that these guys, these politicians did you know today, And so those things enter at the top of everyone's feed So now everybody's going to these feeds, and them, huters, not humans, are selecting out of all the thousands of things, the outrageous things. And what that creates are these waves of outrage inside of all these human animals. We have these you know, human evolutionary instincts, and you turn your phone over in the morning, and instead of feeling calm and opening your eyes and taking a breath and asking like what are my dreams or what are my hopes for today? Or what am I grateful for? What I want to do? What am I going to have for breakfast? What am I going to have for breakfast? You know? You instead, the first thing you do when you set off your turn off your alarm, after your phone vibrates, is you open up one of these feeds and suddenly your mind is like filled with outrage. Screw those democrats you wake up to exactly and we have to ask, I mean, it's like this totalizing on it on both sides. That's The thing is everyone is influenced by this, which is why it should be a unifying issue, right. I think that no one wants this to happen to politics. I mean, once you sort of see where this is all going, we don't want to live in a society that's just triggered and filled with outrage, or that's manipulating kids from the first moment they wake up in the morning and they see photo after photo after photo of their friends having fun without them. I mean, it's always true that people are having fun without us. But the question is, do I fill your day in your morning with evidence of that? Do I just make you believe that's the only thing that's going on. I mean, this is crazy, I agree, And I think it's so corrosive for kids in general. You know, when you see that the suicide rate for teenage girls has I think tripled in the last ten years. And if you look at anxiety being the number one mental health disorder in this country, isolation being one of the biggest problems, it's so ironic in an error when we're all connected, we've never been so lonely. Yeah. Well, and I think Sherry Turkle was just so precient with her book alone together about this that you know, giving having the virtual experience of connection of seeing all these people doesn't mean that as an animal sitting there with the body right and breathing, that we don't need that physical presence. Right. It feels a lot better to be physically present someone than to sit the entire day on a screen getting that virtual form of presidents. Can I ask you Trystan a little more about like a firm like Cambridge Analytica, because I don't think people fully understand how these firms or even tech companies are able to kind of gather absorb these this information, highly personalized information, and then spew stuff out at you. So if I'm talking on my phone with a friend about the fact that I really want to buy a new winter coat, can they pick that up and then start sending me ads for winter coats? Because that's happened to a couple of my friends. They've experimented. They talked about Emmy Rawsom just for fun to see if suddenly they'd get all this stuff about Emmy Rossum, who I really love and I think she's a great person. No, no dis on Emmy, but she got started getting all this information about Emmy Rowsom So that to me is super creepy. I mean, how do they find out all this stuff about us? Well, I mean especially Facebook, I mean the their business. I mean, how much have you paid for your Facebook account? Zero dollars? But we're paying in our attention obviously, Well that's interesting, right, So we don't pay for Facebook. So the question who is paying them? And it's the advertiser, which means that all those people who go to work at Facebook. As much as they say, hey, we want to make the world more open and connected, I'm sorry, but if your business model is serving not people, but the business model is serving advertisers, then guess what. All of that information over time is going to be used more and more to make the advertisers more successful. Otherwise those advertisers aren't going to spend their money on Facebook. So in the long run, Facebook has to be, you know, helping their advertisers be successful. And what that's going to mean is enabling them to access more and more personal information when they target uh, you know, adds to you. And so when you send a message to someone else on Facebook, you know, it'll pick up those keywords and that will be part of the way that it starts to target and enabling advertisers to target you over time. Well, let's talk about Facebook in parta killer for a second. Facebook on Instagram. Um, we sometimes use the euphemism of social media companies different technology platus, so many of them really just Facebook. Yeah. I think we're really in large part just talking about Facebook. So the question is, is Facebook the worst offender here? Are they doing things that are qualitatively different and more harmful than Google for example? Um? Yeah, I think the challenges it's just a ground. For a second, just how many people use Facebook. There's more than two billion people on Facebook. That's more than a quarter of the world's population. Um, that's about the number of notional followers of Christianity who every day are jacked into this system where they start looking at a feed and their thoughts get you know, start flowing into them from these one from this one company in California, you know, with a handful of engineers who make the design and algorithm decisions. And so the reason I say that is that no matter what Facebook us, it's creating exponentially complex consequences for two billion people. So you know, how many engineers speak at Facebook speak Burmese. I mean, I don't know one zero maybe, and yet Facebook is the number one way for people in Burma to access the internet. Facebook is the Internet if you're in Burma. And it started amplifying genocides in Burma because it was amplifying this fake news about a specific minority group. So the point is that they can't reign in this machine that they've created. They've created this automated machine that, because there's no humans, they're figuring out which thoughts should we put in people's minds. It's just the machines calculating what it should put in people's minds based on what's most engaging, and it's starting to push thoughts into people's minds and elections and democracies around the world in languages that the engineers at California and Menlo Park they don't even speak. And so they've created this kind of monster that they no longer control. So now what's going on is they're trying to go back and say, how do we reigin this thing in How do we quickly, you know, throw all the firefighters at this thing and try to save it as much as possible. But I think the challenges we trusted them to try and be thoughtful about this whole thing, and they didn't see from the very beginning that this is the level of influence they had. That they aren't just you know, helping us keep in touch with our friends. They're a political actor. They're steering elections, They're creating addictions, they're making people lonely. I know that some activists and Mimar wrote a letter to Mark Zuckerberg about this, about the very thing that you were describing in Burma, and he wrote them back personally. But you know, to your point, all the people in the world, it's impossible to monitor these things, isn't it. So before we talk about what Facebook can and cannot do, let me ask you about Mark Zuckerberg's testimony, because I'm sure you watched it with a great amount of interest. Um, what did you think about it? And did you think he was questioned vigorously enough by members of Congress, you know, some of whom still have flip phones. Well, I think that is the issue. Fundamentally. It is clear that you know, the modern realities of how technology companies work have far outpaced the governance capacity of really probably any government to stay in touch with you know all of the ways that these things are working and evolved in the business model. And you know, we were involved in the November first hearings um myself and a few other people and briefing major Congress members. Unfortunately, there wasn't as much time for these latest hearings, and I think it came across, as you know, a whole hodgepodge of issues. Some people are asking about privacy, some people are asking about housing discrimination adds, some people are asking about election integrity, data breaches, Cambridge Altica. So to the average person, it feels like, oh man, this is about just a bunch of unrelated things. But I actually want to reframe that. The reason why it felt like it's about a bunch of unrelated things and that these harms are showing up everywhere is because Facebook affects every part of society. It affects election campaign pricing, it affects housing discrimination ads where some groups get discriminated by over others. It affects elections, it affects people's privacy. And I think the problem is if you think about Facebook is almost like a global government. I mean, how much power does Mark Zuckerberg have even over someone like President Trump at controlling people's thoughts and actions, right. I mean, you could argue that he has more power that's completely unaccountable except to him, since he's the major shareholder at doing whatever he wants, and we're sort of left to you know, his moral compass and whatever happens, you know, to be running between the his eyes and ears as the way he's thinking about this. Well, what did you think of his testimony? Well, you know, I thought he was dodging the fundamental issue, which is that the business model of advertising and keeping people engaged is the problem. All of these issues come down to that one issue, and so he was trying to distract Congress from that core issue that the business model is what incentivizes them to offer better ways for advertisers to take people's personal information and target against it and offer better and better ways to keep people hooked on Facebook for as long as possible. And it's also a little misleading when he said that they don't sell data to anyone, because what they actually do is they allow advertisers to micro target people based on their data. So even though the advertisers themselves don't see the data, that's get all the benefits of that data. That's right, and and and Facebook is quick to sort of, you know, smirk when Congress asked them, why do you sell people's data, because they don't do that. But the point is that it is equivalently the same because advertisers are paying Facebook not to own and access your data, not to use it and then spread somewhere else, but they're paying to target ads specifically to those people. You can go on Facebook and you can target um conspiracy theorists by knowing what keywords they tend to identify with. So are you saying that until and allows Facebook changes its fundamental business model, the problem can't be solved. That's that's right. And the thing is that we can't change their business model overnight, and they can't change their business model overnight, even if actually they saw these issues. So I know that there's a lot of work, and we should celebrate the work that they're trying to do right now to try and peel away the problem against their own financial interests. And I think that there's some authentic things that they're doing there, But the question is there's an upper bound to that, And and do they see that the fundamental heart of all of these problems comes down to their business model. They're not on our team. They're not on democracy's team to help strengthen the fabric of society. So long as the people who pay them are the advertisers they're they're not going to be able to solve these problems. So if you were Mark Zuckerberg and you were in charge of Facebook, what would you be doing to put Facebook back on the side of democracy and the American people and everything that's good and just in the world. The first thing I would do isn't just say I'm sorry and there's this one bad actor way over there in the corner called Cambridge Analytica, but Facebook is fine. I would tell the world I'm sorry that I didn't see that our business model was so corrosive and I feel bad about that. And now what we're gonna do is start a transition plan to get off of this business model. And here's how we're gonna do that, and here's how much time it's gonna take. It's not gonna happen instantly, but here's how we're gonna work on that. Here's why, and hey, shareholders, here's why we're going to be regulated if we don't do this. In the long run anyway, And I would start with that. I was going to say, trust, now, that won't another behemoth just take Facebook's place. I mean, I hate to say it, but when it comes to profits versus ethics, profits usually when don't they that that's right? But that's all based on consumer demand. So right now, there's essentially Facebook as a monopoly. Um and it's a new species of monopoly because all of our anti trust, anti monopoly laws are about um uh, they don't handle zero price monopolies, usually a monopoly as they price discriminate so they can control pricing. They can offer one price to you and no different price to someone else, and no one can stop them because you know their a monopoly. The challenge here is it's a monopoly that's free. So all of our antitrust law that's normally about regulating these things doesn't handle free monopolies. And so I think if we change consumer demand and people realize that Facebook doesn't have our best interests at heart, and it never will unless they change their business model, then we're starting to see this movement where people do want an alternative. We're not there yet, but I think slowly that's what's happening. I don't know, you know, sorry to play Devil's advocate for a moment, but it seems to me, Tristan, that a lot of this depends on consumer demand or lack thereof. And there was a big thing, oh, you know people were going to quit Facebook. Well, to quote Brokeback Mountain, I just can't quit you Facebook. I mean it's really really hard because people are so addicted to live without it. And furthermore, there was an interesting piece in The New York Times by my friend Andrew Ross Sorkin, and the headline was our privacy has eroded. We're okay with that? I mean, don't you think that. People It's sort of like being a frog in a slowly boiling pot of water. People have become you know, complacent and in order to it and have pretty much accepted this is the new normal. Well, I would frame it a little bit differently. I would say that it's a testament to how much of a fabric, basic fabric of our lives it's become that we can't just delete Facebook. So when I say this, I'm not saying that consumer demands going to change and everyone's just gonna quit because you know, in fact, people I know at Facebook say, with a smug on their face, will if you don't like the product, just use a different product. It's like, okay, great, so I'll just switch to that other two billion person social network. That's so, that's right there to my right. You know, it doesn't exist. And right now, what we need to do is change the kind of the government and consumer context so that those alternatives can start to as so it becomes harder and harder for them to innovate. And we need make it easier for the alternatives to exist, because we're not going to quit. I mean, I still use Facebook every day because it's the best. It's the only way I can get my ideas out to a large enough audience. And that speaks to the amount of power that they have, right it's not. And the key thing here is also that it's not just an addiction. They've they've really taken over the fundamental communications fabric and the fabric of staying in touch with the people that matter to us. We don't have an alternative, and that's why people aren't just going to delete it. But that doesn't mean people don't have a problem with it. But aren't there some good things about it. I mean about social media writ large, not just Facebook, but all these different modalities in terms of like galvanizing people, getting like minded people to stand up and want more sensible gun laws for example, or start you know, stopping genocide in different places because there's kind of a grassroots uprising. Uh, the fact that, yes, we're lonely, but it is nice to be able to look at video of your grandchild if you live far away. I mean, I can think of countless examples of the positive impact this is having. I mean, you're such a debbut down or is there anything that well I think this is This is such a critical thing because this this conversation often makes it seem as if there's this all or nothing choice, like either we have Facebook and we we have these these benefits in these costs, or we just don't use Facebook at all. And the question is is there a middle way? And as I said this in my Ted talk, that it's like, you know, we can have social movements that take off with positive messages without most of the time creating viral outrage most of the time. I mean, these positive social movements are very very rare in comparison to the daily outrage that we experience on these things. The experience of knowing that your friend is you haven't seen for ten years is also visiting a city, um that you find out about that only because you had a Facebook account. That's experience is great, but it's very, very very rare compared to all the time people spend sort of just mindlessly browsing and they self report regretting. And I think there's a different way to design all of this stuff. And that's that's, by the way, what we're trying to do with our with our work with the Center for Humane Technologies, show that there actually is a different way to design these products that's not an all or nothing choice. So what does a good Facebook look like? Is it a subscription model where advertisers wouldn't be on the platform, that people would pay to use it, and therefore the business of social media isn't about capturing as much of our attention as possible. Yeah, think of it like a utility. I mean, um, you know, um, we would we would pay to have access to a service that's all about benefiting our lives. So what I mean by that is, right now, if you walked I mean, if you walked into Facebook today, and you just interviewed a random sample of people and you said, what are you doing with your time today? You as an engineer at Facebook, and you would find every engineer is basically on one goal, which is am I keeping people engaged? Scrolling, clicking, liking, hooked or not. They're only concerned with that goal. And instead, imagine we paid for Facebook. All those engineers go to work and they ask, how do we generate positive benefits in society? How do we make this a resource? And Mark Zuckerberg himself back in two thousand five used to talk about Facebook as a social utility. He didn't try to defend this stuff about news feeds and content and all this kind of stuff, which that came later after they hinge their success on the advertising model. Before that, it was a utility. It was an address book. It was something I could use to make things happen in my life. And I think, if we paid for Facebook, you don't just see news feeds without the ads. I'm talking about a radically different service that is entirely built as a utility to empower us to make new social choices together that we wouldn't be able to make without Facebook. Things like Oh, I can find out when my best friend is visiting town that I haven't seen in ten years. Um, I can have strangers who when they move to a new city, they can quickly find the groups that they can be in touch with. But it wouldn't be built around news feeds and mindless consumption, which is what it's built around. Now. I want to ask about fake news is because listen, these seem to be long term goals, Tristan, but you know, we have an important midterm elections coming up in November, a hugely important presidential election coming up in um and so what immediately, what can be done about the proliferation of fake news, these bots, Russian influence and what we're seeing and general chaos when it comes to this platform in general. Yeah, the thing that people should understand is, you know, if I ask you the question, how much more confident do you feel today if an election was to be held that it is less vulnerable to outside manipulation and influence than it was in like zero point zero? Correct? These platforms are still highly highly vulnerable to not just Russia, but any state and even non state actors China, North Korea are in the playbook. Is now out there, and now anyone can spend money on Facebook to target information to the audiences that they want. And it's not just the ads, by the way, there's loads of other techniques. I work with ex intelligence people that know about those techniques, and it's still one possible to create impersonation accounts, to create fake images that are dark don't exist, these deep fakes where you can fake videos of people saying things they didn't they didn't say. All this is going to get worse, which is why we're doing this work. This is the short term, as you said, Katie, is if you care about the integrity of our elections and our democracy, then you ought to care about how these platforms reform their practices to better protect them from manipulation. Is their awareness part of the solution, though, I feel like you know, I had a good friend I've mentioned this a couple of times on our podcast who got this video and sent it to me about whom Aberdeen and her connections to the Muslim Brotherhood, and she was appalled and have you seen this? And I said day that this is bullshit, this is not true stuff, but it was so professionally done. And do you think that people are at least more skeptical and are kind of more educated did in terms of their consumption of this kind of thing, or or am I just a little more sophisticated than your average consumer. Well, I'm very worried about this because often we have this feeling that well, I'm the smart one, and it's only those very persuadable person's way over there that were influenced by Russia. But not right. Look, and you're as a magician, as a kid, you realize that everyone feels that way, Like, no, no, no, I I'm you know, magic is only gonna work on those dumb people you know, who don't are un educated. But I have a PhD. So therefore I'm not manipulated. It's actually the opposite. Usually, the people who are most confident are the ones who are much easier to manipulate and the you know, we didn't always see society that way. Back in the nineteen forties, UM, the United States government UH created you know, the Committee for National Morale UH and there is also the Institute for propagand Analysis to protect the American psyche from foreign influence. We recognized fundamentally how vulnerable our population was to outside influence. And we actually made it a government campaign funded to make make sure there is large public awareness campaigns. And I think Facebook and other companies need to do much better and step up their job and spend millions and millions of dollars on making sure people are aware of these conspiracy theories and needs deliberate campaigns because we're very vulnerable, so we need some more awareness for sure, but we also need the platforms to crack down on the ways that they continue to be vulnerable to outside influence. So what you're saying is that for all the rhetoric, Facebook really hasn't cleaned up the problem of fakes and foreign influence, and it could have as profound in effect this year as it did in It could totally have as profound in effect this year as in now. I do want to say there's a lot of people working really hard, I'm sure at Facebook and at Twitter to try and clean this up. My point is just that they're not nearly close enough to make us feel confident about our elections not being manipulated. So to close the gap, I think we need to be highly aware and spread this message to everybody we know that these conspiracy theories are going to be very compelling and look very true, and they'll spread really quickly. But we're gonna need you know, there's a reason we have a five second delay on television. You know. We don't want to just automatically broadcast everything instantly to millions of people and have it reshared that fast networks. When Donald Trump had been elected president without Facebook, and I'm not saying that with the political specific political bias, I can just tell you that from the perspective of how media works, social media was critical to his election. That's pretty chilling. What about artificial intelligence? Just on how concerned are you about that? And this trend to give more and more decision making authority to robots or to you know, algorithms or too technology in general, versus having some kind of human judgment involved. Yeah, this is really the critical thing we think of artificial intelligence. People think of the Terminator movies and Arnold Schwarzenegger and you know, death bots or something like that, and they think it's all about the future. Oh, and the future will worry the AI is going to kill everybody or something like that. And what this misses is that we already live inside of a system that is governed and run by artificial intelligence algorithms. And that's because when you open up a Facebook news feed, that's an AI. It's trying to figure out what can I show you that's going to keep you hooked. When you open up YouTube and you wake up two hours later saying, what the hell just happened with my time? The reason was because it was playing. It was an AI that was playing chess against your mind. Think if your mind as a chessboard, and you can. It's sitting there and it thinks it knows what it wants to do and what your goals it has, But YouTube sitting there trying to play chess against your mind, asking what are the videos I can put on AutoPlay next that will keep you on here for as long as possible. And just like when Gary Kasprov played the AI at chess and it and and Gary lost, it's because the AI was seeing way more moves ahead on the chessboard than Gary could see. And at some point it can see so many moves ahead that it's done. It's it's checkmate against humanity. And the issue that we have right now is that when you land on YouTube and it sees that many more moves ahead it's not aligned with our goals, and YouTube also drove fifteen billion views to Alex jones conspiracy theory videos on its own using AI, So we already have an AI problem right now, which is why it's so critical to bring awareness to these issues. In closing, Tristan, what are steps that all of us can take to protect ourselves from being manipulated by tech until our kids from getting addicted to exactly until and unless these companies fix themselves. What can we do in the interim to to protect ourselves as much as we can? Absolutely, I mean, obviously we don't want to wait five years for tech companies to finally come around, so we have to have choices we can make right now. And the good news is we can. Um A bunch of these, by the way, are on our website, Humane tech dot com. Um. But you know, you can do things like turn off all notifications. Most people don't realize that notifications on your phone are mostly invented by machines trying to find a way to get you coming back, saying, oh, ten new friends posted some likes over here. Don't you want to see what those are? Just turn all that stuff off. Try to make as many notifications on your phone off as possible. Um, when I say this, you'll hear it, but you probably won't do it. So actually consider what you know. Can I turn off notifications? Am I willing to make that step? I really recommended. Another thing is setting your phone to gray scale. If you feel really addicted, every time you turn your phone over and you see those colors, it's lighting up some of that dopamine rewards, just even if by looking at it, And if you set your phone to gray scale, it's sort of cuts out about fift of that like addictive feeling. Our listeners know. Yeah, you go into uh your on an iPhone, you go into general settings, and then general and then accessibility and the scroll all the way to the bottom to accessibility shortcut. Well, this is a good example. And this is where Apple you know, when the new iPhone comes out in a few months that they can make this a lot easier for people. But I really recommend and just not using social media, just you know, at least uninstalling it from your phone and only using it when you're on a desktop. Um, and that that will never happen. By the way, I did do that with Facebook. I should probably do it with Instagram as well, um Twitter. I couldn't give up, but it made a huge difference in my life not to have Facebook on my phone. I tried to turn my phone screen to black and white per Tristan's recommendation. That lasted about fifteen minutes because I missed it. I was like, the world just doesn't feel good with a black and white phone. See what's my problem, Tristan. You don't have a problem, Katie. Know. The thing is that I think people need to be able to switch. I think the thing is people need to be able to switch it back and forth between color and black and white. The point isn't to stay in black and white all the time. It's just that, um, you know, keep it in black and white by default, because then you you actually get used to it. You actually it starts to feel overwhelming to see color. It feels like whoa, that's that's too much. And that's kind of where you want to be is to remind yourself that this is a tool in my pocket. It's only a tool, and I want to make sure I'm putting it in its place. That that's the role I wanted to serve in black and white can be a helpful mind. And what about for kids? Real quickly, Tristan, because a lot of our listeners have children or grandchildren. I find that it's quite depressing when I see parents at the park or playground. They're so you know, they have They're so busy looking at their phone. And I'm not being judge here because I probably would be doing the same thing if my kids were younger. But they are not interacting with their kids in the same way. And it's or people having dinner. I don't know. I'm sure you all have seen this. Nothing is more depressing than to see a couple at a nice restaurant on their phones or even a family. I get so bummed out. And again I'm not being judge e listeners, but it just it's just, I don't know, there's something about it that just makes me incredibly sad. You know, this is the conversation. I mean, this is the culture that we're that these design choices are creating, and um, you know, this is going to be up to us in the short term to realize is this the society we want to live in? Is this? How is this who we are? Um? And I think that that's a question that all parents have to start asking themselves. Um. And you can use the tips that we just talked about in the meantime to try and use it less. And finally, finally, there are a couple of apps that you recommend to make this easier for all of us. Can you just briefly talk about some of those. Yet you can download and installed Moments, which is an app for the iPhone that tracks how much time you spend. You know, I mean that helps to sort of see and get a picture of where your time is going. But I do really think that the much better thing is simply to look at your phone and you know, bleep most of the apps you're not using, turn off notifications, set it to black and white. Um, there's a couple other tips on on the website. It can really make a big difference. And just simply being aware of this can change your relationship to it. And you know, I think that realizing that you're missing out on life. Talk about not being present, whether it means staring at your phone when you're in a new city and you don't even know a your surroundings, or not ever being bored, which is the key to creativity. In this hour, I did Tristan where You're You're featured. One of the things I realized and thought about is the part of our prefrontal cortex that allows us to be creative that fires up when we have a creative thought, cannot really operate or function when we're constantly distracted by our phone. So you know, every moment, whether it's driving, you know, if you're in the passenger seat of a car, or you're just you know, waiting, nobody ever has this time to just sit and think, and that is absolutely key to being creative and coming up with ideas and having epiphanies of all kinds. And that's one thing I think about too, This this constant stimulation which only increases the cortisol and doesn't allow us to have a moment where we can think and consider and contemplate things. That's be right. And that's why this is such an invisible problem beneath all other problems, because every choice we make in our lives is on top of the background of how our mind is feeling, thinking, um choosing, and how we think feeling. Shoes are basically, you know, never been more influenced by how our phones shape our attention and whether you care about creativity or mental health or loneliness. We're gonna be publishing something soon that's kind of a ledger of all of these negative externalities, these cultural harms on society, so that people can really see it all in one place, because the effects are so profound and so invisible. Well, I'm so proud of you raising these issues at the ripe old age of thirty four being thirty three still but I know the Atlantic called you the closest thing Silicon Valley has to a conscience, and I really appreciate everything you're doing, the consciousness you're raising about these issues. If people want to learn more, they want to actually support what you're doing, triest On, how can we do that? Yeah, just go to the website humane tech dot com, which is for our Center for Humane Technology, and um, there's ways to get involved, join the community, make a donation. Um. You know, we see this as a team effort, and this is a team humanity really protecting and fighting for for the world we want. I was more than a little freaked out after speaking with Tristan. What about you, Brian? For sure? But knowledge is power, as they say, and I think it's time for a lot of us to take a hard look at our tech habits, myself included, and just how much time they're sucking up in our lives. That's true, and I think if you're just conscious of it, it does help you. And I think there are times when you can leave it at home. I know parents and people say, well, what if my kid calls, what if there's an emergency? And I feel that way too, But I also feel like you have to take a break from it sometimes and take a walk, don't listen to our podcast. Wait wait, did I just say that? But sometimes you just need to leave it at home, get outside and be on tether. Don't have it permanently, you know, attached to your hand. By the way, I have a friend who has two phones, one for during the week which has social media and all the bells and whistles of a smartphone, and one for the weekend, which is just texting and calling in case his kids need to reach him. I thought that was an interesting idea. Must be nice to have such rich friends, Brian. Meanwhile, stop checking your phone a hundred and fifty times a day, people, Brian. I downloaded that Moment app and the worst day I was on my phone for ready nine hours now. In fairness, I was sick, I had like the flu, and I had nothing to do but lying in bed in my phone. Well, I was watching The Crown. If I recall and on my phone and you weren't transfixed by the Crown, I can't believe I was. I was, and it was crazy because I had to keep rewinding it because I'm us things because I'd be on my phone anyway. I got a lot of problems people. That does it for this week's show, Thanks as usual to our pod squad over at Stitcher. That's Gianna Palmer, Jared O'Connell and Nora Richie. And thanks as well to the team over at Katie Curric Media. That would be Alison Bresnik, Emily Beena and Beth Demas. Mark Phillips wrote our theme music, and Brian and I are the show's executive producers. For Better for Worse, I'm under Katie Couric on social media. Instagram is where I shine people. Brian tweets his little hard out at Goldsmith b. Meanwhile, don't forget to call in with your stories about discrimination and gender bias at work. That number again is nine to nine to to four four six three seven, or drop us a line at comments at currect podcast dot com and by the way, if you haven't already, please leave us a rating over at Apple Podcast and be sure to subscribe to the show as well. Thank you so much for listening, and we'll talk to you next week.