The Dark Side of TikTok’s Algorithm

Published Apr 21, 2023, 9:00 AM

Bloomberg senior investigative reporter Olivia Carville is back with her latest reporting on TikTok. She explains how the superpopular app’s algorithm can serve up a stream of anxiety and despair to teens, including videos about eating disorders and suicide. And Jennifer Harriger, a professor of psychology at Pepperdine University, joins to talk about the effect these messages can have on teens and young adults.

Read the story: TikTok’s Algorithm Keeps Pushing Suicide to Vulnerable Kids

Listen to The Big Take podcast every weekday and subscribe to our daily newsletter: https://bloom.bg/3F3EJAK 

Have questions or comments for Wes and the team? Reach us at bigtake@bloomberg.net.

This podcast is produced by the Big Take Podcast team: Supervising Producer: Vicki Vergolina, Senior Producer: Kathryn Fink, Producers: Mo Barrow, Rebecca Chaisson, Michael Falero and Federica Romaniello, Associate Producers: Sam Gebauer and Zaynab Siddiqui. Sound Design/Engineers: Raphael Amsili and Gilda Garcia.

When you look at teen surveys, two thirds of kids say that they have a joyful experience on social media. They like using the products because they give them a sense of connection among other teens and they're sending them content that they want to watch. But on the flip side of that, you also have more than fifty percent of teenagers who engage with social media saying that they feel overwhelmed by all of the drama on these apps, and more than a quarter of these kids saying that they feel worse about themselves after using social media.

From Bloomberg News and iHeartRadio, it's the big take. I'm Weiscasova today the downside of the algorithm that powers TikTok's popularity. Bloomberg investigative correspondent Olivia Carvel has been on the show twice now to report about the enormous influence of TikTok, especially on young people. She's back today to tell us about something the company doesn't talk much about, the secret algorithm that keeps many of TikTok's billion plus users glued to the app for hours every day. And just a note, We'll be talking about some difficult subjects here, including suicide and some of it is not easy to listen to. If there are kids nearby, you might want to use headphones. Can we start just by really defining what is an algorithm and how does it work in this context?

An algorithm is a mathematical equation that instructs a computer program how to behave. It is not good, it is not bad. It's not benivolent, it's not malevolent. It is just a mathematical equation. And when you think about an algorithm that is powering a social media product like TikTok, its main goal is to keep a user engaged with the product as long as possible, and the algorithm is trained to do just that.

And so if we're talking about TikTok, which is videos, it's trying to show you videos that people have made that you're going to be interested in and watch it and then watch the next one, and the next one and the next one.

That's right.

The algorithm is learning from you as you use the product. So every movement your finger makes on that app, every like, every rewatch, every follow, every comment, every time you make an action on TikTok, the algorithm is tracking you to then deliver you a steady stream of content that it now knows you want to see. And a number of users of TikTok actually believe the algorithm can read their minds because they feel like the videos they are sent through TikTok are exactly what they want to see. But they don't understand how TikTok does this and what's powering that is the algorithm.

And I think we've all had that experience on social media where you see something that appears in your feed and it's not anything that you're in interested in necessarily or anything you asked for, but you do wind up clicking on it and you do find it interesting, and then suddenly there's a lot more of that kind of content sitting there waiting for you the next time you use it.

That's right. Yeah, we can relay this to targeted advertising when maybe you've been looking to buy a workblazer online and then you open up Instagram and it's right there. A number of women have talked about thinking about egg freezing or getting married and then seeing very targeted advertising relating to those things popping up in their feeds, and algorithms are actually really effective and very useful in some cases. You know, in this infinite world of information. They're delivering you content that you want to see. If algorithms didn't exist, we'd just be wading through endless forms of content that are largely irrelevant to us. So they are useful in that respect, but on the flip side of that, they can also be dangerous.

And you write that TikTok's algorithm, among social media companies in particular, is really really effective.

Yeah, TikTok's algorithm is so successful that we have seen other social media platforms in Silicon Valley try to copy it. TikTok's algorithm is, you know, sending users the vast majority of what they see on the app, and it has become so good at what it does that people are trying to understand it. They want to know what drives it, what power is it, and also how does it influence what a user sees. At the heart of the algorithm, what makes the platform so addictive is the for you page. As a user on TikTok, when you open up the app, the first thing you see is this for you page, which is delivering a steady stream of content that you don't want to look away from because it's so hyper personalized. To what you want to.

See, and it's really what the algorithm has decided what you want to see.

Is that right, That's a great catch. It's not what you want to see, it is what the algorithm thinks you want to see. Social media apps have really become ubiquitous in our lives now. Everyone uses them, particularly kids. They have a lot of positives, like they enable connection among teens, and particularly through the pandemic, was really crucial. It allowed teens to connect with one another, to talk among their friendship groups. But with all social media products, they have different pros and coms. When you look at Facebook or Instagram, a lot of what you see is driven by who you follow, It's driven by your friendship groups. What distinguishes TikTok from other social media products is that it's for you. Feed is really a steady stream of content sent in from random users around the world. It is not driven by who you follow, who you're friends with. It is driven by what the algorithm thinks you want to see. So what the algorithm can send you to keep you watching for even longer. So it's a totally different business than the social media apps before it, and that's where it's really interesting when you're seeing Facebook and Instagram and Snapchat start to try and tweak their own business models to reflect more of that of TikTok, where they're sending random content to users rather than sending them content from people that they already follow or are already friends with. When you look at TikTok's algorithm, it does a lot of great things for users, feeding them content they do want to see, that they do want to be engaging with. But it can also send content to teenagers that can target their vulnerabilities and in some cases send repetitive videos about very distressing, dangerous, or harmful types of issues. And in the reporting in this story, I've talked to a number of young people and families of young people who are aware that the TikTok algorithm have been sending them never ending streams of content about anorexia, are the eating disordered related content, depression, self harm, and suicide?

How did they describe the content the algorithm delivered to their for you page?

A number of the young people that I talked to said they started using the platform and they loved TikTok. They loved the silly, funny dancing videos, they loved the content they were seeing, and they liked engaging with it and trying to make their own content. But then they noticed that what their full you feed or what the algorithm was sending them started to change. It switched from funny, silly dancing videos to much darker topics. You know, a lot of the young people who were watching this kind of content pointed out that it wasn't just videos to raise awareness of eating disorders and mental health issues. It was videos that actually glorified eating disorders, that embraced it, that encouraged it, that told these young girls to eat less than three hundred calories a day, or try this exercise to make your legs look skinnier. It promoted videos and images of other young women with eating disorder who suffer from anorexia and encourage them to look the same way.

Olivia, your most recent story about TikTok focus is on a teenager named Chase Naska. Can you tell us about him?

Chase was a sixteen year old boy who grew up on Long Island in Bayport. He went to Bayport Blue Point High School. He loved playing soccer. He was on his high school soccer and swimming teams. He was a very competitive athlete. He was an honors student, so he was excelling academically at school as well. He had a big group of friends in the Bayport area, and in twenty twenty, when the COVID nineteen pandemic shut down his school, Chase started to use social media more like most other teens in America did. He would use TikTok in particular because that's the most popular app among teenagers today, and he would send his friends' videos and often they were funny, kind of silly video about sports or weightlifting or batman or basketball. And then towards the end of twenty twenty one, a number of his friends noticed the content that Chase was sending them through TikTok took a darker turn. It was about suicide. It was posts glorifying suicide, talking about not wanting to live anymore, it's not worth it anymore. Some of these posts really concerned his friends and they didn't understand what was going on. In February of twenty twenty two, Chase took his own life, and afterwards his parents just they didn't understand. They obviously felt blindsided, and they started searching for an answer, looking for a clue. He didn't leave a note, He didn't mention anything to either of them. He was keeping up with his grades, with his sports schedule, and his mum wanted access to his cell phone to get a sense of what he was doing online in the days leading up to his death. So when the police returned his phone, she tried to open it, but like all phones today, it had a pass code and she couldn't break into it. So she ended up figuring out a way to reset his passwords on different social media apps so she could access them. She ended up getting access to both Snapchat and TikTok. Once she gained access to his TikTok app and she opened his for you page, she was horrified at what her son had been sent by TikTok's algorithm. She watched the for you feed for more than an hour and said there were no happy videos, There were no funny videos. It was a non stop stream of content about heartbreak, unrequited love, depression, anxiety, self harm, and suicide. His mother, Michelle Naske, when she opened his TikTok account, she didn't really know what she was looking for. She'd never used the app before. She didn't underst stand how it worked. She just wanted to be closer to Chase. She wanted to get a sense of what he was doing on these apps, what he was seeing, who he was talking to. And she told me that once she opened TikTok, she realized what she was looking for and she found it right there.

Then I'm gonna put a shotgun in my mouth and blow the brains at the back of my head. Cool, you gotta kill yourself a word.

Like right now? If my dad would you listening everything we just heard. There is content that Chase's mother found on his TikTok account, and they were shown to the public at a recent televised congressional hearing where TikTok's CEO showed you testified and we're going to hear more about that in a minute.

Not only was it the for you feed showing her a steady stream of content about depression and suicide, she actually managed to go deeper into his account to see the videos that he had liked, saved, favorited, bookmarked. It became a library of what Chase had seen in the months leading up to his death. There were over three thousand videos saved within this account, preserved within this account almost like a digital fingerprint of Chase that he had left behind, and all of these videos were glorifying suicide in some way. I've seen a lot of this content and it is really distressing to see it. And I'm not a teenager, you know, my brain is fully developed, and even I struggled watching some of this content and particularly thinking about how a teenage boy would have felt watching this kind of content. During my reporting, I spoke to one child psychologist, Professor Jennifer Harriger, who has done a lot of work on social media and how it impacts kids.

Today. Our producer Rebecca Shassan asked profefd us your Harger about her research on social media in young people's brains.

A teenage brain has a less developed prefrontal cortex, that's the part of the brain that's associated with judgment and planning and impulse control. And a teenage brain also is more sensitive to rewards and social feedback, and we know that from numerous studies. And so if you think about a teen viewing this type of content, they would have a harder time recognizing that these messages are not necessarily normalized. They would have difficulty critically evaluating the content of the messages, and they would also be more sensitive to the emotional component of the messages. And then when you combine that with the prefrontal cortex, that would also lead to difficulties with them being able to regulate the time that they're spending on social media, so they might be viewing more and more images or content compared to people that might be able to recognize that, oh, it's time for me to get off.

When his mum started reviewing all of this content and all of these videos that Chase had saved, she said, the one that devastated her the most, that caused her to actually physically gasp, was a video sent to his account three days before his death. It was a picture of a young man smiling on train tracks with an oncoming train in the background, and the caption said, went for a quick little walk to clear my head. Chase died by stepping in front of an oncoming.

Train, Olivia, When you asked TikTok about Chase, where did they say.

Chase's parents have filed a wrongful death lawsuit against TikTok, And when a lawsuits filed, that really removes the ability for any corporation to be able to comment on the case, so it's fairly typical for TikTok to come back to us saying we can't comment on pending litigation. But the company did send a statement through saying that it is committed to the safety and well being of its users, especially teens, and a spokeswoman from the company said, our hearts break for any family that experience is a tragic loss. We strive to provide a positive and enriching experience and will continue our significant investment in safeguarding our platform, and TikTok has done a lot of work in this area, particularly in the last few years, to try and make the app a safer space for kids. The challenges that when you find specific accounts like Chase's account, and you can see that the for you feed is continuously sending him dark, depressing or distressing content, it shows us just how challenging this task is, and that even with all of these efforts, all of these safeguards, or all of these guard rails to protect kids, ultimately TikTok hasn't been able to fully solve this problem. In the March congressional hearing, when TikTok CEO Shaou Chu had to testify before Congress. Chase NASCAR's account was actually broadcast before the entire congressional hearing.

The content and chases for You page was not a window to discovery, as you boldly claimed in your testimony. It wasn't content from a creator that you invited to roam the hill today or stam education content that children in China see. Instead, his for you page was sadly a window to discover suicide. It is unacceptable, sir, that even after knowing all these dangers, you still claim TikTok is something grand to behold.

Chu was asked if he would allow his own children to watch that type of content, and then a representative displayed a number of videos that had been sent to Chase that were glorifying suicide.

Would you share this content with your children? With your two children, would you want them to see this?

That was Florida Republican Congressman Gus Bill A. Racketts, who was questioning TikTok CEO show Chew and Chew is not really given a chance to answer the question he was asked.

And during the hearing, Chase Nescar's parents were in the audience and they were asked to stand up while his account was displayed before all the representatives in the room, and it resulted in probably one of the most emotive scenes from the entire five hour testimony, when you saw Dean and Michelle Nescar stand up in the audience and break down in tears as Chase's death was explained to the CEO.

After the break, the frustration of some people inside TikTok whose job is trying to protect users from dark content, Olivia. Before the break, you had said that tik talk is trying to reduce the amount of this sort of content on the platform. What exactly are they doing, What sort of guardrails, as you put it, are they putting in play?

So TikTok has a trust and safety team. It has more than forty thousand people working on this team to try and make the platform a safer space, particularly for kids, and they have done many things over the years to try and enhance the protection of the app. They've tried things like not allowing users under the age of sixteen to send private messages. They have created a family pairing tool which allows parents to kind of get a mirror image of what their kids are seeing on TikTok to check in on what they're doing on the app. They have also more recently, they've opened up tools that allow researchers to study some of the content that is displayed on TikTok, although in this case these researchers have to actually get it cleared by TikTok before it's published, so they've received a bit criticism around that move. But you know, not only are they trying to be a safer space for kids, they're also trying to be more transparent with how the app works by opening a transparency center in Los Angeles and you know, sharing more information about what it is that powers TikTok. And the most recent thing the app announced was actually in early March, they said that they were pushing out a feature that allowed users to refresh their for you page. So this means if you didn't like what the app was sending you, you could push a button that effectively resets the whole account as if you've opened a new one, and then you can start using TikTok as a completely new user, and it pushes away all of the content that you may have been sending in the past that the algorithm thinks you want to see.

You also write that the Trust and Safety team pulls down videos when they can find them, but it's not really a fair fight.

Yeah that's right. I mean billions of pieces of content are uploaded onto TikTok, you know, in a daily basis, and the Trust and Safety tea is tasked with trying to moderate this content. This is a really complicated nuance complex area. If we just talk about depression for a moment, think about how heart it would be for a human moderator to decide what kind of content it should leave up on the platform and what kind of content it should pull down. So, for example, say we talk about a video where a user is saying I'm feeling really sad today. Should that be allowed on TikTok or should it not? And the human moderators have to ultimately decide this. This gets harder as the topics get, you know, more and more complex. So when it comes to suicide, for example, if someone writes a caption like I don't want to be here tomorrow. Are they joking? Are they being serious? Is this a crif for help, should we send law enforcement to the adore. It's very hard for human moderators to determine what a user means when they send out a video onto the app, and the human moderators have to ultimate decide what to leave up and what to take down.

You spoke to quite a few current and former members of TikTok's trust in safety team, and you say that they were frustrated because they felt they couldn't even get the information they needed to do their jobs.

I spoke to more than a dozen trust in safety representatives for this article. Many of them are former employees of TikTok, some who left as recently as last year, and what they were telling me is that not only is this a really difficult issue for the platform to moderate, but they felt as though they were limited in what they could achieve in their role. And what I mean by that is their task with trying to make the algorithm safer, but they don't know how it works. Many of them described the algorithm as a black box. They said that all decisions pertaining to the algorithm were made in Beijing, where byte Dance, the parent company of TikTok, is based, and even though these forty thousand people who work on trust and safety, who are largely based outside of China, what they could do is come up with tools to protect users. That really put the onus on the user, So you, as the user, have to decide what content you're comfortable seeing. If you, as a user don't want to see content about anorexia, you can put that in as a hashtag and say you want to filter out any content related to that. But this does not change the underlying algorithm.

It does not.

Change the way the for you feed works. The only way to change what a user sees what content is being sent out broadcasts to the billion people who use TikTok on a daily basis is to tweak the underlying algorithm. And the Trust and Safety team had no access to it.

And you write. The members of the Trust and Safety team wrote to the higher upside TikTok asking questions about how the algorithm works so that they would be able to make suggestions about how to make it safer. And what did the company say to those requests.

Many of those requests were just flat out ignored. They said that it was a one way information flow. Trust and Safety was expected to provide information to the engineering team that was largely based in Beijing, But when they asked follow up questions for the engineers who programmed the algorithm to talk to trust and safety, they were stonewalled. They couldn't get any answers. They said that the secrecy pertaining to the algorithm isn't just an external thing, it is a deep internal thing as well.

And what does TikTok say about that.

In a response to a number of questions I asked the company, TikTok said that it takes these concerns that have been voiced by employees seriously. It says that members of its trust and safety team do work directly with engineering, and that it's made significant changes in the past few years to try and make the app a safer space. It's also been trying to make the app more transparent. It's been releasing transparency reports, telling the public how many videos it removes every quarter, and opening this transparency center in LA to try and allow researchers, third party members, journalists to get a better sense of how the algorithm works. So the company is definitely working to try and address this issue.

You right about one person in particular who shows just how difficult this is. Can you tell us about him?

Charles Barr was an early employee on TikTok's sales team and Germany. What he told me was that before he started working at TikTok, he would see a lot of fun, silly, kind of joking videos and he loved the product. He thought it was amazing and way better than any other social media product out there. But once he joined TikTok and started to post about working as an employee at the company, he would get a lot of videos sent to him from users who felt like specific content should come down, was dangerous, was harmful. One of the first really scary videos he remembers being sent was of a man shooting himself in the head. And when Charles would engage with this content, what I mean by engage is just open it, just watch it, just foughd it on maybe to the trust and safety team, which he did on many occasions. The algorithm was learning from him that he was opening, engaging and watching dark, depressing content, so it started to send it to him. It became so dark that he said, on many occasions it led him to cry. And you know, that just goes to show that even TikTok's own employees didn't understand the algorithm and didn't understand how to change what it was sending them, so he raised these internal concerns, and he says a lot of his questions were not answered. After Charles raised these concerns internally, he was actually fired from TikTok in late twenty twenty one. He was accused of expense fraud and misusing company tools. TikTok has said to me in an on record statement that it can't really comment on Charles's criticisms of the company, and it can't validate any of the concerns that he raised internally because they can't find those messages. He actually still maintains his innocence to this day. He sued TikTok for wrongful dismissal and they settled outside of court.

When we come back, the pressure builds on social media companies to do something to have fixed this problem, Olivia. We talked about the congressional hearing where TikTok CEO is asked a lot of tough questions, and here's there's growing pressure on the company that do something about this kind of content that the algorithm is pushing to users.

I mean, the winds of change are coming for social media. We're seeing movements in so many different areas. Now we're seeing dozens of lawsuits filed accusing these companies of wrongfol death, or product harm liability. Many of these cases are ongoing before the courts today. We're seeing Section two thirty, which is a section of the Communications Decency Act that essentially provides tech companies with a liability shield saying they cannot be sued for the content that's posted on their platforms. Section two thirty cases before the Supreme Court right now. We've also seen more and more cases like the Chase Nascar case and parents involved in those cases willing to speak out. We're seeing researchers and academics also push back against big tech, calling out for access to these algorithms to understand how to study them, to see the impact that they have on kids and on society in general.

As are practical matter, what can be done to fix this problem?

Well, in terms of what can be done, I mean a lot of these academics are calling for more transparency around how these algorithms work.

And Professor Jennifer, who we've heard from earlier, said something similar that research is hard to do if you don't have access to all the information.

We have not been able to find a way to actually test the algorithm itself and whether that is increasing risk, And the reason for that is that the companies are not very transparent about how the algorithm works, so there's no way for a researcher to test whether or not the algorithm is making this experience more dangerous for the child without knowing how the algorithm is actually working. If someone is experiencing negative effects based on the content that they're viewing on a social media platform, it would be difficult to know if it's because they themselves have sought out the content, or if the algorithm is pushing more content on them that they would have not normally been exposed to.

We asked Jennifer Harrager for some practical advice for parents who are here in all this and wondering how they can help their kids stay emotionally safe on social media.

If a child's already on social media, then I would recommend that the parents help create a plan to first of all, limit the time that the child is on social media, because they probably will have a very difficult time getting off of it themselves without some sort of prompt And I would also recommend that parents talk with their children and help them with some media literacy. So how do you critically evaluate the messages that you're being exposed to? What kinds of things are you seeing on social media? Let's talk about it. How are you feeling when you're on social media? If you feel a lot worse about yourself when you're on it, you know, is this the best thing for you right now?

Oliba? Given all the pressure lately for more transparency, do you think social media companies will begin to allow some visibility into the way their algorithms work.

One of the biggest difficulties we've seen is that tech companies are notoriously secretive about this data and about their algorithms. They protect that intellectual property fiercely. And this isn't just TikTok, this is, you know, a big tech social media thing. None of these companies want third party researchers coming in and studying their algorithm. That's their secret source, that's what makes them money, that's their whole business model. So now we're getting to a point where researchers are saying, well, look, we've seen rising rates of mental health issues, hopelessness, suicide, and self harm among teens. That has tracked precisely with the explosive use of social media. So you know, correlation is very easy to prove here, but causation is very difficult to prove. The reason why that's difficult to prove is because these researchers cannot gain access to what happening inside these companies and how these algorithms work and how they're affecting their users. So one of the things that a lot of people are calling for, one of the things that we should be talking about is ways to put pressure on these companies to be more transparent, to not only allow people to study the public content that might be on their platforms, which is what TikTok has recently done, but to actually study the underlying algorithm, the recommendation engine, how it's weighted, what it thinks you know you want to see, and how it makes those decisions. And that's what I've heard time and time again from the researchers I've talked to, is we need access to study these because these algorithms and these companies are having a huge impact on the youth of today and we do not know what the long term consequences are going to be. The vast majority of kids do really like social media, but it has this darker side and the darker side we don't understand because we can't study it.

Olivia, thanks so.

Much for being here, Thanks for hearing me.

Thanks for listening to us here at The Big Take. It's a daily podcast from Bloomberg and iHeartRadio. For more shows from iHeartRadio, visit the iHeartRadio app, Apple Podcasts, or wherever you listen, and we'd love to hear from you. Email us questions or comments to Big Take at Bloomberg dot net. The supervising producer of The Big Take is Vicky Virgalina. Our senior producer is Catherine Fink. Rebecca Shasson is our producer. Our associate producer is Sam Goobauer. Philde Garcia is our engineer. Our original music was composed by Leo Sidrin. I'm Wes Kasova. We'll be back on Monday with another Big Take. Have a great weekend.

In 1 playlist(s)

  1. Big Take

    667 clip(s)

Big Take

The Big Take from Bloomberg News brings you inside what’s shaping the world's economies with the sma 
Social links
Follow podcast
Recent clips
Browse 669 clip(s)