Legislators recently met to vent their frustration at big tech’s lack of accountability for the production and trading of child sexual abuse material. Will it make a difference?
If you would like to reach out to the Betrayal Team, email us at betrayalpod@gmail.com.
To report a case of child sexual exploitation, call The National Center for Missing and Exploited Children's CyberTipline at 1-800-THE-LOST
If you or someone you know is worried about their sexual thoughts and feelings towards children, reach out to stopitnow.org
In the UK, reach out to stopitnow.org.uk
Hey guys, it's Andrea Gunning with some big Betrayal news. I have been on location with some of the people you heard in season two, Ashley Avea and their family to shoot a docuseries for Hulu. I'll let you know when the docuseries available on Hulu later this year. We're all so excited to announce that Betrayal will become a weekly series starting this summer. Thanks to your support of this podcast, we'll be able to bring you many real life stories of betrayal, making this community even stronger. So if you've been thinking about sharing your story, now is the time. Email has a Betrayal Pod at gmail dot com. That's Betrayal Pod at gmail dot com. I want to share some news that affects and children everywhere. Our second season of Betrayal focus on families destroyed by child sexual abuse material also called see SAM. The National Center for Missing and Exploited Children has reviewed over three hundred and twenty two million images and videos of child sexual exploitation. It's hard to wrap your head around that. It's why we couldn't stay away from the topic last season. It's also been a big issue in Washington recently. Betrayal producer Kerrie Hartman has been following developments. Carrie, I know you watched it. What did you see?
Yeah, I watched it. It was fascinating. The Senate Judiciary Committee, they subpoened five CEOs of some of the biggest tech companies Discord, Snap, Meta X, you know formerly Twitter, and TikTok, and the committee wants to advance several bills that address online safety for children. And this hearing it got a ton of publicity, and at the beginning Senate Judiciary Chair explained how the committee was feeling.
These apps have changed the ways we live, work, and play, but as investigations have detailed, social media and messaging apps have also given predators powerful new tools to sexually exploit children. Your carefully crafted algorithms can be powerful force in the lives of our children. Today we'll hear from the CEOs of those companies. Their constant pursuit of engagement and profit over basic safety have all put our kids and grandkids at risk. But the tech industry alone is not to blame for the situation we're in. Those of us in Congress need to look in the mirror.
This was a major issue for two New York Times supporters that you talk with earlier this season.
Yeah, why don't we actually revisit that interview with Gabriel Dance and Michael Keller.
We spoke with people who said that as early as two thousand, tech companies knew this was a very serious problem, and we're doing nothing to solve it. In two thousand and nine, when they introduce scanning technology, we knew that it could be effective and helping stem the problem. Still tech companies were not using it.
I would say if you talk with most technology policy people, their answer would be technology companies don't have that much pressure to get rid of harmful content on their platform because Section two thirty of the Communications Decency Act shields technology companies from any liability for content that users post.
Can you explain more about what section two thirty does?
Okay? So, Section two thirty means any lawsuit holding a tech company liable for damages won't go anywhere they have immunity. So if Facebook court Snapchat or ex as storing or transmitting images of c SAM, for example, parents can't hold the company responsible. And try to imagine if it was your child's photo and if that child was tricked into sending it but Section two thirty was passed almost thirty years ago, back in nineteen ninety six. No one could have imagined back then TikTok or Instagram or even sex stortion. People still had their photos developed at the drug store. And I have to tell you how real this is. I mean, this happened to a close friend of mine, to her child. You take a vulnerable kid and a savvy adult with no conscience and no barriers.
Right, So why was there a hearing now?
It seems in recent months that frustration with tech's immunity is just getting bigger on both sides of the aisle. And look, this isn't the first time Congress has summoned tech leaders for a shaming session. But I was really curious was this more than a shaming session? So I reached out to Politico technology reporter Rebecca Curred. She was in the room for this whole thing, and she shared some of her thoughts.
Oh interesting, I've been covering efforts in Congress to regulate social media companies and how they handle kids online safety issues. Typically there's a lot of posturing from the senators, but in the room, it was very palpable. The emotion because this time, the committee members invited families whose children have died as a result, they say, of content they've been exposed to on the platforms. A number of children have committed suicide over cyberbullying, over a new phenomenon that I know you guys have covered in the podcast called sextortion, where organized criminal groups create fake accounts that opposed to be other children and extort elicit images from children and then hold them financially.
My gosh. Yeah, And the committee chair Dick Durbin co sponsored the stop Ceesam bill. That bill would hold platforms responsible if they host cesam or make it available. And you're probably thinking, well, who would make those images available? But haven't you ever searched for something like you just took up skiing recently, right, so you want to see more image of a skiing and then the platform's algorithm recommends more content because they think that you like that. Well, it does the same thing with nefarious and dangerous content. And Senator Ted Cruz went after Meta on exactly that.
Point, missus Szuckerberg and June of twenty twenty three, The Wall Street Journal reported that Instagram's recommendation systems were actively connecting pedophiles two accounts that were advertising the sale of child sexual use material. In other words, this material wasn't just living on the dark corners of Instagram. Instagram was helping pedophiles find it by promoting graphic hashtags including hashtag pedhore and hashtag preteen sex to potential buyers. Instagram also displayed the following warning screen to individuals who were searching for child abuse material. These results may contain images of child sexual abuse. And then you gave users two choices, get resources or C results anyway in what sane universe? Is there a link for C results anyway?
How did Mark Zuckerberg respond to that.
There's no good answer for that, But here's what he said.
Well, because we might be wrong, we try to trigger this this warning, or we tried to when we think that there's any chance that there is.
OK. Here's more from Rebecca Kerrn.
Tech companies will admit and it is for sure not something they want on their platforms. They don't want to be hosting CCM, and they take great efforts to remove it, and I will give them credit. They invest millions of dollars into AI and machine learning to detect it early. But it's still there and it gets spread across multiple platforms.
These companies are self pleasing and self reporting, but we're depending on them to find it and shut it down.
It's interesting that you bring that up because a senator from Rhode Island, Senator Sheldon white House commented exactly on that issue.
We are here in this hearing because as a collective, your platforms really suck at policing themselves. In my view, Section two thirty is a very significant part of that problem.
Listen, there were great soundbites from senators, but that doesn't translate to policy right. Rebecca Kern pointed out that Section two thirty served an important purpose, at least for a while.
We wouldn't be leading the globe in these innovations without Section two thirty allowing them to flourish without lawsuits. But a lot of other senators are saying, Okay, we allowed them to flourish and grow. Now we need to raten them in and we're an outlier in the whole globe. Europe has been able to pass regulations and hold them accountable, and so a lot of people say it's time to take away this quote unquote sweetheart deal that we have given to tech companies.
Did any comments stand out to you while you were watching?
There were a lot of them, but this one from Amy Klobrichar kind of got me.
When a Boeing plane lost a door in mid flight several weeks ago, nobody questioned the decision to ground a fleet of over seven hundred planes.
So why aren't we.
Taking this same type of decisive action on the danger of these platforms when we know these kids are dying?
She has a point, right, when everyone is worried about their own physical safety. Boom, it's done exactly.
And I got to tell you about another moment that really took the room down, and that was when Meta CEO Zuckerberg testified that social media doesn't really do any harm to kids.
With so much of our lives spent on mobile devices and social media, it's important to look into the effects on team mental health and well being. I take this very seriously. Mental health is a complex issue, and the existing body of scientific work has not shown a cause a link between using social media and young people having worse mental health outcomes.
Did he say that with a straight face?
He did, and there was some laughter. I mean, it was one very short moment of levity. But you know, it's just so absurd. You don't have to be a social scienceist or a psychologist to understand that social media impacts kids a lot.
Was there anyone there defending the work of technology companies? I mean, there are ways they've enriched all of our lives. Can you even remember life before Amazon?
Life before Amazon? We mean going to a store and having to wait in line? No, of course, not no, but all kidding aside. Some senators mentioned that and did praise these companies for adding some value to society. But this hearing wasn't set up for pushback. It was really about these tech companies being told draconian measures are coming if you don't do a better job. But outside of this, there is an advocate for the tech company called net Choice, and they are pushing back pretty hard. They have filed several lawsuits against states that are tired of waiting for the federal government to do something.
Can you give me an example.
Sure, there's one. Net Choice is suing the Ohio Attorney General over the Social Media Parental Notification Act. This law requires companies to obtain parental consent before individual's younger than sixteen can use platforms like Facebook, Instagram, YouTube, Snapchat. So ne choice does not support any of these bills being pushed by the Judiciary Committee. What do they support, well, free speech is what they hang their hat on. Free speech, Free speech all the way. But one thing that they did promote that we'll be familiar to our season two listeners is to hold child abusers accountable by prosecuting more of them. You know, far too many reports of c SAM offenses are not investigated, not prosecuted. Because we talked about this, Andrea like their triage right, there's not enough law enforcement to go after all the people that are breaking these laws, and when they're able to go after them, they can prosecute them and at least put them in for some kind of prison time. But despite that choice, there was some movement on one of the bills called KOSA, or the Kids Online Safety Act. Now, this bill wouldn't repeal Section two thirty, so we asked Rebecca Kerrn, what would it do.
That one specifically, would hold tech companies accountable and imposing a duty of care for them to make sure that their recommendation systems, their algorithms do not recommend harmful quote un content that is the key word, how do you define harmful for them, they're saying it's suicide content, it's eating disordered content.
And Rebecca pointed out that some groups are worried about KOSA moving forward.
Progressive LGBTQ groups are saying, we're worried that this bill also empowers state at Traded General to sue over harmful content. And how they would define content maybe like trans content or LGBTQ content that these communities would want to see on the platforms. Some conservative liing ags may want to take that down. So they said this could have an inadvertently negative impact for certain vulnerable youth.
While the CEOs were on the hot seat, and you know, the day before they were called to the hearing, they did make some concessions that are worth mentioning. Here is X CEO Linda yak Aina X.
Supports the Stop c SAM Act. The Kids Online Safe DAC should continue to progress, and we will support the continuation to engage with it and ensure the protections of the freedom of speech.
And you know, SNAP CEOs Evan Spiegel also came out in support of KOSA and look, it's not everything, but maybe it's a start. Here's Politico's Rebecca Kurrent.
Again, these are the constant battles these platforms have to deal as between privacy, which is such a strong protection in our country, and free speech and other protection and safety and there's you know, no real mandate to put safety first.
Do you think Section two thirty has a chance of being repealed?
I asked re Becca that question, and she seemed pretty doubtful. You know, it's not just the law passing, but it's it's the lawsuits that would follow, and how many years would it be caught up in court.
I can't help, but wonder did this hearing make a difference.
If you're asking will it create more safety for children online? I think there is a reason for hope. There was some movement we've never seen before. But people need to keep applying pressure because that does make a difference.
Thank you to Politico's Rebecca Kern for her insight, and thanks to our listeners for your support of Betrayal. Remember if you want to share your story for the new weekly series of Betrayal coming this summer, email us at Betrayal Pod at gmail dot com. That's Betrayal Pod at gmail dot com. Betrayal is a production of Glass Podcasts, a division of Glass Entertainment Group and partnership with iHeart Podcasts. The show was executive produced by Nancy Glass and Jennifer Fason, posted and produced by me Andrea Gunning, written and produced by Kerry Hartman, also produced by Ben Fetterman, Associate producer Kristin Melcuririe. Our iHeart team is Ali Perry and Jessica Krincheck. Audio editing and mixing by Matt Alvecchio. Betrayal's theme composed by Oliver Bains. Music library provide by my Music and For more podcasts from iHeart, visit the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts.