Meta announces they will stop promoting "political" content, kicks the can on what "political" means. Internet shopping company Temu ran ads during the Super Bowl, but an awful lot of people claim that their app compromised their private information, and experts say it looks like malware. We don't recommend installing it! An unscrupulous anti-abortion company used geolocation data from a data broker to target people who had visited clinics with a national abortion misinformation campaign. On the streets of San Francisco strangers came together, put aside their differences, and joined forces to destroy a self-driving Waymo car. It's a news roundup!
Uber, Lyft, DoorDash drivers in the U.S. to strike on Valentine’s Day for fair pay: https://www.nbcnews.com/tech/tech-news/uber-lyft-doordash-drivers-us-strike-valentines-day-fair-pay-rcna138450
Temu app contains ‘most dangerous’ spyware in circulation: class action lawsuit: https://www.fashiondive.com/news/temu-class-action-lawsuit-data-collection/699328/
Would you buy this Temu purse? https://www.instagram.com/p/Crl1copvr_u/
Instagram and Threads will no longer promote ‘political’ content. No one knows what they define as ‘political’ but it's a safe bet that (Daniel Day Lewis Voice) There Will Be Bias. https://www.cnn.com/2024/02/15/success/instagram-threads-political-content/index.html
A data broker company tracked visits to 600 Planned Parenthood locations around the country, then sold the data to fuel an anti-abortion misinformation ad campaign, senator says: https://www.politico.com/news/2024/02/13/planned-parenthood-location-track-abortion-ads-00141172
Protesters Gather Outside OpenAI Headquarters: https://www.bloomberg.com/news/newsletters/2024-02-13/ai-protest-at-openai-hq-in-san-francisco-focuses-on-military-work?leadSource=reddit_wall
A crowd destroyed a driverless Waymo car in San Francisco: https://www.theverge.com/2024/2/11/24069251/waymo-driverless-taxi-fire-vandalized-video-san-francisco-china-town
Nearly 1 in 5 Americans believe Taylor Swift is involved in an election conspiracy theory: https://www.monmouth.edu/polling-institute/reports/MonmouthPoll_US_021424/
Weird times for media outlets: https://www.newyorker.com/news/the-weekend-essay/is-the-media-prepared-for-an-extinction-level-event
There Are No Girls on the Internet, as a production of iHeartRadio and Unbossed Creative. I'm Bridget Todd, and this is there Are No Girls on the Internet. Welcome to their No Girls on the Internet, where we explore the intersection of identity, social media, and technology. And this is another installment of our weekly roundup of news stories that you might have missed on the Internet. Mike, Happy day after Valentine's Day. Did you see that all of the lift drivers and Uber drivers and door dash folks were striking? They were doing a Valentine's Day strike for better wages.
I did, and I love to see it.
I love to see people and workers taking collective action to demand better pay, respect, to increase their visibility.
Love to see it.
Okay, So I agree with all of that, plus plus plus plus to all of that. However, I have to admit that even I talk so much crap about Uber and Lyft and door Dash and the way that they run their companies on this podcast, I do use those services like I take Uber and Lyft a lot here in DC where I live. And so this news that Uber drivers and Lyft drivers were striking on Valentine's Day and asking folks not to break the strike and cross the picket lines and use those services. Set me into a little bit of an existential tailspin because I don't know if folks live in DC, you know what I'm talking about. Like, I've lived in the I've lived in cities my whole life, and I've lived in DC for the longest, and I would say Uber and Lyft have really changed the landscape of the city. And it's interesting. I was like, how am I going to get to my I'm had to go to dinner, Like what am I going to do? Should I drive? But if I am drinking? Am I risking in duy? Like is it safe to even to even like be out driving at night? Like I'm not the best driver. I was like racking my brain trying to figure out what to do, And when you know it, it was completely fine. All I had to do was walk out of my apartment. Oh there's a cab. I got in it. I told them where I wanted to go, took me there without incident, went had my dinner, came back out of the restaurant, Oh there's a cab, just got in it. It is so interesting to me how Uber and Lyft their rise and their ubiquity had made me think like, oh, there is no other alternative. Hey, I would never cross a picket line. There's a scene from that sitcoms That Nanny with Fran Dresser, where she's meant to be going to a fancy dinner but all the bus boys are outside striking, and she's like, one thing my mother told me is you never cross a cricket line. And so I agree with Fran Dresher's mom's advice and this, But it's so funny to me, how like I had convinced to myself that if I wasn't taking Uber or Lyft, there was no other options. And it's like, oh, actually, cab was very easy. It was right there, no no problem, do you know what I mean? How strange it is that that technology makes us think we have less options than we actually do.
Yeah, it is funny how that works, because yeah, like the cabs are still there, Metro is still there, buses are still there. There's a bunch of bikes around.
Don't even get me started on bikes.
Okay, well I won't.
I mean, yes for the city run bike share program in DC. But again, if I'm gonna circle back and make this a deeper complaint about Uber and Lyft. Uber purchased those jump bikes and then the line bikes and basically junked them, and so biking was a much more viable option in DC. I can't speak for other cities before Uber and Lyft got involved in it, and now it is like a husk of its former self. You acknowledge that.
Yeah, so I have a lot of thoughts about what has happened to the dockless vehicles here. I remember back in like I don't know, almost have been twenty seventeen, twenty eighteen, remarking to a friend that we were living through peak dockless vehicle because that was the period when all these companies like jump Birds, Spin, Lime, a whole bunch of others that are now gone. They had so much venture capital that they were just like dumping electric bikes and electric scooters all over the city and you could, you know, check them out on your phone using the app for like pennies, right, it was like ridiculously cheap. There were a million of these things. Yeah, they were a blight on the city. People did dumb things like parking them on sidewalks and ways that made the sidewalks inaccessible, which was like definitely not cool, but for somebody who wanted to like get around the city, it was amazing. And now most of them are gone. Like you say that, it's just a handful that are left. It is kind of sad.
Okay, So this is something you know, we don't see EyeT of eye on. I think that bit that you just said about them being a blight on the city, I don't think that deserves to be glossed over because they're so in DC. They are everywhere. They are blocking intersections, they are on sidewalks. It's a nightmare. And half of them don't even work. So it's like at this point, you know, I think it was different when it's like, oh, well, these are bikes that are working that you could use to get around. Hearing DC, part of me is like, it's at this point, it's just like eWays delittering the streets for no reason. They half of them do not work.
It would be nice if more of them worked. Yeah, we don't see eye to eye on that.
I don't know.
I think it's super convenient to have like multiple transportation options if I want to get downtown and then I can just like ditch the thing and be done with it.
I mean, if it actually worked that way. Yeah, that does sound like a good situation. I guess Uber and Lyft had to get involved and innovate that convenient reality out of existence. So thank you Uber and Lyft. Okay, so, speaking of another tech company that maybe is doing some shady stuff. You watched the Super Bowl. Did you see all of those like very weird possibly AI generated ads for Tmu? Actually, apparently it's Tamu. I've been saying it wrong this whole time, but I thought it was Temu.
I did see those ads, and they were so weird. It was weird in the same way that their advertisements for products, which are all over the web and social media, are also weird. It's just like just a little bit off, like appealing and sort of interesting, but also just like strange and a little bit off, as if like every step of the production process were performed by AI.
Did I ever show you the Tamu ad that I got for the purse that looks like a gun?
You did show me that.
It's like, I'm gonna I'll put a link to it in the show notes. I'll try to describe it. I know, this is a not a visual medium. But it's like a handbag on a chain that hangs about hip level, and the handbag itself is shaped like a gun. There's really not more to it than that. And so I'm sure someone somewhere was like, maybe this is edge design, but it looks so if you were to go out, it just looks like you're holding a gun on your hip, which I don't know. I don't think that's the kind of perse that me, as a black person, wants to be out and about carrying, like, oh, just going to the bank with this gun, this visible gun on my hip. Thanks Tamu. But that's what you're sort of getting at that even the products somehow seem like they were designed with no human at any step in the process.
Yeah, And I frequently get ads for products that like, it's not even clear what it is. It's some sort of interesting looking gadget that's somewhere between like high tech and steampunk, but like does not have a clear function from the ad, And sometimes I'm like genuinely curious. They do a good job of hooking my curiosity and I'll almost click on it until I see that it is Tamu Okay.
Well, let's see if you will ever click on another Tamu ad again after we talk about what's actually going on there. So Tamu, they did have a Super Bowl commercial back in twenty twenty three, but this year they ran four identical commercials with a jingle that says like shop like a billionaire, and they also advertised fifteen million dollars in giveaways. So I guess those ads are really paying off because people are really downloading Tamu after those ads, according to NBC, who talked to the mobile intelligence company app Topias vice president of research, Tom Grant, who said in a statement that Tamu's app downloads increased thirty four percent on Super Bowl Sunday from the day before, which was Tamu's fastest day over day growth since November. So this is the part where I have to give a big loud warning caution about Tamu. Please be careful with Tamu. I would even go so far as to say, maybe don't use Tamu, and by use it, I mean like don't even download it, let alone make a purchase on Tamu. And after that Super Bowl ad, maybe the person in your family who warns your mom, your aunt, to use your cousins whatever about Taymu too. So when I first started researching this segment, I was like, I don't want to get sued. I'm gonna like say everything very carefully. I don't really need to do that anymore, and you'll find out why. It's a minute. So there are multiple reports of people saying that they have had their identity stolen after they downloaded Tamu. I say, downloading with intention because they didn't necessarily have to make purchases. Tamu is one of those apps. In order to see the great deals or find out what that item is that it's been advertised to you, the only way to see that is to download the app. And so Mike, you're saying, oh, I see these novelty goods that they pique my curiosity on social media, but I always stop short of clicking on them. That is good. I think that that is part of Tamu's marketing strategy of peaking your interest with these deals or with these novelty goods, getting you to go through the process of downloading the app, and that is where they get you.
Yeah, it seems pretty obvious, right, Like if their business model were selling goods. They would make it possible to buy those goods without downloading the app, but it seems like one hundred percent of what they're trying to do is to get me to download the app, which is skitch.
So there are hella posts on Reddit about people who think they have had their identity stolen after getting mixed up with Temu. Here's one. I idiotically installed Temu on my phone last week to see what these deals were that I keep hearing about. Uninstalled the app after ten minutes. Today, I woke up to see an odd notification on my phone pertaining to an approved Facebook ad. Apparently they made a page on my account, made a post about selling women's lingerie all in Mandarin. Temu is a Chinese owned company, had Facebook boost it as an ad, and now people are asking why I'm selling women's lingerie and if I am Chinese. I've deleted the page and now I'm worried ads manager will demand payment for the three hours of ads they've run on my page. Mike, there are so many posts like this, people saying I didn't even purchase anything. I just it piqued my curiosity. I downloaded the app I had it on my phone for less than ten minutes, and now some sort of weird behavior is.
Going on in my name, and it is so consistent with their slightly off branding that like, it's not just a normal scam that takes his money. It's like, now people are asking why I'm selling women's lingerie, and if I'm Chinese.
I know it's not just a normal scam. It is a weird This is what I'm saying about this company. It seems like there are no humans anywhere in the mix. And it's like the kind of scam that is written by AI. It's like it can't even be a normal scam. It has to be a weird scam. Even their scams are weird.
Yeah, it's like a scam within a scam.
Okay. So when I was thinking about how to talk about this issue, I was like, oh, I really I just want to be like people are saying, here are some experiences. I'll just like share people's experiences and let the listeners decide, because I'm like not trying to get sued by TAMU. I'm sure they're very litigious. However, come to find out that TAMU has faced multiple class action lawsuits alleging that they are misleading customers about the scope and reach of data access and collection and intentionally has loaded dangerous malware and spywear on to use devices. This is from a piece in Fashion and Dive that we will put in the show notes. So, attorneys for the plaintiffs claim that Tamu collects data beyond what is necessary for an online shopping app, including biometric information such as facial characteristics, voice prints, and fingerprints. The lawsuit quotes experts who said that TAMU gains access to literally everything on your phone, and the suit further alleges that TAMU is able to read private messages, make changes to a phone setting, and track notifications.
I mean, that's just like a ridiculous amount of privileges for an app that is about purchasing goods.
Like I can see.
Why they would want to why they would want to harvest every piece of information about you, including your biometrics and the content of your private messages. But it's difficult to think about a world where like that's okay for an app to do well.
Mike, do you want to shop like a billionaire or not? Do you want these eye catching deals or not? Is it just the costs. I'm doing business to shop like a billionaire?
What does that even mean?
Like our billionaires out there being like, oh there's things only three bucks?
What a deal?
I feel like that's exactly not what billionaires do.
I know. Even the slogan doesn't make sense, Like nothing about it makes any sense.
The only thing that makes sense is that they're like clearly trying to just harvest all of your data.
Well. The newest complaint against Tamu cites experts who have studied the Tamu app and concluded that it is quote purposefully and intentionally loaded with tools to execute virulent and dangerous malware and spyware activities on user devices. The complaint further alleges great efforts were taken to intentionally hide the malicious intent and intrusiveness of the software.
My god, that's intense.
Like we talk a lot about bad apps, bad companies that are like not taking appropriate protections to protect their users, but this is just malicious straight.
Up, according to this complaint, like straight up malwares. I guess you probably never expectant, but when you buy something online from a company that's like kind of sketch, Like like when I first started making purchases. I think I've made two purchases on TikTok shop. I was like, oh my god, is this is this? Am I about to get my identity stolen? I thought people were going off of mostly vibes that like it felt sketchy, didn't feel right, whatever, whatever. No, this is like legit, real deal malware. All of this to say, like, I guess I feel pretty comfortable saying people should really be thinking long and hard before downloading the app, even if you only downloaded to look because it's some novel the item that you can't believe is real, Like a gun shaped purse has come across your feed. Just think really long and hard before you click, and tell your aunties and your mama's and your cousins about like, send them this segment if they're thinking about buying something on table or you know that the kind of person who, like you know, can't resist a good deal online and wants to shop like a billionaire.
Yeah, I wonder if we're gonna hear from them. But you know, based on these experts and many lawsuits and many Reddit posts, it definitely seems like the sort of thing that should be avoided.
From what I have seen, I think that Temu might be litigious. They were gotten into it with En back in twenty twenty three, where they so like Sheien is another kind of like crappy fast fashion retailer. Tamu is one of their big rivals. Tamu sued Sheen alleging quote mafia style intimidation, and so something tells me that this company is like kind of litigious. I don't know all the details between what happened between Tamu and Sheien, but I was like, oh, I don't want to get mixed up with this company. But now I feel very confident saying that folks should really think twice before tabloating it at the very least. So Temu, if y'all are listening, feel free to come on the show and plead your case. I would love to hear how you justify this. Maybe it's just like by hook up our remote recording software. It's like, God, they got my identity, like they're so good.
Yeah, hold on, I'm getting a notification about something that's trying to install.
Mike, Why does this ad say that you're selling Chinese laingerie?
The scam is coming from inside of me? Oh, No, they are good. They are good. It is a cool little gadget.
Though.
Let's take a quick break at our back.
Let's talk about another big story going on, which is Meta deprioritizing political content. So we have talked a lot about the different changes happening for our social media landscape. Honestly, it feels like every day it's a new thing. I am exhausted trying to keep up with it. I cannot imagine how people who don't do this for a living field. I guess I'll put it that way. And so this is a change that I think could have big consequences, especially for traditionally marginalized people. So Meta, the parent company and Facebook, Instagram, and Threads, announced this week that they would be deprioritizing political content from threads, Facebook, and Instagram. This comes from Adam Mosseerri, my personal nemesis, who said the company will no longer quote proactively amplify political content from accounts who don't follow most Aari said that the platform will still show you content from people that you have chosen to follow, but that the company will avoid recommending political content to the broader masses. Our goal is to reserve the ability for people to choose to interact with political content while respecting each person's appetite for it. So I think that this is likely Facebook kind of admitting or acknowledging that they have no real ability to control things like hate speech or conspiracy theories on their platforms, the kind of things that led to, you know, among others, January sixth, you know, a genocide and Meanmar, you know, really chaos and havoc globally. I think that this is a decision that is like, Yep, we don't know how to control it, and honestly, we can't be bothered to figure it out. I also think it dovetails with a story that we covered pretty extensively last summer about Facebook dropping news content altogether. I think that they've kind of decided that news and political content is just like not worth it, and so they're just going to be like, Yep, we can't figure out how to manage it responsibly, and we're just going to sort of abdicate that responsibility.
Yeah, I think that's exactly right. I think they're responding to those two different pressures. The one pressure that they don't really want.
To be in the news business anymore.
There's this article in The New Yorker yesterday about how people are just like burnt out on news in general and how it's like rippling across media, and so I think, you know, this is meta. So that's probably what's driving their real decision. Is they just like it's not profitable to be serving this kind of content. But then also, yeah, I think they just don't want to be seen as having to have responsibility for moderating like responsible political content, and so they're trying to just like take their hands off of it entirely.
But you know, good luck with that.
Yeah, I have two things to say about that. I was reading this piece from Charlie Wurtzel at The Atlantic about how perhaps that does in some ways reflect what people are saying they want from these companies. Like Ortel argues that people do not necessarily want to feel like Facebook is just picking and choosing what they see and moderating things. But however, they also don't want it to be the wild wild West of lies and disinformation and hate speech, and so in some ways Rtil argues that Facebook kind of is in a tough situation of wanting to give people what they want but also not wanting to do too much and sort of like trying to figure out like what it is people want in that like sweet spot. I also want to say something about this idea about folks being burnt out on news that really resonates with me, and it's one of those things that I don't really know what to do with. I actually started and then abandoned a Patreon episode about this, and so if you're wondering where the most recent Patreon episode is, it's still floating around in my brain because it's a tough thing to navigate. I feel like I'm navigating it. I'm really curious how listeners are feeling about it. It does feel like we've reached a fever pitch where there is so much difficult news and that is real. Like I can speak for myself personally, a lot of the coverage coming out of Palestine, for instance, is the kind of thing that I have historically had a lot of issues with following very closely. It's the kind of tragedy that I would get very invested in and feel the need to follow quite carefully. But also it's the kind of thing that nobody could follow closely without it really getting to you. You know, journalists from the Ground have talked about this too, and I do think it's one of those things where it makes sense that it feels so absurd that all of this is going on meanwhile we're meant to be like going to work and living our lives as if it's not. It's really difficult. And I say that from a place of like extreme privilege, of like I have the ability to say, like, I don't want to see pictures of dead children anymore because it's really wreaking havoc on my mental health. If I was in Palestine, I would not have the ability to say to say that, right. It would be like, oh, yeah, I'm sure you would like to not be experiencing this happening in your real life. And so I say that cautiously, really aware of the fact that, like, it's a privilege to be a to say, like, oh, this news is such a bummer, but I also want to acknowledge it is a lot like I think when you think about what's happening globally, not just in Palestine but all over it just part of me understands what people are saying when they say that they're burnt out on news content, and even from somebody who makes a podcast, Like, I see the numbers for how episodes perform, and it does seem like people are looking for content that it's not necessarily anchored to the news. And yeah, I guess all of that to say is I'm curious how that lands for folks. I'm curious how folks are navigating that in their own lives. I'm definitely trying to navigate it as somebody who creates a thing. And I think it's important that we all stay engaged and informed and checked in and like, don't it's so easy just to retreat. Again, speaking for myself, it is so easy just to retreat into like Real housewives and like nostalgia and like watching The Office a million times and things that feel good. But we still have to find ways to stay informed and checked in. And Yeah, I'm curious, Mike, if you have thoughts around that, or if listeners, if you have thoughts around that, please let me know. I'm genuinely very curious how folks are navigating this this time.
Yeah, that all really resonates. It's tough times. It's been tough times for years, right, you know, there's terrible wars happening. Domestic politics is anxiety provoking, to say the least, and it's really difficult to stay engaged. That New Yorker article was interesting. We should link to it in the show notes. I feel like this is kind of in some ways not surprising that collectively a lot of people in society are feeling this way right now. Right we've had ten, almost like twenty years of social media where algorithms and platforms and bad actors and nefarious, demagogic politicians you know who I'm talking about, have really perfected the art of dialing up outrage to get us engaged, to get our attention, to get our money, to get our votes.
And it's just.
Exhausting, right, It's just gotten more and more exhausting. And you know, I guess maybe in some ways this move by Meta can be seen as like a reaction to that that we need something else other than just a continual ratcheting up, an increase and increase of like outrage and terrible stuff, but just you know, stripping political content from the platform. That doesn't really seem like the right approach either, because there are legit problems and issues in the world that people need to be informed about and should be engaged with.
Well, that's what I'm saying. I think that Facebook was a huge driver of the polarization that you described a moment ago, this ratcheting up of more and more hate, more and more you know, engagement farming by making people angry and outraged and more divided and less trustful of each other and all that. And it's just I don't like that they can just be like and oopsie's swe don't want to do that anymore. You can't just create this chaos globally and then step back and say and wash your hands of it and say it's not your problem, because it is your problem. It's your problem that you cause. And I think it's I think it's pretty telling that now they're like, h we don't want to do it anymore. And I have seen people, some of whom are people that I trust and sometimes agree with, say that this is a good thing, that less politics on platforms means less chances for polarization and all of that stuff, Right, But I am not so sure. And I think that a change like this has the ability to be very bad and destabilizing at a time in our media ecosystem it's already not great. So if threads and Instagram and Facebook are no longer going to be prioritizing any kind of political content. That means, first of all, that means that there's going to be a bigger strain on Twitter to be a place where you go for news. And Twitter is a mess, and so we're going to make Twitter a bigger piece of the pie for our media diet as it pertains to politics and news and what's going on. That is a problem because Twitter is a sesspool right now. And I also think, you know, Facebook has really not made it clear what they consider political content. They haven't really given any information as to what they consider political and I just think the idea that there's one big bucket of clearly defined political content is just so incorrect. That is not how people live their lives. The reality is everything is intertwined and intersectional, right Like the super Bowl was political, Star Wars is political, you know, Real Housewives is political. You know, like everything is political. Everything is politics. And I think that this idea that oh, we're going to start moderating what is and is not political without giving any insight into what that's going to look like or any kind of concrete examples, is very worrying for me. So I guess CNN reached out to Adam Moseri, who conveniently was traveling and could not give any information. And you know, Adam mos Ari doesn't really do interviews. What he does is like tweet and then selectively respond to some people's tweets, Like he tweeted about this change. Mark Cuban was like, Oh, how are you going to be deciding what's political and what's not. And adamos Ari replied to Cuban's tweet, so like, that's how ineffectively we're getting information that it's poised to change our entire media ecosystem. What a gem of a guy at a Mossy. However, so, even though he did not respond to requests to clarify what is going to be considered political content, a META spokesperson did give this very vague statement, informed by research. Our definition of political content. It's content likely to be about topics related to government or elections, for example, posts about laws, elections, or social topics. These global issues are complex and dynamics, which means this definition will evolve as we continue to engage with the people and communities who use our platforms and external experts to refine our approach. Scott really doesn't say anything at all to me. They threw in the word communities and people and basically they're like, we don't know, y'all. I can't like that statement is so vague. It basically says nothing.
Yeah, it really reads like it says nothing. Like it one gets the sense that they don't want to be in the business of deciding what kind of speech is okay and what kind of speech isn't. They don't want to have to make those value judgments because they're going to make people bad, So they're just going to blanketly exclude any sort of political content. But all they've done is just shift those value judgments from you know, what speech is harmful or hateful to what speech is political.
And in doing so, that is a value judgment. Like not by saying like we're just going to deprioritize political content, that is them picking and choosing what is politics and what is something else, Like whose issues are going to be considered political and who's are going to be considered okay. I feel like we have a lot of precedents for Facebook doing that, Like Facebook already is involved in the business of making those kinds of decisions, So we do have some precedent to look at how they might go about deciding what is or is not political. And spoiler alert, I believe that if it's something that pertains to somebody who is traditionally marginalized, women, black folks, queer folks, folks of color, trans folks, that is going to be much more easily deemed as political. And if it doesn't, that is going to be much more easily deemed by Facebook as not political. Facebook has already established such a clear precedent of doing this. We've talked about it before, like when it comes to how they moderate their ads. For instance, any kind of ads that are related to things that are related to the sexual health of people who are not CIS men, so queer folks, trans folks, women, ads for things like pelvic pain during sex, Facebook has deemed to those sexually explicit and will not allow them on the platform. However, products that deal with things like our rectile dysfunction, Facebook has deemed those products health products, and so they're fine. And so I would argue that Facebook is already in the business of deciding what is and is not acceptable, what is it is not legitimate political issues or legitimate health issues or whatever. So the fact that Facebook is already involved in baking this kind of bias into how they make decisions like that does not fill me with confidence as they continue with this new direction of deprioritizing political content.
They're so powerful, they have so much power to shape discourse, shape the information ecosystem in which we all live. It's laughable to think that they could just like wash their hands of a huge swath of content, you know, and just like abdicate any responsibility for political content. It's you know, it's like the Spider Man quote. With great power cups great responsibility and they're not even attempting to meet it.
I agree, And from a larger broader perspective, we really got to ask just that data is super clear that the majority of these people making the decisions are white CIS men, and not even just white CIS men, like a very specific type of white CIS male, Like they live on the coasts, they make a certain amount of money, a very specific, narrow sub section of people making decisions that impact all of us. And so I would I would step back and say, like, are we really served by a media landscape where this small amount of hyper narrow, specific white men make decisions that impact everybody. I would say no, And I just can't imagine this is going to be a change that helps people who are traditionally marginalized, people who already have a difficult time building platforms and getting messages out there about what's happening in their lives and their experiences that people need to know about. People might not want to hear about, but people need to know about. I can not imagine that this is going to make the platform less possible to folks like that. And so yeah, I guess we'll really see we will.
I mean, there's a world where Meta makes this sort of announcement and they set up a very transparent, accountable process whereby decisions about what sort of content is political or not are made, and they have built in checks to prevent exactly those kinds of bias against women, transgender people, all the groups who typically get excluded from those kinds of decisions. There's a world where that process is part of this and this new policy like positively affirms some values. But that's not the world that we live in, right. This is we live in a world where we just get a vague statement that basically says it's going to be whatever they decide.
It is exactly, And we already know so much about how Facebook has demonstrated very clear bias about how these decisions get made. So The Guardian published this thing in twenty seventeen called the Facebook Papers, which showed internally the training documents that Facebook uses to train their moderators on how to decide what content is okay and what content breaks their rules. And it is so clear that that training material is biased against marginalized people. It's just very clear. We'll throw some links to the pieces in the show notes, but so Facebook already has this precedent for coding in bias against marginalized people into how decisions get made within the company. And so yeah, I just don't think that we should be leaving it up to this very small sub section of whites, traits cis men to make decisions about what kind of content the rest of us are seeing. And it's not even really about Facebook taking a step back. I would argue that that dynamic is not a healthy one. It's not a dynamic that leads to anybody being better informed and having a better media.
Diet, you know, it's interesting thinking about this story in contrast with the Tamu story that we talked about. We're on the one hand and unscrupless companies are just going out of their way to collect all of the information about us that they possibly can, and at the same time, we have this trend where social media platforms are restricting the types of information that were exposed to and the types of discourse that we're able to have.
It's like opposite directions.
And yeah, it really goes back to what you were saying about folks just feeling squeezed and feeling burnt out and feeling exploited, and also feeling less nourished by these online experiences like who wants to have an online experience where on the one hand, when I just go look up an app, they steal my identity And on the other hand, I can't even be nourished by the content that I'm seeing on the internet, like I do think something that's got to give.
More after a quick break, get right back into it.
So the online world pretty scary, lots of threats to privacy. Fortunately, we can feel safe when we're just walking around out in the offline world, right You.
Sure about that that's my I think you should leave Tim Robinson impression. But no, you can't feel safe in the irrail world either, because data brokers are out there selling your sensitive information to whoever. So last year, a Law Street Journal report revealed that the anti choice organization Veritas Society was using cell phone location data shared with online advertisers to target people who were visiting a Wisconsin Planned Parenthood clinic with misinformation about reproductive health. So you would if you were in Wisconsin and you went to Planned Parenthood, they would then bombarde you with inaccurate information about abortion. So after that, Senator Ron Wyden started investigating, and this week Widen found that it went far beyond Wisconsin. Veritus was actually running the largest location data driven anti abortion ad campaign in the nation that we know of, and they were doing it by getting cell phone location data from a broker called near Intelligence and using that information to target people who had visited six hundred abortion clinics across the country with these misleading anti abortion ads. So Wyden is now calling on the FTC and the SEC to quickly take action against the data broker Near Intelligence to really make sure that the privacy of patients whose data was exploited is being protected. He sent letters to the SEC and FTC outlining exactly that. So I should say this is not really new, but it is the first time that we know of that location data has been used in this way. As early as twenty fifteen. You even have one ad firm bragging that they could quote tag all the smartphones entering and leaving nearly seven hundred planned parenthood clinics in the United States. And I was back in twenty fifteen. So this is something that unfortunately has been going on for a while, but we have never seen it to this scale and scope as we have with this Verariti Society situation. Gross I know, it's really bad. So according to Politico, who spoke to Justin Sherman, a researcher who studies data brokers at Duke University, says that this campaign's scale is unprecedented. Sherman says, this is the largest targeting campaign we've seen to date against reproductive health clinics based on broker data. According to Politico. In a February twenty twenty three filing, the company said that it ensures that the data it obtains was collected with the user's permission. However, the company Near Intelligence, their former chief privacy officer, told Widen staff that the company collected and sold data without consent.
Okay, the idea that people who traveled to an abortion clinic or a planned parenthood clinic somehow willingly and with informed consent gave permission to this company to collect their geolocation data is absurd. Like, it's absurd that they would even make that claim that this data was collected with the user's permission.
So Near Intelligence, as former chief privacy officer Jay Angelo, really just saying like a canary here, God love them Jay Angelo. Angelo says that while the company stopped selling location data belonging to Europeans, it continued for Americans because of the lack of federal privacy regulations.
Mister Angelo, Boom, there it is like black and white.
Yeah, just like saying it. I honestly, I appreciate the clarity. Jay. Mister Angelo revealed that while he had to put a stop to the company's sale of data about Europeans, which is subject to Europe's strong privacy law, the company was still selling location data about Americans. This is kind of gross, but it sounds like this sensitive data was just part of like a going out of business sale. For this company, Near Intelligence filed for bankruptcy in December and started selling off its businesses and assets, which could include a trove of data collected from planned parents and facilities. So Senator White, it is asking the FTC to prevent from selling any more of that data. The company's privacy policy does say that in the event of bankruptcy or sale of assets, that the company could transfer any of the data that it collects to its successor. That is really gross, especially when we're talking about something as sensitive as health data. But apparently it is not uncommon. But this concern is part of a larger FTC crackdown about data broker selling health data, specifically political rights that in recent months, the FTC has been cracking down on data brokers collecting and sharing health related information. The agency settlement against the location data broker x Mode highlights that the company ran ads targeted to people who visited medical facilities, and its lawsuit against the data broker Cochava also notes that the data tracked people who visited reproductive healthcare clinics. So this is really gross. It is gross to think of your sensitive data being sold off as like just part of a going out of business sale or a fire sale, but it does sound like that is exactly what's going on, and that your Intelligence is not the only company who has been doing that.
Yeah, and I think people have a false sense of security that HIPPA protects them in these cases. That HIPPA is the law in the US that protects the privacy of some health information.
But people often have.
A really overly large understanding of what HIPPA actually protects, and in a lot of cases it just it offers no protection at all where people would expect it to. So we really need better privacy laws in this country. Is really the long and short of it.
Well, that's the thing is that I don't think that any any regular person would assume that when you go to a help a private healthcare provider, a health care facility, something that is so intimate and private, that that data could be sold in this way. I think that most people would probably assume that we had protections that would prevent that, and it's we just deserve better, We deserve we should have those protections. It is common sense. But I guess if it makes some company a little bit richer, that's the grift, Like it's like, okay, well, what won't we sell about people? It doesn't matter how intimate or how sensitive it is. They will sell anything if it makes them a buck.
It's disgusting, and it also underscores the importance of jealously protecting our privacy, even for things that aren't sensitive. Right, Like, you know, you might imagine that geolocation data from your cell phone about where you're going. You know, if you go to the grocery store, you go to a coffee shop, that's not so sensitive. So maybe it's fine, who cares if somebody is like tracking that and associating it with my online shopping history. But opening ourselves up to that kind of willing surrender of our privacy to allow ourselves to be tracked, our data to be harvested, these profiles to be creative of us and sold from one broker to another, even if ninety five percent of it is not sensitive. There's some sensitive stuff in there, and once it's out there, it's out there, and somebody's going to find it and exploit it to do gross things like serving misinformation to people who have visited planned parenthood clinics.
So we did a story a couple of weeks ago about how open Ai quietly I say quietly because I think they tried to make it quiet, but folks like us made it loud. Quietly dropped their band I'm using Ai for military uses. Well, this week, open AI's headquarters in San Francisco was protested over this change. Protesters outside of their San Francisco headquarters had signs that said things like don't trust Sam Altman, a reference to open AI's CEO, who was fired and then rehired and basically now seems like maybe he thinks he can do whatever he wants because he came back after being fired. So. Bloomberg did a really interesting interview with the two activists who organized this protest. Holly Elmore, who helped organize the protest, said that the underlying problem was much bigger than open AI's military work. It's the way that open ai publicly stated it would not work with militaries and then retracted that commitment. There's no teeth to those boundaries, Elmore said, even when they are very sensible limits set by the companies, they can just change them whenever they want so. Sam Kircher, another organizer of the protest, sounds just like an interesting person. Sam, if you're listening to this, you sound interesting as hell, and I want you to come on the show. Actually, both of you sound interesting as hell, and I want you to come on the show. But Sam Kircher says that she protests regularly in Seattle against artificial general intelligence. Open AI and other companies are aiming to create AGI, which they define as AI as smart as the average person. Kircher says that AGI will remove meaning from the human existence by hindering our ability to contribute and make discoveries. And so I think it's really interesting that that Sam Kirscher's objection is like more like existential. It's like, I object to the existential threat to humanity that AGI specifically proses.
Yeah, Sam is certainly not the only one talking about it as an existential threat.
In so many ways.
So I think that this protest really shows that people are paying attention to what's happening around technology like AI and saying, wait, this isn't good. And you know, we can't leave that to the fact that all of this is happening against the backdrop of what's happening in Palestine, And so it is clear that This is not just like theoretical, it is very much real life. For someone like a Sam Altman, just saying JKA and deciding that their technology can in fact be used by the military comes with very high stakes and to be super clear, as far as we know, open ai is not making weapons, and they say that they will not, But as the organizers of this protest point out, they also said that they would not let militaries use their tools, and now they've gone back on that, And so why should we trust people like Sam Altman that they're going to do what they say? And it's very clear from this protest that people are paying attention to that and being like, yeah, you say one thing and then you do another. It really matters what you do and what you say, and we feel like we cannot trust you at your word to stick by anything.
Yeah, just you know, I guess my role today is to connect stories with the previous story we talked about, but like it reminds me of Facebook, which is so powerful and an unavoidable responsibility for shaping what our information ecosystem looks like. Open ai is really pushing the boundaries of this incredibly powerful technology and a lot of people like Sam Kircher feel it as an existential threat, and that, combined with how unaccountable these companies are, is really unsettling and scary right Like it's they have so much power, so little accountability. It feels like a scary time to be a human.
Well, the humans are fighting back. Speaking of connecting stories, we got to talk about another. I'll call it a protest that happened in San Francisco. People on the street all got together and destroyed a way Mow self drive vehicle. Now, the vehicle was not transporting anybody at that time, and nobody was hurt. But listen to how Verge reports it. A person jumped on the hood of a Waymou driverless taxi and smashed its windshield in San Francisco's Chinatown, generating applause, before a crowd formed around the car and covered it in spray paint, breaking its windows, and ultimately set it on fire. The fire department arrived minutes later, but by then the flames had fully engulfed the car. So can you like just the idea that somebody was like, I'm gonna smash this car, and then a group of strangers were all like we like this and we will help you destroy this car, and they all came together and set it on fire.
Like I said at the top of the show, I do enjoy seeing people come together for collective action.
I mean, I'm sorry, I this is kind of this is this sounds kind of cool, right, And I'm not the only person that thinks this.
Right, Yeah, I mean, you know, you'd hate to think that with all these problems that we've been talking about, people feel that like their only path of recours is to go in the street and like smash things and burn them. Maybe that's where we are. I mean, they did effectively get that car off the.
Street, so I should say there is no word on motive. But this does all take place against the backdrop of rising tensions between citizens in California and self driving car companies. We reported on this like awful story a while back that Cruz, who is Waimo's rival, lost their ability to operate in California after one of its cars hit and dragged a pedestrian, and worse, the company basically lied about it to save their own assets. At every step of the way, they tried to mislead regulators about what happened, including giving them doctored footage of the incident. So yeah, I can understand why citizens are not loving these companies. Even as I was just putting together the research for this segment, I found two unrelated stories that happened days apart that involved close call almost incidents where self driving cars almost hit children. In one, the car was fully stopped and then it started when two people had gotten maybe a third of the way across the intersection. She says that the car then started to accelerate toward them as if they were not there. Her seven year old child had to run and serve out of the way to avoid being hit. And this happened just the day before. Another cruise incident was caught on camera and posted to Reddit, where a car came barreling toward two women and a child who were fully walking in a crosswalk. So it is clear that people do not want these cars on their streets. City officials and residents in San Francisco opposed the cars being given a license for twenty four to seven operation last year, and some residents are even doing a less involved form of protest by putting orange cones on top of the car's hoods as a kind of like less Flamey protest less Flamey.
It's not good when you have a brand that people hate so much that they like spontaneously come together to set your product on fire, right Like you would think that that would maybe prompt some reflection or doing things differently, or maybe just not doing things at all.
In this case, Yeah, but I get it. I mean, why would you want to risk your life and the life of your child so that a self driving car company can make money that you never get a cut up. It's like I when I walk out on my streets, I'd never agreed to be a crash test, an unpaid, unconsenting crash test dummy for Cruise or way MO. I don't think anybody agreed to that. So I can understand why residents are fighting back in any way they can against the system that says your life and the life of your child is worth less than our company making money. I can understand while people are angry and frustrated at that dynamic because it's not one that is working for anybody except for these car companies.
I'd be curious to see national poll data about like what Americans think about these cars.
Well, I can't give you national poll data about that. But you know what, I can give you national.
Poll data about what's that bridget.
The Taylor Swift psyop campaign that we're all currently witnessing. We have talked about this a little bit on the show before. Taylor Swift is the focal point of a new conspiracy theory. It gets a little bit fuzzy for me once I try to describe the actual like what is going on here? I kind of loose a thread. But as far as I can tell, it's that Taylor Swift is being artificially pumped up and propped up to help the Democrats and also vaccines. You have voices like former Republican presidential candidate viviak Ramaswami saying this, So this is not just coming from fringe places on the Internet. It is being elevated at a much more mainstream level. And now Mamath just puts some numbers behind this. They did a national poll that found that nearly one in five Americans, or eighteen percent, believe that Taylor Swift is part of a covert government effort to re elect President Biden. Among this group, seventy one percent identified with the GOP. Eighty three percent are likely to vote for Trump, and seventy three percent also believe that Biden won twenty twenty by fraud.
One in five Americans believing that Taylor Swift is part of a covert government effort is so many people like, we are so screwed here, We're cooked.
We're cooked. This is not good. The data is not good. It's not looking good for humanity.
No.
I will say this, though I don't want to say too much. I don't want to get the Swifties angry at me. I don't need Swifties and Tamu coming after me in this episode. However, I am curious how Taylor Swift's recent vibe has impacted this conspiracy theory, because I do feel like something has kind of shifted where people are kind of turning on Taylor Swift a little bit after her speeches at the Grammys. It's it's I don't know this. I don't I'm not tapped into the Swift everse so like, forgive me if I'm speaking out of turn, but I have witnessed that I think that people are kind of turning on her a little bit, and so I am curious as to how that fits into this larger conspiracy theory that she is being artificially propped up to like help Biden win. Like I remember, I think on the day of the super Bowl Trump was on Trump's social basically like bad mouthing Taylor Swift and being like, oh, she's planning on endorsing Biden at the super Bowl? How would she have even done that? Like, she's not a perform she wasn't performing at the halftime show, She's not a player, like she could was the plan that she was going to wear a Biden's shirt? Like, what was the what was the plan?
It doesn't even need to be connected to reality, right, He just says things. Sometimes they make sense, sometimes they don't. Where they definitely make people laying you know, And I'm sure people who believe in this conspiracy with this new news cycle where like you say, people or maybe like turning on her a little bit, I'm sure that's only more evidence that makes the people who believe this conspiracy like further believe in it. That's just how these conspiracy theories work.
Oh, that is absolutely the thing with conspiracy theories. I shouldn't even be trying to ask rational logical questions about what's going on here. Necessarily because let's say that Taylor Swift came out and endorsed Trump somehow, that would be used as evidence that, like, see, they were right all along. It doesn't matter what it is. If you've ever studied how movements like Q work, it's like any new data point can be used to confirm their their previously held idea. So Mike like, it's it was almost like not even really worth it to try to find the logic of what they're saying or what they think is happening here.
One in five.
I don't even I don't even have it much to say about that. It's just so bad.
Yeah, like what kind of a conversation can we have with those people that? I mean, I guess to be clear, I'm coming from a place where I don't believe that Taylor Swift is part of a covert government siop not really aware of any evidence suggesting that she might be. She seems pretty apolitical like most of the time. But yeah, here we are like trying to make sense of it, and it's not like a sense making thing, like these eighteen percent of Americans, they just believe it.
Well, even though this is not a sense making thing. I hope that we have helped people make sense of this wacky and wild news and media and tech climate that we're in. Mike, thank you for being here and helping us do that.
Bridgette, thanks for having me. It's always a pleasure.
And thanks to all of you for listening. I will see you on the Internet. Got a story about an interesting thing in tech, or just want to say hi, you can be just said Hello at tegody dot com. You can also find transcripts for today's episode at tengody dot com. There Are No Girls on the Internet was created by me bridget Toad. It's a production of iHeartRadio and Unbossed creative Jonathan Strickland is our executive producer. Tari Harrison is our producer and sound engineer. Michael Amado is our contributing producer. I'm your host, bridget Tood. If you want to help us grow, rate and review us on Apple Podcasts. For more podcasts from iHeartRadio, check out the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts.