DISINFORMED: Boys are being radicalized by sexist, racist online networks and she went undercover to expose them

Published Apr 20, 2021, 7:40 PM

Sometimes called “The Manosphere,” a network of sexist, racist online communities are taking advantage of algorithms to indoctrinate young men into extreme ideology.


Everyday Feminism creator Laura Bates researched these spaces for her book Men Who Hate: From Incels to Pickup Artists, the Truth about Extreme Misogyny and how it Affects Us All.


Check out Laura’s book: https://www.npr.org/2021/03/13/976379494/manosphere-world-of-incels-exposed-in-laura-bates-book-men-who-hate-women


Learn more about your ad-choices at https://www.iheartpodcastnetwork.com

Just a quick heads up. This episode talks about sexual violence. You're listening to Disinformed, a mini series from There Are No Girls on the Internet. I'm Bridget Todd. When I was first coming of age online, we talked about the Internet and the real world as two totally different spheres. The internet was where you want to escape from the real world. They were not connected. But by now it's easy to see that that is no longer the case. What happens online is what happens in the real world, and this is especially true for younger folks. So what happens if this generation is immersed in an Internet that's serving them up increasingly extremest content. In twelve, feminist activist Laura Bates created the Everyday Sexism Project, a database catalog and women's experiences with sexism from around the world. After she was targeted by misogynists online, she went undercover to investigate the unseen, organized movement of thousands of men wishing violence and worse on women, the terrorism that we're not really talking about. She found that these men are taking advantage of social media algorithms to recruit and radicalize boys into a world of racism and misogyny. In online spaces that are sometimes called the meno sphere. Her book Men Who Hate Women chronicles what she learned about how it operates and why we should all be paying attention. Well, I think, as a woman who writes and speaks about feminism, especially online, I was already aware of these communities. I think that for most women in particularly feminists, more so for trans women, for women of color online, these men come to you, they make themselves known to you in online abuse. So I've been aware of them for a long time, these men and members of communities who have been sending me perhaps two hundred rape and death threats a day from the best part of the last ten years. But for a very long time, there was an argument in feminist communities that we shouldn't give them the oxygen of publicity, and that was an argument I was I was sympathetic to, I largely agreed with. But something changed for me. I do a lot of work in schools with young people, talking and working with young people around sexual consent, healthy relationships, gender stereotypes, and suddenly, after visiting perhaps two schools a week on average for about nine years, in the last two years, there was a really marked shift in the responses I was seeing from boys at schools. And don't get me wrong, those conversations have always been awkward, sometimes stilted, sometimes challenging, but they had also generally been very rewarding conversations. And suddenly what I was finding was that boys were showing up with a very determined resistance to engaging at all. They were showing up already with very extreme views about women and about feminism, that I was a feminati bitch coming in to tell them why I hated men, and they were showing up with misconceptions, so really believing very deeply that there is a huge feminist conspiracy at the heart of our government undermining white men who are the real victims of today's society, quoting the same full statistics about false rape allegations being rife, about men being the majority of victims of domestic abuse, And at that point I suddenly joined the dots and realized that these boys were being radicalized by exposure to the extremist online communities that I've been aware of as a feminist for a long time. At that point, I thought, if these groups are managing to reach teenage boys in a quite scary and pervasive way with their online propaganda, then actually perhaps not giving them the oxygen of publicity isn't working. Perhaps it's actually allowing them to operate under the radar and to have a quite major and worrying impact without parents or educators or wider society having much idea that these communities exist at all. And that was what really pushed me into wanting to write the book, because I felt that suddenly these communities were posing a very real offline threat in terms of that grooming radicalization, which of course we'd never use those terms to describe when it's white men doing it, but also in terms of, of of course their attacks, the fact that these men were increasingly carrying out violent attacks in real life, and yet many people didn't know that this was a form of extremism that even existed. Yeah, I mean, you're that is so true. It's actually one of the things that kind of prompted me to start this podcast was, you know, I think there had been a mass shooting in the United States, and I thought, you know, just like in many of the cases of mass violence, this person who was the perpetrator had a history of domestic violence, allegations, had a history of women speaking up and saying, hey, this is someone who's been harassing me. But of course we know those kinds of uh warning signs are often ignored because it's women speaking up. And I thought, Wow, if only we kind of took seriously the experiences folks were having online and uh really thought about that seriously, some of these mass incidents of violence could be prevented. And so I'm happy to see you sort of digging into that work. I'm wondering, could you tell me a bit more about some of the ways that you've seen young men be targeted, radicalized and sort sucked into these online spaces that teach them some pretty tax it stuff. Yeah. Absolutely, it's very sophisticated. They have extremely well honed radicalization techniques and they're very open about it. So these men will share their recruitment strategies with each other quite openly in their online communities. They talk specifically about using memes, using the cultural delivery methods cultural touch points funny Instagram means viral YouTube videos, taking advantages of algorithms on platforms like YouTube that will automatically direct young people to increasingly extreme content, and they describe these delivery systems as in their own words, adding cherry flavor to children's medicine. They deliberately target boys as young as eleven, and they don't wait for boys to come to them in their communities, that's quite a common misconception. They find boys where boys are, so they are in online gaming there in gaming strategy chat rooms. They're showing up a lot on bodybuilding forums, which is a very clever space for them to operate in because, of course there they have a ready made group of young men already um sort of self selecting as having preoccupations with traditionally masculine ways of showing up in the world and portraying themselves. So they are very clever about finding these spaces where boys might be isolated and vulnerable online and quite mainstream spaces, and then they groom them very carefully and very gradually. So at first it's just a fu misogynistic jokes here and there, and then if a boy seems to respond to that, then maybe it's a link to a pick up artist website or to a YouTube video, and maybe from there it kind of gets very gradually further down the rabbit hole, and they very explicitly see recruiting boys into anti feminism and misogyny as a gateway to white supremacy and near Nazism. So they are seeing it as a kind of pathway for radicalization. And some really interesting studies have been done that really back that up. That many men who end up playing pivotal roles in the white supremacist movement, for example, and men who have made their name and kind of created a following and cultivated particular skills of online abuse in anti feminist movements like and Father's Day or Gaming Gates. I'm not surprised here when you say this at all. I know, after the insurrection at the Capitol in January, so many of the men who were there and involved were really well known to people. You know, they were they had these big online personas, oftentimes like live streaming what they did, and um a lot of them had prior domestic violence and harassment of women allegations against them. And so I do think it's difficult but but important to demonstrate the ways that white supremacy and misogyny are sort of what it's like that then diagram as a circle, you know, it's it's one and the same, and they're kind of very much linked. Absolutely, this came out so clearly in researching the book. There's this public perception that we're talking about these two separate groups, these two separate problems, But in every way when you research this, you realize that that's not right, that this is this is of the same whole. So you're seeing, for example, the lexicon, the kind of unique language of the manisphere, and male supremacist communities showing up very much as part of the language of white supremacy, the right, far right communities. You can see it there in the language, but you can see it also in the kind of fundamental roots of their ideology in cells and not just furious that women aren't sleeping with them, they're furious if women are sleeping with non white men instead of them. That's what they see to be something that absolutely outrages and that's a repeated focus that they come back to. And white supremacists, of course, are also focused obsessed with misogynistic issues. You know, they're obsessed with the idea of birthrates, with the idea of so called replacement theory. They obsess over keeping white women in sexual slavery to breed a so called master race. They're obsessed with the forced sterilization of women of color. They talk about, for example, the idea that they see white women as a kind of dehumanized, fragile commodity that they see as being plundered by kind of invading immigrant men. So very clearly, if you can look at that not see misogyny, then you're only seeing half of the picture. In the same way, if you couldn't look at what's going on with in cells and the manisphere and not recognized the racism inherent in that, then it really holds us back, I think, from meaningfully tackling the problem. Yeah, I would love to talk a little bit more about that. You mentioned earlier that you look at platforms like YouTube, sometimes they're serving up these ads, you know, through their algorithm or through their recommended videos, I know through you know, the research I've done for this show that Facebook's own um own reports show internally that most times I think it was eighty six percent of the time when somebody joined the extremist group, it's because Facebook's algorithm has recommended it for them. Um. I guess my question to you is what do you think the responsibility of social media platforms and platforms like YouTube? What role should they be playing in curbing this kind of radicalization and men, well, I think first of all, they and we have to recognize their influence because I think for adults, they think of YouTube in particular as the kind of benign home to movie trailers and grumpy cap videos, and there's a real lack of awareness that they have this pivotal role that of US teams use YouTube, that they say explicitly that it's where they're getting their news from as well, and that YouTube has this this massive kind of stranglehold, if you like. There are one point five billion YouTube uses in the world, so that's more than the number of households that own televisions, and YouTube accounts for thirty seven percent of mobile internet traffic internationally, and of the YouTube videos that are watched are recommended to users by the algorithm. So if you put those two statistics together, you realize that a quarter of all international mobile traffic is accounted for by this one algorithm designed by a workforce that's sixty percent male and five percent black being being telling people what to watch. And so suddenly that becomes a power that should be interrogated, that should be held to account, that should be accountable for the impact it's having, and what I find really frustrating is that there tends to be this tendency from so social media companies to go, hey, this is so difficult. You know, this is a huge thing. There's not much we can do. The algorithm wasn't built to do this, and of course I'm sure that's true, but I don't think it's a coincidence either that it was built by a community of people that is not particularly diverse and that didn't foresee these issues happening. The argument that we can't hold them accountable and that there's simply nothing they can do because it's too difficult, it's absolute nonsense. When you're talking about companies that have an income that's bigger than the income of some small countries. Of course they have the money to tackle this, right if they really cared, if it was a priority, they could hire twenty thousand human moderators and upscale them and train them in coordination with specific community groups who are affected by these issues. There's a lack of will there. We have to acknowledge that that that's what's going on. Yeah, something that you brought up that I really harp on a lot, and I think it's complicated, but I think it's really important to understand. I talk a lot on this show about the importance of having inclusive teams in tech spaces and why underrepresented folks, so women, transpolkes, queer folks, people of color, black folks, all need to feel like it's our do like tech is our domain, Like we need to be able to take up space in those in those areas, not just because it's the right thing to do or because it's you know, because I'm a black woman and I want to be in these spaces, but because of exactly what you just described. When you don't have inclusive teams building the the tools that really do run our world and really do control how and what we we you know, come across in terms of the media landscape, that is when things get missed, right, And so if you have a team that is building an algorithm that is going to control so much of what we consume, and that team is mostly white and mostly male, it's not surprising to me that these things go on to create unforeseen harms that this team that was mostly homogeneous did not account for. And so, you know, I think it's critical the way that the way that underrepresent and voices and people and identities are seen in these spaces. It's critical to, I think, get into a place where tech is not causing so much harm because it's right now, I feel like it's a bit out of control, and I think that sometimes that can be kind of a complicated chain of things to to sell someone on. Absolutely, and I really think that that lack of representation is deliberately and actively exacerbated by the lack of transparency and the lack of accountability that these platforms have on dealing with harassment. Because if you have a company, and this is absolutely the m O of most of the major social media platforms, instead of having a clear strategy on harassment and on these issues that works for everybody, what they do is they judge each case separately. And what that means is that when a case hits the media and hits the headlines, they take action because they want to clear up, they want to protect their reputation. So that means that the cases that get acted on are dependent on media attention. And we know that media attention is likely to be bestowed on a young, privileged, white, attractive, non disabled, CIS gender woman who has a problem so you get someone like Taylor Swift, she's being abused online. Suddenly Instagram is taking action. It's announcing new policies, it's announcing tools all over the place to protect her and to deal with it. But if you have a woman whose profile is less likely to be picked up via our racist and misogynistic media, that means that Facebook or whoever it is, they're not going to get a call from a journalist saying what are you doing about the harassment of this particular woman. And that means that those voices that we most desperately needed in those spaces are the ones being driven offline and being driven out of those spaces. So it's such a vicious circle, and it really is kind of self perpetuating. I think let's take a quick break. Basically, the only way to get social media panies to take it seriously when you're being harassed online is to be famous enough for it to generate negative media attention. It's clear to me that tech companies are not optimizing their platforms to be safer from the jump. Instead, they only take action when they're reacting to bad press. I wonder if it's because they are optimizing their platforms for harasses if they are optimizing their platforms because it works in their favor, that actually, these vast networks of men who are spending a huge amount of their time online as staying on those platforms. I do wonder if there is a deliberate decision going on here about which groups of users they're protecting and at what cost to other groups. Have you ever felt like negative, inflammatory, or polarizing content gets more attention on your social media feed? It might not just be in your head. According to Angry by Design Toxic Communication and Technical Architectures in the Journal of Humanities and Social Science, communication platforms like YouTube and Facebook privilege in thediary content setting up a stimulus response loop that promotes outrage expression. So rather than seeing positive content, algorithms keep us angry and on edge by surfacing up a steady diet of outrage. These algorithms are optimizing for content that gets a lot of cliques and attention, and that that content happens to be very negative, And so part of me always wonders, you know, is that why like the fact that I see the tweets that are that are inflammatory or super negative or super you know, um partisan or whatever. The is the reason I see those more is because they just get more attention. Therefore, these algorithms are always going to privilege that kind of content, And you know, part of me feels like, well, maybe it's just but maybe it's just the platforms showing us what we what we as like, you know, messy humans just want to see, right, Like, oh, I'm not interested in the positive content. I want to see the negative content. But even still, it's like, is that is is that an online space where I want to be, where folks want to be, a base where the content that it often serves you with content that is really negative and that content that makes you feel good. It just really I don't know, it makes me wonder how personally for me, like why I spend time on spaces that make me feel so miserable? Sometimes? Yeah, absolutely, And that was the experience of so many of the women I interviewed for the book, and in many cases, the choice eventually for their own mental health was that they removed themselves from those spaces. And I think it's really interesting that the average white man's experience of social media is not necessarily a net negative in the same way, um there have been so many examples where white men versus their female colleagues or their colleagues of color have created similar accounts and have released similar kinds of information or opinions, and the results in terms of the responses that they've received have been so dramatically different. Um. And I think that it's really interesting to think as well about whole all of this intersects with a kind of capitalist urge for social media to be selling us things, because we know absolutely from kind of former engineers that YouTube givem Schaslow is a good example of this, who I quote in the book. He's a guy who comes forward and says, look, the YouTube algorithm is not about optimizing for you to see the most high quality content or the most relevant content. It's about whatever we'll keep you watching for the longest, because that's all that matters to the algorithm is getting you to see as many adverts as possible, essentially, and we know that the things that keep people watching for the longest are increasingly extreme content. And as many researchers pointed out, that's not such a big deal if you're looking for a video about you know, um cooking, and then suddenly you're being taken to ourn or you can eat contest video. But it does become serious if you're looking at something about women and suddenly you're being told that the gender pay gap is a myth, and from there that actually women are lying bitches who make up rape allegations, and from there that maybe it will be better off if women were forced into sexual slavery. There suddenly it becomes pretty important that that's what's going on. You know, you mentioned some of the experiences the women that you interviewed for your book shared with you, or that eventually they're just they leave these spaces, they leave social media. You know, we talk so much about things like d platforming and people being kicked off you know, social media or you know, quote cancel culture, and we never sort of talk about the issues of the women and the voices who are just kept from speaking, just kept from putting their opinions out there, who are pushed off of these spaces because they are no longer safe. I feel that that often gets missed in the conversation that like, isn't that a form of d platforming. Isn't that a speech issue that these women are functionally unable to to, you know, express themselves. It's ironic that the freedom of speech debate is usually dominated by people who are claiming to have been silenced, often in media, major media platforms, writing speaking out about their silencing, usually white men speaking out about how white men can't say anything anymore, leaving them saying it, and it's like on every on every every paper, also being picked up by those same algorithms. But we don't tend to think about kind of freedom of speech in the in the negative, like freedom of speech from people who don't get to speak to tell us that they've lost their voices in the first place. And I think there's a risk for our political discourse as well, that we have a generation of young people who are honing their political skills online. They're often described as a politically disengaged or apathetic generation, but actually I think it's a generation who are just politically active in very new and different ways, and a lot of that involved technology. So these are young people who are debating on Twitter, who are cutting their teeth in political debate online. And if those online platforms are unsafe for women and girls, for various minoritized communities, then we are kind of weeding them out from the public discourse Before they ever get the point where they might reach those public spaces later on where suddenly will really feel the loss of them. And it might be that we don't realize that that's happening until it's too late. That's such a good point. I hadn't even hadn't even really thought about that that. You know, what does a landscape looks look like when a big section of that generation is just quiet and and go silent. You know, that's a real that's a real problem. More after a quick break, let's get right back into it. I got my first computer when I was in junior high. But most younger folks have lived their entire lives online and ways that I could never imagine when I first got online. So if the internet is the backdrop upon which we're all living our lives, what happens when that backdrop is really harmful or toxic? What's it doing to a generation of younger folks who are coming up completely immersed in it and we've never really seen it. Because we are at kind of hovering on the edge of the first real generation coming of age who have lived their whole lives on social media, we have this kind of unique political moment that never really gets picked up on this. I find it wild that we're living at this moment in history has never happened before and will never happen again, and yet it's never really discussed where a generation of non digital natives is parenting and educating a generation of digital natives, and there is this chasm, there, this huge gap in culture and understanding of what their world is like, what the day to day landscape of their online world is like amongst parents who grew up in a pre Internet age, and that generation will becoming of age, and what will that look like when our political representatives are being drawn from a generation who have literally grown up with the sheer daily bombardment of racism and misogyny and transphobia that comes with living your life on the Internet in the way that young people do today. And I don't think we'll know what that looks like until we get there, And I think when we do, it will be a shock. Wow. I yeah, I hadn't even thought about that, but you know it's true, and I do think. I think about this a lot. So I don't I don't know if you're a parent. I don't have children, but I have a lot of young people in my life, and I think about how hard it is, how hard it must be, or how different it must be to be wading through this very complex online experience that young people so they have to wade through, and like, as you said, that experience is often racially charged, politically charged. It's often I think that it very much overlaps with like real life. You know. I think when I was coming up, coming up online, people were like, Oh, your online world and your real world. Today it's very clear that those are those overlap. You know, things that you say online get you fired from your job or in trouble at school, and they're largely wading through it with parents who have not had that experience. You're exactly right. I hadn't even thought about that. It's and I think the people we see it in the way that we report on things that happened live, we don't take them seriously, we don't look seriously at them, and as you were saying in the beginning of our interview, that lack of scrutiny allows for these things to just go on check. And so you know, if you spend if you're a reporter who spent all this time being like, oh, kids, on the internet. What do they do and taken Selfie's like, oh, just just just trivializing it. When it really becomes an issue that we should pay attention to, you know, it can be a problem. Absolutely. One of the experts I interviewed for the book, Dr Carlin Ferman. She said that one of the things that she really sees in her work with young people around sexual violence is that it's extremely rare for her to come across a case of sexual violence that doesn't have an online component, and that adults find it difficult to recognize that connection, but for young people, there is no meaningful distinction between their real lives and their online lives. And she said that the normalization of violence in the online world is having a massive impact in young people's perceptions of violence offline. And that's very much something that I see in my work as well. When I'm working with young people, it's really common to hear them say things like rape as a compliment. Really, it's not raper she enjoys it. You hear young people repeating stuff that they've seen an online porn and assuming that that's how they'll be called upon to behave in a sexual relationship. So um, For example, I've visited a school where they've had a rape case involving a fourteen year old boy, and a teacher had said, why didn't you stop when she was crying? And he had said, because it's normal for girls to cry during sex, because that's what he'd seen online. And so much of what they're very mainstream, readily available online pornography is showing kids that sex apparently is something very violent and aggressive that's done by men to women. Again, it's racialized. There are these kind of fetishized racial stereotypes there. It very much suggests that it's normal for women to be hurt, humiliated, and degraded during sex. And so I'm seeing young people who've experienced quite extreme sexual abuse and wouldn't recognize what's happened to them is wrong, or that they would ever have the right to tell someone, or that support would be available, because it's so normalized by what they're seeing online, and there's a real barrier to dealing with that. When the majority of the parents I speak to, their conception of what online porn looks like is a kind of Playboy centerfold, but online it's like FHM photos, but you see them on a website. There's very little recognition of of what the reality of that landscape looks like for the teenagers who immersed in it on a daily basis. Wow. Yeah, I mean, it's got to be so incredibly difficult to navigate as a young person. And then you know, talking to your parents about porn and sex is already so hard, and then that they had there be that kind of extreme barrier in terms of like understand your parents ability to even understand what you're experiencing. That's that's kind to be so hard. I don't envy, but I mean I always I say this in so many different ways, like I don't envy kids today who have to navigate this largely on their own. It must be very complicated. Yeah, me neither. And then when you add into that mix the fact that you've got this generation of teenage boys being actively groomed and sought after buy these extremist groups really sort of training them and radicalizing them to dehumanize their female peers, to objectify their female peers, then that kind of almost sets a match to the whole tinder pile, and up it goes. In Atlanta last month, eight people were murdered by a gunman at three spas. Initially, Sheriff's Captain J. Baker parroted the perpetrator's own words that his heinous crimes were not motivated by race. But six of his victims are Asian women, whom he conflated with sexual temptation. So why the rush to not describe his actions as a racially motivated and gendered act of terror against Asian women. Well, the investigators they interviewed him this morning, and they got that impression that yes, he understood um, the gravity of it, and he was pretty much fed up and it ben't kind of at the end of his rope. And and if they was a really bad day for him and this is what he did, I'm not going to go to I don't know if ether are more slow or not. We see this time and time again. Yet when we talk about these these these issues, like in Atlanta, that awful sheriff said, oh, well, this guy had a bad day. He said he was a sex adic blah blah blah. I've never seen a situation where they just take the perpetrator at their word, like, oh, not motivated by race checks out. You know, I wonder, why do you think it is so difficult for people to recognize this as extremist or even in some cases like terrorism, we almost never describe the kind of havoc that extremist extremist views on women and folks of color. Domestically in the United States, we almost never talked about that as terrorism, Like why do you think that is? I think as a society we are very comfortable with homogenizing other groups than white men um, but we tend to afford white men the dignity of discrete and unique identities. And so when a white man commits a mass atrocity because he doesn't fit the stereotype our society has of a terrorist, we look for other reasons, and we dig into mental health, and we see these these kind of almost glowing photographs of him as a cherubic child, and we get a quote from his neighbor to say that he was a really nice guy who helped out with d I Y around the neighborhood, and and hey, something from his granny to say how sweet he was as a boy, And we portray these men as we we can't. It's almost like as a society, we can't compute the fact that this could have happened because we're not able to recognize white men as a group who can be radicalized and who can carry out extremist acts. Even when we reach a situation where they actually become the majority of perpetrators of mass shootings, we still fail to recognize it. And I think partly it's an inability to kind of see those white men in that way when the media reports on them in such a kind of forgiving and normalizing and excusing way. But it's also about who the victims are and a society tea that is used to seeing those folks as victims as well. In the case of men who have gone out and massacred women because there are women. For example, we haven't called these men terrorists. We never called Elliot Roger a terrorist. Alec Menassian, the Toronto van attacker who murdered ten people and injured six, of the people who murdered being women, who explicitly told police I've murdered these women because I hate women, because women won't have sex with me. And the police came out and gave a press conference and said there's no evidence of terrorism here. The city is safe. And I think in part it's because brutality against people of color, and violence against women are the wallpaper of our daily lives. These are things that have become almost normal. If one in three women on the planet is raped or beaten in her lifetime, then a man violently murdering women isn't exceptional, and so we struggle to recognize it as terrorism. And I think the police and the criminal justice system are at fault here. The media is at fault here. The way that all of us talk about these crimes and about these criminals are at fault um. And unless we start to recognize these particular forms of terrorism and extremism, it's very difficult to see how we'll move forward. Because right now, because we don't call those men terror list terrorists, we don't look at their often online journeys that have filled them with hate as forms of radicalization. And if we don't do that, it means that nobody is looking at the grooming of the next generation of these attackers. Nobody is looking at boys. When I was researching the book, I rang up a lot of counter extremist counter terror organizations globally, and I was really shocked that when I used the word in cell typically there would be a long silence on the other end of the phone, and then somebody would say, can you can you spell that for me? There was a US Government Accountability Office report which was literally looking at the U. S government's response to forms of terrorism, and in the period of the report, there were three massacres carried out by male supremacists by people with kind of in cell backgrounds online and they weren't included in the report. But over this ten year period they were tracing and carefully tracking people with extreme views on federal ownership of public land or extreme views on the environment, and um shockingly, in the period of the report, no one was killed by people with those views, So we're looking at the wrong people. And you know, you look at someone like Donald Trump describing Black Lives Matter activists as extremists, as terrorists, and you realize it matters if the people who are choosing who is designated as a terrorist or an extremist isn't representative of the communities that they're serving and has the ability to carry deep prejudice in them. So I guess it also comes back to that issue we discussed earlier about representation in the people who are policing and governing our countries and how we respond to this threat. You know, the people who are talking about it, who are investigating it, who are you know, mired in it, they also have things they can't see. Well, we all have biases, and so it really is I think it's critical that we think about inclusion in terms of you know this, If we don't think about it and aren't able to talk about it, it will hurt us all as a society. I completely agree. Um And just as a side note, you know your work with Everyday Feminism creating a database for women to talk about their experiences with sexism and misogyny online. Even even even things like that, where you're building these spaces in these platforms where you can sort of clip the script a little bit and say like, hey, we're here, we have experiences that we want to talk about online. You might think that the online experience is just you know, dominated by men, but actually here is a slice of what we deal with or what we experience. I think doing that kind of archival chronicling, storytelling work is also very important in terms of getting us to like a culture shift where we think of these spaces as you know, places where we are and places where we're you know, showing up and taking up space. If that makes sense, absolutely, I hope so. I hope the Internet is in a transitional phase. You know. I feel like the Internet is in its infancy. It's basically still teething. I had, These are teething problems. It's so new, relatively speaking. Surely we can't carry on like this. Surely it can't be forever a space where where women, where people of color, where transfolk have to practically brace themselves to step in. You know, that can't be sustainable in the long run. I really hope and believe that this can't just be the way that things are. This has to be a transitional period. I hope you're right. I hope you're right. I want to take a moment to thank all of y'all for the reviews that many of you have left. They really do help the show grow, and yes, I personally read each and everyone. I wanted to highlight one particular review because they really stuck with me. This is from listener Lisa Hazn. I've been listening for a while but hadn't bothered rating the show and noticed the weirdly high proportion of one star reviews scrolled through. When I couldn't believe anything shocks me anymore. But did some guy really just say a black woman's podcast is as bad as Jim Crow? How does someone go about the trouble to find the show, leave a rating type that out hits end and never have a single moment of self awareness. Jokes on them though, people like that are just enthusiastically proving the point that this podcast is necessary and more people with the birth perspectives need to have platforms. And Lisa, I just have to tell you this review made me legit laugh out loud in my first few weeks of making this podcast. It's true someone did leave a review comparing the podcast to Jim Crow, which I gotta say feels a little bit harsh. And Lisa, your review reminded me of how hard it was to keep pushing through even when vocal critics like that left that kind of feedback. But reading your review today, you know, fifty episodes in, just made me really happy that I stuck with it, and I wanted to share this for anybody who's maybe thinking about starting a podcast of their own, or making anything at all. Really, just ignore the trolls, ignore the haters, ignore that discouraging little voice in your head that tells you to stop and just keep going to Lisa. Thank you for the reminder. If you enjoyed this podcast, please help us grow by subscribing. Got a story about an interesting thing in tech, or just want to say hi. We'd love to hear from you at Hello at tango dot com. Disinformed is brought to you by There Are No Girls on the Internet. It's a production of iHeart Radio and unbost Creative Jonathan Strickland is our executive producer. Tory Harrison is our supervising producer and engineer. Michael Lamatto is our contributing producer. I'm your host Bridget Todd. For more great podcast check out the iHeart Radio app, Apple Podcast, or wherever you get your podcasts.

There Are No Girls on the Internet

Marginalized voices have always been at the forefront of the internet, yet our stories often go over 
Social links
Follow podcast
Recent clips
Browse 292 clip(s)