It’s a new year, which is a great opportunity to take stock of what we’re leaving behind in tech in the new year.
Bridget joined Sam and Anney over at Stuff Mom Never Told You to get into what’s not coming with us in 2024.
There Are No Girls on the Internet. As a production of iHeartRadio and Unbossed Creative. I'm Bridget Todd and this is there Are No Girls on the Internet. Happy New Year. Now that the holidays are officially over, I'm looking at what's in and what's out in tech for twenty twenty four. And to do that I joined my friends Samantha and Annie over at the podcast stuff. Mom never told you to run through what we should be leaving in twenty twenty three, I gotta ask what tech things are you leaving in twenty twenty three? After you listen to the episode, be sure to let me know.
For the topic today that you brought us. You're talking about some of the things tech in the tech world that we should leave behind.
Are wont exactly?
Yeah?
If you all know that cartoon that you might see on Twitter a lot where it's like a woman stepping from one year to the next, and behind her she's leaving behind all of this baggage from the previous year, like fake friends or heartbreak, and then across over her shoulder, she's carrying what she's carrying into the next year. So like love or focus. Do you know the image that I'm talking about.
I don't.
I had never seen it, but I'm looking at it now.
Yes, So buffet about that image.
It was created by a British Ghanaian graphic artist named Penniel and Chill and it first appeared on the internet in twenty fourteen, and Buzzby has a great interview with her. But basically, this image has been totally memified and it kind of pops up around this time of year every year to sort of fit the theme of like new Year, new me. Sometimes it's a joke, which this artist says that she doesn't always agree with or like the way that people like do their spins on it. But it's really become a part of the shorthand of this time of year of like what we're leaving behind and what we're taking with us into the new year. So when that picture was making its rounds recently around this time of year, you know, I had just wrapped up podcasting for my own podcast. There are no girls on the internet talking about technology and identity and really asking questions about like what we get right and what we get wrong you know, in those in those arenas. So this image got me thinking what tech things should just stay in twenty twenty three, and that we should not be bringing with us into twenty twenty four. So I've got my list of things that should stay. We're like, we're done with them, We're not, we should not be bringing them with us into the new year.
Yeah, And I was thinking about this too. This has been a big year for tech. This has felt like a very monumental like a lot of things are changing, and still looking at this list, I was like, oh, yeah, yah, Yeah, it can feel strange because time feels so strange now, but that did all happen this year. Yeah, So what is on your list? Let's get into it.
So, as you said, like, it's been a big year for conversation around technology, that they're big right now, and I would say there probably is not a bigger conversation than the one happening around AI. Right So we're all like rightly talking about how AI could change the way that we do our jobs or the way that artistic projects get created, and those are all important conversations to be having, But there is one very serious conversation about the reality of AI, and that is how AI is being used to do things like violate consent and further use technology to make it seem like our bodies are just like up for grabs once we show up online.
So case in point newdify apps.
So newdify apps are kind of the catch all term for apps or platforms that promise to use AI to generate non consensual nude or semi nude images of anyone. These kinds of apps have been exploding in popularity. In September alone, twenty four million people visited newdify apps or undressing websites, according to the social network analysis company Graphica. So in their analysis they really talk about how we've how this year specifically, we've seen this shift where new toify apps kind of went from this niche, underground custom thing where like if you were like a singular creep who had a singular fixation on one person, you could find a marketplace for non consensual images to be made of somebody, but it was like a niche custom thing. Now in twenty twenty three, we have really seen that shift to where these are fleshed out monetized online businesses complete with advertising apparatuses. Graphica found that the volume of referral link spam for these kinds of services has increased by more than two thousand percent on social media platforms like Reddit and Twitter since the beginning of twenty twenty three, and this set of fifty two Telegram groups used to access non consensual image. Sites like these contain at least one million users as of September of twenty twenty three, so they have really exploded in popular Yet I think that we're only now, like only recently, come to have any kind of conversation about what that means, not just for you know, the women who are overwhelmingly targeted by these kinds of apps, but also what it means for our digital landscape more broadly.
Okay, I'm not gonna lie.
None of this is familiar to me as many social media things as I'm involved in.
Didn't know this was a thing. Didn't know.
It's pretty despicable. It first got on my radar. It first seemed to be rolled out to like more mainstream platforms with celebrities, so it'll be like, oh, you can get a nude of any celebrity. And now it's like, not just celebrities, it's like anybody who can get it. You can generate a non consensual nude of anybody. And so one of the ways that this is becoming more and more ubiquitous is just seeing the advertisements for it on social media. After some journalists at Bloomberg were looking into the popularity kinds of apps, they contacted both TikTok and Facebook to ask them about you know, it's okay.
It seems like these really.
Gross apps are being allowed to advertise on your platforms. And both Facebook and TikTok I will say, you know, to their credit when Bloomberg reached out to them, they both did some work to block search terms related to newify apps, but one social media platform notably did not do that. Can you guess what platform? That was?
Probably Twitter? Obviously I'm not calling name, No, let's.
Call on Twitter still, so it was Twitter.
They were like, we're actually fine with these neuify apps, Like, we don't see a problem.
So this is from vice.
Searching the word undress on TikTok brings up no results in either the top or video tabs. Instead, the platform warns users that the phrase may be associated with activity that violates the platform's guidelines. Searching the same term on Instagram similarly brings up no results. Searching undress on Twitter, however, readily surfaces a verified account with nearly twenty thousand followers promoting new toify app services. So let's say that you were to search Twitter for the word in dress instead of undress. Twitter is actually like, wait, were you actually meeting to search for undressed? And it prompts you to search for undress instead. So where other platforms are like, no, we can't have that on our on our platforms, Twitter is like not just allowing it on platforms, but like helping people to search it when they when they get it wrong. And so you might see people if you ever see newify apps advertise on these platforms, sometimes they like the a word is misspelled, or like there's a space between a couple some of the letters to try to evade being picked up and like knocked off the platform.
But it generally.
This does not seem like Twitter has a problem with these kinds of services being advertised on their platform.
And I feel like this is a big question with like legalities, especially on public platforms like that in general. But like I'm talking about doing news and such like it.
Seems like it should be illegal.
It seems like this could be one of those things, especially like if you think about revenge born and all of that being illegal in so many places, Like, is that not a thing?
Is that?
Can you not take it to court or at least try to stop these images from happening?
So that is a great question.
When I first saw these, I was like, this has got to be illegal, And it turns out that right now, depending on where you live, it is probably not illegal to do this.
It depends on your jurisdiction. But there is.
Currently no federal law criminalizing using AI to generate non consensual deep fake images. Representative Evet Clark out of New York has actually proposed legislation that would criminalize making, you know, non consensual deep fakes, but as of right now, kind of unbelievably, there is not any legislation federally that prevents somebody from doing that. And Yeah, I just think that, like, as this kind of technology becomes more ubiquitous in our culture, I think it adds to this idea that just by showing up on the Internet, women are fair game for anybody who wants to sexualize us, and that I think it's getting to be getting to a point where the idea is like, well, if you didn't want someone to use your images in that way?
Why did you post them to Instagram? Right?
Like?
And I think that we really got to have a real serious think and a serious conversation about if that is going to be a social media climate that we want.
And I think that that kind of thinking is the kind.
Of thinking that we need to leave behind in twenty twenty three, right, Like, women should be able to show up online without non consensual sexualization just being the cost of showing up, and so let's leave that behind in twenty twenty four.
We are not bringing it with.
Us, please please, Yeah. And that's one of the very frustrating thing since I did see a lot about this, and we even had an episode bridget about it about journalists getting targeted by stuff, yeah, and revenge porn and things like that, and this is just making it so much easier and it's harder to for some people, for all of us to ascertain like what is real and what is not real. But it does kind of disproportionately impact women and marginalized folks. And that's just one example of technology doing that that you have on your list.
Right, yes, And I think, like, just like what you said, I think that there are specific groups that we see being targeted for this kind of harm first, and then it's.
Like, oh nah.
And then like when those groups, when it's allowed to happen to those groups, it's like, oh, well, you know, nobody really does anything. And then it's like, oh, surprise, Now it's everybody's problem. Now we just live in a society where like this is commonplace, and like maybe when this was happening to specific groups of people, if you had done something and take it seriously, then we wouldn't have allowed it to just become ubiquitous. Right, And I don't think anybody wants to live in a culture where anybody is fair game just because they were all they put a picture of themselves on the Internet, to have that picture be distorted and sexualized.
So absolutely right.
So that brings me to another thing that we should not be taking with us into twenty twenty four, and that as content moderation policies that really hurt women and other marginalized people.
So, as you were talking.
About with AI right now, AI is used in content moderation that really does a lot of the work of deciding what gets amplified and what gets suppressed on social media. This technology, however, also objectifies women's bodies, and it's much more likely to flag images that involve women or include women as racy or inappropriate, and thus those images are more likely to be suppressed on social media sites. So The Guardian actually put together like a really interesting investigation into this, where they had journalists use AI tools to analyz hundreds of photos of men and women in their underwear, working out using medical tests and with partial nudity, and found that the AI used will tag photos of women in everyday situations as sexually suggestive. So I'm talking about images of like women in their underwear fully covered, or women working out at the gym fully covered, right, like images that we would recognize as not racy, but because they include women, the tools that are being used to make decisions about how content is moderated will be like, no, that's racy, can't have that.
On social media.
And so as a result of this, social media companies that leverage these algorithms have suppressed honestly countless images of featuring women's bodies. We know that this hurts women led businesses, and this is reading about how a shapewear company essentially can't advertise on social media because images of women's bodies fully covered in shapewear will just be suppressed by these algorithms, and they essentially cannot advertise their product on social media, which is like where you advertise products in twenty twenty three. This also like not only does it hurt female led businesses, it also has medical impacts. They found that this disparity is also true for medical images. AI was shown images from the US National Cancer Institute demonstrating how to do a clinical breast exam. Google's AI gave this photo the highest score for raciness. Microsoft's AI was eighty two percent confident that these images of women doing breast exams was explicit sexually in nature, and Amazon classified it as representing explicit nudity. This is also true for pregnant bellies. If you are heavily pregnant and showing a pregnant belly on social media platforms, AI is much more likely to deem that image to be racy and then suppress that in their content moderation tools.
And so you really get a sense.
Of the way that these platforms are creating a disproportion it cost for being a woman who shows up online like it prevents women from being able to express themselves. It prevents women from being able to get medical information about our bodies, and ultimately it's just not fair, like we shouldn't have to deal with this just because we showed up on our social media platform with our bodies, Like there's nothing wrong with women's bodies, They're not racy or explicit just by us having.
Them right, right, And I believe Smith and I we talked about this about YouTube because YouTube had a similar strange thing that was happening where it was flagging videos of children the young girls. Is just like, this is so sexual and it's just like literally young girls, and it's it's telling to what is going on in our society, the problems of objectifying and sexualizing women. But also, this is a big thing that we talked about with you, Bridget when it matters who's making these things, who's doing this stuff. And AI is a pretty new space, but I've already seen a lot of conversations about about the importance of who is working on it. So that's part of what is going on here, right totally.
So The Guardian spoke to Margaret Mitchell of the AI company Hugging Face, who said that she believes that the photos used to train these algorithms were probably just being labeled by straight men who probably associate men working out with fitness, but maybe consider an image of a woman working out as being like racy or sexual or explicit, even though it's like the same theme, the same content. And so it's possible that these ratings seem gender biased in the US and in Europe because the labelers might be from a place that might have more conservative cultures, right, And so yeah, it really matters who is building technology, who's in the room and the technology gets built, who is training it, who is rolling it out, who is thinking about it, who's writing about it, who's talking about it. If these people are mostly men, then like, of course women and other marginalized people are not showing up in an equitable way. With all the conversations that we have around things like inclusion and diversity and equity and tech, I don't I'm not like harping on those just because it's like nice to do or it's like the right thing to do, which it is it is because eventually the technology that gets built is worse, is less inclusive, is more dangerous, like it includes less people.
And that ends up hurting all of us. Yes, it does.
And speaking of that, you have another point on here, going back to something we were talking about earlier about women in journalism.
Yes, yes, so I'm so glad that was a great transition, which is that you know, it's like one of the reasons why I started my own podcast about this is that we unfortunately have a tech culture that can treat women like perpetual outsiders, right, whether by accident or with intention, and that is something that we gotta leave in twenty twenty three, really because I have left it like in like many many many years past, but this is the time that we should be leaving it in the past because exactly what you said, it is so critical. These tools are going to be shaping our world and how we understand our world. So we've got to make sure that people who are publicly talking about it and are included in that conversation are done so in a way that does not treat them like perpetual outsiders. And so women who are working for that AI company that I mentioned before, hugging face, but you can sort of think of as like a competitor to open Ai, the company that makes chat GPT, which is run by that.
Guy Sam Altman.
Hugging Face has a lot of women who work there, and these women do a ton of.
Like public speaking about tech and AI.
And the media, which is great, and I also think again it's important because it can help to change the face of like who we think of as somebody who gets to speak or gets to have.
An opinion when it comes to technology, and that's great.
However, the women at Hugging Face also noticed that when they were doing public speaking about AI with the press, times would get like sexist or otherwise kind of messed up questions during interviews that just like really highlighted that they were not necessarily being treated like people who belonged in the space. Margaret Mitchell, she is Hugging Faces chief ethics scientist, framed it as a research question. She asked what are patterns and how journalists talk to and about women and AI. She found that compared to male peers, there is a disproportionate focus from the press on their ages, their motherhood, their physical appearance or behaviors, their failures, and what AI gossip they could provide, rather than their like technical work. And these are people who are incredibly accomplished. They're like doing very important work that they've gone to school for been trained for in an article.
If you get to interview.
Them and you ask about their age, or you ask if they have kids, or you ask you know, about their the way they look, it's it's so limiting because it's like you have somebody who has dedicated their life to this very important technology and this is what you ask them. Like it's such a it's such a miss on part on the part of the journalists.
It's like, I thought you by now at this point in AI there would be more things to talk about. There's so many questions that we should be asking again, things like how are how are sexist issues being handled in AI? How are you protecting women in AI? And not about the individual's person, be like, so you got a kid, you gotta be on the computer while you have a kid, how.
Are you gonna do that? Like, well, this seems so like far FoST Like why is this? Is this a s good?
Are you?
Are we still doing this?
What's frustrating? It's like they would never ask a man.
If you were talking to like a powerful man, you would never be like, oh, well, who watches your kids while you're working?
Or like how do how do how do you juggle being a dad and like being a scientist?
Like these are things that would not that like would not come up. They're only coming up because they're women. Doctor Sasha Lucioni, who was an AI researcher and a climate lead at hucking Face, was in this like pretty glowing piece for Adweek. The piece was great, but the headline to the article read this AI ethics expert juggles motherhood and a tech career and people had to like raise hell to get them to change it. And yeah, it's just like I do think that there is a place for conversations about what it looks like to juggle career and family and all of that, But those are not conversations that should only be happening to women, and there's a time and a place for them, right, Like if you're meant to be interviewing somebody about their technical prowess, that I don't see how that is is relevant to their technical expertise.
Right, And that piece in the self the title is so condescending, is that that's note like.
Aw, look at you how cute? Look at you doing that?
You go, you go girl, I'm so proud of you instead of seeing that professional scientists who has more degrees and more experience than the person who actually probably wrote the article. I don't know for sure, but just that level of like proble, Wow, really, what are you doing? Like is it because you feel insecure that you need to be condescending but maybe like passive aggressively like but no, no, no, I'm really impressed.
Really, it is so condescending, and so, in an effort to help the space be better, the women who work at Hugging Face actually developed a guide for journalists to help get it right, to help create a better dynamic so that it's not just condescending question and sexist question after sexist question. The guide reads the real achievements of women on our team often get overshadowed by a focus on personal and sometimes very intrusive details that are not relevant to their work. With all the amazing press attention we get at Hugging Face, we're bound to see some journalists rely on outdated tropes. Lately, we've seen more reporters ignoring our amazing achievements of our ses and theays and instead focusing on stereotypes in tech. One of the things that I love about the guys that they put out is that it really highlights the importance of not treating tech like a place where women don't belong. And so, for instance, the guide reads, don't rely on antiquated stereotypes about women in tech. This includes describing women as outsiders in the field, which only serves to reinforce the idea that women don't belong in tech. An example of a problematic sentence they give despite being a woman in a male dominated field, brook Brookie has made a name for herself in the tech industry better Is through her brilliant results on magic and large language models. Brooke Brookie made a name for herself in the tech industry. And so this I think is really key, and I think people do this without even really meaning to you, is that the tech industry, women and queer folks and trans folks and black folks and folks with disabilities and all different kinds of people who are traditionally marginaliznsed in our society have been at the beginning of technology and have been the very start. We have this idea that like, oh, tech is this like white male CIS boys club and anybody else is trying to break their way in, and that's I can understand why people feel that way, But we have been there from the very beginning. And if if you don't always hear our stories or our voices, it's not because we're not there, it's because specific choices have been made to keep to like sideline us and to turn down our voices. And so really starting from a place that like, we do belong in these conversations, we do belong in tech, we do belong in the sector. We're not outsiders, I think is really really important, and it also just really matters. You know, if the people who are building technology and talking about technology and thinking about technology are only one type of person, the tech that gets.
Built is going to be a whole lot worse.
And so all of us benefit when more voices are included and feel included and feel empowered to join the conversation.
You would think that would be an obvious and also that it would be it would save money, It would save money, and all the revamping and redoing of everything that you knew, well I would expect you knew. But that could go wrong when you don't have the right people, or all the people at least represented in this conversation, especially if you're wanting these people to use your technology.
You're so right, And like, what's interesting is I once read this book called Mothers of Invention, all about how things like misogyny and bias around women stifle innovation. And something that's really struck me from that book is how much money gets left on the table because of things like gender bias and misogyny. It's like, you would think in a capitalistic society that for profit companies would want money above anything. You might be surprised because they are perfectly willing to lose money if it means further entrenching gender bias and massogey here.
Yeah, which is so nonsensical.
Right, I guess it's power over money, even though money can be considered power.
Yeah.
Yeah, we were doing I was trying to do a wrap up episode like this about video games and board games, and I ran across kind of a stunning, upsetting amount of how often people in power who are dudes, were like, it's just women don't make good stuff, and they would literally be like, uh, people don't buy those things, those party games, Why yes they do.
Yeah, look at the demographics of who is actually playing video games it is a lot of women, and so set like I think not making video games and stuff that like women want. When a big chunk of your customer base just is women, it's a fact that is on you, that is like you don't know how to do it or unwilling to do it.
It's like any anything that you could say is just an excuse to not do it right.
Like you look at Animal Crossing and the fact they made bank, especially during COVID and quarantine, and you're like you sure, are you sure?
Yeah?
This guy in question was he was saying, like all the games women like to play, which are like in his mind, are much more too political or two like parties, like Animal Mobile games like aren't good and don't make money.
And it's like, uh, are you sure?
I can't?
Why?
Oh?
This also reminds me of it's kind of upsetting, but it reminds me of when Sally Ride was, you know, getting ready to go to space and they would have these press conferences with her and all these men and the men they're asking like all these like questions related to the jab and her. They'd be like, how are you gonna put your makeup on space?
What it has?
It? Like it's still there that kind of do we still gender these things and we still other people And it does really matter who's writing about it and who whose stories are getting published or getting that traction, because you know, if it's mostly white men's stories who's getting like all of the attention, then that does help reinforce this idea that yeah, it's this white male space when it isn't and it hasn't been, as you said, yeah.
Did you ever hear?
I mean, I'm sure you have all have heard this story, but it really sticks with me that when Sally Ride was going to space, she's going to be there for one week, and the like scientists or engineers at NASA whoever, were like, they gave her a hunt.
Was it two hundred tampons or one hundred tampons?
She'd be there for one week? And when she was like, oh, one hundred tampons, they were like, will that be enough? Like these are the greatest scientific mott Like, right, nobody can tell me that like male scientists are like better than women when that's what's going on?
Right?
Can you imagine if you honestly believe that was true and then for to expect the women in your life or the people who menstrate in your life to just be going through.
Two hundred and.
We have to pay that much when you were like looking at tampons and they're like thirty for what is it, like eight dollars or thirty?
They're like thirty. So it's like how much that I have so many questions?
I have so many There's so many questions.
There's so many questions, and to be fair and like in everything, like how did you think this was going to work? Like who invented this? This is not this is this doesn't work like that?
What are you doing?
God? Sometimes like oh, this is such a non secutor sometimes when I think about gender dynamics, like I remember reading this tweet from a guy who was like, people think Taylor Swift is so pretty, but here she is without makeup, and it's like, did you think that she was.
Born with cat eyes and like break red lips?
And you thought all of us thought that You clearly thought that, but you thought all of us thought that, Like and then you're like you're supposed to be getting one over on us women, Like the whole thing is just laughable.
Yeah, it's like such a great example of who we consider the norm in society whose experience we consider the norm we base it on, because I was thinking about that recently too, about this whole idea that I feel like has faded away largely but not always. But like women who would wake up early and put on makeup so their partner would not see them without makeup. So it's like, it's like a lot of us feel that we have to do this thing because then we'll get this complaint from this random jackets.
On social media.
But then it's also like, dude, what do you think? It's strange. It's like a strange dynamic that is happening where you feel the pressure to do it. But then also shouldn't they shouldn't people realize how much work.
It is, right?
Yeah, And it kind of goes into that another question that I'm kind of concerned about I've had, like I've been noticing more and more on TikTok they just have automatic filters, and a lot of people talk about the fact that they use different filters, especially because they don't want to wear makeup and all that, and this feels makes them feel better, which all all in on whatever you do, you but then this level of expectation that especially men, especially I'm gonna say sis, heterosexual men think that this is what you should look like. And if you don't look like that in real life, holy catfish me and you've lied to me. Like in this new level of standard, a beauty standard, that might been like, Okay, this is good for us because it makes us kind of feel better.
But at the same time, what is it leading to?
That's such an interesting question, And I guess I just think that like a lot of men, like sis, heterosexual men, I think, are really willing to live in a fantasy world.
Like they're really like it's like if.
You if you think that you know, some of those TikTok filters they give you like glitter on your eyes, if you thought that, like somebody came out of the womb with like glitter on their eyes, Like I sometimes like it's it's like, honestly, it's one of the reasons why I don't really remove my body hair, because when you add up how much time it takes to do it, it's like it's actually not it's a lot of work, And it's like do I want to do extra work and pay whatever extra money it takes to buy the razor and this and that to feed into a fantasy that.
Adult women don't have body hair.
No, I don't like if you're if you're like you should by now know that adult women do you generally have body hair. And I'm not I'm not interested in perpetuating this like very weird fantasy world where this becomes how women are in the world.
And then if you see a woman.
Who is not doing that, it's it's not it's like out of sync with how you think.
Women should exist. I guess I'll say I don't.
I don't know if that fence no, Yeah, it absolutely does. It kind of again, it faes into this narrative that women are supposed to be this higher level of beauty standard in order to fit with a norm which is so much less work from men in general.
Yeah, and when we're looking at like tech, I mean, so many of the things that we've talked about on here do you play into that, whether it's filters or social media, of these kind of beauty standards that are getting perpetuated. Just the vitriol women can receive just by here's my face on social media. So it is really unfortunate because it's we can't ignore that that does happen. And I know I've told the story before, but I have a lovely collage of all of the horrible things people have said to me online. But it's also like, why can't we just these are women who are doing amazing things in AI. We don't have to talk about how they look. We really don't, right, we really don't.
We don't need to know how many children she has. We don't need to know she has a husband or a partner. Like, that's not we want to know. Are you doing better AI that protects people? Yeah, wonderful, let's support that.
Yes, And it's such a yeah, it's just such a missed opportunity because like the majority of people who are making sure that AI is safe or more inclusive, or doesn't harm people, or is ethical or doing really interesting work, those people tend to also be women, people of color, people who are not necessarily treated as the norm in tech. And so when you have an opportunity to talk to these people, actually use it because what they are working on probably really matters, and like it matters to all of us. Even if you're not somebody who is a Katechi, you're not somebody who thinks of yourself as like somebody who thinks about AI a lot.
This stuff is going to matter to everybody.
And I guess that brings me to one of the things that I'm looking forward to in twenty twenty four that I if I'm the girl in the cartoon that I've got this in my bag slung over my shoulder, that I'm taking it with me to the next year, and that is all of us, each and every one of us, being more involved in conversations around tech. Like I do think I have seen a shift this year around how regular people like you and me and people listening you know, are thinking about technology and talking about technology. I think that we are done with this idea that tech is only decided by a bunch of like super smart, genius white guys who don't have to be accountable to us at all because we're like not smart enough to understand the brilliance that they do. That is out I think that in twenty twenty three we are coming to the realization that these people have been using that dynamic as a way to essentially like get rich off of us, and I think that we're going to start asking some question about that dynamic and pushing it back, like, should these companies and tech leaders be able to just get rich off of us without us asking any questions or having any say I think, I think I'm starting to see people be like, No, Actually, I am smart enough to understand that I'm being taken advantage of it, exploited, and I have questions about that. So let's keep asking those questions in twenty twenty four, let's bring that dynamic into the new year with us.
Yes, yes, and of course always your show is such a good part of that and part of that conversation, so very eagerly awaiting the next the next season. Do you call it seasons, frigid, We call.
It seasons, but they're really just like when I get tired of making the show and I have to take a break.
So we are taking a break, but we'll be back real.
Soon while you're here with us answering those questions and asking those questions and allowing us to ask you those questions. So we are very grateful for that and excited for that for the new year.
Yes, yes, and we are hoping maybe we'll get to hang out irl and do some things in twenty four.
One day, one day.
It's gonna happen soon. Stay ye tune folks.
Yes well, Bridget, thank you so much as always for being here at the end of this year.
Where can the good listeners find you?
You can check out my podcast there are no girls on the internet on iHeartRadio. You can find me on Instagram at Bridget Marie and DC or on Twitter at Bridget Marie.
Yes, and definitely go do that listeners if you haven't already, Bridgeitte, I hope you have a good relaxing holiday, weirdo Christmas.
Yes, thank you all, Happy Mary whatever to all of y'all.
Yes, thank you and listeners. If you would like to contact us, you can or email is Stephanie and mom Stuff at iHeartMedia dot com. You can find us on Twitter at mom Stuff Podcast or in instagrament TikTok at Stuff. I never told you a ta public store and we do have a book. Thanks as always to our super producer Christina, our executive producer Maya, and our contributor Joey.
Thank you and happy holidays.
Yes yes, and thanks.
To you for listening.
Stefan never told you the prediction of iHeartRadio. For more podcasts from My Heart Radio, you can check out the iHeartRadio, Apple podcasts, or where you listen to your favorite shows.