A new digital player is helping to shape public discourse online. How an audience feels about a brand, a narrative, a personality or a scandal is being moulded by the rise of… bots. PR bots, to be exact. These bots can be programmed to target your social media algorithms and the content you do or don’t get served. In today’s podcast, we’ll explore how these digital armies can be mobilised to influence public opinion and even elections.
Hosts: Chloe Christie and Zara Seidler
Want to support The Daily Aus? That's so kind! The best way to do that is to click ‘follow’ on Spotify or Apple and to leave us a five-star review. We would be so grateful.
The Daily Aus is a media company focused on delivering accessible and digestible news to young people. We are completely independent.
Want more from TDA?
Subscribe to The Daily Aus newsletter
Subscribe to The Daily Aus’ YouTube Channel
Have feedback for us?
We’re always looking for new ways to improve what we do. If you’ve got feedback, we’re all ears. Tell us here.
Already, and this is the daily This is the Daily os oh. Now it makes sense.
Good morning and welcome to the Daly Ohs.
I'm Zara and over the next little while, we're going to be bringing you a bonus series featuring our favorite deep dives from twenty twenty four. We've put together the best deep dives to listen to on the beach, road, tripping when you don't want to talk to the person next to you, or just reflecting on the year that was. Welcome to tda's summer series. Earlier this year, we did a deep dive into the new digital player helping to shape public discourse online. Now the TLDR is the bots are targeting your social media algorithms and the content that you get served online. In today's deep dive, Chloe explores how these digital armies are being mobilized to influence public opinion. It's a fascinating element of our online world. So without further ado, let's get into the episode. Now, Chloe, you've become something of a bot.
Expert in the last couple of weeks.
But for anyone that hasn't spent as much time in the weeds of bot mania as you have, Yes, can you just explain what exactly is a bot?
I have been in the weeds of the botmania. It's quite a place to be, and it can sound really techy when we talk about this stuff, but it's pretty much every time you jump on social media and you see somewhat of a spammy comment on a popular Instagram account. Just look at the Daily Yours's account, you'll see plenty of them.
Excellent plug, excellent plug.
You'll see them most often in the comment section within seconds of making a post on Instagram.
Yeah, and I feel like, you know, even if you just look at our Instagram, you're seeing things like how coach Sarah changed my life. They changed the name of Sarah to multiple different women.
But there you go.
There's some like semi nude women that have come up in gifts in our comment section.
Click on my io you'll get rich fast.
So those sorts of comments you're saying, those are coming from bot accounts, right.
Yeap, So that is a bot account. But when we look at bots more broadly, bots are programmed to perform automated, repetitive tasks over a network. Okay, so they're deliberately designed to mimic human behavior. They look like humans, they sound like real users. Only bots can generate content at a speed and a scale that we humans simply couldn't. So that's thousands upon thousands of comments a day. There are some helpful bots like chat gpt is, a chatbot, Siri Alexa, but others, as we know, are less helpful.
They are less helpful, but they are absolutely everywhere. Like it takes, you know, one second of looking, as you said, at a popular account and you just get flooded with this stuff. How significant I mean, I'm just using anecdotal evidence here, So how significant are the presence of these bots in the online world.
Yeah, it's wild. And to give you a sense of the scale of things, nearly half of all online traffic in twenty twenty three came from fake users.
That is crazy.
So bots, yeah, now, that's according to a new report by IT security firm Imperva, which also found that bad bots programmed to defraud and scam users accounted for nearly one third of all of that traffic.
Wow.
I mean, I knew they were everywhere, but that's really there everywhere.
They're everywhere.
And so when we're thinking about bots, you know, I think that anyone that has spent time on the internet knows not to click on you know, come earn money with me with your crypto dollars, you know in our comment section. But what are some of the other ways the bots show up or the other ways that they can I guess influence behaviors or conversations.
Well, that's a whole different podcast about how people are scammed and frauded into clicking on things that they shouldn't. But what I was really interested in was the way bots can be used to make an opinion seem like a fact, or to make it feel and zeem as though it has either widespread support or widespace opposition for something. What does that do for real users online? Because if you can gear thousands upon thousands of accounts to push a certain narrative, that can be really dangerous if we think about elections or democracy or just public discourse at large.
It's a really interesting topic, but it does still seem quite up there, and I would love to bring it down to down here. Is there an example that you can just provide, I guess to give some orientation as to what you're actually talking about here.
So let's just say you're scrolling on X and you read a bunch of different tweets, retweets shares comments of those tweets saying apples are really really bad for you and instead you should be buying oranges. This is really basic, but just still wrapping heads around it. You're probably going to start feeling a little bit suss about apples, even though you might really like them. You just sort of have a bit of questions about that. You might even consider buying more oranges. And bots can create these spaces that feel like communities are sharing ideas that it's just normal people talking about how we all don't really like apples, but there is an agenda at play. Now when you consider the influence that, as I said, this could have on politics or elections.
Not just apples and oranges, Not just.
Apples and oranges, it can be really scary.
What are the experts say about bots?
Because you know, they are a fairly new phenomenon, but there must be a body of research out there.
I reached out to a bot doc.
Of course, a bot doc has a dock of bot.
She does have a real name. Her name is doctor Sophia Mellinson Richard done from Canada's McMaster University. Now She did a PhD on Bota gander when bot armies generate mass content to saturate social media feeds and then manipulate audiences. So doctor Richardoni told me about something called hashtag flooding, which is essentially a tweet containing nothing but popularized keywords and catchphrases in the form of hashtag.
After hashtag up to hashtag so annoying, so annoying.
Now, this tactic, as well as the rapid resharing of human posts so twenty five thousand in ten minutes, can create, like we were saying before, this illusion of widespread support or widespread opposition for specific viewpoints. She says that this happens as though the idea embedded in the tweed came from grassroots popularization. So when we're talking about mimicking human behavior, it's this feeling of oh, everyone thinks.
This, and you want to be a part of the everyone. Like that's the human condition, right exactly.
It's the concept of herd mentality, which is the idea that individuals naturally want to conform to the dominant view of the community.
Yeah, I mean, it's so fascinating, and the reason that you came to I don't want to say, become obsessed with this topic, but I mean had started your fascination was around one example that we've seen in the last couple of years.
Talk me through the Johnny Depp Amber Herd story.
I don't know if many people remember what the internet looked like in twenty twenty two during the Johnny Depp amber her defamation trial, but I recently listened to a six part podcast that brought me right back there. So for people who need a refresher on the story. Essentially, Johnny Depp filed a lawsuit against his ex wife Amber heard over an opinion article Herd wrote for The Washington Post. She alleged she had experienced domestic abuse. Her didn't name Debt, but he launched defamation proceedings against her, arguing he was identifiable from the article and depth denies claims that he physically abused her. Now, a jury in Fairfax County, Virginia ultimately sided with Depp, and Herd was sued for defaming him. But I think you might remember that the court proceedings were streamed online.
Yeah, so I was going to say, can you, in the minds of our listeners connect what we've just been talking about which is you know all of this stuff about bots and this case, what's the connection here?
So as the case played out in the courtroom, the internet mounted its own unofficial trial of Amber Heard. This is the premise of the podcast I was talking about earlier, Who Trolled Amber? It's by UK media organization Tortoise. The podcast found that a large part of the online hate campaign against her was actually manufactured and executed by bot accounts. Now I spoke to Xavia Greenwood, he produced the Who Trolled Amber podcast.
Some of the main themes were that Amber Heerd deserves prison, Amber Heard as a gold digger, Amber Heard as a fraud, Amber Heard as a liar. To give you a sense of the scale of things, there was a hashtag justice for Johnny Depp that was viewed fifteen point seven billion times hashtag just as Amber Heerd viewed a fraction of that, and my friends were suddenly saying things that didn't really sound like them. They were saying, well, you know what if this time she was the abuser. And these were people who typically in the Me too movement would be a bit more cynical or maybe a bit more reserved in making a judgment like that. So yeah, from the beginning, we sort of saw that this was quite suspicious.
What else did this investigation uncover?
The team Tortoise brought together a database of over one million anti amber Heard tweets, and they brought it together and they found that more than half of them were inauthentic. So what that means is either they were posted from spam accounts with three followers and they were built in the last two months, or they were amplified in an inauthentic way so reshared and reshared thousands upon thousands of times.
Okay, I think the thing that I wonder when I hear about this story is like, who is behind this trolling campaign? It's very clear who benefits from it, but who's behind it.
We don't know. According to the investigation, the team could hypothesize multiple different scenarios and their likely were multiple different agendas at play.
Yeah, it's crazy, So this could.
Be genuine Johnny Depp fans who can acquire bots for pretty cheap online and push support for their favorite actor in other scenarios. Xavier floats what he calls a more abstract theory.
I think it's sort of fairly widely known now that authoritarian regimes who want to sow discouse or want to cause confusion in the West that they seek out wedge issues. So they seek out issues which divide Britain's, divide Americans, divide Australians. They try to sort of drive a deeper wedge.
But something that really stuck with me from my conversation with Xavier was a point he made about the future of political discourse on social media.
If you can attack a celebrity who has enormous an enormous amount of resources, as Amberhard did, was to stop someone doing the same thing when it comes to attacking a politician or when it comes to trying to sway an election.
I mean, we're talking there about elections, and we of course know that there is one coming up just around the corner in the US. Are we expecting to see this sort of interference from bots in the US election this year?
We already have. Last month, the Biden administration charged Russian media executives over an alleged targeted online campaign to influence voters in the US and push hidden Russian government messaging. Now it comes after US officials seized thirty two Internet domain names that were covertly targeting specific demographics on social media and promoting AI generated false narratives and pushing that to those groups.
Okay, so there are ready allegations that we've seen this at play when it comes to the US election. Ye, I mean, I think that a natural endpoint here is that if we're talking about the fact that even the most powerful institutions, people, whatever, in the world can be susceptible to this kind of bodagander, as we'll call it, what is like an average dough like you or I meant to do when it comes to protecting yourself against something.
Like this, Well, that was what came back in so many of the conversations I was having. For the average dough like you and me, A lot of it comes down to awareness, staying vigilant, staying curious, question what you're getting served, who's pushing it. And that's generally for anything that you've seen online, but particularly when you're thinking something looks a bit spammy, it likely might be it's important to be critical about the content you're seeing before you form an opinion, especially ahead of an election. Now, as for what comes next, Xavier Greenwood says, we're largely in uncharted waters.
What we saw with Amber heard trial, we may well continue to see in an even more intense way in the future. To some extent, the genie is out at the box.
Thanks for listening to today's episode of tda's summer series. We'll be back again tomorrow with another of our favorite deep dives, but until then, have a wonderfully warm day.
My name is Lily Maddon and I'm a proud Arunda Bungelung Caalcuttin woman from Gadighl Country. The Daily oz acknowledges that this podcast is recorded on the lands of the Gadighl people and pays respect to all Aboriginal and Torres Strait Island and nations. We pay our respects to the first peoples of these countries, both past and present.