The morning after we learned we were going to have to wait for the final results from the Iowa caucuses, we sent this tweet:
PSA: Keep an eye out for trolls and bots seeking to fan the flames while we wait for #IACaucus results. If you see an account weigh in that feels weird, check it. Don't immediately retweet or like what it says. That feeds right into their goal of sowing distrust. #IACaucuses
— Iowa Public Radio (@IowaPublicRadio) February 4, 2020
As the chaos surrounding the delay in results reporting continues, we are seeing interaction from bot-like accounts and lots of comments about Russians, trolls, and fake news on our posts on social media.
In the last 72 hours, the firestorm has had a lot of fresh fuel.
Conspiracy theories and vitriolic commentary are running wild, due in part to an error in the release of the caucus results, sparse communication from the Iowa Democratic Party, and a call from the Democratic National Committee for a recanvass of the results.
Meanwhile, the Senate aquitted the president on two charges of impeachment the day after a politcally divisive State of the Union address. Political divisions are growing deeper. Public trust is on the decline.
To echo our tweet from earlier this week: what we post, and how we comment on social media right now, matters.
Darren Linvill is an associate professor at Clemson University. His research explores state sponsored social media disinformation and its influence on civil discourse. In the last few years, he's presented about his work to the U.S. Senate Intelligence Committee, the Department of Homeland Security, and U.S. Army Cyber Command.
He says the most noticeable effect foreign actors have had through social media is to change how we perceive each other.
“That’s the single biggest effect that Russians have had on Americans. They changed how we engage with each other,” Linvill says. "All too often we now view people who may simply disagree with us as not simply having a different perspective, but not even being a real person."
Through his research, Linvill has found that the so-called trolls in your feeds often aren’t Russians.
“Most often, it’s someone trying to make a buck, or it’s just a jerk,” Linvill explains. “It’s almost always one of those two things, which is often the case in life. Sometimes they’re Russians, but jerks are much more common.”
Let’s get some terminology straight.
What is a bot?
A bot is an account that is automated and run by a computer. Bots are limited in what they can do. Think about them like social media robots. They are really good at math and at posting things 24 hours a day.
What is a troll?
Trolls are people on social media who operate fake accounts.
While we often hear this term used to describe someone who behaves like a jerk on social media, it's important to make a distinction between the colloquial use of the term, and the use of the term to describe “sock puppet” accounts. These are profiles of fictional people being operated by real people.
People who operate "sock puppet” troll accounts aren't always malicious. Sometimes they're just trying to make money.
“In an off-shoot of my work, I identified a big circle of Black Lives Matter accounts that were being operated out of Vietman trying to sell t-shirts,” Linvill says.
The fact that "sock puppet" troll accounts aren't always inciting controversy and can appear to be advocates for causes and ideas you believe in is a very important point to emphasize.
“The people who are in St. Petersburg who are doing this, or the people who are doing this in the United States are the account who is pretending to be your friend,” Linvill says. “It’s the account that agrees with you, not the account that you think is a jerk. You don’t actually persuade people to think something differently. You pull them with you. You entrench them in their beliefs.”
Accounts that post comments that are argumentative, rude, or demeaning most often aren't the ones trying to influence public opinion.
What should you look for to identify a bot or a troll account?
There are some red flags to look for to identify troll accounts that are very obvious and some that are more technical.
Ask these questions:
- Is the account anonymous?
- Does the profile provide a picture of a person and some description of who the person is?
- Are they only talking about politics?
- Are they only attacking one political candidate?
- What kinds of memes are they sharing?
- From the photos provided, does the person who appears to be running the account seem to be unrealistically attractive?
- Where possible, check to see that the account is verified.
Think about what you post on your own social media accounts. Real people tend to talk about real events in their lives.
“Troll accounts are very often one of two things,” Linvill continues. “They are either a beautiful woman between the ages of 20 and 25 years old, or they are a veteran. Ideology is sold in exactly the same way that you sell Budweiser. Foreign actors spreading disinformation learned it all from Madison Avenue. It’s just marketing. You sell things with pretty people and veterans.”
When news is polarizing, it's also important to share factual and current information.
On Monday night, as we were waiting on Iowa caucus results, someone retweeted an NBC tweet from 2008, which reported that Joe Biden had dropped out of the presidential race.
According to Linvill, the account that first retweeted that old post was not a Russian actor or a bot, but someone trying to stir the pot online. It worked, and NBC eventually issued a statement that the tweet was from a past election.
Editor’s note: The tweet above ^ is from 2008, not tonight. This is being noted because this tweet is being tweeted by some users tonight as if it is new tonight. Again, the tweet above is from 2008, not 2020.
— Breaking News (@BreakingNews) February 4, 2020
The news has moved fast in in the last few days. The commentary online has been furious. We've been reminded that technology can be a powerful tool - or a giant stumbling block. When all is said and done, it's how we use it that matters.