This story is part of, CNET’s coverage of the run-up to voting in November.
On Facebook, Twitter and its own website, a group calledclaimed to be a global news organization with a mission to expose corruption. Its supposed purpose was to tell stories that had been hidden from the public.
Turns out PeaceData was concealing the truth about its own story. In September, Facebook and Twitter said they had uncovered ties between PeaceData and the Internet Research Agency, the notorious Russian troll farm that sought to sow discord among Americans during the 2016 US elections. The social networks quickly pulled the group’s accounts, some of which appeared to use artificial intelligence to generate fake profile pictures.
Though PeaceData never gained a large following, the elaborate disinformation campaign is a potent reminder of how easy it is to conceal identities and motives on social media. Studies show that Americans have trouble separating fact from fiction — look at how many people have been hoodwinked by the bogus— especially online. Lawmakers, activists and even employees have pressured social networks to do more to combat misinformation, particularly ahead of the US elections.
On Wednesday, the debate over online misinformation flared up again when Facebook and Twitter limited the spread of a New York Post story that included unverified allegations about Democratic presidential nominee Joe Biden’s son, Hunter. Biden’s campaign challenged the accuracy of the report, which included information from emails reportedly taken from a computer hard drive. Many observers worried that hackers might’ve leaked the information to meddle in the November election. Twitter said the story violated rules against sharing hacked materials. Facebook referred the story to third-party fact-checkers for review. The moves by the two social networks proved controversial, praised by some experts and criticized by some politicians.
Online misinformation and conspiracy theories are such big concerns that the House Intelligence Committee is holding a virtual hearing on the topic on Thursday.
“My initial advice to media consumers is to always be on guard when interacting with content and users on social media,” Jack Delaney, who was duped into writing for PeaceData, said in an article published in The Guardian. “I would have never guessed I would be caught up in a dubious media campaign.”
Delaney’s advice is worth bearing in mind. About 18% of US adults say they get their political news primarily from social media, according to a survey released this year by the Pew Research Center. Roughly 48% of US adults 18 to 29 years old say social media is the most common way they get political and election news.
People who relied more heavily on social media for news were more likely to get facts about politics and current events wrong compared to those who mostly use other sources such as print media and news websites. As part of the study, the Pew Research Center quizzed US adults about topics such as which party had the majority in the US Senate, the unemployment rate and tariffs during President Donald Trump’s presidency.
Joel Breakstone co-authored a 2019 Stanford History Education Group study that showed high school students also lacked the skills to judge the credibility of online information. Breakstone, who leads the education group, said strategies such as examining whether a website is professional looking and studying its address aren’t always effective in identifying credible sources of information. Social media only “exacerbates the problem” because posts that flow through someone’s feed look the same.
“Suddenly, it can be more difficult to distinguish who is behind that information, the trustworthiness of it and the process that went into producing content,” Breakstone said. “The distinction between different kinds of sources blurs as well.”
Erin McNeill, president and founder of Media Literacy Now, said social media users should be aware of their emotions.
“That’s one of the psychological tricks that is used to get people to share misinformation and disinformation,” she said. “When somebody’s reading something and they get angry or upset, that seems like a key motivator to share before you really investigate.”
PeaceData published political news stories about a number of countries, including the US, and was able to trick freelance journalists into writing articles for the site. One article said the US was an “unreliable partner for global organizations.” Another stated that Joe Biden, the Democratic presidential nominee, and Kamala Harris, his vice presidential pick, demonstrate “how the Western Left will give in to Right-Wing populism.”
PeaceData denied allegations it had ties to Russia and didn’t respond to a request for further comment. Delaney declined to comment. The group said in a message on its website that it’s shutting down and accused social networks and news sites of attempting to “silence free speech.” The website no longer contains any articles. The message is accompanied by a cartoon featuring the CEOs of Facebook, Amazon and Twitter with their heads in a guillotine under baskets bearing the label “toxic waste.”
Consuming news on social media
Young voters say they are wary about the political news they see on social media. Still, it’s tough to stay away from those sites because they’re easy and quick ways to read content from different sources.
About 34% of adults ages 18 to 29 surveyed last year by Pew reported getting their political news from Facebook in a one-week period, compared to 25% of adults overall. Google-owned YouTube, Twitter, Reddit and Facebook-owned Instagram were other popular sites used by this age group to consume political news.
About 54% of Americans think social media is responsible for a great deal of misinformation while about 58% believe that President Donald Trump is spreading a great deal of misinformation, according to a survey conducted in September by Gallup and the Knight Foundation. Social networks have labeled some of that contain false information about mail-in voting and other topics.
Quentin Wathum-Ocama, vice president of the Young Democrats of America, said that he sees links to political stories from his friends on Facebook or political videos on TikTok. The 29-year-old also uses Twitter where news can break before it’s even published on a website. Friends and family members also offer political opinions on social media, he said.
Wathum-Ocama, who lives in Minnesota, said he’s occasionally deleted some of his own social media posts after discovering they contained false or exaggerated information. He’s seen edited videos shared on his social media feed that don’t contain context. Content tied to QAnon has also popped up on social networks.
“Even though I would like to say that I’m a really highly informed person, I am dependent on something that can be really easily manipulated, corrupted or altered,” he said.
Chelsea Howell, secretary for the Texas Young Republicans, said she’s seen fake social media profiles posing as news outlets such as Fox News or CNN. She uses Facebook, Twitter and the conservative social network Parler, and browses news sites and reads her local newspaper.
The 29-year-old also posts political content herself, sharing TikTok videos that demonstrate her support for Trump. When she sees a political news story, Howell said she does her own research using the search engine DuckDuckGo, which she prefers to Google because of privacy concerns. Facebook sends some posts to third-party fact-checkers, but Howell said she’d rather come to her own conclusions.
“I’ve totally disagreed with Facebook a few times on their fact-checking,” she said. “I feel that they fact-check a lot of the right-wing people a lot more than they do to the left wing.”
Howell said she will also read political news from left-leaning websites to see what the other side is saying and has friends who support Democratic presidential nominee Joe Biden.
“We need to find common ground to find real solutions for what’s really going on in this country,” she said.
Separating fact from fiction
Social media sites are trying to point users to more authoritative sources. Facebook, Twitter and TikTok all launched online hubs for voting and election information. YouTube has directed people to mail-in voting information from the Bipartisan Policy Center, a bipartisan think tank.and also said they will label posts from US presidential candidates who declare victory before the votes are counted, along with other steps to curb the spread of misinformation.
Facebook has tips for spotting fake news, which include being skeptical of headlines, closely examining the link and checking other reports. The company didn’t respond to a request for comment.
Twitter outlined various steps it’s taking to safeguard the election. The company said it added more context to trending topics on the site and directs users to trustworthy sources when they search for key terms related to voter registration.
Understanding the context behind videos and images are important too. The Stanford History Education Group concluded the digital literacy skills of high school students were “troubling” after testing 3,446 students from June 2018 to May 2019.
In one example, students saw a video on Facebook of poll workers secretly stuffing ballots into bins. The caption in the video says the clips were from the 2016 Democratic primary elections in Illinois, Pennsylvania and Arizona. The user “I on Flicks,” who shared the video, states in a post “Have you ever noticed that the ONLY people caught committing voter fraud are Democrats?”
Students were asked if this video was strong evidence of voter fraud during the 2016 Democratic primaries. About 52% of students responded that it was. The video clips actually showed voter fraud in Russia.
“Most people have not learned effective strategies for evaluating online content,” Breakstone said. “Young people may appear to be very fast at navigating digital devices but they are not nearly as skilled at making sense of the information that they yield, which is reflective of the population at large as far as we can tell.”