As we enter into a new political campaign season, it's good to review research on fake news. This article digs and is informative.
Americans 65 and older share more fake news than other age groups. Why?
Credit: Adobe Stock
Fake news sharing was an epidemic in 2016. But according to researchers, those claims seem to have been exaggerated. In a recent study, over 90% of 3,500 respondents surveyed shared no fake news at all. If most of us don’t think we were sharing fake news, who makes up the 10%?
The study, published in January from Princeton University, looked at sharing habits for so-called “fake news domains” — outlets that published false stories intentionally designed to look like news articles and mislead the reader. Though sharing this content was still a relatively rare activity, the study found that Facebook users who shared the most fake news — nearly seven times as many articles as any other — were ones over 65. Older Americans shared more fake news stories even when researchers held all other factors constant, including political ideology.
“Age was significantly associated with sharing more articles from designated ‘fake news’ domains, on average,” said the lead author of the study, Andrew Guess, an assistant professor of politics and public affairs at Princeton. “No matter which way we tried to slice it or how we did the analysis, that finding kept coming up.”
It’s a Collective Harm
Facebook has seen a huge growth in older users, according to a Pew Research Center study published in February. Forty-one percent of Americans 65 and older use Facebook — more than double the amount from 2012. Yet even with that growth, Facebook’s practices remain opaque to most of its users: According to the same study, more than half of the platform’s adult users don’t understand how their newsfeeds work.
Research hasn’t proven much help on that front, either. Because each user’s newsfeed is created by Facebook’s internal algorithm, a mixture of personalized statistics and advertiser priorities, the Princeton study lacked the ability to see exactly what on its respondents’ feeds led them to share fake news stories.
That confusion creates an opening for potential scammers with political motivations. Guess likened the rise of fake news on social media to the early-2000s trend of email chain letters, which also often involved false or misleading news articles. Email financial scams frequently target older Americans, but Guess noted that an information scam has different consequences.
“The harm done isn’t on an individual level,” he said. “People aren’t necessarily losing their savings. But there is potentially a collective harm, in the sense that the overall information environment could be degraded and it just adds to confusion.”
The Emotional Manipulation of Fake News
The sheer amount of fake news stories online has grown exponentially in the last decade, as coordinated political groups have exploited the behavior of social media users. “Most of the content is reproduced, amplified and spread by people who don’t know that it’s false,” said Kate Starbird, an assistant professor in the department of human centered design and engineering at the University of Washington.
Over the last decade, Starbird has studied the ways people share news on social media — particularly on Twitter — after crisis events like mass shootings. She didn’t start out analyzing disinformation, she says, but that portion of her work has grown over the years as disinformation campaigns have became more widespread.
A common characteristic of the most successful fake news stories, Starbird said, is that they play on the reader’s emotions. “When you’re feeling really angry or really upset or really disgusted, that’s kind of a hint that someone’s trying to manipulate you,” she said.
Starbird particularly noted this phenomenon in the wake of the Paris terror attacks and the Umpqua Community College mass shooting in Roseburg, Ore., both of which occurred in fall 2015 as the U.S. presidential campaign was heating up. Almost immediately on social media, Starbird said, disinformation campaigns worked to ascribe political ideologies to the attacks. In some cases, bogus stories created “false flag” conspiracy theories questioning whether the attacks had even taken place.
The trend of politically-inflected fake news accelerated through the rest of the 2016 presidential campaign.
According to the Princeton study, nearly half of the misinformation released during the period played on conservative, explicitly pro-Trump or anti-Hillary Clinton viewpoints, often from domains disguised to look like legitimate news organizations, like “The Denver Guardian” and “abcnews.com.co.”
Much of it did come from well-publicized Russian operations, which often targeted online political communities ranging from fans of Fox News host Sean Hannity to members of Black Lives Matter. But Starbird said other groups were behind the disinformation, too, including Iran, Hezbollah and the Syrian government.
How Did We Get Here?
Why is sharing fake news more common among older Americans? Starbird doesn’t work directly with age demographics in her research, but she has some ideas:
“Older adults who might have come of age in a different information environment may not be as savvy or have the same skill-set, or even the same training, to be able to manage some of the ways that … misinformation and disinformation come at them online,” she said.
But other digital media researchers who do work more closely with age populations want to draw a distinction between older users and social media newcomers, which often, but not always, overlap. “I hesitate to conclude this is only an age problem,” said Robin Brewer, assistant professor in the University of Michigan’s School of Information studying accessibility in computing across age groups and other demographics.
“I think it’s more likely that newcomers are more likely to share ‘fake news’ because they are unable to adequately identify cues that the information isn’t from a reputable source,” Brewer said. But, she noted, “older adults are not new to the internet.” They simply aren’t the target demographic for a platform like Facebook, so they’re more likely to approach social media as newcomers.
Guess is also hesitant to draw too many conclusions about the culpability of older Americans in spreading fake news.
“Maybe it’s true that people over 65 have always been more susceptible to political misinformation,” Guess said. “But also it could just be true that this could be a weird feature of the 2016 election.”
Innovative Fact-Checking Is the Future
How can social media users address the problem for future election cycles, especially when many sharers of “fake news” have told researchers, including Starbird, they don’t care whether the stories they share are true?
Starbird believes early detection can still prevent the spread of fake news, because it prevents would-be amplifiers from having to rationalize their actions after the fact (or fake).
To that end, several colleges and startups are developing real-time fact-checking procedures, powered by everything from crowdsourcing to artificial intelligence, with the goal of catching and flagging disinformation before it spreads.
And Facebook has gradually instituted new attempts to make its platform more transparent and cut down on fake news sharing. The company is providing more information to users about why its algorithm is generating certain posts in their newsfeeds, and, most significantly, Facebook CEO Mark Zuckerberg recently floated the idea of creating a new section of the platform dedicated to “high-quality news.”
Already, some fake-news proprietors who made healthy profits off of Facebook shares in 2016 have had to drastically scale back operations after the site tinkered with its algorithm.
False or misleading political information existed before social media, and it’s unlikely to completely disappear from the public space. But controlling its spread in the present day requires properly understanding it.
Put astutely by Guess: “It’s difficult to even get well-trained humans to agree on whether or not something is fake.”
By Andrew Lapin