New MIT Sloan research measures exposure to misinformation from political elites on Twitter | By The Perfect Enemy
Findings show a clear partisan split in exposure to political elites who make false claims and reveal that users who follow elites with high falsity scores are more likely to share misinformation themselves
CAMBRIDGE, Mass., Nov. 29, 2022 /PRNewswire/ — When talking about misinformation online, viral “fake news” has gotten the lion’s share of attention. But what happens when political elites themselves propagate falsehoods online?
In a new paper published in Nature Communications, MIT Sloan School of Management professor David Rand and research affiliate Mohsen Mosleh developed a falsity scoring system for political elites—influential political players and organizations—and a method for estimating individual Twitter users’ exposure to misinformation from those elites.
Those tools, which Rand and Mosleh have made public via a web app that allows people to check the misinformation exposure rating and ideological bias of any Twitter user and an API for researchers, show that users who follow political elites with high falsity scores are more likely to share misinformation. They also reveal a clear partisan difference in exposure to politicians who make false claims.
In their study, the researchers identified 816 political elites—among them, all sitting U.S. congressmen and senators—on Twitter and used ratings by public fact-checking database Politifact to give them falsity scores by averaging the ratings from all of Politifact’s fact-checks on each elite. They then assigned individual Twitter users elite misinformation exposure scores by averaging the falsity scores of the elite accounts they follow.
The researchers found a strong negative correlation between these misinformation exposure scores and the quality of information users themselves share on Twitter. The more false information from elites a person sees (as proxied by the extent to which they share elites who make false statements), the more likely they are to share news from questionable sources (as identified by fact-checkers or by a politically balanced group of laypeople).
“This finding, if not necessarily surprising, is important because it points to the influential role that political elites can play in transmitting misinformation online,” Rand said. “Moreover, a more unexpected result is that when we look at the political extremes, there’s a big partisan asymmetry. Among conservatives, more politically extreme users were substantially more likely to follow dishonest politicians. Among liberals, on the other hand, this pattern was much more muted.”
Twitter users with higher misinformation exposure ratings were also more likely to use toxic language and express moral outrage, as measured using existing classifiers. “People on Twitter are also more likely to follow and engage with accounts with similar misinformation exposure, creating ‘falsehood echo chambers,'” Mosleh said.
The work is significant, Rand explains, because the character of online misinformation has evolved over the last few years. “Originally, my group’s work focused on “bottom-up” viral misinformation, as typified in the 2016 election when Macedonian teenagers posted false claims that went viral and got lots of clicks,” he said. “We showed how typical users do not want to share false claims but can get distracted, so shifting users’ attention back to the idea of accuracy can reduce misinformation sharing.”
Rand found that while most people had the view that social media was dangerous because anyone could say anything they wanted, regardless of their authority, a different type of misinformation arose during the 2020 election cycle. When it came to issues like election integrity and COVID-19, there were coordinated, top-down misinformation campaigns by political elites. “This is important because it means that policies which give elites a free pass—like Twitter’s reinstatement of suspended accounts of users like Donald Trump, or Meta’s prohibition on fact-checking active politicians—can be problematic,” Rand said. “It is essential to hold political elites accountable when they spread false or misleading claims.”
While false viral information is problematic in its own right, elites have a unique power to amplify and lend weight to specific claims. Extensive research has shown that people are more likely to believe claims from political figures they support and less likely to believe information shared by those they oppose. Simple exposure to false claims, especially from trusted leaders, can have an impact on social media users’ opinions—yet most research until now has focused on the content users share themselves, just a small fraction of what they see. In the future, they’re interested in tracking how access to this information impacts user behavior.
“We hope that people will use this methodology to do studies that predict what kinds of users are following more or less honest politicians, and more generally get people to think about the choice of who they’re following and what they’re exposed to,” Rand said. “What happens if you learn that a lot of the elites you’re following tend to lie? My guess is that many people don’t want to be following elites that tend to lie, but they don’t really think about it or know it.”
Media Contact:
Casey Bayer,
[email protected],
914.584.9095
SOURCE MIT Sloan School of Management
Published on The Perfect Enemy at https://bit.ly/3gKLUnw.
Comments
Post a Comment
Comments are moderated.