According To Pearson/NORC Poll, Most Americans Think Misinformation Is A Problem

According to the results of a poll released by the Pearson Institution and Associated Press-NORC, 95% of Americans believe that misinformation regarding current events and issues to is a problem, with 81% saying it’s a major problem.

Additionally, 91% say that social media companies are responsible for the spread of misinformation, with 93% saying the same of social media users. More Americans said that they blame social media users, social media companies, and U.S. politicians for misinformation spreading more than the U.S. Government or other foreign governments. However, older adults are more likely to blame foreign countries than younger adults.

41% are worried they have been exposed to misinformation, but just 20% are worried they have spread it themselves. The poll, which involved 1,071 adults, found that younger adults are more likely to worry about possibly having spread misinformation more than older adults.

Lastly, most Americans felt that social media companies and users, the U.S. government, and U.S. politicians all share responsibility for dealing with the spread of misinformation.

The results of this poll shouldn’t be too surprising, as the threat and spreading of misinformation has grown exponentially during the rise of social media in the past decade.

In addition, major events have been at the center point of misinformation, such as elections, natural disasters, and the COVID-19 pandemic. Many people have had their opinions on the virus and vaccines effected due to the fake news that is swirling around them, which shows us that something as simple as a lie or exaggeration in an article can have massive, negative impacts.

Social media platforms have made attempts in the past to combat misinformation. Back in 2017, Facebook discussed some of the steps it was taking to limit this matter, such as updating fake account detection, identifying fake news while fact-checking organizations, and making it harder for parties guilty of misinformation spreading to buy ads. Facebook also assured users of easier reporting of fake news and improved news feed rankings.

Those improvements clearly haven’t done much, if anything at all. In 2020, Forbes reported on a study that found that Facebook was the leading social media site to refer to fake news over 15% of the time, while referring to hard news just 6%. It wasn’t a close margin between social media sites, either. Google came in with 3.3% untrustworthy versus 6.2% hard news, while Twitter had 1% untrustworthy versus 1.5% hard news.

Speaking to 60 Minutes, Facebook whistleblower Frances Haugen explained how the tech giant prioritized what content users would see on their news feeds, which helped led to the spread of misinformation that targeted fierce reactions.

“And one of the consequences of how Facebook is picking out that content today is it is — optimizing for content that gets engagement, or reaction. But its own research is showing that content that is hateful, that is divisive, that is polarizing, it’s easier to inspire people to anger than it is to other emotions.”

If you are worried about biting the bait on or spreading around misinformation, there are plenty of ways to train yourself to have a more keen eye. According to The Verge, looking at factors such as survey and infographic sources, quotes, names and keywords, and the time-sensitivity of an article can all help you in concluding whether or not there may be misinformation afoot.

You should also take the time to consider other details, such as who is providing the information and how the story is being presented by different media sources. The Verge also urges for readers to think about their own feelings— are you getting strong emotions from reading the article? Do you want to instantly share it? If articles are feeding into reactions more than emphasizing actual facts or information, then that could be a red flag.