Facebook App

Facebook To Begin Flagging Posts Containing False Information About Covid-19

Facebook is taking some major initiative to prevent the spreading of misinformation regarding the coronavirus pandemic. The platform announced that within the next few weeks users who have previously liked, reacted to, or commented on posts that are considered to be “harmful misinformation” about Covid-19 will be directed to information from sources like the World health Organization, who have a legit authoritative standing when it comes to the pandemic. 

Users and tech-experts alike were relatively shocked by this recent development from Facebook, especially considering executives working for the company have been quite adamant in the past about not monitoring the information that’s spread across the platform. 

Embed from Getty Images

“We want to connect people who may have interacted with harmful misinformation about the virus with the truth from authoritative sources in case they see or hear these claims again off of Facebook. The notifications will apply only to Facebook and not our other platforms like Instagram and WhatsApp,” wrote Guy Rosen, Facebook’s vice president of integrity.

Social media in general is known for the amount of “fake news” that gets easily spread around within seconds of it being posted. So from the beginning of this entire pandemic, dangerous lies regarding Covid-19 began circulating on platforms like Facebook especially due to the fact that they normally don’t have a system for filtering out posts containing misinformation. 

However, because of the fact that this is a global pandemic, platforms such as YouTube and Twitter have joined Facebook in trying to flag as many fake posts as possible. In March alone Facebook claims to have flagged over 40 million posts from around the world that contained “false information” regarding the coronavirus. 

Embed from Getty Images

Facebook included a visual of what users who have interacted with false posts would see on their feeds. The visual showed a design that basically nudges users to click on resources from WHO specifically if they have concerns regarding the pandemic. The company is still tweaking the model as well and a spokesperson claims that Facebook will continue to “iterate on these designs.”

However, critics of Facebook aren’t satisfied with this minimal first step in combating the spread of lies regarding Covid-19, especially considering this is literally a life or death situation. This “lax moderation” issue has been around for a while with Facebook especially, which was criticised just this past fall for spreading lies regarding immigration and politics in general. 

“[T]he company has taken a key first step in cleaning up the dangerous infodemic surrounding the coronavirus, but it has the power to do so much more to fully protect people from misinformation. [We’ve] been pushing for stronger fact-checking and for corrections to be issued more broadly on the platform, not just on content about Covid-19. New research commissioned by our organization shows that Facebook corrections have a major impact in shaping users’ views and can effectively reduce people’s belief in misinformation by 50%,” wrote Fadi Quran campaign director at nonprofit activist group Avaaz.

Social media platforms are widely used and easily accessible for practically anyone around the world, so it’s hard for these companies to remove every little bit of false information that’s spread on its platforms. So for now, it’s up to us, the users, to check the sources we’re getting our information from. When in doubt, just check the World Health Organization/Centers For Disease Control websites to get directed to the most accurate information regarding Covid-19.