Skip to content

Facebook to warn users if they’ve liked, commented on COVID-19 misinformation posts

Facebook will now be letting users know if they’ve liked, commented on or reacted to posts with harmful misinformation about COVID-19 that the company’s moderators have later removed.
Facebook
B.C. political parties have been giving Facebook voters’ data, a report says. Image: Pixabay

Facebook will now be letting users know if they’ve liked, commented on or reacted to posts with harmful misinformation about COVID-19 that the company’s moderators have later removed.

The company announced the new measure Thursday, and said the new message feature will also connect people to myths about the virus that have been debunked by the World Health Organization, including posts Facebook has removed “for leading to imminent physical harm.”

The new feature will roll out in the coming weeks.

“We want to connect people who may have interacted with harmful misinformation about the virus with the truth from authoritative sources in case they see or hear these claims again off of Facebook,” said the social media company in a post on its corporate website.

Facebook has also added a new section to its website, called Get the Facts, which includes fact-checked articles from the company’s partners that debunk misinformation about COVID-19.

The company said it has been working to connect people to accurate information from health experts since the virus was declared a public health emergency in January.  It has also partnered with fact-checking organizations around the world to review content.  

“We’ve now directed over 2 billion people to resources from the WHO and other health authorities through our COVID-19 Information Centre and pop-ups on Facebook and Instagram with over 350 million people clicking through to learn more,” said the company.

Facebook has also banned ads that promote COVID-19 treatments or cures.

Avaaz, a US-based activist network, said it was “strongly encouraged” by conversations it had with Facebook earlier this week about its new messaging feature.  

“We were strongly encouraged by unprecedented commitments from Facebook to institute retroactive alerts to fight coronavirus misinformation, an important and necessary first step that could potentially save lives.”

A survey conducted by Avaaz, released Wednesday, the group found that Facebook’s current policies are “insufficient and don’t protect users,” despite the “commendable efforts” of Facebook’s anti-misinformation team.

The advocacy group’s survey found that there were “significant delays” in Facebook’s implementation of its anti-misinformation policies, meaning that millions of users seeing misinformation content before the social media company labels it with a fact check and warning screen, or removes it.

Specifically, the survey found that it took an average of 22 days for Facebook to issue warning labels on false content.

In one example in the survey, a post on Facebook claimed coronavirus is “destroyed by chlorine dioxide,” which was viewed nearly 200,000 times.