Facebook to inform users if they've engaged with posts containing coronavirus misinformation

Apr 16 2020, 1:06 pm

Facebook is boosting its efforts to combat misinformation about the COVID-19 pandemic on its platform.

The company announced Thursday that it will begin displaying messages in the News Feed to users who have liked, reacted, or commented on “harmful misinformation” about the pandemic that Facebook has since removed.

“These messages will connect people to COVID-19 myths debunked by the WHO including ones we’ve removed from our platform for leading to imminent physical harm,” said Facebook in a statement, adding that people will start seeing the messages in the coming weeks.

The social media platform says working to stop the spread of misinformation about COVID-19 is a priority, and it is currently collaborating with over 60 fact-checking organizations.

“For example, during the month of March, we displayed warnings on about 40 million posts on Facebook, based on around 4,000 articles by our independent fact-checking partners. When people saw those warning labels, 95% of the time they did not go on to view the original content,” said the company, adding that to date it has removed “hundreds of thousands” of posts containing misinformation.

Daily Hive StaffDaily Hive Staff

+ News
+ Venture
+ Tech