Facebook Steps Up Efforts To Combat The Spread Of Coronavirus Misinformation
As a global pandemic takes hold, more people are turning to Facebook in search of news about the coronavirus.
But the traffic load on the social media platform is also testing its ability to crack down on a spike in virus-related misinformation. Users are being confronted with phony cures and conspiracy theories around the virus' origin. (Note: Facebook is a financial supporter of NPR.)
Nick Clegg, Facebook's vice president of global affairs and communications, says he can't quantify the volume of misinformation around the virus, but that the company will remove coronavirus-related information that has the potential to cause physical harm.
"We do not allow misinformation to circulate on Facebook which can lead to real-world harm," Clegg said in an interview with All Things Considered on Wednesday. "So if people say drinking bleach is going to help you vaccinate yourself against coronavirus — that is dangerous. We will not allow that to happen. We won't even allow folk to say social distancing makes no difference in dealing with this pandemic."
On Wednesday the company outlined the efforts it's taking to prevent the spread of inaccurate content during the public health crisis.
Through pop-ups and a new COVID-19 information center on Facebook, the company says it's directed more than 1 billion people on Facebook and Instagram to resources from the World Health Organization, the Centers for Disease Control and regional health authorities, and that over 100 million users have clicked on the content.
The company's moves to curb pandemic-related misinformation on the site are aggressive in comparison to its hands-off approach in the moderation of political messaging.
Particularly in the wake of the 2016 U.S. presidential election, and now during the 2020 race, critics and lawmakers have slammed the company for not doing enough to combat the circulation of false claims from politicians via ads and other messaging.
"What politicians say on the campaign trail about each other is not what a medic or an epidemiologist says about a pandemic," he says. "They're completely different forms of information. One is underpinned by science and established expertise, which no one questions," adding that it's easier for the company to act under the "strict expertise and guidance" from institutions like WHO and CDC.
"The other is a highly contested form of speech. That is the whole point about political speech in a democracy."
But, he says, Facebook does have limits when it comes to political content.
"You cannot use your freedom as a politician in the United States, for instance, to say things which will lead to real-world harm," Clegg says.
There's still room for gray area, and it's unclear whether these criteria apply to high-level officials, including the president himself.
In a still-live post from the White House's page on Facebook, President Trump gave a press briefing in which he embraced chloroquine as a promising treatment in the fight against the coronavirus. However, an Arizona man died and his wife was hospitalized after consuming a form of the chemical.
Last week, according to a Facebook internal report obtained by The New York Times, more than half of the stories being read on the platform in the U.S. were coronavirus-related. The company also recently reported a 50% increase in "total messaging" over the last month in several of the countries most impacted by the virus.
At the same time, the company's increased reliance on artificial intelligence for content moderation could further compromise its ability to effectively police content. Facebook has acknowledged that human content moderators are the best line of defense.
But those contracted employees, who weed through hours of sensitive and often disturbing content — and can suffer serious mental health side effectsas a result — were placed on paid leave last week after Facebook failed to come up with an option for them to continue their work remotely.
Clegg added that a number of full-time employees will be trained to review some of the more harmful content, including: child safety, terrorism, suicide and self-injury.
But, he said, users should expect more mistakes as Facebook makes these adjustments amid the increased flow of content.
"It is perfectly possible there will be occasional mistakes and gaps or a slightly slower response than would've been the case in normal times," Clegg said. "These are not normal times."
In a press call on Wednesday, Facebook CEO Mark Zuckerberg said those mistakes will inevitably include content that shouldn't be taken down.
Copyright 2020 NPR. To see more, visit https://www.npr.org.