**This article has been updated with comments from the European Commission.
While medical professionals continue to raise alarm of the dangers of inaccurate information on social networks, a new report estimates the content of health misinformation spreading networks on Facebook to have generated 3.8 billion views last year, while top offenders had almost four times as many views as leading health institutions such as the World Health Organisation.
Despite a move in April by Facebook to establish a new system for alerting users who have engaged with misinformation it continued to spread through the network, a report by rights group Avaaz found.
Researchers who looked at 82 websites and 42 “superspreader” Facebook pages generating widely shared misinformation identified as such by independent fact-checkers found that the inaccurate health information is being spread by an ecosystem of actors.
Results revealed that only 16% of fact-checked misinformation analysed had a warning from Facebook, while the rest remained on the website without a label.
Meanwhile, the report uncovered a gap in Facebook’s ability to detect clones and variations of content that was originally flagged as false, leaving it without a warning label, particularly if it appeared in other languages.
While Facebook’s efforts are commendable, the social media giant must do more, lead researcher Luca Nicotra told this site.
“They do a lot of policy announcements but then the numbers don’t show necessarily [the results],” Nicotra said, calling for more transparency about how policies are implemented by the company.
“In general, you don’t understand why Facebook is not implementing a systemic solution but during a pandemic it’s just unacceptable.”
“They need to step up their efforts and implement a systemic solution that addresses the problem with their algorithm right now,” said Nicotra.
According to the researchers, retroactively distributing corrections from independent fact-checkers to every user exposed to false or misleading information could halve belief in misinformation while downgrading posts and actors in the ecosystem may decrease their reach by up to 80%.
Meanwhile, mounting evidence points to a grave impact on public health if no action is taken. One study published in the leading Nature journal in May predicted that anti-vaccination views will dominate in a decade, potentially amplifying outbreaks and leading to the reappearance of diseases.
Nicotra said that misinformation created “mistrust at a scale” towards the medical community, concerns that many medical professionals share.
“Trust is one of the main stems of the patient doctor-relationship,” Secretary General of the European Union of Medical Specialists, Dr. João Miguel Grenho said.
“What we are seeing is that the misinformation and the velocity of the spread of false information is undermining that trust and is putting into question its existence.
Dr. Grenho said that increasingly medical professionals see that patients fundamentally do not trust their doctors or “just embark on this misinformation and false treatments options [on their own], and they just don’t come to us until it’s too late.”
“The spread of the misinformation is like wildfire on dry land, it’s just spreading and making all of our work very difficult and challenging,” said Dr. Grenho, adding “we need to start to put some responsibility back on the vehicles of misinformation.”
Dr. Grenho said that we may see a trend against the COVID-19 vaccine once it is available, which will undermine efforts to build herd immunity.
The concern is shared by Dr. Frank Ulrich Montgomery, council chair at the World Medical Association, representing over 10 million physicians.
The doctor said that besides getting vaccine coverage, the other damage from the misinformation surrounding the pandemic is the denial of the existence or severity of the disease.
“It is hard to get this information out of Facebook but it is even harder to get it out of people’s heads,” he told EURACTIV.
Asked about the report, an EU Commission spokeperson told EURACTIV that while online platforms, including Facebook, have been promoting content from reliable sources and actively trying to combat disinformation, a “significant flow of falsehoods and conspiracy theories about the coronavirus” still spreads through social media channels.
“We need stronger action to improve transparency and platforms’ accountability, as well as access to non-personal data, and users’ awareness,” they said, adding that the Commission published a joint communication on disinformation back in June and have established a close monitoring programme on how platforms deal with coronavirus.
“We are working with all actors, including online platforms – signatories of the Code of Practice against disinformation,” they said, adding that the evaluation of the Code of Practice will be published in September and that they will also soon outline next steps in the European democracy action plan and the digital services act.
In reaction to the report, values and justice European Commission boss Věra Jourová wrote on Twitter that “disinformation related to health is a major threat for our societies”.
“We need stronger action to improve transparency, access to data, [and] users’ awareness.”
The executive is due to put forward its plans to regulate the online ecosystem and fight against disinformation later this year.
* Natasha Foote contributed to this article.
[Edited by Benjamin Fox]