An expert group advising the European Commission on so-called fake news wants tech giants to be more transparent about their advertising revenue. Their new report pressures social media firms like Facebook and Twitter to deal with the spread of false information on their platforms.
Platforms should change their advertising policies to “discourage the dissemination and amplification of disinformation for profit”, the group wrote in a report published on Monday (12 March).
The Commission will announce its own non-binding recommendations on fake news on 25 April. It set up the expert group late last year and selected 39 researchers and representatives from tech platforms, media and NGOs to draft a report ahead of the upcoming strategy.
The expert group “deliberately” rejected the term “fake news”, and said its recommendations address “disinformation” that is not illegal but is “designed, presented and promoted to intentionally cause public harm or for profit”. So far, the Commission has described its April announcement as an effort to address fake news.
Members of the group began meeting in January. The experts’ work ended after Thursday’s report was published, but they could reconvene in November to judge whether companies are taking on the recommendations, which focus largely on big online platforms’ role in spreading disinformation.
“The growing power of platforms to enable—and potentially to interfere with—the free circulation of information comes with growing responsibilities,” the report reads.
It urged the Commission to consider how disinformation could spread online before the 2019 European elections.
Social media platforms have come under increased scrutiny over the last few months to remove posts containing false information and shut down fake user accounts. Political pressure increased after Facebook revealed last autumn that a Russian company with ties to the Kremlin had purchased political advertisements before the 2016 election in the United States.
The Commission’s expert group wants social media companies to be more transparent about who pays to advertise on their platforms. The report outlined ten principles that platforms should follow, including “adhering to ‘follow-the-money’ principle, whilst preventing incentives that lead[s] to disinformation”.
Lobbyists from Google, Twitter and Facebook were part of the expert group and approved the report.
The document calls on platforms to clearly label when advertisements are sponsored or paid for by political groups, and to be transparent with their users about when their personal data is used for targeted advertising.
EU Digital Commissioner Mariya Gabriel said that the Commission will consider the group’s recommendations, but will not commit to including them in the strategy she will announce next month.
Gabriel also insisted that the Commission will not yet introduce binding legislation forcing platforms to remove false information, but may consider regulation in the future.
“That’s very much the approach we need to stick to: no legislation at this point. Let’s try and identify the problem, let’s put in place a number of measures, let’s monitor the implementation of these measures. And then the Commission reserves the right to see what to do in the future,” she told reporters on Monday.
Gabriel spoke at a joint news conference with the chair of the expert group, Madeleine de Cock Buning, a professor of media law at the University of Utrecht.
De Cock Buning said she was confident that platforms will adjust their advertising policies and take measures to be more transparent about political ads.
“It is also in their own best interest to do this because they want to be publicly on the right side of the spectrum,” she told reporters.
“It’s good for their business models to stand aboard and that’s a very important success factor for self-regulation,” she added.
Sources involved in the group’s work pointed to Unilever’s announcement in February that it would pull its ads from Facebook and Google if the platforms do not address their role in the spread of false information online.
“As one of the largest advertisers in the world, we cannot have an environment where our consumers don’t trust what they see online,” the multinational’s chief marketing officer Keith Weed said at a conference last month in California.
One source with knowledge of the EU plans said that Facebook “realised the problem too late” in the US, but that there is not yet as much public outcry in Europe as there was after the US election.
“We are not at that point, we have a chance,” the source said.
“I think in Europe we have a sound basis to make sure whatever content is supported by funding the platform that received this money should advertise this content in the same way broadcasters do with sponsored and promoted content,” the source added.
The expert group’s report also caused some controversy. One of its 39 members did not back the final text.
Monique Goyens, director general of BEUC, the umbrella association representing consumer groups from EU countries, said in a statement on Thursday that she “deplores that the report does not tackle the root causes of fake news”.
Goyens wants the Commission’s powerful competition policy department DG Comp to investigate whether online platforms’ advertising policies are anti-competitive.
“Platforms such as Google or Facebook massively benefit from users reading and sharing fake news articles which contain advertisements. But this expert group choose to ignore this business model. This is head-in-the-sand politics,” she said in a statement.
“We are disappointed that the expert group did not agree on bolder steps. There is a window of opportunity to avoid fake news becoming an issue during the European Parliament elections and this opportunity should not be missed,” BEUC spokesman Johannes Kleis told EURACTIV.
One source with knowledge of the group’s discussions insisted that the final report reflects Goyens’ concerns about the platforms’ business model even though it does not explicitly call for an antitrust investigation.
The group could still call for further measures in November.
“If we’re not effective enough, anti-competition measures or even regulatory measures could kick in,” the source said.
The report also recommends a number of other measures to make online platforms more transparent and to make citizens more aware of media reports that are reliable.