Online platforms face EU regulation on transparency and business contracts

The Counter Extremism Project says Twitter needs to take responsibility for radical groups using the platform

European Commission Vice-President Andrus Ansip announced that he may propose legislation affecting online platforms like Google and Facebook by the end of 2017. [Jason Howie/Flickr]

Internet platforms like Google, Facebook and Amazon Marketplace face regulation over their contracts with other businesses by the end of the year, under possible new EU legislation announced today (10 May).

The proposal will affect “unfair contractual clauses and trading practices identified in platform-to-business relationships” and follows a 2016 Commission strategy that suggested “problem-driven” EU action but no across-the-board regulation of online platforms.

One Commission official said the rules would focus on forcing platforms to be more transparent and give companies options for redress if a platform wrongly removes their products from search results.

A Commission document published today said that “there is widespread concern that some platforms may favour their own products or services, otherwise discriminate between different suppliers and sellers and restrict access to, and the use of, personal and non-personal data, including that which is directly generated by a company’s activities on the platforms”.

Some online platforms remove products from search results “without due notice or without any effective possibility to contest the platform’s decision”.

The official said details of the plans haven’t been nailed down yet, but that the EU executive is also analysing transparency measures for platforms’ algorithms. “We cannot say here today that we will come out with legislation on algorithms,” the official said.

Commission wants to regulate US-dominated online platform market

Amid controversy over the European Commission’s increased scrutiny of American technology companies, the EU executive will propose new rules regulating online platforms, according to a leaked draft document obtained by

The Commission’s announcement has drawn fury from tech companies.

Siada El Ramly, director of EDiMA, a Brussels-based association representing platforms like Amazon, Airbnb, Google and Facebook, said she was “astounded” by the announcement, which she called a “complete contradiction” of the Commission’s communication last year that promised not to introduce sweeping legislation.

The Commission published that communication last April, one week after EU Competition Commissioner Margrethe Vestager accused Google of using its Android mobile operating system to harm competitors by pre-installing Google Search and its own Chrome browser on Android mobile phones.

66% of small and medium-sized businesses that responded to a 2016 Eurobarometer survey said their sales are significantly affected by how platforms position them in search results.

Georgios Petropoulos, a researcher at the Bruegel think tank, said new legislation could help smaller European companies in the online platform market, which is now dominated by American giants like Amazon and Google.

“Clearly we need some regulatory clarity to incentivise investment in platforms and to incentivise the scale-up of startups. Here we see some problems with the diversity of business models in some platforms,” he said.

MEPs raise Netflix quota to 30% and sharpen rules on violent online posts

Netflix and other video-on-demand platforms could be required to include a minimum 30% of European content—more than the Commission’s proposed 20% quota, if new broadcasting rules follow a report approved yesterday (25 April) by MEPs in the Culture Committee.

Andrus Ansip, the Commission Vice-President for digital policies, announced the action on platforms during a press conference on the EU executive’s flagship Digital Single Market initiative two years after it was first started.

He also said that the Commission will by the end of this year introduce measures to better coordinate how online platforms remove users’ posts that contain illegal content once they receive notice about it.

Ansip told reporters before today’s announcement that rules on removing illegal content from platforms are unclear because there is “no common understanding of what ‘notice’ is”.

The Commission has come under pressure to act on platforms’ role in removing illegal content, including hate speech and copyright breaches. Several EU countries are pushing to pass their own legislation that takes a stricter, more legally binding approach than the EU executive, which has until now stuck to softer, non-binding agreements with companies to encourage them to remove more hate speech.

“We will get a fragmented Europe if we do not provide clarifications,” Ansip said.

Germany’s justice ministry proposed a new law in March that would require internet companies to remove hate speech or face fines of up to €50 million. The Bundestag has not yet voted on the bill. Several Commission officials have said they are concerned that the law might violate freedom of expression and platform liability rules, but Ansip said today that it is still too soon for him to react. The Commission is required to respond to Germany’s notification of the law by next month and can either approve it or ask the German government to clarify the Commission’s concerns.

A European Parliament report that was passed by the Culture Committee (CULT) last month on draft EU audiovisual rules would require video-sharing platforms like YouTube to remove any illegal posts.

The Commission insists that its measure to guide platforms later this year will respect freedom of speech and will not affect the EU e-Commerce directive, a seventeen-year-old law guaranteeing that internet companies will not be completely liable for what their users post.

The EU guidance on the rules for removing content will outline requirements like a minimum procedure to take down posts and dispute resolution for users who think their posts are wrongly removed.

Ansip told reporters today that the measure will clarify when firms will be required to react once they are notified of illegal content on their platforms, and what rights users have to object if their posts are removed.

A group of 24 MEPs, led by Dutch Liberal Marietje Schaake, wrote to Ansip today asking him to propose hard legislation, instead of non-binding guidelines, to specify when and how platforms should be required to remove illegal content. Legislation could create transparency requirements for platforms and prevent EU countries from introducing their own national laws against hate speech, they argued.

“A notice and action directive could counter these trends towards fragmentation. It would provide a framework which guarantees legal certainty for businesses and users alike, while increasing the transparency, effectiveness, and proportionality of takedown procedures,” the MEPs wrote.

Germany set to fine social media platforms millions over hate speech

A new draft German law would fine social media firms up to €50 million if they fail to remove hate speech, jumping ahead of EU plans. The European Commission is still weighing up whether it will propose rules to crack down on online hate speech.

Subscribe to our newsletters