Welcome to EURACTIV’s Digital Brief, your weekly update on all things digital in the EU. You can subscribe to the newsletter here.
“We will not let Kremlin apologists pour their toxic lies justifying Putin’s war or sow the seeds of division in our Union.”
-European Commission President Ursula von der Leyen
Story of the week: Commission President Ursula von der Leyen on Sunday (27 March) promised to halt the spread of “toxic and harmful disinformation in Europe” by banning Russian state media outlets RT and Sputnik, which she described as “the Kremlin’s media machine in the EU.” RT and Sputnik pushed back against the proposal, with the French branch of the former pledging to “take all possible legal recourse” against any measures introduced.
The measures came into force on Wednesday in the form of economic sanctions on Sputnik and five RT entities, banning the broadcast and distribution of their content within the EU, both online or on-air. The sanctions, designed to work in parallel with existing media regulations, target the organisations as legal entities, meaning their journalists can continue working without sharing the material they produce. The European Federation of Journalists condemned the ban, stressing that censorship is not the right tool to fight disinformation, that media regulation falls outside the competencies of the Commission, and that the likely retaliation will further impoverish the media landscape in Russia.
But Moscow is far from bothered about media freedom. While RT laments breaches of freedom of expression, Russia has shut down the last independent media critical of the government. Ekho Moskvy and Dozhd TV are both no more after the Russian media regulator warned against organisations spreading “false information” in reporting the conflict. The authority cautioned that describing Russia’s aggression as an “attack, invasion, or declaration of war” would lead to penalties. Access to social media has also been restricted, and legislation is underway to criminalise the distribution of “unofficial” information, with a potential prison sentence of up to 15 years for those found guilty. Meanwhile, Kyiv’s TV tower was bombed, resulting in at least five deaths. Read more.
Don’t miss: China is expected to make a new push for a centralised version of internet governance at this week’s World Telecommunication Standardisation Assembly in Geneva. The idea is not new and was already bounced back in 2019. The technical justification is based on developing specific features such as reserving bandwidth for certain applications that require a certain connection speed (data latency in the jargon), for instance, online gaming or the metaverse. However, the intent is a political one as this setting would require authenticating traffic to charge extra for such service, giving internet providers the capacity to control traffic in every connected device. The WTSA sets out the four-year mandate for the ITU, the UN’s telecommunication agency, which elects its new secretary-general in September. The leading candidates are an American and a Russian. Just one month ago, Russia and China signed a joint statement calling for a more sovereign internet and greater participation of ITU in the internet governance debate.
Ironically, Russia was the one asking not to politicise the internet governance debate after the WTSA kicked off with emotional statements on the Ukrainian conflict – where Western countries supported Ukraine and none supported Russia. Western delegates have asked to ban Russia from holding any leadership position and called on the ITU secretary-general to support the motion. These events have completely overshadowed the Chinese proposal, and it is no longer expected in a single motion but in bits and pieces scattered around different documents. The most remarkable attempt so far has been a proposal to create a fast-tracking procedure for developing countries to set a discussion point in working parties, something Western delegates see as a way to circumvent scrutiny over what is proposed.
Also this week:
- Tech companies mobilised against Russian propaganda, and many pulled out of the market.
- EU lawmakers position themselves on the AI Act policy discussions as delays pile up.
- The third political trilogue on the DMA tried to pave the way for a final compromise.
- European telecom companies are offering free services to Ukrainian refugees.
Before we start: Cyberattacks are hardly limited to one country in an interconnected world. We discuss the impact of Russian cyberattacks against Ukraine on European companies with dr Vera Demary, head of digitalisation at the German economic institute, and Iva Tasheva, cybersecurity consultant at CyEn.
Key discussion points. At an event on Thursday, IMCO co-rapporteur Brando Benifei outlined the main point for discussions on the AI Act, both internally at the Parliament and in future negotiations with the Council. On the definition of AI, Benifei said that “this is the AI Act, not the Algorithm Act,” although recognising that the definition will likely not change much. On the list of high-risk use cases, the MEP signalled that he does not see eye to eye with certain things proposed in the Council, such as the part on financial service, ICT infrastructure and democratic process, the latter considered too limited. The Italian lawmaker stressed that he is not happy with the Commission’s version of the certification mechanism based on a self-assessment because “we don’t want to find out biases in systems after they have destroyed families and ruined lives, as it happened in certain countries”. His comments were in reference to the childcare benefits scandal in the Netherlands and the Post Office scandal in the UK. Finally, the social democrat wants to review the chain of responsibilities to ensure the big corporations do not offload the burden on the end-users and small system providers.
JURI report. The draft report from Axel Voss started circulating this week even before it was finalised. It includes strong wording on the preservation of trade secrets, including preventing market surveillance authorities from requesting documentation on the conformity assessment. The fines have been lowered from 6% to 4% of the annual turnover, and new wording has been added to determine the final amount. On general-purpose AI, Voss kept the definition suggested in the Council and excluded it from the scope of the AI Act with new Art. 2.3.b. The liability conditions have also been changed according to the Council’s text in the new Art. 23.a. In terms of biases, new ethical standards have been included when developing technical standards. The dataset’s quality is to be assessed based on the minimisation of disparity of outcomes based on a certain demographic trait or stereotype. The approach on high-risk AI has been completely changed, leaving discretion to the users if the AI system applied to a critical sector such as education or law enforcement is high-risk.
ITRE report. ITRE rapporteur Eva Maydell also made her move, submitting her draft report together with Voss’ one but positioning herself on the more moderate side of the EPP. The overall strategy is that if Europe is to succeed in making its values the international standards in AI, then it should enable its companies to be influential actors in the AI market. Therefore, supporting SMEs and startups is a strategic choice, although safeguards should be maintained. The AI definition has been virtually copy-pasted from the OECD’s one, with the idea of keeping them aligned if one of the two changes. SMEs exemptions: but with safeguards. For general-purpose AI, the providers of these systems should collaborate with system providers and be registered in a database. In terms of robustness, accuracy, security, the draft report states these requirements should be reasonable to implement, with the same metrics and under a single authority to avoid market fragmentation. Finally, the Bulgarian MEP is pushing to make regulatory sandboxes accessible to all countries with the support of the EU Commission and a more detailed explanation of how these should work in the annex. The draft report also mandates collaboration with the existing Digital Innovation Hubs and introduces a differentiation between sandboxes for hardware and software.
Likely delays. Interestingly, Benifei also warned the Council that it “needs to keep up” if the trilogue negotiations are to start before the end of the year as planned. This week, a new IMCO/LIBE timeline was circulated to the other committees, confirming the rumour of delays on the Parliament’s side. The joint draft report is now expected for 11 April, the deadline for amendments has been moved by almost one month to 18 May, and the final committee vote to the end of October. The plenary vote in November has, however, not changed. Expect the dates to change further as the more contentious points come to a head during the compromise amendments. Besides the heated debate on facial recognition, another dividing line between left and right-wing MEPs is likely to be the compliance mechanism and to what extent companies will be allowed to keep the review internal.
Data & privacy
European Parliament leaders have approved the establishment of a committee to investigate the abuse of Pegasus spyware in the EU. It was revealed last year that governments worldwide had been using military tech to infiltrate the phones of political leaders, journalists, activists and others. The committee will look into allegations that both Hungary and Poland deployed the tools, but MEP Sophie in’t Veld warned today that the inquiry could turn its attention to a long list of governments as new information emerges. Read more.
Debunking together. Sixty fact-checking organisations from more than 50 countries have been working on an international database to fight the constant flows of disinformation around the Russian-Ukrainian war. In less than one week, the network debunked hundreds of fake news pieces on the conflict. The website includes an interactive map with a breakdown per country.
Digital Markets Act
Third out, one to go. The third political trilogue took place on Tuesday this week, despite the extraordinary plenary session in response to the Ukrainian conflict. The meeting was kept upon the insistence of the rapporteur Andreas Schwab, who told EURACTIV the meeting was very good and that he was still convinced an agreement could be reached by April. The trilogue did not result in any major breakthroughs but formally adopted some compromises, reached the technical level, and prepared the groundwork for the upcoming technical meetings. These intensify the work ahead of the (potentially last) trilogue on 24 March. The negotiations are marked by increased secrecy, as the negotiators are still trying to nail down some key aspects of the obligations and enforcement.
It’s all about enforcement. The Commission is under increasing pressure to provide sufficient resources, which might involve around 250 staff members. A potential arrangement could be found in making a joint task force between DG COMP and DG CONNECT. The Apple case in the Netherlands also influences the discussions, as policymakers are concerned the gatekeepers might circumvent DMA obligations (although the sanctions are much higher). The right of third parties to contribute to the investigation might also mitigate the lack of capacity. Still, it continues to be resisted by the Commission that does not want to lose control over implementation.
Interoperability, yes or no? An agreement on the interoperability obligations, at least for messaging services, seems to be more likely, following the change of attitude from the Council’s side, potentially driven by the German government. However, the day after the trilogue, the Commission circulated a working paper bashing the obligations as unnecessary and technically risky, according to MLex. The EU executive reportedly stressed that interoperability would create a security risk for encrypted messages and a content-moderation headache for social media. Moreover, the Commission made the case that these obligations would favour the gatekeeper’s dominant position and reduce the impulse to innovate.
More to discuss. The obligations on default settings, fair treatment and self-preferencing are on the table for further technical discussions. On targeted advertising, the Commission intervened with the same working paper making a case for moving the provisions to the DSA. An agreement seems to be nearing to include the transaction-based model in the definition of active users for online marketplaces. The quantitative thresholds are another contention point, but it will only be addressed after the obligations are sorted.
Digital Services Act
Ukrainian disruptions. The Ukrainian conflict has disrupted what was already slow progress on the DSA. As the proposal is meant to be the EU’s rulebook for content moderation, the negotiators have been discussing how to expand the anti-disinformation measures given the war propaganda around the Russian invasion of Ukraine.
Chips for industry. The industry working party held its first meeting on the Chips Act on Thursday, with three more planned for this month on 10, 16 and 31 March. This working party will be responsible for the three pillars of the proposal, whereas the research one will lead on the joint undertaking part, according to an internal document seen by EURACTIV. However, the file was expected to land on the desk of the telecom working party, which has not been informed of such re-arrangement, an EU diplomatic source told EURACTIV.
Matching gaps. To tackle the gender gap within the tech sector, the gender pay, pensions and care gaps must also be closed, Commission officials have said. Policymakers stressed the importance of tackling broader issues of gender inequality if the digital skills gap was to be closed and gender balance within STEM studies and careers achieved. Read more.
Big Tech mobilisation. In response to calls to tackle the issue of Kremlin-backed disinformation surrounding the war, tech platforms have been rolling out measures to clamp down on misleading content. Amongst those appealing for action were the leaders of Lithuania, Latvia, Estonia and Poland, who sent an open letter to Meta, Twitter, Google and YouTube. Google this week announced that it would block access to the YouTube accounts of RT and Sputnik in Europe, and other platforms have taken similar steps. An EU official told EURACTIV that there had been a good collaboration between tech companies and the Commission, describing it as unprecedented compared to the Trump era. Read more.
Rus-exit. Apple joins several other companies in exiting Russia’s market over its invasion of Ukraine, as the iPhone-maker announced this week it would no longer sell its products in Russia. The decision followed a letter sent to Apple’s CEO from Ukraine’s Deputy Prime Minister calling for the company to block Russians’ access to its store and support US sanctions. Apple also says it has “limited” Apple Pay, its mobile payment platform and disabled features on Apple maps in Ukraine as a safety measure. Read more.
Research & Innovation
Research cut-out. The European Commission will suspend further research cooperation projects with Russian partners in response to the country’s invasion of Ukraine, innovation commissioner Mariya Gabriel announced. The preparation of grants to four Horizon Europe projects has been suspended, as have payments under existing contracts. Several EU countries have also followed suit, urging their academics to cut ties with Russian and Belarusian counterparts, but there remains some debate over whether this is the correct approach, with some in academia appealing for communication between researchers to be sustained.
Free services for refugees. Telecom operators in Europe have voluntarily taken measures in response to the humanitarian crisis unfolding as hundreds of thousands of refugees leave Ukraine. While the measures vary between countries, they include lifting fees on international calls to Ukraine, providing free Wi-Fi at refugee camps, and distributing SIM cards to those fleeing the conflict. Work is also underway to halt roaming charges with Ukraine, though the complexities of ensuring the coordination of all operators make implementation more complicated. Read more.
What else we’re reading this week:
Europe Is in Danger of Using the Wrong Definition of AI (Wired)
Police use of Pegasus malware not illegal, Israeli inquiry finds (The Guardian)
Hackers Breach Russian Space Research Institute Website (Vice)
[Edited by Alice Taylor]