France pitches changes to the supervisory board, market surveillance in AI regulation

[Pixels Hunter/Shutterstock]

Yet another compromise text on the AI Act has been circulated amongst the diplomats of the EU Council by the French Presidency ahead of a working party meeting on Tuesday (10 May)

The new text, seen by EURACTIV, makes significant modifications to the text related to the European Artificial Intelligence Board, market surveillance, guidelines, and Codes of Conduct.

Member states have been generally happy with the direction the French Presidency has given to the file, an EU diplomat told EURACTIV.

European Artificial Intelligence Board

The structure of the Board has been changed to include a representative for each member state, instead of the national supervisory authority. This representative will be designated for a three-year mandate which is renewable one time.

Eight independent experts have been added to the board, two per each category representing SMEs and start-ups, large companies, academia and civil society. These experts will be selected by the national representatives “in a fair and transparent selection process,” the compromise text reads.

The European Data Protection Supervisor has been demoted from full member to mere observer. The Commission’s role has also been significantly downsized from chair of the Board to a participant without voting rights.

In the French proposal, the rules of procedure are to be adopted only by the national representatives by a two-thirds majority. These procedural rules should define the selection process of the independent experts and the selection, mandate and tasks of the board’s chair, which will need to be a national representative.

The board might establish temporary or permanent sub-groups on specific topics, in which representatives of businesses, civil society, affected persons, researchers, standardisation organisations, and notified bodies might be invited as observers.

The board will advise the Commission, upon request of its own initiative, on any relevant matter related to the implementation of the regulation, including updating the annexes on the definition of AI and the list of systems considered high-risk, the preparation of secondary legislation and the development of guidance.

One of the specific tasks consists of harmonising administrative practices at the national level on the derogation from the conformity assessment procedure, the function of regulatory sandboxes and testing in real-world conditions.

French EU presidency wants ‘proportionate’ fines, extended deadlines in AI Act

The French presidency of the EU Council has made a series of proposals regarding the enforcement of the EU’s Artificial Intelligence Act, in a new compromise text seen by EURACTIV.

Guidelines

A new article has been introduced requiring the European Commission to provide guidelines on its own initiative or upon the request of the board on how to apply the AI regulation, notably in terms of compliance with the requirements for high-risk systems, prohibited practices and how to put in place significant changes to existing systems.

The guidance would also cover the identification of criteria and use cases for high-risk AI systems, how to implement the transparency obligations in practice, and how the AI regulation will interact with other EU legislation.

“When issuing such guidelines, the Commission shall pay particular attention to the needs of SMEs including start-ups and sectors most likely to be affected by this Regulation,” the text adds.

Market surveillance

The modification in this part of the text is “intended to clarify the powers of the market surveillance authorities and the modality in which they will exercise their power, as well as the extent to which they should have access to relevant data and information, especially the source code.”

The market surveillance authorities would have to be granted full access to the source code of a high-risk AI system if there is a ‘reasoned request’, namely, the code is needed to assess the conformity of the system as the data and documents provided have been deemed insufficient.

The compromise text indicates the market surveillance authorities as being responsible for supervising high-risk systems used by financial institutions. The national authorities will have to inform the European Central bank without delay about any information relevant to its supervisory tasks.

The procedure for flagging measures taken against non-compliant AI systems to other member states and the Commission has been significantly changed. These cases now cover systems non-compliant with the restriction on prohibited practices, high-risk systems with their requirements and failure to comply with the transparency obligations set out for deep-fakes and emotion recognitions.

Generally, the Commission and EU countries will have three months to object to such actions. In the case of suspected breaches of the ban on prohibited practices, the deadline has been shortened to 30 days.

If an objection is raised, the Commission shall enter in consultation with the relevant national authorities. The EU executive will decide within nine months if the decision is justified, but for prohibited practices, the deadline is 60 days.

The Commission might overrule the authority’s decision. However, if the Commission deems the measures appropriate, all other authorities will have to replicate them, including by withdrawing the AI system from the market.

Codes of conduct

The article on the codes of conduct has been amended to clarify that these will be voluntary tools for AI systems that do not fall in the high-risk category. The codes of conduct have been expanded to cover the obligations of users of AI systems.

French Presidency pushes for alignment with the new legislative framework in AI Act

France is proposing several changes to the Artificial Intelligence (AI) Act to ensure better alignment with the new legislative framework, the EU’s legislation that regulates market surveillance and conformity assessment procedures.

[Edited by Alice Taylor]

Subscribe to our newsletters

Subscribe