AI Act Draft Standardisation Request

Earlier this week the European Commission issued a second Draft standardisation request to the European Standardisation Organisations in support of safe and trustworthy artificial intelligence. The draft contains the rationale and terms of the request, a ten-item list of standards and standards deliverables to be drafted, and a set of requirements for the prospected standards. CEN and CENELEC are the primary recipients of the request, with ETSI mentioned as contributor to fulfil the request.

The required standards will support the implementation of the AI Act when it becomes applicable. The regulation is still being discussed in the Council of Europe and the European Parliament, will soon go through the Interinstitutional negotiations (trialogue), and will almost certainly undergo some changes. The Commission may amend the standardisation request accordingly to reflect any changes in the final text of the AI Act, or as a result of additional input from relevant stakeholders.

In contrast, other aspects of the request are unlikely to change. First, the Commission encourages collaboration between European and International SDOs, and the possible adoption of standards by ISO/IEC on the basis of the Vienna and the Frankfurt agreements. Furthermore, in line with Regulation (EU) No 1025/2012 on European standardisation, the Commission requires ESOs to involve SMEs and civil society organisations in the standardisation process.

🤩 Currently trade unions, consumers and specialist non-profit ForHumanity represent civil society in the development of these standards.

Finally, the standards produced in response to the request should be aligned with the Commission's policy objectives in the field of AI. They include, in addition to the specific objectives of the AI Act, the safety of AI products and services, the respect of fundamental rights and European Values, the digital sovereignty of the Union, the growth of the AI market, the public interest, and the rights of persons with disabilities.

The standardisation request identifies 10 areas on which CEN and CENELEC are to produce or adopt standards. They correspond to some of the requirements and obligations put forward in Chapters II and III of the AI Act.

  • risk management system for AI systems, Art. 9

  • governance and quality of datasets used to build AI systems, Art. 10

  • record keeping through logging capabilities by AI systems, Art. 12 / 13

  • human oversight of AI systems, Art. 14

  • accuracy specifications for AI systems, Art. 15

  • robustness specifications for AI systems, Art. 15

  • cybersecurity specifications for AI systems, Art. 15

  • quality management system for providers of AI systems, including post-market monitoring process, Art. 17

  • conformity assessment for AI systems, Art. 19

In Annex II, the standardisation request also outlines general requirements for all future standards. They shall be based on the state of the art to minimise risks to health, safety, and fundamental rights. Moreover, they shall be based on a common terminology and be consistent with other harmonised European standards. Furthermore, as anticipated, all relevant stakeholders, including SMEs and civil society organisations representatives, should be involved in the standardisation process. Finally, standards should address horizontal risks, but may also provide vertical specifications.

🤩 Vertical specifications may be necessary for standards on human oversight and accuracy because you cannot decide how accurate something needs to be without knowing what it is for

Next
Next

Team Onboarding in Today’s Remote Working Era