CLASSICA D8.2 Bias

Publikation: Bog/antologi/afhandling/rapportRapportForskningpeer review

2 Downloads (Pure)

Abstract

This report presents CLASSICA’s ‘ethics by design’ approach, which addresses requirements and recommendations for avoiding bias in AI development. This work focuses on the requirements concerning bias in the EU AI Act, ethical guidelines developed by the High-Level Expert Group on AI, and WHO guidance on AI for health. The design choices mitigating biases in the CLASSICA project include prospective collection of data, multi-site studies, and randomised controlled trial, among other things. To ensure reliable and unbiased performance measurement, training and validation datasets are independent from testing datasets. Additionally, to detect biases in the system’s performance, the performance can be evaluated in sex and age subgroups of patients. Subgroups for which the system exhibits poor performance, can be excluded from the future intended population.

The requirements of the AI Act concerning bias will apply from 2 August 2027 and unlike the ethical guidelines are legally-binding. Many of these AI Act’s requirements have been addressed in the CLASSICA project by the abovementioned study design choices. To ensure that the AI Act’s requirements related to bias are fully implemented, ARCTUR (technical partner) should 1) review their quality management system and include additional policies/methods for data management and bias detection and mitigation, as needed, and 2) make sure that the user interface of the system allows for correct interpretation of the system’s scores and makes surgeons aware of potential automation bias (i.e., overreliance on the output of the system).
OriginalsprogEngelsk
ForlagEU
Antal sider13
StatusUdgivet - 2025

Emneord

  • Det Juridiske Fakultet

Citationsformater