The project focuses on the design, development, and evaluation of an XAI framework specifically aimed at healthcare professionals. This framework aims to transform the traditional ‘black box’ of AI systems into an understandable, useful, and clinically relevant tool, allowing healthcare personnel to adequately gauge their level of confidence in the predictions and recommendations generated by these models.
To achieve this goal, a comprehensive methodological approach based on three complementary axes is adopted. First, a real user-centred design is applied, through the development of explanations aligned with existing clinical workflows and the creation of intuitive interfaces that respect the time and cognitive constraints of healthcare professionals.
Secondly, multimodal explanations are incorporated, combining different sources and formats of information in order to offer a richer and more robust understanding of the behaviour of AI models, while ensuring their technical reliability in critical clinical environments.
Finally, the framework integrates ethical and diversity considerations across the board, promoting the active mitigation of algorithmic biases and ensuring compliance with current European regulations on Artificial Intelligence.
Project funded by the Call for Grants for the recruitment of doctoral students by companies, research centres and technology centres: Industrial Doctorates 2024.
