School of Education

(Over)Trusting AI Recommendations: How System and Person Variables Affect Dimensions of Complacency

A new article co-authored by Dr Lenka Schnaubert has been published in the International Journal of Human-Computer Interaction.

Abstract

Over-trusting AI systems can lead to complacency and decision errors. However, human and system variables may affect complacency and it is important to understand their interplay for HCI. In our experiment, 90 participants were confronted with traffic route problems guided by AI recommendations and thereby assigned to either a transparent system providing reasons for recommendations or a non-transparent system. We found transparent systems to lower the potential to alleviate workload (albeit not to neglect monitoring), but to simultaneously foster actual complacent behaviour. On the contrary, we found performance expectancy to foster the potential to alleviate workload, but not complacent behaviour. Interaction analyses showed that effects of performance expectancy depend on system transparency. This contributes to our understanding how system- and person-related variables interact in affecting complacency and stresses the differences between dimensions of complacency and the need for carefully considering transparency and performance expectancy in AI research and design.

Please visit the publisher's website to read the full article.

 

Posted on Monday 25th March 2024

School of Education

University of Nottingham
Jubilee Campus
Wollaton Road
Nottingham, NG8 1BB

Contact us