TY - JOUR
T1 - Explainable Artificial Intelligence for Mental Disorder Screening
T2 - A Computational Design Science Approach
AU - Tutun, Salih
AU - Topuz, Kazim
AU - Tosyali, Ali
AU - Bhattacherjee, Anol
AU - Li, Gorden
N1 - Publisher Copyright:
© 2025 Taylor & Francis Group, LLC.
PY - 2024
Y1 - 2024
N2 - Mental disorders affect nearly one billion people globally, 94% of whom are undiagnosed and untreated due to an acute shortage of trained clinicians. In response to this crisis, this study introduces mental disorder scan (MDscan), a novel artifact for screening ten mental disorders using data from the SCL-90-R mental disorder screening instrument, an explainable artificial intelligence approach, and our own ShapRadiation algorithm. MDscan converts 90 mental health indicators for each patient into an easily interpretable diagnostic image for mental disorders, similar to radiological images, and explains which indicators contributed to that prediction, increasing clinicians’ ability to screen more patients in less time. A field evaluation with clinical data shows that MDscan has high classification accuracy, with average F1 scores between 0.77 and 0.94, compared against prerecorded ground truth. Furthermore, unlike traditional black-box models, MDscan’s transparency and explainability can help enhance trust in artificial intelligence (AI) applications for clinical use.
AB - Mental disorders affect nearly one billion people globally, 94% of whom are undiagnosed and untreated due to an acute shortage of trained clinicians. In response to this crisis, this study introduces mental disorder scan (MDscan), a novel artifact for screening ten mental disorders using data from the SCL-90-R mental disorder screening instrument, an explainable artificial intelligence approach, and our own ShapRadiation algorithm. MDscan converts 90 mental health indicators for each patient into an easily interpretable diagnostic image for mental disorders, similar to radiological images, and explains which indicators contributed to that prediction, increasing clinicians’ ability to screen more patients in less time. A field evaluation with clinical data shows that MDscan has high classification accuracy, with average F1 scores between 0.77 and 0.94, compared against prerecorded ground truth. Furthermore, unlike traditional black-box models, MDscan’s transparency and explainability can help enhance trust in artificial intelligence (AI) applications for clinical use.
KW - Explainable AI
KW - algorithmic diagnosis
KW - computational approach
KW - field experiments
KW - health screening
KW - image generation
KW - mental health
UR - https://www.scopus.com/pages/publications/85214234624
U2 - 10.1080/07421222.2024.2415771
DO - 10.1080/07421222.2024.2415771
M3 - Article
AN - SCOPUS:85214234624
SN - 0742-1222
VL - 41
SP - 958
EP - 981
JO - Journal of Management Information Systems
JF - Journal of Management Information Systems
IS - 4
ER -