Center for Leadership and People Management
print

Language Selection

Breadcrumb Navigation


Content

Human-AI-Interaction in Healthcare

Our project aims to improve clinical decision-making and reduce medical errors by increasing the usability of AI health-technology. Enhancing clinical outcomes and thus ensuring patient safety needs to be a priority in healthcare and medicine with the rise of “shiny” new technology. Providing healthcare professionals with better technology should not only reduce error rates but also free up time for more direct patient interaction, which ultimately leads to better care.

Within our research project, we will address three broad open questions:

  1. How are healthcare providers‘ perspectives regarding AI related to how they interact with the technology?
  2. How does the presentation of AI-enabled advice to healthcare providers influence clinical decision-making?
  3. Can a human-factors-optimized AI-enabled support system reduce medical errors and improve patient safety?

We believe this complex research program needs to be addressed by an interdisciplinary collaboration between the disciplines of psychology, medicine, and computer science.

Our international team works together to combine expertise in psychology, medicine and computer sciences and reach healthcare professionals from the EU to North America to gain multi-facetted insights and perspectives across different healthcare systems.

More about our project and our team can be found here: https://clinaid-lab.com/.

Publications:

Cecil, J., Lermer, E., Hudecek, M. F. C., Sauer, J., & Gaube, S. (2023). The effect of AI-generated advice on decision-making in personnel selection. OSF. https://dx.doi.org/10.31219/osf.io/349xe

Gaube, S., Suresh, H., Raue, M., Lermer, E., Koch, T. K., Hudecek, M. F. C., Ackery, A. D., Grover, S. C., Coughlin, J. F., Frey, D., Kitamura, F. C., Ghassemi, M., & Colak, E. (2023). Non-task expert physicians benefit from correct explainable AI advice when reviewing X-rays. Scientific Reports, 13(1), https://doi.org/10.1038/s41598-023-28633-w

Hummelsberger, P., Koch, T. K., Rauh, S., Dorn, J., Lermer, E., Raue, M., Hudecek, M. F. C., Schicho, A., Collak, E., Ghassemi, M., & Gaube, S. (2023). Insights on the Current State and Future Outlook of Artificial Intelligence in Healthcare From Expert Interviews. JMIR Preprints. https://doi.org/10.2196/preprints.47353

Kleine, A. K., Lermer, E., Cecil, J., Heinrich, A., & Gaube, S. (2023). Advancing Mental Health Care with AI-Enabled Precision Psychiatry Tools: A Patent Review. PsyArXiv. https://doi.org/10.31234/osf.io/wmr38

Kleine, A. K., Kokje, E., Gaube, S., & Lermer, E. (2023). Attitudes towards the Adoption of two AI-enabled mental health tools among prospective psychotherapists: A cross-sectional study. JMIR Human Factors. https://doi.org/10.31234/osf.io/c8yr3

Gaube, S., Cecil, J., Wagner, S., & Schicho, A. (2021). The relationship between health IT characteristics and organizational variables among German healthcare workers. Scientific Reports, 11, 17752. https://doi.org/10.1038/s41598-021-96851-1

Gaube, S., Suresh, H., Raue, M., Merritt, A., Berkowitz, S. J., Lermer, E., … & Ghassemi, M. (2021). Do as AI say: Susceptibility in deployment of clinical decision-aids. npj Digital Medicine, 4(31). https://doi.org/10.1038/s41746-021-00385-9

 

Media:

LMU newsroom (https://www.lmu.de/en/newsroom/news-overview/news/research-project-can-artificial-intelligence-improve-medical-diagnostics.html)

Forbes
(https://www.forbes.com/sites/michaelmillenson/2021/06/27/medical-ai-confronts-pesky-problem-people/?sh=6ca86841242f)