Chatbot breakthrough in the 2020s? An ethical reflection on the trend of automated consultations in health care

Research output: Contribution to journalArticleScientificpeer-review

78 Downloads (Pure)


Many experts have emphasised that chatbots are not sufficiently mature to be able to technically diagnose patient conditions or replace the judgements of health professionals. The COVID-19 pandemic, however, has significantly increased the utilisation of health-oriented chatbots, for instance, as a conversational interface to answer questions, recommend care options, check symptoms and complete tasks such as booking appointments. In this paper, we take a proactive approach and consider how the emergence of task-oriented chatbots as partially automated consulting systems can influence clinical practices and expert–client relationships. We suggest the need for new approaches in professional ethics as the large-scale deployment of artificial intelligence may revolutionise professional decision-making and client–expert interaction in healthcare organisations. We argue that the implementation of chatbots amplifies the project of rationality and automation in clinical practice and alters traditional decision-making practices based on epistemic probability and prudence. This article contributes to the discussion on the ethical challenges posed by chatbots from the perspective of healthcare professional ethics.

Original languageEnglish
Pages (from-to)61-71
JournalMedicine, Health Care and Philosophy
Early online date2021
Publication statusPublished - 2022
Publication typeA1 Journal article-refereed


  • Chatbot
  • COVID-19
  • Expertise
  • Health care
  • Professional ethics

Publication forum classification

  • Publication forum level 1

ASJC Scopus subject areas

  • Health(social science)
  • Education
  • Health Policy


Dive into the research topics of 'Chatbot breakthrough in the 2020s? An ethical reflection on the trend of automated consultations in health care'. Together they form a unique fingerprint.

Cite this