AI is unlikely to replace a human clinician anytime soon. However, it may help with synthesizing and collecting patient data.
Is the mental health field ripe for disruption? Could AI bots provide timely interventions for patients on a moment’s notice, without the need to track down a counselor? Some AI proponents say AI is ready for such a role. But psychiatrists say the notion is far-fetched.
A team of researchers, led by P. Murali Doraiswamy of the Duke University School of Medicine, recently surveyed psychiatrists on the topic of AI playing the role of mental health counselor. In their survey of 791 psychiatrists, they measured opinions about the likelihood that AI and machine learning would be able to fully replace the average psychiatrist in performing 10 key tasks (e.g. mental status exam, suicidality assessment, treatment planning) carried out in mental health care.
On many levels, the premise of AI playing a role in mental health is feasible. “Smartphones and wearable sensors offer people the ability to monitor themselves and to benefit from the way deep learning can analyze the data,” the researchers state. “Indeed, these techniques are already being used to detect the changes in mood that indicate bipolar disorder or to detect people at risk of depression. So the scene is set for artificial intelligence to become a disruptive force in psychiatry.”
See also: Humans are Key to Ethical and Responsible AI
However, only 4% of the respondents felt that AI/ML was likely to replace a human clinician for providing empathetic care. At the same time, about 50% of the doctors acknowledged that their jobs “could be changed substantially” by future AI/ML. AI could play a role in “documenting, or updating medical records, and synthesizing information to reach a diagnosis were the two tasks where a majority predicted that future AI/ML would replace human doctors,” the researchers point out. Additional areas where AI may be of value in the near future include “reducing administrative burden, 24/7 monitoring, individualized drug targets to reduce side effects, integration of new streams of data from wearables and genetics, reducing human errors, scaling care to areas with psychiatrist shortage, and elucidating brain etiologies that are currently opaque.”
It’s likely that psychiatrists are skeptical about the clinical value of AI due to the current hype around the technology, the researchers speculate. Plus, “We are still far away from an AI that accurately recognizes and understands the full range of human emotions and mental illnesses,” they point out. “Further, unlike pattern-based fields like radiology or pathology where AI can sometimes outperform doctors, psychiatry requires greater integration of cultural and psycho social factors with medical comorbidities.”