
Wiggles the gyrus: why language models can't replace a psychologist

People are increasingly turning to neural networks instead of psychologists. Psychologists and psychiatrists explained to Izvestia whether ChatGPT can replace a real specialist and how to safely use it for mental problems.
The global trend
Using neural networks as psychologists can be dangerous: to exacerbate feelings of loneliness, develop emotional dependence on bots and reduce communication with people, warned Natalia Krasenkova, medical director at the online insurance company Luchi.
According to her, Russians most often use ChatGPT, a language model capable of answering questions and solving various tasks. They can introduce promptings like, "You're a psychologist specializing in cognitive behavioral therapy. Ask one question at a time and let me answer it completely before moving on to another. Your goal is to help me overcome problems and injuries."
— It turns out that a person unilaterally writes down his thoughts and experiences (similar to written practices), asks questions on his own, and in response simply receives advice from thousands of documents on a similar topic, - explains the expert.
In general, she notes, neural networks are convenient — they are free, constantly accessible, anonymous, do not criticize or condemn (like a competent psychotherapist), they quickly give an answer. But it's important to remember that ChatGPT is just a program, and creating empathetic contact, the ability to notice nuances, and delve into complex feelings is beyond its reach.
— Therefore, the user's communication with the neural network, with the prevailing psychological demands, can lead to a situation where a person closes himself and only aggravates his emotional state. In addition, he does not track progress, he may miss the deterioration," the expert notes.
Benefits and harms
Psychologists and psychiatrists interviewed by Izvestia are confident that AI can be useful for both specialists and patients, but only in some cases. According to Tatiana Vorgul, a practicing psychologist and founder of the online school "Hummock of Vision", neural networks can successfully perform several functions.
- Primary screening. Chatbots collect symptoms and offer tests for anxiety or depression, like questionnaires at a clinic. This saves time, but does not replace a specialist.
- Crisis support. For people in isolation, AI assistants become the "first aid", preventing suicides before meeting with a doctor. Sometimes it's important to just speak up, get clarifying questions or praise.
- Data analysis. Algorithms process arrays of information (sleep patterns, speech markers), identifying risks. But the conclusions require human verification.
At the same time, the expert is sure that AI will not conduct a full-fledged diagnosis. Even advanced algorithms make mistakes. For example, an AI trained to recognize dementia by speech gives 20% false positives. Mental disorders rarely fit into patterns: depression can disguise itself as laziness, and anxiety can disguise itself as irritability. Only the doctor sees the full picture, in particular the biography, family history, and social context.
— Self-diagnosis using AI can lead to serious errors. Chatbots are not able to take into account all the nuances and individual circumstances that affect the user's condition. This can lead not only to misinterpretation of symptoms, but also to making wrong decisions, which makes the situation worse," emphasizes psychiatrist, chief physician of the Mental Health Center Anna Korendyukhina.
Is it possible to completely replace psychologists
Experts consider the complete replacement of AI psychologists to be rather dangerous. According to psychiatrist Anna Korendyukhina, most entertainment chatbots are designed to hold users' attention and cannot be perceived as sources of professional help.
"Chatbots can create the illusion of support and reliability, which leads to users not seeking the necessary help from specialists, which can worsen their condition," says the psychiatrist.
It is important to understand that neural networks have neither qualifications and experience, nor ordinary human empathy, adds Tatiana Vorgul. They won't hear the tremor in your voice, they won't understand your ulterior motives. For example, the words "I'm sad" may conceal depression, longing, or trauma — the algorithm won't distinguish them. A psychologist, noticing non-verbal signals (breathing rhythm, facial expressions, gestures), helps to uncover the true causes of feelings. Sometimes a client does not need a recommendation, but support — even just human warmth.
However, recommendations from neural networks can also be ambiguous. A study in the Journal of Medical Internet Research (2023) found that 30% of AI advice on mental health contains errors and may not improve, but rather worsen the patient's condition (for example, in case of a panic attack, nervous breakdown, depression).
— And no one will be responsible for this. As well as for data leakage, psychologists are obliged to keep a secret, and algorithms have no moral obligations, you cannot be sure of confidentiality," concludes Vorgul.
The importance of therapy
To summarize, experts note that neural networks can be used as a tool, such as diaries and tables for self—observation. However, qualified help can only be obtained from a psychologist or therapist in a therapeutic relationship.
— Clinical psychologists and psychotherapists evaluate and treat a variety of factors. Their training and licensing ensure that they meet ethical and professional standards, which are necessary to ensure safety and effectiveness. It is especially important to consult a specialist when serious problems arise, as improper help can have disastrous consequences, warns Anna Korendyukhina.
Similarly, Tatiana Vorgul adds, other tools can be used, such as metaphorical maps. They can be a good helper, but they never replace full-fledged diagnostics and work.
— Mental health requires human involvement. As Carl Jung said, "your inner life is the most important secret." But it's easier to open it when there are those who hear and understand you," the psychologist concludes.
Переведено сервисом «Яндекс Переводчик»