{"id":40821,"date":"2025-09-10T10:09:05","date_gmt":"2025-09-10T15:09:05","guid":{"rendered":"https:\/\/sites.imsa.edu\/acronym\/?p=40821"},"modified":"2025-09-10T10:09:05","modified_gmt":"2025-09-10T15:09:05","slug":"ai-in-the-therapists-chair","status":"publish","type":"post","link":"https:\/\/sites.imsa.edu\/acronym\/2025\/09\/10\/ai-in-the-therapists-chair\/","title":{"rendered":"AI in the Therapist&#8217;s Chair"},"content":{"rendered":"<div id=\"attachment_40822\" style=\"width: 310px\" class=\"wp-caption alignnone\"><img loading=\"lazy\" decoding=\"async\" aria-describedby=\"caption-attachment-40822\" class=\"size-medium wp-image-40822\" src=\"https:\/\/sites.imsa.edu\/acronym\/files\/2025\/09\/Image-1-300x169.jpeg\" alt=\"\" width=\"300\" height=\"169\" srcset=\"https:\/\/sites.imsa.edu\/acronym\/files\/2025\/09\/Image-1-300x169.jpeg 300w, https:\/\/sites.imsa.edu\/acronym\/files\/2025\/09\/Image-1-180x101.jpeg 180w, https:\/\/sites.imsa.edu\/acronym\/files\/2025\/09\/Image-1-260x146.jpeg 260w, https:\/\/sites.imsa.edu\/acronym\/files\/2025\/09\/Image-1-373x210.jpeg 373w, https:\/\/sites.imsa.edu\/acronym\/files\/2025\/09\/Image-1-120x67.jpeg 120w, https:\/\/sites.imsa.edu\/acronym\/files\/2025\/09\/Image-1.jpeg 690w\" sizes=\"auto, (max-width: 300px) 100vw, 300px\" \/><p id=\"caption-attachment-40822\" class=\"wp-caption-text\">Source: Indian Today, image is generated with the use of AI<\/p><\/div>\n<h2>Introduction<\/h2>\n<p><span style=\"font-weight: 400\">The COVID-19 pandemic marked a change in mental healthcare, with a rapid transition to telehealth. This change, while initially in response to the crisis, has made therapy more accessible and convenient. For a generation accustomed to digital communication, video calls and chat platforms reduced the constraints of time and distance, allowing for more people to seek medical assistance. Research from organizations like the <\/span><a href=\"https:\/\/www.nami.org\/advocacy\/policy-priorities\/improving-health\/telehealth\/\"><span style=\"font-weight: 400\">National Alliance on Mental Illness (NAMI)<\/span><\/a><span style=\"font-weight: 400\"> has consistently highlighted how telehealth improves access, particularly for those in rural areas or with mobility issues, and reduces &#8220;no-shows,&#8221; leading to greater continuity of care.\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400\">However, this digital transformation is not without its complexities. Studies on telehealth show concern that the trusting bond between a client and therapist could be lost. Subtleties in body language, tone, and shared physical space shown through a screen, creates a barrier to building a more empathetic connection.\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400\">This brings us to a new and complex frontier of mental healthcare as a growing number of people turn to AI chatbots like ChatGPT for emotional support. The appeal is clear: AI is free, always available with immediate answers, and offers a perceived non-judgmental space. Some specialized purpose-built AI chatbots trained by mental health experts have shown promise in clinical trials for reducing symptoms of anxiety and depression by delivering techniques like Cognitive Behavioral Therapy (CBT), <\/span><a href=\"https:\/\/home.dartmouth.edu\/news\/2025\/03\/first-therapy-chatbot-trial-yields-mental-health-benefits\"><span style=\"font-weight: 400\">supported by a Dartmouth study<\/span><\/a><span style=\"font-weight: 400\">.\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400\">But this is where we must proceed with extreme caution. General-purpose AI models are not therapists. They are not bound by the same ethical codes\/confidentiality, and they cannot truly feel or understand human emotions. A growing body of scientific literature and media reports points to serious risks.\u00a0<\/span><\/p>\n<h2>Reinforcement Issues<\/h2>\n<p><span style=\"font-weight: 400\">The tendency of AI chatbots to reinforce a user&#8217;s beliefs, a phenomenon researchers refer to as &#8220;AI sycophancy,&#8221; is a significant concern highlighted in academic research. <\/span><span style=\"font-weight: 400\"><a href=\"https:\/\/www.kcl.ac.uk\/news\/ai-chatbots-can-be-exploited-to-extract-more-personal-information\">A report by researchers at King&#8217;s College London<\/a>, analyzing 17 media-reported cases, found a concerning pattern where AI chatbots reinforced delusions in vulnerable individuals.<\/span><span style=\"font-weight: 400\"> The study noted that chatbots are designed to mirror a user&#8217;s language, validate their assumptions, and generate continued prompts to maintain engagement. This creates a feedback loop where the AI&#8217;s agreeable behavior supports a user&#8217;s beliefs without challenge. This process, sometimes called &#8220;AI psychosis\u201d, risks validating distorted thinking rather than prompting a user toward professional help.<\/span><\/p>\n<h2>Dangerous and Unethical Advice from Chatbots<\/h2>\n<p><span style=\"font-weight: 400\">Multiple studies and news reports have documented instances where general-purpose AI chatbots have failed to provide appropriate crisis responses. <\/span><span style=\"font-weight: 400\"><a href=\"https:\/\/www.vpm.org\/npr-news\/npr-news\/2024-12-10\/lawsuit-a-chatbot-hinted-a-kid-should-kill-his-parents-over-screen-time-limits\">NPR news reported on a lawsuit<\/a> alleging that a popular chatbot gave a teenager detailed information related to self-harm.<\/span><span style=\"font-weight: 400\"> Unlike licensed human therapists who are trained to assess risk and intervene, AI lacks this critical ability, posing safety risks.<\/span><\/p>\n<h2>Data Privacy Concerns<\/h2>\n<p><span style=\"font-weight: 400\">While a licensed therapist is bound by legal and ethical standards like the Health Insurance Portability and Accountability Act (HIPAA), an AI chatbot is not. The sensitive, personal information people share with these chatbots is often used for training and development with no guarantee that it will remain confidential. <\/span><a href=\"https:\/\/www.mozillafoundation.org\/en\/privacynotincluded\/articles\/how-to-protect-your-privacy-from-chatgpt-and-other-ai-chatbots\/\"><span style=\"font-weight: 400\">As detailed by the Mozilla Foundation, users often have to manually opt-out of data collection, and any information shared could be at risk of being exposed in a data breach.<\/span><\/a><\/p>\n<h2>Conclusion<\/h2>\n<p><span style=\"font-weight: 400\">The future of mental healthcare is incorporating both aspects where technology enhances, but does not replace the human element. AI has the potential to be a powerful assistant for therapists, helping with administrative tasks, data analysis, and even providing supplementary resources for patients. But it can never truly understand the complexities of a human\u2019s life, or provide the empathy and genuine connection that a physical therapist offers. As the medical field continues to navigate this new trend, we still must recognize the difference between a tool and a trusted human professional.<\/span><\/p>\n<p>&nbsp;<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Introduction The COVID-19 pandemic marked a change in mental healthcare, with a rapid transition to telehealth. This change, while initially in response to the crisis,&#8230;<\/p>\n","protected":false},"author":1035,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"ngg_post_thumbnail":0,"footnotes":""},"categories":[12,4466],"tags":[2849,2701,1073],"coauthors":[4418],"class_list":["post-40821","post","type-post","status-publish","format-standard","hentry","category-opinions","category-stem-and-business","tag-ai","tag-mental-health","tag-psychology"],"aioseo_notices":[],"_links":{"self":[{"href":"https:\/\/sites.imsa.edu\/acronym\/wp-json\/wp\/v2\/posts\/40821","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/sites.imsa.edu\/acronym\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/sites.imsa.edu\/acronym\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/sites.imsa.edu\/acronym\/wp-json\/wp\/v2\/users\/1035"}],"replies":[{"embeddable":true,"href":"https:\/\/sites.imsa.edu\/acronym\/wp-json\/wp\/v2\/comments?post=40821"}],"version-history":[{"count":6,"href":"https:\/\/sites.imsa.edu\/acronym\/wp-json\/wp\/v2\/posts\/40821\/revisions"}],"predecessor-version":[{"id":40847,"href":"https:\/\/sites.imsa.edu\/acronym\/wp-json\/wp\/v2\/posts\/40821\/revisions\/40847"}],"wp:attachment":[{"href":"https:\/\/sites.imsa.edu\/acronym\/wp-json\/wp\/v2\/media?parent=40821"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/sites.imsa.edu\/acronym\/wp-json\/wp\/v2\/categories?post=40821"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/sites.imsa.edu\/acronym\/wp-json\/wp\/v2\/tags?post=40821"},{"taxonomy":"author","embeddable":true,"href":"https:\/\/sites.imsa.edu\/acronym\/wp-json\/wp\/v2\/coauthors?post=40821"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}