04/20/2026
The illusion of "Digital Empathy" is becoming a clinical liability. 👇
We are seeing a massive rise in clients turning to AI chatbots for "therapy" or crisis management between sessions. While an AI can synthesize a validating sentence, it is fundamentally predictive text—not biological attunement.
The Clinical Reality:
⚠️ No Somatic Tracking: AI cannot see when a client’s breathing shifts, their posture tightens, or when they begin to dissociate.
⚠️ Dual Awareness Deficit: In EMDR and trauma processing, the clinician is the human anchor. A screen cannot maintain a client's "Window of Tolerance" during emotional flooding.
⚠️ Liability Risk: Processing trauma without real-time, biological co-regulation poses a significant risk of re-traumatization.
How to Address This in Session (Without Shaming):
If a client mentions using AI for support, use this clinical framework to redirect them:
1️⃣ Validate the Need: "It makes sense that you sought immediate support when your nervous system felt overwhelmed."
2️⃣ Educate on the Mechanism: "While tech can organize your thoughts, trauma healing requires biological safety. A screen cannot provide the co-regulation we build here in the room."
3️⃣ Redirect to Somatic Tools: "Let’s focus on the grounding tools we’ve practiced that rely on your own nervous system, rather than a digital output."
💬 Let’s talk: Have your clients brought up 'AI therapy' in session yet? How did you handle the conversation? Tell us below.
📌 Save this post to reference the script before your next session.
🔗 Visit the link below or in our bio to explore our upcoming EMDR trainings and join a community of human-first practitioners.
https://compassionworks.com/courses/