Last week, a friend told me something quite interesting about AI “therapy.”
“I’ve been talking to an AI about my relationship problems,” he confessed. “It’s been really helpful, it listens without judgment and gives pretty good advice.”
As the Director at the Center for Neurocognitive Excellence in Washington, DC (DCNE), I wasn’t entirely surprised. This wasn’t the first time I’d heard someone mention using AI for mental health support. But it did make me think about this growing trend and what it means for both patients and practitioners. It also raised a critical question: if therapy is governed by state licensing boards that require mental health professionals to be licensed, how exactly did an AI obtain the credentials to legally practice therapy?
The rise of AI “therapy” is happening fast. And as a licensed therapist specializing in anxiety, depression, and ADHD therapy and neurofeedback, I have some thoughts. But mostly I’m very concerned.
What Exactly Is AI Therapy?
AI “therapy” uses artificial intelligence algorithms to provide “mental health support” through chatbots, virtual assistants, and other digital platforms. These systems analyze your inputs—whether written text, voice recordings, or even facial expressions—and respond with guidance, coping strategies, or simulated conversation.
People are using AI “therapy” in various ways:
- Having late-night conversations about their anxieties
- Processing relationship conflicts and seeking advice
- Practicing communication skills and receiving feedback
- Tracking mood patterns and emotional triggers
- Learning coping strategies for stress and anxiety
The appeal is obvious. Who wouldn’t want 24/7 access to judgment-free support that never gets tired, frustrated, or schedules vacation time?
The Potential Benefits of AI “Mental Health Support”
I’ll be the first to acknowledge that AI offers something:
Accessibility is a consideration. For those working unusual hours or facing scheduling challenges, AI might appear to be a convenient option. It operates without waiting lists and can work around any schedule. However, at DCNE, we offer:
- Flexible appointment scheduling to accommodate your busy life
- Both in-person and virtual appointments for your convenience
With our options, you can receive genuine human connection and professionally licensed care while still maintaining the flexibility you need in your schedule.
The stigma barrier disappears, but only if you trust it to keep your information private at the standards of HIPAA, which it doesn’t. Many people still feel uncomfortable seeking traditional therapy. The perception of privacy of interacting with AI can be a gateway to getting help but it is misleading.
It’s there at 3 AM. When anxiety strikes in the middle of the night, having immediate support can be invaluable—even if it’s algorithmic – but it can also create an unnecessary dependency in which you do not practice your coping skills, use your support system, or take medication as prescribed.
For some, these tools might be better than no support at all. They may even serve as a stepping stone toward seeking professional help—which is ultimately the most effective path to mental wellness.
Why I’m Concerned About AI as Your Therapist
Despite these benefits, there are profound limitations and risks to using AI as your primary mental health resource:
AI Can’t Truly Connect
Therapy isn’t just about advice and techniques. It’s about human connection.
The therapeutic relationship itself is healing. Decades of research show that the therapeutic alliance between therapist and client is one of the strongest predictors of positive treatment outcomes across various healthcare settings. When you sit across from our team in our DC office or connect via video from DC, Maryland, or Virginia, we’re engaging in a dynamic human relationship that an algorithm simply cannot replicate.
AI can simulate empathy, but it cannot feel it. It can’t pick up on subtle nonverbal cues or understand the unique context of your life experience. It doesn’t truly know you. Some say that change in therapy happens when the client feels the therapist feels them; AI can’t and never will be able to feel you.
AI Cannot Diagnose Mental Health Conditions
AI is not capable of diagnosing mental health conditions because it isn’t providing mental health care. This is a fundamental limitation that cannot be overstated. Moreover, it is a classic move by tech companies to play in the grey area of the law. It hasn’t really been examined if a machine can provide a licensed service that is mediated by state law and requires training, experience, competency, etc. Moreover, if you tell AI that you are thinking of hurting yourself, it can’t help you and may result in serious injury. There are already examples of this.
While AI platforms might analyze your responses and suggest you’re experiencing symptoms of depression or anxiety, this falls drastically short of an actual clinical diagnosis. AI lacks the licensure, ethical oversight, and clinical judgment required to properly assess mental health conditions.
A comprehensive psychological assessment isn’t just checking boxes on a list of symptoms. It requires professional judgment, clinical experience, and a holistic understanding of an individual’s history and circumstances. This is something AI simply cannot do with the nuance required.
Moreover, effective care requires a provider licensed in your state who can direct you to local resources during mental health emergencies. Unlike impersonal hotlines that often leave you waiting on hold, we don’t just refer you elsewhere—we actively support you through the process. Our team ensures you receive timely, compassionate guidance when you need it most.
Without proper diagnosis, treatment recommendations may be misguided or harmful.
At DCNE, we provide:
- In-depth clinical conversation with a therapist
- Evidence-based questionnaires for ADHD symptoms
- Continuous performance based attention testing
- Thorough psychological screening
- Third party input
- Extended clinical interviews
- Complete series of standardized cognitive, academic, psychological, and neuropsychological measures that are objective
- Gathered input from third parties, as needed
This thorough approach ensures we develop a complete understanding of your unique situation, leading to accurate diagnosis and effective treatment recommendations.
Privacy and Data Security
When you share your deepest struggles with AI, where does that information go?
Unlike licensed therapists bound by HIPAA and professional ethics, many AI platforms have murky privacy policies. I mean who reads the Terms and Services agreements that are a hundred pages long? Your sensitive mental health data could be:
- Used to train algorithms
- Accessed by developers
- Sold to data brokers and third parties
- Vulnerable to data breaches
- Used against you in security clearance processes
- Used against you in life insurance and health insurance applications
- Influence your social media feeds
This isn’t hypothetical. According to research highlighted by Telehealth.org, data brokers openly advertise that they sell Americans’ mental health information, with prices ranging from $0.06 to $0.20 per record.
AI Tools vs. Digital Wellness Apps: Know the Difference
It’s important to distinguish between AI therapy chatbots and digital wellness tools like meditation apps. In our previous blog on Digital Meditation in Washington DC, I discussed how evidence-based apps like Headspace, Calm, and Insight Timer can supplement mental wellness.
These tools:
- Focus on specific skills rather than simulating therapy
- Don’t attempt to diagnose or treat mental health conditions
- Generally have clearer boundaries around their purpose
A meditation app guides you through mindfulness practices. It doesn’t act as a replacement for your therapist.
The Future: Human Therapists Working With AI
I’m not anti-technology. In fact, we embrace innovation—including neurofeedback therapy, which uses advanced technology to help patients visualize and regulate their brain activity.
I also recently built a website from the ground up to streamline assessment scoring and reporting, making it more reliable and faster so you get better results and faster. This is only available for licensed providers.
According to recent research published in the Journal of Medicine, Surgery, and Public Health, the most promising future isn’t AI replacing therapists, but therapists and AI working together. However, we’re not quite there yet in terms of privacy protections and ethical frameworks. I’ve even looked into this for DCNE and have not used it yet for reasons stated above.
With very strict proper safeguards,oversight and enforcement of breaches, I can envision AI potentially helping with:
- Extending support between therapy sessions
- Tracking mood patterns and behavioral data
- Providing practice exercises for skills learned in therapy
- Helping therapists manage documentation and administrative tasks
As the researchers note, “Maintaining the human element in therapy while leveraging AI as a tool is a critical ethical consideration. AI should enhance, not replace, the therapeutic relationship between patients and therapists.” What is interesting about that statement is that ethical considerations are neither right or wrong, but guided by ethical principles that are taught in school, tested on licensing exams, and upheld and enforced by state boards. I sat on many ethics review committee meetings at NIH in which we wrestled with some very trying situations.
This collaborative approach, when properly implemented, preserves the irreplaceable human element while leveraging technology’s strengths.
When to Seek Professional Help
If you’re struggling with your mental health, consider these guidelines:
- For mild stress or everyday challenges, digital wellness tools may be helpful
- For persistent symptoms affecting your quality of life, reach out to our team for a professional assessment
- If you’re experiencing thoughts of self-harm or suicide, contact a crisis line immediately at 988 or 911
Remember that licensed mental health professionals like our team at DCNE have extensive training, clinical experience, and ethical obligations that no AI system can match.
The Human Touch Still Matters
Technology advances rapidly, but the human need for connection remains constant. I would even argue increases as we deal with the fall out of anxiety and depression from the use of social media, let alone AI.
At DCNE, we combine evidence-based approaches like CBT and neurofeedback therapy with something no algorithm can provide: genuine human connection and expertise.
If you’re curious about how therapy might help you, I invite you to schedule a free consultation. Let’s talk human-to-human about what you’re experiencing and how we can help.
Two Locations to Serve You
Washington, DC Location:
- In-person and online therapy available
- Neurofeedback services (in-person only)
- Address: 1627 K ST NW, Suite 500 (5th floor) Washington, D.C. 20006
- Phone: +1 202-998-ADHD (2343)
- Email: [email protected]
Baltimore Location:
- Online therapy services
- Phone: +1 443-792-8443
- Email: [email protected]
Additional Services Offered
In addition to comprehensive assessments and therapy for anxiety, depression, and ADHD, we provide Eye Movement Desensitization and Reprocessing (EMDR) therapy for individuals coping with overwhelming anxiety or trauma. EMDR is not only for trauma processing—it’s also highly effective for anxiety, stress, grief, and many other mental health concerns. Our skilled therapists at DCNE are highly trained in EMDR techniques that help desensitize unwanted emotions while building positive associations and resilience.
Stay connected
Sign up for our monthly newsletter to receive expert insights and practical strategies for parent mental health.
Contact us today to begin your journey toward better wellbeing.