We’re Hiring! View Our Open Positions

Current Clients:  Contact Your Clinic  Bill Pay Portal

AI, Access, and Safety in Therapy: A Conversation with Joel Berger, LMHC

Northeast Health Services is dedicated to supporting your mental health. If you are experiencing suicidal thoughts, we encourage you to reach out for immediate support through your local crisis services by dialing 988, contacting your local emergency services, or visiting your local emergency room.

In this interview, Joel Berger, LMHC, Regional Clinic Director at Northeast Health Services, explains why people turn to AI for mental health support and what that means for care. He shares where AI can help with quick information, organization, and insurance questions, and where it can create risks such as overreliance or unsafe guidance. The takeaway: AI can start the conversation, but progress comes from real support and knowing when to move from a chatbot to care.

Why people are turning to AI right now

What is driving people to use AI for mental health support?

Joel: Access and immediacy. It can be easier to ask a chatbot about depressive symptoms than to get on a clinician’s schedule, especially if the first available is weeks out. Therapy has a cadence, like weekly or every other week. AI can fill the in-the-moment gap when something spikes on a random weeknight and your therapist is not on call. For people who are isolated, anxious about reaching out, or unsure if what they are feeling counts, that quick response can feel helpful.

Did AI create a new expectation for on-demand help, or did it solve a longstanding gap?

We have always tried to meet people where they are through things like mobile crisis and community outreach. AI opened a new lane many people did not know they would use until it was there. It is convenient. It also raises a question about real life support. When things are hard in the moment, who is in your network that you can call instead of a tool?

The promise and limits of AI for mental health

Where can AI be useful in practice?

I have seen solid, practical wins with ADHD. Clients might build schedules or reminder systems with AI and bring them to session for us to refine. Structure like that can be powerful. On the operations side, using AI to improve insurance verification and catch lapses before appointments could reduce disruptions in care. Those are places the technology removes friction and saves time.

And where does it go wrong?

Over-reliance sneaks up on people. What starts as “this makes life easier” can slide into “I depend on this to think, decide, and cope.” Clinically, we are also encountering scenarios no therapist has treated before, like a client who followed AI advice to cheat on a partner, or an AI encouraging eating disorder behaviors. You cannot just map those to tech addiction. It is more complex.

Risks Joel is watching closely

What worries you most about AI in the clinical context?

The line about what AI is, a predictive language model, gets fuzzier the more someone uses it, especially for people with cognitive vulnerabilities. Distinguishing “I am chatting with a model” from “I am getting guidance from something authoritative” gets harder.

Guardrails, safety, and regulation

What protections should exist right now?

Anything involving self-harm or suicidality should not be handled by an AI. Stop the app and route to a person, a clinician, a crisis line, or a trusted support. For adolescents, there should be some form of parent or guardian notification if those topics surface. More broadly, we need sturdier guardrails so people cannot prompt around safety blocks. The fact that there are cases alleging harm tells you regulation is behind lived reality.

If AI is your first step, what should the next step be?

What would you say to someone testing the waters with AI because they are not ready to call a therapist?

I am glad you are starting the conversation. There is insight in that. Now take one more step. Talk with a trusted person. Call a clinic in your community and ask whether therapy could help and what an intake looks like. A quick consult never hurts. If the condition is getting worse, the entanglement with AI tends to get worse too. Moving that conversation into a human relationship is where progress begins.

Ready to move beyond the algorithm? Start a real conversation

If AI nudged you to think about your mental health, let a person take it from here. Chatting with AI can feel helpful, but it has limitations. A mental health provider who will take time to learn about you and build a clinical relationship is easier to find than you might think. Start a real conversation and take the next step toward support that can adapt with you.

For new clients, please click here to schedule an appointment. For existing clients, please click here and find your office location to contact your office directly.