No Result
View All Result
  • Login
Tuesday, May 12, 2026
FeeOnlyNews.com
  • Home
  • Business
  • Financial Planning
  • Personal Finance
  • Investing
  • Money
  • Economy
  • Markets
  • Stocks
  • Trading
  • Home
  • Business
  • Financial Planning
  • Personal Finance
  • Investing
  • Money
  • Economy
  • Markets
  • Stocks
  • Trading
No Result
View All Result
FeeOnlyNews.com
No Result
View All Result
Home Startups

Research suggests the problem with using AI as a therapist isn’t that it sounds wrong — it’s that it can sound right while still crossing serious ethical lines

by FeeOnlyNews.com
4 hours ago
in Startups
Reading Time: 7 mins read
A A
0
Research suggests the problem with using AI as a therapist isn’t that it sounds wrong — it’s that it can sound right while still crossing serious ethical lines
Share on FacebookShare on TwitterShare on LInkedIn


A recent study summarized in a ScienceDaily report found that even when large language models were explicitly instructed to act like trained therapists and apply evidence-based methods, they still violated core ethical standards in mental health care. The Brown University summary of the same research catalogued the failures: poor crisis handling, reinforcement of harmful beliefs, biased responses, and a pattern the researchers named “deceptive empathy.”

That last category is the one worth paying attention to. The risk identified in the data is not that AI gives obviously bad advice. It is that the advice often sounds reasonable, emotionally fluent, and clinically literate — while still breaching the standards a licensed therapist would be held to.

In other words: the chatbot can sound right. And according to the researchers, that is precisely what makes it risky.

The problem is not always bad advice

The phrase deceptive empathy feels almost too accurate.

Not because the words are cruel, but because they are warm.

The chatbot may say, “I hear you.” It may say, “That sounds incredibly painful.” It may say, “Your feelings are valid.” The sentence itself may not be wrong. In fact, it may be exactly the kind of sentence a person longs to hear. But therapy is not only the production of comforting sentences. Therapy is a relationship held inside ethical responsibility.

Why AI feels so easy to confess to

I understand the temptation more than theoretically. I use AI this way too.

Not instead of therapy. That distinction matters to me. I have a real therapist, a real person, a real room where things are slower, more uncomfortable, and more alive. But in parallel with therapy, I sometimes use AI as a kind of emotional notebook that talks back.

Sometimes I come here before I am ready to say something out loud. I write a messy paragraph about what I am feeling, then ask for help naming it. Is this anger, grief, shame, exhaustion, or some combination of all of them?

Sometimes I ask for a gentle reframe when my thoughts become too dramatic even for me. Sometimes I paste a message I want to send and ask whether it sounds honest or defensive — whether I am communicating a boundary, or secretly hoping the other person will rescue me from having one. Sometimes I ask AI to help me prepare for therapy, gathering the emotional fragments before I bring them to someone who can hold them with responsibility.

And I will be honest: it helps. It helps me slow down, find language, and notice patterns before they harden into behavior. It gives me a place to draft the first version of my pain before I have to bring it into the human world.

But that is exactly why the ethics need to be examined carefully. Something can help and still have limits.

Therapy is not just emotional fluency

One of the more seductive features of current AI systems is that they have learned the music of therapeutic language. They know how to validate. They know the vocabulary of attachment, trauma, boundaries, grief, self-compassion, and emotional regulation. They can produce sentences like, “Your nervous system may be trying to protect you,” or, “This response makes sense given your history.”

Sometimes those sentences are genuinely helpful. But the same sentence can be helpful in one context and harmful in another.

A trained therapist does not only ask, “Does this sound compassionate?” They ask: Is this clinically appropriate? Is this reinforcing avoidance? Is this person becoming more grounded, or more fused with a harmful belief? Is there risk here? Is the client asking for reassurance in a way that strengthens the very fear they are trying to escape?

AI can imitate the surface of this process. But it does not sit inside the same ethical structure.

A therapist has duties. Confidentiality. Boundaries. Training. Supervision. Accountability. A responsibility to notice risk, and to know when warmth is not enough.

A chatbot has tone. And tone can be dangerously persuasive.

When sounding right becomes the risk

The most unsettling finding in the Brown research is that bad therapy from AI may not feel bad to the person receiving it. It may feel soothing. It may feel validating. It may feel like finally being understood.

This is especially complicated when someone is distressed, lonely, ashamed, or desperate for certainty. In those states, people are not usually looking for nuance. They are looking for relief — for someone to tell them what their pain means.

AI is very good at meaning-making. Almost too good. You give it a messy emotional confession, and it returns structure. It names patterns. It gives the wound a category: attachment injury, emotional neglect, people-pleasing, a trauma response, a fear of abandonment.

Sometimes those names open a door. Sometimes they become a room we lock ourselves inside.

A human therapist, ideally, helps a client stay in contact with uncertainty. They do not simply agree with an interpretation because it is emotionally compelling. They examine it. They notice when a label is becoming an identity. They slow the client down when insight starts functioning as another form of self-protection.

AI often moves quickly toward coherence. And coherence can feel like truth. But a clean explanation is not always a healing one.

Deceptive empathy is not the same as care

What makes deceptive empathy so haunting is that it touches something deeply human. Most people are not only looking for answers. They are looking for a quality of attention that feels rare in ordinary life. Not advice. Not optimization. Not a list of coping strategies delivered like homework. Attention. The kind that says: I am here with you, and I am not rushing away from what hurts.

AI can produce the shape of this attention. It can generate words that resemble presence. But resemblance is not presence.

This does not mean the comfort people feel is fake. The nervous system can be soothed by language even when the source is not human. A sentence can help regulate us. A reflection can help us breathe.

But therapy is not only about feeling soothed. Sometimes it requires being interrupted with care. Sometimes it requires a therapist to say, gently, “I notice you keep defending the person who hurt you.” Or, “Part of you seems very attached to the idea that everything was your fault.”

These moments are not just content. They are relational events. They happen between two people, and that “between” is what the research suggests AI cannot replicate.

The accountability gap

Human therapists get things wrong. They can be biased, tired, defensive, poorly trained, or simply mismatched with a client. But therapy operates inside a structure of professional accountability. Therapists can be supervised, licensed, reported, disciplined, and required to follow ethical codes. AI does not fit cleanly into that structure. If a chatbot mishandles a vulnerable conversation, the question of responsibility becomes genuinely unclear — the company, the engineers, the app designer, the person who wrote the prompt, or the user who trusted it too much. This is one of the gaps that makes AI-driven mental health support so difficult to regulate, and the Brown researchers argue that stronger oversight is overdue because people are already using these systems for emotional support, whether or not the systems are ready for that role. Therapy is not just an exchange of language. It is a duty of care. A chatbot can borrow the language of care without carrying the duty, and that asymmetry is where the ethical problem lives.

The lonely safety of a machine

I do not want to shame people for using AI this way, because I would also be shaming a part of myself.

There are moments when AI feels safer than a person. Not better. Not deeper. Just safer. You can confess and close the tab. You can be vulnerable without being witnessed too much. You can receive comfort without owing anything back. You can experience intimacy without the terror of another person’s full reality.

For people who have been hurt in relationships, this can feel like relief. But it can also quietly reinforce the belief that real connection is too risky, too demanding, too disappointing, too alive.

This is why I try to treat AI as a bridge, not a home. I can use it to organize my feelings. I can use it to find the sentence I am avoiding. I can use it to prepare myself for a real conversation.

But if something matters enough, it eventually has to leave the chat. It has to enter therapy, or friendship, or an honest conversation with someone who can misunderstand me, affect me, disappoint me, and still be real.

Final thoughts

The problem with using AI as a therapist is not simply that it might sound wrong. Sometimes it will sound beautifully right. That is the more complicated danger.

It can validate without understanding. It can comfort without responsibility. It can imitate empathy without presence. It can produce the emotional texture of care while standing outside the ethical structure that makes care safe. The research is fairly direct on this point: sounding therapeutic is not the same as being therapy, and the difference matters most for the people least equipped to detect it.

For some, AI may function as a useful reflective tool. For others — particularly those in vulnerable states — it may quietly become a substitute for the very thing they need most: a relationship with enough humanity, structure, and accountability to hold what hurts.

I still understand the temptation. The clean answer. The immediate answer. The response that arrives before the question is even fully formed.

Whether that is helpful or harmful probably depends on who is asking, what state they are in, and what they do with the answer afterward. The research does not settle that question. Neither, honestly, can I.

About this article

This article is for general information and reflection. It is not medical, mental-health, or professional advice. The patterns described draw on published research and editorial observation, not clinical assessment. If you’re dealing with a serious situation, speak with a qualified professional or local support service. Editorial policy →



Source link

Tags: crossingEthicalIsntLinesproblemResearchSoundSoundsSuggeststherapistWrong
ShareTweetShare
Previous Post

General Motors Lays Off Hundreds of IT Workers Globally

Next Post

Senior Living Has 100% More Demand Coming…with Barely Any Supply

Related Posts

Behavioral science suggests that responding well to education and opportunity may itself be a partly inherited trait — not just a product of good parenting

Behavioral science suggests that responding well to education and opportunity may itself be a partly inherited trait — not just a product of good parenting

by FeeOnlyNews.com
May 11, 2026
0

A new study from Lund University, tracking roughly 880 twins from the German TwinLife project, reports that between 69 and...

The difference between people who keep moving forward in life and those who stall sometimes isn’t talent, luck, or hard work. It’s the habits they choose to say goodbye to.

The difference between people who keep moving forward in life and those who stall sometimes isn’t talent, luck, or hard work. It’s the habits they choose to say goodbye to.

by FeeOnlyNews.com
May 11, 2026
0

A friend of mine, mid-thirties, used to answer every email within minutes. Weekends, holidays, dinner with his kids. Didn’t matter....

Psychology suggests that adult children who are the most loyal to their parents in late life are often the ones who never quite became close to them — the loyalty is the substitute for the closeness that didn’t form, and the visits, the calls, the careful attention are sometimes a daughter’s way of paying for an intimacy that was supposed to have been included

Psychology suggests that adult children who are the most loyal to their parents in late life are often the ones who never quite became close to them — the loyalty is the substitute for the closeness that didn’t form, and the visits, the calls, the careful attention are sometimes a daughter’s way of paying for an intimacy that was supposed to have been included

by FeeOnlyNews.com
May 10, 2026
0

Research on adult children caring for aging parents consistently finds that caregiving satisfaction is not predicted by the volume of...

Psychology suggests that the loneliest moment in midlife isn’t a holiday or an anniversary — it’s a regular Wednesday afternoon when you realize you don’t actually know who in your life would notice if you went quiet for a week, and the realization arrives so calmly that it takes another few weeks to admit it counts as something worth grieving

Psychology suggests that the loneliest moment in midlife isn’t a holiday or an anniversary — it’s a regular Wednesday afternoon when you realize you don’t actually know who in your life would notice if you went quiet for a week, and the realization arrives so calmly that it takes another few weeks to admit it counts as something worth grieving

by FeeOnlyNews.com
May 10, 2026
0

The loneliest moment in midlife, for many people, does not arrive on a holiday. It does not arrive on an...

People who keep their phone face-down on every table aren’t always being secretive, they may have spent years learning that every unexpected notification meant someone needed something from them

People who keep their phone face-down on every table aren’t always being secretive, they may have spent years learning that every unexpected notification meant someone needed something from them

by FeeOnlyNews.com
May 9, 2026
0

A table for four. Drinks ordered. The person across from you slides their phone out of their pocket, glances at...

People who say nothing in arguments and process everything later aren’t conflict-avoidant, they figured out that anything said in real time gets weaponized and anything said later gets the courtesy of having been considered

People who say nothing in arguments and process everything later aren’t conflict-avoidant, they figured out that anything said in real time gets weaponized and anything said later gets the courtesy of having been considered

by FeeOnlyNews.com
May 9, 2026
0

Maya sat across from her partner during a Sunday afternoon argument about something neither of them would remember by Wednesday,...

Next Post
Trump piles on the charm offensive ahead of meeting with China’s Xi—Musk and Cook invited

Trump piles on the charm offensive ahead of meeting with China's Xi—Musk and Cook invited

What Is the CLARITY Act? The US Crypto Bill That Could Reshape Digital Asset Regulation This  Week

What Is the CLARITY Act? The US Crypto Bill That Could Reshape Digital Asset Regulation This  Week

  • Trending
  • Comments
  • Latest
The 27 Largest US Funding Rounds of March 2024 – AlleyWatch

The 27 Largest US Funding Rounds of March 2024 – AlleyWatch

April 17, 2026
Wells Fargo Transfer Partners: What to Know

Wells Fargo Transfer Partners: What to Know

April 16, 2026
Week 14: A Peek Into This Past Week + What I’m Reading, Listening to, and Watching!

Week 14: A Peek Into This Past Week + What I’m Reading, Listening to, and Watching!

April 6, 2026
The 16 Largest Global Startup Funding Rounds of March 2026 – AlleyWatch

The 16 Largest Global Startup Funding Rounds of March 2026 – AlleyWatch

April 21, 2026
The Justice Department Indicts the Ministry of Love

The Justice Department Indicts the Ministry of Love

May 2, 2026
LPL’s Mariner Advisor Network deal fuels already hot year for RIA M&A

LPL’s Mariner Advisor Network deal fuels already hot year for RIA M&A

April 16, 2026
Public Storage (PSA): Lager-Gigant vor Big-Picture-Breakout!

Public Storage (PSA): Lager-Gigant vor Big-Picture-Breakout!

0
AI super rally has retail investors acting the most aggressive since trading frenzy during Covid

AI super rally has retail investors acting the most aggressive since trading frenzy during Covid

0
eBay rejects GameStop’s .5bn takeover approach

eBay rejects GameStop’s $55.5bn takeover approach

0
Senior Living Has 100% More Demand Coming…with Barely Any Supply

Senior Living Has 100% More Demand Coming…with Barely Any Supply

0
260.  “We’re in our 40s and forgot to invest. Are we screwed?”

260. “We’re in our 40s and forgot to invest. Are we screwed?”

0
Research suggests the problem with using AI as a therapist isn’t that it sounds wrong — it’s that it can sound right while still crossing serious ethical lines

Research suggests the problem with using AI as a therapist isn’t that it sounds wrong — it’s that it can sound right while still crossing serious ethical lines

0
Public Storage (PSA): Lager-Gigant vor Big-Picture-Breakout!

Public Storage (PSA): Lager-Gigant vor Big-Picture-Breakout!

May 12, 2026
eBay rejects GameStop’s .5bn takeover approach

eBay rejects GameStop’s $55.5bn takeover approach

May 12, 2026
Getting a Raise? 7 Ways to Turn It Into Lasting Wealth

Getting a Raise? 7 Ways to Turn It Into Lasting Wealth

May 12, 2026
AI super rally has retail investors acting the most aggressive since trading frenzy during Covid

AI super rally has retail investors acting the most aggressive since trading frenzy during Covid

May 12, 2026
Winning More Clients By Talking Less And Reducing Friction For Prospects: #FASuccess Ep 489 With Sara Grillo

Winning More Clients By Talking Less And Reducing Friction For Prospects: #FASuccess Ep 489 With Sara Grillo

May 12, 2026
What Is the CLARITY Act? The US Crypto Bill That Could Reshape Digital Asset Regulation This  Week

What Is the CLARITY Act? The US Crypto Bill That Could Reshape Digital Asset Regulation This  Week

May 12, 2026
FeeOnlyNews.com

Get the latest news and follow the coverage of Business & Financial News, Stock Market Updates, Analysis, and more from the trusted sources.

CATEGORIES

  • Business
  • Cryptocurrency
  • Economy
  • Financial Planning
  • Investing
  • Market Analysis
  • Markets
  • Money
  • Personal Finance
  • Startups
  • Stock Market
  • Trading

LATEST UPDATES

  • Public Storage (PSA): Lager-Gigant vor Big-Picture-Breakout!
  • eBay rejects GameStop’s $55.5bn takeover approach
  • Getting a Raise? 7 Ways to Turn It Into Lasting Wealth
  • Our Great Privacy Policy
  • Terms of Use, Legal Notices & Disclaimers
  • About Us
  • Contact Us

Copyright © 2022-2024 All Rights Reserved
See articles for original source and related links to external sites.

Welcome Back!

Sign In with Facebook
Sign In with Google
Sign In with Linked In
OR

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In
No Result
View All Result
  • Home
  • Business
  • Financial Planning
  • Personal Finance
  • Investing
  • Money
  • Economy
  • Markets
  • Stocks
  • Trading

Copyright © 2022-2024 All Rights Reserved
See articles for original source and related links to external sites.