techhub.social is one of the many independent Mastodon servers you can use to participate in the fediverse.
A hub primarily for passionate technologists, but everyone is welcome

Administered by:

Server stats:

5.4K
active users

#therapy

25 posts23 participants4 posts today

I’ve been thinking about healing—not just the kind that scars, but the deeper kind.

Is it when the memory softens? When you stop rehearsing the story? When your body no longer braces?

Maybe healing isn’t a destination.

So how do we know we’re on the right path?
What are some signs you’ve noticed along the way?

Hey there—figured it’s time for a quick reintroduction.

I’m a therapist focused on trauma, relationships, and the weird, wonderful ways we heal.

Husband, dad, dog-wrangler, bunny negotiator.

I’ve spent a lot of life staying quiet online, but I’m slowly learning to take up a little more space.

Here to connect, share the human stuff, and maybe post about #therapy, #healing, and the occasional nerdy dad thought.

Striving to grow daily—helping others do the same.

Utah Ex-Therapist Scott Owen Sentenced to Prison for Sexually Abusing Patients

Owen’s 15-year-to-life prison term follows a 2023 investigation by The Salt Lake Tribune and ProPublica that uncovered a range of sex abuse allegations against the ex-therapist, who claimed to be a specialist for struggling gay Latter-day Saints men.
propublica.org/article/scott-o

ProPublicaUtah Ex-Therapist Scott Owen Sentenced to Prison for Sexually Abusing Patients
More from ProPublica
#News#Utah#Therapy

"Now consider the chatbot therapist: what are its privacy safeguards? Well, the companies may make some promises about what they will and won't do with the transcripts of your AI sessions, but they are lying. Of course they're lying! AI companies lie about what their technology can do (of course). They lie about what their technologies will do. They lie about money. But most of all, they lie about data.

There is no subject on which AI companies have been more consistently, flagrantly, grotesquely dishonest than training data. When it comes to getting more data, AI companies will lie, cheat and steal in ways that would seem hacky if you wrote them into fiction, like they were pulp-novel dope fiends:
(...)
But it's not just people struggling with their mental health who shouldn't be sharing sensitive data with chatbots – it's everyone. All those business applications that AI companies are pushing, the kind where you entrust an AI with your firm's most commercially sensitive data? Are you crazy? These companies will not only leak that data, they'll sell it to your competition. Hell, Microsoft already does this with Office365 analytics:
(...)
These companies lie all the time about everything, but the thing they lie most about is how they handle sensitive data. It's wild that anyone has to be reminded of this. Letting AI companies handle your sensitive data is like turning arsonists loose in your library with a can of gasoline, a book of matches, and a pinky-promise that this time, they won't set anything on fire."

pluralistic.net/2025/04/01/doc

pluralistic.netPluralistic: Anyone who trusts an AI therapist needs their head examined (01 Apr 2025) – Pluralistic: Daily links from Cory Doctorow

"Most chatbots are not currently regulated. The U.S. Food and Drug Administration has only approved one AI-based mental health app to treat major in adults. Without regulations, there's no way to safeguard against misuse, lack of reporting, or inequity in training data or user access.

"There are so many open questions that haven't been answered or clearly articulated," said Moore. "We're not advocating for this technology to be nixed. We're not saying get rid of AI or therapy bots. We're saying we need to be thoughtful in how we use them, particularly when it comes to a population like children and their mental health care."
sciencedaily.com/releases/2025

ScienceDailyMy robot therapist: The ethics of AI mental health chatbots for kidsAI mental health apps may offer a cheap and accessible way to fill the gaps in the overstretched U.S. mental health care system, but ethics experts warn that we need to be thoughtful about how we use them, especially with children.

Today I "safeworded" on a therapy topic.

We came up against something, examined it a bit, and then I went "Nope, I am not in a place where I can handle dismantling this particular core personality/coping mechanism."

"Ma'am, that is a load bearing trauma response" was literally, actually, what I told her. lolsob.