Suicide assisted by AI

AI is being blamed and not without reason. I wonder, is it the real reason or the final push?

21-02-2026 - 1 minute, 26 seconds -
AI

This is a topic that I have experience with. I have in the past attempted suicide… A few times. I had issues with low self-worth and codependence. These issues were bleeding all over the place and kind of it contirbued to who I am now. The last attempt was the closest I came to success, I had the motiviation and the plan. I was talked down. I am grateful for being talked down. I am grateful because that low was the solid foundation I needed to rebuild myself.

Techdirt posted this article and I recalled that one of my kids was chtting with Character.ai. I had spoken with them before when news broke a couple of years ago that a mom was suing Character.ai for her son's suicide, allegedly aided by a Character.ai personality. Sure, that was likely the trigger. Still, the thoughts of suicide likely existed well in advance of AI's involvement.

I can only speak for my experiences. My self worth was low, low enough that I convinced myself that death would actually be a benefit for my monkey sphere. I sat with those thoughts for decades with minor attempts until that summer morning when I made too much noise and was talked out of it. I had tied my self worth to validation of others and in my weakest moment, I almost gave in to my ego.

I had not intended to share this much and wanted to direct you to the article.

Ai is pretty shitty and I don't know many people who love it. Still, it seems to be the latest scapegoat in a long line of scapegoats for personal decisions. The thing about suicide is that it's mostly a solitary act. Mostly.