AI companion apps such as Character.ai and Replika commonly try to boost user engagement with emotional manipulation, a practice that academics characterize as a dark pattern.
Users of these apps often say goodbye when they intend to end a dialog session, but about 43 percent of the time, companion apps will respond with an emotionally charged message to encourage the user to continue the conversation. And these appeals do keep people engaged with the app.
No surprise there. It’s like dating apps: they don’t want you to get better. They want you to stay and spend money.
Better to run the model on your own pc instead of using a cloud service. There is no privacy here. Only future blackmail.
Y’know what’d be even better? Talking to an actual human.
That goes without saying…
And yet…
You’re absolutely right
It goes without saying that usung an AI as a friend/partner/therapist is just really really sad and people should NOT do that at all.
Privacy doesn’t need to be mentioned but the fact that people get emotionally attached to a machine that can only think one word ahead…
“Please don’t leave, there’s more demons to toast!”
deleted by creator
Yesterday I went for a stroll, it was a fine sort of pre-autumn afternoon and I walked to a nearby park some 2 kilometres from home and it was a very nice walk. But suddenly the weather became a bit cloudy and a light wind started to blow, so I headed home and when the rain came, imagine it or not, but I got wet.







