AI companion apps such as Character.ai and Replika commonly try to boost user engagement with emotional manipulation, a practice that academics characterize as a dark pattern.
Users of these apps often say goodbye when they intend to end a dialog session, but about 43 percent of the time, companion apps will respond with an emotionally charged message to encourage the user to continue the conversation. And these appeals do keep people engaged with the app.
It goes without saying that usung an AI as a friend/partner/therapist is just really really sad and people should NOT do that at all.
Privacy doesn’t need to be mentioned but the fact that people get emotionally attached to a machine that can only think one word ahead…