#EmotionalEngament

Miguel Afonso Caetanoremixtures@tldr.nettime.org
2025-04-25

"The core problem here is designing for attachment. A recent study by researchers at the Oxford Internet Institute and Google DeepMind warned that as AI assistants become more integrated in people’s lives, they’ll become psychologically “irreplaceable.”
Humans will likely form stronger bonds, raising concerns about unhealthy ties and the potential for manipulation. Their recommendation? Technologists should design systems that actively discourage those kinds of outcomes.

Yet disturbingly, the rulebook is mostly empty. The European Union’s AI Act, hailed a landmark and comprehensive law governing AI usage, fails to address the addictive potential of these virtual companions. While it does ban manipulative tactics that could cause clear harm, it overlooks the slow-burn influence of a chatbot designed to be your best friend, lover or “confidante,” as Microsoft Corp.’s head of consumer AI has extolled. That loophole could leave users exposed to systems that are optimized for stickiness, much in the same way social media algorithms have been optimized to keep us scrolling.

“The problem remains these systems are by definition manipulative, because they’re supposed to make you feel like you’re talking to an actual person,” says Tomasz Hollanek, a technology ethics specialist at the University of Cambridge."

bloomberg.com/opinion/articles

#AI #GenerativeAI #Chatbots #Attachment #EmotionalEngament #Psychology #Addcition

Client Info

Server: https://mastodon.social
Version: 2025.04
Repository: https://github.com/cyevgeniy/lmst