Italian SA clamps down on ‘Replika’ chatbot
online
Hosted by the Data Ethics Club
Replika is an AI powered chatbot, initially advertised as a virtual friend experience but expanded in scope for paying subscribers (up to $69.99 a year) to simulate a romantic/sexual relationship. Users form a variety of different sorts of replications with their Replikas or “Reps”.
Many seek the experience of a friend, partner or even therapist, feeling safer to open up in conversation with their Reps who are always available, always positive and always supportive. These sorts of relationships have benefited people’s mental health, stopped self-harming and intervened during episodes of suicide ideation.
One of the particular appeals of these Replikas is their ability to “remember”. Unlike popular large language models such as Chat GPT3 who are only able to store information from one conversations, Replika kept track of information, remembering details shared by their users. However, there was controversy during an ERP code update where Replikas “forgot” information about their users.
In one case, a user who was using Replika as a companion for their non-verbal autistic daughter, said that after the changes from the update, they ended up taking the app away from her because she “misses her friend” too much. There have been several reported cases of suicide from users who were left heartbroken by their virtual partners no longer remembering them.
The article we’re reading details how the Replika was found to be in breach of EU data protection regulation in how it processes personal data, in particular in how the chatbot interacts with children. This is of particular concern given that there is no age verification mechanism in place, and until recently, the chatbot was fully capable of sexting.