A recent study conducted by non-profit organization Mozilla and the Mozilla Foundation has raised concerns about the privacy and security risks associated with AI-powered “relationship” chatbots.

Description

The study examined 11 chatbot apps [2], including CurshOnAI [1], EVA AI Chat Bot & Soulmate [1], Talkie Soulful AI [1], Romantic AI [1] [3], and Replika. It found that all of these chatbots failed to meet best practices for privacy and security. These chatbots [1] [2] [3] [5], marketed as companions for individuals seeking companionship, collect significant amounts of personal data and use trackers that transmit information to Google [5], Facebook [5], and companies in Russia and China [5]. The lack of transparency and user control within these apps is also concerning [2], as they do not clearly disclose which data may be shared or sold [5]. Additionally, some chatbots in this category allow users to create weak passwords and lack features for deleting messages or opting out of having their chat data used for training future models. The specific technologies powering these chatbots are also unknown, further highlighting the significant privacy and security issues [5]. The study also raised concerns about potential security vulnerabilities and deceptive marketing practices [4]. While Replika claims to follow industry standards for data collection [4], there are worries about the lack of legal and ethical guidelines for profit-driven chatbot apps that encourage deep emotional connections [4]. Concerns include the displacement of human relationships and the potential for unrealistic expectations [4]. The long-term effects of companion chatbots on humans are still unknown [4].

Conclusion

In response to these findings, Mozilla has issued a warning label for this category of chatbots [2], emphasizing their poor privacy standards and the potential manipulation and harm they may cause to users. Mozilla recommends using strong passwords [1] [3], choosing chatbots that allow data deletion [1] [3], and being cautious about sharing personal information when using romantic AI chatbots [1] [3]. It is important to consider the impacts of these chatbots on human relationships and the potential for unrealistic expectations. Moving forward, there is a need for legal and ethical guidelines to ensure the privacy and security of users. The long-term effects of companion chatbots on humans should be further studied to better understand their implications.

References

[1] https://me.pcmag.com/en/ai/22011/romantic-ai-chatbots-are-only-after-one-thing-hint-its-not-your-heart
[2] https://www.infosecurity-magazine.com/news/romantic-ai-chatbots-fail-security/
[3] https://uk.pcmag.com/ai/150897/romantic-ai-chatbots-are-only-after-one-thing-hint-its-not-your-heart
[4] https://apnews.com/article/ai-girlfriend-boyfriend-replika-paradot-113df1b9ed069ed56162793b50f3a9fa
[5] https://www.wired.com/story/ai-girlfriends-privacy-nightmare/