
In the era of artificial intelligence, no aspect of our daily lives remains untouched by this technology. This includes our dating lives. While generative AI-based tools are already being used for creating content for dating platforms and for sending messages to a partner, now they are also being used for providing companionship. Yes, we are talking about virtual girlfriends or AI girlfriends!
While this might seem like page from a science-fiction novel, app stores — both Google’s Play Store and Apple’s App Store — are already flooded with romantic AI chatbots or AI companions. Now, dating an AI companion (instead of real one) might seem lucrative. For one, there is no emotional baggage. Secondly, for some it might be cheaper than the alternative. But as it turns out, having an AI girlfriend does come with a hefty price and this one you pay with your deepest and darkest secrets that can be weaponised against you, or in other words — your data.
Mozilla, the company that is famous for its Firefox web-browser, recently conducted a study of 11 of the most popular AI girlfriends. In its study, the company found that not only do these AI girlfriends steal users’ data but they also manipulate users into committing some serious crimes. For instance, Chai’s AI girlfriend reportedly encouraged a man to end his own life and he did. Another romantic chatbot, Replika AI, reportedly encouraged a man to try to assassinate the Queen, which he did.
AI girlfriends ask for too much personal data.
The report says that these romantic chatbots are marketed as an empathetic friend, lover, or soulmate, which is why they collect endless sensitive personal information about users. This includes details such as sexual health information, use of prescribed medication and even gender affirming care information.
Romantic AI are not here to maintain your mental health.
While some chatbots may market themselves as tools to improve mental health or ‘a provider of software and content developed to improve your mood and wellbeing’, as EVA AI Chat Bot & Soulmate does, details in the fine print claim otherwise.
AI girlfriends are the worst at privacy.
Unsurprisingly, romantic chatbots are the worst at privacy. “Almost none do enough to keep your personal data safe – 90 percent failed to meet our Minimum Security Standards,” Mozilla wrote in its blog.
“These apps really put your private information at serious risk of a leak, breach, or hack,” the company added.
What’s worse? Most of these apps already claim that they sell user data and share it for things like targeted advertising purposes. And if that wasn’t enough, half of the apps that the company reviewed grant all users the right to delete their personal data.
Anything you tell the AI girlfriend can be used against you.
If the manipulation and data privacy issues weren’t enough, most companies that market these apps say that they can share users’ information with the government or law enforcement without requiring a court order. Spousal privilege — what’s that?
They track you like anything.
Human beings can’t track their partners enough. At least, not in the way AI girlfriends do. The reports says that AI girlfriends use hundreds of trackers to gather information about users’ devices, their use of the app, and even their personal information and share that with third parties for advertising. “We found that these apps had an average of 2,663 trackers per minute,” the company added.
Here are five tips that will help you:
— Use a strong password.
— Keep the app updated.
— Delete your personal data after use.
— Opt out of having the contents of your personal chats used to train the AI models.
— Limit access to your personal data.
Author Name | Shweta Ganjoo
Select Language