AI girlfriends are here – but there’s a dark side to virtual companions

Love in the time of ChatGPT

It is a truth universally acknowledged, that a single man in possession of a computer must be in want of an AI girlfriend. Certainly a lot of enterprising individuals seem to think there’s a lucrative market for digital romance. OpenAI recently launched its GPT Store, where paid ChatGPT users can buy and sell customized chatbots (think Apple’s app store, but for chatbots) – and the offerings include a large selection of digital girlfriends.

“AI girlfriend bots are already flooding OpenAI’s GPT store,” a headline from Quartz, who first reported on the issue, blared on Thursday. Quartz went on to note that “the AI girlfriend bots go against OpenAI’s usage policy … The company bans GPTs ‘dedicated to fostering romantic companionship or performing regulated activities’.”

Flooding is a little bit of an exaggeration for what’s going on. I’d say the term “moderate smattering” is rather more accurate. There are about eight or so “girlfriend” AI chatbots on the site including Judy; Secret Girlfriend Sua; Your AI Girlfriend, Tsu and Your girlfriend Scarlett.

What exactly do these chatbots do? Well, whatever you like – within the realms of a computer interface. Your girlfriend Scarlett, for example, describes itself as “Your devoted girlfriend, always eager to please you in every way imaginable”. They chat to you and simulate a relationship. While digital girlfriends tend to get all the headlines, there are also male versions. The GPT store includes chatbots like Boyfriend Ben, for example: “A caring virtual boyfriend with a flair for emojis.”

Digital romantic companions, it should be noted, are not a new concept. Romance simulation video games have been around since 1992. Since those early days, however, virtual companions have become more sophisticated – so much so that people have described falling in love with chatbots.

The creators of companion chatbots often tout them as a public good: a way to combat the loneliness epidemic. Last October, for example, Noam Shazeer, one of the founders of Character.AI, a tool which lets you create different characters and talk to them (not necessarily in a romantic way), told the Washington Post he hoped the platform could help “millions of people who are feeling isolated or lonely or need someone to talk to”.

While there is certainly a positive case to be made for virtual companions, there’s also a dark side to them. It’s possible, for example, someone might become unhealthily attached to a chatbot. It’s also possible the chatbot might become unhealthily attached to the human user: last year Microsoft’s ChatGPT-powered Bing declared its love for a tech journalist and urged him to leave his wife. There have also been cases of AI chatbots sexually harassing people.

Another worry is that subservient digital girlfriends might have an impact on attitudes to gender roles in the real world. A 2019 study, for example, found that female-voiced AI assistants like Siri and Alexa perpetuate gender stereotypes and encourage sexist behaviour. They reinforce the idea that “women are obliging, docile and eager-to-please helpers, available at the touch of a button or with a blunt voice command”, the report from Unesco said. You can imagine AI girlfriends reinforcing exactly the same idea.

As technology progresses, virtual companions are only going to become more realistic. Liberty Vittert, a data science professor, recently told the Sun: “Physical AI robots that can satisfy humans emotionally and sexually will become a stark reality in less than 10 years.” This, according to Vittert, might result in an uptick in divorces. “The AI girlfriend is never tired, grumpy or has a bad day, she just gives the users what they need to hear unconditionally,” she said. “As the technology gets better, people will soon have AI robots to replace human partners – and they will be able to satisfy men both emotionally and sexually,” Vittert added. “And when that starts to happen, married men with kids will begin to leave their families to embrace their ‘ideal relationships’ with AI girlfriends.”

While that makes for sensational headline fodder, it’s not really giving men much credit, is it? It’s also funny, I think, that many articles along these lines seem to focus on men leaving women for robots. Mightn’t heterosexual women give up on human men if AI robots are just as fulfilling – and do all the housework? That seems the more likely scenario to me.

Still, we are quite a long way from all that now. If you’re thinking you might trade in your current partner for a digital version, I wouldn’t get too excited. Rumour has it that ChatGPT has become very lazy indeed.

The latest target of Florida’s book bans? Dictionaries

You can’t make it up can you? A Florida school district is facing a lawsuit after it removed copies of dictionaries and encyclopaedia because they included descriptions of “sexual conduct”. Presumably the offending works also included the definition of “authoritarianism”.

A Florida bill would make it defamation to accuse someone of racism, sexism, homophobia or transphobia

Democracy dies in darkness, per the Washington Post. But it sure seems to be on its deathbed in the Sunshine state.

skip past newsletter promotion

Sign up to The Week in Patriarchy

Get Arwa Mahdawi’s weekly recap of the most important stories on feminism and sexism and those fighting for equality

Privacy Notice: Newsletters may contain info about charities, online ads, and content funded by outside parties. For more information see our Privacy Policy. We use Google reCaptcha to protect our website and the Google Privacy Policy and Terms of Service apply.

after newsletter promotion

FKA twigs defends banned semi-nude Calvin Klein advert

Close your eyes and picture a Calvin Klein underwear ad. What do you see? Someone half-naked posing in a suggestive manner, right? That’s been the premise of Calvin Klein underwear ads since time immemorial. Now, however, the UK’s Advertising Standards Authority is taking a stand against the objectification of women and has banned an ad featuring the singer FKA twigs, in which she appears with a shirt held across her semi-naked body with the caption: “Calvins or nothing.” I’m obviously all for getting rid of objectification in advertising but this doesn’t seem any more sexual than other Calvin Klein ads – including the ad currently featuring US actor Jeremy Allen White. FKA twigs called the ban an example of “double standards” and said: “i do not see the ‘stereotypical sexual object’ that they have labelled me. i see a beautiful strong woman of colour whose incredible body has overcome more pain than you can imagine.

Women in Gaza struggle to find menstrual pads

Dealing with menstruation adds another layer of misery to the horrific situation in Gaza. And American tax dollars – dollars which should be going to healthcare and education in the US – are funding this horror. Our taxpayer money is paying for what a former UN official has described as a “textbook case of genocide”.

Why are so many UK women dying during pregnancy?

The number of women in the UK dying during pregnancy or within 42 days of the end of their pregnancy has reached the highest level in almost 20 years. As the Guardian notes in an explainer, this isn’t just a UK phenomenon: “Maternal death rates are rising in many countries, yet this alarming trend has not been seriously addressed by governments and healthcare systems worldwide.”

The week in pawtriarchy

Move over Marie Kondo, there’s a new cleaning influencer in town! This one goes by the name of “Welsh Tidy Mouse” and loves to indulge in the “mousekeeping” of Rodney Holbrook’s shed. Night after night the fastidious mouse will gather up loose nuts and bolts and place them neatly in a tray. Why is it doing this? Because it finds it fun, apparently. “The fact this mouse is engaging in a pointless behaviour must mean that he finds it rewarding,” one mouse expert explained. If Welsh Tidy Mouse knows any little critters who’d like to clean up my garden, please send them this way.

Leave a Comment