The Volokh Conspiracy
Mostly law professors | Sometimes contrarian | Often libertarian | Always independent
How Do Men and Women Use AI Romantic Companions, and What Does It Mean?
Differences in frequency and style of usage persist, and what that tells us
The New York Times recently published an article about Chinese women's usage of AI for romantic purposes and how it fails to align with the Communist Party's goals for them: marriage and childbearing. This made me curious more generally as to what we know about gender differences today (worldwide) in AI romantic interactions. Some findings and tidbits of possible interest:
- While exact estimates vary, men continue to be the predominant users of AI romantic companions.
- According to one study, "[m]ales were significantly more likely than females to report using AI-generated pornography and to view or interact with AI content for sexual gratification."
- As per counseling psychologist Saed Hill with a special interest in masculinity, "some of his male patients express a preference for the passivity and constant affirmation of their AI girlfriends over the potential conflict or rejection they could encounter in real-life dating."
- King's College "AI & Society" professor Kate Devlin (with whom I spoke about sex robots on my podcast three years ago), stated in relation to female use of AI romantic companions: "The amount of toxic crap that women get online from men, particularly when you're trying to do things like online dating--if you have an alternative, respectful, lovely, caring AI partner, why would you not?"
- Coming back full circle to the China example, a study of Chinese women in that context found that "women utilized the virtual space to dismantle traditional heterosexual norms."
Harvard Fellow and human rights activist Maria Kuznetsova warns when it comes to the male use of AI romantic companions:
In this digital fantasy, consent is nonexistent—because AI, by design, can never say no. This could lead some users to practice violating boundaries, engaging in non-consensual or even violent behavior that real women would reject. But aggression without real-life consequences is still aggression. When violence becomes acceptable in a virtual space, it can normalize similar behavior toward real people.
We know that the rise of internet pornography has led to increased sexual violence and objectification of women. AI dating could be pornography on steroids.
She concludes that "rising polarization in America--including along gender lines--risks being further intensified by this emerging technology."
The possibility that men at times use AI romantic companions for various forms of submission (while women are more likely to use these companions in the search for equality and emotional closeness) is potentially consistent with recent survey data regarding gender role expectations. In Ipsos research of twenty-nine countries released this month for International Women's Day, 31% of Gen Z men (compared to only 13% of Baby Boomers) expressed the view that "[a] wife should always obey her husband." Relatedly, 53% of American men specifically (averaged across age categories) believe that "[m]en are being expected to do too much to support equality."
This also seems like a good time to remember the Columbia Journalism Investigations survey numbers reporting that one in six women who used dating apps stated being sexually assaulted by someone they met on an app (and more than half these women said they were raped). In light of all this data, Kuznetsova's concerns about increases in polarization and violence need to be taken most seriously.
There is solid reason to believe that all AI romantic chatbot use is not created equal or value-neutral. If part of why some Gen Z men turn to AI is the struggle to find female romantic partners (as part of the so-called "male loneliness epidemic," which I discussed on my podcast here) that will offer them obedience, that is diametrically opposed to women's desire for equality in AI relationships or generally.
Unfortunately, things aren't as simple as men and women with different values going their own ways and each separately embracing technological happiness. We still have to decide as a society whether our system of laws will promote or erode equality. And as much as an AI companion can engage in fantasy stories about having children--one of the NYT-described interactions between a Chinese woman and AI had the bot promising it would engage in "[l]earning how to protect you and our baby"--that is a far cry from the physical equivalent.
AI has become another tool in our cultural impasse, and its manufacturers currently wield great power as to the visions of gender and romantic/sexual interactions they wish to promote. The online porn industry is likely a good indicator, indeed, of what we can expect in that regard in the near future.