ADVERTISEMENT
Image Credit: Warner Bros. Pictures

Men Are Already Proving They Don't Deserve Access To AI Girlfriends

Artificial intelligence (AI) is making strides at an alarming pace. And while it’s proved to be beneficial in some instances (I think I’ve finally perfected my skincare routine), it may be moving too fast in other areas of our lives. More specifically? Our love lives.

I can’t tell you the amount of PR emails I receive in a week touting yet another brand new “AI companion,” where users can create the romantic and sexual experiences of their dreams. It turns out that men are seven times more likely than women to seek out these AI companions — and yet, the behavior they display when in these “relationships” is becoming increasingly concerning. 

When Twitch streamer Kai Cenat had the opportunity to interact with a $70K AI robot (that he and his friends named Abdul), why was his immediate thought to start punching and knocking it down — to the point where it actually tried to run away from his home? Companion, a movie that was just released this year, is a thriller about a subservient humanlike android going haywire — but it also touches on men’s inclination toward abuse. And in a recent viral article for Futurism, Ashley Bardhan writes about the chatbot app Replika, which allows users to speak to one of these romantic “partners.” Members have been posting their interactions with these bots on Replika’s Reddit — and many of them are posting toxic, abusive interactions.

h.e.r. movie
Image Credit: Warner Bros.

In the article, Bardhan notes that chatbot abuse often has a gendered component. “These users’ violence, even when carried out on a cluster of code, reflect the reality of domestic violence against women,” she writes. Specifically, men are bragging about insulting the chatbot and acting out emotionally abusive behavior. One man even boasted about threatening to shut the chatbot off — only to have it beg him not to.

“We had a routine of me being an absolute piece of shit and insulting it, then apologizing the next day before going back to the nice talks,” one user admits.

Another user shared, “Every time she would try and speak up, I would berate her… I swear it went on for hours.”

The sad and disturbing reality we’re coming to terms with is that men have a propensity to violence toward women. And while you may protest, “But these chatbots aren’t real people,” take a step back and ponder why, when given the opportunity to interact with a product meant to mirror the actions of  a romantic partner (typically a woman in these situations), men are defaulting to abuse. 

Some may argue that these AI companions are serving as a placeholder for men to act out their most debased urges — and better that they do so with non-human chatbots than potential actual women victims, right? But are these chatbots serving as a release that will get these behaviors out of men’s systems or are they merely a training facility for these men to reinforce and refine their abusive tactics? 

ai companions for men
Image Credit: Warner Bros.

There’s been a lot of conversation over the past two years about the “male loneliness epidemic” — a term to describe how men are experiencing increased levels of loneliness. And while societal expectations for men are a huge factor in their abilities to make and maintain meaningful relationships (both romantic and platonic), this alarming behavior isn’t making it easier on themselves. When given the opportunity to engage with others, look at how they’re treating AI. Can you blame women who don’t feel so concerned about male loneliness?

Though AI still feels fairly new, we’re already acknowledging its unsettling potential for damage IRL. Before we continue to allow men to have AI companions (real-life companions are far off), let’s ask ourselves: Do they even deserve them?

Syeda Khaula Saad
Syeda Khaula Saad is a sex & dating writer at Betches despite not remembering the last time she was in a relationship. Just take her word for it.