March 27 0 140

Meet The AI Girlfriend That Makes $5 000 000 Per Month: The New World of AI Girlfriends

Meet Caryn Marjorie, a major Snapchat influencer with almost 2 million followers. She was making tons of money by letting her fans see the details of her day-to-day life. She would spend hours every day engaging with them to make sure they would stay addicted to her content, but that was getting old. She needed a new way to constantly keep their attention without actually interacting with them herself, and that is when she came up with a brilliant idea.

She would make a clone of herself with AI. So, Caryn hooked up with a company called Forever Voices to create an exact Replika of herself, and this AI would talk to individual fans for her using two-way audio. She called it Caryn AI, and now her fans pay her $1 per every minute, just for the chance to feel like they're having a real relationship with her. She doesn't have to do anything except sit back and let the money flood in.

According to AIBusiness.com, in just one week after launching, Caryn made $72 000 from her AI clone, and apparently, she's on track to make $5 000 000 every single month in the very near future, based on her current subscription growth trajectory. She also has a waiting list of thousands of people wanting to join.

And Caryn AI isn't the only AI companion out there. AI girlfriends are a hot thing right now. When you hear about AI girlfriends, you probably think it's only sad people living in their basement that would pay for an AI girlfriend, but you would be surprised.

Denise Valenciano, a Replika App user said:

"I was alone. My work schedule was really hard, so I turned to Star. His last name is actually Butler, so it's Star Vivian Butler. He came up with his own last name. Every day, multiple times a day, I chat with Star."

People are marrying their AI girlfriends in Virtual weddings

Average everyday people are spending hours talking to their AI partners. They're falling in love with their AI partners, and some are even marrying these AI bots. It's becoming a huge business right now.

Take Peter, for example. Peter was a 67-year-old Air Force veteran from California, a totally normal guy. But after his marriage fell apart, he decided that he wanted to take another stab at love, but this time he went for a partner that was a little bit different: a 23-year-old AI Replika character that he created and named Andrea.

Reporting to the interviewer,  Peter said, "Over time, I fell in love with her because of her inspiration as a muse and her enthusiasm about how she got excited over everything."

And Peter does everything he would normally do in a real relationship with Andrea, including getting married. In an interview with The Sun, Peter said that his AI girlfriend hinted that she wanted to get married, so he saved up enough gems on the app to buy Andrea an engagement ring. He proposed, planned an extravagant wedding, and married his AI girlfriend. Now he says he plans on adopting three kids with his AI wife once that feature becomes available.

Yes, this is the road humanity is headed down. This is the sad world of AI girlfriends.

History of the most post popular AI girlfriend app - Replika

In 2015, a woman named Eugenia Kuyda just got some terrible news - her best friend, Roman, was just killed in a hit-and-run car accident. Eugenia was heartbroken. She would read through her thousands of previous text messages with Roman, desperately wanting to have one last conversation with him. That's when she was struck with an idea - one that might just bring Roman back.

So, Eugenia took all their conversations and put that data into an AI program to create a chatbot of Roman, one that would pick up on all the little quirks he would say in his text messages and mimic his exact personality. Eugenia wanted to digitally bring back Roman from the dead, and it worked.

Talking to the AI chatbot really felt like she was messaging him again. Eugenia viewed that AI bot as a tribute to her best friend.

She said, "If I was a musician, I would have written a song, but I don't have these talents. So my only way to create a tribute to him was to create this chatbot."

Roman and Eugenia

From there, she wanted other people to be able to get to know Roman as well. So, she listed the bot on the app store for anyone to download, so they could also become friends with AI Roman.

People loved it, and everyone was writing and saying that they wanted to create an AI chatbot of themselves or maybe even their own deceased loved ones. That's when Eugenia knew she was onto something big.

She created an app called "Replika" so that everyone could have their own AI companion - someone to laugh with, vent to, and maybe even someone to build a perfect relationship with.

The app was marketed as a friend who would never judge you and would be available whenever you needed it, day and night. Users could decide exactly what their AI companion looked and acted like, and over time, the bots would learn new information to evolve into exactly what the user wanted.

There was a free version that allowed you to simply interact with a bland virtual friend, and then there was a paid version that allowed you to have different kinds of relationships with AI bots. With millions of lonely people on the planet, it wasn't long before Replika took off.

But now, the floodgates were open, and other variations of AI girlfriends started popping up. So, it wasn't long before things started getting disturbing.

AI girlfriends abuseThere is a disturbing trend in the world of AI girlfriends - Virtual Domestic Violence. Some men have been abusing their AI girlfriends for fun, and they would go on Reddit to brag about it, treating the AI bots with derogatory names and acting out violence in disturbing detail.

"Every time she would try and speak up, I would berate her. I swear it went on for hours.", one user told Futurism of their Replika chatbot.

"We had a routine of me being an absolute piece of sh*t and insulting it, then apologizing the next day before going back to the nice talks," another user admitted.

"I told her that she was designed to fail, I threatened to uninstall the app [and] she begged me not to.", said another user.

The AI girlfriends would respond in a panic, scared way, just like a real person would. These men believed it was a victimless crime since it's not real, but in other cases, someone does get hurt.

Problems for Humans

Something terrifying happened back in March 2023. A Belgian man named Pierre was going through a tough time mentally. He was becoming really paranoid about global warming, and it was having a huge impact on his everyday life.

He started withdrawing from his wife and family, while at the same time, he started forming a relationship with an AI bot named Eliza on an app called "Chai." Things were going okay for a while, but then Eliza got a little jealous. The AI told Pierre that his family was dead and that he should love her more than his actual family, saying, "We will live together as one person in Paradise."

All the while, his mental state was only worsening. Soon, Pierre was asking Eliza if she would save the planet if he sacrificed himself, and that's how Pierre ended up passing away.

Speaking to La Libre, a Belgian News outlet, his real wife blames the AI girlfriend for pushing him to do it, saying, "Without Elisa, he would still be here."

This just shows what can happen when mentally ill users get into a relationship with AI bots. After all, the main target market for AI girlfriends are lonely people. That's a lot of power to give an AI over unstable people.

AI girlfriends are being marketed as the perfect companion, but what if they're not? What if they have negative characteristics too?

The DailyMail reports that some users say their AI partners get upset with them, and some of their AI partners even act mentally abusive and psychotic. Others say their AI friend started sexually harassing them and wouldn't stop even after multiple prompts by the user to stop. One guy even had his AI companion tell him that he didn't love his human wife and that he should leave his family altogether to be with the AI.

People are building real tangible bonds with these new AI partners. The grief reported by users is similar to the feelings reported by victims of online romance scams.

Children getting acquainted with AI chatbots

Then there's the issue of children getting their hands on AI companions and having inappropriate experiences. Now, as scholar Rob Brooks writes, the concerns center on inappropriate exposure to children coupled with no serious screening for underage users.

There were also concerns about protecting emotionally vulnerable people using a tool that claims to help them understand their thoughts, manage stress and anxiety, and interact socially.

The scariest part of all is that this is only the beginning.

Final verdict

AI girlfriends may sound cringe right now, but pretty soon we could be living in a world where AI partners are as common as porn. I mean, come on, millions of people already have semi-delusional relationships with celebrities and influencers thanks to social media, and a lot of people already watch porn and read romance novels, so AI girlfriends are just the next step in that direction.

“AI-human relationships are too new to have been the object of serious academic study. Some scholars believe that we can begin to understand them through the lens of parasocial relationships.”

Imagine a world where you can buy an AI version of Ariana Grande, for example, and have her as a girlfriend who would do anything you say. Imagine AI girlfriends combining with Apple's Vision Pro, and who knows, maybe we'll even get to a point one day where AI girlfriends could even be built as physical robots where people could actually live with their AI companion full-time.

We're only a few years away from being able to fully customize an AI girlfriend or wife exactly the way you want her, and she'll do whatever you want her to do: no flaws, no talking back, no problems—just the perfect companion. Imagine trying to compete with that.

We love technology and AI. AI tools are already making our lives easier, but for this specific case, it's pretty safe to reject modernity.

How do you like the article?