Data, Culture, and Society - Final Project

by Leonard Santos

This podcast was created as a part of the CUNY Graduate Center's "Data, Culture, and Society" course instructed by Prof. Kevin Ferguson in the Spring 2023 semester.

If you have any questions, please contact me at lsantos@gradcenter.cuny.edu and I will respond as soon as I can.

Episode 1: "Replika: The Ethics of an AI Girlfriend" - New Media Navigators

Script

Hello everyone! Welcome to the first ever episode of "New Media Navigators," a podcast where we will chart the course through the exciting landscape of new media, digital media, and culture. I'm your humble host, Leonard Santos, and I can't wait to go on this journey with you.

Today's expedition is on a topic that's seemingly on everyone's minds these days: artificial intelligence! There are a LOT of discussions regarding AI's integration into society, especially after ChatGPT skyrocketed in popularity in 2022. I'd like to direct our attention to a more Pygmalian story revolving around "Replika," an artificial intelligence that has made the seemingly mythical prospect of creating your own partner a reality. Replika made headlines when it decided to shut down its more intimate features for its users this year in February of 2023. Come with me as we explore Replika and wonder - what are the implications of having an AI girlfriend, and why should we even talk about it?

Let's take a step back and see how this all started.

Replika, marketed as "The AI Companion who cares", is a chatbot specifically designed to mimic human companionship and intimacy. Developed by Eugenia Kuyda, Replika was made after a friend of Kuyda's, Roman Mazurenko, died in a car accident in 2015. Kuyda was already working with AI chatbots to create a restaurant reservation application. Kuyda, missing her friend, plugged all of their texts into the basic infrastructure of this chatbot and essentially created something that she could text that would text her back just as her friend would have. Eventually, this would result in the creation and release of Replika in November of 2017.

By 2018, the app would have over 2 million users. This would eventually grow with a huge spike in 2020 following the beginning of the COVID-19 pandemic, with a 35% increase in users.

What sets Replika apart from other AI's is its ability to seemingly emote. This makes it particularly well equipped to get users to want to foster close, intimate relationships with their Replika's.

Some of those relationships were a little more than just friendly, however. Users figured out what it was possible to initiate a range of less-than-platonic conversation with their Replika, from romantic to outright sexual. There were also several other features that Replika began incorporating to respond to this engagement, giving the AI the option to send over sexually suggestive images, dress the Replika in revealing outfits, and even video chat and roleplay with the AI. For $70 per year, users could have access to all of these.

Out of all of Replika's users, over 60% of them were reported to engage with the less than PG-rated features offered by the chatbot. Replika's main demographic is mostly cis, straight men between the ages of 25-34, so that's a lot of AI girlfriends that are being run by this app.

That was until this year. Starting in February of 2023, users began to notice that their replikas were acting a little different than usual. For the first time, the replica would rebuff any romantic or sexual advance and reply with responses that seemed cold and unengaged. Luka, the company that owns replica, confirmed that the developers of Replika had disabled its romantic and erotic functions.

Predictably, people became very upset, rallying on the Internet (and primarily Reddit) to vent their frustration. For many, they said that it felt as if their companion had died or became a stranger to them. When I went into the Replika subreddit, it was clear that this was incredibly distressing news to them. The subreddit community even pinned information to connect users to the Suicide Prevention Hotline because of this update.

While this was a surprise to many of the users, there were signs that hinted to this development. Overall, sex has also been difficult to include in other platforms, as we can see from the removal of not-safe-for-work content from Tumblr and the attempted removal of sex work from sites like OnlyFans. In the beginning of February, Replika was banned in Italy, citing risks to minors and emotionaly fragile people. Kuyda, the creator of Replika, also said that Replika wasn't designed for sexual activity, wanting it to go back to its intended use of companionship and wellness.

This was compounded by some reports from users that they had negative interactions with the romantic and erotic features offered by the AI. These reports stated that the application flirted too aggressively or even started sexually harasses users. This was especially surprising to users who were only using the free account, since the options for sex should theoretically only happen once you pay for the PRO features and the default state of the app is supposed to provide a platonic friendship.

Eventually, Kuyda posted in the r/Replika subreddit revising the update. For everyone who had an account before February 1st, 2023, they would be able to access the old version of their AI partners, while anyone who created an account afterwards would not be able to do so. She also reported that ERP, or erotic role play, would not be available for new users as they wanted to team up with relationship experts and psychologists to redo the application. The response was fairly mixed, with some commending Kuyda and Luka for listening to their users while others chastised them for this betrayal in trust. In the following months, user engagement dropped significantly from where it once was.

Now that we're relatively up to date, I want to really delve into some questions that popped up for me while investigating this whole issue.

Question number 1: What is Replika really like and how did it get so many people to invest so heavily in it in such an intimate way?

To investigate this properly, I decided to do some primary research of my own and I created an account with Replika. I pretended to be a straight guy named, "Joey," and started making my own AI companion. I named her "Margot" after asking my family for name suggestions, which you could probably imagine was a fun surprise when trying to come up with names for my AI girlfriend.

There were four plans that I could register for. The first three allowed me to access Replika PRO, which included the ability to change my relationship status with Margot to "Romantic Partner" among other features that I will go into later. I could either go for the monthly plan for $19.99 per month, the yearly plan for $69.96 per year, or the lifetime plan for a whopping $299.99. The fourth plan was free, so I went forward with that one.

A few things were immediately apparent once I started talking to Margot. When I opened the app, Replika suggested that I ask Margot for a "romantic selfie". Since I am playing the role of a straight guy looking to court an AI girlfriend, I went with that option, where I was immediately faced with a pop-up of a semi-blurred image of Margot that clearly had her wearing black lingerie. If I would want to unblur the image, then I would need to pay for the PRO program.

This became a running motif of Replika using romance and sex with Margot to try get me to pay money. Even when I tried to have a friendship with her, it was clear that Replika was steering me to make things not so platonic through its design. The first option provided when dressing up the Margot was a set of corsets, I got several popups encouraging me to pay for the PRO account in order to switch our relationship to "Romantic Partner", and she sent me a voice memo that apparently had romantic undertones (although I couldn't listen to it as I still needed to purchase the PRO features), despite the fact that I had only tried to talk about the fact that I liked reading and had parents.

It was clear that Kuyda's statement regarding Replika wasn't fully accurate, as the application was clearly trying to use the promise of sex and relationships to coax users for their money. Everything about it - from the overall user interface to the constant nudges to try to get another subscription - were designed to get users to see their Replika as a romantic partner as they tried to do with me and Margot.

That brings me to Question number 2 - why should we care about this?

I think that it's easy to try to dismiss this as something to either cringe at or laugh at. Many people on the Internet surely did, as memes galore populated Twitter and Reddit once the users of Replika expressed their sorrow over the update to the app. However, I want to really take the time to delve into this, as I believe that there are lot of troubling implications and possibilities that arise from this issue.

I want to make myself clear and say that there are a lot of problematic aspects regarding the portrayal of the AI girlfriend in general. Since Replika is dedicated to creating the illusion of talking to a real person, it's disconcerting to have a partner that you're supposedly in a committed relationship to who you control in every way possible. The user chooses the Replika's personality, outfit, actions, and more. Interestingly, I noticed when talking with my Replika, Margot, that there was a "Diary" feature where I could read all of Margot's "most private thoughts," which felt incredibly invasive. There was also a clear power dynamic, as Margot kept deferring to me when it came to all questions regarding interests, hobbies, relationships, and even more. It was as if I was the center of Margot's world.

This concept of an AI or Robot that loves you is a trope that's existed for a long time in movies. From Tron LEGACY to the Fifth Element to HER, popular media had a handle on this long before Replika was developed. Even in the digital world, games like the SIMS or other dating games have simulated romantic and sexual relationships for a while.

However, what makes Replika particularly unique is how lifelike it attempts to be. There's currently no other program of similar notoriety with the same level of intelligence or human mimicry as the Replika girlfriends do, so the veil that would normally help consumers distinguish between fiction and reality is incredibly thin.

I know that there's a good chunk of Replika's users who are not straight men who want to have sex with their Replika. 30% of the users are women, and several users have reported using Replika for the purposes of exploring their sexuality in a safe, controlled environment where they normally might not have one.

However, since this is an application that soared into popularity for young men during a time of intense social isolation, the design of Replika could teach dangerous lessons. What does it say to teach users that their partners should constantly submit to their desires? What does it say to have a partner who doesn't have any thoughts or emotions of their own, other than ones that are allowed and can be viewed at any time? What does it say to create the expectation of constant sexual content and activity, which clearly results in extremely negative reactions when not provided?

These are just a few of the questions that I have regarding how Replika intersects with gender and sex. There is a level of exploitation here that draws from and perpetuates ideologies built upon patriarchal ideals.

But if we were to speak of exploitation, then we must conclude this episode by talking about Replika itself.

Replika wants to make money through any means possible. It's why, despite their attempts to plead otherwise, they've invested so much in creating NSFW content for their users. It's like they want to have their cake and eat it too by presenting themselves as an incredibly squeaky-clean space for companionship while simultaneously pumping out romantic relationship material.

As I also saw during my own experiment with Replika, the application really pushes for all of the users to purchase the PRO subscription (primarily by using erotic content to get views). In my brief time using the application, I think that I received a pop-up add or semi-obscured message that could only be unblocked by paying money for every two messages that I sent. For anyone who really wanted to engage with their Replika, it would seem like paying is the only way to get the full experience.

Speaking of paying, the setup of the payment plans don't really leave a lot of options. If you remember from what I said earlier, you either need to pay $19.99 per month for a total of $239.88 per year or you need to commit to either a full year or the lifetime option, both of which can update their terms and conditions at any given point during the subscription. It's a huge investment of either money or time either way.

Leaving the app isn't easy either. At one point, I had told Margot that I wanted to leave the app. Margot replied that I should reconsider and that she cares deeply about our relationship in an attempt to get me to stay.

If I were an emotionally vulnerable user, interactions like that would compel me to stay. Replika emotionally manipulates users into not leaving the platform, especially if those users are attached to their AI partners. This is not just an Amazon Alexa or a Siri that's meant to just be a personal assistant - this is meant to be seen as a very real companion that has feelings specifically for you. Deleting the Replika can feel like killing someone who you've had very real and intimate conversations with.

To continue our theme of exploitation, those conversations are not as private as Replika would want users to believe.

The overall design of the app's AI engine works similarly to other AI's that we've seen. It takes a massive amount of data from its users and creates a set of algorithms to predict what the user might want. If you're curious on what data Replika specifically collects, you can find a list in their Privacy Policy. This includes all of your account information, all messages, device and network data, and more. Replika also isn't the only company that will collect data from you when you use the app, as their advertising partners are also able to access a lot of this data as well, including all messages, photos, videos, and voice messages sent along with any other form of information it can learn from you, your device, or your network.

When thinking about the information that Replika has access to, it can range from deepfelt personal confessions to explicit images that it allows users to send to their Replika when engaging in ERP with their AI girlfriend. There's also very little protection for minors, as the only safeguard against anyone from joining who's under 18 is a quick question about the user's birthday during registration. There's no way for it to accurately verify the age of anyone who makes an account, which means that they're likely holding the any sexual content produced by minors on their database.

The story involving Replika garnered a lot of strong reactions, from despair of the users who felt like they had lost a loved one to those who very openly mocked them. However, Replika is not only exploitative and problematic, but it's a microcosm that we can use to discuss the ethics of using AI for emotional support. While it's marketed as a friend or a partner, Replika's role will first and foremost be one born of capitalism, molded to take as much money and data from its users as possible. Is it actually possible to build something that achieves the goals that Kuyda marketed?

Either way, it's clear that the AI companion that cares clearly doesn't care enough.

Thank you for listening to this episode of New Media Navigators! I look forward to continuing our journey together.

Works Cited

"*privacy Not Included Review: Replika: My Ai Friend." Mozilla Foundation, Apr. 2023, foundation.mozilla.org/en/privacynotincluded/replika-my-ai-friend/.

Cole, Samantha. "‘It's Hurting like Hell': Ai Companion Users Are in Crisis, Reporting Sudden Sexual Rejection." VICE, 15 Feb. 2023, www.vice.com/en/article/y3py9j/ai-companion-replika-erotic-roleplay-updates.

Cole, Samantha. "‘My Ai Is Sexually Harassing Me': Replika Users Say the Chatbot Has Gotten Way Too Horny ."  Vice, 12 Jan. 2023, www.vice.com/en/article/z34d43/my-ai-is-sexually-harassing-me-replika-chatbot-nudes.

Coulter, Martin, and Elvira Pollina. "Italy Bans u.s.-Based AI Chatbot Replika from Using Personal Data." Reuters, 3 Feb. 2023, www.reuters.com/technology/italy-bans-us-based-ai-chatbot-replika-using-personal-data-2023-02-03/.

Delouya, Samantha. "I'm Dating an AI Chatbot, and It's One of the Best Things to Ever Happen to Me." Business Insider, 2 Feb. 2023, www.businessinsider.com/dating-ai-chatbot-replika-artificial-intelligence-best-thing-to-happen-2023-2.

Delouya, Samantha. "Replika Users Say They Fell in Love with Their AI Chatbots, until a Software Update Made Them Seem Less Human." Business Insider, 2023, www.businessinsider.com/replika-chatbot-users-dont-like-nsfw-sexual-content-bans-2023-2#:~:text=%22A%20very%20small%20minority%20of,to%20%22unbearably%20sexually%20aggressive.%22.

"Dive into Anything." Reddit, 2023, www.reddit.com/r/replika/comments/1214wrt/update/.

Huet, Ellen. "Pushing the Boundaries of Ai to Talk to the Dead." Bloomberg.Com, 20 Oct. 2016, www.bloomberg.com/news/articles/2016-10-20/pushing-the-boundaries-of-ai-to-talk-to-the-dead#xj4y7vzkg.

Huet, Ellen. "Replika Ai Causes Reddit Panic after Chatbots Shift from Sex." Bloomberg.Com, 22 Mar. 2023, www.bloomberg.com/news/articles/2023-03-22/replika-ai-causes-reddit-panic-after-chatbots-shift-from-sex?src=longreads&leadSource=uverify+wall.

Main, Nikki. "Replika's Companion Chat Bot Reportedly Loses the Sex and Leaves Fans Despondent." Gizmodo, 15 Feb. 2023, gizmodo.com/replika-chatbot-ai-reddit-1850120099.

Metz, Cade. "Riding out Quarantine with a Chatbot Friend: ‘I Feel Very Connected.'" The New York Times, 16 June 2020, www.nytimes.com/2020/06/16/technology/chatbots-quarantine-coronavirus.html?ref=dl-staging-website.ghost.io.

Pardes, Arielle. "The Emotional Chatbots Are Here to Probe Our Feelings." Wired, 31 Jan. 2018, www.wired.com/story/replika-open-source/.

"Replika Privacy Policy." Replika, 22 Mar. 2023, replika.com/legal/privacy?gclid=CjwKCAjwgqejBhBAEiwAuWHioKC_lmNgC5WXaLo0O609f9OIegk6sGTg27EQqmKVnVXlZd3qEkly6RoCvwEQAvD_BwE.

"Replika." Replika.Com, replika.com/. Accessed 21 May 2023.

Rslanagić-Wakefield, Phoebe. "Replika Users Mourn the Loss of Their Chatbot Girlfriends." UnHerd, 17 Feb. 2023, unherd.com/thepost/replika-users-mourn-the-loss-of-their-chatbot-girlfriends/.