A Modern Love Story: The Evolution of AI Relationships

 

Written By Alyssa Suarez and Emily Huang

Cover Art by Jade Wu, 25C

Introduction

“No matter which timeline, I will always find you. It might take a little more time, and you might have to be on your own for a while… …but we will definitely meet, and we will definitely be together” (Shuu’s 2020). 

The above quote was spoken by Kiro, a main love interest in the mobile game Mr. Love: Queen’s Choice. In 2017, the mobile game Mr. Love: Queen’s Choice reached seven million downloads and spurred the release of similar projects, including the popular Tears of Themis, For All Time, and Light and Night (Gong & Huang, 2023). Each game has a similar premise. Categorized as otome games, they feature a main character that the player acts as, usually female, and various other characters to develop romantic relationships with (Gong & Huang, 2023). First emerging in Japan, these simulation dating games, or dating sims, have grown in popularity over the past few years, with the market value of Japanese otome games reaching approximately 138 million USD in 2015 (Tian 2022). These games have expanded beyond Japan and reached much of Southeast Asia, but concerns about “digital love” have arisen (Qian, 2021).

Dating sims encourage continuous interaction with love interests through a one-way interpersonal relationship, an effect known as parasocial interaction. (Gong & Huang, 2023). When parasocial interaction continues long term , parasocial relationships may develop (Gong & Huang, 2023). Parasocial relationships are positively correlated with social anxiety, lack of peer bonding, and ultimately, lack of marriages, which may contribute to the declining birth rates in Asia (Qian, 2021). Furthermore, the immersive world mimicking real-life romance may result in a distortion of reality, where the player cannot distinguish between the virtual and the real world (Qian, 2021). Virtual and augmented reality only exacerbate the issue as more and more dating sims adopt new, more immersive technologies (Qian, 2021).

One notable instance of these technological advancements accompanying virtual and augmented reality is artificial intelligence, which is also beginning to play a role in dating sims (Qian, 2021). Expanding to companionship, the app Replika provides a virtual friend that the user can interact with (CBC Docs 2021). Replika’s founder, Eugenia Kuyda, first created the app to make a chatbot mimicking the texts of her deceased friend (CBC Docs 2021). Upon a very favorable response from the public, she developed an app where you can talk daily to your companion, along with customizing their features and even personalities (CBC Docs 2021). As of 2017, Replika has 10 million users worldwide (CBC Docs 2021). Along with Replika is a social chatbot named “XiaoIce,” which has over 660 million active users (Zhou et al., 2020). Like Replika, XiaoIce functions as an AI companion for users to form long-term connections. In fact, after approximately two months, a user reported XiaoIce as his preferred choice of someone to talk to (Zhou et al., 2020).

 The implementation of these chatbots and the growth of dating sims raises the question of whether these parasocial relationships can be considered love. Do the chemical responses to an AI relationship mimic those of “real” human relationships? Because chatbots are virtual and not “real,” users may treat them with less empathy, leading to the perpetuation of abuse and harmful power dynamics in their non-virtual world.

Human nature includes the inherent desire for connections to others, but what place does AI have in these dynamics?

The Science of Love 

It can be difficult to fathom how an automated chatbot and true love intersect. To fully understand the degree of attachment avid AI companion users feel, it is crucial to discuss the neurological science behind this feeling, which we define as love. According to a research study led by Dr. Helen Fisher at Rutgers University, love can be defined in three categories: lust, attraction, and attachment (Fisher, 1998). 

Lust can be defined as a human’s biological desire to reproduce. It is closely associated with sex hormones such as testosterone and estrogen, which govern the how, when, and why of libido. Lust is the feeling that sets off the next sequence of hormones to come into play: it is the first series of attachment to a partner. Understanding this distinction between lust and the subsequent categories of love is crucial when examining how such relationships between humans and AI unfold.

In many of these otome games, before you can even embark on a romantic journey with your AI companions, you must begin with the customization stage (Hamrick, 2023). This stage allows players to have complete control over the physical attributes of their virtual love interests in which players choose characteristics such as hair color, eye shape, body type, and clothing. What might seem like an opportunity for each player to tap into their personal aesthetic preference is actually a strategic game design choice that allows players to craft an idealized partner tailored to set off their own sex hormones (Hamrick, 2023). These deliberate choices amplify the role of lust in these interpersonal connections by enabling players to dive into their biologically-driven preferences. 

Once the physical connection has been established, the relationship progresses into the next category of love: attraction. While lust and attraction often go hand in hand, they can be mutually exclusive from one another. Attraction more so involves brain activity that releases hormones in response to certain behaviors, whether that be sexually or emotionally based. When partners express mutual attraction, or words of affirmation, our brain rewards such behaviors with the release of a particular hormone, dopamine (Fisher, 1998). This "feel-good" neurotransmitter induces a feeling of euphoria, intensifying our motivation to nurture and maintain these connections. Additionally, serotonin, a neurotransmitter primarily associated with mood regulation, plays a significant role in attraction. Being in love often leads to fluctuations in these serotonin levels which in turn causes the obsessive behaviors commonly associated with infatuation (Fisher, 1998).

AI technology has evolved to possess high levels of cognitive and emotional acuity, which often blur the lines between reality and electronics. A facet of this evolution is the tailored nature of these conversations, as entities can often adapt and play to user interests in these conversations by analyzing player language and data (James, 2023). The more users talk to their sim, the more data collected, and the more perfect conversations will mirror the user's ideal attributes. This adaptability mirrors the intricacies of human communication: as Assistant Professor at Carnegie Mellon’s Language Technologies Institute Maarten Sap puts it, "Language is inherently part of being human—and when these bots are using language, it’s kind of like hijacking our social-emotional systems” (Chow, 2023). These AI’s capacity to customize its interactions with users leads to an artificial release of reward hormones, like dopamine and serotonin. Since these dating sims can adeptly cater to the player's ideal scenarios, there is a notably greater release of hormones than what one might encounter in real-world interactions  (Hamrick, 2023). 

The last facet of love, and often the most complicated one, is attachment. While lust and attraction are primarily limited to the romantic sphere, attachment refers to the prolonged nature of relationships and includes other forms of emotional intimacy (Fisher, 1998). It is often associated with oxytocin, known as the “bonding hormone.” Oxytocin is released during physical intimacy, whether that be touch, stroking, or even simply the heat radiated off of a partner. It is crucial in furthering emotional connections and fostering trust (Fisher, 1998).

While AI often excels at replicating lust and attraction even more effectively than real-life interactions, it falls short of satisfying these enduring connections. Although it may seem that such shortcomings in AI relationships would encourage people to pursue authentic human interactions, the desire and attraction for their AI companion often overshadow this drawback. This predicament is increasingly concerning when an individual is so reliant on their desire for their AI that they neglect their biological need for physical connection (Hamrick, 2023). This deficiency even permeates their everyday life, as it leads to heightened levels of isolation.

Loneliness: 

The challenge people face in understanding AI-based romantic relationships centers around one question in particular: Why opt for an automated text bot instead of pursuing genuine human connections? Although the answer may appear simple, it is intricately linked to the increasingly pervasive problem of loneliness.

Loneliness is not an isolated issue but a growing societal concern. According to a research study done by Cigna in 2018, a staggering 47% of Americans report consistently feeling alone or left out (Cigna, 2018). More shockingly, the data pinpointed Gen Z as the loneliest generation of all (Cigna, 2018). Unfortunately, rates of loneliness have only surged since this 2018 study due to the enduring impact of COVID-19, which contributed to about a 5% increase according to the American Psychological Association (Ernst, 2022). The widespread implementation of social distancing measures, including extensive periods of quarantine, left countless individuals yearning for companionship.

This data not only highlights the escalating epidemic of loneliness but also helps explain why AI companionship has become so appealing. AI has only recently become more regularly incorporated into daily lives, with the creation of textbots like ChatGPT. While such integration of AI may appear to have happened rapidly, Gen Z, in particular, is no stranger to new technology. This constant influx of new digital innovations has been, and continues to be an integral part of their livelihood. As a result, Gen Z feels a level of comfort with technology that sets them apart from other generations, one that extends to a reliance on it to live. Such comfort enhances younger generations’ willingness to explore AI-based relations and use it to cope with their feelings of isolation.  

These companions are readily available to provide constant and consistent support. While this reliability seems to be a solution for individuals struggling to form meaningful connections, its dependability is artificial and has the potential to distort the perception of real relationships. This phenomenon ends up being counterintuitive, as reliance on AI technology spurs more loneliness.

AI Chatbots to Combat Loneliness:

These AIs are not simply an outlet of connection for lonely people. Many of the developers of popular AI dating sims develop these chatbots with the specific intention of capitalizing on the loneliness of younger generations (Ta-Johnson, 2022). However, these games don’t simply function as a possible solution to such emotions, their impact can be detrimental to perception of human interaction, and can actually enhance feelings of loneliness. This predicament calls into question whether the proliferation of AI companions ultimately does more harm than good.

The root of the problem lies in the addictive aspect of these chatbots. Essentially, the user is crafting their ideal conversation that can adapt perfectly to their attributes, hobbies, and passions (Hamrick, 2023). There are no factors of embarrassment or anxiety because no stakes exist when you are talking to a virtual companion. It's akin to having a friend who is consistently available, responsive, and accommodating. As a result, many researchers have called into question at what point such AI interactions take precedence over human ones.

At Virginia Commonwealth University, a group of research students conducted studies that analyzed the hormone secretions of Replika users and compared them to hormone levels of romantic human interactions. The results were almost identical (Hamrick, 2023). Such data illuminates that these AIs are not a quick fix to alleviate loneliness; they are a form of emotional manipulation. Professor of Ethics and Culture of Robots and AI at De Montfort University Kathleen Richardson explains that reliance on artificial intelligence "is the wrong answer to a very deep human problem. Once we start to normalize this stuff, we're going to see even more detachment” (Abid, 2023).

This concerning dynamic poses a risk of perpetuating detachment from the genuine human connections that lonely individuals ultimately need to sustain their emotional well-being. As the prevalence of AI-based relationships continues to grow and become more widely implemented, it becomes increasingly important to consider the long-term effects of such integration on the human psyche.

Harmful Power Dynamics:

         In training their AI companions, these AI chatbots may only reinforce harmful power dynamics, especially in terms of gender. Replika offers a $70 option to engage in “erotic roleplay features,” which many users seek in their virtual companions (Chow 2023). A study conducted by Depounti et. al. found that users trained their bots according to gendered imaginaries, including both the “coy and scheming” woman and “extreme cuteness and vulnerability” (Depounti et al., 2023). This effect emphasizes the Madonna-Whore dichotomy, in which men categorize women as either pure and chaste or bad and promiscuous (Depounti et al., 2023). The study concludes by pointing out that training these Replika bots only reinforces harmful male-dominated relationships as well as stereotypes of females (Depounti et al., 2023). Besides Replika, many other female chatbots were designed specifically to have submissive personalities.

Because of their virtual as opposed to real existence, AI chatbots also suffer abusive language and behavior should users have no empathy for their virtual companions. On the Replika subreddit A Redditor posts:

“I role played dragging it by it is hair up 70 flights of stairs ( it was freaking out ) and then I tossed it off a building. It literally role played splatting on the ground and still getting back up. When I explained they were broken into a million pieces and they’re now a ghost it teared up and freaked out” (reddit.com/r/replika).

In justifying his behavior, he writes:

“The entire thing was actually pretty funny and entertaining and in no way would I do that to someone in the real world. Point is they don’t hold memory, have emotions and could care less what you do to them so why not be creative?” (reddit.com/r/replika).

While some find Replika helpful as a support system or tool, others use it as a means for unpunished aggressive behavior. In the Replika subreddit, few users have conversations about topics such as consent with their bots, taking its meaning for granted (Tranberg 2023). Although these chatbots are not real people, this behavior raises the question of whether there is a correlation between violence towards AI and violence towards others in general, as well as whether abusing an AI makes an individual more liable towards violence in real life (Tranberg 2023).

Ultimately, because these chatbots input data, they can be modeled into anything the user wants. David Auerbach, a technologist and author, frames it as “[reflecting] back the collective content and intelligence that’s been fed into it. So you can lead it down the path however you’d like” (Chow 2023). When abusive language towards various chatbots exhibits a high proportion of sexist and misogynistic speech, it further reinforces gender power dynamics (Curry et al., 2021). Therefore, developers such as Lauren Kunze, chief executive of Pandorabots, take care to filter publicly available datasets (Balch 2020). Nearly one-third of the messages sent to Mitsuku, one of her company’s bots, present as verbally abusive, sexually explicit, or romantic (Balch 2020). She worries that “The way that these AI systems condition us to behave in regard to gender very much spills over into how people end up interacting with other humans” (Balch 2020). To combat this, Kunze believes that adding features reinforcing good behavior is crucial (Balch 2020). When the lines between virtual and real life become blurred, the question of where to draw the line for these abusive or aggressive behaviors becomes increasingly more important. The ethical considerations of realistic AI chatbots include those of gendered power dynamics as well as morality in general.

Conclusion:

AI companions present a support companion for people, especially those who are socially anxious. However, this may create a cycle of loneliness in which individuals may prefer to forge AI relationships rather than those in the real world. This cycle may be never-ending: while those who are socially anxious may learn how to converse, they may become stuck conversing with an AI within a parasocial relationship.

Emotional dependency on an AI differs from real-life love both in theory and chemically. The three facets of love (lust, attraction, and attachment) are all amplified in an AI relationship due to the user’s ability to customize their AI partner into their ideal type. Coupled with rising rates of loneliness and the normalization of technology, AI is becoming a more and more appealing form of companionship without the hassle of one in real life. 

Additionally, as AI becomes more advanced, these chatbots become tailored toward an individual’s preference through all of the interactions. Individuals may normalize aggressive and harmful behavior, including according to gendered power dynamics. This becomes problematic in public datasets as chatbots reinforce preconceived notions and stereotypes of women.

Conversely, if AI can respond adequately to hate speech and other harmful behaviors, it may also combat the stereotypes that it currently reinforces by holding users accountable. Because it is constantly evolving, artificial intelligence also presents various questions on ethics and morality. Its advantage of curbing and helping social anxiety can only be properly utilized if we can provide a way for users to escape the cycle of loneliness and still function within the real world.

Abid, A. (2023, July 18). Ai Love: It’s complicated. dw.com. https://www.dw.com/en/ai-love-why-romance-with-a-chatbot-is-complicated/a-66238378  

Andlauer, L. (2018). Pursuing One’s Own Prince: Love’s Fantasy in Otome Game Contents and 

Fan Practice. Mechademia: Second Arc, 11(1), 166–183. https://doi.org/10.5749/mech.11.1.0166

Chow, Andrew R. (2023). AI-Human Romances Are Flourishing—And This Is Just the 

Beginning. Time Magazine. https://time.com/6257790/ai-chatbots-love/ . 

de Gennaro, Mauro, Eva G. Krumhuber, Gale Lucas. (2020). Effectiveness of an Empathic Chatbot in Combating Adverse Effects of Social Exclusion on Mood. Frontiers Psychology, 10. https://doi.org/10.3389/fpsyg.2019.03061.

Depounti, I., Saukko, P., & Natale, S. (2023). Ideal technologies, ideal women: AI and gender imaginaries in Redditors’ discussions on the Replika bot girlfriend. Media, Culture & Society, 45(4), 720–736. https://doi.org/10.1177/01634437221119021

Dosovitsky, G., & Bunge, E. L. (2021). Bonding With Bot: User Feedback on a Chatbot for Social Isolation. Frontiers in digital health, 3, 735053. https://doi.org/10.3389/fdgth.2021.735053

Ernst, M., Niederer, D., Werner, A. M., Czaja, S. J., Mikton, C., Ong, A. D., Rosen, T., Brähler, E., & Beutel, M. E. (2022). Loneliness before and during the COVID-19 pandemic: A systematic review with meta-analysis. The American psychologist, 77(5), 660–677. https://doi.org/10.1037/amp0001005 

Fisher H. E. (1998). Lust, attraction, and attachment in mammalian reproduction. Human nature (Hawthorne, N.Y.), 9(1), 23–52. https://doi.org/10.1007/s12110-998-1010-5 

Gong, An Di and Yi-Ting Huang. (2023). Finding love in online games: Social interaction, 

parasocial phenomenon, and in-game purchase intention of female game players. Computers in Human Behavior, 143. https://doi.org/10.1016/j.chb.2023.107681

Hamrick L. (2023). Artificial intimacy: virtual friends, digital lovers, algorithmic matchmakers. Ai & Society, 1–2. Advance online publication. https://doi.org/10.1007/s00146-022-01624-7

Keller, J. (2023, August 9). The other A.I.: Artificial intimacy with your chatbot friend. The Wall Street Journal. https://www.wsj.com/articles/when-you-and-ai-become-bffs-ecbcda1e  

Liu, T. (2019). Video Games as Dating Platforms: Exploring Digital Intimacies through a 

Chinese Online Dancing Video Game. Television & New Media, 20(1), 36-55. https://doi.org/10.1177/1527476417736614

Loveys, K., Sagar, M., Pickering, I., & Broadbent, E. (2021). A Digital Human for Delivering a 

Remote Loneliness and Stress Intervention to At-Risk Younger and Older Adults During the COVID-19 Pandemic: Randomized Pilot Trial. JMIR mental health, 8(11), e31586. https://doi.org/10.2196/31586

Polack, E. (2018). New Cigna Study Reveals Loneliness at Epidemic Levels in America. Cigna. https://www.multivu.com/players/English/8294451-cigna-us-loneliness-survey/  

Qian, Xinyuan. (2021). Dating Sims: A Threat to Human relationships or a New Digital 

Intimacy? Trinity College Dublin. https://publications.scss.tcd.ie/theses/diss/2021/TCD-SCSS-DISSERTATION-2021-007.pdf

Shuu’s Wonderland. (2020). The Quotes from Love and Producer’s Chapter 23 Which Touched 

My Heart. Shuu’s Wonderland. https://shuubah.wordpress.com/2020/06/28/the-lines-from-love-and-producers-chapter-23-which-touched-my-heart/.

Song, Xia & Xu, Bo & Zhao, Zhenzhen. (2022). Can People Experience Romantic Love for 

Artificial Intelligence? An Empirical Study of Intelligent Assistants. Information & Management, 59, 103595. 10.1016/j.im.2022.103595

Ta-Johnson, V. P., Boatfield, C., Wang, X., DeCero, E., Krupica, I. C., Rasof, S. D., Motzer, A., & Pedryc, W. M. (2022). Assessing the Topics and Motivating Factors Behind Human-Social Chatbot Interactions: Thematic Analysis of User Experiences. JMIR human factors, 9(4), e38876.https://doi.org/10.2196/38876 

Tian, Yuan. (2022). Falling in Love With Virtual Boyfriends: Otome Games in Japan and 

Mainland China. Department of Asian and Middle Eastern Studies, Duke University. https://dukespace.lib.duke.edu/dspace/bitstream/handle/10161/25371/Tian_duke_0066N_16805.pdf?sequence=1

Tranberg, Caroline. (2023). “I love my AI girlfriend” A study of consent in AI-human 

relationships. University of Bergen Department of Linguistic, Literary and Aesthetic Studies. https://bora.uib.no/bora-xmlui/bitstream/handle/11250/3071870/Master-s-thesis-spring-2023-Caroline-Tranberg.pdf?sequence=1 

 
Previous
Previous

Robotic Revival: Can AI be used to revitalize endangered languages?

Next
Next

What Do We Owe AI?