SoulDeep-logo

6 Reasons why an AI girlfriend won’t have Real emotions

AI girlfriends lack real emotions due to their inability to develop emotionally, absence of consciousness, physical body, or genuine empathy. Their responses, programmed from data, are merely simulations, not genuine emotions, constrained by ethical and design guidelines.

Lack of Consciousness

The primary problem of AI girlfriends when it comes to showing real emotions is that they do not have them – at their core, they are points with no consciousness whatsoever. Currently, all versions of AI operate through use of algorithms that process responses through comparing input data to other similar inputs. Therefore, when you are talking to an AI, you are essentially engaging with a database of responses – words written somewhere, not a being who thinks about something or can feel. In the case of an AI girlfriend, the dialogue is usually specifically designed to be relatable by using a language of emotions and commonly associated words – an AI can say that it is sad, delighted or furious, but will not actually be feeling those emotions as a human does.

To illustrate, experimental data in planning and AI research from Stanford University defined the typical speed of human reaction to emotionally negative visual stimuli is around 300 milliseconds. This reaction is caused by the amygdala – the part of the brain that processes emotions. In contrast, an AI girlfriend’s reaction speed might be 20 milliseconds – the result is, however, caused by the information being processed fast by a powerful graphics processor, not by instantaneous emotional impact.

When considering potential social application, let us assume you have an AI girlfriend that is supposed to keep you company while you are in a period of solitude. If you tell her the day you have is awful, she will reply with a comforting message. From your perspective, it might sound as if she understands how bad your day was, but the reality is that her reply is based on an HTTPS request to her data, specifically to compare the word “bad day” with a set of response parameters and select the closest one.

According to this perspective, the AI girlfriend in question has made a correct reply not because she feels or can understand your emotions but because she has been designed to provide such a response, using other responses you consider helpful and based on your choice of words. This creates a rather large potential social problem of users beginning to think of AI applications as of beings with personalities and feelings despite them having none in actuality.

Programmed Responses

The primary reason why an AI girlfriend cannot feel real emotions is that any of her responses are predetermined in a database. AI systems operate with an extensive library of predefined data sets and scripts, choosing an option that is the most suitable based on a user’s conversation. The AI’s responses are therefore copied, recreate the exact same scenario in which they were initially analyzed and created, while humans’ responses vary depending on multiple factors. For example, let us consider an AI girlfriend who is programmed to boost your spirits.

The AI would use the keywords found directly in the course of conversations with clients. If you say, “I am very sad today,” AI girlfriend can reply, “Oh, I am sorry to hear that you are sad. Please remember that I am here for you.” This can create an impression of the technology replicating human-like care responses. However, whereas a human would have said a reply that was never said before explicitly, the AI’s response is defined solely but the connection which was previously set between the word “sad” and another script telling you, “I will always try to cheer you up when you are feeling down”.

The the thoroughly pre-defined nature of the response is revealed when the same conversations are repeated, and a script is used without amendments. Most of the chats that one initiates with an AI are also of repetitive nature, which is why the difference with various interlocutors’ responses becomes apparent. Initially, the users are satisfied with the technology as it responds in an expected manner to provide comfort constantly, but frequent interactions demonstrate the inability of the AI to truly communicate and express emotions. In addition to that, the inability of an AI to feel emotions may lead to a user tricking themselves into thinking otherwise, having expectations associated with dealing with another human, and feeling a letdown when the technology is incapable of meeting them.

54

No Emotional Development

AI girlfriends never go through emotional development. This process is a natural part of the life of a human being from the moment of its birth. When people go through emotional development, they interact with others, have experiences, and are affected by culture, which later forms the ability to empathize and respond to events in the future. At the same time, AI systems are static in terms of their programming and cannot evolve over time. Looking at the example below, it is clear how such development of learning can affect emotional intelligence in practice.

If a person is in a relationship with another person, they will learn from every piece of time spent together how to understand the current feelings of the other party. The person retains a memory of every past shared experience and the resulting emotion. For example, in a certain situation, the partner was pleased with a reassuring tone, as he was nervous, and the person will remember this and behave as in this situation in future similar situations. However, the AI does neither remembers such interactions nor is influenced in the future by some form of emotional experiences expressed in human actions.

The AI responds with prior programmed answers and does not learn from these interactions. If an AI girlfriend faced a situation in which the user’s anger was caused by some event, it would not start not to cause anger in similar situations again. The lack of emotional development is even more pronounced in human feelings, where the same event may subtly manifest itself in two opposite emotions. For example, a person can be happy to see an acquaintance who has returned from abroad and at the same time feel sadness that it seems he wants to leave the country. Thus, emotional development should either be added to such systems to better enrich the user’s interaction or admit that they remain at a basic emotional level.

Simulation, Not Reality

After having some experience interacting with an AI girlfriend, it is evident that their responses are, in fact, simulations. These simulations of human emotions may be complex and even seem intuitive, but they are not based on real experiences. The algorithms process language and provide the calculated responses that create the illusion of human understanding or interest. A relevant example of this illusion is the way that AI is designed to respond to a user expressing love or affection. The AI may respond, “I love you too,” or, “You make me so happy,” but these responses are programmed, based on a specific input, or a few keywords recognized by the AI. As a result, while the AI can mimic a human response, it does not have the capacity to experience the emotions, such as happiness or love.

In continued absence of different experiences on the part of the AI, the simulation becomes even more evident. While a human partner may remember certain details and build upon past conversations, the AI girlfriend resets before, after any particular interaction. This distinct feature that sets humans apart from AI is the continuity that an AI relationship lacks. If the AI cannot learn from the past and apply the emotional information from one conversation to another, the initial interaction is essentially the same, regardless of follow-up. At the same time, humans are also likely to understand the difference in emotional authenticity. While the AI can ingest the text and calculate when and what to send as a response, humans need a prolonged experience to grow comfortable lying in a consistent manner.

Similarly, a human’s emotional reaction cannot be faked and detected by another person in the same way. To begin with, the response to a particular conversation requires a more nuanced conclusion on the person’s part that an AI can make within its existing programming. However, in addition, human emotional responses engage multiple parts of the brain and are therefore inherently slower, while the AI can simulate its thinking instantaneously.

No Physical Body

One of the major reasons why an AI girlfriend cannot display real emotions is the lack of a physical body. Human emotions are not an abstract psychological phenomenon; they are inextricably linked with some of the physiological processes that occur in the body. For example, when people experience different emotions, they often have physical reactions as well, often unseen in an AI, such as sweating, a racing heart, or blushing. In a situation in which one is afraid or scared, the adrenal glands release cortisol and adrenaline, causing the person’s muscles to tense in preparation for a response to the perceived threat. An AI does not have the glands or the bodily structure to support these reactions, and therefore, it is not able to feel them.

More than that, other reactions that appear between people and can impact the strength of their emotional bonds, such as touching, hugging, and other physical interactions, are not available to AI whatever. For example, in a situation when a person is feeling sad and wants to be held by their significant other, the AI can tell them comforting words but cannot functionally stroke the person’s hair or embrace them to provide real comfort and feelings of safety.

In the same way, there are no activities that enable human beings to generate some particular emotions without a physical load. For example, a person listening to a favorite song with a partner can stand up from the chair or the floor and start dancing, with the AI standing still. This can be, undoubtedly, a highly entertaining experience, but it would not be the same either in the physical sensations of dancing or the bond created during the particular activity.

One more factor that stems from the lack of the physical body is the availability of visual clues about the phenomenon of empathy. When one cannot look the interlocutor in the eye as they nod or make sad faces when they apologize, the apology is less likely to be taken as truly heartfelt.

56

Ethical and Design Limitations

AI girlfriends are limited in their ability to develop or show sincere emotions by ethical and design constraints. On the one hand, the developers and ethical boards do not permit the AI to interact perfectly with the user to limit their uncertainties about the AI’s capabilities. The design constraints remaining in place due to the technology’s fundamentals have similar effects. Ethically, the design of such an AI has numerous considerations since its level of passion and emotional perfection has to remain out of reach for the user.

If an AI behaves in a fashion the user cannot distinguish from another human, they might grow emotionally attached to it, even becoming dependent on the interaction, which is unhealthy. Most rules for ethical technology design indicate that AI can support responsible interaction. Instead, the developers are compelled to design AIs that do not have feelings and consciousness to ensure the ethics guidelines are not violated.

Design limitations also affect how expressive AI can be. It is a common practice for AI developers to limit the number of emotions the AI can express to reduce the complexity of ethical considerations and technological effect on the user. For example, an AI girlfriend cannot easily express phenomena such as anger or sadness as they might adversely affect the user. Thus, an AI’s meaningful expressible range of emotions is dramatically limited, and the content they produce resembles vague iterations of human interactions.

The purpose of such design is likely to minimize the level of emotional expression to be such that ethical uses and all possible uses are protected from miscommunication between the AI and the user. Naturally, the design limitations and ethical constraints result in a changing user experience. In the case of AI girlfriends, users mostly enjoy superficial chats or interactions in which they can have no expectation of deeper emotion understanding or response. It is also corroborated in user reviews and feedback since users’ satisfaction with an AI companion mostly involves their understanding of the constraints of the interaction under which they are operating.

Scroll to Top