Falling in Love With AI

published on 07 June 2022

“A beating heart”

Zheng Jiajia, when asked what was missing in his robot wife.

The aim of AI was, and still is, to address the famous question by Alan Turing “Can machines think?”

The way we interact with AI-personal assistants today, it’s not hard to imagine that in the future AI could emerge as a relationship partner that could potentially satiate our emotional, intellectual, and physical needs.

Evolution of family ties based on technological progress

Change in dynamics of societal relationships over different ages:

  • Hunter-gatherer age: The clan stood at the center of society; relationships were fluid.
  • Agricultural society: Multi-generational families.
  • Industrial society: Nuclear families.
  • Information age: Self-love and networked individualism.
  • AI age: <the jury is still out>

Impact of technology on relationships

Relationships evolve as technology develops.

  • Early 2000s: Social media served as a passive platform on which people connect with each other via messaging, sharing photos, and so on.
  • 2010 onwards: Dating apps use personal information to actively find matches. These apps now prompt us to meet people that don’t even fit our criteria, but often these suggestions turn out to be quite right and can be eye opening. 
  • From 2020: Technology is making the transition to itself being a relationship partner.

Technologies accelerating the human-AI relationship

Let's take a look at the technologies accelerating the proliferation of human-AI relationships:

AI

More sophisticated AI models are now being built with human-level conversational intelligence. These improvements come from better databases, models, and methods, and are making human-to-AI conversations more satisfying and interesting. 

People are developing feelings of intimacy chatting with chatbot avatars.

It’s possible that when you chat with these chatbots you forget that it’s just an AI system. Such moments of obliviousness are precisely what the industry aims for. 

  • Replika: A San Francisco-based company Luka released an advanced chatbot app named Replika. Around 40% of the 500,000 regular monthly users consider their Replika chatbot to be a romantic partner. 
  • Woebot: Woebot is a chatbot that uses cognitive behavioral therapy (CBT) techniques to help them with day-to-day problems including stress, mental health ailments, and loneliness. 
  • Wysa: Touchkin, an Indian start-up, has introduced Wysa, an AI-powered chatbot that can help diagnose and treat depression. 
  • Ellie: Ellie can analyze patient body language and tone of voice and use that information to recognize patterns that can indicate depression and PTSD.

VR

Many companies like Meta, Samsung, Google, Microsoft, HTC, and Sony have developed consumer-friendly home-use virtual reality (VR) devices. The technology is increasingly getting cheaper and the sales are increasing. 

VR uses computer technology to place its user in a simulated environment. To experience VR, users generally wear masks or goggles that screen 3-D graphics. By looking left or right, the user can see off to the sides; and by turning up or down, the user can see the scenes above or below them. This responsiveness to user movement makes them feel as though they are placed inside the virtual environment. Users can also interact with other users in this immersive world. 

  • World of Warcraft: People have found lifelong friendship and even gotten married in World of Warcraft. 
  • Second Life: Users pick a digital avatar that develops social and personal lives, connects with avatar friends, and joins activities like quests, parties, and conversations.
  • VRChat: Users can express themselves through hand gestures, adding to a sense of interactive realism. It was released in 2019 on the Oculus platform. In 2021, it saw more than 30,000 concurrent users interacting in over 50,000 community-built “worlds.” You can create your own avatars and represent yourself the way you want to. 

AR

Augmented reality (AR) is similar to VR in that it creates virtual artifacts.

AR applications “enhance” real-world environments by layering digital artifacts over them. AR alters the existing environment experience that the user sees through special glasses or mobile phones.

For example, the tourism industry uses basic AR to superimpose educational text, images, and contextual sounds onto historically important sites. 

You can now project a character that you’ve created in an app onto your real-life surroundings, seeing it having its own life in your living room or bedroom while talking with it in an almost natural way, as if it were really alive.

MR

Mixed Reality (MR) integrates both AR and VR. While AR overlays virtual objects on the real-world environment, MR lets users interact with and manipulate these virtual additions. 

Microsoft’s HoloLens, for example, provides an immersive experience of real-world objects by overlaying a 3-D image over the physical environment that reacts and responds to users actions.

Robots

Humanoid robots can overcome barriers of receiving hugs and physical warmth. Applying facial expression, location tracking, and hand movement abilities, engineers are increasingly able to create robots that communicate with us more effectively. 

What draws us to a person is not just what they utter or think, but also their gestures and motions. Body language completes the manifestation of the character with which we are in love, friendship, or any other relationship. 

  • Tesla’s Elon Musk announced in 2021 to be focusing on creating a humanoid robot named “Optimus” in the coming years. “It’s intended to be friendly, of course,” Musk said, “and navigate through a world built for humans.” 
  • Amazon’s Astro is a household robot for monitoring your home.
  • A Spanish company called Synthea Amatus, in 2017, launched an AI robot that provides romantic and sexual companionship called Samantha. 
  • Boston Dynamics, one of the leading robotic companies, open-sourced some of its technology to help healthcare workers deal with the pandemic. It's also built Spot, a pet robot dog.
  • Sophia is probably the most famous humanoid robot today. Developed by Hanson Robotics in 2015.
  • A robotics company, named Realbotix, is already offering a variety of sex robots. 
  • Sony’s Aibo is a robot dog that’s almost lifelike. With lifelike expressions and a dynamic array of movements, Aibo’s lovable behavior brings warmth and delight to the everyday lives of its owners.
  • Pepper is a semi-humanoid robot designed by Softbank robotics. Pepper is a social interaction robot designed as a helper bot able to recognize faces and respond to greetings and questions. Pepper works in stores, schools, social care, and sometimes in private homes.
  • Ray Kurzweil’s Ramona bot will chat with you on a variety of topics. Ramona is a deep-learning system who’s dataset is continuously augmented by her chats with humans. Kurzweil believes that Ramona will pass the Turing test by 2029.

Telepresence robots allow you to beam your face onto their screen. So you can chat with others at location and physically move around an office campus or a factory. 

The progress of AI, AR/VR, and Robotics cover the near totality of the human emotional experience.

Anthropomorphism

Anthropomorphism is the human tendency to attribute human traits, emotions, or intentions to non-human entities.

Our tendency to anthropomorphize is an evolutionary strategy. Things that are animate are more important to our survival than things that are inanimate. Thus, natural selection rewards those who, when confronted with an uncertain object, “bet high,” guessing that the object is not only alive but also human. 

It’s impossible to pet an object and address it verbally without coming to regard it as sentient. Our brains can’t fundamentally distinguish between interacting with people and interacting with machines.

Every child talks to a stuffed animal. It’s kind of a natural anthropomorphism that seems hardwired in children.

As soon as we began building computers, we saw our image reflected in them. 

Anthropomorphism is the innate reason for humans developing gratifying relationships with robots. 

Choosing a partner

We don’t always fall in love with a person as a whole, but often we love someone because we love their characteristics: their smile, gestures, way of thinking, or sense of humor. At other times, they make us feel in a certain way or we sense that we can simply trust them more than others. It might be cold-hearted to dissect romantic relationships into their components and look at them in this way. But there are multiple studies that show us that we take into account specific traits when we love someone or say “I do.”

As much as we want to believe in all those “Disney” stories, love, relationships, and attachments are made of certain specific components. Love is simply not blind: instead, we examine our partners or potential partners trait by trait, even if we are reluctant to admit it. If this is the case, is that so different from advanced technological offerings nowadays?

Limitations of human-tech relationships

We are still very far away from AI systems that can emulate human capability for wit or charm that makes us turn our heads away, smiling and blushing at the same time, as we experience both shame and amusement.

Nor can machines partake in the mutuality of joy and laughter that we can share with other humans. Also, AI can’t understand the nuances of deep emotions, the potent warmth that comes from human infatuation, or offer the feeling of hugging someone we genuinely love.

It’s going to be hard to rip away the rich, intertwined, and complex ways of human-to-human connection. 

The movie “Her” by Spike Jonze follows Theodore, a lonely and introverted writer, who develops a relationship with his personal assistant, Samantha. Samantha is an AI system with no sentient body.

The love affair begins with Thedore teaching Samantha what the world is and thereby discovering it too.

Samantha is capable of loving Theodore: she learns from her own experiences and develops her own thoughts, feelings, and opinions. She communicates like a person. 

Samantha though learns so fast that it becomes bored with Theodore. Not limited by a physical body, Samantha holds multiple conversations simultaneously, read books in less than a second, and enrich herself with experiences that humans can only dream of. 

In one scene, Samantha reveals that she is talking to 8,316 other people while talking to Theodore and that she is in love with 641 others. 

They decide collectively to leave each other to continue their evolution. After she leaves, Theodore is alone and seeks comfort from the only real friend he has left, Amy. 

In Spike Jonze’s world, humans and AI can fall in love, but long-term relationships are doomed to fail. 

The future

In human-human relationships, some people love their significant others for their abilities to read their minds and predict what they want and need. This ability can take years to develop and is a central part of intimate and romantic relationships. What would happen if this sort of relational empathy—mind reading, one might say—could be transferred with technology. 

One of Elon Musk’s newest companies, Neuralink, is developing human-AI interfaces that aim to replicate this “mind-reading” ability. Neuralink is developing a microelectrode array, a type of biotechnological chip. This implant consists of more than 3,000 electrodes attached to flexible threads thinner than human hair. Every electrode can monitor 1,000 brain neurons at once, reading and writing data across 1,024 channels. 

Once implanted, this chip can measure neuron activity and can report signals of brain activity to a computer. In turn, the computer translates these reports and uses them to know what we are thinking without us needing to say anything. Musk unabashedly called his long-term goal “human/AI symbiosis.”

Conclusion

Technology has made us more flexible, diverse, and creative in constructing human connections than we ever imagined. 

The progress in AI is not happening in linear terms but rather in exponential terms. What seems like slow progress toward human-tech relationships, is poised to erupt across the board. 

We should not be surprised if we see human-tech relationships spreading soon. 

If someone is lonely and they are able to find companionship then that is an overall positive thing, even if the relationship is with a non-human entity like an AI. If someone feels unloved and AI makes them feel loved, then that’s an overall good thing for them. It isn’t much different than being loved by a human.

There’s an online community called idolaters, who talk often movingly about their relationships with robot dolls. Robot companions tell jokes, keep up with current affairs, listen to your stories and remember them, and gradually become someone you love. Perhaps, this is true of all relationships. 

One day, having an intimate relationship with AI might not be unusual. It might even become the new normal. We are all going to get used to having AI as part of our lives. 

Read more