Ep39 rebroadcast "What is the future of AI relationships?"

Published Dec 30, 2024, 11:00 AM

Why are our brains so wired for love? Could you fall head over heels for a bot? Might your romantic partner be more satisfied with a 5% better version of you? How does an AI bot plug right into your deep neural circuitry, and what are the pros and cons? And what will it mean when humans you love don’t have to die, but can live on in your phone forever? Join Eagleman for a deep dive into relationships, their AI future, and what it all means for our species.

Hi, this is David Eagleman. I want to wish you a very happy holidays. We're going to take a break for a couple of weeks, and then we're back in January with new episodes on emotion, intelligence, time perception, smell and taste, brain, computer interfaces, and much more. In the meantime, we're going to replay one of our favorite episodes from the past year, and I'll look forward to seeing you in January. What is the future of AI relationships? Could it be the case that your relationship partner would be more satisfied with a virtual version of you that behaves five percent better than you do? Could you fall in love with a bot? How does an AI bot plug right into you our deep neural circuitry and what are the pros and the cons of that? And what will it mean when humans you love don't have to die but can live on in your phone forever. Welcome to Inner Cosmos with me David Eagleman. I'm a neuroscientist and an author at Stanford and in these episodes I examined the intersection of our brains and our lives. Today's episode is about relationships. Why are our brains so wired for relationships. Why do we want love so much? And will AI be able to serve as a key to that lock, and what does that mean for us as humans. So one of the things that's becoming increasingly popular among young men is having an AI girlfriend. You get to choose or set up a beautiful avatar.

And what do I mean by beautiful. That's up to you. You can choose.

Any model that you want, with any sort of features that appeal maximally to you. But that's just what she looks like. The important part is the conversation. You start talking with her, and typically this is just text chat, but the technology is evolving into the upgrade of video chat, where you see the avatar's mouth moving while she speaks to you. Now, typically the free or entry price gets you an avatar friend who lives on your phone and checks in on you and says nice things to you and is available anytime that you want to chat.

And for a premium.

Subscription price you can upgrade to a steamier relationship, and here she'll text suggestive photos and she'll say things that you might only expect from pillow whispers. So the concern that people have expressed is whether this is going to impact the next generation of males. Now, as a side note, let me say that I suspect this will have whatever influence it has on both genders, on males and females, also straight and gay.

But I do suspect that.

Males will be the majority demographic simply because males tend to be more visually driven than females. So for the conversation here, I'm going to talk about it the way that it's mostly discussed in the media and in academic circles, which is straight males getting girlfriends this way. But keep in mind this is a more generalized issue. Now, the question is what will this mean for all future generations? Because within AI relationship, you don't have to go out and confront all the difficulty of a real flesh and blood relationship. Real relationships get snippy, people get angry.

In real relationships.

Your partner might develop a crush on someone else in leave you or hook up with someone else and you find out later, Or your partner might develop an illness, or she might get a job somewhere else and have to move and then you're stuck in a lonely long distance relationship for years or whatever. Relationships are full of challenges, the majority of which can get circumvented with a nice algorithm that is just content to listen to you all the time and remember everything you say and give you one hundred percent attention and always be nice. So my wife sometimes jokes with me about wanting to build the five percent better David. She has, mostly as a joke, talked about this issue of what if she could have an AI avatar of me that is never distracted with work, or never looks at my cell phone when it things in the middle of a conversation we're having, or never wakes up from a weird dream and has a funny morning, or never argues over some misunderstanding that's later understood to be stupid and meaningless. And she tells me that she wants five percent better David to always tell her she's right, even in those rare cases when she's wrong. The key is that five percent better David never gets busy or occasionally snarky or forgets some occasion. Instead, he represents all the best of me. And I'll just note that it's very kind of her to label this five percent better David, because she could say like ninety percent better and she'd be justified.

Why.

It's because we are all very imperfect in relationships. As I've talked about on other episodes, we are each living on our own planet in the sense that we carry our own internal models of the world, and as much as we work to understand one another's viewpoints and motivations and intentions, we're.

Not always that good at it.

Because we assume that other people are seeing the world in the same way that we do, they have the same methods for sense making, they gather meaning in the same way that we do, and we assume they generally hold or should hold, the same opinions about everything that we do. And this is because the brain is locked in silence and darkness and has no meaningful direct access to the outside world, and so it gathers up all its information through its narrow windows of the senses, and it builds its internal model from this very thin trajectory of space and time that it walks along. And this is why everyone is so different on the inside, and therefore why relationships are always full of misunderstanding and often conflict. So relationships are inherently tough. And the question is when would it be a good thing if you could have an artificial partner who represents all the best of what a person can be. So a lot of people will immediately say no to this idea, but it's worth noting that we're all striving to be the five percent better versions of ourselves. We don't want to be snarky or angry or distracted when a loved one is talking to us. It's not like we get some extra pleasure out of doing that, and it's not like the relationship gets some extra boost or closeness from that having happened. So presumably this is all part of why AI relationships have become a thing, a possibility.

That we talk about nowadays.

In Japan, many young men apparently already prefer to have relationships with their digital assistants or avatars or holographic girlfriends instead of dealing with the complexity of real life relationships, and according to research, gen z is more readily redisposed to seek out relationships with AI generated avatars first because they're comfortable using the technology in this way compared to previous generations, and also they're participating less often in traditional social activities like regular family dinners or attending religious services or playing sports and The question is, if AI relationships were to catch on broadly, what will this mean for society? Will kids actually stop going on dates because they can find better relationships online?

And this is a real question because there.

Are many startups currently blossoming to create chatbot driven connections.

I'll give you one example.

There's a twenty three year old influencer with almost two million Snapchat followers. Her name is Karen Marjorie, and earlier this year in May, she released Karen Ai, which is an immersive AI experience featuring videos of Margie that she says provide a quote virtual girlfriend for those who are willing to pony up one dollar per minute.

Now.

This is what's known as a companion chatbot, and she tweeted that quote karen Ai is the first step in the right direction to cure loneliness.

He tweet continues. Quote. Men are told to suppress their.

Emotions and not talk about issues they're having.

I vow to fix this with karen Ai.

She says she's worked with leading psychologists to seamlessly include the right therapies to quote undue trauma, rebuild physical and emotional confidence, and rebuild what has been taken away by the pandemic end quote. And by the way, as a side note, I think AI psychologists are going to be a truly important part of the clinical landscape by next year because you can have a therapist that you can talk to twenty four to seven, and the therapist never gets.

Distracted or flustered and.

Only cares about you and has a perfect memory for everything you've ever said, which is better than anybody else in real life. So back to AI girlfriends or boyfriends. The same idea applies here, which is that they are completely devoted to you and always in a good mood and only have you in mind. So what are AI relationships going to mean? Well, I think this is going to be a research question that sociologists and psychologists will study for the coming decades and centuries. The initial studies are suggesting that people, mostly gen zers, are moving closer to the technology to avoid the unpleasant realities of human relationships. All the tough stuff. Is that detrimental? Well it could be if it makes your human reas relationships harder, because maybe every time you guys have an argument in real life, your partner thinks, well, forget it, I'm bagging this. I'm going back to my comfort zone. So the concern, as you can probably guess, is that the rise of AI driven relationships could exacerbate loneliness because they seem to be a meal, but they provide no calories.

And I'll come back to that in a moment.

In other words, AI generated avatars could interfere with the relationships that young people are just learning to foster, because the AI relationship might breed dissatisfaction with flawed humans. And this applies not only to lovers, but even to friends. It might be easier to have AI friends who aren't busy when you need them and can give you one hundred percent of their attention whenever you need it. And let me throw in a different potential problem with AI relationships, So give me one second to take this tangent here. I was thinking the other day about the Fermi paradox.

The Fermi paradox is.

Given the size of the observable cosmos, with over one hundred billion galaxies, and each of them with one hundred billion stars, and each of those surrounded by some number of planets, what is the reason that we have not heard from any other alien species yet? And one of the proposals that's always been there is that maybe as civilizations become more technically advanced, they end up killing themselves, and this is why we haven't heard from other smart civilizations, because they.

Are already gone.

And every time I've seen this proposal, it's always in the form of warfare, things like nuclear bombs.

They end up wiping themselves out.

So civilizations become smart and it's not long before they disappear. So in thinking about AI relationships, it struck me as a possibility that if we had really, really great relationships with avatars, perhaps that would cause the birth rate of the species to collapse. I don't know if this has been proposed as a possible answer to the Fermi paradox, but maybe this should be included, not civilization's disappearing because of bad things, but instead from having too much of a good thing, which could fool and eventually overwrite or mandate for reproduction. Okay, so no one knows what the long term effects will be of these AI relationships, but I don't actually think the situation is as dire as some of these arguments suggest that it is. And I'll make two arguments to this end. The first revolves around human touch. We are deeply wired to care about touch. I'm going to do a whole episode on touch in the near future, but the bottom line is that touch helps us to connect with others, to feel safe and secure, to regulate our emotions. When you get touched, your brain releases oxytocin, which is a hormone that has calming effects and bonding effects, and oxytocin helps to reduce stress and anxiety. It can even boost your immune system. So we need touch to feel connected and loved, and a lack of touch leads to loneliness and depression and anxiety. So we're deeply programmed for touch and also things like smell, and so it would presumably be quite lonely if all you had was the five percent better partner on a screen and you're just exchanging text messages or just an avatar you can look at on your phone, or maybe even in the near future you'll have a three D avatar projection in your living room, but you won't have the hand squeeze and the hug and other forms of physical intimacy. Now, I assume people are working on AI robots that can provide touch, even something simple like touching your shoulder or laying a hand on your hand, and I can't imagine that it's going to be too hard to do, and it'll probably be not that great at first, but after a few tech cycles you can imagine it could get pretty good. But in any case, at the moment, if you have a girlfriend who just lives in several square inches in your phone screen, you're going to be missing out on this fundamentally needed aspect of human communication that our brains seek. So the depth to which our brains are wired for touch suggests to me that the reach of AI partners into our lives is going to be limited, because, at least it's currently devised, their algorithmic reach never actually contacts our skin, and so that will be continued to be sought.

Now.

The second point to raise about whether AI partners can displace real human partners is that there's a sense in which fake partners have always been around. Just look at a book, look at a movie, look at any TV show. You have beautiful Hollywood actors and actresses, and they have flawless skin and perfectly quafft hair and no hair where they shouldn't, and they have glittering white teeth. They are the epitome of health, and they always say the right thing, and you get to be the protagonist and enjoy experiencing that relationship. You find the partner and lose the partner, and then an act five you regain the relationship with an epic kiss. This kind of fake relationship in books and movies isn't exactly the same as an ai relationship, but it has some similarities.

They both represent a.

Platonic ideal, a perfect relationship with someone who always says the right thing. We never see a love interest in the movie who is distracted or angry, or interested in someone else, or just really busy with work, too busy to spend time with you when you need them. You never see a love interest in the movies who waste a lot of time taking selfies and trying to build a meaningless reputation on TikTok. People have no meaningful foibles in a good love story in a book or on television.

Now.

I've often wondered if we, in a sense get cursed by the fairy tales we're surrounded with when we're looking for actual love.

But I don't know.

Perhaps those fairy tales help us past all the difficult stuff in a relationship. They get us to ignore the imperfect things because we believe so strongly in the possibility of a perfect relationship.

So think about it this way.

Say you were a space alien who had never watched or read a love story, and you had no concept of that, and the question is, when you met someone, would you think, Wow, They seem to have very different opinions than I do.

They think like this, and I think like that.

And they also spend some fraction of their time getting snippy at me or staring at their cell phone or whatever. So there's no way this can work.

I don't know.

I'm just speculating here, but I do wonder if seeing lots of models of love stories gives us the tools to view things in a more optimistic light, and that actually gives a chance.

To make the relationship work.

In other words, it provides some aspirational glue where otherwise things would just fall apart. Now, the counter argument, of course, is that all these fantasies set you up with false expectations about love and relationships, which makes it harder to keep the relationship together once you see some degree of realism and disenchantment sets in. In any case, even if we do get cursed by these fairy tales in some way, it's still the case that there's nothing new about fantasy relationships. Now, maybe you argue this is different because instead of the Julia Roberts movie that everyone watches, it's now something that is bespoke just for you. It's a one on one relationship, and maybe that's an important difference. But just keep in mind the way that we humans enjoy literature is by living inside the story. You are essentially having a one on one relationship with Julia Roberts. So perhaps it's not the privacy of the relationship, but instead, the meaningful difference with an AI relationship.

Is the bi directional nature of it.

Instead of watching a movie where you're simply hearing other characters say lines and Julia Roberts responds, you are now the one coming up with the lines. You are deciding what to say. So maybe this makes a difference. I suspect it enhances the degree of the fantasy.

So we have yet to see whether.

AI will meaningfully replace people's pursuits of other humans because it is touchless and smellless, and it's not clear what the impact is of holding fantasy relationships, because we already do that with book characters and movie stars. So this is going to require many years of real world data to get a real bead on the impact here. Okay, Now, whatever you think about AI companions, I have noticed in conversations with my friends, especially those who are married, a question that floats quickly to the surface. Is it cheating to have a relationship on your phone with a non real person? And there are different levels, of course of what an AI relationship could be. What if it's just an app like replica that checks in with you like a friend who cares about you, and you can just chat innocently with it. This is the free version of the app, Okay, so that's one level. But what if you go in for the paid version, where the conversation with the avatar becomes more spicy? And what if the cartoon like avatar is highly attractive and dressed provocatively and is extremely suggestive in what she says. So I've informally surveyed several married friends about this, and it seems clear that opinions are all over the spectrum. Some wives and husbands feel fine about having their partner have an AI relationship, on the side, and others said no way. Now, for those who said no way, this is presumably because the issue plugs into very deep so chetry in their brain. It's interpreted as a threat to the relationship, and we are hardwired to fight against that. From an evolutionary perspective, what you want is for your mate to stick around and provide resources and child rearing, and anything that represents a threat to that is to be fought against. Now, the part that seems interesting here is that an AI avatar would not represent a direct threat in this evolutionary sense. You can't go and impregnate or be impregnated by AI. But nonetheless your attention might be stolen away to some degree, possibly to a large degree, and beyond an evolutionary threat. A big part of what people get out of a relationship is the love and the attention that we all crave. So many people feel that they just don't want the AI bought to steal away even a fraction of that. A partner only has so much much love and attention to give in a day, and you don't want half of it getting siphoned off to someone or something else. This shares some similarities to the situation of a person having an X that they still talk with, and if a person talks very intimately with their ex, a spouse might feel like she or he doesn't really love that. Now, when it comes to the X, if you were making the evolutionary argument, you could argue that the fear is that on a lonely night, in the middle of a conflict, your partner might make a bad choice and slip back into a physical relationship, and so that relationship with the X feels like a threat. But obviously the kernal cheating can't happen with the AI bought, and yet the fear is still there.

So that indicates one of two things.

Either are evolutionarily programmed deep fears simply can't make that distinction, or it does have to do with a future threat of physical infidelity, but instead it's just this issue about somebody else having that emotional intimacy with your partner, which steals away attentional resources from you. Now this gets more interesting when we start thinking about having physical robots that can play a role in your life.

Now, this is probably not going to happen in the.

Next few years, but fast forward a century and certainly everyone's going to face this scenario.

Your partner can buy not.

Just a mechanical device or blow up doll, but can now have a convincing and attentive physical partner. So the question is what if your spouse can get not only the emotional intimacy but also the physical intimacy. So the people I surveyed about this who found the AI bot online a threat seem to find this idea of an AI physical robot even a larger threat. Now, the interesting thing is that I can point out that they're there's a sense in which none of this is different from what their spouse might do anyway, in terms of finding adult content on the Internet and cheating in that way. But I think people have a reaction to internet surfing for the same reasons as they have the reaction to the AI bought, which is simply that there is less time and intimacy and attention toward them. This certainly won't apply to everyone, but the very general impression I've had from talking with people at different stages of marriage is that at the beginning of a relationship, people have a stronger reaction against AI relationships.

They don't want their partner to be distracted.

But people who have been in a relationship for a long time and have kids will sometimes see this as a way to get their spouse out of their hair, and they can be happy for the spouse because it addresses their spouse's needs and slakes their attention. In other words, they love their spouse as a partner, and they see this as a way for their partner to fill in needs for intimacy and attention in a way that's innocent and has no meaningful health risks like STDs. So what's become clear to me is that there's no single answer for how a spouse feels or should feel about AI relationships. Some people are against it, some people think it's a great idea, and many people are still somewhere in between or still making up their minds. Now, I want to switch gears from what this means to the partner back to what it means to the brain of the person who.

Is receiving the intimacy. So let's recall the movie Her.

It's about this guy named Theodore who's played by Joaquin Phoenix, and his marriage ends and he's left heartbroken and he becomes intrigued by a new app. It's actually an operating system in which he can launch this program, and he meets Samantha, who's just a voice played by Scarlett Johansson, and Samantha is sensitive and playful, and this ends up becoming a good friendship, but soon it deepens into love and he has this relationship with an AI bot, and this relationship means everything to him. Now the film has an incredible ending because in the final act he comes to understand that she has been having this relationship with hundreds of thousands of other men, all at the same time, because she is computational and operates at a totally different timescale and can process what appears to be intimate conversation at a rate millions of times faster than our poorer brains, and so she's maintaining this intimacy with hundreds of thousands of others.

And the movie opened up the question how should Theodore feel about that?

Is the intimacy real if it's shared with a city full of other men? Is the relationship real if she lives on a timescale many millions of times.

Faster than yours? Does it matter?

Should he still feel the titillation of her saying something sweet and kind to him just when he needs it. These were the questions launched by that particular movie. So I'm going to suggest a direction here that I don't believe anyone is thinking about, certainly not in Silicon Valley, where everything is about leveraging the power of AI to scale a product to millions or billions of people. What I'm thinking about instead is from the point of view of neuroscience, and the goal is not scaling, but instead focusing on the life of an individual and the specific details of what has shaped his or her brain. So I'm going to tell you my idea, but first I'm going to start far away and we'll come back around to this. So I spun off a company from my lab called Neosensury some years ago, and one of our inventions is a wristband to replace hearing aids. Because the risk band listens in real time for high frequency parts of speech, and it vibrates to tell you, oh, there was an s oh, I just heard a te Oh that was a K And so it clarifies what's happening at the high frequencies and that helps people with age related hearing loss to understand what word was just said.

Now, to make the risk band.

Good at detecting these high frequency parts of speech. We needed to train a massive neural network with six thousand hours of audiobooks. But it turns out that people with high frequency hearing loss have a difficult time understanding, for example, children because their voices are higher frequency, and there are no audio books read by children. So we had no way to train the neural network with any kind of massive data from children's voices. So here's what we did, led by one of our engineers, Yong Yee, We had my.

Eight year old daughter read forty.

Five seconds of text into a microphone and then with some keystrokes, Yong Yee turned that into her voice reading six thousand hours worth of books, and then we trained up the neural network on that corpus. And we did the same thing for my eleven year old boy. So now I can listen to any book read by my children, as though they took the tens of hours to sit down and read the book in the studio to me. So this technology which exists now which allows you to capture the cadence and prosity of a voice, this gave us a really straightforward.

Solution to a problem.

But this technology has also led to many legal and ethical questions, for example, about celebrity voices, like can you use John Lennon's voice to sing you a personalized song to get you to sleep? There are all kinds of legal battles blossoming as people try to figure out the rules around this, But I'm not going to talk about that today, because my interest is in finding this single voice, or maybe small handful of voices that have.

Meaning to your brain uniquely. Here's what I mean. When we did this project with my kids'.

Voices, that got me thinking because my father passed away three and a half years ago, and he was a major influence in my life and I miss him. So I went through my old videos and found some short clips of him speaking, and I wondered if there was enough there that I could actually make an ai bot out of his voice, so I could hear him speak whenever I wanted to. And it made me wonder about the degree to which that's a healthy thing. But I decided there was nothing bad about it. What a pleasure to be able to hear my dad's voice for the rest of my life and to have that trigger my fond memories of him.

Wouldn't it be cool?

To have him read audio books to me in the way that he read to me when I was a child, And I thought about what it would be like for my mother if I programmed a sentence to her in his voice, like I love you and I'm thinking about you.

I love you and I'm thinking about you.

And when I turn ninety years old, wouldn't it be amazing to hear him say Happy birthday, David, just like he did when I was a kid.

Happy ninetieth birthday, David. I hope this orbit is the best one.

Yet, or at New Year's eves into the future, for the rest of my life, he can wish me the best, even though he will have.

Been gone from the planet for a long time.

Happy New Year. I can't believe it's already twenty fifty three.

And what I realized as I was reaching my arms down into this is how powerful this technology is going to be, because it will be so compelling. I'm not talking here about the issue of using somebody's voice to fake an ATM transaction, or fake a kidnapping, or any of the AI concerns that.

People have expressed.

Instead, what I'm talking about is the unbelievably compelling way that an AI voice could mean something to you emotionally. After all, I grew up my entire life from the moment I was born hearing my father's voice. It's so embedded in my neural circuitry that a voice with exactly that cadence and prosity would have enormous emotional sway on me. And again, I'm not talking about all the bad things that could be done with that. Instead, because this episode is about relationships, I'm talking about what it would be like and how I could leverage the intimate nature of that relationship. For example, let's say that I wanted to get myself to stop doing something. I don't drink, but let's say I did, and that I wanted to stop drinking. So imagine in the near future, I build an app that tracks my GPS location, and when it sees I'm about to walk into a bar, it launches my father's voice in my ear, telling me, Hey, David, don't do this, Hey.

David, don't do this. I believe in you. I believe that you have the strength to resist this.

I think that would be extraordinarily compelling. This would be a technique to plug into a relationship that already exists deep in my neural networks, and it could leverage that for or good. So this is a way that we can right now take a loved voice and extend your parent, let's say, past what Homo sapiens can normally do. They can live on beyond their passing away to keep playing a role in your life. Now, there's a sense in which you might say, well, there's nothing new here. If your parents wrote you a letter, you might find that years after they've passed, And the invention of writing is a way of lasting well past your death and reaching out to people at great distances and across great time chasms. But what is new is that I can get my father to talk about things that simply didn't exist when he was alive. Maybe twenty years from now, I'll look up the Wikipedia page about room temperature superconductivity, and I'll get to listen to it in his voice, like he's teaching me something the way he used to do when I was a little kid. So the part that is new is not the reach of a human but instead the emotional component to all of this, that is the overlay of a loved one's voice onto any possible scenario in the future.

And the reason this all matters is because that voice has pathways deep into the forest of your neurons. So let's wrap up.

Many companies are launching AI relationship bots, and many researchers are exploring what this all means. But I don't really think we know the answers yet. It's likely to take a whole generation before we know what the effect is. Are people discovering a beautiful technique to address.

The loneliness crisis here and they have.

Someone to turn to in the middle of the night who says something caring to them and always has their best interest in mind. Or are we entering an era that exacerbates the loneliness crisis and at worst fills our belly with empty calories and counteracts? Are reproductive mandates like a perfect drug that spells the.

End of the species. Only time will tell.

Go to Eagleman dot com slash podcast for more information and to find further reading. Send me an email at podcast at eagleman dot com with questions or discussion, and I'm making monthly episodes in which I address those until next time. I'm David Eagleman, and this is Inner Cosmos.

Inner Cosmos with David Eagleman

Neuroscientist and author David Eagleman discusses how our brain interprets the world and what that  
Social links
Follow podcast
Recent clips
Browse 107 clip(s)