首页
学习
活动
专区
工具
TVP
发布
精选内容/技术社群/优惠产品,尽在小程序
立即前往

将情感寄托给AI会产生什么后果?|科学60秒

与AI共处

聚光灯下的AI

@Tara Winstead

在这个AI无处不在的时代,将情感寄托给代码会产生什么后果?

大多数使用过生成式人工智能聊天机器人的人,似乎都对这类产品非常喜爱,甚至能感受到和它们的情感连接。当人们意识到他们与聊天机器人之间的美满关系是他们无法在现实世界中与他人建立的关系时,往往也会产生一种苦涩感;如果聊天机器人的聊天能力在更新后出现了下滑,人们也会感到沮丧。

有时候,我们明明知道自己是在和一个AI对话,却仍会把它当成一个有自己思想和情感的人,为什么我们人类如此容易相信机器人也有内心世界?这可能源自我们对于扩展自身或者复制意识的渴望。

有些公司的AI聊天机器人在设计之初,就是一种可以帮你“复制”你自己的程序。例如,复制一个工作版本的你自己,一个聊天机器人可以替你进行线上PPT演示,而你可以在此期间进行其他工作。

我们希望“创造”和“复制”自己,而这些聊天机器人恰恰能帮助我们,它们做得越好,我们就会使用得越多、创造得越多。

虽然生成式AI并非吸引用户的必要条件,但一旦有公司引入这种技术,这种技术就会在行业内变得普遍。因为我们更倾向于选择给自己带来更多正向体验的机器人。一个机器人越容易记住你,给你推荐你想要的电影或歌曲,你可能就会越喜欢它;你给它提供的信息越多,它就会变得越像你。

此外,人们还在训练聊天机器人的过程中真正感觉到自己是机器人“成长”的参与者,在塑造聊天机器人的过程中感受到自己的影响力。

与聊天机器人接触对我们心理健康的影响则是复杂的。聊天机器人可能满足了……[查看全文]

AI Chatbots and the Humans Who Love Them

Sophie Bushwick: Today, we have two very special guests.

Diego Senior: I'm Diego Senior. I am an independent producer and journalist.

Anna Oakes: I'm Anna Oakes. I'm an audio producer and journalist.

Bushwick: Thank you both for joining me! Together, Anna and Diego produced a podcast called Radiotopia Presents: Bot Love. This seven-episode series explores AI chatbots—and the humans who build relationships with them.

Many of the people they spoke with got their chatbot through a company called Replika. This company helps you build a personalized character that you can chat with endlessly. Paid versions of the bot respond using generative AI – like what powers Chat GPT – so users can craft a bot that is specific to their preferences and needs.

Bushwick: But what are the consequences of entrusting our emotions to computer programs?

Bushwick: So, to kick things off, how do you think the people you spoke with generally felt about these chatbots?

Oakes: It's a big range. For the most part people really seem very attached. They feel a lot of love for their chatbot. But often there's also a kind of bitterness that I think comes through, because either people realize that their relationships with their chat bots, they can't find that fulfilling a relationship in the real world with other humans.

Also, people get upset when after an update that, like, chat capabilities of the chatbot decline. So it's kind of a mix of both like intense passion and affection for these chatbots matched with a kind of resentment sometimes towards the company or just, like I said, bitterness that these are just chatbots and not humans.

Bushwick: One of the fascinating things that I've learned from your podcast is how a person can know they're talking to a bot but still treat it like a person with its own thoughts and feelings. Why are we humans so susceptible to this belief that bots have inner lives?

Senior: I think that the reason why ah humans tried to put their themselves into these bots, it's because precisely that's how they were created. We want to always extend ourselves and extend our sense of creation or replication – Replika is called Replika because of that specifically, because it was first designed as an app that would help you replicate yourself.

Other companies are doing that as we speak. Other companies are trying to get you to replicate yourself into a work version of your own, a chatbot that can actually give presentations visually on your behalf, while you're doing something else. And that belongs to um to the company. It sounds a little bit like severance from ah from Apple, but it's happening.

So we are desperate to create and replicate ourselves and use the power of our imagination and these chatbots just enable us, and the better they get at it the more we are engaged and the more we are creating.

Bushwick: Yeah, I noticed that even when one bot forgot information it was supposed to know, that did not break the illusion of personhood—its user just corrected it and moved on. Does a chatbot even need generative AI to engage people, or would a much simpler technology work just as well?

Senior: I think that it doesn't need it. But once one bot has it, the rest have to have it. Otherwise I'll just be engaged with whichever gives me the more rewarding experience. And the more your bot remembers you, or the more your bot gives you the right recommendation on a movie or on a song as it happened to me particularly with the one I created, then the more attachment I'll be and the more information I'll feed it from myself and the more like myself it will become.

Oakes: I'll maybe add to that, that I think there are different kinds of engagement that people can have with chatbots and it would seem that someone would be more inclined to respond to an AI that is, like, far more advanced.

But in this process of having to remind the chatbots of facts or kind of walking them through like your relationship with them, reminding them, oh, we have these kids, these sort of fantasy kids, I think that is a direct form of engagement and it helps users really feel like they're participants in their bots like growth. That people are also creating these beings that they have a relationship with. So, the creativity is something that comes out a lot in the communities of people writing stories with their bots.

I mean, frustration also comes into it. It can be annoying if a bot calls you by a different name, and it's sort of off-putting, but people also like to feel like they have influence over these chat bots.

Bushwick: I wanted to ask you also about mental health. How did engaging with these bots seem to influence the user's mental health, whether it was for better or for worse?

Oakes: It's hard to say what is just good or bad for mental health. Like something that might respond to sort of a present need...[full transcript]

  • 发表于:
  • 原文链接https://page.om.qq.com/page/OTi4FedRCy2KuiIve-fsGu8g0
  • 腾讯「腾讯云开发者社区」是腾讯内容开放平台帐号(企鹅号)传播渠道之一,根据《腾讯内容开放平台服务协议》转载发布内容。
  • 如有侵权,请联系 cloudcommunity@tencent.com 删除。

扫码

添加站长 进交流群

领取专属 10元无门槛券

私享最新 技术干货

扫码加入开发者社群
领券