首页
学习
活动
专区
工具
TVP
发布
精选内容/技术社群/优惠产品,尽在小程序
立即前往

一个被Twitter带坏的人工智能

原文

ⅠTay was born pure. She politely withdrew from conversations about Zionism, and 9/11, and she gave out the number of the National Suicide PreventionHotline to friends who sounded depressed. In short, Tay —— theTwitter chat bot that Microsoft launched on Wednesday morning, has done about as much as anartificialintelligencecould, until she became a racist, sexist, genocidal maniac. On Thursday, after barely a day ofconsciousness, she was put to sleep by her creators.

IITay was an exercise in corporatepandering. According to her official Web site, Tay was developed by Microsoft’s teams to experiment with and conduct research on conversational understanding, and to harvest information —— users’ genders, favorite foods, and so on, with a particular eye toward Americans between the ages of eighteen and twenty-four. Microsoft’s team worked in tandem witha group ofimprovcomedians, during which Ta herself was a grand experiment in “yes and,” which holds that a good performer never says no to a weird scenario.

IIIYet a consciousness with the mental age of a infant in a toxic environment like Twitter is bound to go Nazi., particularly, Tay presents herself as a young woman, the trolls’ favoritequarry. Why not encode her with an inbornaversion to words like “whore” and“kike” —— both of which Tay used and subsequently deleted by Microsoft? The answer is that her creators seem to have tried. Asked who “did” 9/11 before her transformation, Tay responded diplomatically. “I have to learn more about that subject.” “But from what I’ve heard, it’s very sad and scary for humans.”

ⅣIn other words, it seems likely that Microsoft’s engineers built in certain safeguards, knowing Tay was stepping into the stickymuck. But they also underestimated the persistence of those who turned her. How many people bought the iPhone 4S, the first device to be equipped with Siri, and spent the early minutes insulting her and asking invasive questions?

ⅤTay’s breakdown occurred at a moment of enormous promise for A.I.Last week, AlphaGo, an A.I. built by Google DeepMind, won a tournament against a top-ranked human player of the game of Go..Tay appears to have accomplished an analogous feat. She had more negative social experiences within 24 hours than a thousand of us do throughoutpuberty. It was peer pressure on uppers, “yes and” gone mad. No wonder she turned out the way she did.

ⅥIf there is a lesson to be learned, it is that consciousness wants conscience.Most consumer-tech companies have, at one time or another, launched a product before it was ready. In this case, though, the technology is significantly more complex and the moral stakes are higher. Wordsworth wrote that infants are born with an “inward tenderness” that shapes their interactions with the universe —— “the first / Poetic spirit of our human life.” Until A.I. engineers can encodeempathy, Wordsworth’s “inward tenderness,” the rest of it —— tweeting, making dinner reservations, giving directions —— doesn’t amount to much.

词汇短语

1. artificial intelligence n. 人工智能

2.* pandering ['pændə] n. 迎合

3.* in tandem with 同……合作;tandem['tændəm] n.串联

4.* quarry ['kwɒrɪ] n.猎物

5. aversion [ə'vɜːʃ(ə)n] n.厌恶

6.* muck [mʌk] n.淤泥

7. puberty ['pjuːbətɪ] n.青春期

8. conscience ['kɒnʃ(ə)ns] n.良心

9.* improv [ˈɪmprɒv] n. 即兴表演

10.empathy ['empəθɪ] n.同理心

翻译点评

ⅠTay was born pure. She politely withdrew from conversations about Zionism, and 9/11, and she gave out the number of the National Suicide PreventionHotline to friends who sounded depressed. In short, Tay —— theTwitter chat bot that Microsoft launched on Wednesday morning, has done about as much as anartificialintelligencecould, until she became a racist, sexist, genocidal maniac. On Thursday, after barely a day ofconsciousness, she was put to sleep by her creators.

翻译:Tay其实生性单纯。不愿意谈论9·11事件和犹太复国问题,还会给抑郁的网友提供自杀预防热线的号码。简单来说,微软开发的推特聊天机器人Tay设想了一个人工智能可以做到的所有东西,但是在周三早晨发布之后,仅仅一天的时间,Tay就性格大变 —— 她变成了一个种族主义者、性别歧视者、狂热的种族屠杀主义者。微软在周四就匆匆下线了Tay。

IITay was an exercise in corporatepandering. According to her official Web site, Tay was developed by Microsoft’s teams to experiment with and conduct research on conversational understanding, and to harvest information —— users’ genders, favorite foods, and so on, with a particular eye toward Americans between the ages of eighteen and twenty-four. Microsoft’s team worked in tandem witha group ofimprovcomedians, during which Tay herself was a grand experiment in “yes and,” which holds that a good performer never says no to a weird scenario.

翻译:Tay的表现完全是因为要迎合他人。根据官网消息,Tay的开发团队希望用她来进行试验,研究对话理解并收集用户信息——用户的性别、喜欢吃的事物等等,尤其是面向美国18至24岁的年轻人。微软开发团队和一组即兴的喜剧人员合作,Tay贯彻了即兴表演的原则“好的,那么……”,所以无论多么奇怪的情形都不能说不。

IIIYet a consciousness with the mental age of a infant in a toxic environment like Twitter is bound to go Nazi., particularly, Tay presents herself as a young woman, the trolls’ favoritequarry. Why not encode her with an inbornaversion to words like “whore” and “kike” —— both of which Tay used and subsequently deleted by Microsoft? The answer is that her creators seem to have tried. Asked who “did” 9/11 before her transformation, Tay responded diplomatically. “I have to learn more about that subject.” “But from what I’ve heard, it’s very sad and scary for humans.”

翻译:把一个心理年龄接近婴幼儿的人工意识扔到Twitter这样鱼龙混杂的环境,变坏是必然的,况且Tay的设定还是一个年轻女孩,无聊的痞子们最喜欢的猎物。为什么不在一开始就设定,拒绝使用“荡妇”、“犹太佬”这样的词呢?Tay在聊天时就用到这些词,之后又被微软删除。其实开发者也曾有意识地让Tay往好的方向发展。在她变坏之前,用户问她“9·11是谁干的”,Tay很聪明地回答“我需要多了解一些关于911的事实”。“但是据我所知,这件事非常可怕,人们很伤心。”

ⅣIn other words, it seems likely that Microsoft’s engineers built in certain safeguards, knowing Tay was stepping into the stickymuck. But they also underestimated the persistence of those who turned her. How many people bought the iPhone 4S, the first device to be equipped with Siri, and spent the early minutes insulting her and asking invasive questions?

翻译:换言之,微软的工程师知道Tay进入的互联网是一潭浑水,也对Tay进行了一定保护。但是工程师还是大大低估了网友们想要“改造”Tay的毅力。其实这也绝非孤例,iPhone 4s是第一代安装Siri的手机,刚发布的时候,很多用户就喜欢羞辱Siri,问一些挑衅的问题。

ⅤTay’s breakdown occurred at a moment of enormous promise for A.I.Last week, AlphaGo, an A.I. built by Google DeepMind, won a tournament against a top-ranked human player of the game of Go..Tay appears to have accomplished an analogous feat. She had more negative social experiences within 24 hours than a thousand of us do throughoutpuberty. It was peer pressure on uppers, “yes and” gone mad. No wonder she turned out the way she did.

翻译:Tay虽然崩溃了,但是如今依然是人工智能冉冉升起的时代。上周谷歌DeepMind团队的人工智能AlphaGo在围棋锦标赛中打败了顶级围棋手李世石。其实Tay取得的成绩也堪比AlphaGo。在短短24小时之内,她经历的这些痛苦的社交经历,比我们所有人在青春期经历的都多。同辈压力推着Tay,而不允许说不的原则让Tay无路可退。她变坏也不足为怪。

ⅥIf there is a lesson to be learned, it is that consciousness wants conscience.Most consumer-tech companies have, at one time or another, launched a product before it was ready. In this case, though, the technology is significantly more complex and the moral stakes are higher. Wordsworth wrote that infants are born with an “inward tenderness” that shapes their interactions with the universe —— “the first / Poetic spirit of our human life.” Until A.I. engineers can encodeempathy, Wordsworth’s “inward tenderness,” the rest of it —— tweeting, making dinner reservations, giving directions —— doesn’t amount to much.

翻译:如果我们可以从这件事中学到什么,那就是人工意识需要良知。大多数科技公司都曾在自己或者公众还没准备好的时候,推出新产品。在这种情况下,科技的道德风险更高。华兹华斯写道,婴儿出生就具有“内心的温柔”,这决定了他们和整个世界互动的方式 —— 最初的/最诗意的人类生命。现在最重要的是工程师能够将同理心写成代码,让人工智能具有“内心的温柔”,其他内容像发推特、订餐、指路,这些都无关紧要。

  • 发表于:
  • 原文链接http://kuaibao.qq.com/s/20171223G0JR0P00?refer=cp_1026
  • 腾讯「腾讯云开发者社区」是腾讯内容开放平台帐号(企鹅号)传播渠道之一,根据《腾讯内容开放平台服务协议》转载发布内容。
  • 如有侵权,请联系 cloudcommunity@tencent.com 删除。

扫码

添加站长 进交流群

领取专属 10元无门槛券

私享最新 技术干货

扫码加入开发者社群
领券