读BBC学英语:谷歌人工智能是否有意识


读BBC学英语:谷歌人工智能是否有意识

Google has fired one of its engineers who said the company's artificial intelligence system has feelings.

  1. artificial intelligence 人工智能

Last month, Blake Lemoine went public with his theory that Google's language technology is sentient and should therefore have its "wants" respected.

  1. go public with 公开…
  2. sentient [ˈsentiənt] 有感觉能力的;有知觉力的
  3. should have its wants respected 需求应该得到尊重

Google, plus several AI experts, denied the claims and on Friday the company confirmed he had been sacked.

  1. claim 说法
  2. confirm 确认;证实
  3. be sacked 被炒了

Mr Lemoine told the BBC he is getting legal advice, and declined to comment further.

  1. get legal advice 进行法律咨询
  2. decline to do 拒绝

In a statement, Google said Mr Lemoine's claims about The Language Model for Dialogue Applications (Lamda) were "wholly unfounded" and that the company worked with him for "many months" to clarify this.

  1. statement 声明
  2. wholly 完全地
  3. unfounded 豪无根据的;无稽之谈的
  4. clarify [ˈklærəfaɪ] 澄清

"So, it's regrettable that despite lengthy engagement on this topic, Blake still chose to persistently violate clear employment and data security policies that include the need to safeguard product information," the statement said.

  1. regrettable 遗憾的
  2. lengthy 漫长的
  3. engagement ;(尤指正式的或与工作有关的)约定,约会,约谈,交谈
  4. persistently 坚持地;锲而不舍地;顽固地
  5. violate policy 违法政策规定
  6. safeguard 捍卫

Lamda is a breakthrough technology that Google says can engage in free-flowing conversations. It is the company's tool for building chatbots.

  1. lamda 谷歌的人工智能聊天机器人
  2. engage in 参与
  3. free-flowing 自然流畅的
  4. chatbot 聊天机器人

Blake Lemoine started making headlines last month when he said Lamda was showing human-like consciousness. It sparked discussion among AI experts and enthusiasts about the advancement of technology that is designed to impersonate humans.

  1. make headline 上头条
  2. consciousness 意识
  3. spark 引起;激起
  4. enthusiast 热衷人士
  5. impersonate 模仿;拟人;冒充;假扮;扮演

Mr Lemoine, who worked for Google's Responsible AI team, told The Washington Post that his job was to test if the technology used discriminatory or hate speech.

  1. discriminatory speech [dɪˈskrɪmɪnətəri] 歧视性的言论

He found Lamda showed self-awareness and could hold conversations about religion, emotions and fears. This led Mr Lemoine to believe that behind its impressive verbal skills might also lie a sentient mind.

  1. self-awareness 自我意识
  2. lead to 导致;引向…;促使
  3. impressive 出色的;令人印象深刻的
  4. verbal skill 语言能力

His findings were dismissed by Google and he was placed on paid leave for violating the company's confidentiality policy.

  1. finding 发现
  2. dismiss 不予考虑;摒弃;对…不屑一提
  3. be placed on paid leave 被带薪休假
  4. confidentiality [ˌkɒnfɪˌdenʃiˈæləti] 保密

Mr Lemoine then published a conversation he and another person had with Lamda, to support his claims.

  1. have a conversation with 和…交谈

In its statement, Google said it takes the responsible development of AI "very seriously" and published a report detailing this. It added that any employee concerns about the company's technology are reviewed "extensively", and that Lamda has been through 11 reviews.

  1. take sth seriously 严肃对待…
  2. detail 详细说明
  3. review 复查
  4. be through 通过

"We wish Blake well", the statement ended.

Mr Lemoine is not the first AI engineer to go public with claims that AI technology is becoming more conscious. Also last month, another Google employee shared similar thoughts with The Economist.

  1. the first to do 第一个做…的人
  2. conscious 有意识的
  3. share with 分享
发表评论
留言与评论(共有 0 条评论) “”
   
验证码:

相关文章

推荐文章