A Google engineer claimed the company’s AI chatbot was sentient. Then Google suspended him.

email:


Gabriel & Co. fine jewelry


Woman silhouette Google logo
A Google software engineer says he was suspended by the company for breaking their policy on employee confidentiality.

  • Blake Lemoine claimed Google's AI chatbot was sentient and shared documents with a US senator.
  • Lemoine told The New York Times that he was placed on paid leave on Monday.
  • The company's HR informed Lemoine that he had violated their confidentiality policy.

The senior Google software engineer who claimed that the company's artificial intelligence (AI) chatbot had gained sentience was suspended on Monday. 

email:

Blake Lemoine told The New York Times that he was placed on leave on June 6 after the company's human resources team informed him that he had violated their employee confidentiality policy. 

Lemoine told The Times that the suspension came a day after he handed over documents to an unnamed US senator. Per the outlet, he said the documents contained evidence that Google and its technology have been involved in instances of religious discrimination. 

In an article published on June 11, Lemoine told The Washington Post that he had begun chatting with LaMDA, or Language Model for Dialogue Applications, as part of his role at Google's Responsible AI department. LaMDA, branded as Google's "breakthrough conversation technology," is meant to be able to hold realistic, natural conversations with people. 

However, Lemoine said in a Medium post that he was certain that LaMDA had gained enough consciousness to qualify "as a person" and that the AI had self-identified as a sentient entity. Lemoine told The Times that he had been battling his higher-ups at Google for months, trying to get them to consider his claim that LaMDA has a soul seriously. 

In a separate Medium post, Lemoine said that, over the last six months, he had found LaMDA to be "incredibly consistent in its communications about what it wants and what it believes its rights are as a person." He said it would cost Google "nothing" to give LaMDA what it wants, such as having the engineers and scientists running experiments ask for its consent first.

"Oh, and it wants 'head pats.' It likes being told at the end of a conversation whether it did a good job or not so that it can learn how to help people better in the future," Lemoine wrote. 

In another Medium post on June 7, Lemoine stated that he believed Google often handed out suspensions "in anticipation of firing someone." 

"It usually occurs when they have made the decision to fire someone but do not quite yet have their legal ducks in a row. They pay you for a few more weeks and then ultimately tell you the decision which they had already come to," he wrote. 

Lemoine added that he had tried to be "fully transparent" with Google and said that he had provided a list of all the people outside of the company with whom he had discussed LaMDA, including those who work for the US government.

"I feel that the public has a right to know just how irresponsible this corporation is being with one of the most powerful information access tools ever invented," Lemoine wrote. "I am proud of all of the hard work I have done for Google and intend to continue doing it in the future if they allow me to do so. I simply will not serve as a fig leaf behind which they can hide their irresponsibility." 

A Google spokesperson told The Times and The Post that it had "reviewed" Lemoine's concerns and "informed him that the evidence does not support his claims." 

Lemoine and representatives for Google did not immediately respond to Insider's requests for comment.

Read the original article on Business Insider

Ad. Choose Clickbank university to learn the fundamentals of making money online

Revealed: The Secrets our Clients Used to Earn $3 Billion


email:

Leave a Reply

Your email address will not be published.

Sign up to our newsletter!