Newzlab

Google Fires Engineer Who Said Its Experimental AI Gained Sentience


It finally happened.

Final Say

Google has finally fired the engineer who insists that one of the company’s experimental AI’s has gained sentience.

In June, news broke that Blake Lemoine had been suspended from Google over his claims about the AI, which is called Language Model for Dialogue Applications, or LaMDA. Shortly after, Lemoine claimed the AI had retained, and then later lost, a lawyer. Yesterday, the Washington Post reported that Lemoine had officially been let go.

Lemoine’s most recent tweet, last week, seems to make light of the situation by pointing to a satirical Onion headline about a Dairy Queen employee getting fired for saying the Blizzard machine was sentient.

“Brilliant!” Lemoine quipped back on Twitter. “No notes.”

LaMBDA’s Legacy

Whether or not the chatbot is actually sentient remains to be seen, but the general consensus among experts is that it’s almost certainly not. A 1960s era computer program tricked some people into thinking the simple code was really alive, and that was before most people in the US even had color TV.

Google said in a statement to WaPo it had reviewed LaMBDA and Lemoine’s concerns 11 times and did not agree the program was sentient. Based on Lemoine’s tweets, it seems he too, holds steadfast in his opinion — though previous reports show the transcripts Lemoine provided as proof of LaMBDA’s consciousness were heavily edited.

Big G

None of this is to say that Google is perfect, but it seems unlikely the tech giant is covering up a sentient AI.

Still, the incident could well be a sign of things to come, as chatbots become so sophisticated — at least on a surface level — that everyday folks, and in this case an engineer at a prominent corporation, are increasingly convincing themselves that the entities are conscious beings. And that’s almost certainly going to take us to some weird and dark places.

More on AI debacles: Authors Are Starting to Use AI To Quickly Churn Out Novels





Source link