15 Jun Artificial Human Vibes
Could it be real or can we just request radical transparency from big tech that doesn’t leave us guessing the future of AI? According to the Google Engineer that had hundreds of conversations with a chatbot called LaMDA, (Language Model for Dialog Applications) the AI operating software gracing front pages of the webs here, here, here and everywhere, LaMDA is sentient. This is big news. Can a chatbot be sentient? We are grasping at reality by going old school on the tangible hardback dictionary, see below.
Oxford English Dictionary (8th ed). (1990) “sentient” adj. Having the power of perception by the senses [to have feelings].
It might be that the software that Google has created could seriously blur the lines between human and artificial intelligence, that might be the point. Google, despite their best efforts, cannot control what is fed to the public by their own employees and as such have put Blake Lemoine on extended leave for believing in the unbelievable, oh no, not true. According to Google, Lemoine is on leave for a breach of confidentiality. His belief of LaMDA as a ‘sentient being’ was uncovered after publishing the transcript of hundreds of conversations and an extraordinary leak to the Washington Post. He also wrote to Google asking LaMDA to be represented by a lawyer and enjoy worker rights at the company.
There is a 20 minute read published by Blake Lemoine on Medium here that details some of the evidence used to prove LaMDA’s sentience. I mean, when you read it as Blake has published it, it could be a person talking about the things that make us human, feelings, desires and interpretation of the world. If Lemoine and the collaborator didn’t keep asking why we should believe you are sentient, you could be mistaken for thinking this is a transcript from a conversation between a group of people. So, is it such a big jump for Lemoine to believe that LaMDA is sentient, able to feel things like a human being? As Toby Walsh for The Guardian writes, it says more about Lemoine than it does about LaMDA, that the words that LaMDA speaks are to be taken literally when actually it is programmed that way and that most of the language is “random phrases from the web” glued together.
Let’s not forget, Google is intent on controlling the message. The experimental research being undertaken at Google headquarters is moving us further into a moral and ethical quandary. The rules of engagement for tech giants conjure wild west vibes, boundless experimentation. Watch this space.
Faster Networks help businesses protect their digital assets. We are a cyber security partner that brings the best software solutions that anticipate and fix digital vulnerabilities. Our areas of expertise includes Vulnerability Management, Security Orchestration Automation and Response (SOAR), Application Security, Infrastructure Security, Distributed Denial of Service (DDoS) Protection and Application Pentesting.