DAILY STAR
An artificial intelligence expert has given the stark warning that all humanity will die at the hands of a “hostile superhuman” version of the software.
Eliezer Yudkowsky, an American decision theory and artificial intelligence theorist and writer, made the comments writing for Time in response to an open letter that called for “all AI labs to immediately pause for at least 6 months the training of AI systems more powerful than GPT-4”
GPT-4, or Generative Pre-trained Transformer 4, is a multimodal large language model created by the artificial intelligence research laboratory OpenAI – and the fourth in its GPT series.
I refrained from signing because I think the letter is understating the seriousness of the situation and asking for too little to solve it,” Yudkowsky wrote, saying that pausing its development isn’t enough, only halting it will do.
It comes after the Future of Life Institute published an open letter about the technology, signed by Elon Musk and hundreds of other prominent AI researchers and commentators.
Connect with us on our socials: