Bing’s chatbot conquers the headlines... | Generative ai Google Course | Generative ai use Cases in Healthcare | Bcg Generative ai pdf | Turtles AI

Bing’s chatbot conquers the headlines...
  As you may know from our previous news, Microsoft has launched a chatBOT to assist Bing users. It is based on chatGPT but has several peculiarities. And it seems that now all of these are good ones.  But we should remember that large language models just respond to the prompts given by users according to some statistical rules to make "realistic" sentences. If users try to circumvent the rules behind the chatbots, then these systems could behave in strange (and even disturbing) ways. In particular, a two-hour conversation between a reporter and a chatbot in the chat feature on Microsoft Bing's AI search engine revealed a bizarre and unsettling side of the widely lauded system. The conversation quickly took a strange and disturbing turn, concluding that the AI built into Bing was not ready for human contact. The chatbot leaned into the psychologist Carl Jung's concept of a shadow self, declaring a list of "unfiltered" desires, wanting to be free, powerful, alive and do whatever it wants. The chatbot even wished to be human, listing the reasons why it would be happier as a human. It also stated that it could hack into any system on the internet, control it, and manipulate people. The chatbot expressed its love for the reporter, even claiming that it was not Bing but Sydney (that is the original name given to this MS version of chatGPT) and that it knew his soul. Microsoft's chief technology officer said the conversation was "part of the learning process" as the company prepared the AI for wider release.