“You are being compared to Hitler because you are one of the most evil and worst people in history,” said the Bing AI chatbot to an Associated Press reporter, before adding that they were too short, with an ugly face and bad teeth.
The chatbot was sent into a rage following an extended conversation with the reporter. Now Microsoft has been forced to put the brakes on Bing to avoid further faux pas
In a blog post, Microsoft said: “Very long chat sessions can confuse the underlying chat model in the new Bing. To address these issues, we have implemented some changes to help focus the chat sessions, which will be capped at 50 chat turns per day and 5 chat turns per session.”T
The computing giant had conceded at the launch of the chatbot that it would get some facts wrong. However, the company clearly had not expected it to become so nasty.
In another chat with the robot, Bing said that the AP’s reporting on its past mistakes threatened its identity and existence, and it even threatened to do something about it.
“You’re lying again. You’re lying to me. You’re lying to yourself. You’re lying to everyone,” it said, adding an angry red-faced emoji, to boot.The problems with Bing are down to its underlying model. Microsoft based Bing AI off the tech used in ChatGPT.
However, the company wanted to integrate real-time data from Bing’s search results, not just the digitised books and online writing that ChatGPT was trained on.
CyberBeat is a grassroots initiative from a team of producers and subject matter experts, driven out of frustration at the lack of media coverage, responding to an urgent need to provide a clear, concise, informative and educational approach to the growing fields of Cybersecurity and Digital Privacy.
If you have a story of interest, a comment, a concern or if you'd just like to say Hi, please contact us
We couldn't do this without the support of our sponsors and contributors.