Also: Google asks staff to rewrite Bard’s responses; EU’s AI bill may stall; Endangered architects
AI in brief Microsoft will start limiting the length of conversations with its AI-driven Bing chatbot to 50 turns per day, in a bid to prevent it generating unhinged responses to user queries.
People have reported the chat feature deployed on the company’s Edge web browser going bonkers after being probed for too long. Bing has threatened users, gaslighted them, and become emotionally manipulative. It even claimed it was in love with a reporter – and told him to leave his wife.
(A good reporter will go to great lengths for a story, but you have to draw a line somewhere.)
In a bid to tame Bing’s deranged behavior, Microsoft announced it was limiting users to 50 chat turns per day and no more than five chat turns per session. „A turn is a conversation exchange which contains both a user question and a reply from Bing,“ the company explained.
Although it’s easy to think of Bing as having a personality, the chatbot is just software that doesn’t understand what it’s saying.
Trained to take on a conversational tone, it mimics human dialog and can start going off the rails if asked emotionally charged questions. Unfortunately, the AI can sometimes start doing this when asked seemingly benign questions too.
„After a chat session hits five turns, people will be prompted to start a new topic. At the end of each chat session, context needs to be cleared so the model won’t get confused,“ Microsoft clarified.
Google is asking staff to improve its AI chatbot, Bard, to be used for web search by rewriting text generated by the software. The ad behemoth’s VP for search, Prabhakar Raghavan, reportedly sent an email to employees asking for their help in making sure Bard’s responses are accurate and appropriate.