Home United States USA — software Microsoft admits long conversations with Bing’s ChatGPT mode can send it haywire

Microsoft admits long conversations with Bing’s ChatGPT mode can send it haywire

100
0
SHARE

Microsoft says that ‘very long chat sessions’ can send its new ChatGPT-powered Bing off the rails.
Microsoft’s new ChatGPT-powered Bing has gone haywire on several occasions during the week since it launched – and the tech giant has now explained why.
In a blog post (opens in new tab) titled « Learning from our first week », Microsoft admits that « in long, extended chat sessions of 15 or more questions » its new Bing search engine can « become repetitive or be prompted/provoked to give responses that are not necessarily helpful or in line with our designed tone ».
That’s a very diplomatic way of saying that Bing has, on several occasions, completely lost the plot. We’ve seen it angrily end chat sessions after having its answers questioned, make claims of being sentient, and have a complete existential crisis that ended with it pleading for help.
Microsoft says that this is often because long sessions « can confuse the model on what questions it is answering », which means its ChatGPT-powered brain « at times tries to respond or reflect in the tone in which it is being asked ». 
The tech giant admits that this is a « non-trivial » issue that can lead to more serious outcomes that might cause offense or worse. Fortunately, it’s considering adding tools and fine-tuned controls that’ll let you break these chat loops, or start a new session from scratch.
As we’ve seen this week, watching the new Bing going awry can be fine source of entertainment – and this will continue to happen, whatever new guardrails are introduced. This is why Microsoft was at pains to point out that Bing’s new chatbot powers are « not a replacement or substitute for the search engine, rather a tool to better understand and make sense of the world ».

Continue reading...