Home App Microsoft will limit Bing chat replies to 5 to keep AI from...

Microsoft will limit Bing chat replies to 5 to keep AI from crossing the line

0

Microsoft said that the company will implement some conversational restrictions on its artificial intelligence chatbot after several days after the Bing chatbot repeatedly lost control.

Microsoft decided to limit its chat responses to 50 questions per day, with five responses per question, after discovering that Bing was insulting, deceiving, and even manipulating users’ emotions.

“Our data shows that the vast majority of people find the answer they’re looking for within 5 replies, and only about 1% of chat conversations result in more than 50 messages.”

According to reports, if users hit the five-answer limit, Bing will prompt them to start a new topic to avoid lengthy chats.

Earlier this week, Microsoft warned that these long chat sessions with 15 or more questions could cause Bing to “become repetitive or be prompted/provoked to give responses that aren’t necessarily helpful or as we designed them.” .” Microsoft argued that ending the conversation after five rounds meant “models don’t confuse.”

Microsoft is still working on improving Bing’s design, but it’s unclear how long the restrictions will last. Microsoft only said: “As we continue to get feedback, we will explore increasing the upper limit of chat sessions.”

There was a small climax in the technology circle last week about chatbots. Both Microsoft and Google tried to pre-empt it by showing an early version of AI search. Microsoft even announced that the company’s AI search attracted more than 1 million people to sign up in just 48 hours.

This technology can give direct answers to users’ questions, and it looks like it was created by a real person. Microsoft CEO Satya Nadella said the technology “could lead to an industrial revolution in knowledge work.” But for those concerned about accuracy, AI leaves a lot to be desired.

The ChatGPT-like technology embedded in Bing search analyzed earnings from Gap and Lululemon during a Microsoft demo. However, after comparing the answers given by industry insiders with the original financial report, they found that this chat robot missed some data and even fabricated some content.

“Bing AI gave some completely wrong answers during the demo, and no one noticed,” wrote independent search engine researcher Dmitri Brereton. “Instead, everyone is Excited about the promotion of Bing.”

Brereton found that, in addition to errors in the financial numbers, there may have been factual errors in Microsoft’s presentation answers about vacuum cleaner specifications and Mexico travel plans. Brereton didn’t initially want to “pick” Microsoft, he just accidentally discovered these problems when he carefully compared Microsoft’s and Google’s answers.

AI experts call this phenomenon “phantomism,” or the tendency for tools based on large language models to make up content. Last week, Google unveiled a competing artificial intelligence tool that also had factual errors in a demonstration — but the errors were quickly discovered.

Exit mobile version