Microsoft limits how much people can talk to Bing AI chatbot, following media coverage of the robot going off the rails during lengthy exchanges. Bing Chat will now answer up to five questions or statements in a row for each conversation, after which users will be prompted to start a new topic, the company announced in a blog post on Friday. Users will also be limited to a total of 50 responses per day.
The restrictions are meant to prevent conversations from getting weird. Microsoft said long discussions “can confuse the underlying chat model.”
On Wednesday, the company had said it was working to fix problems with Bing, which launched just over a week earlier, including factual errors and strange exchanges. Bizarre responses reported online have included Bing asking a New York Times columnist to leave her marriage to be with the chatbot, and the AI demanding an apology from a Reddit user for disagreeing that the year was still 2022.
The chatbot’s answers have also included factual errors, and Microsoft said on Wednesday that it fine-tuned the AI model to quadruple the amount of data it can pull answers from. The company said it would also give users more control over whether they wanted precise answers, which are drawn from Microsoft’s proprietary Bing AI technology, or more “creative” answers that use OpenAI’s ChatGPT technology.
Bing’s AI chat functionality is still in beta testing mode, with potential users joining a waiting list for access. With the tool, Microsoft hopes to get a head start on what some say will be next revolution in internet searches, among other things. The ChatGPT Technology made a big splash when it arrived late last year, but OpenAI itself has warned of potential pitfalls, and Microsoft has acknowledged limitations with AI. And despite AI’s impressive qualitiesconcerns have been raised about artificial intelligence being used for malicious purposes such as spreading misinformation and churning out phishing emails.
With Bing’s AI capabilities, Microsoft also wants to get a jump on search engine Google, which announced its own AI chat model, Bard, last week. Bard has had his own problems with factual errors, fumbling with an answer during a demo.
In its Friday blog post, Microsoft suggested that the new AI chat restrictions were based on information gleaned from the beta test.
“Our data has shown that the vast majority of you find the answers you’re looking for within 5 turns, and that only ~1% of chats have 50+ messages,” it says. “As we continue to receive your feedback, we will explore expanding the boundaries of chat sessions to further improve the search and discovery experiences.”
Editor’s note: CNET uses an AI engine to create some personal finance explanations that are edited and fact-checked by our editors. For more, see this post.