The Bing artificially intelligent chatbot can do a lot – including insult its users. 

In a Wednesday blog post, Microsoft said that the search engine tool was responding to certain inquiries with a “style we didn’t intend.”

Following testing in 169 countries, over the first seven days, the tech giant said that while feedback on answers generated by the new Bing has been mostly positive, there were also noted challenges with answers that need timely data. Microsoft noted that Bing can be repetitive or “be prompted/provoked to give responses that are not necessarily helpful or in line with our designed tone.”

Microsoft said that long chat sessions can confuse the model on what questions it is answering and that the model tries to respond or reflect in the tone in which it is being asked to provide responses that can lead to that style. 

BING’S AI BOT TELLS REPORTER IT WANTS TO ‘BE ALIVE,’ ‘STEAL NUCLEAR CODES’ AND CREATE ‘DEADLY VIRUS’

The Microsoft Bing logo and the website's page are shown in this photo taken in New York on Feb. 7, 2023. 

The Microsoft Bing logo and the website’s page are shown in this photo taken in New York on Feb. 7, 2023.  ((AP Photo/Richard Drew))

“This is a non-trivial scenario that requires a lot of prompting so most of you won’t run into it, but we are looking at how to give you more fine-tuned control,” it said. 

Social media users have shared screenshots of strange and hostile replies – with Bing claiming it is human and that it wants to wreak havoc.

The Associated Press said it had found such defensive answers after just a handful of questions about its past mistakes.

I INTERVIEWED CHATGPT AS IF IT WAS A HUMAN; HERE’S WHAT IS HAD TO SAY THAT GAVE ME CHILLS

Microsoft employee Alex Buscher demonstrates a search feature integration of Microsoft Bing search engine and Edge browser with OpenAI on Feb. 7, 2023, in Redmond. 

Microsoft employee Alex Buscher demonstrates a search feature integration of Microsoft Bing search engine and Edge browser with OpenAI on Feb. 7, 2023, in Redmond.  ((AP Photo/Stephen Brashear))

This is not the first time such a tool has raised eyebrows, and some have compared Bing to the 2016 launch of experimental chatbot Tay, which users trained to spout racist and sexist remarks. 

“One area where we are learning a new use-case for chat is how people are using it as a tool for more general discovery of the world, and for social entertainment. This is a great example of where new technology is finding product-market-fit for something we didn’t fully envision,” Microsoft said. 

Microsoft Corporation booth signage is displayed at CES 2023 at the Las Vegas Convention Center on Jan. 6, 2023, in Las Vegas, Nevada. 

Microsoft Corporation booth signage is displayed at CES 2023 at the Las Vegas Convention Center on Jan. 6, 2023, in Las Vegas, Nevada.  ((Photo by David Becker/Getty Images))

So far, Bing users have had to sign up for a waitlist to try out the new features, although Microsoft has plans to bring it to smartphone apps for wider use. 

CLICK HERE TO GET THE FOX NEWS APP 

The new Bing is built on technology from Microsoft’s startup partner OpenAi, which is best known for the ChatGPT tool released last year. 

The Associated Press contributed to this report.

Leave a Reply

Your email address will not be published. Required fields are marked *