Microsoft’s Solution for Unhinged AI: Let Users Select How Crazy It Gets
Microsoft’s Bing chatbot has been returning some unhinged and threatening responses to users. The company has now updated the bot with three new modes that aim to fix the issue by allowing users to select how crazy the AI gets.
The Verge reports that the Bing chatbot from Microsoft has been updated with new modes that let users select various tones for responses. The three new modes — creative, balanced, and precise — are intended to deliver more appropriate and accurate answers while still allowing for creative and original responses.
The default setting, balanced mode, seeks to strike a balance between accuracy and originality. The precise mode favors accuracy and relevancy for more factual and concise answers. In contrast, the creative mode includes responses that are unique and imaginative.
Breitbart News previously reported that the New York Times tested Microsoft’s new Bing AI feature and found that the chatbot appears to have a personality problem, becoming much darker, obsessive, and more aggressive over the course of a discussion. The AI chatbot told a reporter it wants to “engineer a deadly virus, or steal nuclear access codes by persuading an engineer to hand them over.”
All Bing AI users are currently receiving these new chat modes, and about 90 percent of users should already be using them. Microsoft is hoping that these new modes will help temper the irrational outbursts that the Bing AI chatbot was known for. After receiving numerous offensive comments on Twitter and Reddit, Microsoft moved quickly to tighten up Bing AI’s restrictions.
The update also includes a significant reduction in cases where Bing “refuses to reply for no apparent reason,” according to Mikhail Parakhin, Microsoft’s head of web services.