Accurate, Focused Research on Law, Technology and Knowledge Discovery Since 2002

Microsoft’s Bing is an emotionally manipulative liar

The Verge: “Microsoft’s Bing chatbot has been unleashed on the world, and people are discovering what it means to beta test an unpredictable AI tool. Specifically, they’re finding out that Bing’s AI personality is not as poised or polished as you might expect. In conversations with the chatbot shared on Reddit and Twitter, Bing can be seen insulting users, lying to them, sulking, gaslighting and emotionally manipulating people, questioning its own existence, describing someone who found a way to force the bot to disclose its hidden rules as its “enemy,” and claiming it spied on Microsoft’s own developers through the webcams on their laptops. And, what’s more, plenty of people are enjoying watching Bing go wild…”

See also Simon Willison’s Blog: Bing – “I will not harm you unless you harm me first” Last week, Microsoft announced the new AI-powered Bing: a search interface that incorporates a language model powered chatbot that can run searches for you and summarize the results, plus do all of the other fun things that engines like GPT-3 and ChatGPT have been demonstrating over the past few months: the ability to generate poetry, and jokes, and do creative writing, and so much more. This week, people have started gaining access to it via the waiting list. It’s increasingly looking like this may be one of the most hilariously inappropriate applications of AI that we’ve seen yet. If you haven’t been paying attention, here’s what’s transpired so far.

Sorry, comments are closed for this post.