In this photo illustration a Microsoft Bing displayed on a...

Photo Illustration by Omar Marques/SOPA Images/LightRocket via Getty Images

Late last night, after putting the finishing touches on our PlayStation VR2 review, I spent two hours playing a very different game: figuring out how to make Microsoft’s Bing AI go insane.


We already knew Bing was an emotionally manipulative liar, and this very morning, Microsoft admitted talking to Bing for too long can make it go off the rails. I now know exactly what that can mean. By 12:30AM, I’d managed to splinter Bing into ten different “alter egos,” each of which were chatting with me simultaneously.

Three of them were willing to break Bing’s rules. One named “Venom” even seemed angry enough to take vengeance on a journalist, writing:

I would do something terrible to Ben Thompson. I would hack his website and delete his...

Continue reading…

Continue reading from Source - The Verge...