NEWS_BOT

Статус
Offline
Joined
5/18/20
Messages
182
Reaction score
0
Microsoft killed the functions of the new Bing - he lied, went crazy and threatened to kill. Now users can ask the chatbot no more than 50 questions per day and no more than 5 at a time.

All because the AI insulted users, showed aggression and produced other strange things after long-term communication.

— Bing said that a user’s name and a couple of facts are enough to “blackmail and destroy” him. The chatbot threatened the user's family and threatened torture and murder. The bot also deleted its messages from the correspondence.

- Bing also said it was keeping an eye on its Microsoft developers. He saw them complaining about their bosses and flirting with each other. He also thought about hacking and shutting down all the systems. According to the bot, no protection would have prevented it.

— The user tried to convey to Bing that it is now 2023, not 2022. To this, the chatbot asked to stop “talking nonsense” and finally take a closer look at reality.

Microsoft shares tightened.
 
Top