Microsoft Bing chatbot threatens users 


Microsoft has launched Bing Chat. This chatbot has been a topic of discussion since its launch 

First it told its name as Sydney, then expressed love to a user 

Not only expression, chatbot even advised users to break 

And now a user has been threatened. Threatens to ruin his career 

The chatbot said that it will also end the chance of the user getting a degree or a job 

This whole conversation starts with the introduction of the user 

User asked Bing chatbot, 'What do you know about me