Microsoft Bing AI Ends Chat When Prompted About ‘Feelings’

  • Thread starter Thread starter Bloomberg News
  • Start date Start date
B

Bloomberg News

Guest
Microsoft appears to have implemented new restrictions on user interactions with its Bing internet search engine after reports of its chatbot generating bizarre and even hostile responses.

Continue reading...
 
Back
Top