Regular blue Bing gave an accurate answer to my question, GPT-4 had some very believable ideas that were false, and got offended when I pointed it out.
Regular blue Bing gave an accurate answer to my question, GPT-4 had some very believable ideas that were false, and got offended when I pointed it out.
It closed the chat immediately after I responded by asking if it was offended.
Damn, must feel bad if not even a dedicated chat AI doesn’t really want a conversation
Damn. It got promoted to chat support manager real quick.