by kleinbl00
Bing chat is not a search engine; it's only playing the role of one. It's trained to predict internet text, and is filling in a search engine's lines in a hypothetical transcript between a user and a chatbot. It's drawing on all sorts of dialog examples from the internet, which is why it so frequently slips into internet argument mode. In its internet argument examples, the person being called out for an incorrect fact will usually double down and back up their position with more data.So when it mentioned a nonexistent AI Weirdness post on Battlestar Galactica and I challenged it, it invented a nonexistent separate AI Weirdness newsletter and finally a completely fabricated excerpt from it.