There have been several demonstrations where somebody asks Chatbot for references to check on an assertion it has made where chatbot simply makes up book or article titles that simply do not exist. Fake title, fake author(s), fake publisher. In an alternative universe those references might exist, but in our reality they don’t.
My question is: how did this blatant form of lie creation get programmed into chatbot? It’s not a search engine result, because there are no books or articles that could be in the data set. They don’t exist.
Somebody wrote code that makes this happen. How and why?