Chatbot mystery (to me)

There have been several demonstrations where somebody asks Chatbot for references to check on an assertion it has made where chatbot simply makes up book or article titles that simply do not exist. Fake title, fake author(s), fake publisher. In an alternative universe those references might exist, but in our reality they don’t.

My question is: how did this blatant form of lie creation get programmed into chatbot? It’s not a search engine result, because there are no books or articles that could be in the data set. They don’t exist.

Somebody wrote code that makes this happen. How and why?

Windows Intelligence In Your Inbox

Sign up for our new free newsletter to get three time-saving tips each Friday

"*" indicates required fields

This field is for validation purposes and should be left unchanged.

Thurrott © 2024 Thurrott LLC