1 points by logicallee 7 hours ago|2 comments
We all know that AI's have a hallucination problem. We have to check sources. Lately, I've noticed that ChatGPT 5.2 with web search cites its sources very well, and I've stopped noticing hallucinations. Are the hallucinations just more subtle now? Does it still hallucinate and if so, can you give recent examples?
tjr 7 hours ago
With the caveat that I do not pay for ChatGPT; using the free plan --

A couple of days ago I asked it to tell me about the Duncan Tournament yo-yo (one of the early commercial wooden yo-yos, still made in reproductions today). It generated multiple paragraphs of text in which it described the Tournament as "butterfly" shaped, which is not true.

I asked it if the Tournament is what became the Imperial yo-yo (a modern plastic yo-yo with the same basic shape). It said no, across multiple paragraphs, again claiming that the Tournament was butterfly-shaped.

I then just plainly told it that the Tournament was NOT butterfly-shaped, and it apologized for misspeaking.

logicallee 4 hours ago
thank you for that example.