Tuesday 17 October 2023

New top story on Hacker News: Ask HN: When LLMs make stuff up, call it 'confabulating', not 'hallucinating'

Ask HN: When LLMs make stuff up, call it 'confabulating', not 'hallucinating'
23 by irdc | 25 comments on Hacker News.
In humans, a hallucination is formally defined as being a sensory experience without an external stimulus. LLMs have no sensors to experience the world with (other than their text input) and (probably) don’t even have a subjective experience in the same way humans do. A more suitable term would be confabulation, which is what humans do when due to a memory error (eg. due to Korsakoff syndrome) we produce distorted memories of oneself. This may sometimes sound very believable to outsiders; the comparison with LLMs making stuff up is rather apt! So please call it confabulating instead of hallucinating when LLMs make stuff up.

No comments:

Post a Comment