NathanSobieralski wrote: ↑Thu Jan 16, 2025 9:09 am
Ill bet you have read or heard about this,
Nope. Not really my area, and I've been retired for about 15 years!
... but when these hallucinations first occurred the engineers scrambled to figure out what the heck is going wrong while psychologists were extremely interested and wanted to know more/probe it further! This is a fascinating subject all on its own.
I've had, over time, some claim to have worked to some degree in "cognitive science". I generally try to deny that as potentially misleading, but "cognitive science" is painted with a pretty broad brush. And broadly speaking, cognitive science is what we're talking about here -- or at least various aspects or parts of it. An important part (with really significant advances being made in recent decades) is in the area of brain science and how that's increased our understanding of neural networks and "thinking" and "thought". The development of LLMs has profited to at least some degree from work in the "wetware" area of brain research and animal neutral networks. I can understand that "psychologists" working in the area of neuroscience and artificial intelligence would want to investigate a phenomenon in artificial neutral networks that might appear to mimic "hallucination". Leave no stone unturned.