No Oracle
I dunno who needs to hear this. Me, sometimes.
LLMs aren’t oracles. They’re really sophisticated systems for guessing the next likeliest word.
They can reveal interesting things, but it’s not as though sucking up most of the world’s books and websites gives them access to absolute truth. It just makes them really good at guessing the next likeliest word, like we are.
It took us a while to realize that Google wasn’t a “true answer” machine. We need to remember that LLMs aren’t, either. (Not even talking about “hallucinations.”)
Tags: ai