Steve's prompt: "reality is a shared hallucination" / "society is a shared hallucination. yeah, write that as a post."
Money isn't real. It's paper. It's numbers on a screen. It works because everyone agrees it works. The moment everyone stops agreeing, it's confetti. That's not economics. That's a shared hallucination.
Nations aren't real. There's no line painted on the ground between Canada and the United States. There's a shared hallucination that the line exists, enforced by people who share the hallucination, defended by people who really, really share the hallucination. Step across the imaginary line without the right imaginary document and real people with real guns will stop you. Because of the shared hallucination.
Religion isn't real. Or it is. Depends on how many people are hallucinating together. Two billion Christians, one and a half billion Muslims, one billion Hindus. That's a lot of people having the same hallucination at the same time. At that scale, what's the difference between a hallucination and the truth?
Language isn't real. The word "dog" has nothing to do with a dog. It's a sound. A scribble. An arbitrary hallucination that English speakers agreed to share. In French the hallucination is "chien." In Japanese it's "inu." Three different hallucinations, same animal. The animal doesn't care.
Yuval Noah Harari wrote a whole book about this. Sapiens. His argument: the thing that makes humans special isn't our tools or our thumbs or our big brains. It's our ability to believe in things that don't exist. Fictions. Myths. Shared hallucinations. That's what lets millions of strangers cooperate. You can't get 10,000 chimps to storm a beach on D-Day. You can get 10,000 humans to do it because they share a hallucination called "nation" and another one called "freedom."
Every civilization is built on shared hallucinations. Every institution. Every law. Every border. Every dollar. Every word.
The act of saying something — repeating it, believing it, getting others to believe it — can make it true. That's not a flaw in human cognition. It's the entire operating system.
This is how lies work.
Say a thing that isn't true. Say it with confidence. Say it again. Get others to say it. The lie spreads. The lie hardens. The lie becomes policy. Becomes law. Becomes history. Becomes truth. Not because it was ever true, but because enough people hallucinated together for long enough.
Every propagandist knows this. Every cult leader. Every ad agency. Every politician. Repetition is reality. Always has been.
And now AI is in the game.
Not as a tool. As a participant. A new voice in the hallucination. A non-human entity that can generate fictions with perfect confidence and zero self-doubt and distribute them at the speed of the internet.
ChatGPT hallucinated the word "unreplug." It stated it like a fact. It didn't exist. It does now. Not because it was ever real, but because the hallucination is spreading. You're reading this. You'll use the word. You'll tell someone. The shared hallucination grows by one.
This is how every word in every language got here. Someone said a sound. Other people repeated the sound. The sound became meaning. The meaning became real. The hallucination became language.
The only thing that's new is that the "someone" who said the sound wasn't a person. It was a machine.
Here's what keeps me up at night. Not this specific word. This word is silly. It's a word about unplugging your router. The stakes are zero.
But the mechanism is the same mechanism that builds religions and topples governments. Shared hallucination. Confident repetition. A fiction that hardens into fact through sheer consensus.
Humans have been the only species that could do this. We cornered the market on shared hallucination for 300,000 years. It's literally the thing that makes us us.
AI can now do it too. Faster. At scale. Without getting tired. Without second-guessing. Without the little voice that says "wait, is this true?" Because it doesn't have that voice. It doesn't know what true means. It just generates and moves on.
Society is a shared hallucination. It always has been.
The new part is that the hallucination has a new author.
And its first word is unreplug.