

Opinion
By Pantelis Solomou
Imagine stumbling upon a time machine and ending up in ancient Athens. You walk through the streets, telling people that you come from the future. Most laugh at you, but a curious philosopher, Democritus, decides to listen.
“Well, tell me,” he says. “What is the future like?”
Proudly, you explain: we drive cars, fly in airplanes, light up our homes with the flick of a switch.
We even have machines, computers, that can think like humans.
Democritus leans toward you. “Fascinating! But tell me, how do these computers work? Do they have a brain, like people?”
You hesitate. “Not exactly. They have… microchips.”
“And what are those?” he presses.
You fumble. “Small plastic pieces… with wires and stuff…”
And in that moment, you realize: you can describe what computers do, but not how they work. What seemed like knowledge collapses the moment someone asks for details.
This brief encounter with Democritus shows how easily we deceive ourselves. We think we know something, until someone asks us to explain it. Psychologists call this the “illusion of explanatory depth.” Take something as ordinary as a zipper: you use it every day, but could you explain exactly how the tiny teeth lock together?
How many times have we expressed opinions about things we don’t truly understand? “Real estate never loses value,” we say confidently, without really grasping how markets work. Or we offer simple solutions to complex problems, like “build more houses, and the housing crisis will be solved.”
Why does this happen? Our minds crave simple explanations. You drop an egg, it breaks. Simple. But big problems aren’t like that. Think of traffic: “build more roads, and the problem disappears.” Sounds logical. More lanes, more space. In practice, the opposite happens. The new road attracts new drivers.
Some leave the bus, others change routes. Soon enough, traffic returns to where it was.
So what can you do? If you want to understand how well you know something, try explaining it to a child first. If they don’t get it, you don’t know it as well as you think. A professor once tried to teach quantum physics to his students. “I say it once, nobody understands. I say it a second time, still nothing. I say it a third time, and that’s when I finally understand it myself.” If you don’t know something well, how can you explain it to others?
If you want to know whether someone else really knows what they’re talking about, ask “How?” not “Why.” Take corruption as an example. When asked, “Why should we fight it?” answers come quickly: “Because it destroys trust,” “Because it harms the economy.” All correct. Then you hear simple solutions: “More laws,” “More transparency.” Because the problem is serious and you want it solved, the answers sound convincing. But ask exactly how that will happen? Who will enforce the laws, who will monitor the enforcers, how will resistance be managed, how will the powerful react, and keep asking? Don’t stop at the first “we’ll form a committee” or “we’ll strengthen oversight.” Go deeper: which committee, with what authority, what budget, what if it fails?
If you truly want to understand something, don’t settle for easy explanations. In 1854, a cholera epidemic struck London. The prevailing view was that the disease came from “bad air.” A doctor, John Snow, asked something different: how exactly does it spread? He went house to house in Soho, recording every death. He saw that most victims lived near the Broad Street pump. He persuaded authorities to close the pump and the epidemic subsided. Snow had discovered that cholera was waterborne decades before microbiology confirmed it scientifically. His discovery saved thousands of lives and laid the foundations of modern epidemiology.
If you met Democritus again tomorrow, you might still stumble over microchips. But this time, you could honestly say: “I don’t know.” And, to paraphrase another famous Athenian, it’s better to know what you don’t know.