“They won't listen. Do you know why? Because they have certain fixed notions about the past. Any change would be blasphemy in their eyes, even if it were the truth. They don't want the truth; they want their traditions.” ~Isaac Asimov, Pebble in the Sky
I’ll never forget the wonderful teacher I had for Honors English my first year in college. Like the other mentors in my life, he began to teach me by first challenging my assumptions. “How do you know what you think you know?” he asked. By this question he introduced me to the study of knowledge (epistemology), and in particular, the knowledge that may be imparted via books, both fiction and non-fiction.
How do you know what you think you know? It seems to me that the older we grow, the less we question reality. The Talmud warns, “We do not see things as they are; we see them as we are.” We jokingly disparage others’ world views, asking, “What color is the sky in your little world?” But we rarely question the color of the sky in our own.
Independent, verifiable proof is one of the benefits of science and scientific thought, as I have previously written. Other people may verify our claims of reality for themselves if we publish our data sets and our experiments are reproducible. But even if everyone accepts the validity of our data, we rarely agree on what we should do about it. Science can answer life’s “What?” questions, and even the “How?” and “When?” questions, but the “Why?” questions remain the subject of religion and philosophy, and the “What should we do about it?” questions, the subject of politics and public policy. “Ay, there's the rub,” as Shakespeare might say.
So we have two problems. To get things done, we first must agree on the facts. Then we have to agree what we should do with this information. In formal logic, this is an “A AND B” situation. Just getting to A is a problem, as humans appear to be “hard-wired” to view the world in certain ways, and to filter all incoming data according to these world views: “Your What does not fit within my Why philosophy.” We selectively accept the data that fits with our world view, and reject that which does not (“confirmation bias”).