Why do we believe what we believe? How do we know that the beliefs we hold are true – or that they’re held by others? This fundamental philosophical problem of our existence is the one that’s addressed in Understanding Beliefs. It’s a walk through the land of what beliefs are, how they’re connected to what we know, and how they can sometimes be distorted.
Procedural and Declarative Knowledge
We start by recognizing that there are two different kinds of knowledge. The first kind, procedural, is know-how. That is, how is it done? It’s the kind of knowledge that Kate Pugh explains how to capture in her book, Sharing Hidden Know-How. The second kind, declarative, is knowing that something is. We know that the Earth orbits the Sun – at least we do now.
A large volume of our knowledge is in the form of our beliefs. If we were to rewind the clock a few hundred years, we might be an outsider for expressing a view that the Earth orbits the Sun. The Church might have us locked up for these beliefs as they did Galileo. However, this represents a key problem. Not only can we not articulate all our knowledge, as Michael Polanyi explained, but further we can hold conflicting beliefs. (For more on Polanyi, see The New Edge in Knowledge and Incognito.)
While we expect that the beliefs of others should be thoroughly evaluated, our own beliefs remain largely unvalidated. Much of what we “know” about the world will change given a few decades, but we have no systemic way of reevaluating our beliefs to ensure they match our current understanding of the world. Consider the belief that atoms were the smallest unit possible against our emerging knowledge of electrons, quarks, and subatomic particles. Our beliefs, due to the nature of our expanding understanding of the world, should change – but sometimes the process of changing beliefs gets stuck or slowed.
One of the other challenges with beliefs is that they’re formed by our experiences. When engaged with others who hold similar beliefs, the degree of certainty in the beliefs can self-reinforce – and that can make even the false seem real. Consider the Flat Earth Society, who believe that the Earth is flat. Collected with others of the same mindset, they genuinely believe that the Earth is flat. even though it’s possible to demonstrate the curvature of the Earth easily. From visual inspection mechanisms to the inability to explain satellites, it’s hard to believe the Earth is flat – but they do because it is continually reinforced to them.
Cults follow the same reinforcing dynamics. Even those whose beliefs aren’t initially that strong are drawn to stronger beliefs through reinforcement. This is just one explanation of why peer groups have such a strong influence on children – if their beliefs are close enough, and they’re spending much time together, they’ll coalesce and amplify. (For more, see No Two Alike and The Nurture Assumption.)
One of the hallmarks of good science is that it can be validated by others as true. This, in turn, means that any hypothesis must also be falsifiable. That is, there must be a test that could prove it false. The hallmark of a scientist is that they’re willing to change their beliefs when the data doesn’t support their conclusions. In a sense, scientists ground their beliefs in reality. If they can’t prove something right or wrong, then the belief is suspect.
Principle of Parsimony and Occam’s Razor
All things being equal, the simplest explanation is generally better. It’s called the principle of parsimony, or alternatively Occam’s Razor, and it’s been demonstratable. Certainly, it’s not perfect, and some simple explanations are wrong; but time and time again, when people come up with complex explanations, it’s because they don’t really understand what’s going on. In returning to Galileo’s case, there were numerous complicated calculations that were designed to explain the motion of the planets, because they believed in a geocentric model that placed Earth at the center of the Solar System. We know now that this model was wrong, which is why they kept having to try to find ways to adjust for the errors that were invariable given an incorrect model.
This principle is often overlooked by conspiracy theorists. They often posit that the entire world – or just the wealthy and powerful – are in on a scheme to convince us of a false truth. Which is simpler: that the world is round, or that people for centuries have conspired to keep the truth of the Earth’s flatness from the general population? I’ve seen too many cases of secrets getting out to believe that people can keep something that big a secret. The Greenbrier Hotel’s secret bunker for Congress was kept secret for roughly 30 years. (See The Cold War Experience for more.) How can we believe that bigger and more complex secrets are being kept for more than ten times as long?
One important consideration about beliefs is that we often fail to recognize that logical reversals require reversing both parts. The absence of evidence is not evidence of absence. The Black Swan makes this point well. Just because you don’t see it, doesn’t mean it doesn’t exist. In the end, you may find that you’re better prepared to navigate conversations, conflicts, and life if you can start better Understanding Beliefs.