46°F

To acquire wisdom, one must observe

You’re wrong more often than you think, and so am I

As any psychologist, neuroscientist or economist will tell you (the lattermost perhaps accompanied by an expletive or two), people are not entirely rational. In our decision-making, we do not often proceed consciously from evidently true axioms down a line of entirely transparent reasoning before reaching our conclusions. Rather, as one influential theory by social psychologist Jonathan Haidt proposes, we often begin with an intuition and then create post hoc rationalizations and justifications for that intuition. How is that intuition formed? Sometimes it is based on personal experience, sometimes on other deeping intuitions or maybe through the dark, quirky realm of the human subconscious. In fact, Haidt claims, it’s incredibly difficult for us to change our own minds, see errors in our arguments or really challenge ourselves in any efficacious way.

Our minds did not evolve with the goal of making us correct—they evolved with the goal of making us survive. To this end, there are many quirks about the functions of our minds that do not quite line up with reality. We often grasp at patterns where there aren’t any (clustering illusion), assume purposeful action where there may be no reason to (agent detection) and overestimate the amount of attention other people are paying to us and our foibles (the spotlight effect). Many of these quirks also concern our knowledge and may give us pause in having too much faith in our beliefs.

The first, and perhaps most widely known, of these quirks is the Dunning-Kruger effect. This effect refers to the phenomenon of those who know a little bit about a field to be much more confident in their breadth of knowledge than people who know much more than them. In the words of Alexander Pope, “a little learning is a dangerous thing.” 

The next two go hand in hand: the first being the illusion of explanatory depth. This (aptly termed) theory says that we often assume we can explain things in much more detail than we actually can. The second is the funny habit our minds have of conflating what people in our community know with what we know (if someone knows how a toilet works, I may feel that I do too).

Here it may be useful to think about the causes of some of these biases. This last one, the conflation of communal knowledge with individual knowledge, is a good example. For me, as a functioning early human in my early human community, I probably don’t need to know something if my neighbors know it (assuming they’re nice enough fellows), so it should make perfect sense that my brain wouldn’t quite care about the distinction so much. However, it’s this very intersection between community, individuals and knowledge that is the crux of understanding why it’s critical to be skeptical of your own beliefs. If I am liable to conflate common opinions with correct ones, then that ought to make me want to check my work a bit more carefully, shouldn’t it?

Perhaps the most powerful of these social biases (and most relevant for our time) is what’s known as group polarization. Cass Sunstein, a political and legal theorist, wrote an article describing this. The general idea of it can be boiled down to this example: if five Brandeis students meet every week to discuss how they feel about the food at Sherman, after seven weeks, what direction, if any, do you think their opinions have swung? Or, if ten baseball fans meet every week to discuss why baseball is better than football, what direction, if any, do you think their opinions on football have swung? Or (here’s the kicker): if I create a Facebook page, subreddit, Tumblr page or other kind of community with a certain political bent, after a year, do you think that page has gotten more or less extreme in its views? Probably more extreme. If you guessed this, congrats! You’re correct. You’re also probably susceptible to the same effects.

Before thinking you’re immune because of the scrupulous nature of your sources, it is not necessary for anything underhanded to be going on for this effect to occur. In fact, one of the most prominent reasons for this effect is what’s known as a “limited argument pool.” 

The idea being, not many of these Brandeis students are bringing with them arguments for why Sherman is good (if, indeed, there are any), and not many of these baseball fans are bringing arguments for why baseball may be worse than football, so, in the course of deliberating over these topics, more arguments (and these are arguments that the participants are predisposed to agree with) are heard in one direction than the other, exacerbating the magnitude of the convictions without changing the veracity of them. By your simple ignorance of what the other side is saying (and, importantly, what individuals on the other side are saying in good faith), your opinions may—seemingly rationally—swing in a certain direction.

If you feel unsettled by the idea that your opinions could be formed by forces outside of your focus and lacking in rhetorical rigor, welcome! Now you might understand a bit better that craze some time ago about subliminal messaging and the fear that may have produced. People generally don’t like the idea of being manipulated. So, hopefully, you’re on my side in that you’re reconsidering how much faith you should have in your opinions, but perhaps you’re wondering where we should go next.

Group deliberation was aided by one thing: dissent. The expansion of the argument pool, and the proximity to someone from “the other side,” actually did make group deliberation fruitful. Haidt as well termed his theory of decision-making “social intuitionism” for the reason that he saw decision-making as a social activity—when we are confronted by others who disagree with us (or who simply have access to arguments from what we disagree with), we are forced to justify our beliefs. And this is no faint exercise—most people believe they are right, and, indeed, want to be right. If their intuitions are found to be without justification, or if another’s justifications seem oh-so-much stronger than their own, that can lead to real change. Confrontation—in the friendliest of definitions—is the lifeblood of self-examination. Brooding and communal agreement can lead to polarization—confrontation can lead to discourse.

This is not to say that everything is up for grabs—I, for one, feel little compunction to relitigate my stances on such issues as rape, human bondage or genocide—but that first step in an argument should never be holing up on your own side and lobbing justifications across no man’s land. Instead, the whole premise of successful discussion is the deep and sympathetic understanding of the opposing point of view. Empathy in debate: it is absolutely crucial. Thinking about it, maybe I should have gotten this article published before Thanksgiving. Oh, well.

The main takeaway I hope comes from this is to be nicer to people you disagree with, however right you may think you are. Many people far more intelligent than I have been wrong about many things. We may not all be Socrates, proclaiming, “the only thing I know is that I know nothing,” but we also ought not to have too much faith in our beliefs. It is only taking the strongest opposing arguments you can find and dismantling them to your satisfaction that should bring comfort in belief—and that being a provisional comfort. As Judge Learned Hand aptly said, “the spirit of liberty is the spirit which is not too sure that it is right.”

Seek out the dissenters, the troublemakers, the obstinately different. Come to them with kindness and openness, trusting at least that they too want what they believe is best for us all. Everyone, by and large, has a kernel of wisdom to give—it’s simply a matter of finding it.

Get Our Stories Sent To Your Inbox

Skip to content