Reason is a tool.
Truth is a result.
Never conflate the two.
Assuming that reason leads to truth or better decisions is like assuming a hammer leads to a piece of fine furniture, or a paintbrush leads to a work of art, or a piano leads to a performance of “Rhapsody in Blue.” Under very specific and highly disciplined circumstances those tools CAN lead to those results, but the reality is that they almost never do.
Means and the ends are very different types of things. Never assume a result based on the means.
Reason is for persuading, convincing, arguing, winning. Questions like “about what?” or “for what purpose?” are as relevant to understanding reason as “to hit what?” or “to build what?” are to understanding a hammer, i.e., not at all.
Politics is nothing if not the use of reason for winning. Truth is secondary, if it matters at all. What matters more than anything is whether the ideas or the candidate resonate with the audience.
If you think reason is for finding truth or leads to better decision making then consider the following masters of reason:
Adolf Hitler was possibly the greatest reasoner the world has ever seen. He persuaded millions of people to follow him, he convinced them that genocide was beneficial to their cause.
Charles Manson used reason to convince his followers to commit grotesque murders.
Jim Jones used reason to convince his followers to commit mass murder/suicide in Jonestown.
Where’s the truth in any of that? Where’s the improved decision making?
Reason is for persuading, convincing, winning, getting results. Period.
The Journal of Behavioral and Brain Sciences devoted an entire issue to this finding (in April of 2011; Volume 34, Issue 02). A web site by the authors of the centerpiece article explains that “the idea that the role of reasoning is to critically examine our beliefs so as to discard wron-headed ones and thus create more reliable beliefs – knowledge – [that] in turn is supposed to help us make better decisions”…
… is hard to reconcile with a wealth of evidence amassed by modern psychology. Tversky and Kahneman (and many others) have demonstrated the failures of reasoning in decision making. Johnson-Laird and Evans (and, again, many others) have shown how fallible reasoning can be. Others have shown that sometimes reasoning too much can make us worse off: it can unduly increase self-confidence, allow us to maintain erroneous beliefs, create distorted, polarized beliefs and enable us to violate our own moral intuitions by finding handy excuses.
When people reason alone, there is often nothing to hold their confirmation bias in check. This might lead to distortions of their beliefs. As mentioned above, this is very much the case. When people reason alone, they are prone to all sorts of biases. For instance, because they only find arguments supporting what they already believe in, they will tend to become even more persuaded that they are right or will develop stronger, more polarized attitudes.
When reasoning is used to make decisions, it will do what it is supposed to do, namely, find arguments. As a result, instead of always pointing towards a better choice, reasoning will usually lead us to a decision that is easy to justify. Psychologists have shown that many a weird decision can be explained by this factor: people decide to do something because they can easily justify it rather than because it is right.
A summary by Patricia Cohen in the New York Times, emphasis added:
“Reasoning doesn’t have this function of helping us to get better beliefs and make better decisions,” said Hugo Mercier, who is a co-author of the journal article, with Dan Sperber. “It was a purely social phenomenon. It evolved to help us convince others and to be careful when others try to convince us.” Truth and accuracy were beside the point.
Indeed, Mr. Sperber, a member of the Jean-Nicod research institute in Paris, first developed a version of the theory in 2000 to explain why evolution did not make the manifold flaws in reasoning go the way of the prehensile tail and the four-legged stride. Looking at a large body of psychological research, Mr. Sperber wanted to figure out why people persisted in picking out evidence that supported their views and ignored the rest — what is known as confirmation bias — leading them to hold on to a belief doggedly in the face of overwhelming contrary evidence.
Other scholars have previously argued that reasoning and irrationality are both products of evolution. But they usually assume that the purpose of reasoning is to help an individual arrive at the truth, and that irrationality is a kink in that process, a sort of mental myopia. Gary F. Marcus, for example, a psychology professor at New York University and the author of “Kluge: The Haphazard Construction of the Human Mind,” says distortions in reasoning are unintended side effects of blind evolution. They are a result of the way that the brain, a Rube Goldberg mental contraption, processes memory. People are more likely to remember items they are familiar with, like their own beliefs, rather than those of others.
What is revolutionary about argumentative theory is that it presumes that since reason has a different purpose — to win over an opposing group — flawed reasoning is an adaptation in itself, useful for bolstering debating skills.
Mr. Mercier, a post-doctoral fellow at the University of Pennsylvania, contends that attempts to rid people of biases have failed because reasoning does exactly what it is supposed to do: help win an argument.
“People have been trying to reform something that works perfectly well,” he said, “as if they had decided that hands were made for walking and that everybody should be taught that.”
Mercier and Sperber offer an evolution-based explanation for why reason works the way it does:
Communication is hugely important for humans, and there is good reason to believe that this has been the case throughout our evolution, as different types of collaborative—and therefore communicative—activities already played a big role in our ancestors’ lives (hunting, collecting, raising children, etc.). However, for communication to be possible, listeners have to have ways to discriminate reliable, trustworthy information from potentially dangerous information—otherwise speakers would be wont to abuse them through lies and deception. Listeners must have mechanisms of epistemic vigilance. One way listeners and speakers can improve the reliability of communication is through arguments. The speaker gives a reason to accept a given conclusion. The listener can then evaluate this reason to decide whether she should accept the conclusion. In both cases, they have used reasoning—to find and evaluate a reason respectively. If reasoning does its job properly, communication has been improved: a true conclusion is more likely to be supported by good arguments, and therefore accepted, thereby making both the speaker—who managed to convince the listener—and the listener—who acquired a potentially valuable piece of information—better off.
We value truth. We think it is important; so important that we invented the scientific method and the peer review process of the scientific community to thwart the tendency of reason to lead us away from truth. Those processes act as a mental ju jitsu move that turns the biases that are built-in to reasoning to work in our favor. The fact that we’re very good at seeing the speck in each other’s eye (or reasoning) while at the same time being blind to the log in our own is turned to our mutual advantage IF we all work together cooperatively.
Indeed, the journal findings corroborate this. The web site summary describes the very specific and highly disciplined circumstances through which reason CAN be used to pursue truth:
If reasoning evolved so we can argue with others, then reasoning should yield better results in groups than alone. Short answer: it does. When the performance of groups and lone individuals in reasoning tasks is compared, groups fare much better—sometimes dramatically so. Not only do groups have a better performance than the average individual, but they often perform as well, or even better, than the best group member (again, in reasoning tasks, this is not true across the board).
The real purpose of reason is capture in this quote from Liberal Fascism: The Secret History of the American Left, From Mussolini to the Politics of Change by Jonah Goldberg:
What Hitler got from Italian Fascism – and as indicated above from the French and Russian revolutions – was the importance of having an idea that would arouse the masses. The particular content of the idea was decidedly secondary. The ultimate utility of ideas is not their intrinsic truth but the extent to which they make a desired action possible. (p.55)
I agree with your comments. I can’t even find a quibble worth arguing over.
My post addresses what I see as a gross misconception about reason. But of course we also use it as you say.
I see my post as the basic outlines of a picture, and your comment as filling in the details and nuance that make it more rich and complete.
Thanks for your time and thoughtfulness.
A really great treatment, Gordon- a very helpful reminder to me about holding my opinions loosely, testing them, and giving others latitude in their own opinions. It’s difficult to hear treatments of the argumentative theory too much, as our inner tendencies are so dependent on its tenets, and we’re such inertial machines about things.
A few additions. First, I think the most common technique reason uses to win arguments is to focus on one truth (or a related small set of them), while ignoring other truths that bear on the matter at hand. Think of the arguments for welfare (taking care of the poor) and against it (systematically reducing the sense of agency of the poor). Most situations are complex, and most effective prescriptions need to balance competing tensions to arrive at a useful overall strategy, especially if tactical challenges are given adequate weight. Ironically, complexity isn’t a friend to reason, at least not in the sense of making it easy to fulfill the purpose of winning arguments. Thus, we drone on about one principle, or one important fact, are careful to leave unacknowledged or minimize competing principles that may also hold sway. When this is pointed out-and it’s always pointed out in arguments- we either ignore the point and increase the volume, argue away the existence of competing principles (typically wrongly), or, if we acknowledge competing principles exist, we do so implicitly by quickly and murkily negating their importance with techniques like ‘that isn’t anywhere NEAR as important as…’ or ‘that’s ignoring the REAL problem’. This is really an argument against complexity, against the prospect of validity outside of our conscripted rationale.
Allowing for the other’s perspective in the argument typically opens up unforeseen opportunity by revealing the nearly inevitable insights that acknowledging complexity allows. This is a big subject for another time, but getting there is a matter of injecting positive emotions properly, and reducing impinging cognitive bias through various techniques.
Secondly, while I appreciate the argument you’re making, I’m not willing to cede to an anthropological theory the idea that reason is not what science does, or that science was designed to counteract reason. I do think the evidence is strong that we evolved to make powerful, inaccurate arguments- to rationalize- but calling that likelihood the beginning and end of reasoning is inaccurate and self-defeating. Others will posit differently, but I take good reasoning to mean using our rational and irrational selves, everything we have, to arrive at the best solution we can. I’d contend reason is what we do when studying physics properly, but it’s also what a crow does when it figures out how to use a tool to get at prey, or what a women does when answering a marriage proposal. Calling what humans do ‘reasoning’ when we argue is like calling what I do with a cello ‘playing’: though technically accurate in a narrow sense, it’s an abuse of the term. I think the distinction between rationalization and reasoning is important because I believe it critical to have a strong sense that reason is distinct from rationalization, and to engage in an effort to reason, no matter that one evolved to rationalize. I happen to believe that the irrational self needs to be a player in proper reasoning- that emotions, intuitions, holding opinions lightly, and creative play should often play a part in proper reasoning. Others may not take such a broad stance, which is fine, but if we don’t maintain an ideal in mind when we speak of reasoning, we can easily assume that we live in a fog where truth may as well be another fiction, with only our conscious motives and our rationales to guide us. This may describe the subconscious belief of many, but it doesn’t negate the power and necessity of good reasoning.
I mention in passing that my above is a characteristically liberal contention, focused on a somewhat-to-largely unrealizable potential, while your post is a distinctly conservative one, naming the way things are and how little they shall budge. I find this duality endemic to reality, and the reconciliation between them piecemeal. I’ve had the proper amount of sleep, so I leave to others the task of setting a sword between them.
Finally, an ancillary point, but one near to my heart: the researcher who stated that efforts to fix cognitive bias have failed is mostly right, but wrong in another, very important way. It’s true that biases in general are quite entrenched, that there are a myriad of them, and that studies show that people don’t typically become much less biased through simple lessons. However, work since this article was published in 2011 has shown that just the simple step of reminding people about potential bias in a given situation may reduce some biases significantly. Making us conscious of the potential for bias reduces some of our biases immediately, and sometimes dramatically. Since even tiny reductions in bias can have a very large personal or sociological effect, education is an urgent step forward.
But that’s not the important point he got wrong. We need to take a 50,000 foot view on fixing cognitive bias. First, biases toward women, homosexuals, the sick, the mentally ill, the poor, minorities, and foreigners have attenuated drastically in modern times; these reflect reductions in the outgroup bias, the FAE, and others. Second, different cultures have dramatically different tendencies toward or against certain biases, so pointing to genetics as solely responsible for bias, or even as primary drivers, is problematic. Contrary to his contention, people are in fact learning to be less biased all the time, in ways that are quite dramatic, especially when seen across generations (but within individual lifetimes). What we need to remember is that, when we speak about becoming less biased, we are speaking about an important way that we become better people, as in choosing good outcomes over evil ones. Anything that basic might be quick in individuals (I have seen rapid individual reductions in bias), but in the overall population, it will be a slow, culture-soaked process. We reduce biases as we become better people: by being taught about biases well; through the inspiration and guidance of art, which helps us feel the personal costs of bias; seeing unbiased individuals we respect; and by being exposed to situations or people that reveal the error in our biases (biases are often caused through underexposure to realistic or proper outcomes). It’s hard work, yes, but as I’ve said a great deal elsewhere, it’s annoying to have educators rationalize that people can’t be trained out of their biases. Au contraire- it’s happening all the time.
But bias is a sideline to the subject at hand- thanks again for the post. I think argumentative theory is one of the most important aspects of social psychology, and I look forward to the coming decades, as knowledge of this and other social psychology basics work their way into colleges, and then into secondary schools, and we begin to see modification of the techniques of rationalization we’re all victims of today.
LikeLiked by 1 person