Alan Jacobs


thinking and rationality

#

Reading this Joshua Rothman piece about rationality, I finally realized how my account in How to Think differs most significantly from the models of thinking advocated by the “rationalist community.” The chief difference is this: they proceed by excision, and I proceed by inclusion. Rationalists focus on clearing away those contents of the mind that they believe to be impediments: they’re about “overcoming bias,” about eliminating subjectivity – cognitive errors to them are always about unwarranted intrusions into the rational process. They operate under the (as far as I can tell unacknowledged and undefended) assumption that if you strip away everything that is not rationality as they define it — like the sculptor carving away everything that doesn’t look like an elephant — then you will be able to reach more reliable conclusions about what is true or what you should do.

I don’t believe in that story. Charles Taylor talks about metaphysical “subtraction stories,” that is, accounts of the role of religious belief that assume that if you subtract religion from the human mind and human experience then what will be left is Reason. The rationalist community tells the same subtraction story, though extending the content of what’s to be subtracted from “religion” to “irrationality” of all kinds; it just doesn’t seem to know that that’s what it’s doing. And it doesn’t seem aware of the possibility that taking away bad things does not necessarily leave behind good things. It might be rather that good things need to be built up.

My approach to thinking does not involve excision but rather inclusion: addition and amplification. I don’t believe in getting rid of biases, but rather trying to understand, as Gadamer put it, which of my biases and prejudices are conducive to knowing what’s true and good and which ones impede or disable me from knowing what’s true and good. After all, some of my prejudices are true. Why is that? I don’t think of my emotional responses as violations of objectivity to be eliminated, but rather additional factors to consider in trying to assess whether I’m growing closer to the truth or moving farther away from it. My biases and prejudices and feelings sometimes lead me in the right direction. Why is that? For me, excellent thinking does not require me to strip away portions of my humanity but rather to bring all my resources to bear on the quest for truth and right action — and therefore requires me to enhance my emotional life as well as my purely ratiocinative abilities. 

Rothman’s essay is critical not of reason itself but rather of the rationalist movement in some useful ways. He points out that “talking like a rationalist doesn’t make you one,” and sympathizes with Tyler Cowan’s view that “the rationality movement has adopted an ‘extremely culturally specific way of viewing the world.’” As Rothman rightly concludes, “It’s the culture, more or less, of winning arguments in Web forums.” 

But there’s something odd about his conclusion, which goes like this: 

The realities of rationality are humbling. Know things; want things; use what you know to get what you want. It sounds like a simple formula. But, in truth, it maps out a series of escalating challenges. In search of facts, we must make do with probabilities. Unable to know it all for ourselves, we must rely on others who care enough to know. We must act while we are still uncertain, and we must act in time — sometimes individually, but often together. For all this to happen, rationality is necessary, but not sufficient. Thinking straight is just part of the work.

Isn’t relying on others, if they are reliable others, one of the forms of “thinking straight”? This is a major theme of How to Think: the wisdom of knowing your own limits, and the necessity of, instead of always trying to “think for yourself,” finding trustworthy people to think with. It’s odd that Rothman seems to have absorbed an account in which doing this is something other than rationality, something other than thinking straight.