Tuesday, January 24, 2017

The Philosopher's Rationalization-O-Meter

Usually when someone disagrees with me about a philosophical issue, I think they're about 20% correct. Once in a while, I think a comment is just straightforwardly wrong. Very rarely, I find myself convinced that the person who disagrees is correct and my original view was mistaken. But for the most part, it's a remarkable consistency: The critic has a piece of the truth, but I have more of it.

My inner skeptic finds this to be a highly suspicious state of affairs.

Let me clarify what I mean by "about 20% correct". I mean this: There's some merit in what the disagreeing person says, but on the whole my view is still closer to correct. Maybe there's some nuance that they're noticing, which I elided, but which doesn't undermine the big picture. Or maybe I wasn't careful or clear about some subsidiary point. Or maybe there's a plausible argument on the other side which isn't decisively refutable but which also isn't the best conclusion to draw from the full range of evidence holistically considered. Or maybe they've made a nice counterpoint which I hadn't previously considered but to which I have an excellent rejoinder available.

In contrast, for me to think that someone who disagrees with me is "mostly correct", I would have to be convinced that my initial view was probably mistaken. For example, if I argued that we ought to expect superintelligent AI to be phenomenally conscious, the critic ought to convince me that I was probably mistaken to assert that. Or if I argue that indifference is a type of racism, the critic ought to convince me that it's probably better to restrict the idea of "racism" to more active forms of prejudice.

From an abstract point of view, how often ought I expect to be convinced by those who object to my arguments, if I were admirably open-minded and rational?

For two reasons, the number should be below 50%:

1. For most of the issues I write about, I have given the matter more thought than most (not all!) of those who disagree with me. Mostly I write about issues that I have been considering for a long time or that are closely related to issues I've been considering for a long time.

2. Some (most?) philosophical disputes are such that even ideally good reasoners, fully informed of the relevant evidence, might persistently disagree without thereby being irrational. People might reasonably have different starting points or foundational assumptions that justify persisting disagreement.

Still, even taking 1 and 2 together, it seems that it should not be a rarity for a critic to raise an interesting, novel objection that I hadn't previously considered and which ought to persuade me. This is clear when I consider other philosophers: Often they get objections (sometimes from me) which, in my judgment, nicely illuminate what is incorrect in their views, and which should rationally lead them to change their views -- if only they weren't so defensively set upon rebutting all critiques! I doubt I am a much better philosopher than they are, wise enough to have wholly excellent opinions; so I must sometimes hear criticisms that ought to cause me to relinquish my views.

Let me venture to put some numbers on this.

Let's begin by excluding positions on which I have published at least one full-length paper. For those positions, considerations 1 and 2 plausibly suggest rational steadfastness in the large majority of cases.

A more revealing target is half-baked or three-quarters-baked positions on contentious issues: anything from a position I have expressed verbally, after a bit of thought, in a seminar or informal discussion, up to approximately a blog post, if the issue is fairly new to me.

Suppose that about 20% of the time what I say is off-base in a way that should be discoverable to me if I gave it more thought, in an reasonably open-minded, even-handed way. Now if I'm defending that off-base position in dialogue with someone substantially more expert than I, or with a couple of peers, or with a somewhat larger group of people who are less expert than I but still thoughtful and informed, maybe I should expect that about half to 3/4 of the time I'll hear an objection that ought to move me. Multiplying and rounding, let's say that about 1/8 of the time, when I put forward a half- or three-quarters-baked idea to some interlocutors, I ought to hear an objection that makes me think, whoops, I guess I'm probably mistaken!

I hope this isn't too horrible an estimate, at least for a mature philosopher. For someone still maturing as a philosopher, the estimate should presumably be higher -- maybe 1/4. The estimate should similarly be higher if the half- or three-quarters-baked idea is a critique of someone more expert than you, concerning the topic of their philosophical expertise (e.g., pushing back against a Kant expert's interpretation of a passage of Kant that you're interested in).

Here then are two opposed epistemic vices: being too deferential or being too stubborn. The cartoon of excessive deferentiality would be the person who instantly withdraws in the face of criticism, too quickly allowing that they are probably mistaken. Students are sometimes like this, but it's hard for a really deferential person to make it far as a professional philosopher in U.S. academic culture. The cartoon of excessive stubbornness is the person who is always ready to cook up some post-hoc rationalization of whatever half-baked position happens to come out of their mouth, always fighting back, never yielding, never seeing any merit in any criticisms of their views, however wrong their views plainly are. This is perhaps the more common vice in professional philosophy in the U.S., though of course no one is quite as bad as the cartoon.

Here's a third, more subtle epistemic vice: always giving the same amount of deference. Cartoon version: For any criticism you hear, you think there's 20% truth in it (so you're partly deferential) but you never think there's more than 20% truth in it (so you're mostly stubborn). This is what my inner skeptic was worried about at the beginning of this post. I might be too close to this cartoon, always a little deferential but mostly stubborn, without sufficient sensitivity to the quality of the particular criticism being directed at me.

We can now construct a rationalization-o-meter. Stubborn rationalization, in a mature philosopher, is revealed by not thinking your critics are right, and you are wrong, at least 1/8 of the time, when you're putting forward half- to three-quarters-baked ideas. If you stand firm in 15 out of 16 cases, then you're either unusually wise in your half-baked thoughts, or you're at .5 on the rationalization-o-meter (50% of the time that you should yield you offer post-hoc rationalizations instead). If you're still maturing or if you're critiquing an expert on their own turf, the meter should read correspondingly higher, e.g., with a normative target of thinking you were demonstrably off-base 1/4 or even half the time.

Insensitivity is revealed by having too little variation in how much truth you find in critics' remarks. I'd try to build an insensitivity-o-meter, but I'm sure you all will raise somewhat legitimate but non-decisive concerns against it.

[image modified from source]

9 comments:

  1. *sloshing wine around in a glass wildly whilst gesticulating and speaking* Do we have an inclination to think it's a question of attitude? Like there is some seat of open mindedness, if we just don't fall to a vice (stubborness or even differentialism)?

    But what if the arrogant person could be right and the humble wrong? What if there was no correlation between attitude and factuality? It'd suck! *splosh*

    Personally I think a credible position has the irony of having a way it could be wrong built into it.

    Okay, it's an indulgence to treat having a built in way of being wrong as making for credibility, but perhaps it'd give an insight on how attitude relates to fact. Perhaps that nature is a bit ruthless and wants us/breeds us to take on a position and damn well stick to it, to see if we survive the wilds with that position. Indeed statisticians will say if you use a tactic you should use it over and over and over as a proper test. Belligerently sticking to a position is what statisticians advise, rather than chopping and changing with such tiny sample size of using any particular position that you just don't know if they were effective or not.

    But what do in an era where the wilds wont take us if out position is false (false in survival terms, anyway)? But chopping and changing is its own superstition? What do indeed *swish*? Perhaps we are inclined to find the right attitudinal seat, but none of the seats apply - except as a tested belligerence? *spashes wine on carpet*

    And off that topic, while I feel a few bruises of guilt after all that, I'll go for some more bruises by saying why would thinking about something for a long time be relevant to how correct the position is? Geocentricism was what some people thought their whole lives. On the other hand, I don't think being able to lose a long held thought necessarily leads to a great thing. Not by human measure, anyway. *having drunk all the wine, he leaves, da jerk!*

    ReplyDelete
  2. I'm highly suspicious as well. I know I hold some false beliefs (although of course, I can’t point any of them out!), but I of course hold my beliefs because I take them to be true. This often leads me to wonder how many false beliefs I hold.

    I’m no mature philosopher, so I can take your example and say that I’m off base 1/4th of the time. Let’s say I’m talking to an expert within a certain field – then we can say that they’d be able to stand firm on 15/16 cases. But even in this scenario, the expert has a chance to incorrectly “correct” me, because the expert is of course prone to holding false beliefs and using faulty lines of reasoning as well (they’re just less prone than I am).

    So when talking about a rationalization meter, wouldn’t we have to take into account the “false positives" or “false corrections” that an expert or lay person may commit? People can be persuaded by poor lines of reasoning, professionals included!

    ReplyDelete
  3. How do you know you have false beliefs, Robert?

    Not the usual sort of question but I'm wondering what you base it on?

    ReplyDelete
  4. I like your argument, I like it for serious weighty academic and political discussions, but I kept coming back to my personal relationships. I see the truth in the scenarios in my relationship with my wife and my children. I can't wait to tell my ten year old who vehemently objects to this or that, that I am 20% persuaded by their argument, but that is not enough for me to change my position. Because it's true, I am often persuaded but not enough to change...

    ReplyDelete
  5. If I understand correctly, some models of Bayesian updating of one's credence based on the opinion of an expert depends not only on the magnitude of the export's credence, but also your estimate of how responsive his credence is to new facts. It seems to me, to take the example of a specific religious belief, I will downweight the effect of his belief on mine if I think he is fixed in his beliefs ie he will disregard the type of evidence I would think important in that context.

    ReplyDelete
  6. Thanks for the comments, folks!

    Callan: You write: "Personally I think a credible position has the irony of having a way it could be wrong built into it." That reminds me of Popper on falsifiability. I wouldn't want to make it an exceptionless principle (e.g., it's tricky to apply to mathematical beliefs), but there's something more substantial in risky claims and predictions -- and that can be admirably combined with a willingness to allow that they were wrong, if the balance of further evidence or argument eventually tilts the other way. How this relates to the feeling of confidence, though, must be complex! (It's a good thing most of our furniture is cheap, second-hand stuff, given how messy the guests are with the wine.)

    Robert: For sure! I think the non-expert can often be closer to right than the expert. One type of case that drives me bananas is in interpreting historical philosophers. The non-expert says, "the author seems to be saying P". The expert says, "no, no, that's all wrong, defer to my expertise on this". But in fact that expert has a minority view of the text and the majority of experts agree that the author is saying P. (Stipulate further, if desired, that the expert majority is correct in favoring P and the local expert is in fact off the mark.)

    Gray: Yes, in personal arguments too. I agree.

    David: Nice point. I agree with that.

    ReplyDelete
  7. Eric,

    "if the balance of further evidence or argument eventually tilts the other way."

    That sounds either like the claim has a way of being wrong built into it, or I'm left wondering why someone with a claim would give it up after further evidence or argument?

    ReplyDelete
  8. Also "It's a good thing most of our furniture is cheap, second-hand stuff, given how messy the guests are with the wine."

    Mmmm, right at home! :)

    ReplyDelete
  9. I suppose this is quite pheripheral to your point, but quite often we focus on the parts of an argument where we are expert and most likely to be right. So in a complex debate both parties could rationally feel they are over 50 % right based on how they allocate percentages. They may even both be correct that the part where they are wrong is irrelevant to them.

    ReplyDelete