You're a lost tourist. You hit up a local for directions. With apparent confidence and fluency, the local sends you in utterly the wrong direction. I experience this so regularly in travelling that I now no longer ask a single local for directions. About half the time, the second gives a completely different version from the first. Occasionally, someone will express a degree of uncertainty. Almost always, if she's at all uncertain, she's completely wrong; and many who seem perfectly confident are equally wrong.
I've been on the other end of this, too, at least twice, realizing that my directions were seriously mistaken only after having given them. In one case, I sent a Mexican family so far astray that I fear it would take them at least an hour recover, were they unwise enough to trust me!
Similarly, but more serious: The doctor tells you you have disease X, with evident assurance, no visible uncertainty. Another tells you you have disease Y. Or even: The doctor starts out seemingly uncertain, undecided, then settles on something as -- you might think, her best guess -- then, as she spells out the guess to you becomes seemingly confident about it, confident enough, apparently, to stake your health on it. But she's wrong, and in fact changes her opinion easily when the next doctor you see calls up the first and describes his different, better diagnosis.
Let's call this the "best guess phenomenon". In certain situations, when Person A is presumably an expert and Person B has no resources to challenge Person A's opinion, Person A will give her best guess, conveying it with authority and confidence regardless of how well-founded the opinion is. No malice is intended, nor any disguise. It's not that Person A knows she's uncertain and aims to conceal that fact. Rather, the situation invites Person A to take on the mantle of expertise, with very little sensitivity to the proper degree of confidence.
One model I think won't suffice for such cases: Conventional philosophical/economic treatments in terms of "degree of belief" on a scale from 0 to 1. Best-guess phenomena are not, I think, best described as cases in which Person A has an irrationally high degree of confidence. For example, if asked to make a serious wager -- for example, if the local wanted to get there herself, or if the doctor's own health were at stake -- she'd balk, admit uncertainty, consult elsewhere. Rather, it's more like degree of confidence doesn't arise as an issue: Person A is neither certain nor uncertain, really. She's just talking, playing authority as part of a social role, without much thought about how much certainty is justified.
Monday, November 13, 2006
The Best-Guess Phenomenon and Degree of Belief
Subscribe to:
Post Comments (Atom)
6 comments:
I like this analysis. Your concluding remark reminds me a bit of Harry Frankfurt’s theory on ‘bullshit' (although you specify that the answer is not known to be false by either party). Perhaps at times we need a confident answer to take action, and so we go along with it; perhaps we are neither certain nor uncertain in receiving the answer at times, just playing submission as part of a social role.
But given how dependant we seem to be on confident assertion in conceptualizing “belief,” this analysis might derail us a little bit. The more we can pull apart our “believing” dispositions from the actual utterances we wind up making the more mysterious this belief stuff is. So it seems to me…. Is this the kind of thing you are after, or am I getting this wrong?
Michael
Thanks for your comment, Michael! I agree that there is some similarity between this thought here and Frankfurt's analysis of "bullshit". I disagree, though, about the importance of hewing close to people's linguistic dispositions in attributing beliefs. In fact, I think it's helpful to distinguish between occurrent judgment as expressed in sincere utterance and genuine, broadly dispositional belief as revealed by one's broad pattern of behavior. For example, there's the racist who sincerely claims that "all the races are intellectually equal" yet whose behavior reveals that she doesn't really believe this....
I like your point here, kboughan -- though you attached it to the wrong post. I think perhaps my gut vs. rational moralist distinct is indeed a bit simplistic, and "gut thinkers" may often do a lot of cogitation. I do suspect there's some cognitive difference to be found between people who are more or less "gut" driven in their ethics, but how exactly to spell that out may be tricky. Joshua Greene has done some fMRI research that's suggestive on this front, though, suggesting that more calculating moral decisions involve different brain areas than more visceral moral decisions.
‘For example, there's the racist who sincerely claims that "all the races are intellectually equal" yet whose behavior reveals that she doesn't really believe this....’
Ok, thanks. This makes sense. However, what one “really believes” is growing more perplexing to me. One thing I had in mind when writing my original comment was that occurrent judgment seems to be what our common usage of the word “belief” is rooted in—how we “conceptualize” believings, for lack of a better word. Even in philosophy, move from our concrete assertions to “that p,” and from “that p” to internal mental states with static propositional content. If it was not for the concrete assertions, it is hard to imagine how it is we would be talking about “beliefs.” For instance, if we had a pre-language community replete with familiar social practices and gestures, would we characterize the people of this community as having beliefs? If these people do not yet use a language, it is hard to view how they see the world, predict it, and engage with it a matter of having a certain set of propositional beliefs. But maybe they do, depending on how we wind up defining “beliefs.” Sorry if this murky; I see that you have written some about beliefs, but I have not yet had a chance to take a look.
Thanks
Michael
Michael, I think you nicely articulate some of what drives people to language-centered views of belief. However, I think in the end such accounts are not the most useful -- or, at least, they are not the most useful for understanding the kinds of murky cases that most interest me in belief: gradual cognitive change and gradual forgetting, self-deception, thoughtlessness, cases where one behaves inconsistently over time, etc.
Linguistic approaches to belief tend to be clean, inviting a one-to-one match to discretely believed (or discretely not believed) contents. Murky cases seem puzzling on such accounts -- does Kripke's Pierre, for example "really believe" that London is pretty or that it is not pretty or both or neither? The error in such thinking is, in my view, the initial commitment to a clean view of belief on which such questions get discrete yes-or-no answers. Broadly dispositional accounts of belief make no such commitment to discrete contents.
(That's the short version: For the full version see my 2001 and 2002 essays on belief, linked to from my homepage.)
That sounds right to me so far. I’ll check out those essays of yours here shortly. Thanks! I’m inclined to think of the ‘content’ of explicit belief as indeterminate until the time of utterance. For me, stable judgments do not evidence clean, discrete content.
I did just locate a post of yours from July that I think addresses one of my worries here:
“Maybe we need two different words for these two different types of state, so as not to confuse things by applying the term "belief" to both. (Indeed, maybe the English "belief" does pretty much just capture the explicit, not-especially-well-connected-with-action type of state. "These are our beliefs....")”
http://schwitzsplinters.blogspot.com/2006/07/what-we-believe.html
Thanks
Michael
Post a Comment