Monday, August 28, 2006

Implicit Belief and Tokens in the "Belief Box"

You believe the number of planets is less than -- well, let's be safe, 100! You also believe that the number of planets is less than 101, and that the number of planets is less than 102, etc. Or are you only disposed to believe such things, when prompted to reflect on them, while prior to reflection you neither believe nor disbelieve them?

Most philosophers of mind who've discussed the issue (e.g., Fodor, Field, Lycan, Dennett) are inclined to think everyone does believe that the number of planets is less than 157, though they may never have explicitly entertained that idea.

Now some of these same authors also think that, paradigmatically, believing something requires having a sentence "tokened" somewhere in the mind (metaphorically, in the "belief box"). So, then, do I have an indefinitely large number of tokens in the belief box pertaining to the number of planets? That's a rather uncomfortable view, I think, if we take the idea of "tokening" relatively realistically (as Fodor and others would like us to do). Consequently, the idea of "implicit belief" might be appealing. As Fodor and Dennett describe it, only explicit beliefs require actual tokenings in the belief box. We believe something "implicitly" if its content is swiftly derivable from something we explicitly believe.

Here are some concerns about that idea:

(1.) It draws a sharp distinction -- between the real cognitive structure of explicit and implicit beliefs -- where folk-psychologically and (so far) empirically, none (or only a gradation of degree) is discernable. Five minutes ago, did you believe that Quebec was north of South Carolina (or New York), that the freeways are slow at 5:17 p.m., that Clint Eastwood is a millionaire, that most philosophers aren't rich, etc.? Some of these contents I have probably previously entertained, others not; I don't know which, and it doesn't seem to matter. I note no sharp distinction between the implicit and explicit among them. (On the other hand, there is a sharp distinction, I think, between my belief that the number of my street address is less than 10,000 [it's 6855] and my just-now-generated belief that it's divisible by three, which took some figuring [using the rule that a number is divisible by three only if the sum of its digits is divisible by three and 6+8+5+5=24].)

UPDATE (2:39 p.m.) on (2) and (3):

(2.) Dennett imagines a chess-playing computer that, as a result of its programmed strategies, always tries to get its queen out early -- though there's nothing explicitly represented in the programming from which "get the queen out early" is swiftly derivable. Although most people wouldn't ascribe a chess-playing computer literal beliefs, one can imagine a similar structural situation arising in a person, if one accepts the rules-and-representations approach to cognition Fodor and others are offering. In such a case, we might still want to say, as Fodor grants, that the machine or person (implicitly) believes one should get the queen out early. But now we need a different account of implicit belief -- and there's a threat that one might go over to a view according to which to have a belief is to exhibit a certain pattern of behavior (and experience?) as Dennett, Ryle, and others have suggested; and that would be a change of position for most representationalists about belief.

(3.) Briefly: If beliefs commonly arise and (especially) are forgotten gradually, this puts strain on the central idea in the belief box model of the explicit-implicit distinction, which seems to assume that there's generally a distinct and discrete fact of the matter about what is "tokened" in the belief box. (At least I've never seen a convincing account of belief boxes that makes any useful sense of in-between cases and gradual change.)

(Thanks to Treat Dougherty for discussion of this issue [at This Is the Name of This Blog] in connection with my Stanford Encyclopedia entry on belief.)

No comments: