In a forthcoming paper, Blake Myers-Schulz and I pick up a mostly-cold torch from Colin Radford (whose seminal work on this topic was in the 1960s) and challenge the belief condition. Can one know that something is the case even if one doesn't believe that it's the case? We offer five plausible cases (one adapted from Radford) along with empirical evidence that our intuitions [note 1] about these cases are not idiosyncratic.
This paper has already drawn several follow-up studies, some critical and some supportive -- but interestingly, even the critical studies can be read as contributing to an emerging consensus that problematizes the belief condition. (I don't predict the consensus will last. They never do in philosophy. But still!)
First, to give you a feel for it, our cases:
1. An unconfident examinee who feels like she is guessing the answer but non-accidentally gets it right;Now maybe not all the cases work, but we think in each case there's at least some plausibility to the thought that the person in question knows (that Queen Elizabeth died in 1603, that the bridge is closed, that athletic students are just as capable, that aliens won't come out of her faucet, that his wife is cheating) but does not believe -- at least not as fully and determinately as she knows. And lots of undergraduates seem to agree with us! So we think the B condition on knowledge should at least be open for discussion. It should not be regarded as uncontroversially nonproblematic.2. An absent-minded driver who momentarily forgets that a bridge he normally takes to work is closed and continues en route toward that bridge;
3. A prejudiced professor, who intellectually appreciates that her athletic students are just as capable as her non-athletic students but who nonetheless is persistently biased in her reactions to student-athletes;
4. A freaked-out movie-watcher who seems to have the momentary impression that the scenario depicted in a horror film is real;
5. A self-deceived husband who has lots of evidence that his wife is cheating and some emotional responses that seem to reveal that he knows this, but who refuses to admit the truth to himself.
Follow-up studies (e.g., here, here, here, and here) have added some new plausible cases. Our favorite of these is:
6. A religious fundamentalist geocentrist who aces her astronomy class -- seeming to know that Earth revolves around the sun but not to believe it.Although some of these follow-up studies are pitched as in agreement with us and others as critique, we think there's actually a pretty clear thread of consensus through it all, from a bird's-eye view:
Knowledge requires some sort of psychological connection to the justified, true proposition -- something broadly like a belief condition; but it doesn't seem to require full-on act-it-and-live-it-and-breathe-it belief. However reasonable it might be to think the Earth goes round the sun, that fact has to register with me cognitively in some way if I am to qualify as knowing it; but the fact needn't play the full functional role of belief as envisioned in behaviorally-rich accounts of belief like my own. But how exactly should we should conceptualize this somewhat weak but broadly beliefish psychological-connectedness condition? At this point, that's wide open.
Blake and I suggest that one must have the capacity to act on the stored information that P; Rose and Schaffer seem to suggest that what's crucial is that the information be "available to the mind"; Buckwalter and colleagues suggest that one must believe but only in some "thin" sense of belief; Murray and colleagues suggest that one need to be disposed to "assent" to the content. None of these approaches are well specified (and I've simplified them somewhat; apologies). Figuring out what's going on with the B condition thus seems like a potentially fruitful task that brings together core issues in epistemology and philosophy of mind.
-------------------------------------------------
[note 1] Yes, I use the word "intuition". Herman Cappelen has me worried about the term. But I stand firm!