I know where my car is parked. It's in the student lot on the other side of the freeway, Lot 30. How confident am I that my car is parked there? Well, bracketing radically skeptical doubts, I'd say about 99.9% confident. I seem to have a specific memory of parking this morning, but maybe that specific memory is wrong; or maybe the car has been stolen or towed or borrowed by my wife due to some weird emergency. Maybe about once in every three years of parking, something like that will happen. Let's assume (from a god's-eye perspective) that no such thing has happened. I know, but I'm not 100% confident.
Justified degree of confidence doesn't align neatly with the presence or absence of knowledge, at least if we assume that it's true that I know where my car is parked (with 99.9% confidence) but false that I know that my lottery ticket will lose (despite 99.9999% confidence it will lose). (For puzzles about such cases, see Hawthorne 2004 and subsequent discussion.) My question for this post is, how far can this go? In particular, can I know something about which I'm less than 50% confident?
"I know that my car is parked in Lot 30; I'm 99.9% confident it's there." -- although that might sound a little jarring to some ears (if I'm only 99.9% confident, maybe I don't really know?), it sounds fine to me, perhaps partly because I've soaked so long in fallibilist epistemology. "I know that my car is parked in Lot 30; I'm 80% confident it's there." -- this sounds a bit odder, though perhaps not intolerably peculiar. Maybe "I'm pretty sure" would be better than "I know"? But "I know that my car is parked in Lot 30; I'm 40% confident it's there." -- that just sounds like a bizarre mistake.
On the other hand, Blake Myers-Schulz and I have argued that we can know things that we don't believe (or about which we are in an indeterminate state between believing and failing to believe). Maybe some of our cases constitute knowledge of some proposition simultaneously with < 50% confidence in that proposition?
I see at least three types of cases that might fit: self-deception cases, temporary doubt cases, and mistaken dogma cases.
Self-deception. Gernot knows that 250 pounds is an unhealthy weight for him. He's unhappy about his weight; he starts half-hearted programs to lose weight; he is disposed to agree when the doctor tells him that he's too heavy. He has seen and regretted the effects of excessive weight on his health. Nonetheless he is disposed, in most circumstances, to say to himself that he's approximately on the fence about whether 250 pounds is too heavy, that he's 60% confident that 250 is a healthy weight for him and 40% confident he's too heavy.
Temporary doubt. Kate studied hard for her test. She knows that Queen Elizabeth died in 1603, and that's what she writes on her exam. But in the moment of writing, due to anxiety, she feels like she's only guessing, and she thinks it's probably false that Elizabeth died in 1603. 1603 is just her best guess -- a guess about which she feels only 40% confident (more confident than about any other year).
Mistaken dogma. Kaipeng knows (as do we all) that death is bad. But he has read some Stoic works arguing that death is not bad. He feels somewhat convinced by the Stoic arguments. He'd (right now, if asked) sincerely say that he has only a 40% credence that death is bad; and yet he'd (right now, if transported) tremble on the battlefield, regret a friend's death, etc. Alternatively: Karen was raised a religious geocentrist. She takes an astronomy class in college and learns that the Earth goes around the sun, answering correctly (and in detail) when tested about the material. She now knows that the Earth goes around the sun, though she feels only 40% confident that it does and retains 60% confidence in her religious geocentrism.
The examples -- mostly adapted from Schwitzgebel 2010, Myers-Schulz and Schwitzgebel 2013, and Murray, Sytsma, and Livengood 2013 -- require fleshing out and perhaps also a bit of theory to be convincing. I offer a variety because I suspect different examples will resonate with different readers. I aim only for an existence claim: As long as there is a way of fleshing out one of these examples so that the subject knows a proposition toward which she has only 40% confidence, I'll consider it success.
As I just said, it might help to have a bit of theory here. So consider this model of knowledge and confidence:
You know some proposition P if you have it -- metaphorically! -- stored in your memory and available for retrieval in such a way that we can rightly hold you responsible for acting or not acting on account of it (and P is true, justified, etc.).
You're confident about some proposition P just in case you'd wager on it, and endorse it, and have a certain feeling of confidence in doing so. (If the wagering, expressing, and feeling come apart, it's a non-canonical, in-between case.)
There will be cases where a known proposition -- because it is unpleasant, or momentarily doubted, or in conflict with something else one wants to endorse -- does not effectively guide how you would wager or govern how you feel. But we can accuse you. We can say, "You know that! Come on!"
So why won't you say "I know that P but I'm only 40% confident in P"? Because such utterances, as explicit endorsements, reflect one's feelings of confidence -- exactly what comes apart from knowledge in these types of cases.
12 comments:
First of all I think it's useful to be explicit about what we're doing: trying to elucidate linguistic intuitions. That means it's a falsifiable empirical exercise (we're drawing on empirical experience of our own and others' language use), and there is no guarantee that the results will be coherent.
I often work on the assumption that natural language word meanings are often in the form of exemplars, without absolute boundaries. We can model them as bundles of features, but a single feature will not generally be definitive. So for "know" the exemplary meaning might be knowing a fact, like London is the capital of England. (Could be modeled as a justified true belief or similar.) Anything that is reasonably similar to this could reasonably be linguistically intuited to be "knowing".
(There's a potential complication: there may be two or more exemplars for a given word. Know might have exemplar 1: knowing a fact; exemplar 2: knowing a person. And any given use of "know" might approach one or both of these exemplars to differing degrees.)
So it is intuitively OK to say that you know Quitos is the capital of Ecuador, because it only differs in being untrue. On the other hand, because of this obvious difference, it is equally OK to say the opposite: that you don't "know" this "fact", because it's not true.
On knowing but not believing: in the second person this is very common. When someone makes an outrageous claim, we say, "Oh, come on, you know that's not true." And sometimes the person we're talking to reflects for a moment and says, "Yes, you're right." That is, they concede that though they just now believed X, they in fact all along knew not-X.
But that's probably a failure of mindfulness. What you're getting at is cases in which a person is fully mindful of both their knowledge of fact X and their low confidence in X. In those cases, I think your examples are good. It would be intuitively OK to say the person knows X, because the knowing is fairly close to an exemplary form of knowing. However, it would also be OK to say that they don't know X, because of the obvious contradiction.
I don't think you could construct an example where a person thinks X is probably false AND it would be OK to say the person knows X AND it would not be OK to say that she doesn't know X.
I should really read more - I hadn't looked properly at your last four paragraphs, and I think I've just repeated much of what you said. Sorry!
Hi Eric,
I'm waiting for Scott Bakker to turn up saying something, something using our basic heuristics outside our native problem ecologies something, something dark side!
I think the geocentricism example is unfair - it assumes were in possession of absolutely true knowledge in saying geocentricism is incorrect.
Reading it from the context of us not definately knowing, you might go from reading it as mistaken dogma to being instead fairly open minded? It probably describes where we all are at best, in terms of theory, really. At best we look like were entering a mistaken dogma.
I'm not sure about the weight example - he seems more in direct contact with evidence. Evidence of his own choosing. On the other hand it's similar to geocentricism - what if his gene line is good for this weight?
Temporary doubt sounds more like eratic certitude - amongst all the years, she's picked the one she's most certain of! Not the one she doubts the most.
Another angle is how they can all seem the same sort of 40% belief, though. Even though they come from quite different origins. Perhaps it says something about self reflection - how we sense the belief, but the origins of it can elude us? Even as the beliefs can all feel the same 40%.
For whatever its worth, anyway. Regardless, I don't know what I've been soaking in, but 40% doesn't sound like a bizarre mistake to me? I presume it'd be with someone who's gone through a lot of crazy events that they've had little time to absorb or could not absorb, leaving them with fragmented grasp of what is where. Or on a broader scale, a person going through a regular life.
chinaphil: Thanks for the comment! I agree with most of that, though I'd slightly tweak the metaphilosophy: Given the flexibility/indeterminacy of ordinary language, we can resolve in favor of one approach rather than another on pragmatic grounds.
Callan: Yes, I agree they all have pretty different structures and backstories -- I'm going for a shotgun approach here and just hoping one rings true to the reader. On your last point, would it seem a bizarre mistake to you to say "I know P but I'm 60% confident P is false"?
Can we finally agree that "knowledge" is a rough and ready colloquial term, and that it means different things to different people in different contexts, and that, properly speaking, there really is no such thing as philosophy of knowledge? Unless, of course, you are an internalist, and you consider it a subcategory of philosophy of mind . . .
-John Gregg
http://www.jrg3.net/mind/
Eric,
I get a twinge. It doesn't seem a state to declare - I feel it's more a state of indecision. To declare it as somehow fixed and going to stay that way into the indefinate future, I guess that does sound almost like a mental condition/bizarre mistake.
But I tend to find myself reading it with charity, that the person is transitioning to a new commitment, rather than remaining still with such an arrangement.
Dog just dragged this in: http://pss.sagepub.com/content/early/2014/11/10/0956797614553944.full.pdf
You could say FOKs dissociate even when there's nothing to dissociate from!
Callan -- Yes, I think transitioning cases are sometimes like this. You know, you have the insight, but you don't quite trust that you really do have it, that you won't soon see what's wrong with it.
Scott: Yes!
So what are you shooting for, Eric? A name for it? A mutually recognised feel for it so as to be able to refrence it in conversation? Just asking - it'd seem a good idea to get some recognition of it in regular conversation. Try bugging that Kaplan guy to put it in big bang, so as to create a massively recognised lexicon for it! Heh...ok, cheeky suggestion!
that you won't soon see what's wrong with it.
I guess it's a procedure question - do you have to just see what's wrong with it OR keep it? Does that make it a binary between still being kept or completely thrown out the window?
Can't it be kept in a mothball status - potentially it is true, and of course one looks silly for having mothballed it if that turns out the case, but it garners no more commitments. Yet at the same time it's part of you, really, so it's kept in storage like one might keep artworks. Or is that a bit of a hording thing? Certainly one might have to have more procedure on only a limited amount being stored like this?
I personally do have a thing against ruthless theory culling. I don't think it's us, really, as a species.
I was wondering whether a difference between "I know where my car is" and "I know where my car is parked" might have an impact on intuitions about credence weighing in these cases. If you ask me "Do you know where your car is?", I would be inclined to say "Yes, I'm pretty certain I know where it is." If you ask me "Do you know where your car is parked?", I feel an urge to reply with stronger language with something like "Yes, I'm quite certain I know where it's parked." I don't really know why this is (maybe I'm idiosyncratic), but I think it has something to do with the salience of the parking. It ties things more closely to my own actions about which I would feel very confident in my knowledge. Whatever has happened since I parked my car, I feel almost as certain as I can be that I know where I parked it (even if I don't know where it's parked right now, ex. if I later learn that it is in an impound lot somewhere, I still know where I parked it.) It further seems like if I know where I parked my car, I should be able to answer where my car is parked. In normal circumstances, it's a perfectly reasonable inference. It's something that would be reasonable to conclude on the basis of something I have very high confidence in. This connection becomes less salient when my knowledge of my parking is removed from the context when I'm asked the more general question of "Do you know where your car is?" Perhaps this makes a difference to the intuitions and explains why it might seem strange to assign anything but high credence to "I know where my car is parked."
Callan: I wouldn't advocate any one answer to those questions, since I think there are a diversity of types of cases. So, it can be mothballed, or not, etc.
Anon: I share that intuition about the pair of sentences you mention. Perhaps your diagnosis is correct. Another thought is that the "parked" question seems to build in the presupposition that the car is still parked and so invites (or even provides evidence for, depending on whether the person who is asking might have some relevant knowledge) an answer that shares that assuming.
Relatedly, the question builds in the assumption that I do in fact have a car; as the alternatives get more remote we might not want to build in that presupposition either. I'm not sure there's such a thing as a perfectly presuppositionless question, though, so that might not make sense as an ideal.
Eric,
I thought this would be figuring how to narrow down the cases - so it can be 40% and not mothballed? As I said before, I think 40% is a transitional. If it's not headed toward mothballing (or not headed towards recommitment), then it wouldn't be in transition.
Maybe that's me being impatient/being from an impatient species and expecting it to transition faster. But if it's not transitioning at all - yeah, I'd call that a problem.
Post a Comment