Friday, January 10, 2014

Skeptical Fog vs. Real Fog

I am a passenger in a jumbo jet that is descending through turbulent night fog into New York City. I'm not usually nervous about flying, but the turbulence is getting to me. I know that the odds of dying in a jet crash with a major US airline are well below one in a million, but descent in difficult weather conditions is among the most dangerous parts of flight -- so maybe I should estimate my odds of death in the next few minutes as about one in a million or one in ten million? I can't say those are odds I'm entirely happy about.

But then I think: Maybe some radically skeptical scenario is true. Maybe, for example, I'm a short-term sim -- an artificial, computerized being in a small world, doomed soon to be shut down or deleted. I don't think that is at all likely, but I don't entirely rule it out. I have about a 1% credence that some radically skeptical scenario or other is true, and about 0.1% credence, specifically, that I'm in a short-term sim. In a substantial portion of these radically skeptical scenarios, my life will be over soon. So my credence that my life will soon end for some skeptical-scenario type reason is maybe about one in a thousand or one in ten thousand -- orders of magnitude higher than my credence that my life will soon end for the ordinary-plane-crash type of reason.

Still, the plane-crash possibility worries me more than the skeptical possibility.

Does the fact that these skeptical reflections leave me emotionally cold show that I don't really, "deep down", have even a one-in-a-million credence in at least the imminent-death versions of the skeptical scenarios? Now maybe I shouldn't worry about those scenarios even if I truly assign a non-trivial credence to them. After all, there's nothing I can do about them, no action that I can reasonably take in light of them. I can't, for example, buy sim-insurance. But if that's why the scenarios leave me unmoved, the same is true about the descending plane. There's nothing I can do about the fog; I need to just sit tight. As a general matter helplessness doesn't eliminate anxiety.

Here my interest in radical skepticism intersects another of my interests, the nature of belief. What would be involved in really believing that there is a non-trivial chance that one will soon die because some radically skeptical scenario is true? Does genuine belief only require saying these things to oneself, with apparent sincerity, and thinking that one accepts them? Or do they need to get into your gut?

My view is that it's an in-between case. To believe, on my account, is to have a certain dispositional profile -- to be disposed to reason, and to act and react, both inwardly and outwardly, as ordinary people would expect someone with that belief to do, given their other related attitudes. So, for example, to believe that something carries a 1/10,000 risk of death is in part to be disposed sincerely to say it does and to draw conclusions from that fact (e.g., that it's riskier than something with a 1/1,000,000 risk of death); but it is also to have certain emotional reactions, to spontaneously draw upon it in one's everyday thinking, and to guide one's actions in light of it. I match the dispositional profile, to some extent, for believing there's a small but non-trivial chance I might soon die for skeptical-scenario-type reasons -- for example, I will sincerely say this when reflecting in my armchair -- but in other important ways I seem not to match the relevant dispositional profile.

It is not at all uncommon for people intellectually to accept certain propositions -- for example, that their marriage is one of the most valuable things in their lives, or that it's more important for their children to be happy than to get good grades, or that custodians deserve as much respect as professors -- while in their emotional reactions and spontaneous thinking, they do not very closely match the dispositional profile constitutive of believing such things. I have argued that this is one important way in which we can occupy the messy middle space between being accurately describable as believing something and being accurately describable as failing to believe it. My own low-but-not-negligible credence in radically skeptical scenarios is something like this, I suppose.


P.D. Magnus said...

I think that the model of belief which quantifies credence as probability just breaks down in sceptical cases, which I think is stronger than your claim that they are boundary cases. I just have no idea what it would mean, as a psychological matter, to assign 1% credence to being in a simulation.

Eric Schwitzgebel said...

PD: Interesting point. For example, a conventional wager scenario to determine credence might be tricky to work out. I'll wager $1 to earn $3 if it rains tomorrow and $0 if it doesn't rain; but would I wager $1 to earn $10,000 if this is a sim? How would I trust that the payout would occur, what would it mean for such a payout to occur? The issues are non-obvious.

And yet I think we can still at least roughly gesture at skeptical scenario credences in this way. It seems much more confident to say one-in-a-million that I'm dreaming than to say (more reasonably, I think) one-in-a-thousand. The degree of confidence is *comparable* to the degree of confidence which would represent my regarding certain real-world bets as fair on the assumption that no skeptical scenario is true.

Marco Devillers said...

Of course you don't. I've said it before: It's bloody impossible to walk around with a weighted belief system as a human being. So you're talking nonsense. Delightful intellectual nonsense, though.

Personally, I don't believe the world consists of things; I am not a Platonist. But I am forced to reason about the world as if it were true; I am a handicapped being.

Eric Schwitzgebel said...

I agree, Marco, that in some ways it's easier to reason from simple yes-no belief than from weighted belief. But we do also regularly and spontaneously make judgments based on the relative probabilities of events that are less likely than 50%, with some skill and flexibility in doing so. But for some reason this seems harder to do about philosophical theses than about, e.g., the chance that someone is lying to you or in playing tactical competitive sports.

Callan S. said...

Well, you could still not get on planes in the future.

What if there was something you could do about your impending deletion? What if the answer is right in front of you and if you just knew what to do you could save yourself? But you're going to die!

There, does it start to hit genuinely troubling town a small amount?

Potential salvation is what adds the terror? ;)

Marco Devillers said...

I don't think it's easier, I think you're forced by nature to think in yes/no beliefs. Even a 'likely true' belief doesn't seem to have weights associated with it and seems to be an, in essence, yes/no decision.

Whatever. I am stuck on another personal thought train of rocking chair thoughts I amuse myself with so I guess my own beliefs are colored at the moment.