Wednesday, August 11, 2010

Can You Believe That You Are Dead? (by guest blogger Lisa Bortolotti)

Irrationality is considered a defining feature of delusions in many influential definitions. But in what sense are delusions irrational?

Delusions can be procedurally irrational if they are badly integrated in one’s system of beliefs. They can also be inconsistent with other beliefs one has. Lucy who has Cotard delusion believes at the same time that dead people are motionless and speechless, that she can move and talk, and that she is dead. Here there is an apparent inconsistency that is “tolerated”, that it, doesn’t lead her to revise or reject one of the beliefs. Typical delusions are epistemically irrational, that is, they are badly supported by the evidence available to the subject, and they are not adequately revised in the light of new evidence. John who suffers from anosognosia doesn’t acknowledge that one of his legs was amputated and explains the fact that he can’t climb stairs any longer by a violent attack of arthritis.

These examples are striking. For many philosophers, the irrationality of delusions is a reason to deny that delusions are beliefs. Lucy can’t really believe that she’s dead, maybe what she means is that she feels empty and detached, as if she were dead. John can’t really believe that he has both legs because there is no problem with his visual perception. Maybe he wishes he still had both legs. This way of discounting delusions as metaphorical talk or wishful thinking is appealing. It is based on the view that there is a rationality constraint on the ascription of beliefs. We wouldn’t be charitable interpreters if we ascribed to Lucy the belief that she’s dead and to John the belief that he has arthritis.

I want to resist the idea that people with delusions don’t mean what they say. First, people often act on their delusions and base even important decisions in their lives on the almost unshakeable conviction that the content of their delusions is true. We couldn’t make sense of their behaviour at all if we couldn’t ascribe to them delusional beliefs. Second, the type of irrationality delusions exhibit is not qualitatively different from the irrationality of ordinary beliefs. Delusions may be irrational to a greater extent than ordinary beliefs, and the examples we considered were certainly puzzling, but procedural and epistemic irrationality can be found closer to home.

Students believe that wearing clothes of a certain colour will bring them good luck during the exam and nurses believe that more accidents occur in the nights of full moon. These beliefs are certainly inconsistent with other beliefs well-educated people have about what counts as the probable cause of an event. Prejudiced beliefs about black people being more violent are groundless generalisations that can be just as insulated from evidence and as resistant to change as clinical delusions.

Maybe what makes delusions so puzzling is only that they are statistically less common (not necessarily more irrational) than other procedurally and epistemically irrational beliefs.

12 comments:

Eric Schwitzgebel said...

Lisa, you write: "We couldn’t make sense of their behaviour at all if we couldn’t ascribe to them delusional beliefs." I'm not sure that is true. I can see how one might think belief necessary to explain behavior if one holds a view according to which intentional action has to arise from the right sort of belief-desire pair, but I don't think we need to accept such a theory of intentional action.

I'm inclined to think that there are many "in-between" cases of belief -- cases in which it seems not quite right to say the person believes and not quite right to say the person fails to believe. The person is mixed up, lacking a harmonious set of dispositions on the issue. Maybe some cases of this are (a.) the person who sincerely, explicitly says that all the races are intellectually equal but who is pervasively racist in her day-to-day behavior, (b.) the person who would recall or recognize the truth of a proposition in a few contexts but not in most contexts, (c.) certain types of ambivalent self-deception.

If such in-between cases can exist, why not think that delusions might be among them?

Anonymous said...

What I meant was to challenge the view that delusions are always or necessarily metaphorical 'as if' talk or wishful thinking. If they were always or necessarily meant to be about a non-actual reality, and never genuinely believed, then their pervasive effects on apparently intentional action would be deeply mysterious.

But of course you are right that there can be in-between cases and your examples are convincing. What I would like to avoid, though, is the policy of considering all instances of less-than-ideally-rational beliefs as an in-between case.

jmark said...

Lisa,

I wonder if there is not another category of cases that fall outside the categories that Eric describes, but that are not clearly delusional either. Andrew Solomon, in his book on depression describes being terrified of showering and yet knowing that there is nothing particularly scary about showers. Some persons with OCD describe recognizing that it is irrational to think they need to wash their hands again, yet seem to believe that they must wash them again nonetheless. In these kinds of cases the agent not only has conflicting beliefs, but explicitly acknowledges that the beliefs conflict and is often distraught by the epistemic conflict.

I wonder if you would find such cases importantly different than the kinds of cases you describe, or if, alternatively, you think they should be subsumed under one of Eric' categories of "in between" cases?

Anonymous said...

Hi! The case of obsessive thoughts is incredibly interesting - I do compare it to the case of delusions in chapter 1 of the Delusions book.
The main difference seems to be as follows: the person with obsessive thoughts does not necessarily believe that the fear of contamination is justified and that it warrants the repetitive hand-washing, whereas the person with a delusion of, say, persecution, may have insight into the implausibility of her belief, but is concerned with a danger that is real for her, and thus engages in safety behaviours that are (from her point of view) justified by her belief.
This the textbook version of the difference, as far as I understand it. But in many cases obsessive thoughts and delusions are comorbid, so the distinction may not be so clearcut.
I wouldn't consider obsessive thoughts as beliefs (maybe they belong to Eric's in-between category) but MOST delusions seem to be beliefs - their folk psychological role is that of beliefs. Not ALL the things we call delusions, though. Some 'delusional' thoughts that are reported but not endorsed at all, either behaviourally or with reasons, fail to play the folk psychological role of beliefs and may fall into the in-between category.

Matthew said...

Re: obsessions. You're right Lisa. Classically refer to them as 'ego dystonic' and the patient will recognise them as products of their own mind (cf thought insertions) yet distressing, and realize they are irrational and/or silly. Delusions in contrast are ego-syntonic and seen as integral to their identity and as true facts about the world/themselves.

CP said...

Lisa,

The LU case in your ‘Cotard’ link is fascinating. I think it highlights differences between confusion and irrationality, though. How can we fairly and consistently attribute beliefs to a person who was deeply confused? The sorts of behaviours that might lead to belief attributions - e.g. assenting to ’I am dead’ - were in evidence, hence perhaps Eric may wish to attribute an ’in-between’ label to LU, but her attitudes were in flux until they were eventually resolved (if anything, being precisely 40% confident that one is dead is even weirder than being 100% confident).

Admittedly, LU’s confusion could have been accompanied by fleeting but genuine beliefs. She believed she may have been in an afterlife, for instance. (I take the fact that her understanding of what it is to be dead sometimes changed to include being corpselike and immobile to be evidence of her confusion.) This belief probably wasn’t irrational, though - if one believes in an afterlife and doesn’t have specific beliefs as to what it’s like, is it really so irrational to believe one is in it right now? This isn’t to say that it was the beliefs that were confusing LU - the paper you link to strongly suggests that it was her new emotional detachment that she was trying to understand.

I contrast this state of confusion with, for instance, the attitude of a student who understands what it means to say that simply wearing red will increase her chances of success in exams, consistently and sincerely holds that this is the case and consistently does wear red to exams.

CP said...

On another tack, there are some delusion-like states that are not very belief-like. For instance, it is common to identify a hallucination as a delusion (or is ‘delusion’ being used technically, so that this is not allowed?). Perhaps if I hallucinate a tree, you might want to call my ensuing belief that there is a tree there my ‘delusion’. But is that belief really irrational? There are few better reasons for believing there to be a tree than seeing a tree. And I might be reluctant to abandon the belief in the face of further evidence, but anyone would be, and quite rightly. I think you either have to take the original (non-belief) experience of a tree to be the delusion, or claim that my belief in the tree is the delusion, despite its relative non-irrationality.

Anonymous said...

Dear CP, 'delusion' means something different from 'hallucination'. A hallucination can be at the origin of a delusion, when you hear voices and then come to believe that it's the devil tempting you. But hallucinations don't necessarily lead to delusions. For a definition of delusions, see my previous post or my SEP entry on delusions.

Concerning McKay and Cipolotti's 2007 case of LU, you are right that the young woman appears to be deeply confused, and that her behaviour is much more puzzling than that of superstitious students, even if some inconsistency can be detected in both instances of behaviour.

That said, I think it's not to be ruled out a priori that LU believes that she is dead, just because the belief is irrational and inconsistent with other beliefs by LU. I agree with you that the job of the interpreter in this case is a tricky one... Other cases of Cotard delusion may be easier to adjudicate, as the person doesn't commit to openly inconsistent beliefs and endorses the delusion with full conviction for a longer period of time.

Kapitano said...

I think we need to make a distinction between the cause of a belief, the belief itself, the cognitive processes which led to the formation of that belief, and the rationalisations used to justify it afterwards.

In the case of 'dead' Lucy, the cause is (if I've read articles on Coutard delusion correctly) a physical one. Some part of her brain concerned with her ability to feel emotions about her own body isn't functioning properly.

When she sees her own arm, or her own face in the mirror, she doesn't emote a connection between her 'self' and the visible flesh. Trying to make sense of this loss, she rationalises that her body is a machine, or it belongs to someone else...or that it used to be hers but has since died.

In the case of the amputee John, it may be that the mechanism by which the brain updates its own body map hasn't worked, so the map which tells him he has two arms also tells him he still has two legs.

He sees a gap where one leg should be, but avoids thinking about it, much as an old person who wants to believe they still look young avoids thinking about what they see in the mirror, though their vision is not impaired.

When he stumbles on the stairs, his explanation of arthritis is a convenient ad-hoc rationalisation.

Anonymous said...

Dear Kapitano, of course we can distinguish different types of beliefs on the basis of how they are formed - perception, inference, testimony and so on. I think all processes giving rise to mental states are 'physical' ones, but in the case of some delusions (Cotard and Capgras) it is even more obvious that the experience from which the report originates has a physical cause, a type of brain damage.

Your observation about certain delusional reports playing the role of post-hoc rationalisations is very interesting. If you are relatively new to delusions or you haven't read the following papers yet, I recommend the chapters by Martin and Anne Davies and by John Campbell in 'Psychiatry as Cognitive Neuroscience' (OUP 2009) and the papers by Neil Levy and Al Mele in 'Delusions and Self-Deception' (Psychology Press 2008).

Anonymous said...

"Maybe what makes delusions so puzzling is only that they are statistically less common (not necessarily more irrational) than other procedurally and epistemically irrational beliefs." Interesting point and I certainly agree with the point on "not necessarily more irrational." But then there are several irrational beliefs that are not common and are also not considered delusional, but simply unique or eccentric. Perhaps a way of defining them is just that the social group evaluating the belief finds it to be delusional.

For example, nowadays grandiose delusional ideas might be considered, with other factors, the hypomanic side of a bipolar disease. One hundred years ago it would only be considered a swinging mood. Have people changed? No, just their social agreement on what delusion is.

If at some point pharma can find a drug to cure boredom, a bored vision about the world would be delusional, after all all of us will see the world as an ever exciting place. Would boredom then be considered a delusion given that the social agreement has changed? Probably yes.

Emilia said...

Kapitano, I think you're right about the need to distinguish between the cause, the belief and rationalization. It is also true that in some delusions (such as Cotard) obviously is a physical cause, more specifically neurological. But some other types of delusions, not seem to have a neurological basis so clear (of course I do not doubt that all mental activity has a physical basis, in the brain).

But, I'm not sure it's true that they are all "ad-hoc rationalizations" as you say. I immediately raises me the question of why all are make the same rationalization (eg that they are dead, that their relative was kidnapped and there is an impostor). Why so similar content? I think it would be interesting to explore this aspect.