Tuesday, August 24, 2010

Delusions and Action (by guest blogger Lisa Bortolotti)

As I suggested in my previous post, we sometimes have the impression that people do not fully endorse their delusions. In some circumstances, they don’t seem to act in a way that is consistent with genuinely believing the content of their delusions. For instance, a person with persecutory delusions may accuse the nurses in the hospital of wanting to poison him, and yet eat happily the food he’s given; a person with Capgras delusion may claim that his wife has been replaced by an impostor but do nothing to look for his wife or make life difficult for the alleged “impostor”.

Some philosophers, such as Shaun Gallagher, Keith Frankish and Greg Currie, have argued on the basis of this phenomenon (which is sometimes called “double bookkeeping”) that delusions are not beliefs. They assume that action guidance is a core feature of beliefs and maintain that, if delusions are not action guiding, then they are not beliefs. Although I have sympathies with the view that action guidance is an important aspect of many of our beliefs, I find the argument against the belief status of delusions a bit too quick.

First, as psychiatrists know all too well, delusions lead people to act. People who believe that they are dead Cotard delusion may become akinetic, and may stop eating and washing as a result. People who suffer from delusions of guilt, and believe they should be punished for something evil they have done, engage in self-mutilation. People who falsely believe they are in danger (persecutory delusions) avoid the alleged source of danger and adopt so-called “safety behaviours”. The list could go on. In general it isn’t true that delusions are inert.

Second, when delusions don’t cause people to act, a plausible explanation is that the motivation to act is not acquired or not sustained. Independent evidence suggests that people with schizophrenia have meta-representational deficits, flattened affect and emotional disturbances, which can adversely impact on motivation. Moreover, as Matthew Broome argues, the physical environment surrounding people with the delusion doesn’t support the action that would ensue from believing the content of the delusion. The content of one’s delusion may be so bizarre (e.g., “There’s a nuclear reactor in my belly”) that no appropriate action presents itself. The social environment might be equally unsupportive. One may stop talking about one’s delusion and acting on it to avoid incredulity or abuse from others.

My view that delusions are continuous with ordinary beliefs is not challenged by these considerations: maybe to a lesser extent than people with delusions, we all act in ways that are inconsistent with some of the beliefs we report - when we’re hypocritical - and we may fail to act on some of our beliefs for lack of motivation - when we’re weak-willed.


Keith Frankish said...

Hi Lisa. I think you're right that delusions are continuous with ordinary beliefs, but I'd deny that all delusions are beliefs.

As you know, I think that delusions are acceptances, where an acceptance is a commitment to treating a proposition as true, at least in some contexts. Now, some acceptances are beliefs -- specifically (I'd argue), those that are unrestricted as to context. But others are not; a lawyer can accept that her client is innocent for the purposes of defending him without believing that he is innocent.

And I think the same goes for delusions: some are beliefs (albeit odd and deviant ones), but some are too localized, unintegrated, and inert to deserve the title. Moreover, I suspect there's a large class whose status is indeterminate. In fact, I'd argue, nothing much hangs on whether a delusion gets the folk title of 'belief'; the important psychological type here is acceptance.

So I don’t think there's a big difference between us here. And I think you’re right about the role of motivation. By default, we are disposed to act upon our acceptances -- to honour the commitments we've made. But delusions often conflict with the patient's other acceptances and goals, and dictate actions which they find undesirable, so that their motivation to act on them is severely diminished or restricted. In fact, I think this points to a central conflict in delusion: the patient is strongly motivated to make a commitment to the delusion content (perhaps for emotional reasons) but also strongly motivated to refrain from executing that commitment consistently.

Anonymous said...

Hi Keith.

As you know, I like your position on this issue because it is very nuanced and very plausible (sorry if I couldn't do justice to it in the post due to the limited word count).

I agree that not all the reports we call delusional are reports of beliefs, but when in the literature people focus on the alleged 'behavioural inertness' of delusions as a category, then they often forget how motivation works in 'normal' beliefs. That was the (very minimal) point of the post.

When people don't act on their reports, sometimes it is because they do not truly endorse them (and then we should suspect that the reports are not reports of beliefs). At other times it is because they fail to acquire or sustain the motivation to act.

George Graham said...

Hi Lisa and Others,

I am always on the lookout for what people are saying about delusions, and came across this wonderful exchange about delusions on E's excellent blog site.

Many, many years ago, I worked in the psychiatric ward of a Harvard teaching hospital, and helped to care for numerous persons with delusions. Delusions of all sorts and in a variety of different disorders. I was a mere aide or orderly, but had a philosophy MA in my conceptual quiver, and so was not without some intellectual equipment with which to try to parse just what is present in a person with such a diagnosis.

In some persons/cases, beliefs of various sorts were central to being (diagnosed as) deluded, in a normal sense of 'belief' (appositely connected with action, feeling, and so on). But other cases were much more 'hodge-podgy', if I might use such a term. A patient, for example, may have obsessive thoughts that they (mis?)report as beliefs. Or they may constantly make bizarre claims or assertions, but on which they never act. One patient, a priest, kept referring to me as the devil, and asserting that he should stay away from me, but he seemed to like to talk to me. What was I to make of this? Was he a victim also of weakness of epistemic will? Believing that I was the devil but unable to resist communicating with me? Or was his attitude no belief at all. Just something that he claimed or asserted, perhaps for its effect on me?

What I ended up making of these experiences on the ward, though this was only years and years later, is a view of delusions that I help to call with G. Lynn Stephens, 'the delusional stance' conception of delusions. Namely: a view of delusions that recognizes that beliefs may be central to a delusion -- but also that they may not be, at least if psychiatric diagnostic practice is to be honored.

What is central to delusion qua delusion is a certain complex failure, on the part of a subject of delusion, in epistemic self-management. A failure (among other things) to appreciate the harmfulness of certain of their attitudes, thoughts, and feelings, and to appreciate just how distant such states are from being properly warranted. But yet clinging to them. Being resistant to their expulsion.

There is no doubt that the very idea of delusion is in need of regimentation. But I, for one, do not believe that the regiment ought to line up behind General Belief. The clinical infantry is too various for that.

Anonymous said...

Dear George

You are right that not all the things we call delusions are best described as beliefs, and that sometimes we can misinterpret obsessive thoughts as delusions (I made this point in reply to a comment on a previous post). I'm also open to the idea that there may be a delusional 'stance'.

Delusional mood, for instance, is not enough for beliefs, but it is related to, and can precede, delusions. This is when people feel a heightened sense of significance in their own experience without being able to identify anything specific as the object of their feeling.

What I'm worried about, and for reasons that have little to do with psychiatry, is the tendency in philosophy to idealize beliefs. Some of the arguments against the view that delusions are (partly, or sometimes) beliefs derive from the conviction that beliefs are largely true and rational. By showing that the gap between delusions and 'normal' beliefs is much less vast than it might seem on the surface, I want to resist the idealization of beliefs and promote a more psychologically realistic view of our mental life.

As an added bonus, a realistic perspective on our mental life is also beneficial to our understanding of psychiatric disorders as continuous with normal functioning.

George Graham said...


I am certainly with you on not wishing to idealize our mental life and in appreciating that the boundary between normality, or the normatively appropriate, and abnormality, or the normatively inappropriate, when it comes to mentality is not marked with sharp divides or precisely determinate discontinuities. This is especially the case with respect to the rationality/irrationality of propositional attitudes and with respect to beliefs in particular.

In a paper about to appear in Philosophy, Psychiatry, and Psychology, Marga Reimer (who, too, likes to think of delusions as beliefs) asks the following fascinating question: What is the difference between a delusion (if we may assume that delusions are a type of belief) and the sort of (equally bizarre) belief or opinion held by a philosopher that, say, she is a fiction or that the external world does not exist? One price of conceiving of delusions as a species of belief is that questions like that (and there are others of their kind, as you know) must come to the forefront of defense of the conception. The price is not necessarily steep, but it is one real cost of the conception.

One may attempt to pay it either "upstream" in an account of the etiology of delusions or "downstream" in an account of their effect on a person's psychological economy (and in comparison and contrast with various 'bizarre philosophical' attitudes); or perhaps paid in two related installments, one up, one down. But it must be paid, of course.

No need to respond to this in the blog. Best wishes with this fascinating work.


Anonymous said...

Hi George - thank you again for your comments.

Those who think that (most) delusions are beliefs (like myself) are not committed to the view that being a belief is all there is to delusions. They can have an account of what type of beliefs delusions are, and of the ways in which delusions can be differentiated from other beliefs.

There are criteria we can use to distinguish a delusional from an extravagant but not delusional belief, and these may have to do, as you anticipated, with aetiology, with effects on functioning, or with epistemic norms for beliefs.

Spelling out these criteria is not a cost that must be paid only by the supporters of the doxastic account; it's an integral part of the job of distinguishing delusions from non-delusions.

Even those who think that delusions are other than beliefs - acts of imagination, alternative realities, stances towards the experienced world, etc. - will have to account for what makes an act of imagination, an alternative reality or a stance delusional.

Not all accounts are equally satisfactory, and they should be assessed on the basis of elegance, economy, empirical adequacy, coherence with other philosophical and psychological theories, vindication of the phenomenology of delusions, and capacity to capture the complexities of clinical practice.

The doxastic account of delusions is not worse off than its competitors with respect to the desiderata above, but whether we go for a doxastic or an anti-doxastic view, no account of something as messy as delusions will ever be neat as a one-line definition.

Michael Caton said...

Interesting post. What's really interesting are the organic disorders you name that have delusional belief components. Lateralized neglect is the one that most fascinates me. Why deny that the limb is connected, or deny that there can be a part of space you're not aware of? If someone could reproducibly show me a limb that looked like my own but wasn't connected to me in any way I could see, I would be disturbed but I would eventually be forced to conclude that yes, this is my limb and yes, I must be suffering some kind of perceptual problem. But often people seem uninterested in ask these kinds of questions, not just with delusions that have a clear organic etiology, but with delusions in general. So then we might ask to what extent the existence of delusional beliefs is at least partly a disorder of belief integration? That is, what is different about people with strong delusional beliefs allows them to compartmentalize? Such beliefs don't survive contact with other beliefs in most people's minds.

Anonymous said...
This comment has been removed by the author.
Anonymous said...

Hi Michael

Failure of belief integration seems to be a core feature of delusions, especially of monothematic delusions. I think we are all pretty bad at integrating beliefs in coherent systems (there's some interesting psychological literature on this) but people with delusions display failures of integration that are more accentuated than the rest of the population.

I don't have a clear answer to the question why that's the case, but if we adopt the two-factor theory of delusions, then we can characterise delusions as (partly) disorders of integration.

According to the theory, a neuropsychological deficit giving rise to a delusional experience is not sufficient for delusion formation. The other factor is an impairment in the capacity to assess hypotheses.

Suppose you see a light flashing in the night and hypothesise that an alien spaceship has just landed on your front garden. You quckly dismiss that hypothesis as implausible, as it does not fit with the other things you believe and it is less likely to be true than alternative hypotheses.

People with delusions may not be very good at discounting implausible hypotheses, and may end up endorsing such hypotheses as beliefs. One way of describing this is to say that they are less conservative than the rest of the population, and more likely to accept an explanation of their experience that conflicts with some of their previous beliefs (they "jump to conclusions").

I know this is not entirely satisfactory, because, to my knowledge, we don't have a grip on what causes this impairment in hypothesis evaluation, but it's a first step towards the understanding of the aetiology of delusions.