Showing posts with label Lisa Bortolotti. Show all posts
Showing posts with label Lisa Bortolotti. Show all posts

Tuesday, September 14, 2010

Can We All Become Delusional with Hypnosis? by guest blogger Lisa Bortolotti

Recent studies on hypnosis have suggested that delusions can be temporarily created in healthy subjects (see work by Amanda Barnier and Rochelle Cox). When you are given a hypnotic suggestion that you will see a stranger when you look in the mirror, it is probable that your behaviour in the hypnotic session will strikingly resemble that of a patient with a delusion of mirrored self misidentification. Both the hypnotic subject and the delusional patient deny that they see themselves in the mirror and claim instead that they see a stranger who looks a bit like them. Their beliefs are resistant to challenges and often accompanied by complex rationalisations of their weird experience.

Why would we want to create delusions in healthy subjects? It’s difficult to study the phenomenon of delusions in the wild, and especially the mechanisms responsible for their formation. Here are some reasons why we may need the controlled environment of the lab:

1. it is not always possible to investigate a clinical delusion in isolation from states of anxiety or depression that affects behaviour - comorbidity makes it harder to detect which behaviours are due to the delusion under investigation, and which are present for independent reasons;

2. ethical considerations significantly constrain the type of questioning that is appropriate with clinical patients because it is important to avoid causing distress to them, and to preserve trust and cooperation, which are beneficial for treatment;

3. for delusions that are rare, such as the delusion of mirrored self misidentification, it is difficult to find a sufficient number of clinical cases for a scientific study.
Evidence from the manifestation of hypnotically induced delusions has the potential to inform therapy for clinical delusions. Moreover, the use of hypnosis as a model for delusions can also inform theories of delusion formation, as analogies can be found in the underlying mechanisms. There are good reasons to expect that the hypnotic process results in neural patterns that are similar to those found in the clinical cases.

Given that during the hypnotic session healthy subjects engage in behaviour that is almost indistinguishable from that of clinical patients, reflecting on this promising research programme can not only help the science of delusions, but also invite us to challenge the perceived gap between the normal and the abnormal.

[This is Lisa's last guest post. Thanks, Lisa!]

Sunday, September 05, 2010

Are People Responsible for Acting on Delusions? by guest blogger Lisa Bortolotti

Consider this case. Bill suffers from auditory hallucinations in which someone is constantly insulting him. He comes to believe that his neighbour is persecuting him in this way. Exasperated, Bill breaks into the neighbour’s flat and assaults him. Is Bill responsible for his action? Matthew Broome, Matteo Mameli and I have discussed a similar case in a recent paper. On the one hand, even if it had been true that the neighbour was insulting Bill, the violence of Bill’s reaction couldn’t be justified, and thus it is not obvious that the psychotic symptoms are to blame for the assault. On the other hand, psychotic symptoms such as hallucinations and delusions don’t come in isolation, and it is possible that if Bill hadn’t suffered from a psychiatric illness, then he wouldn’t have acted as he did.

In the philosophy of David Velleman, autonomy and responsibility are linked to self narratives. We tell stories about ourselves that help us recollect memories about past experiences and that give a sense of direction to our lives. Velleman’s view is that these narratives can also produce changes in behaviour. Suppose that I have an image of myself as an active person but recently I neglect my daily walk and spend the time in front of the TV. So I tell myself: “I have to get out more or I’ll become a couch potato”. I want my behaviour to match my positive self-image so I can become the person I want to be. Our narratives don’t just describe our past but can also issue intimations and shape the future.

According to Phil Gerrans, who has applied the notion of self narratives to the study of delusions, when experiences are accompanied by salience, they become integrated in a self narrative as dominant events. People with delusions tend to ascribe excessive significance to some of these experiences and, as a result, thoughts and behaviours acquire pathological characteristics (e.g. as when Bill is exasperated by the idea of someone insulting him). Gerrans’ account vindicates the apparent success of medication and cognitive behavioural therapy (CBT) in the treatment of delusions. Dopamine antagonists stop the generation of inappropriate salience, and by taking such medication, people become less preoccupied with their abnormal experiences and are more open to external challenges to their pathological beliefs (“How can I hear my neighbour’s voice so clearly through thick walls?”) In CBT people are encouraged to refocus attention on a different set of experiences from those contributing to the delusional belief, and to stop weaving the delusional experiences in their self narratives by constructing scenarios in which such experiences make sense even if the delusional belief were false (“Maybe the voice I’ve heard was not my neighbour’s.”)

As Gerrans explains, self narratives are constructed unreliably in the light of abnormal experiences and delusional beliefs. If we take seriously the idea that self narratives may play an important role in the governance of behaviour, and accept that narratives constructed by people with delusions are unreliable, then it’s not surprising that people with delusions are not very successful at governing themselves.

Tuesday, August 24, 2010

Delusions and Action (by guest blogger Lisa Bortolotti)

As I suggested in my previous post, we sometimes have the impression that people do not fully endorse their delusions. In some circumstances, they don’t seem to act in a way that is consistent with genuinely believing the content of their delusions. For instance, a person with persecutory delusions may accuse the nurses in the hospital of wanting to poison him, and yet eat happily the food he’s given; a person with Capgras delusion may claim that his wife has been replaced by an impostor but do nothing to look for his wife or make life difficult for the alleged “impostor”.

Some philosophers, such as Shaun Gallagher, Keith Frankish and Greg Currie, have argued on the basis of this phenomenon (which is sometimes called “double bookkeeping”) that delusions are not beliefs. They assume that action guidance is a core feature of beliefs and maintain that, if delusions are not action guiding, then they are not beliefs. Although I have sympathies with the view that action guidance is an important aspect of many of our beliefs, I find the argument against the belief status of delusions a bit too quick.

First, as psychiatrists know all too well, delusions lead people to act. People who believe that they are dead Cotard delusion may become akinetic, and may stop eating and washing as a result. People who suffer from delusions of guilt, and believe they should be punished for something evil they have done, engage in self-mutilation. People who falsely believe they are in danger (persecutory delusions) avoid the alleged source of danger and adopt so-called “safety behaviours”. The list could go on. In general it isn’t true that delusions are inert.

Second, when delusions don’t cause people to act, a plausible explanation is that the motivation to act is not acquired or not sustained. Independent evidence suggests that people with schizophrenia have meta-representational deficits, flattened affect and emotional disturbances, which can adversely impact on motivation. Moreover, as Matthew Broome argues, the physical environment surrounding people with the delusion doesn’t support the action that would ensue from believing the content of the delusion. The content of one’s delusion may be so bizarre (e.g., “There’s a nuclear reactor in my belly”) that no appropriate action presents itself. The social environment might be equally unsupportive. One may stop talking about one’s delusion and acting on it to avoid incredulity or abuse from others.

My view that delusions are continuous with ordinary beliefs is not challenged by these considerations: maybe to a lesser extent than people with delusions, we all act in ways that are inconsistent with some of the beliefs we report - when we’re hypocritical - and we may fail to act on some of our beliefs for lack of motivation - when we’re weak-willed.

Monday, August 16, 2010

Delusions and Self-Knowledge (by guest blogger Lisa Bortolotti)

Suppose that Chloe suffers from a delusion of erotomania and believes that President Obama is secretly in love with her. Chloe has never met him, so how does she know about his feelings? When probed, Chloe may offer no reason in support of her belief or offer reasons that others would consider unsatisfactory or irrelevant (e.g., “He is sending me love messages that only I can decipher”).

One explanation is that the belief is so certain for Chloe that she doesn’t feel the need to provide a justification for it. John Campbell argued that at least some delusions play the role of framework beliefs, a notion introduced by Wittgenstein in On Certainty. Framework beliefs (e.g., “The Earth existed long before my birth”) are central to our world-view and become virtually indubitable. They are the pillars on which the rest of our belief system rests, and can’t themselves be justified on the basis of beliefs that are more certain. However, they are manifested in our way of life - we wouldn’t believe our grandparents’ war stories if we thought that the Earth had come into existence at the same time as we did. In my view, delusions are unlikely to play the same role as framework beliefs. Framework beliefs are typically shared by an entire linguistic community, delusions are not. Framework beliefs are perfectly integrated in a belief system, whereas delusions are often in conflict with other beliefs.

What puzzles us about those delusions that seem to come out of nowhere is that the person reports them with conviction but doesn’t seem to genuinely endorse them, whereas there is no doubt that framework beliefs are endorsed. Richard Moran developed the notion of authorship which captures the sense in which we know what our beliefs are on the basis of the fact that we endorse their content. We can introspect some of our beliefs. We can infer some of our beliefs from our past behaviour. But at times we know that we believe that p, because we have made our mind up that p based on evidence for p. This mode of knowledge is direct like introspection, but it’s not as passive as perceiving a belief floating around in our stream of consciousness, and doesn’t involve looking inward, but looking outward, at the evidence for p. I know that I believe that the death penalty should be abolished because I have good reasons to believe that the death penalty should be abolished.

When I justify my beliefs with reasons that I regard as my best reasons, according to Moran I’m the author of the belief. The notion of authorship combines aspects of rationality and self-knowledge that we tend to take for granted. We expect that, if Chloe is convinced that Obama is in love with her, she must have some reasons to believe that, and she must be able to justify her belief on the basis of those reasons. But in the case of delusions, authorship can be fully or partially compromised. This suggests that people like Chloe experience a failure of self-knowledge.

Wednesday, August 11, 2010

Can You Believe That You Are Dead? (by guest blogger Lisa Bortolotti)

Irrationality is considered a defining feature of delusions in many influential definitions. But in what sense are delusions irrational?

Delusions can be procedurally irrational if they are badly integrated in one’s system of beliefs. They can also be inconsistent with other beliefs one has. Lucy who has Cotard delusion believes at the same time that dead people are motionless and speechless, that she can move and talk, and that she is dead. Here there is an apparent inconsistency that is “tolerated”, that it, doesn’t lead her to revise or reject one of the beliefs. Typical delusions are epistemically irrational, that is, they are badly supported by the evidence available to the subject, and they are not adequately revised in the light of new evidence. John who suffers from anosognosia doesn’t acknowledge that one of his legs was amputated and explains the fact that he can’t climb stairs any longer by a violent attack of arthritis.

These examples are striking. For many philosophers, the irrationality of delusions is a reason to deny that delusions are beliefs. Lucy can’t really believe that she’s dead, maybe what she means is that she feels empty and detached, as if she were dead. John can’t really believe that he has both legs because there is no problem with his visual perception. Maybe he wishes he still had both legs. This way of discounting delusions as metaphorical talk or wishful thinking is appealing. It is based on the view that there is a rationality constraint on the ascription of beliefs. We wouldn’t be charitable interpreters if we ascribed to Lucy the belief that she’s dead and to John the belief that he has arthritis.

I want to resist the idea that people with delusions don’t mean what they say. First, people often act on their delusions and base even important decisions in their lives on the almost unshakeable conviction that the content of their delusions is true. We couldn’t make sense of their behaviour at all if we couldn’t ascribe to them delusional beliefs. Second, the type of irrationality delusions exhibit is not qualitatively different from the irrationality of ordinary beliefs. Delusions may be irrational to a greater extent than ordinary beliefs, and the examples we considered were certainly puzzling, but procedural and epistemic irrationality can be found closer to home.

Students believe that wearing clothes of a certain colour will bring them good luck during the exam and nurses believe that more accidents occur in the nights of full moon. These beliefs are certainly inconsistent with other beliefs well-educated people have about what counts as the probable cause of an event. Prejudiced beliefs about black people being more violent are groundless generalisations that can be just as insulated from evidence and as resistant to change as clinical delusions.

Maybe what makes delusions so puzzling is only that they are statistically less common (not necessarily more irrational) than other procedurally and epistemically irrational beliefs.

Friday, August 06, 2010

Clinical Delusions: What Are They? (by guest blogger Lisa Bortolotti)

(Lisa Bortolotti is a Senior Lecturer in Philosophy at University of Birmingham and author of Delusions and Other Irrational Beliefs.)

In the last five years I have been working on the nature of clinical delusions, and have asked what they can tell us about the philosophy and psychology of belief. Clinical delusions are a symptom of a variety of psychiatric disorders, among which are schizophrenia and dementia. Some delusions have fairly mundane content, such as delusions of persecution or jealousy. Other delusions are very bizarre, and people may come to assert that they are dead (Cotard delusion) or that their spouse or family member has been replaced by an impostor (Capgras delusion).

In the Diagnostic and Statistical Manual of Mental Disorders (American Psychiatric Association DSM-IV-TR, 2000), delusions are defined in epistemic terms, as beliefs that are false, insufficiently supported by the available evidence, resistant to counterevidence and not shared by other people belonging to the same cultural group. In the philosophical literature it is an open question whether delusions should be considered as genuinely instances of belief.

According to the two-factor theory of delusions, delusions are explanatory hypotheses for an abnormal experience which is due to brain damage. The first factor contributing to the formation of the delusion is a neuropsychological deficit and the second factor is an impairment in the evaluation of hypotheses. Imagine that, overnight, Julia’s sister appears different to Julia, and this is powerful experience. One possible explanation is that an alien has abducted Julia’s sister during the night and replaced her with an almost identical replica without anybody else noticing. This hypothesis is implausible and even Julia would consider it as highly improbable, but if her hypothesis-evaluation system doesn’t work properly and doesn’t dismiss it, Julia may endorse it as something she truly believes. As a result, she may become hostile and even aggressive towards the alleged impostor.

Even from the oversimplified example above, one can see where the tension is. On the one hand, delusions seem to be just like any other beliefs. They are reported with sincerity and they can affect the person’s other intentional states and behaviour. They “make sense” of a very unusual experience. On the other hand, there is a neuropsychological deficit at the origin of delusions that is not present in the non-clinical population. The affective channel of Julia’s facial recognition process is damaged. The good functioning of the hypothesis-evaluation system is also compromised, maybe due to exaggerated versions of common reasoning biases. Julia “jumped to conclusions” as she accepted her initial hypothesis on the basis of insufficient evidence and without considering other, more probable, alternatives. This unusual aetiology and the apparent extreme irrationality might seem to be in tension with the view that delusions are “beliefs” in the ordinary sense of that term.

However, in my view, the main difference between clinical delusions and other irrational beliefs is that delusions severely undermine well-being. People with schizophrenia are often isolated and withdrawn and their life plans are disrupted. But on purely epistemic grounds we can’t easily tell delusions apart from the false beliefs that we ourselves report and ascribe to others on an everyday basis, such as: “Women make poor scientists” or “I failed the exam because the teacher hates me”. Irrationality is indeed all around us.