Monday, May 19, 2008

Defining "Consciousness"

Scientists, students, ordinary folks, and even philosophers sometimes find the word "consciousness" baffling or suspicious. Understandably, they want a definition. To the embarrassment of us in "consciousness studies", it proves surprisingly difficult to give one. Why? The reason is this: The two most respectable avenues for scientific definition are both blocked in the case of consciousness.

Analytic definitions break a concept into more basic parts: A "bachelor" is a marriageable but unmarried man. A "triangle" is a closed, three-sided planar figure. In the case of consciousness, analytic definition is impossible because consciousness is already a basic concept. It's not a concept that can be analyzed in terms of something else. One can give synonyms ("stream of experience", "qualia", "phenomenology") but synonyms are not scientifically satisfying in the way analytic definitions are.

Functional definitions characterize terms by means of their causal role: A "heart" is the primary organ for pumping blood; "currency" is whatever serves as the medium of exchange. Someday maybe a functional definition of consciousness will be possible; but first we have to know what kind of causal role (if any) conscious plays; and we're a long way from knowing this. Various philosophers and psychologists have theories, of course, but to define consciousness in terms of one of these contentious theories begs the question.

So maybe the best we can do is definition by instance and counterinstance: "Furniture" includes these things (tables, desks, chairs, beds) and not these (doors, toys, clothes). "Square" refers to these shapes and not these. Hopefully with enough instances and counterinstances one begins to get the idea. So also with consciousness: Consciousness includes inner speech; visual imagery; felt emotions; dreams; hallucinations; vivid visual, auditory, and other sensations. It does not include: Immune system response, early visual processing, myelination of the axons; what goes on in dreamless sleep.

Unfortunately, definition by instance and counterinstance leaves unclear what to do about cases that don't obviously fit with either the given instances or counterinstances: If all the given instances and counterinstances of "square" are in Euclidean space, what does one do with non-Euclidean figures? Are paintings "furniture"?

Now of course some cases are just vague and rightly remain so. But one question that interests me turns on exactly the kinds of cases that definition of consciousness by instance and counterinstance leaves unclear: Whether unattended or peripheral stimuli (the hum of the refrigerator in the background when you're not thinking about it, the feeling of your feet in your shoes) are conscious. It would beg the question to include such cases in one's instances or counterinstances. But this class of potential instances is so large and important that to ignore it risks leaving unclear exactly what sort of phenomenon "consciousness" is.

Now maybe (this is my hope) there really is just one property -- what consciousness researchers call "consciousness" -- that stands out as the obvious referent of any term meant to fit the instances and counterinstances above, so that no human would accidentally glom on to another property, given the definition, just as no human would glom on to "undetached rabbit part" as the referent of the term "rabbit" used in the normal way. But this may not be true; and if it's also not acceptable (as I think it's not) simply to lump cases like the sound of the fridge and the feeling of one's shoes into the class of vague, in-between cases, then even definition by instance and counterinstance fails as a means of characterizing consciousness.

Are we left then with Ned Block's paraphrase of Louis Armstrong: "If you got to ask [what consciousness/jazz is], you ain't never gonna get to know"?


Anonymous said...

Hi Eric,

I'm thinking that maybe philosophical notion of consciousness is tightly related to a kind of Cartesian approach, where it seems to be defined negatively.

That is, the world is described through certain notions which we can approach through sciences, and then basically we acknowledge that there is something of which we are aware, but which is missing in that given description. So, I guess we lump all those things (you mentioned qualia, stream of experience, probably also intentionality, and so on...) into this consciousness category.

That there is this negative meaning to consciousness, can be seen if we take for example naive-realistic view on colors. The moment we take such naive-realist take on them, and say that the colors are properties of objects, we won't be inclined any more to see them (as qualia) as an example of consciousness.

Justin (koavf) said...

Or perhaps: "Consciousness? I know it when I see (by) it."

Anibal Monasterio Astobiza said...

What if we provide an evolutionary view to define consciousness.
As Searle did with his "biological naturalism" paradigm and more recently Ravenssuo with his "biological realism" perhaps consciousness its a biological adpatation and has real survival value.
In this sense we can define the distinct states of animal conciousness in its evolutionary path or philogenetics (including human conciousness)and find for biological markers of what consist to be conscious.
Neverthless, i recognize that because conciousness is the default background for meaning, a definition of consciousness is trap in logical problems

arnold Trehub said...

Here's my candidate definition:

Consciousness is a transparent phenomenal experience of the world from a privileged egocentric perspective.

What are the problems with this definition?

MT said...

But rabbits are instantially defined, aren't they? With rabbits and birds and even people ("for what is a man?") at least some philosophers would say we're talking about kinds. If consciousness(es) too is(are) just a kind of thing--and why not?--then what's wrong or necessarily incomplete about its instances? If you want to think biologically, I suppose you'd want a definition that entailed an anatomy, so that we could compare consciousneses across species. An anatomical analysis is extremely special, but does it yield what's essential (like cutting open the cadaver of your best friend)? And what about the problem of anatomical diversity? When breeders use artificial selection to give rabbits floppy ears, they're still rabbits, arguably at least. It's hard to know what anatomy is essential to rabbithood, and we might expect consciousness anatomy to present the same problem. Maybe a definition has fulfilled its duty if it only tells us to what a word refers. Why expect it to tell us the anatomy of the thing or things referred to, and to provide the understandings this sort of information can yield? How rabbits' physiology means to where they live is little like what a word means in a poem, or in a phrase calculated to make a friend cry. There's a lot more meaning out there in the world with regard to rabbits and particular words, and we expect a definition only to allow them all, not to specify them.

Eric Schwitzgebel said...

Thanks for all the comments, folks!

Tanasije: It's an interesting thought that consciousness is defined negatively -- for example, it's what's going on with you when you see color, if you don't assume the external world exists. But I don't think that's how I'm thinking of it. For example, emotional experience and pain don't seem negatively definable in quite the same way; and it's not straightforward how a non-dualist would carve off the "external world" -- does that include the brain, or some parts of it?

Justin: Yeah, I think I know it when I see* it, too -- like jazz!

Anibal: What I'm worried about here is that if we define consciousness in terms of evolutionary function, we beg questions about its causal significance, questions that are now open. Of course, if we can establish that consciousness, defined less contentiously, does play a certain evolutionary role, then maybe we can redefine it (perhaps more precisely) evolutionarily; but I don't see the argument there yet. I'm not convinced that Searle's view, for example, is entirely coherent.

Arnold: My feeling is that the problem with your definition is the word "phenomenal", which I take as a synonym for "conscious"; so the definition is circular (and also everything else in the definition apart from that word is needless).

MT: I agree with you, and I think the instantial definition is the best hope. In fact, I think it probably works. But here's the problem: If all your examples of non-rabbits are also non-mammals, how do we know whether the term "rabbit" refers to rabbits or to mammals? Similarly, if all your examples of consciousness are of focal consciousness, how do we know whether (to put it in terms of the rich view, according to which we have constant tactile experience of our feet in our shoes, etc.) the word "consciousness" refers to all conscious experience or only to focal consciousness?

Anonymous said...

Let me first say that when I said 'defined negatively', I didn't use 'definition' not in a formal way, where when one gives a definition, it is supposed to pick out all instances of the phenomenon in question, and only instances of the phenomenon in question. I was more thinking, of how what philosophers usually mean by 'consciousness' is connected to such negative approach of distinguishing those phenomena (related to subjects) from the phenomena of the world as described by sciences.

Further, I'm not saying that emotions, or pain, or intentionality, etc... don't have nature of their own separate from the world-phenomena, but that we put all those into one kind of 'consciousness phenomena', because of the mentioned negative approach.

I mentioned that if one takes naive-realistic view on colors, and says that they are in the world, one would be inclined to remove them from "conscious phenomena". The same case is for pains, if we take naive-realistic take on pain (e.g. that the pains are where we feel them, and that they could be there even we don't notice them) we would also be inclined to remove them from "consciousness phenomena" group. Or let's say if one is externalist about concepts, one will remove concepts from "phenomena of consciousness", and see them in the world (while internalist would consider them phenomena of consciousness), and so on...

You are right that externalist would not carve the external world in same way, and that is I guess a part of what I'm saying...

That is, I think that we don't have some one specific natural phenomenon of which we "directly" aware of, and so that people might disagree on theories about it, while agreeing that they are speaking of the same thing. Instead I think we tend to group different phenomena under this notion, and that as such it is a kind of theoretical gerrymandered notion.

arnold Trehub said...


Can I take it then that you would agree that consciousness is a transparent experience of the world from a privileged egocentric perspective?

Eric Schwitzgebel said...

Thanks for clarifying, Tanasije! I don't see why the phenomena of consciousness can't be known directly, and why the concept is gerrymandered; but I know that's a big issue!

Eric Schwitzgebel said...

Arnold, I didn't mean to imply that. I said "needless" but what I should have said is "needless and contentious"! I find both the notions of transparency and privilege problematic, though there may be weak senses of the terms on which I agree that the epistemology of consciousness is transparent and privileged.

MT said...

If all your examples of non-rabbits are also non-mammals, how do we know whether the term "rabbit" refers to rabbits or to mammals?

You're describing a circumstance in which the mammalia is monophyletic--rabbits (and their evolutionary antecedents as well, if I understand the concept of monophyly) representing all present and past representatives of the entire class. In such a world taxonomists or systematists would be rabbits or perhaps intelligent reptiles, and the class we call mammalia in our world would be called "Rabbitia" (unless "Leporidae" wins by a hare). This class would be a heuristic or theoretical conceit by which we place rabbits at a fitting phylogenetic distance from reptiles, based on 1) the assumption of common descent, 2) how dissimilar rabbits and reptiles seem to the systematists relative to 3) the dissimilarities between the more instantiated taxa of our system.

It's barely an academic question what are the characteristics of the class to which rabbits belong in this case. The question arises after the discovery something like but unlike rabbits--in the fossil record or an unexplored ecosystem. Then at least you are constrained to conceive the class as broad enough to encompass not only rabbits but the new things as well...unless they are classified a subspecies (highly unlikely if rabbits are the systematists in this hypothetical universe--would be like us classifying chimps as a variant of human).

Similarly, if all your examples of consciousness are of focal consciousness, how do we know whether (to put it in terms of the rich view, according to which we have constant tactile experience of our feet in our shoes, etc.) the word "consciousness" refers to all conscious experience or only to focal consciousness?

How do we know? We just know. It depends whether we're rabbits or reptiles, as it were. i.e. According to how we conceive our consciousness, some stuff will be significantly different from our consciousness or merely a subspecies of it. Of course, it's not an individual decision. We meet at conferences annually to hash it out.

Anibal Monasterio Astobiza said...

To me conciousness is a proxy for knowledge as knowledge (information/cognition)is the general way animals and its central nervous systems have to react and adapt to a constantly changing enviroment.

Of course most cognition is nonconcious but with increasing complexity higher forms of conciousness evolve to command with precision thoughts reflections,will, motor plans...

The metaphysical question about how electro-chemical events produce experience or matter becomes imagination (Edelman and Tononi 2000) with the blosom philosophical landscape of theories ranging from identity theories, functionalim, dualism...
its very important but not the proper question because under the light of evolution conciousness is an outstanding feature and evolution cannot produce too many spandrels or by-products.

The right question is What !%*¿?·# (bad word) function serves conciousness? and the transition from absence, lower forms and higher forms of conciousness in evolutionary timescales is almost the right question.

A. Rechtschaffen is credited to say in relation to sleep:

"If sleep does not serve an absolute vital function, then it is the biggest mistake the evolutionary process ever made"
The same we can say about conciousness.

arnold Trehub said...

Just look around you, Eric. Don't you now have a transparent phenomenal experience of the world from your own privileged egocentric perspective? If not, can you tell us why not?

Eric Schwitzgebel said...

Thanks for the reply, MT. Unfortunately, I didn't explain my concern well enough and threw you off track. I was not thinking of rabbits as the only mammals but rather as the only mammals used as positive or negative instances in defining the term. There are still deer, sheep, etc., but my idea was that they have not been referred to one way or another in the definition. That, obviously, would be a problem!

I'm not sure I fully understand your second point: Do you think it's merely a conceptual question whether we count the unattended hum of the fridge as part of our consciousness? Or is there a substantive question there -- whether, we might say, one really consciously experiences that hum -- entangled with the conceptual issues?

Eric Schwitzgebel said...

Anibal: I think that's an interesting argument about the evolution of consicousness. Nichols and Grantham made a similar argument in an essay in Philosophy of Science several years back, if I recall correctly. I'm not *sure* I'm convinced. I know you mean to exclude this, but I don't see why I couldn't be that knowledge (or something like that) is selected for and phenomenal consciousness rides along for free.

Eric Schwitzgebel said...

Arnold, I'm not sure I have privileged knowledge of my own consciousness in any strong sense, since I think there are many cases where we are wrong about our current conscious experience. And I'm not sure my consciousness is "transparent" in any strong sense, either, since cognitive phenomenology, inner speech, and emotion don't seem to be "transparent" in the way that sensory experience of the external world is often taken to be transparent.

So there's a start!

arnold Trehub said...

Eric, I think this is a good start.

1. What is the difference between a strong sense and a weak sense of privileged knowledge?

2. If you/I were wrong about our current conscious experience, wouldn't this simply be an instance of a failure in our *understanding/classification/report* of the features of our privileged raw conscious content, rather than evidence that we do not have a privileged uninterpreted experience of the world?

3. It seems to me that cognitive phenomenology, inner speech, and emotion are phenomenologically transparent as long as we do not experience the media (the particular brain mechanisms) that carry this content.

MT said...

Eric, thanks for alerting me to my misunderstanding. Incidentally, I also misused "monophyletic." I think there's a word for what I had in mind, but that's not it. What you call my second point was an extension of the first--that the decision to associate or dissociate something, conceptually, with the very kind or the very category of which the decider is a member is subjective, and a decision made with an eye to the extended family of kinds.

If consciousness is a kind, it might be useful to think of what kind of a kind it is. Is it like a family or genus, to which many species of things belong--dog consciousness, plant consciousness, computer consciousness? Or is consciousness a species of thing that exists only in awake and legally competent people?

Conceiving consciousness liberally means seeing it as existing elsewhere than in awake and legally competent people, and so viewing it as encompassing at least a genus.

By defining consciousness we game the taxonomic system. We could draw a circle around something that might seem to exist only in awake legally competent people--"language," for example (displacing the taxonomic problem to that thing we've circled).

If we wish to understand consciousness as a thing in the world, we'll recognize it as biological and define it with an eye to the neuroanatomy, physiology and behavior of other animals, so that how it evolved may become clear to us. It seems you want to use the word "consciousness" foremost for other uses and other understandings to do with subjective experience. Isn't it the very pitfall of consciousness itself to assume one word ought to work for both? If you decide you're talking about a word that's only for phenomenology, you're in a realm of very soft science, ripe for speculation. You could more or less define it as you like and anger very few.

Anonymous said...

Howdy Folks,

Great discussion! Seems to me that consciousness doesn't need to be so complex. In basic terms, any mobile species must possess some primary skills in order to navigate at all. First of all, it must have a way of referencing time and space. That requires memory and some way of looking at the relationship between self, non-self and motion. Seems to me that in order to navigate successfully requires at least this much awareness.


Eric Schwitzgebel said...

Thanks for the interesting discussion, folks!

Arnold: (1.) Weak sense: I have a means of learning about my conscious experience that is different in kind from what other people have -- but not necessarily better. (2.) I agree we have an uninterpreted experience of the world (except insofar as interpretation is built into experience); but to become knowledge or judgment, it requires some interpretation or classification. (3.) If that's all you mean by "transparent", I'll give you that! Tye, Dretske, and Harman seem to mean more. (But exactly what more is not entirely clear.)

MT: I agree with the spirit of your remarks, but I think the problem is that we only have uncontentious knowledge of instances of consciousness in a limited range of cases. If we define it in terms of those cases only, we risk being too narrow in scope. If we explicitly include other cases, we beg the question against views that would exclude those cases. The best hope, I think, is to define it leaving open the contentious cases and then hope (to what extent this hope is justified I don't know) that the referent is clear enough that we're all talking about the same thing!

Jim: Would it follow that self-propelled vacuum cleaners (the sophisticated kind from MIT) would be conscious?

Anonymous said...

Howdy Eric,

The smart vacuum certainly has a built in awareness of time and the floor it must navigate. That awareness depends on sensors, a cpu, memory and algorithms, all part of the "system" that manages its behavior. That system took billions of manhours to create and yet it doesn't even come close to vacuuming skills of a catfish, for example.

What AI is attempting to do is duplicate the efficiency of natural system. What AI needs is a form of active awareness or consciousness that can make intelligent choices.

My point is that i think biological systems have already done that. We have copied all the elements of consciousness, but we can't breath life into them. The element we are missing is consciousness itself.

So, i guess my definition of consciousness is Life itself.

Over at my place, we are looking for the operating system that executes DNA. Its a manuscript in progress. Please stop by and check it out.


Eric Schwitzgebel said...

Neat blog, Jim! I don't know what it takes to breathe consciousness into a system, but as the case of the smart vacuum shows (if we're right that the vacuum does not have consciousness), mere low-level responsiveness to environment and low-level navigational abilities is not enough. So what more is needed? That's the million-dollar question!

I'm not sure life itself will do: It seems we can have life without consciousness (mushrooms, bacteria) and maybe also consciousness without life (sufficiently advanced AI? God?).

Anonymous said...

Thanks Eric.

Yes, the million dollar question... but it brings us right back to that illusive definition. If indeed consciousness is only a function of higher animals, then we can be sure that it takes more than life. However, if consciousness is a function of something more basic, like a coherent electromagnetic field, then all living creatures could possess a version of it.

I think we tend to define consciousness in strictly human terms because we have always thought we were the only conscious ones around. Most indigenous people and some eastern religions recognize the consciousness in all things. My personal experience bears this out, so i go along with them. It seems to me that the systems approach, looking at Life as a biological information processing system first, with protein as secondary output, leads to the same conclusion.

Anyway, thanks for listening.

arnold Trehub said...

Eric, it seems to me that your conception of conscious experience as knowledge is problematic. I would claim that your conscious knowledge is a only a subset of your occurrent phenomenal world.

Take the moon illusion as a good example. The moon is experienced as large when seen near the horizon, and it is experience as small when it is seen high in the sky. The naive observer typically thinks that the moon is necessarily more distant when it is overhead than it is near the horizon. This is his raw perception and his reflective experience which can be taken as his knowledge about his immediate phenomenal experience of the moon. In fact, his conscious experience of the moon illusion is neither true nor false -- it just *is* what it is. But at the same time, his *knowledge* about his own conscious experience is false because the moon is actually no farther away at its zenith than it is at the horizon. So I actually have better knowledge *about* the observer's phenomenal experience than he does. But no one can share the observer's own egocentric perspective, and therefore his conscious experience of the moon and his reflections about it remain privileged as his own conscious phenomena.

MT said...

Even if we were to agree what kind of thing we mean to refer when we say "consciousness," and even though I don't know what might be agreed exactly, I suspect we'll still be identifying instances of it by means so indirect that we can be duped by ventriloquists, that we'll reasonably distrust these means except when we are applying to bodies that look human, and we won't necessarily know it to observe it in ourselves. The definition seems certain to amount to an untestable hypothesis,in other words--Popper's definition of an unworthy objective. Granted, he wasn't talking about philosophy, but even in a philosophical endeavor, if I'm right that you're shooting on something untestable, shouldn't that be troubling? I suppose "2" isn't exactly testable, yet has its uses. At least with "2" our engineers build bridges and other cool and popular stuff. What are we going to get from a precisely defined "consciousness"?

Eric Schwitzgebel said...

Thanks for the continuing discussion, folks!

LifeOS: I agree with what you say in the first paragraph, but I'm not sure what justified what you say in the second!

Arnold: I think we're talking past each other a bit. In the moon illusion case, I think the naive observer's knowledge about the moon's distance is false, but he may be quite right about his conscious experience. Also, I'm not sure why you think I characterize consciousness in terms of knowledge. What I meant to convey in my earlier remark is that knowledge about consciousness -- at least the kinds of judgment about consciousness that are the reportable products of introspection -- necessarily involves conceptualization and categorization.

MT: I think we see eye to eye on all you just said! (Including the non-dogmatic tone.)

arnold Trehub said...

Eric, I certainly agree with you that introspecting and reporting about our conscious experience necessarily involves making judgements about our inner experience that can be wrong. I guess what I don't share is your pessimism about the prospects for a better scientific understanding of the nature of consciousness. I think we have made encouraging progress in understanding the kinds of brain mechanisms and systems that might constitute our phenomenal experience.

MT said...

I think we see eye to eye on all you just said!

Low blow. I suppose obvious points were right down there with the untestable ones in Poppers book ;-)

Justin (koavf) said...

Nice spam, "Paul."

Please don't click on the above link. Eric, I'd like to request that you delete it.


Eric Schwitzgebel said...

Thanks for catching that, Justin! One of the problems of having open comments....

Parag Jasani said...

First ever casual explanation of the mechanism responsible for human consciousness (you can verify the same with your subjective experiences):

While interacting in our day-to-day life, we need to act or react to bodily processes and the happenings in the world, sometimes instantly, to provide us beneficial outcomes.

Consciousness is designed by the evolutionary process to allow data from such interactions that requires judgmental power to become available for making decisions, thereby benefiting from the capability of making free will decisions (If there were no free will, there was no requirement of consciousness).

To understand how interactions are continuously scrutinized for the requirement of judgmental power and how free will decisions are made, visit (based on Dichotomized Operating System model - DOS model)

Arnold said...

Is observation replacing consciousness...