The Features
Take a moment to introspect. Examine a few of your conscious experiences. What features do they share -- and might these features be common to all possible experiences? Let's call any such necessarily universal features essential.
Consider your visual experience of this text. Next, form an image of your house or apartment as viewed from the street. Think about what you'd do if asked to escort a crocodile across the country. Conjure some vivid annoyance at your second-least-favorite politician. Notice some other experiences as well -- a diverse array. Let's not risk too narrow a sample.
Of course, all of these examples share an important feature: You are introspecting them as they occur. So to do this exercise more properly, consider also some past experiences you weren’t introspecting at the time. Try recalling some emotions, thoughts, pains, hungers, imagery, sensations. If you feel unconfident -- good! You should be. You can re-evaluate later.
Each of the following features is sometimes described as universal to human experience.
1. Luminosity. Are all of your experiences inherently self-representational? Does the having of them entail, in some sense, being aware of having them? Does the very experiencing of them entail knowing them or at least being in a position to know them? Note: These are related, rather than equivalent, formulations of a luminosity principle.
[porch light;
image source]
2. Subjectivity. Does having these experiences entail having a sense of oneself as a subject of experience? Does the experience have, so to speak, a "for-me"-ness? Do the experiences entail the perspective of an experiencer? Again, these are not equivalent formulations.
3. Unity. If, at any moment, there's more than one experience, or experience-part, or experience-aspect, are they all subsumed within some larger experience, or joined together in a single stream, so that you experience not just A and B and C separately but A-with-B-with-C?
4. Access. Are these experiences all available for a variety of "downstream" cognitive processes, like inference and planning, verbal report, and long-term memory? Presumably yes, since you're remembering and considering them now. (I'll discuss the methodological consequences of this below.)
5. Intentionality. Are all of your experiences "intentional" in the sense of being about or directed at something? Your image of your house concerns your house and not anyone else's, no matter how visually similar. Your thoughts about Awful Politician are about, specifically, Awful Politician. Your thoughts about squares are about squares. Are all of your experiences directed at something in this way? Or can you have, for example, a diffuse mood or euphoric orgasm that isn't really about anything?
6. Flexibility. Can these experiences, including any fleeting ones, all potentially interact flexibly with other thoughts, experiences, or aspects of your cognition -- as opposed to being merely, for example, parts of a simple reflex from stimulus to response?
7. Determinacy. Are all such experiences determinately conscious, rather than intermediately or kind-of or borderline conscious? Compare: There are borderline cases of being bald, or green, or an extravert. Some theorists hold that borderline experientiality is impossible. Either something is genuinely experienced, however dimly, or it is not experienced at all.
8. Wonderfulness. Are your experiences wonderful, mysterious, or meta-problematic – there is no standard term for this – in the following technical sense: Do they seem (perhaps erroneously) irreducible to anything physical or functional, conceivably existing in a ghost or without a body?
9. Specious present. Are all of your experiences felt as temporally extended, smeared out across a fraction of a second to a couple of seconds, rather than being strictly instantaneous?
10. Privacy. Are all of your experiences directly knowable only to you, through some privileged introspective process that others could never in principle share, regardless how telepathic or closely connected?
I've presented these possibly essential features of experience concisely and generally. For present purposes, an approximate understanding suffices.
I've bored/excited you [choose one] with this list for two reasons. First, if any of these features are genuinely essential for consciousness, that sets constraints on what animals or AI systems could be conscious. If luminosity is essential, no entity could be conscious without self-representation. If unity is essential, disunified entities are out. If access is essential, consciousness requires certain kinds of cognitive availability. And so on.
I'll save my second reason for the end of this post.
Introspection and Memory Can't Reveal What's Essential
Three huge problems ruin arguments for the essentiality of any of these features, if those arguments are based wholly on introspective and memorial reflection. The problems are: unreliability, selection bias, and the narrow evidence base.
Unreliability. Even experts disagree. Thoughtful researchers arrive at very different views. Given this, either our introspective processes are unreliable, or seemingly ordinary people differ wildly in the structure of their experience. I won't detail the gory history of introspective disagreement about the structure of conscious experience, but that was the topic of my 2011 book. Employing appropriate epistemic caution, doesn't it seem possible that you could be wrong about the universality, or not, of such features in your experience? The matter doesn't seem nearly as indubitable as that you are experiencing red, when you're looking directly at a nearby bright red object in good light, or that you're experiencing pain when you drop a barbell on your toe.
Selection bias. If any of your experiences are unknowable, you won't of course know about them. To infer luminosity from your knowledge of all the experiences you know about would be like inferring that everyone is a freemason from a sampling of regulars at the masonic lodge. Likewise, if any of your experiences fail to impact downstream cognition, you wouldn't reflect on or remember them. Methodological paradox doesn't infect the other features quite as inevitably, but selection bias remains a major risk. Maybe we have disunified experiences which elude our introspective focus and are quickly forgotten. Similarly, perhaps, for indeterminate or inflexible experiences, or atemporal experiences, or experiences unaccompanied by self-representation.
Narrow evidence base. The gravest problem lies in generalization beyond the human case. Waive worries about unreliability and selection bias. Assume that you have correctly discerned that, say, seven of the ten proposed features belong to all of your experiences. Go ahead and generalize to all ordinary adult humans. It still doesn't follow that these features are essential to all possible conscious experiences, had by any entity. Maybe lizards or garden snails lack luminosity, subjectivity, or unity. Since you can't crawl inside their heads, you can't know by introspection or experiential memory. (In saying this, am I assuming privacy? Yes, relative to you and lizards, but not as a universal principle.) Even if we could somehow establish universality among animals, it wouldn't follow that those same features are universal to AI cases. Maybe AI systems can be more disunified than any conscious animal. Maybe AI systems can be built to directly access each other's experiences in defiance of animal privacy. Maybe AI systems needn't have the impression of the wonderful irreducibility of consciousness. Maybe some of their conscious experiences could occur in inflexible reflex patterns.
Nor Will Armchair Conceptual Analysis Tell Us What's Essential
If you want to say that all conscious systems must have one or more of unity, flexibility, privacy, luminosity, subjectivity, etc., you'll need to justify this insistence with something sturdier than generalization from human cases. I see two candidate justifiers: the right theory of consciousness or the right concept of consciousness.
Concerning the concept of consciousness, I attest the following. None of these features are essential to my concept of consciousness. Nor, presumably, are those features essential to the concepts of anyone who denies their universal applicability. One or more of these features might be universally present in humans, or even in all animals and AI systems that could ever be bred or built; but if so, that's a fact about the world, not a fact that follows simply from our shared concept of consciousness.
In defining a concept, you get one property for free. Every other property must be logically proved or empirically discovered. I can define a rectangle via one (conjunctive) property: that of being a closed, right-angled, planar figure with four straight sides. From this, it logically follows that it must have four interior angles. I can define gold as whatever element or compound is common to certain shiny, yellowish samples, and then empirically discover that it is element 79.
Regarding consciousness, then: None of the ten purported essential properties logically follow from phenomenal consciousness as ordinarily defined and understood (generally by pointing to examples). None are quite the same as the target concept. You can choose to define "consciousness" differently, for example, via the conjunctive property of being both a conscious experience in the ordinary sense and one that is knowable by the subject as it occurs. Then of course luminosity follows. But you've changed the topic, winning by definitional theft what you couldn't earn by analytic hard work.
Could luminosity, subjectivity, unity, etc., covertly belong to the concept of consciousness, so that the right type of armchair (not empirical) reflection would reveal that all possible conscious experiences in every possible conscious entity must necessarily be luminous, subjective, or unified? Could subtle analytic hard work reveal something I'm missing? I can't prove otherwise. If you think so, I await your impressive argument. Even Kant held only that luminosity, subjectivity, and unity were necessary features of our experience, not of all possible experiences in all possible beings.
Set aside purely conceptual arguments, then. If we hope to defend the essentiality of any of these ten features, we'll need an empirically justified universal theory of consciousness.
That brings me to the second reason I've presented this feature list. I conjecture that universal theories of consciousness, intended to apply to all possible beings, instead of justifying the universality of (one or more of) these features circularly assume the universality of (one or more of) these features. Developing this conjecture will have to wait for another day.