Thursday, November 12, 2020

The Nesting Problem for Theories of Consciousness

In 2016, Tomer Fekete, Cees Van Leeuwen, and Shimon Edelman articulated a general problem for computational theories of consciousness, which they called the Boundary Problem. The problem extends to most mainstream functional or biological theories of consciousness, and I will call it the Nesting Problem.

Consider your favorite functional, biological, informational, or computational criterion of consciousness, criterion C. When a system has C, that system is, according to the theory, conscious. Maybe C involves a certain kind of behaviorally sophisticated reactivity to inputs (as in the Turing Test), or maybe C involves structured meta-representations of a certain sort, or information sharing in a global workspace, or whatever. Unless you possess a fairly unusual and specific theory, probably the following will be true: Not only the whole animal (alternatively, the whole brain) will meet criterion C. So also will some subparts of the animal and some larger systems to which the animal belongs.

If there are relatively functionally isolated cognitive processes, for example, they will also have inputs and outputs, and integrate information, and maybe have some self-monitoring or higher-order representational tracking -- possibly enough, in at least one subsystem, if the boundaries are drawn just so, to meet criterion C. Arguably too, groups of people organized as companies or nations receive group-level inputs, engage in group-level information processing and self-representation, and act collectively. These groups might also meet criterion C.[1]

Various puzzles, or problems, or at least questions immediately follow, which few mainstream theorists of consciousness have engaged seriously and in detail.[2] First: Are all these subsystems and groups conscious? Maybe so! Maybe meeting C truly is sufficient, and there's a kind of consciousness transpiring at these higher and/or lower levels. How would that consciousness relate to consciousness at the animal level? Is there, for example, a stream of experience in the visual cortex, or in the enteric nervous system (the half billion neurons lining your gut), that is distinct from, or alternatively contributes to, the experience of the animal as a whole?

Second: If we want to attribute consciousness only to the animal (alternatively, the whole brain) and not to its subsystems or to groups, on what grounds do we justify denying consciousness to subsystems or groups? For many theories, this will require adjustment to or at least refinement of criterion C or alternatively the defense of a general "exclusion postulate" or "anti-nesting principle", which specifically forbids nesting levels of consciousness.

Suppose, for example, that you think that, in humans, consciousness occurs in the thalamacortical neural loop. Why there? Maybe because it's a big hub of information connectivity around the brain. Well, the world has lots of hubs of complex information connectivity, both at smaller scales and at larger scales. What makes one scale special? Maybe it has the most connectivity? Sure, that could be. If so, then maybe you're committed to saying that connectivity above some threshold is necessary for consciousness. But then we should probably theorize that threshold. Why is it that amount rather than some other amount? And how should we think about the discontinuity between systems that barely exceed the threshold versus barely fall short?

Or maybe instead of a threshold, it's a comparative matter: Whenever systems nest, whichever has the most connectivity is the conscious system.  But that principle can lead to some odd results. Or maybe it's not really C (connectivity, in this example) alone but C plus such-and-such other features, which groups and subsystems lack. Also fine! But again, let's theorize that. Or maybe groups and subsystems are also conscious -- consciousness happens simultaneously at many levels of organization. Fine, too! Then think through the consequences of that.[3]

My point is not that these approaches won't work or that there's anything wrong with them. My point is that this is a fundamental question about consciousness which is open to a variety of very different views, each of which brings challenges and puzzles -- challenges and puzzles which philosophers and scientists of consciousness, with a few exceptions, have not yet seriously explored.

--------------------------

Notes

[1] For an extended argument that the United States, conceived of as an entity with people as parts, meets most materialist criteria for being a consciousness entity, see my essay here. Philip Pettit also appears to argue for something in this vicinity.

[2] Giulio Tononi is an important exception (e.g., in Oizumi, Albantakis, and Tononi 2014 and Tononi and Koch 2015).

[3] Luke Roelofs explores a panpsychist version of this approach in his recent book Combining Minds, which was the inspiration for this post.

[image source]

37 comments:

  1. Psychological-consciousness or physiological/neurological-consciousness...are problems...

    Phenomenological-consciousness's problem is understanding...what is an object...
    ...that understanding and consciousness are objects of self, of an objective self...

    You wrote, 'a general problem for computational theories of consciousness, which they called the Boundary Problem.', thanks for searches...
    ...Do you have a view about the nestinglessness of phenomenological-consciousness...

    ReplyDelete
  2. Thanks for this post it was an interesting read. I do have a question that I hope you can answer.

    We seem to be, in many of these instances, conflating Consciousness with what may end up being other, non-consciousness related things.

    We have the following givens when talking about Consciousness.

    1) Our only experience of Consciousness right now is in biological organisms, and that too one particular biological organism, homo sapiens.

    2) Along with Consciousness, biological organisms have evolved other traits and features over time in order to survive. Things such as locomotion, vision, hearing, etc.

    3) So Consciousness is either a feature we evolved to aid in survival or an epiphenomenon arising from something else that evolved for survival.

    Now, when talking about other biological features, we don't go assigning them to anything other than the organism itself, so why do it with Consciousness?

    For example, we wouldn't say that the locomotive ability of an animal means that we must somehow consider that all it's parts have some small bits of Locomotion in them. Nor would we say that all the individual locomotions of each member of a group of animals somehow adds up to a group level locomotion feature. We would consider these sorts of conjectures absurd for most biological features but not when talking about Consciousness.

    Any thoughts why?

    Thanks,
    Caesar

    ReplyDelete
  3. To be fair to many of these theories, it matters what domain the theory claims to apply to. A theory that simply says what it postulates *is* consciousness, like IIT, will claim that any system that meets its criteria is conscious. However, many other theories, like higher order or global workspace, make no such broad claims. Their domain is organic brains. In other words, they're only claiming to be a theory of how consciousness works in an overall brain, not consciousness in any system anywhere. That's not to say those theories may not pertain to AI, but you have to bring in a lot of other neuroscientific concepts.

    That said, each of these theories involves a particular definition of consciousness. And that's problematic, since there is no consensus on such a definition. In truth, no one consistent definition of consciousness seems to include all systems we intuitively see as conscious and reliably excludes systems we intuitively see as not.

    I think the reality is, consciousness is in the eye of the beholder. We can talk about systems that have memory, perception, attention, learning, imagination, emotion, and introspection, but which collections of such capabilities are conscious will always be a judgment call.

    This shouldn't be surprising. The concept of consciousness began rooted in Cartesian dualism. As that notion has become untenable, that conception of consciousness has as well.

    Put another way, when Alice ponders Bob's consciousness, she's pondering how much Aliceness Bob has. When Bob ponder Alice's consciousness, he's pondering how much Bobness Alice has. When people ponder animal consciousness, it's a question of how much humanness the animal might have. And when we ponder a machine, it's a question of how much lifeness it might have. In other words, what we really seem to mean by "consciousness" is: "like us."

    When we look at it that way, it becomes much more obvious that there's no fact of the matter, just degrees of similarity.

    ReplyDelete
  4. Thanks for the comments, folks!

    Arnold: My own view is that there are variety of options, all with bizarre-seeming implications, and we don't have a good method for settling among them.

    Caesar: I'm not sure about the exact meaning of claim 1. In a certain sense, my only experience of consciousness is in a single individual organism: me. In another sense, I experience other people and other mammals (such as my pet dog Cocoa) as conscious. Claim 3 seems right (on a broad reading of "epiphenomenon", but is quite weak. The analogy to locomotion doesn't quite work, I think. That is a trait that is clearly applicable to the whole, yes. But other traits are applicable both to parts and wholes, such as being alive or representing. With being alive and representing, there doesn't seem to be any big obvious problem with nesting attributions (there might be non-obvious problems) but with consciousness, nesting attributions seem to lead to consequences that many people would find bizarre.

    SelfAware: Right, good point that some of the more empirically-grounded theories of consciousness are explicitly limited to certain types of system and don't aim at a general C. I do think the question still arises about how to extend their domain-specific C to other entities or other levels, even if the theorist avoids commitment about how it would extend. The shift to being in "the eye of the beholder" seems wrong to me though. I have trouble wrapping my mind around non-realist views of that sort. I'm a phenomenal realist in the weak sense of that term (not committed to denying reduction, or materiality, or anything spooky): I think there are facts of the matter, independent of our judgments about those facts, about what organisms are or are not conscious. (Such of view is compatible with in-between or indeterminate cases.) I'm happy to hear more, but that's probably an immovable starting point for me.

    ReplyDelete
  5. Eric,
    I should clarify that I think for any precise definition of "consciousness" there is a fact of the matter. The difficulty is in getting everyone to agree with such a definition.

    It's worth noting that consciousness isn't the only concept that's difficult to define. Life is another one. Are viruses alive? What about viroids? Or prions? The United States? :-) We can talk about whether these things replicate, maintain homeostasis, undergo evolution, etc. Each of those questions seem to have a definite answer. On the other hand, whether the entities in question have a single vitalistic life force depends on what we mean.

    Likewise with consciousness, we can talk about whether a system has distance senses it uses to build image maps, exogenous attention, endogenous attention, episodic memory, various types of learning, introspection, etc. Each of these are easier to get a definite answer on than an overall assessment of consciousness.

    The overall conclusion I reach is that consciousness, like life, is unlikely to ever be solved with a single theory. It will likely be a galaxy of theories, many of them complex and difficult for the lay public to understand, just as the various microbiological ones on the border between biology and chemistry are.

    ReplyDelete
  6. As I write my PhD sample there is definitely tension between addressing the fringe and more orthodox issues. I rather expect that my panpsychist views make the more conservative approach the wiser one.

    ReplyDelete
  7. Professor,
    I wonder if you could clarify your current stance on IIT? I was just reading Scott Aaronson’s 2014 assessment, though the conclusion left me unfulfilled. Instead of explaining that some sort of non-human object had high Phi / consciousness, what he actually said was:

    “We could achieve pretty-good information integration using a bipartite expander graph together with some fixed, simple choice of logic gates at the nodes: for example, XOR gates.”

    What the hell is that? A mathematical description fit for the printed page? A general symbolic representation? But fortunately he also linked to your 2012 post about IIT implying that the USA itself should be conscious. Apparently there was a hiccup however because you weren’t using the latest IIT version. Though I think you did do a good job undressing that version, apparently right now you’re giving IIT a pass. Are they allowed to simply state things such as “By the way, we’re no longer panpsychists” or “Nestings and expansions of our theory are no longer considered valid interpretations”?

    Anyway I’m pleased that you’ve observed this trait in various other proposals. It doesn’t make them “wrong” as you’ve noted, though the point I’d include as well is that their open nature should put them in the class of “supernatural” when they are taken alone, or at least in need of a statement that they’re agnostic regarding whatever it is that effectively does create qualia. To stay right with John Searle and my own “thumb pain” version of his Chinese room, in a natural world processed information will require mechanical instantiation.

    In a 2002 paper UK professor Johnjoe McFadden offered such theories a potential escape, though without takers. Note that there is no place in the body which produces the complex electromagnetic radiation which McFadden theorizes to exist as qualia, except for the brain.

    Beyond his recent paper I’ve heard that he’s finally writing a book on his cemi field theory, hopefully to be published next year. Personally I think he should quit being so congenial and go on the offensive, preferably by incorporating my own thought experiment. Do you remember it professor? The TL;DR is that information based consciousness theories imply that if the right information on paper were properly converted into other information on paper, then something in this paper shuffle would thus feel the qualia that we know of when our thumbs get whacked. To avoid such funkiness a given theorist could either admit “Beyond my theory itself I have no opinion about which mechanisms create qualia”, or try to incorporate a theory which does specifically addresses those mechanics, such as McFadden’s.

    ReplyDelete
  8. I think there are good reasons to accept nesting despite it's initial counterintuitive feel for many of us. For one, it's a medical reality that some people who have lost parts of their brains can report on the before/after phenomenal differences. I've argued that the best (though certainly not the only) interpretation of this is that the portion which survives the disconnection was already independently conscious beforehand. Our brains include multiple overlapping independently conscious regions. It seems crazy until you compare it to the alternatives, and some of the initially compelling objections (I only introspect myself, not others) don't hold up. Consciousness then would be like other physical properties that permit nesting and overlap: temperature, charge, volume, mass, energy.

    ReplyDelete
  9. Thanks for the continuing comments, folks!

    SelfAware: Ah, that's clearer. For sure, there are complications about defining the term. I *hope* that there is a single best natural-kind "reference magnet" once we get our positive and negative examples lined up right -- i.e., a fact in nature about what-it's-like/consciousness/experience that is the natural referent of terms like "phenomenal consciousness" in philosophers' usage. But that is a risky assumption. (I have a paper about this in JCS in 2016.)

    Patrick: Some people find panpsychism laughably absurd. I don't. But there is a risk!

    P Eric: I definitely don't want to give IIT a pass, though I am politic in this post. I think the critique of Exclusion that I posted in 2014 (a version is also in my 2015 USA consciousness paper) is devastating to IIT 3.0 and later and has never been seriously addressed by friends of IIT, despite some off-the-cuff remarks in Tononi and Koch 2015. Luis Favela has the best engagement with it, in one of his papers (2018?), and suggests that IIT might need to revise Exclusion to be workable, though it's also a brief treatment and he doesn't have a full positive solution. On the expander graphs: I think they can be instantiated physically though of course they are usually instantiated virtually in typical serial computers. On paper: Yes. A related case is simply a Turing machine instantiated as Turing imagined on printed paper with a read/write head. If you accept that the Turing-machine equivalent of any conscious being is also conscious, and that conscious beings do have Turing-machine equivalents, then such an arrangement would be conscious.

    jblackmon: That is an interesting idea, and the comparison to physical quantities is similar to Roelofs. It has some wild-seeming implications about the number of different streams of consciousness going on in your head, and working out their relationships and boundaries will be weird and interesting. But something wild and weird must be true. I completely agree that all the alternative views also have bizarre implications. There are no non-bizarre options left, and something "crazy"-seeming must be true! (On the last point, see my 2014 paper in AJP.)

    ReplyDelete
  10. I've yet to read Roelofs. Unger's 'The Problem of the Many' makes clear how absurd he finds the view. I argue for it in my 2016 paper on hemispherectomies and independently conscious brain regions.

    ReplyDelete
  11. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4731102/ ...
    ...'For many decades, neuroscientists understood the brain as a 'stimulus–response' organ, consisting … In this traditional model, learning and experience merely modulate neural activity that is driven by sensory events in the world'...

    Isn't still to soon to attach consciousness to anything...
    ...that "many deep brain neural prediction system layer's" work, is to sort...
    ...not 'stream consciousness'...

    ReplyDelete
  12. https://phys.org/
    Does the human brain resemble the Universe?..by Università di Bologna, NOVEMBER 16, 2020

    Article about the possible commonality of universes, cosmoses, brain-systems, bio-systems, AI-systems and consciousness (systems)...thanks

    ReplyDelete
  13. I was naive last year, and, not realizing the risk, I believe I leaned into the panpsychism a bit too much. I read Phenomenal Consciousness, Defined and Defended as Innocently as I Can Manage and it has been a guide for me as I rewrite my sample.

    Your blog was/is incredibly helpful in the admission process, even though I found it late in the game. I'm hopeful that I can improve on last year's outcome of two wait-lists and one (disappointing) offer.

    ReplyDelete
  14. In the full knowledge of how dumb it is to declare that there's an easy solution to a hard problem... I'm going to declare that I still think there is a reasonably straightforward solution to this problem.

    First, it seems sensible to say that it seems fine that consciousness exists within some subset of the thing to which we normally assign consciousness - and that doesn't imply multiple streams of consciousness at all. So for example, I think consciousness is a feature of humans, and humans are 100% physical beings (is that called physicalism? I can never get these terms straight). If you cut a finger off a person, that would make them physically smaller, but wouldn't alter the fact that they're conscious. Clearly, a human being's consciousness can survive the removal of some physical parts of their body. So the fact that the term "conscious" can be applied to a complete human, but also some subset of a human, shouldn't strike us as inherently problematic. The fact that we *apply the term conscious* to a human, and also to some subset of a human, doesn't mean that two consciousnesses exist. It's just a feature of the way the word "conscious" (and most other words) works.

    Nesting would exist as a *problem* iff we see certain things in the universe as conscious, AND believe that certain subsets of those things are also conscious, AND and we believe that the consciousnesses we see in the things and their subsets are different consciousnesses. But in general, it's normal to see things (people) as conscious, and some subset of each of those people as conscious, simply because we usually work with "people" as a unit, without worrying about exactly which section of the person does what. In this sense, consciousness would be no different from redheadedness or jollity.

    Even if we do see separate consciousnesses, I'm not keen to jump straight to Tonioni's exclusion principle "at any given time there is only one experience having its full content...[within] a particular spatial and temporal grain". That just doesn't sound much like life as I know it. I feel like I'm having several experiences at the same time - as I write I'm sitting at the table, my wife and son are both at the table too, and I'm aware of them and of what I'm writing here. I don't see any reason to define this experience as "only one"; nor do I think that I necessarily spatially exclude other consciousnesses. For example, I could have a "split personality," with two persons inside my body having different experiences at the same time (I don't know if that's psychologically possible, but I don't think there's any conceptual problem with it).

    So, I'm still leaning towards a Dennett-style view that there's nothing to see here. These worries about nesting and exclusion just seem to be... invented.

    ReplyDelete
  15. Thanks for the continuing comments, folks!

    Jblackmon: Yes, interesting article! I think Unger misses the severity of the problem by assimilating it too quickly to superficially similar-seeming problems about object constitution.

    Arnold: Thanks for the suggestion!

    Patrick: Thanks. I'm glad you found the series, and the article, helpful.

    chinaphil: As you say, the problem is:
    "Nesting would exist as a *problem* iff we see certain things in the universe as conscious, AND believe that certain subsets of those things are also conscious, AND and we believe that the consciousnesses we see in the things and their subsets are different consciousnesses." As I see it, it's not a Tibbles/Tib+tail problem, like your finger case, which I'll leave to be sorted out in the metaphysics of objects. It's rather that you (arguably) get different streams of consciousness at different levels. I might also pull apart a couple of things in your last paragraph: A single unified stream of experience can presumably have several elements in it (see, e.g., Tim Bayne) or the elements could really belong to separate streams of experience that are only partly unified. The first option doesn't challenge Exclusion at all and is a very of saying "exactly one stream, exactly one privileged level". The second is more challenging to standard views, introducing questions and puzzles that people haven't yet worked out very well -- for example, concerning unity, introspective privilege, resolution of conflict, and a potentially unintuitive multiplicity of streams of consciousness.

    ReplyDelete
  16. Btw, I meant to refer to Unger's 'The Mental Problems of the Many' which is not just to good ol' problem of the many.

    ReplyDelete
  17. I believe a clear and useful consensus definition can be crafted of consciousness as sentience, which means feeling (not intelligence as is often incorrectly assumed). Why not make the attempt and see how it develops? Here’s my starter contribution:

    Definition:

    Consciousness, noun. A biological, embodied simulation in feelings of external and internal sensory events and neurochemical states that is produced by activities of the brain.

    Description:

    That consciousness is a simulation is routinely overlooked, but is obvious when considering, for example, that we don’t see wavelengths of light, we see colors; we don’t hear compression waves in the air, we hear sounds ... and so on for each sensory mode. Consciousness is the simulation. The feelings are the contents of consciousness. Feelings are physical feelings like touch, pain, hunger, sight, sound and smell. Each sensory track is simulated in unique feelings. Diverse elements of a single sensory track are uniquely simulated, as in feelings of different colors in vision and different odors in olfaction. Conscious thought in words or pictures seems non-physical, but thought is sensory-inhibited physical feeling—speech-inhibited or vision-inhibited feeling respectively. Emotions are feelings that are simulations of a brain’s limbic system and neurochemical states: fear, anger, depression and so on. Normal conscious experience presents as a unified “movie-like” flow, as in “stream of consciousness.”

    Note that David Chalmers of “hard problem” fame said in an interview that Thomas Nagel’s “what it’s like” to be a particular animal is what it feels like to be that creature. In his critique (“Is There Anything It Is Like to Be a Bat?”) of the “something it is like” phrase, P. M. S. Hacker uses the words “feel” and “feelings” about 60 times. Those two instances alone (and there are many more) would seem to validate a definition of consciousness as sentience.

    Unconscious (non-conscious) brain processes may be involved in the resolution of feelings but should not be mistaken as consciousness. For instance, intelligence (pattern matching) and expectation resolution are not conscious processes.

    Armed with the sentience-centered definition and description we can ask and answer questions about who or what is conscious. As I’ve mentioned in previous “The Splintered Mind” comments, even though we can only be certain about our own consciousness, we can infer consciousness in others based on biostructural, neurochemical/biochemical and DNA similarity. We can very confidently infer that primates and all mammals are conscious. The inference strength for the consciousness of birds and octopuses is very high as well. Rocks and computer laptops are not conscious. The United States is not conscious. We frequently consider the possible consciousness of AI systems, apprehensively or for moral reasons, but AI systems like Star Trek’s Commander Data are not conscious. That’s not to say, however, that we should rule out moral consideration for a non-sentient system that computes itself centered in a world.

    Some philosophical preferences or conceptions would surely take issue with this definition but I believe the objections should be disallowed. GWT and IIT are theories with philosophical abstractions about the creation of consciousness, but workspaces and information relationships are not biological. Panpsychists and neutral monists routinely fail to define consciousness but use the term in a non-biological disembodied way, leading to the conclusion that panpsychism, neutral monism and similar philosophical beliefs are not about biological sentience.

    Take any claim or alternate definition of consciousness, weigh it against the preliminary descriptive definition I’ve provided and let me know how it goes. A group effort would be helpful in refining this definition and overcoming its weaknesses, so jump right in with suggestions for improvements and clarifications.

    ReplyDelete
  18. By the way, this definition was suggested by William James of stream-of-consciousness fame, in a talk of December 1, 1884 as printed in his book The Meaning of Truth. In the biography William James, Robert D. Richardson writes

    “The Function of Cognition,” which might more helpfully have been called “What Cognition Is,” claims only to be “a chapter in descriptive psychology ... not an inquiry into the ‘how it comes,’ but merely into the ‘what it is’ of cognition.” It is, says James, “a function of consciousness,” which “at least implies the existence of a feeling.” He explains that he is using the word “feeling” to “designate generically all states of consciousness,” including those sometimes called “ideas” or “thoughts.” “Feeling” remains for James the most general, most inclusive term for “state of consciousness.”

    ReplyDelete
  19. We seem to be really close on your proposal Stephen, so I’d like to help. Furthermore after reading the professor’s challenge to Keith Frankish, “Phenomenal Consciousness, Defined and Defended as Innocently as I Can Manage” (thanks to Patrick Glass above), I’d think he’d be open to your “sentience” based consciousness definition as well, rather than “intelligence”. (Before this post I hadn’t realized what a bad ass professor S. happens to be! Like Searle before him he seems to use sensible reasoning to challenge all sorts of pompous intellectuals who use their verbal acumen and charm to “bewitch” many intelligent people into funky positions.)

    One suggestion that I’d make would be for us to reduce your “biology” requirement to something more basic. Notice that evolution should use the properties of physics / chemistry to “blindly engineer” the traits of life. Thus in principle it should be possible for something other than evolution to technologically create a sentient entity using that same physics / chemistry, and yet not also by means of any “evolved biological stuff”.

    Personally I’m not optimistic about humanity building functional conscious robots some day, and since our machines seem ridiculously primitive when compared against biological machines. But I also see no reason for us to draw our definitional circle smaller than it technically should be drawn. Accuracy here might even help our cause in the eyes of any sensible sci-fi lovers out there. I suspect that many can’t stand Searle mainly because they falsely presume his opposition to robot consciousness, essentially given his prominent use of the “biology” term. Beyond his strengths, let’s also use the faults of this UC Berkley professor to help instruct us.

    ReplyDelete
  20. Thanks for the continuing thoughtful comments, everyone!

    jblackmon: Right, I figured! (I do still think it overassimilates to the non-mental problem of the many.)

    Stephen & Phil Eric: While I think Stephen's definition might work reasonably well for someone who is already in broad theoretical agreement with him, I prefer the much less theory-laden definition by example that I offer in the 2016 paper that Phil Eric mentions. It is for instance unclear to me why biology or neurochemistry should be required for consciousness, unless "biology" and "neurochemistry" are construed very broadly to include, for example, AIs that we might create in the future and organized groups and various types of hypothetical (possibly actual) alien entities -- maybe (if we want to engage also with dualists or idealists) even immaterial souls. Such entities might or might not be conscious (or in the case of souls, might or might not exist), but in my view that's a matter to be settled by reasoning and evidence rather than a matter that can be settled by the *definition* of "consciousness".

    ReplyDelete
  21. what if consciousness explains the part whole relationship? -- in other words for something to be a part is if we are conscious of it as as a part, for something to be a whole is to think of it as a whole. If that's the order of explanation, it would make sense to me that we will never get anywhere trying to use part/whole logic to explain consciousness.
    [ironically to leave this comment I had to check the squares with traffic lights -- there was a piece of a traffic light in one square and a piece in another square. I had to decide does "a piece of a traffic light" count as a traffic light. Signs and wonders!]

    ReplyDelete
  22. Professor,
    The reason I speculated that you might agree with Stephen’s “sentience” definition for consciousness, is because it seems isomorphic with the “phenomenal experience” definition that you championed in that 2016 paper, effectively argued by means of relavent examples. I consider each of us to be referring to the same essential idea here. Hopefully Stephen will be able to reduce his “biology” stipulation to something more basic, at least conceptually, since you and I consider this essential.

    Furthermore beyond your illustration by example (which is clearly a great way to go), I’m interested in your “wonderfullness condition”. I agree that whatever creates phenomenal experience should remain wondrous to us, even after being somewhat explained.

    An analogy lies in Newton’s gravity. He famously left the reason that mass would attract mass open for future natural philosophers to address. If he’d have instead postulated that his theory was the whole of it, and thus no underlying wonder was left to discover, this would be analogous to the position of global workspace theorists today. It may be that their theory does have some validity to it, but just as Newton postulated something wondrous beyond his base theory itself (a wonder later confirmed by Einstein and others), they should open up this door as well. There’s nothing wondrous about global information processing producing phenomenal experience, any more than mass inherently attracting mass. Furthermore their puts them under fire from accomplished philosophers such as Searle and yourself. (My own “thumb pain” thought experiment has yet to see the academic light of day, though I consider it devastating.)

    Here Bernard Baars and newer proponents might ask, “Okay, what would it effectively take for us to follow the model of Newton and stay right with Eric Schwitzgebel’s “wonderfullness condition”? It seems to me that they’d need to stop claiming that their theory does more than a natural world would permit it to. For example, I personally am quite impressed with the electromagnetic radiation proposal of Johnjoe McFadden, but in any case to deny “wonderfullness” here is to also deny naturalism.

    ReplyDelete
  23. Eric, when SelfAware commented about “systems we intuitively see as conscious … and not,” he wrote that “consciousness is in the eye of the beholder.” You responded “I think there are facts of the matter, independent of our judgments about those facts, about what organisms are or are not conscious.”

    The sentience definition I proposed precisely specifies the facts of the matter of consciousness. I repeat it here for easy reference:

    Consciousness, noun. A biological, embodied simulation in feelings of external and internal sensory events and neurochemical states that is produced by activities of the brain

    This definition states the facts of the matter for the only consciousness we know of—human consciousness. Respecting evolutionary knowledge, biostructural, bio-neurochemical and DNA similarity, the definition generalizes ‘human’ to ‘biological’ to allow the definition to encompass closely related animals and even distantly related species like the octopus. I believe the remaining elements of the definition are scientifically valid and uncontroversial facts of the matter—activities of the brain resolve sensory events and neurochemical states and produce simulations of them in feelings.

    If biology and neurochemistry shouldn’t be required, as you and PhilEric suggest, the definition would depart from known facts of the matter and be diverted into imaginative territory. The resulting evidence-free, nonfactual undefinition would then truly be in the eye of the beholder, which, in my opinion and perhaps that of SelfAware, is the current confusing state of much thinking and writing about consciousness.

    Eric, if you would like a word to refer to an AI’s computation of itself centered in a world, why not invent one? Philosophy commonly invents vocabulary and with the word consciousness already taken and clearly defined, inventing a new word would avoid creating confusion and promoting nonsense. For the AI case I suggest ‘aiwareness’ (pronounced eyewareness). If alien entities when encountered provide evidence that they are biological organisms similarly structured to ourselves with feeling-based consciousness such as our own, they’d fit the facts-of-the-matter consciousness definition. If not, create another new word for the phenomenon the aliens explain, perhaps ‘alienwareness’. Create another new word to refer to a group alienwareness once we have evidence it exists. And, since consciousness is already taken, why not also create new words for the fundamental ‘sensitivity’ of some kind that panpsychists and neutral monists suppose?

    The existence of ghostly immaterial minds (dualism) and souls (religion) must be demonstrated before investigating their features so, until convincing evidence is provided, a consideration of their sensitivities is meaningless.

    The new vocabulary words would probably not be as popular in philosophical publications as the current equivocation of the word consciousness provides, but I hope that consideration isn’t relevant to Philosophy’s quest to understand.

    ReplyDelete
  24. Eric, sorry to take you back to comments of two weeks ago, but further consideration has left me rather disturbed about your comments in response. The comment thread in question was begun with SelfAware’s comment of 11/12: “I think the reality is, consciousness is in the eye of the beholder.”

    You replied on 11/13 that: “I think there are facts of the matter, independent of our judgments about those facts ...”. SelfAware agreed: “I think for any precise definition of ‘consciousness’ there is a fact of the matter.” On 11/16 you wrote: “... I *hope* that there is a single best natural-kind ‘reference magnet’ once we get our positive and negative examples lined up right -- i.e., a fact in nature about what-it's-like/consciousness/experience that is the natural referent of terms like ‘phenomenal consciousness’ in philosophers' usage.”

    Then on 11/18 I proposed a definition of the word ‘consciousness’ in an unambiguous way that included only facts of the matter. I proposed:

    Consciousness, noun. A biological, embodied simulation in feelings of external and internal sensory events and neurochemical states that is produced by activities of the brain.

    At this point I would include ‘streaming’ as: “A biological, streaming, embodied simulation in feelings ...”, since the streaming characteristic has long been recognized as a “fact of the matter” of consciousness.

    Here is the problem Eric: You replied on 11/19: “While I think Stephen's definition might work reasonably well for someone who is already in broad theoretical agreement with him, I prefer the much less theory-laden definition by example that I offer in the 2016 paper that Phil Eric mentions. It is for instance unclear to me why biology or neurochemistry should be required for consciousness ...”

    However, biology and neurochemistry are indisputably facts of the matter of consciousness. Your comment didn’t dispute any “fact-of-the-matter” characteristic of the elements of my definition, neither did you offer any facts of the matter that I omitted. After considerable thought, Eric, I have realized that your reply reverts to SelfAware’s claim that consciousness is in the eye of the beholder!

    So, I’d appreciate your take on your own “eye of the beholder” implication. Additionally, I would still like to learn your thoughts about the facts of the matter definition I’ve proposed—which elements are not, in your opinion, facts of the matter and what facts of the matter would you add?

    I don’t accept the current muddled state of consciousness discussions that are rendered meaningless without an agreement on the definition of the core terminology. Why can’t we fix that?

    ReplyDelete
  25. SelfAware: I don't think my assertion implies that. Here's an analogy. Suppose we're in the 17th century. You define "stars" as "fixed points in the sky that emit light". I define stars by ostention as "whatever type of things those things are". Your definition has theoretical commitments that mine lacks. But mine doesn't make whether something is a star or not a matter of something in the eye the beholder.

    ReplyDelete
  26. Hi Eric,
    You addressed your last reply to me, but I think you meant it for Stephen?
    Mike

    ReplyDelete
  27. Stephen and Mike: Whoops -- that's right! Thanks for the catch.

    ReplyDelete
  28. Thanks for replying Eric.

    Your analogy is light on the facts of the matter though. “Fixed points in the sky” is not factual because the points move and “whatever type those things are” is wholly devoid of facts. The analogy seems beside the point as well, which is the construction of a broadly acceptable factual definition of the word ‘consciousness.’

    In the definition of consciousness I have proposed, would you please identify any “theoretical commitments” that you see? I truly fail to see any. My entire intention was to define the term with facts of the matter only, none of which you have disputed. On the other hand, a definition of ‘consciousness’ that includes consciousness in non-biological substrates clearly makes a theoretical commitment that produces entertaining science fiction/fantasy but is out of place in a definition of a biological phenomenon.

    ReplyDelete
    Replies
    1. Yes Stephen, the only sentience that we know of is biological. I’m sure that the professor would agree if you ask explicitly. But I doubt he’ll ever provide a definition for sentient entities (which is to say his and your mutual definition for conscious entities), which mandates biological instantiation exclusively. If there is a physics by which biological entities become sentient, then logic suggests that such physics in a lab setting could create something non-biological that has phenomenal experience as well. Unless you don’t consider sentience to weakly emerge from the physical (and as a naturalist, of course you do!), then we agree. So why not acknowledge our agreed upon consciousness definition in the attempt to help academia finally get beyond its current “eye of the beholder” paradigm?

      Delete
  29. PhilEric, here’s a version of a comment I just posted to SelfAwarePatterns’ blog, which discusses the issue:

    ************

    As to “what about [consciousness] requires that it be biological?”—this is a definition—a statement of the exact meaning of the word. In this case, the exact meaning captures the facts of the matter of consciousness. Biological is specified as a fact of the matter because all reliably observed or inferred instances of consciousness are biological.

    Per Wikipedia, “A term may have many different senses and multiple meanings, and thus allow multiple definitions,” but the definition I’m proposing is the a. definition, one that is broadly acceptable for consciousness studies discussions. (Follow-on b-z definitions of some type of awareness are possible, as in ‘AI consciousness’ and ‘panpsychic consciousness’ but all of them would be explicitly understood as fictitious, imaginative and highly conjectural since none of them can be affirmed as existing).

    As I’ve mentioned before, if a feeling is determined to be a specific Neural Tissue Configuration, perhaps a sheet of neural tissue connected and deformed in a particular way, it would be obvious that a machine replication wouldn’t be possible. But until we know exactly how consciousness works, there’s no reason whatsoever to suppose that a non-biological consciousness is possible. All such suppositions are fictions.

    If a non-biological instance of consciousness were claimed—a non-biological feeling—it would be extremely difficult to confirm because we’re limited to strength-of-inference with regard to the consciousness of other organisms, even another human being. Any inference would seem impossible in a non-biological case.

    ReplyDelete
    Replies
    1. Stephen,
      Thanks for the “heads up” on your commentary over at Mike’s blog.

      Given your theme I wonder which of these sounds better to you — 1) that science presently gains a generally accepted and apparently useful definition for the squishy “consciousness” term that corresponds both with your “sentience” and the professor’s “phenomenal experience” conception, or 2) that science presently remains unable to achieve any such agreement specifically because you demand that a “biology” stipulation be included additionally that others refuse to grant? Would you prefer 1, and so an end to the “eye of the beholder” paradigm that our soft mental and behavioral sciences struggle under, and even without explicitly mentioning “biology” whatsoever, or would you prefer 2 because presumably in the first case great harm would come in that scientists should then erroneously decide that something non-biological might be created in a lab setting which experiences its existence phenomenally? (Note that science would march on in either case, but I wonder which of these you’d consider more healthy today in science, 1 or 2?)

      Delete
  30. To which I have recently added:

    The whole point is to develop a definition of consciousness that includes only facts of the matter. (I’m tempted to acronymize that but I’ll restrain myself).

    The proposed definition is intended to remedy the “eye of the beholder” equivocation of the term that leads to so much nonsense and misunderstanding in consciousness discussions. Convincingly show that an element of the proposed definition is not a fact of the matter and it will be removed. Convincingly propose a definitional element that is clearly a fact of the matter and it will be added. I welcome any definitional suggestions. If non-biological feelings are ever verifiably created, then that will be a new fact of the matter and the definition would be revised to incorporate that newly discovered fact.

    The definition of a noun is not a fiat. It doesn’t dictate possibilities or impossibilities. It factually defines the term so we all can be certain what we’re talking about in discussions like those using the word ‘consciousness.’ I believe that’s an immeasurable improvement over discussions of ‘what-it’s-like-ness’ ... ;-)

    As for AI consciousness, panpsychism and neutral monism, we’d all know such discussions are not about consciousness definition a. but are, instead, about some proposed form of awareness that the proponents are required to clearly define, but have not.

    ReplyDelete
  31. Actually PhilEric, I seriously doubt that neuroscience objects to consciousness being defined as biological. The issue is with Philosophy, not Science.

    Much of the “Consciousness Studies” branch of Philosophy is rooted in lack of definitional specificity and several fundamental misunderstandings. I doubt if anyone would be discussing AI consciousness had Artificial Intelligence not been so disastrously misnamed. A more accurate term would have been Massive Computational Pattern Matching. Would Philosophy ponder the possibility of MCPM systems developing and/or possessing consciousness? That unfortunate use of the term ‘intelligence,’ the widespread misunderstanding that intelligence is an attribute of consciousness and the incorrect view that computers are “electronic brains” led to a science fictional, movie media fascination with the Artificial Other: Colossus: The Forbin Project, The Terminator and so on. It’s a cultural fascination that been escalating for decades and early on it seeped into Consciousness Studies.

    Were it common knowledge that computer processors are essentially routers of high and low voltages—computers do not and can not know anything—perhaps some rationality would return. The other unfortunate and misguided association underlying concerns about AI consciousness is cortical consciousness theory. While the massive connectivity and parallel processing of the cortex is provably related to the production of conscious content, no evidence exists that cortical activity produces consciousness, while significant evidence exists that it does not. But the erroneous widespread belief that a computer system’s massive connectivity and parallel processing is the same as cortical connectivity and processing has led many to incorrectly believe that consciousness can ‘emerge’ in complex computer systems.

    I also suspect that a facts of the matter definition of consciousness would be resisted and rejected by Philosophy’s Consciousness Studies community because of it’s deleterious effect on the flourishing Commercial Philosophy marketplace.

    ReplyDelete
  32. Be that as it may Stephen, what would you personally consider more healthy for the institution of both science and philosophy — 1) that your and the professor’s stated conception of “consciousness” becomes widely accepted today, except with no mention of biology whatsoever, or 2) that consciousness would remain “in the eye of the beholder” for academia in general? It’s fine if you need to make some clarification as well, but I’m asking for you to pick between these two options as if you personally were able to make that be the case right now.

    ReplyDelete
  33. Stephen,
    I realize that I’ve presented you with a difficult choice. Your investments correspond with “2”, though my response to that might be unsettling. Conversely with a response of “1” you’d effectively be aligned with the professor and I, though at the apparent cost of one of your most cherished positions — or your desire for “consciousness” to be inherently associated with biological dynamics. So perhaps your best option would be to choose not to decide? (Here I’m reminded of lyrics from the song “Freewill” by Rush.) In any case I’ll now provide a generic response for each option.

    I would of course be pleased if you’d choose option 1. I wonder if you’ve read the professor’s 2016 paper, “Phenomenal Consciousness, Defined and Defended as Innocently as I Can Manage”? (I found this on the professor’s home page, http://www.faculty.ucr.edu/~eschwitz/, which for convenience I’d love for him to add as a standard link right here on his blog.) The paper masterfully challenges the obstructionism of people like Keith Frankish who seem intent upon having consciousness exploration remain forever trapped under an “eye of the beholder” holding pattern. No “hard science” should have been able to achieve its firmness by means of such diverse definitional parameters. The paper’s innocence clause mandates that stipulations such as “biology” not be included, or at least not given modern uncertainty regarding what it is that the brain does to create phenomenal experience. (Furthermore I consider his “wonderfullness” clause to be at least as substantial.) With an innocent but still generally useful definition for consciousness widely agreed upon, our mental and behavioral sciences should finally gain a fighting chance to somewhat harden up. I suspect that your “sentience” as well as the professor’s “phenomenal experience” conception (as established by means of example), could effectively serve in this capacity.

    Then there’s option 2. Your effective reason for defining sentience / consciousness as an exclusively biological dynamic seems to be that we know of nothing beyond the biological brain which is able to create something sentient. (If you can think of any other reasons then please provide them as well!) Regardless we should consider how valid that line of reasoning happens to be. One way to assess this would be to extend it to other situations to see if it holds up in those cases.

    For example, beyond our planet we’ve found no other place in the universe where “biology” emerges (and with “biology” I roughly mean a genetic material based system which uses a host of causal dynamics for replication). Thus shall we define the term “biology” such that it can only emerge on our planet, and so any other places where similar systems might some day be found would be considered to house lower (“b”, “c”, “d”,…) varieties of “biology”, and perhaps we’d even presume that speculation about discovering such dynamics would be “fictitious”, “imaginative”, and “highly conjectural”? Of course not. Given the bajillions of planets out there, surely ours isn’t all that special.

    If sentience does (weakly) emerge through various not yet established dynamics of the brain, then today it would seem presumptive to decide that such unknown dynamics could never be technologically replicated if we were to grasp something about what our brains do to create it (which is not to say that we ever will). So here we needn’t make any ontological claims about biology being inherent to sentience (any more than our planet would be inherent to “life”), but rather could wait for science itself to weigh in, and hopefully soon with an effective and widely accepted consciousness definition at our disposal.

    ReplyDelete
  34. PhilEric, this isn’t about my “... desire for ‘consciousness’ to be inherently associated with biological dynamics”—it’s about agreeing on a definition that specifies only facts of the matter. A single proven instance of non-biological consciousness would allow striking ‘biological’ from the definition and I would happily make that change at that time.

    Until that happens, however, all of your supporting reasoning is speculation that I would categorize as highly speculative, rather than mildly speculative. As I’ve mentioned, it would be extremely difficult to impossible to recognize a non-biological instance as conscious and the grounds for doing so haven’t been specified. No one has any idea how to create non-biological consciousness and no one is working to achieve it. And expecting consciousness to ‘emerge’ from computer systems is like expecting digestion to emerge from a kitchen sink garbage disposal, which, just like biological systems, takes in foodstuffs and emits waste material.

    As to biology, nothing in that word’s definition as the study of living things requires them to be terrestrial.

    As regards my consciousness definition, only the biological qualification has been disputed. Until objections to the other facts are registered and/or additional facts of the matter are proposed, I have to believe the definition is a viable candidate to eliminate “eye of the beholder” equivocations.

    At this point I’m simply repeating myself to no one’s benefit, but I’ll gladly jump back into the discussion when some substantive change on the issue has been observed.

    ReplyDelete
  35. Stephen,

    “As to biology, nothing in that word’s definition as the study of living things requires them to be terrestrial.”

    Right. But why would it be crucial to saddle a “consciousness” definition with the biology constraint, though not the “biology” definition with a terrestrial constraint, if in each case we merely have no examples today for such a contradiction? Here you seem not to grasp that I’ve demonstrated a fault in your logic. If you’d like to solidly ground the position that it’s effective to define consciousness as an inherently biological dynamic, then you’ll need to come up with a non-arbitrary reason for us to back that position. Instead you currently seem to be adding one of your own pet desires for the term.

    ReplyDelete