Wednesday, August 04, 2021

On the Apparent Inconceivability of Borderline Cases of Consciousness

Let's call a state or process a borderline case of consciousness if it is in the indeterminate gray zone between being a conscious state or process and being a nonconscious state or process. Consider gastropods, for example. Is there something it's like to be a garden snail? Or is there nothing it's like? If borderline consciousness is possible then there's a possibility between those two: the possibility that there's kind of something it's like. Consider other vague predicates or properties, such as greenness and baldness. Between determinate baldness and determinate non-baldness is a gray zone of being kind of bald. Between determinate greenness and determinate non-greenness there's a range of kind of greenish shades, with no sharp, dichotomous boundary. Might consciousness be like that? Might some organisms in that in-betweenish zone?

For a human case, consider waking from anaesthesia or dreamless sleep. Before you wake you are (let's suppose) determinately nonconscious. At some point, you are determinately conscious, though maybe still feeling confused and hazy-minded, still getting your bearings. Must the transition between the nonconscious and the conscious state always be sharp? Or might there be some cases in which you are only kind of conscious, i.e., there's only kind of something it's like to be you?

We need to be careful about the concept of "consciousness" here. You might be only half-aroused and not fully coherent, responsible, or knowledgeable of your location in time and space. Such a confused state has a certain familiar phenomenal character. But that is not a borderline case of consciousness in the intended sense. If you are determinately having a stream of confused experience, then you are determinately conscious in the standard philosophical sense of "conscious". (For a fuller definition of "conscious", see here.). An in-between, borderline case would have to be a case in which it's neither quite right to say that you are having confused experiences nor quite right to say that you aren't.

It is commonly objected that borderline cases of consciousness are inconceivable. (Michael Antony and Jonathan Simon offer sophisticated versions of this objection.) We can imagine that there's something it's like to be a particular garden snail at a particular moment, or we can imagine that there's nothing it's like, but it seems impossible to imagine its kind of like being something. How might such an in-between state feel, for the snail? As soon as we try to answer that question, we seem forced either to say that it wouldn't feel like anything or to contemplate various types of conscious experiences the snail might have. We can imagine the snail's having some flow of experience, however limited, or we can imagine the snail to be an experiential blank. But we can't in the same way imagine some in-between state such that it's neither determinately the case that the snail has conscious experiences nor determinately the case that the snail lacks conscious experiences. The lights are, so to speak, either on or off, and even a dim light is a light.

Similarly, as soon as we try to imagine the transition between dreamless sleep and waking, we start to imagine waking experiences, or confused half-awake experiences, that is, experiences of some sort or other. We imagine that it's like nothing - nothing - nothing - something - something - something. Between nothing and something is no middle ground of half-something. A half-something is already a something. Borderline consciousness, it seems, must already be a kind of consciousness unless it is no consciousness at all.

I'm inclined to defend the existence of borderline consciousness. Yet I grant the intuitive appeal of the reasoning above. Before admitting the existence of borderline cases of consciousness, we want to know what such a borderline state would be like. We want a sense of it, a feel for it. We want to remember some borderline experiences of our own. Before accepting that a snail might be borderline conscious, neither determinately lights-on nor determinately lights-off, we want at least a speculative gesture toward the experiential character of such in-betweenish phenomenology.

Although I feel the pull of this way of thinking, it is a paradoxical demand. It's like the Catch-22 of needing to complete a form to prove that you're incompetent, the completing of which proves that you're competent. It's like demanding that the borderline shade of only-kind-of-green must match some sample of determinate green before you're willing to acept that it's a borderline shade that doesn't match any such sample. An implicit standard of conceivability drives the demand, which it is impossible to meet without self-contradiction.

The implicit standard appears to be this: Before granting the existence of borderline consciousness, we want to be able to imagine what it would be like to be in such a state. But of course there is not anything determinate it is like to be in such a state! The more we try to imagine what it would be like, the worse we miss our target. If you look through a filter that shows only determinately bald people, you won't see anyone who is borderline bald. But you shouldn't conclude that no borderline bald people exist. The fault is in the filter. The fault is in the imaginative demand.

In another sense, borderline cases of consciousness are perfectly conceivable. They're not like four-sided triangles. There's no self-contradiction in the very idea. If you're unhappy with your inability to imagine them, it could be just that you desire something that you can't reasonably expect to have. The proper response might be to shed the desire.

A philosophically inclinded middle-schooler, on their first introduction to imaginary numbers, might complain that they can't conceive of a number whose square is -1. What is this strange thing? It fits nowhere on the number line. You can't hold 3i pebbles. You can't count 3i sheep. So called "imaginary numbers" might seem to this middle-schooler to be only an empty game with no proper reference. And yet there is no contradiction in the mathematics. We can use imaginary numbers. We can even frame physical laws in terms of them, as in quantum mechanics. In a certain way, imaginary numbers are, despite their name, unimaginable. But the implicit criterion of imagination at work -- picturing 3i sheep, for example -- is inappropriate to the case.

We can conceive of borderline cases of consciousness, in a weaker sense, by forming a positive conception of clear cases of consciousness (such as regular waking consciousness and such as the experience of feeling disoriented after waking) and by imagining in a different way, not from the inside, cases in which consciousness is determinately absent (such as dreamless sleep), and then by gesturing toward the possibility of something between. There is, I think, good reason to suppose that there are such in-between, borderline states. Nature is rarely sharply discontinuous. On almost every theory of consciousness, the phenomena of consciousness are grounded in states of the brain that aren't sharp-boundaried. (I'm working on an article that defends this view at length, which I hope to have in circulating shape soon.) This is fairly abstract way of conceiving of such states, but it is a conception.

If borderline cases were common enough and important enough in human life, we might grow accustomed to the idea and even develop an ordinary language term for them. We might say, "ah yes, one of those jizzy states, in the intermediate zone betweeen consciousness and nonconsciousness." But we have no need for such a concept in everyday life. We care little about and needn't track borderline cases. We can afford to be loose in talking about gradual awakenings. Similarly for nonhuman animals. For everyday purposes, we can adequately enough imagine them either as determinately conscious or as nonconscious machines. There has never been a serious linguistic pressure toward an accurate heterophenomenology of nonhuman animals, much less a heterophenomenology with a dedicated label for in-between conditions.

Thus, if we accept the existence of borderline cases of consciousness on general theoretical grounds, as I'm inclined to think we should, we will need to reconcile ourselves with a certain sort of dissatisfaction. It's incoherent to attempt to imagine, in any determinate way, what it would be like to be in such a state, since there's nothing determinate it would be like. So first-person imaginative transportation and phenomenological memory won't give us a good handle on the idea. Nor do we have a well-developed science of consciousness to explain them or an ordinary folk concept of them that can make us comfortable with their existence through repeated use.

It's understandable to want more. But from the fact that I cannot seize the culprit and display their physiognomy, it does not follow that the jewels were stolen by no one.

[image source]

37 comments:

  1. Another excellent post Eric!

    Can't say I'm a fan of the phrase "something it is like". It's one many philosophers seem quite taken with. But it strikes me as the epitome of ambiguity. That ambiguity, I think, masks confusion and disagreement.

    A much better phrase would be "like us". Our own experience is the only frame of reference we can use to think about this, so any consideration we give it will really be in terms how much like us another system is.

    When put that way, it becomes clear that other healthy mentally complete humans are a lot like us. Great apes are less like us, but much more like us than a dog or mouse. But any mammal is much more like us than a frog, fish, crab, ant, or garden snail.

    All of those are much more like us than a self driving car. But a self driving car is a lot more like us than a rock, since the self driving car takes in information from its environment to make decisions.

    A good source of information that can help refine our intuitions is to read about brain injured patients. It's very hard to imagine being someone aware of their environment while having no affects about it (akinetic mutism or abulia), or who have lost the ability to perceive motion. Due to their injury or pathology, many of these patients have become less like us while still being human.

    Mike

    ReplyDelete
  2. Thanks for the thoughtful comment, Mike! I’m not sure about “like us”. There are so many ways in which something can be like or unlike us. A corpse is a lot like a living human in some respects, very unlike in others. Phenomenal consciousness is my target here. I find that many people find the phrase “something it’s like” useful, but the ideas also can be put forward without that phrasing.

    ReplyDelete
  3. I'm with Mike about not being a fan of the "something it is like".

    Part of it is that the "something it is like" for myself as a human encompasses a rather wide range of states.

    To me the nearest borderline state that we regularly experience is the hypnagogic state before falling asleep. You can also achieve a similar state in a float chamber. I've also found it in experimentation with galantamine taken after awakening in the middle of the night and while trying to return to sleep. Sometimes I can persist on and off in a hypnagogic for long periods.

    However, I don't think this state can be compared to a conscious snail state.

    I guess you are familiar with Koch's consciousness meter and the attempt to introduce objective measurement of brain activity.

    ReplyDelete
  4. Regarding Koch's consciousness meter:

    It seems to me that there is a suggestion that objectively consciousness could be measured by a certain threshold amount of neuron firing in sync, probably at rates above 8 hertz. This seems associated with various states of consciousness in humans, including wakefulness and dreaming but not dreamless sleep which has firings at a lower rate.

    Apparently even in locusts and other insects there appears to be synchronous activity in 20 hertz range for olfactory learning. I don't know about snails but there may be something similar.

    That isn't to say all of these states are identical or even necessarily that similar but they might all be conscious in some sense.

    ReplyDelete
  5. Professor,
    I think it’s possible that you’ve been too charitable to your opposition here. I suspect that we can’t imagine a borderline consciousness from the innocent conception that you’ve developed (which I’m a big fan of), because by definition there can’t be any. A phenomenal dynamic should either exist or not given your definition itself, and thus a priori. Conversely Mike’s favored definition (for example) leaves loads of room for “sort of conscious states” since the theme is our conception of humanness. So I’m saying that your charity may have tricked you into saying that “a triangle needn’t be three sided”. On to your reasoning…

    As I understand it, imaginary numbers are effectively placeholder symbols that make no sense in the form of a result. We use them just in case a final answer happens to end up with one or more sets of two multiplied together. Each of these yield a negative 1 to thus be valid. Conversely an answer that leaves any trace of such a number alone, will be incoherent.

    Ah, but isn’t the “i” effectively used for an important equation in quantum mechanics? It seems to me that in equation form this symbol should remain coherent since equations don’t in themselves yield results. I presume that quantum mechanics hasn’t changed math itself, which is to say that imaginary solutions tell us nothing even in QM.

    Furthermore note that physicists remain quite divided about what’s going on in QM. In order to claim that the observed wave/particle duality is deterministic, some even theorize that each second, bajillions of full universes emerge from each cubic millimeter of our universe. I consider this to substitute one magic for another. Regardless perhaps quantum mechanics isn’t the best role model?

    Sometimes when a person can’t conceive of a middle ground between two different things, like black and white, such grays do still exist. Other times when a person can’t conceive of something, it’s because by definition there’s nothing to conceive. Regardless I doubt the “conception” part matters either way. I just don’t see how to get around your consciousness definition itself. To me it seems to only yield discrete “yes” and “no” states. I look forward to your coming paper on brain continuity, though surely discrete brain elements exist as well. For example notice that in certain ways the electromagnetic radiation which it’s known to produce may be considered “on” and “off”.

    ReplyDelete
  6. Hi Eric,

    Hope you are doing well. I would think that we might want to distinguish two things here: creature consciousness and state consciousness, and explore borderline cases with respect to each of these separately. I think the intuitive pull for borderline cases comes primarily from thinking about creature consciousness. As far as common parlance goes, we say things like, "I am only half awake," perhaps before our first cup of coffee or "I was almost asleep when a loud noise rang through the neighborhood." Phrases like those suggest that our rough, ready and common notion of creature consciousness admits to shades. In fact, I think we could say more: creature consciousness admits to shades insofar as we treat creature consciousness as being constituted by a number of distinct properties that don't cooccur by necessity. So, when I am half awake I am be fully subject to the deliveries of my senses, but I might be considerably less attentive to them (even if I really try to attend). Alternatively, when I am almost asleep, I am still sensitive (though in a diminished way) to the world, but I am starting to experience the intrusions of dreaming.

    When it comes to state consciousness on the other hand, I am less sure about the intuitive pull for borderline cases, especially if we are taking state consciousness to be defined in terms of what-its-like. (I will add my voice to the chorus of people who expressed their dislike for this characterization above.). While I take your point that there are few sharp distinctions in nature, such a generalization has to be treated carefully. While, for instance, there are borderline cases when it comes to hairiness or colorfulness--we might fruitfully note that those cases already involve an axis along which distinct cases differ and differ with respect to the target property: two determinately non-bald people can still differ in their hairiness with respect to the number of hairs they have or the density of hairs they have, etc., and crucially, differing in these respects involves one of the people being further away from determinate baldness than the other (even if they are both equally determinately full-headed). The same can be said of two things that are determinately green: one of the things can be further away from determinately not-green than the other insofar as they differ with respect to hue or saturation, etc. Yet, when it comes to state consciousness, it is harder to imagine an axis along which determinate cases can differ in a way that makes one of the cases further away from being a case of determinately not conscious, which makes it hard to imagine borderline cases of state consciousness.

    All that skepticism being registered, here are a few ways one could try to get at the notion of borderline cases of state consciousness: Two cases of state consciousness might differ with respect to 1) how many phenomenal properties each has, 2) how much of one's"phenomenal field" a particular state "takes up" (on this way of thinking about it, peripheral vision might deliver borderline states of consciousness), 3) how much effort a subject would have to put into expressing or reporting the state, 4) how vivid the phenomenal properties that make up the experience are, etc. A difficulty with all of these, it seems, is unlike the kinds of cases discussed in the previous paragraph, differing with respect to 1, 2, 3 or 4 does not seem to make one state more/less conscious than another even among those cases of determinate consciousness. But, attempting to motivate the distinction this way does avoid the issue of having to characterize what it is like to be in a state that isn't determinately conscious or not conscious.

    ReplyDelete
  7. I had retinal surgery a couple of months ago and I opted for a local rather a general anesthetic. I remember hearing some discussion between the surgeon and the anesthesiologist though not being aware of what they were actually saying. There was some phenomenology during the experience (I think it's accurate to call it an "experience") -- though no pain -- but it wasn't continuous. I would count this as a borderline case of consciousness: some phenomenology (just as a borderline case of baldness involves some hairs) but patchy and not very coherent.

    ReplyDelete
  8. Thanks for these great comments, everyone!

    Jim: I have my doubts about the consciousness meter, but even if it’s correct it ought to allow for gray areas, depending on how close the system is to the required wavelength and how large the synchronized region is — unless we go full IIT and embrace near-panpsychism.

    Phil E: Would you say the same about negative numbers? Make the middle schooler an elementary schooler who can’t picture -3 sheep and the argument repeats. But fundamentally I think your comment at the end suggests we agree that limitations on conception here aren’t necessarily limitations on reality. I’m not sure, though, why my “innocent” definition of consciousness implies the impossibility of borderline cases. I can define artworks and nonartworks by example through clear cases but that doesn’t imply there aren’t borderline cases.

    Ryan: That’s super helpful. Thanks! For borderline states I’m thinking of possibilities such as unattended sensory stimuli or stimuli near the limit of perceivability. I agree that differences of difference from non-X among determinately X cases is a feature of at least some properties with borderline cases, including the three I used as examples. So either I must commit to consciousness not being like that or I must allow some analog for the conscious case. You suggest some good possible bases for the analog, but I’m not sure I want to commit to them yet.

    Frankie: Very interesting! Here I’m inclined to say that if there is determinately some phenomenology, even if only a little, you were determinately conscious in the intended sense. Might there have been some moments when you were in the gray area between having some phenomenology and none at all? Of course, there wouldn’t be anything determinately that was like, so in a sense it would be ineffable and elusive to grasp.

    ReplyDelete
  9. Eric: Yes, there is determinately some phenomenology in my surgery example, but is it sufficient to count as being determinately conscious, rather than as a borderline case? I was suggesting that we treat the consciousness case more like the baldness case. For (non) baldness the relevant property is having hairs; for consciousness the relevant property is having phenomenology. So, no hair then bald, no phenomenology then not conscious. (This defines one end of the spectrum.) The question then is how many hairs are required to be non-bald, and this is indeterminate. For consciousness, the question concerns the quantity and the quality of the phenomenology. Re quantity, it may be fleeting vs. patchy vs. continuous, but there may not be a clear threshold for counting as fully (determinately) conscious, and re quality it may be hazy, elusive, and effable only in various degrees (the “gray area” you mention above). My surgery example seems to fall below the threshold in both quantity and quality for full consciousness, hence it would count as an indeterminate case.
    Some caveats: (1) it might be objected that hazy and elusive phenomenology is still phenomenology, and so what I am calling a quality requirement on phenomenology is illegitimate. If one takes this line – that any phenomenology is sufficient for determinate consciousness – then there won’t be any borderline cases and the question seems much less interesting. (2) The issue concerns only so-called ‘phenomenal consciousness’ and not what Ned Block calls ‘access consciousness’. I don’t think that there should be a problem about borderline cases of access consciousness but then I don’t fully understand this notion of consciousness.

    ReplyDelete
  10. Thanks for following up on this, Frankie!

    Maybe the state consciousness / creature consciousness distinction is important here. One conception might be that creature consciousness is vague in that the possession of just a tiny bit of (determinate) phenomenal consciousness isn't sufficient for creature consciousness, just like having one hair isn't sufficient to be non-bald, but having enough is sufficient, and there's a gray zone. But for state consciousness, if I'm hearing you correctly, you're denying the possibility of vagueness / gray area cases. (Yes, I take phenomenal consciousness to be the issue here, not access consciousness.) Does that sound right?

    My own view, which might well be a minority view, since many people seem to simply state or assume that it's false, is that there can be vague, gray-area cases even for state consciousness. My arguments for this aren't fully articulated in the post, but generally turn on the plausibility of the idea that the functional and physiological phenomena that give rise to or are identical with consciousness are phenomenal that admit of degree rather than being sharp-boundaried (even if often there are sudden phase shifts). I then combine this thought with the thoughts above to defuse what I take to be the most common objection to allowing gray-area cases of states of phenomenal consciousness (i.e., that they are inconceivable or unimaginable). It's convenient for my presentation if I allow that any creature with at least one determinately phenomenally conscious state is creature conscious; but I also allow that there's some attractiveness to the view of creature consciousness you articulate, and I'm not opposed in principle to that view.

    ReplyDelete
  11. Eric - I think we agree on creature consciousness, but my remarks were intended to bear on state consciousness. I think that there *are* grey area/vague cases -- sufficiently confused phenomenology (but phenomenology nonetheless), as in the surgery example -- that don't clearly rise to the level of determinate/full (state) consciousness, and so would count as borderline cases of (state) consciousness. So I think we probably agree on state consciousness too.

    ReplyDelete
  12. This seems to be circling back to 70's style examples of altered states of consciousness - eg if I have successfully "emptied my mind" while meditating (as opposed to dropping off to sleep ;)), am I still conscious? If I am in a "flow state" during a physical activity, with "self-talk" completely absent, is my experience close to what a dog might experience? After all, I may not be able to explain exactly what either of those states are like to others, or capture them as a memory, but cerebral O2 consumption was probably much closer to conscious than unconscious.

    ReplyDelete
  13. It does seem a bit like one of those positions so bizarre only a philosopher could hold them. Every single living person is biologically continuous with a bunch of cells in the past that were not conscious. The claim that there is no in-between state is a claim that for every person there was a perfectly defined moment before which they were in no way conscious, and after which they were conscious. Not even a millisecond of in-between time.
    Our increasing knowledge of biology seems to offer lots of analogies, too. Life seems like a fairly black-and-white kind of a thing, until you know about viruses and spores, which seem to occupy an in-between space. Mammals seem like a well-defined group, until you discover platypuses. Genetic inheritance seems like a clear thing, until you learn about mitochondrial DNA.
    So I supposed I'd say: if a particular definition of consciousness is useful for other reasons, and has a side-effect of admitting no fringe cases, then that seems fine, but you'd have to use it knowing that there are likely to be some challenging fringe cases out there that will test your definition; but you should never try to infer facts about the world from the semantic structure of the particular terms you happen to be using.

    ReplyDelete
  14. Eric, thanks for the reply.

    After reading some of the other exchanges in the comments, I was wondering about your argument from the fuzziness of functional and physiological phenomena.

    Let's assume
    1. that conscious states qua consciousness turn out to be (caused by) the set of physio-functional properties S={P1...Pn},
    2. that for (almost) any physical or functional property Pi, there can be some state of affairs (SoA) such that it is indeterminate whether that SoA instantiates Pi insofar as
    A. it is indeterminate whether one of the properties that that SoA does instantiate is identical to Pi (I include 2A so as to avoid other ways in which it may be indeterminate whether the target SoA instantiates the relevant property).

    What you claim is that because we can construct sororities series involving properties from S, we can do so for the consciousness of state C. Let C is be a determinately conscious state. Because a state’s being conscious (by assumption) is that state instantiating the properties from S, C instantiates all the properties in S. Now, we can generate a series of states starting from C, where each subsequent state less determinately possesses some particular property in S. Since instantiating all the properties in S is (the cause of) consciousness, then the series we constructed will involve some state C* such that C* neither determinately instantiates nor determinately does not instantiate some property in S, and thus, is not determinately conscious nor determinately not conscious. Is that right?

    Let me be presumptuous and assume that is more or less correct. If so, a few concerns: One concern is that the state C* might be neither determinately conscious nor determinately not conscious because it is not determinate whether C* is a mental state at all, or more generally, because it is not determinate that it instantiates other necessary conditions on being a conscious state. Thus, you would have a state that lacks determinacy qua consciousness but because it lacks determinacy for some other more fundamental property. (I think, for instance, one could reply to Chinaphil above in this way, and thus accept the existence of such indeterminate states, while at the same time denying that such states are interesting cases of indeterminate consciousness.) That might be easy to deal with though: Define S above as the minimal set of the necessary and sufficient properties that are proprietary to consciousness, that is, S contains those properties that a state needs to be conscious, but not those properties that a state needs to be, for instance, a mental state at all. For example, on my view, for a state to be mental it needs to be (potentially) causally relevant in the production of behavior. This would not be included in S because it is not proprietary to consciousness in the way I am using the term. Now, it seems like the series would generate indeterminate cases of consciousness qua consciousness itself as long as such a set S is possible.

    There is a second concern I have, but I am yet sure how to articulate it so I may email you about it once I’ve gotten it down a bit more clearly.

    ReplyDelete
  15. Phil E: Would you say the same about negative numbers? Make the middle schooler an elementary schooler who can’t picture -3 sheep and the argument repeats. But fundamentally I think your comment at the end suggests we agree that limitations on conception here aren’t necessarily limitations on reality. I’m not sure, though, why my “innocent” definition of consciousness implies the impossibility of borderline cases. I can define artworks and nonartworks by example through clear cases but that doesn’t imply there aren’t borderline cases.

    No professor, I wouldn’t say that the imaginary number situation repeats with negative numbers. Just as we can talk about having three sheep with +3, it’s coherent to talk about missing three sheep with -3. Conversely 3i says nothing at all.

    Regardless if your point was that limits on conceptions do not demonstrate limits on reality, to me that’s certainly the case. Why? Because I consider ontological reality to create an epistemic reality where the ontological needn’t match our epistemic conceptions. Like Einstein I’m not arrogant enough to presume that my own capacity to conceive (as with quantum mechanics), corresponds with what’s actually “out there”.

    It’s not that your consciousness definition is argued through example that removes the possibility for borderline cases, or even how innocent it happens to be. Instead it’s the initial dichotomous phrasing. All “somethings” that have implicit “versus nothings” attached, will by definition have no borderline cases. Art could be defined this way too, though in practice we’d need a clear delineation between the two. That doesn’t seem useful in this case.

    Conversely I think a “yes or no” definition should be useful for your provided conception of consciousness, since if there’s anything at all phenomenal that exists (regardless of how minor or confused), then certain “hard problem physics” should be involved in creating that state. We don’t have to grasp such physics to make such a presumption, though naturalism does mandate such an explanation in the end. (Personally I suspect certain electromagnetic parameters associated with neuron firing.)


    You said that nature is rarely discontinuous. I’ll up you there and say that in a causal world nature can never be discontinuous. Here there will always a reason (and for quantum mechanics too, though we obviously don’t grasp what’s going on and may never). But continuity doesn’t mean we can’t productively phrase things discretely. For us “on” or “off” can exist even given fundamental continuity behind why something becomes that way as we phrase it.

    There’s been plenty of talk here against Nagel’s “something it is like”. Conversely to me this heuristic fits like a glove. Even as a kid I commonly thought about what it might be like to exist as other people and animals in order to hopefully guess their behavior.

    (Earlier I misrepresented Mike’s proposal as “humanness” when it should have been “human like”. I think you’ve corrected me before about that Mike, so I wanted to acknowledge my mistake.)

    ReplyDelete
  16. Philosopher Eric,
    Not sure what I was thinking if I corrected you, since I've used "humanness" myself before and probably would again. So no worries.
    Mike

    ReplyDelete
  17. You had me doubting my memory about that Mike. My search came up with this here, where you did say that you’ve used “humanness” in the past, but have moved on. This shut me down for a bit, though we did take it up later there with themes very similar to this one. Anyway perhaps this ties in with your post from today, or if something is at all painful for us then given standard biases perhaps it’s also a productive to explore? Surely Julia Galef would agree!

    ReplyDelete
  18. Dualism at its worst...
    ...Google and Stanford Encyclopedia of Philosophy prefer consciousness vs physicalism instead of consciousness vs physicalness...

    In defense of 'first-person...noumenological-phenomenological memory won't give us a good handle on the idea.'...

    Here's to discovering real half states in my existence and maybe some being completeness...thanks

    ReplyDelete
  19. And the "science"... https://doi.org/10.1016/j.newideapsych.2017.05.004 ...meanings abound...

    ReplyDelete
  20. @Ryan
    If I'm understanding you right, the kind of situation you're talking about would be analogous to this: can there be a thing that is in an indeterminate state between having and not having colour? Maybe yes, but it's trivial because actually it's in an indeterminate state between being a thing and not being a thing, and it's the thing-indeterminacy that is driving the indeterminacy, not colour-indeterminacy per se?

    It's an interesting idea, but I'm struggling to see how it would hold for consciousness, because conciousness seems like it should be a very fundamental concept, not dependent on other features; or maybe just because we haven't yet worked out what other features consciousness may be dependent on. So, for example, it's not obvious to me that consciousness is or requires a mental state; it seems quite possible that the relationship is the other way round, and that mental states depend on consciousness.

    I'm also not convinced that if we could establish such a dependence, and demonstrate that consciousness is indeterminate because of the indeterminacy of an underlying factor, that that would be trivial!

    ReplyDelete
  21. This comment has been removed by the author.

    ReplyDelete
  22. This comment has been removed by the author.

    ReplyDelete
  23. 1/2

    (Reposting because I misspelled "sorites")

    @Chinaphil:

    That is the gist of what I am getting at, though I wouldn't target thinghood; I think there are more salient ways to make the point. I take it that, and I think this is the common view (though perhaps panpsychists, Russellian monists, and others of that ilk demur, and I recognize commonness is not evidence), for X to be a conscious state (to be a state that exhibits state consciousness), X must be a mental state. Furthermore, I take it that not all mental states are conscious states. (It sounds like you disagree with me here.) Thus, it seems being a mental state is a necessary but not sufficient condition on being a conscious state. That is enough to get my concern off the ground. One can have indeterminate cases of conscious states that are so because they are indeterminate cases of mental states (more generally, because they are indeterminate cases of satisfying all the necessary conditions on being a candidate for being conscious).

    Imagine being causally implicated in the production of behavior (or some other similar functional property) is a property in the set S I mention above (let's assume at this point that S has not yet been sieved to produce the minimal, proprietary version that I mention later). One could object that I am assuming that there are some physical/functional properties that constitute (or cause) consciousness. Fair, but I am trying to take Eric up on his argument from the indeterminacy of physical/functional properties to the existence of indeterminate states of consciousness. If one is sympathetic to a dispositional or causal account of mental states, then being causally implicated in the production of behavior is in S not because it is something special about consciousness, but because it is something special about being a mental state.

    Now, we can construct a sorites series beginning with a determinately conscious state C, where each subsequent state in the series is distinguished from the prior state insofar as it is less causally implicated in the production of the behavior of the subject of the state. At some point in the series we will have a state C* that is determinately not causally implicated in the production of behavior, and thus, thus, determinately not a conscious state. At some point between the states C and C* on the series, we will have cases where it is not determinate whether those states are conscious or not. But, it seems unclear (at least to me) whether this is in an interesting/relevant result, since one could note that if a state is indeterminate with respect to being mental, then, for this reason, it can be indeterminate with respect to its being conscious.

    ReplyDelete
  24. 2/2

    This does require that consciousness is not a simple, fundamental property, and that being a conscious state necessitates being some other sort of state (at minimum, being a state at all) but that seems in line with Eric's view about consciousness, and generally, with the antecedent assumptions Eric is making for his argument from the indeterminacy of physical/functional properties to the indeterminacy of conscious states. That all being said, I attempted to offer a way to get around such a worry, though I am not sure how successful it is.

    As far as whether this would make such cases trivial, I think it would in a technical sense. There are indeterminate states of consciousness, but only because there are indeterminate cases of being Φ where being a conscious state requires being a Φ-state. Let's say you and I are arguing about whether there are indeterminate cases of being Ryan or being Chinaphil. I take the position that there are no such cases because I argue being a particular person is a discrete property (perhaps, I’ve just gotten out of a medieval philosophy habit and have become sympathetic to a haecceities view). One is either determinately Chinaphil or one is not. You, then, present a sorites series where on one end there is a case of determinately being Chinaphil, and where each subsequent entry in the series differs with respect to the last insofar as the (average) distance between the molecules that constitute the body is increased by one plank length. Eventually, we will have a case of a mere swarm of (mostly) organic molecules. Has that taught us something about the property (or concept) of being a particular person P? Perhaps. Perhaps it shows that being a particular person P is dependent on other properties, but I am not sure that it is particularly revealing about the property being Chinaphil (or being Ryan) itself. I think the same thing holds for cases of consciousness.



    @Phil E:

    I am interested in your view that imaginary (and complex) numbers are effectively place holders. Might I ask why you think that? You mention something about yielding results, but I am not sure exactly what that means. When, for instance, complex numbers are used to quantify values of the Higgs Field, why does that not count as a result as you are using the notion?

    ReplyDelete
  25. Thanks for the continuing comments, folks!

    Frankie: Maybe we do agree, then. However, I'm not sure, since I use "phenomenology" and "consciousness" interchangeably, so when you say there is phenomenology but not determinate/full (state) consciousness, I'm not sure how to translate that into my own manner of thinking. I think of "phenomenology" and "consciousness" as both referring to what's-it-like-ness or qualia or (in my preferred definition by example) the obvious feature that inner speech, visual experience, and felt pain all share in common but which is lacking in "nonconscious" processing.

    David: Yes, I think those are interesting cases and questions! I myself lean toward a relatively "abundant" view in which consciousness is present, but I think it's very methodologically difficult to settle.

    Chinaphil first comment: Yes, exactly my thinking!

    Ryan/Chinaphil second comment: Ryan, that is very helpful. Yes, although I didn't articulate it fully here, the argument you present is a version of the grounds-of-consciousness argument that I have in mind. (Topic of a new blog post soon, I hope.) It relies on the assumption that some naturalistic functionalist and/or physiological story about consciousness is correct, but I don't know that consciousness needs to be a non-simple property (for example if consciousness arises from complex properties in some property-dualist fashion). Interesting idea that the mentality might be fading out (so to speak) rather than the consciousness specifically fading out while mentality remains. To that I have two reactions. First, that would still be borderline consciousness, so maybe I can just accept that as a type of case that establishes what I want. I'm not sure why it would be trivial; but maybe I'm missing something here. I'm more or less a Parfitian about personal identity and don't have any inclination toward haecceities, so I'm fine with being an in-betweenist about personhood too! Second, if we're thinking about brain states and functional states, it seems unlikely that the fade-out will be a fade-out toward causal irrelevancy, if we think about animal cases or developmental cases at least (sleep and anaesthesia cases are maybe less clear). It seems to me that the case for saying a snail, for example, has no mental states, then, would probably be grounded in saying it has no consciousness and requiring consciousness for mentality, rather than the other way around. In a nonconscious (or maybe conscious?) way, it tracks features of the world and strings together somewhat complex behavior in hours-long mating dances. If it did so consciously, presumably its states would be mental states.

    Phil E: You write: "It’s not that your consciousness definition is argued through example that removes the possibility for borderline cases, or even how innocent it happens to be. Instead it’s the initial dichotomous phrasing. All “somethings” that have implicit “versus nothings” attached, will by definition have no borderline cases." Ah! But that was speaking in the voice of my opponent, trying to show how they might think. On the issue of few/no sharp lines in nature and reality possibly exceeding our conception, we agree, and I think that's the main issue here.

    Arnold: I agree, plenty of different ways to use terms here!

    ReplyDelete
  26. Ryan,
    Though imaginary numbers are clearly an important tool in mathematics, my understanding has been that they’re mainly manipulated in order to see if they can be combined to yield a real number result which we can thus practically use. Is that not your understanding? Or if it is, though in quantum mechanics you understand imaginary results to be practically useful in themselves, that would be interesting. But then QM remains pretty mysterious in general so I guess this wouldn’t surprise me too much. Still there’s only so much metaphysical funkiness that I’m able to accept without resorting to Einstein’s perspective on God playing dice!

    ReplyDelete
  27. @Ryan That's taken a moment to digest! I think I've fully understood what you've said, though I haven't yet looked up haecceities!

    It still sounds like you're assuming more than I'd be willing to assume about consciousness, in two ways, I think.
    First is the issue of knowing what preconditions are necessary for consciousness to exist. I definitely don't subscribe to the view that consciousness is a species of mental state. I don't disagree with it, either, I just maintain an indeterminate position. For example, it seems entirely possible to me that mental is only meaningful when there is consciousness. Fish may well have no mental life.
    I don't think I'd be willing to assume that consciousness is dependent on physical matter, either. I believe that consciousness is a physical phenomenon, but the relationship between physical matter and consciousness is likely to be at least as subtle as the relationship between physical matter and life; if we don't understand a relationship, even knowing that it exists wouldn't necessarily tell us anything about consciousness.
    So, given that I'm not willing to make any assumptions about the what the substrates of consciousness is, I couldn't yet run into the possibility that you raise.
    Similarly, because I don't yet have even much of an outline around what the concept of consciousness might be, I'm interested in all of its boundaries, and wouldn't call them trivial (as in uninteresting), though I recognise that they might be trivial in the technical sense that you're raising. In your personhood example, while I might be interested in the difference between the persons Ryan and Phil, I'm also very interested in the boundary between Phil and bunch-of-mush. In the case of consciousness, that's because I just don't know where any of the boundaries lie.

    ReplyDelete
  28. This is a geniune showerthought, so excuse me if it's a bit half-baked. But I thought, as an example of the prior conditions to consciousness, I'd probably accept that life is one. I think that anything conscious must necessarily be alive.

    However, if we were to look at a situation in which an indeterminate consciousness case arose because of indeterminate life, that would immediately cause us to reevaluate the situation. This has really happened: In the past, when someone's heart stopped, we had to accept that they were dead. Now, because we know that consciousness persists for a while even after catastrophic heart failure, we no longer accept that a person in that state is dead. Instead, we try to shock them back to life.

    That is, when a situation of indeterminate consciousness because of indeterminate life arose, it wasn't trivial, because the causal or categorical relationships between life and consciousness are not fully determined.

    ReplyDelete
  29. @Eric:

    I was not thinking of property dualist sorts of views when I made my comment about consciousness needing to be non-simple. I haven't spent a lot of time thinking about such a case. But, I take your point.

    Regarding what is indeterminate in cases of indeterminate conscious states, what I am trying to press is 1) that it seems relatively easy to argue for the existence of indeterminate states of consciousness from the indeterminacy of physio-functional properties that underwrite consciousness insofar as there are, very plausibly, states that will be indeterminately Φ, where determinately being a conscious state requires being determinately Φ. And 2) that it will still have to be argued that these are cases of indeterminate consciousness, that is, cases of indeterminate conscious states qua consciousness, since it is possible that they are cases of being indeterminately mental. for instance. I am not sure to what use you want to put the existence of such states, though it seemed from your post that you were looking for cases of indeterminate consciousness as such so I thought you might want the stronger conclusion. That being said, I tried to offer a way of restricting the set of properties that underwrite consciousness in such a way that one could plausibly get to a case of indeterminate consciousness.

    My use of the term "trivial" was inaccurate. I mean to point out something about the contours of a dialectic here: Some might happily accept the existence of indeterminate cases of states consciousness, and still deny that there are cases of indeterminate state consciousness qua consciousness. The point I make about personal identity and haecceities—that one can accept indeterminate cases of human beings (and thus, indeterminate cases of Ryan) and deny cases of indeterminate personal identity without contradiction—was my attempt to provide a parallel example from another field.

    Finally, regarding causal irrelevancy, again, I just take your point re: application to a garden snail or other cases from phylogenetic development. I was thinking of thought experiment style cases as opposed to ones that have nomological plausibility.


    @Phil E:

    The reason I chose values of the Higgs field was to avoid QM quandaries. The Higgs field and Higgs boson are just parts of the standard model, and thus, don't require reference to any QM wackiness. If that is correct, does that give complex numbers enough heft to be considered more than placeholders?

    @Chinaphil:

    I think you raise an interesting case with someone whose heart has stopped. I confess that I think that that person is still alive to the extent that their brain is still processing things, which is evidenced by their still enjoying (or, in this case, suffering) consciousness. I also am not sure that I think a creature needs to be alive to enjoy conscious states, but that might just be an artifact of my not knowing what it is to be alive in non-biological cases.

    ReplyDelete
  30. Wow Ryan, I didn’t even consider what the Higgs field happens to reference. Fortunately I only pretend to be a philosopher rather than a physicist! (Though maybe that’s bad enough?)

    Anyway in principle, no I don’t consider a lack of quantum wackiness to matter. It seems to me that math can inform physics, though physics can’t inform math. Why? Because I consider math to essentially exist as a language from which to think (like Spanish for example), not a science from which to potentially discover how things work. It’s of course deductive rather than inductive.

    So if imaginary numbers are mathematically understood to be meaningless in the form of a result (which has been my perception at least), then I wouldn’t think that a theory in physics could teach us that imaginary numbers can actually make sense as a result regarding that theory. Would you say that the standard model is able to use imaginary numbers as effective results in themselves?

    I’ve been in some of the commentary for Sabine Hossenfelder’s recent “Is Math Real?” post, and I now notice that it stemmed from her “Do Complex Numbers Exist? post. So I just just watched the 11 minute video. My take is the first half said complex numbers generally just provide simpler ways to make certain calculations, though aren’t essential. I didn’t notice her to say that imaginary results make no sense as such, though maybe she figured that goes without saying.

    Then in the second half she discussed a new non peer reviewed QM paper which interests her because it suggests that imaginary numbers might provide results that real numbers can’t provide. Hmm…. if that holds up and and my reasoning above was otherwise sound, it should contradict my position that physics can’t inform math!

    ReplyDelete
  31. Hi Eric,

    I'm also sympathetic to the idea of borderline consciousness, but I have a worry and I wonder if you find it worrying too. It seems as though borderline facts about consciousness lead to borderline facts about value. For example, suppose the snail is on the borderline with respect to being in a conscious state, and the relevant conscious state is a pain state. So its kinda sorta in a pain state; there's no fact of the matter. Is it *bad* that the snail in such a state? It seems as though there must be no fact of the matter. That seems pretty weird, but it gets weirder! Presumably there's *also* no fact of the matter whether its *wrong* to use pesticides which put that snail (or thousands of snails) in this indeterminately-painful state. But how can there be no fact of the matter about *that*?

    ReplyDelete
  32. Thanks for these great continuing comments, all!

    On life and consciousness: Interesting example, chinaphil. We might also consider cases of conscious robots who might not meet some of the standard biological criteria for "life" such as having an evolutionary history, being made of organic matter, and reproducing. That said, my own inclination would be to say that conscious robots would be "alive" in the best way of thinking about what life, or at least "a life" is:
    http://schwitzsplinters.blogspot.com/2018/05/is-c-3po-alive.html

    On imaginary numbers: Thanks for the Higgs boson example, Ryan. I didn't realize that Higgs mathematics also used imaginary numbers. Phil E: Thanks for the links to Hossenfelder. It's of course a delicate philosophical issue what it is for a number to "exist" in the first place.

    On whether the indeterminacy is about whether the states are mental at all vs in the consciousness as such: Maybe that discussion is best saved for after my next post on indeterminacy, probably next week, which will put forward more of the positive case. I'm thinking that the structure of the argument will point toward indeterminacy in consciousness as such, but it's possible that you'll think otherwise.

    ReplyDelete
  33. Dan: Interesting connection! My inclination is to think that it's more natural to treat, say, borderline pain not has having indeterminate value but rather as having intermediate value between no pain and determinate pain. Compare: Someone who is borderline bald might be determinately less good as an example of hairy vigor than someone who is determinately non-bald but also determinately better than someone who is determinately bald.

    ReplyDelete
  34. c-3po-alive link, metabolics aside...

    I'll try functionalness is objectionalness-questionalness-observationalness-...

    Leaving dualism-functionalism as intermidateism, indeterminateism and incompleteism...

    Irresistible...

    ReplyDelete
  35. Thanks Eric, that is helpful. I guess I was imagining the case as one in which (1) it's indeterminate whether the snail is experiencing serious pain or not experiencing anything, but (2) its determinately *not* the case that the snail is experiencing mild pain.

    Assuming that mild pain has intermediate value, (2) seems to suggest that the snail's state does not have intermediate value. That's one value claim can rule out! Similarly, if there were a color "Yed" that was indeterminate between red and yellow, but determinately *not* orange, then we could rule out that the complement of this color is blue. I think we'd have to say that its complement is "Vreen", which is indeterminate between violet and green.

    But maybe you think that cases with this kind of structure aren't possible when it comes to consciousness, even if consciousness admits of indeterminacy. Or maybe you think that (2) does not imply that the snail's state does not have intermediate value. Anyway, it's very interesting to think about!

    ReplyDelete
  36. Dan: I like how you describe the situation, and I'll take your (1) and (2) on board. I think it's possible that such a situation is one way of having intermediate negative value, though very different from the mild-pain way.

    Maybe this will work as an analogy. Let's say you have $1,000,000 in the currency of an unstable regime whose future is in doubt (and no means to exchange the currency), and no other money. Your financial situation is intermediate between that of someone who has $1,000,000 in a stable currency and someone who is completely bankrupt. That way of being intermediate is quite different from having $100,000 in a stable currency.

    Alternatively, and relying more directly on vagueness: Suppose we're discussing handsome bald men. We agree that A is extremely handsome but only borderline bald. We agree that B is extremely bald and determinately handsome but not extremely handsome. Finally, we agree that C is extremely bald and extremely handsome and D is determinately neither bald nor handsome. If we were then to have a contest for handsomest bald man, C would win and A and B would be runners up but in different ways.

    ReplyDelete
  37. Very cool. It's interesting to think that, beyond the dimensions of intensity and duration, there's another dimension that Benthamite hedonists have to deal with: determinacy of consciousness!

    ReplyDelete