• Relativist
    3.5k
    The P-zombie case, as specified would seem to be the very opposite to that, in that the zombie would say that they had seen, heard, felt, tasted, etc., while not having actually had any experience of anything at all.Janus

    But the sights, tastes, sounds, etc had to be detected in some way. That set of of detected things will be remembered, and that's what the experience is to them.
  • Janus
    17.8k
    So, in the zombie case the sights, sounds, feelings, emotions and so on were detected but never consciously, even though the zombie is able to report about what was detected in as much detail, and with as much nuance as we are.

    In contrast the "blind experiencer" can detect the sights sounds and so on, perhaps not as reliably and with as much subtlety of detail as the conscious experiencer, but they cannot report on it because they believe that they have detected nothing. Let's say this is a failure of connections between brain regions or functions.

    So, now it looks like the zombie and the blind-experiencer are actually similar, except that the zombie who has no experience at all nevertheless speaks as though it does, whereas the other consciously believes it has no experience, which amounts to saying it, like the zombie, has no conscious experience. However in fact it does have experiences albeit unconsciously.

    The question then seems to be as to how it would be coherent to say in the case of the zombie, that all these feelings, experiences, sights, sounds and so on can be detected and yet to simultaneously say that nothing was experienced, when the zombie itself speaks about he experiences.

    Another point that comes to mind is that I think we are not consciously aware of probably almost everything we experience, in the sense of "are aware of' like when, for example, we drive on autopilot.
  • Patterner
    1.9k

    I think information is of extreme importance for consciousness. But I don't have a clear idea on various specifics. What do you mean by "consciousness is informational"?
  • Relativist
    3.5k
    n the zombie case the sights, sounds, feelings, emotions and so on were detected but never consciouslyJanus
    This depends on how one defines "conscious". If it's defined as a state that necessarily includes qualia, then it's true. But a qualia-absent being could have something very similar.

    Representationalists say that qualia are "representational states": pain represents body damage, with an acuteness proportional to the damage; a physical texture represents some physical property of the object; a visual image represents the surroundings we are within...

    If those representations could be made in computable ways, without qualia, this arguably results in a form of consciousness. They could even have unconscious experiences: capturing representations of aspects of the world, but only storing them in memory- not in active use by the executive function.
  • AmadeusD
    3.8k
    P-Zombies. It's often held out as a gotcha for those who think consciousness could be separate from brain activity (or at least emergent from it, rather than synonymous).
  • hypericin
    2k
    What do you mean by "consciousness is informational"?Patterner

    I mean that consciousness is best understood in terms of information, not physics. Some phenomena should be thought of as material: rocks, gravity, light. But others cannot be understood physically: numbers, ideas, computer programs, novels. I claim that consciousness belongs to the latter category.

    Think of a book, Moby Dick. You could try to understand it physically: "Moby Dick" is this specific arrangement of glyphs on paper. But then you look at another edition, or the book in another language, or an ebook edition, and you are totally flummoxed. You will conclude that analyzing Moby Dick as a physical phenomenon is hopeless.

    The same is true for consciousness. Analyzing consciousness physically is hopeless, and leads to the hard problem. Because, consciousness is informational. Evidence?

    Does consciousness have mass? Does it have a position, or velocity? What material is it made of? None of these seem answerable. In fact, to answer the latter, some want to invent an entirely new substance, with no physical evidence, no evidence at all in fact, other than that consciousness exists, therefore this substance must exist.

    On the other hand, what is consciousness, phenomenologically? One thing you can say: each and every conscious moment discloses information. Every of our senses discloses information about the external world, or of our bodies. And every emotion discloses information about our minds.

    Consciousness informs, it is informational, not physical. And so to understand it, it must be understood as informational. Only then can we understand how the brain implements it.
  • Patterner
    1.9k
    On the other hand, what is consciousness, phenomenologically? One thing you can say: each and every conscious moment discloses information. Every of our senses discloses information about the external world, or of our bodies. And every emotion discloses information about our minds.hypericin
    I'm with you. I've been saying information processing is the key. It's the key to life, because it all l began with DNA, and invoices the processing of the information in DNA to synthesize protein. And it's also the key to consciousness, because (my idea on how it works is obviously speculation) information processing is what makes a system conscious as a unit. Your thinking works fine for me.


    it must be understood as informational. Only then can we understand how the brain implements it.hypericin
    Ok, what's the plan? How do we understand it as informational? What do you have *ahem* in mind?
  • Wayfarer
    25.9k
    A thought-experiment i started a thread with, on this very same topic.

    There is a sentry in a watchtower, looking through a telescope. The watchtower stands on top of a headland which forms the northern entrance to a harbour. The sentry’s job is to keep a lookout.

    When the sentry sees a ship on the horizon, he sends a signal about the impending arrival. The signal is sent via a code - a semaphore, comprising a set of flags.

    One flag is for the number of masts the ship has, which provides an indication of the class, and size, of the vessel; another indicates its nationality; and the third indicates its expected time of arrival - before or after noon.

    When he has made this identification he hoists his flags, and then tugs on a rope which sounds a steam-horn. The horn alerts the shipping clerk who resides in an office on the dockside about a mile away. He comes out of his office and looks at the flags through his telescope. Then he writes down what they tell him - three-masted ship is on the horizon; Greek; arriving this afternoon.

    He goes back inside and transmits this piece of information to the harbourmaster’s cottage via Morse code, where it is written in a log-book by another shipping clerk, under ‘Arrivals’.

    In this transaction, a single item of information has been relayed by various means. First, by semaphore; second, by Morse code; and finally, in writing. The physical forms and the nature of the symbolic code is completely different in each step: the flags are visual, the morse code auditory, the log book entry written text. But the same information is represented in each step of the sequence.

    The question I want to explore is: in such a case, what stays the same, and what changes?


    It was an epic thread, but my view is that the physical form changes, while the meaning stays the same, which says something important about the nature of information.
  • AmadeusD
    3.8k
    What do you have *ahem* in mind?Patterner

    hehehe. Good.

    Consciousness informs, it is informational, not physical. And so to understand it, it must be understood as informational. Only then can we understand how the brain implements it.hypericin

    I'm unsure this is true but I like it. Consciousness may only appear that way the experiencer, so in some sense yeah that's right - but it might just be the limiting factor to us ever understanding it rather than a accurate take on it.
  • Patterner
    1.9k

    You may be right. But definitely worth pursuing, and I hope hypericin has something in mind.
  • Relativist
    3.5k
    consciousness is best understood in terms of informationhypericin
    What is information, in the absence of consciousness? Words on a page have to be interpreted by a conscious mind.

    I'm fine with examining aspects of mental activity in terms of information, but information needs to be grounded in something else, to avoid circularity.
  • Patterner
    1.9k

    If a walking robot with a mechanical eye is approaching a cliff, and turns to avoid it, was it because there was information? Photons hit the robot's sensor, a signal traveled to the hard drive, which is programmed to turn the body away from drop greater than X.
  • Relativist
    3.5k
    If a walking robot with a mechanical eye is approaching a cliff, and turns to avoid it, was it because there was information?Patterner
    Of course, and I agree information is relevant to ongoing mental activity. What I was referring to was understanding the fundamental nature of consciousness - the hardware that produces it. I should have been more clear. Sorry.
  • Patterner
    1.9k

    If information can exist in the presence or absence of consciousness, then consciousness isn't part of its grounding. What is it grounded in?
  • hypericin
    2k
    Ok, what's the plan? How do we understand it as informational? What do you have *ahem* in mind?Patterner

    Principles:
    1. Consciousness is informational
    2. Consciousness is naturalistic. (No woo!)
    3. Consciousness arose due to selective pressure.

    Why?
    Given our principles, we can make an educated guess why consciousness arose. Consciousness is an extremely efficient means of organizing and processing information. Look at how we phenomenologically experience the information we receive. Sight as spatially organized, painted with color giving surface information. Sound as directionally and positionally co-located in space, but otherwise orthogonal to space. Smell as non-positional, and orthogonal to both. And so on. And then you have bodily awareness its own dimensions of feeling.

    We integrate all of this, into a holistic sense of everything that is happening. And crucially, based on conscious and unconscious decision making, we can attend to a narrow band of the overwhelming amount of information we receive. Our slow-brain (aka conscious) processing of this information is experienced as thoughts, themselves phenomenal, but marked as interior. Experiences and thoughts trigger memories, also phenomenal. We integrate all this, make predictions, and ultimately act.

    Contrast this with an organism trying to manage all this without consciousness. Just electrical signals, without qualitative feel. Imagine, from an engineering standpoint, the complexity of trying to organize a system that can integrate, analyze, and act on such an immense quantity of information. As the bandwidth and the number of streams of information grow, the task would become totally overwhelming.

    TLDR:
    Conscious brains DON'T process all information streams directly.
    Conscious brians DO convert streams to conscious experiences, then process those.

    As informational inputs from the environment and the self grow to a certain point, consciousness becomes mandatory.

    Who
    Given this, we can gain a better perspective on who we are. In one sense, we are human animals, we are our bodies. But in another sense, we are, specifically, the portion of the brain tasked with decision making. The portion that makes use of conscious information, attends to it, thinks about it, predicts with it, remembers it, and ultimately, acts. Everything that is not processed as phenomenal consciousness, to us, does not exist. It is unconscious.

    We, the 'we' that experiences, that imagines a 'self', are the specific part of the brain that connect to the world, and to our own bodies, by phenomenal consciousness, and nothing more. And so at the same time, we are imprisoned by it.

    What?
    From our perspective, everything is conscious. To be aware of anything is to be conscious of it, definitionally. It is quite easy to mistake consciousness for reality. It is not. It is the result of intensive work by the brain, processing immense amounts of information so we may integrate and ultimately act on it.

    Consciousness is unreal, where what is unreal exists in the head, but not outside it.
    Consciousness is an illusion, where an illusion is that which presents as something it is not.
    Consciousness is virtual, where the virtual exists only in terms of a system which supports it.

    I think these facts are crucial to keep in mind. It is easier to explain something unreal, illusory, and virtual, rather than something real and actual. But still, the unreal still exists, as unreal. The illusion still exists, as illusion. Explanation is still required.

    How?
    This is all really framing for a revised hard problem:

    How and why does biology's method of organizing information lead to qualitative states? How could any such method lead to qualitative states?

    Of course I cannot answer this. But perhaps the preceding offers some context, and clues. We don't have to explain something that exists. Only something that exists, for us, from our own persepctive. We are already familiar with computers, information processing systems that can support arbitrary virtual worlds. I contend that the brain is the ultimate such system.

    Still, there is a lot of mind bending to do. Computers can support virtual worlds. But they cannot support them as something experienced for themselves. Only for the user. I take it as axiomatic that consciousness is naturalistic, it unproblematically fits into the natural world as an informational phenomenon. But how does it work? Can we build such a machine? What are the principles? Can we program a computer so that it experiences? Or is this a kind of information processing that a computer cannot support?

    Fundamental conceptual leaps still need to be made. But perhaps less fundamental than prodding pink tissue, and wondering how it could make the feels.
  • Relativist
    3.5k
    If information can exist in the presence or absence of consciousness...Patterner
    That was part of my point: information does not exist in the absence of (an aspect of) consciousness. Characters on a printed page are not intrinsically information; it's only information to a a conscious mind that interprets it- so it's a relational property.
  • hypericin
    2k
    That was part of my point: information does not exist in the absence of (an aspect of) consciousness. Characters on a printed page are not intrinsically information; it's only information to a a conscious mind that interprets it- so it's a relational property.Relativist

    I think you are talking about meaning, not information. Meaning is interpreted information. Also, there is no necessary involvement of consciousness. Machines can interpret information and derive meaning from it.
  • Relativist
    3.5k
    That sounds reasonable. I stand corrected.
  • hypericin
    2k
    Nifty OP. I had pretty much the exact same revelation, though not so artfully told. It led me to a kind of dualist perspective, where the universe consists of matter in all its forms, and information. Although information seems somehow parasitic on matter, in that it needs a material medium in one form or another to exist (not withstanding "it from bit" theories, which I don't understand).
  • Wayfarer
    25.9k
    Thanks! I notice your musings on that question above. The problem with 'information' is that, as a general term, it doesn't mean anything. It has to specify something or be about something to be a meaningful expression. Unlike, say, 'energy', which is 'the capacity to do work' and which is also defineable in particular contexts.

    There's a well-known and often-quoted aphorism from Norbert Weiner:

    The mechanical brain does not secrete thought 'as the liver does bile,' as the earlier materialists claimed, nor does it put it out in the form of energy, as the muscle puts out its activity. Information is information, not matter or energy. No materialism which does not admit this can survive at the present day. — Cybernetics: Or Control and Communication in the Animal and the Machine

    So what we to do? Admit it! So if we admit that information is fundamental, like matter and energy, that goes some way to addressing this insight. But really not that far - as you grasp, designating something 'information' really doesn't get us that far.

    Machines can interpret information and derive meaning from it.hypericin

    But can they? :chin: I've been interacting with AI since the day it came out - actually three weeks since the essay in the OP was published on Medium - and I think all of the ones I use (ChatGPT, Claude.ai and gemini.google would query that. I put the question to ChatGPT, which replied:

    It depends what is meant by “interpret” and “derive meaning.” Machines certainly manipulate information and can model the patterns of meaningful discourse. But meaning in the strict sense involves intentionality, normativity, and understanding something as something.

    My experience with AI systems strongly suggests they do not possess this. Whatever meaning appears is supplied by the human user in their engagement with the output, not generated in the system itself. But it's an amazingly realistic simulcrum, I'll give you that! And also, not a hill I would wish to die on, as it is another of those very divisive issues.

    (I created another thread on that topic, https://thephilosophyforum.com/discussion/16095/artificial-intelligence-and-the-ground-of-reason/p1, which also has a link to a rather good Philosophy Now essay on the subject.)
  • hypericin
    2k
    The problem with 'information' is that, as a general term, it doesn't mean anything.Wayfarer

    Interesting, it certainly seems to mean something. Definitely in everyday conversation it does. And so does it in the sense we are discussing, as something fundamental in the universe, alongside matter. Of course as with so many things, pinning down exactly what it means is nontrivial.

    My experience with AI systems strongly suggests they do not possess this.Wayfarer

    I don't think LLMs could function as they do without understanding in some form (of course, without the sentience connotation the word usually caries). 'Intentionality' is out, and I'm not quite sure what 'normativity' is doing here.

    I'll be sure to check out the thread, I like the topic.
bold
italic
underline
strike
code
quote
ulist
image
url
mention
reveal
youtube
tweet
Add a Comment

Welcome to The Philosophy Forum!

Get involved in philosophical discussions about knowledge, truth, language, consciousness, science, politics, religion, logic and mathematics, art, history, and lots more. No ads, no clutter, and very little agreement — just fascinating conversations.