• Patterner
    2k
    Coincidentally, i just started listening to Anil Seth's Being You: A New Science of Consciousness on my commute. He says exactly what I think.
    Here’s why the zombie idea is supposed to provide an argument against physicalist explanations of consciousness. If you can imagine a zombie, this means you can conceive of a world that is indistinguishable from our world, but in which no consciousness is happening. And if you can conceive of such a world, then consciousness cannot be a physical phenomenon.

    And here’s why it doesn’t work. The zombie argument, like many thought experiments that take aim at physicalism, is a conceivability argument, and conceivability arguments are intrinsically weak. Like many such arguments, it has a plausibility that is inversely related to the amount of knowledge one has.

    Can you imagine an A380 flying backward? Of course you can. Just imagine a large plane in the air, moving backward. Is such a scenario really conceivable? Well, the more you know about aerodynamics and aeronautical engineering, the less conceivable it becomes. In this case, even a minimal knowledge of these topics makes it clear that planes cannot fly backward. It just cannot be done.

    It’s the same with zombies. In one sense it’s trivial to imagine a philosophical zombie. I just picture a version of myself wandering around without having any conscious experiences. But can I really conceive this? What I’m being asked to do, really, is to consider the capabilities and limitations of a vast network of many billions of neurons and gazillions of synapses (the connections between neurons), not to mention glial cells and neurotransmitter gradients and other neurobiological goodies, all wrapped into a body interacting with a world which includes other brains in other bodies. Can I do this? Can anyone do this? I doubt it. Just as with the A380, the more one knows about the brain and its relation to conscious experiences and behavior, the less conceivable a zombie becomes.

    Whether something is conceivable or not is often a psychological observation about the person doing the conceiving, not an insight into the nature of reality. This is the weakness of zombies. We are asked to imagine the unimaginable, and through this act of illusory comprehension, conclusions are drawn about the limits of physicalist explanation.
    — Seth
    Not to worry. I already disagree him in other ways.
  • AmadeusD
    4.2k
    This doesn't appear to me as an argument against anything but aesthetic implication (it would be weird, no?).

    It doesn't seem to address the fact that the Hard Problem and P-zombies are exactly meant to invoke the gap science is trying to fill.
    Unless you can fully understand consciousness in physical terms (I do not believe this is hte case, but even if not, we don't ahve that understanding yet) then p-zombies are coherent until we do (and it excludes that possibility). 180 Proof made a similar error earlier in the thread (though, it was years ago). "identical" to a 'conscious being' would be a conscious being. Being "physically identical" is the actual case in the TE.

    But i agree with Seth - it's a very weak argument against Physicalism, for sure. It's just that he assumes he's right:

    is to consider the capabilities and limitations of a vast network of many billions of neurons and gazillions of synapses (the connections between neurons), not to mention glial cells and neurotransmitter gradients and other neurobiological goodies, all wrapped into a body interacting with a world which includes other brains in other bodies. Can I do this? Can anyone do this? I doubt it. — Seth

    This precludes anything but a physicalist account for it to be a decent objection, i think. I also think Seth (among others) overblows the correlation we find between certain parts of hte brain and fairly imprecise conscious experience. If the brain is a receiver, nothing here has any really weight on the question/s. But it would certainly rule out an emergent (from neural activity) account of consciousness
  • RogueAI
    3.5k
    "Can you imagine an A380 flying backward? Of course you can. Just imagine a large plane in the air, moving backward. Is such a scenario really conceivable? Well, the more you know about aerodynamics and aeronautical engineering, the less conceivable it becomes. In this case, even a minimal knowledge of these topics makes it clear that planes cannot fly backward. It just cannot be done."

    I like this. I keep trying to imagine a p-zombie kicking up it's feet at the end of a hard day and drinking a couple beers to take the edge off and I keep not being able to do it. I can, superficially, but when I try to pair my imagining with a being that has no mental states, it's impossible.
  • RogueAI
    3.5k
    Unless you can fully understand consciousness in physical terms (I do not believe this is hte case, but even if not, we don't ahve that understanding yet) then p-zombies are coherent until we do (and it excludes that possibility).AmadeusD

    What would the history of p-zombie world be? Is there a coherent story that could be told where p-zombies evolved like we did and developed language, like we did? How could p-zombie language have any referents to mental states?
  • AmadeusD
    4.2k
    yeah that’s a bit of a problem. I wasn’t under the impression that would need accounting for though. I can see the evolution side occurring in roughly the same way it has but I imagine we are still about 250,000 years ago culture-wise(I.e <1 - near zero) and obviously more like several million years ago in terms of actual behavioural capacities. It’s a very different world no doubt and would take some serious storytelling to get going
  • Patterner
    2k

    It seems to me that needs accounting for. Why would something that has no subjective experience - something for which there is nothing it is like to be, to itself - ever develop language about these things. If asked "Are you conscious?", why would it say "Yes"?
  • AmadeusD
    4.2k


    Why are we assuming language? That seems a conscious ability, whereas we're talking about physically identical, yet non-conscious entities.
  • Patterner
    2k
    Why are we assuming language? That seems a conscious ability, whereas we're talking about physically identical, yet non-conscious entities.AmadeusD
    That's the scenario we're given. P-zombies are supposed to act exactly like us. We would have no way of knowing that they have no consciousness. So they talk. And they answer questions the same ways we do.
  • AmadeusD
    4.2k
    That's the scenario we're given. P-zombies are supposed to act exactly like us. We would have no way of knowing that they have no consciousness. So they talk. And they answer questions the same ways we do.Patterner

    That is not how I've ever understood any version of the TE.

    p-zombies are physically the same, yet unconscious. No idea why we are assuming they're behaving exactly the same? If i've got that wrong, then I have got that wrong.
  • RogueAI
    3.5k
    That is not how I've ever understood any version of the TE.

    p-zombies are physically the same, yet unconscious. No idea why we are assuming they're behaving exactly the same? If i've got that wrong, then I have got that wrong.
    AmadeusD

    They're supposed to act the same as us: talking, fighting, warring, yelling out "Ouch!" when they smash their toe, crying watching Schindler's List, etc. They wouldn't, of course, which is why they're incoherent.
  • AmadeusD
    4.2k
    They wouldn't, of courseRogueAI

    No, they wouldn't, but I don't understand how its possible it could be contended that they're 'supposed' to . So, I have no idea where to go with this now :lol:
  • Deleted User
    0
    This user has been deleted and all their posts removed.
  • AmadeusD
    4.2k
    Fair enough; I guess i've misunderstood the TE. Whoops lol.

    I don't really see those elements as relevant (at least certainly not necessary) to the Hard problem. For my part, when i consider this TE in the HP context, I imagine a being, physically exactly the same as a typical human but without conscious experience (i.e, that's the only difference) meaning there is no sadness or happiness. They do not have the experience required to inform that. It can't be 'shown' without hte experience. My job is to figure out the difference between the p-zombie i've described, and a human with conscious experience.

    I am under the impression that this requires biting the "consciousness is not emergent from neural activity" bullet hard, but nothing else - only serves to preclude a fully physicalist account of consciousness, and all the interesting questions are still in the air (what, where from, how, why etc..) about consciousness.
  • Patterner
    2k
    Replace p-zombie with a computer that perfectly simulates human personality. Does the computer feel sadness when it cries? That is basically the question.Lionino
    The difference is that we can program computers to act like us. But there's no reason to think p-zombies would act like us.
  • Deleted User
    0
    This user has been deleted and all their posts removed.
  • Patterner
    2k
    The difference is that we can program computers to act like us. But there's no reason to think p-zombies would act like us.
    — Patterner

    By your own argument, there is. The p-zombie would be biologically wired to act like us.
    Lionino
    That's not my argument. That's the premise, which i dispute.
  • Deleted User
    0
    This user has been deleted and all their posts removed.
  • Patterner
    2k

    My turn to not understand. :grin:
    How would the p-zombies, which do not possess consciousness, come to be programmed to speak and act as though they did?
  • Deleted User
    0
    This user has been deleted and all their posts removed.
  • Patterner
    2k

    Right. If physicalism is absolute, then p-zombies - exact physical duplicates of us, down to the smallest detail - without consciousness are not a possibility. Any physical duplicate would be conscious.

    If there is something like dualism, panpsychism, or whatever other ideas there are, and we remove that from an exact duplicate, so there is only the physical, and there is no consciousness, then there is no reason they would say Yes if asked if they are conscious, or have words for such concepts in their language.
  • Deleted User
    0
    This user has been deleted and all their posts removed.
  • Patterner
    2k

    If I asked a p-zombie if it was conscious, I would think its brain would prompt it to say something like, "What is 'conscious'?" Why would a computer that had no programming or memory related to consciousness think it was conscious, or come up with the idea on its own? If a p-zombie with no consciousness, nothing but stimulus and response, existed, why would it answer other than the way the computer would?
  • Deleted User
    0
    This user has been deleted and all their posts removed.
  • Patterner
    2k
    The premise of p-zombies is that they would not ask that. They act exactly the same as us.Lionino
    Yes. My position is that the premise is not conceivable. Yes, we can write the words "I conceive of a p-zombie with such-and-such characteristics." But that's just writing words. I can write any outlandish thing i want, but that doesn't make it conceivable.
    A square circle that was shaped like a pyramid and made entirely of chocolate flavored whipped cream flew into a black hole, lived there for a year, changed its mind, and flew back out.


    If you train an AI on comments talking about things such as feelings and so on, the AI would talk as if it is conscious.Lionino
    Yes. But if you didn't train it that way, why would it? If you didn't train p-zombies that way, why would they?
  • RogueAI
    3.5k
    I would think its brain would prompt it to say something like, "What is 'conscious'?"
    — Patterner

    The premise of p-zombies is that they would not ask that. They act exactly the same as us.
    Lionino

    But if we are asked if we have attribute x, and we don't have it or don't know what x is (e.g., telepathy), we would either say, "no" or "what are you talking about?". We don't (usually) lie and pretend we have x. The P-zombie isn't conscious. In so far as it knows things, it would know it's not conscious. So when asked if it's conscious, you're saying it would lie? If so, the zombies aren't acting like us. If not, then by their own admission they're not conscious.
  • Deleted User
    0
    This user has been deleted and all their posts removed.
  • AmadeusD
    4.2k
    Yeah, i'm understanding i've gotten the TE wrong - but i also can't work out why it's the way it is.

    It makes little sense because it's importing all of the requirements of success into the experiment. I don't see how that matters - the issue, surely, is whether or not a physically identical being would be conscious, or not. So, a p-zombie, to me, should be conceived as physically absolutely identical but not conscious. To me, that's the bullet to bite. I don't really grok how one could, or could not, confirm or deny the potential for a being acting fully conscious, yet not being so. Begs the question, surely.
  • Deleted User
    0
    This user has been deleted and all their posts removed.
  • RogueAI
    3.5k
    The zombie does not know anything, does not feel anything, it does not think.Lionino

    So I'm supposed to think my p-zombie doppelganger will be able to do my job effectively and navigate the world without knowing anything and/or thinking? How would that work, exactly?
  • RogueAI
    3.5k
    If asked if it is conscious, it will say "Yes" because that is what we would do.Lionino

    Can a p-zombie lie?
bold
italic
underline
strike
code
quote
ulist
image
url
mention
reveal
youtube
tweet
Add a Comment

Welcome to The Philosophy Forum!

Get involved in philosophical discussions about knowledge, truth, language, consciousness, science, politics, religion, logic and mathematics, art, history, and lots more. No ads, no clutter, and very little agreement — just fascinating conversations.