• fishfry
    3.4k
    Good attempt but tusks are nothing more than overgrown teeth.TheMadFool

    And consciousness is no more than an epiphenomenon, or doesn't exist, or is merely an emergent property, or some other such philosophical objection to the importance of consciousness.

    You are privileging consciousness (as I do myself); but you are not making the case that it should be privileged; and others make the case that it shouldn't.
  • Christoffer
    1.9k
    Can a P-zombie make choices? If it is acting the same as a normal human, it needs to make choices, as humans make choices all the time even based on reactions to events and variations on reactions to events over time. We form choices based on our internal processing of our experience, not by the experience itself. If someone struck us, we react, but react differently based on our current mental state that is in turn based on us having processed different experiences.

    So, I would say that P-zombies aren't possible if simulating a human is the primary function, since a simulation that reacts in the exact same way all the time, regardless of time, is not indistinguishable from a human. In order to perfectly mimic a human, it needs to adapt and process experience, which in turn requires an internal mental state that goes against the concept of a P-zombie.

    So, the requirement of a P-zombie to exist is the thing that makes a P-zombie not possible to exist or it becomes just another human, and in being that, it is not a P-zombie, just another human. A P-zombie cannot act as a human without the thing that informs our acts. A person cannot act or even simulate complex actions without having a consciousness that processes the reasons for an act.

    Case point, does Ava in Ex Machina have consciousness? Without the ability to internally adapt and change behavior, she would be stuck in a feedback loop of choices that are easily predictable, thus not act like a human.
  • Harry Hindu
    5k
    . IF physicalism is true THEN p-zombies are impossible.TheMadFool
    This is assuming that consciousness isnt physical, hence begging the question.

    Do you know enough about consciouness to assert that it is physical or not? What does it even mean for something to be physical or not?
  • SolarWind
    207
    Case point, does Ava in Ex Machina have consciousness? Without the ability to internally adapt and change behavior, she would be stuck in a feedback loop of choices that are easily predictable, thus not act like a human.Christoffer

    That is the right question. And the answer is: We can't know, because we don't have a bridge from the third-person perspective to the first-person perspective. Quite simply, both possibilities are conceivable. Likewise, p-zombies are also conceivable.

    All that remains, that is the similarity principle. The more similar something is to us, the more likely we are to assume the first-person perspective. But the similarity principle is not a law of nature like others.
  • Christoffer
    1.9k
    That is the right question. And the answer is: We can't know, because we don't have a bridge from the third-person perspective to the first-person perspective. Quite simply, both possibilities are conceivable. Likewise, p-zombies are also conceivable.SolarWind

    But we can make conclusions in third person, through studying the choices of the subject. Ava can't make choices that adapt over time without having a consciousness. Adaptive behavior requires internal processing and emotional awareness, otherwise, we get repetitive behavior that is easily spotted as having no internal thought behind them.

    All that remains, that is the similarity principle. The more similar something is to us, the more likely we are to assume the first-person perspective. But the similarity principle is not a law of nature like others.SolarWind

    That requires us to attribute something to us that we assuming is missing in the subject. If the subject displays all the actions and behaviors that require the same foundation as our own behavior, it is the same as us. If they don't, they won't act as us.

    If you copy my body into a robot form that is programmed to act entirely as I do based on a behavioral prediction algorithm of me. It will mimic me in the first minute, then start repeating itself while I adapt and change my behavior pattern. Without consciousness, without any internal mental processing of experiences, both emotionally and systematically, the P-zombie would not be able to behave as me at all, because we can't separate behavior from consciousness.

    P-zombies require they can uphold the illusion of being a human over the course of time. But even the most complex P-zombie robot would not be able to sustain such an illusion for long. So by observing choices and behavioral changes, it would be possible to spot a lack of consciousness or not, and if not, they aren't P-zombies by that definition, because they can't be.
  • SolarWind
    207
    P-zombies require they can uphold the illusion of being a human over the course of time. But even the most complex P-zombie robot would not be able to sustain such an illusion for long.Christoffer

    You are right in the sense that to date there is no chatbot that passes a Turing test. Just ask a chatbot what it thinks about climate change and it will answer something like: Ask someone who knows about it. Ridiculous.

    But the question is, what if there was a chatbot that passed the Turing test?
  • Christoffer
    1.9k
    But the question is, what if there was a chatbot that passed the Turing test?SolarWind

    This is why Ex Machina is a good philosophical case study. The whole premise is that a chatbot can accurately be made to fool the Turing test, but the real test is to study a robot you know is a robot and determine if it is conscious or not.

    We could argue that complex consciousness and choices out of it is just a form of synthesis between different inputs. The robot sees coffee for the first time, use data that informs that coffee is good, smiles and takes a sip, input taste, combines that taste with a recorded input from the past when a similar taste as coffee was tasted but spiked with extreme acidity, concludes that coffee is not good - reaction to tasting coffee is: "I just remembered, I don't like coffee".

    Such a reaction might seem like a very complex reaction to tasting coffee. A reaction that includes memory, ability to be wrong in the first decision to taste something seemingly tasting good, The structure of this reaction sounds like how we perceive memory, but there's no indication of the experience being as we experience it.

    However, the causal line of such internal processing of reactions and choices becomes an infinite web that by the time it creates a foolproof system, the complexity becomes the same as normal consciousness. It cannot, therefore, be less complex than consciousness and still pass as consciousness. By mimicking consciousness, it already has become conscious.

    We have AI systems today that actually does this type of synthesis. All those "art by AI" images that are AI's taking images and creating something new, do this and without input as to how it should combine them. But it's not doing so in a way that is a reaction to an emotional request. If you ask it to paint a house that feels like a morning in spring when you have just fallen in love, it cannot create an interpretation of that request and even if it is more complex as a system and does so, it will do different versions every time you request it, or won't be able to change after a time of meditation on the nature of love.

    A P-zombie does not survive the ship of Theseus, since it cannot adapt its behavior after a time of experience without having a consciousness that can process that time and experience. A P-Zombie is fixed in time and will always fail to simulate as long as it lacks consciousness.

    Ava smiles in a scene when she is alone, no one observing her, reacting to nature. Why is she smiling?

    Behavior can't exist without consciousness and P-zombies can't exist without behavior.
  • InPitzotl
    880
    It will mimic me in the first minute, then start repeating itself while I adapt and change my behavior pattern.Christoffer
    Just a quick interjection... this statement suggests to me two things: (1) a non-repetitive robot is conscious, (2) a non-repetitive robot is incredibly difficult to build. Both 1 and 2 are dubious.
  • Christoffer
    1.9k
    Just a quick interjection... this statement suggests to me two things: (1) a non-repetitive robot is conscious, (2) a non-repetitive robot is incredibly difficult to build. Both 1 and 2 are dubious.InPitzotl

    Without context, yes, but a non-repetitive robot, in this case, is about non-repetition in adaptive behavior, meaning, it doesn't randomly repeat different things. It doesn't randomly repeat after each similar input, but based on the experience of past outputs, deliberately acts differently because of it. It reflects upon past outputs as reactions to the input and doesn't repeat the same output again, but instead adjusts the output based on new experiences and knowledge. Robots today can do this, but always in a quantifiable way, we can always see the iterations, even version them. But when a P-zombie robot mimics a human to the point we cannot measure it being different from a human, it is already to the point conscious that it cannot be a P-zombie.

    A non-repetitive behavior does not equal consciousness, but adjustable behavior over time that leads to deliberate non-repetitive behavior that is unquantifiable over time, should be on the same level as consciousness in a human.

    The point being, that in order for this behavior to take form of a perfect mimic of a human, it requires the same internal life that a human has, otherwise the behavior will be repetitive or so different it cannot be a mimic of a human, it would act totally different as seen in complex AI experiments.

    In order to make a P-zombie, it requires a complexity of internal processes that by the time it reaches that level it is no longer a P-zombie, but another human, or consciouss replica of a human.
  • Caldwell
    1.3k
    Explain yourself first.TheMadFool

    Define complexity, please. Examples do not replace definitions.
12Next
bold
italic
underline
strike
code
quote
ulist
image
url
mention
reveal
youtube
tweet
Add a Comment

Welcome to The Philosophy Forum!

Get involved in philosophical discussions about knowledge, truth, language, consciousness, science, politics, religion, logic and mathematics, art, history, and lots more. No ads, no clutter, and very little agreement — just fascinating conversations.