• frank
    14.6k

    They're conscious by a functionalist definition, aren't they?
  • TheMadFool
    13.8k
    Hi. You dug up a corpse of a thread.

    My point is simple. Imagine an AI (X)that looks exactly like a human i.e. externally, you don't have the option of examining its innards which would be a dead giveaway.

    Now, picture a p-zombie (Y) and though they are, unlike the AI, physically indistinguishable from conscious humans, here too you're not allowed to do a thorough inspection of the specimen.

    Both X and Y will convince you that each is conscious. In other words, if I (assuming I'm not an AI or a p-zombie) stand next to X and Y, and you interview the three of us, you simply won't be able to tell which is which.

    You have two choices:

    1. Declare that all 3 are conscious.

    OR

    2. Declare that all 3 are not conscious.

    If 1, physicalism is true (p-zombies are impossible) BUT you'll have to concede AI is conscious and not just because they can mimic consciousness (pass the Turing test) but that AI is actually conscious.

    If 2, physicalism is false (p-zombies are possible) BUT then you'll have to contend with the possibility that other people are p-zombies.

    It's a dilemma: either AI is true consciousness OR other people could be p-zombies.
  • Cabbage Farmer
    301
    You have two choices:

    1. Declare that all 3 are conscious.

    OR

    2. Declare that all 3 are not conscious
    TheMadFool
    If I had good reason to believe that there were lots of human-like AI robots and lots of p-zombies tooling around the Earth, I would go with a third choice, which you neglected to mention:

    3. Suspend judgment on whether these seeming sentient beings are genuine sentient beings, p-zombies, or mere simulations.

    However, since I have yet no reason to believe there are any AI systems that seem (physically and behaviorally) just like humans from the outside, and since I have no reason to believe that there is (or could be) any such thing as a p-zombie, I don't bother suspending judgment on this point.

    Ordinarily I infer that the seeming-humans I encounter in my immediate vicinity are genuine human beings. Of course I could be wrong. But this conceivability of error is no different than the conceivability of error that attends all my ordinary perceptual judgments. Ordinarily I infer, further, that genuine human beings are sentient beings -- except perhaps when they seem unconscious, in which case I may suspend judgment on the matter.

    In any case, surely my "declaring" that something is a genuine human (or a genuine dog, star, barn...) doesn't make it so. The facts are the facts, no matter what declarations I may be disposed to heap upon the basis my own perceptual experience.

    If 1, physicalism is true (p-zombies are impossible) BUT you'll have to concede AI is conscious and not just because they can mimic consciousness (pass the Turing test) but that AI is actually conscious.

    If 2, physicalism is false (p-zombies are possible) BUT then you'll have to contend with the possibility that other people are p-zombies.

    It's a dilemma: either AI is true consciousness OR other people could be p-zombies.
    TheMadFool
    In keeping with my preceding assessment of your list of "declarations", I'd have to say the rest of your argument doesn't get off the ground.
  • Cabbage Farmer
    301
    They're conscious by a functionalist definition, aren't they?frank
    What sort of functionalist definition do you have in mind? And do you mean the p-zombies, the Turing-
    AI, or both?
  • frank
    14.6k
    What sort of functionalist definition do you have in mind? And do you mean the p-zombies, the Turing-
    AI, or both?
    Cabbage Farmer

    A functionalist says there are only functions of consciousness like reportability. There's no extra awareness. IOW, functionalists basically think we're all p-zombies or Turing AIs.
  • TheMadFool
    13.8k
    3. Suspend judgment on whether these seeming sentient beings are genuine sentient beings, p-zombies, or mere simulations.Cabbage Farmer

    I'd have to say the rest of your argument doesn't get off the ground.Cabbage Farmer

    In other words, you can't tell whether the three (other humans, true AI, p-zombies) are conscious or not. You can't commit or come to a definitive conclusion because to do so has implications that you're not willing to accept.

    The dilemma, as I stated it earlier:

    If you say other humans are conscious, you'll have to accept AI and p-zombies are conscious. We can forget about the p-zombie for the moment and focus our attention on AI - they have to be treated as conscious/sentient.

    If you affirm that AI isn't conscious, you must concede that other humans may not be conscious/sentient or that other humans are p-zombies.

    So, either AI is conscious or other humans are p-zombies.

    Thus, I've demonstrated that AI and p-zombies are intimately linked together. That amounts to something, right?
  • Cabbage Farmer
    301
    A functionalist says there are only functions of consciousness like reportability. There's no extra awareness. IOW, functionalists basically think we're all p-zombies or Turing AIs.frank
    I'm aware that sort of view has been fashionable among hard behaviorists, functionalists, computationalists, eliminative materialists, and their ilk. But I'm not sure all functionalists are committed to that sort of view.

    Consider the definitive doctrine ascribed to the functionalist by the SEP, "that what makes something a mental state of a particular type does not depend on its internal constitution, but rather on the way it functions, or the role it plays, in the system of which it is a part." Couldn't one hold this "doctrine" while remaining agnostic about the "extra awareness" you indicate? I suppose one might adopt a functionalist account of "mental states", and even of "mind", without denying that some or all minds have that "extra" awareness, and perhaps without any interest in that proposition.

    That qualification aside: I agree that if one believes "there's no extra awareness", as you put it, then the distinction between sentient beings and p-zombies collapses.

    So far as I can make out, that would mean there's no sense in talking about p-zombies for them. For the rest of us, it will seem as though their conception of sentience is akin to our conception of the p-zombie. I'm not sure where that leaves their talk of AI. I mean, on what grounds would they require that an AI system pass the Turing test in order to count as "conscious"? I'd expect them to count a much wider range of AI systems as "conscious". I'd ask them to provide some account of their distinction between conscious and nonconscious systems, regardless of whether they are natural or artificial.
  • frank
    14.6k
    Couldn't one hold this "doctrine" while remaining agnostic about the "extra awareness" you indicate? I suppose one might adopt a functionalist account of "mental states", and even of "mind", without denying that some or all minds have that "extra" awareness, and perhaps without any interest in that proposition.Cabbage Farmer

    I think so, it's just that the explanations provided by functionalism don't cover qualia. So it's left hanging, so to speak.

    So far as I can make out, that would mean there's no sense in talking about p-zombies for them. For the rest of us, it will seem as though their conception of sentience is akin to our conception of the p-zombie.Cabbage Farmer

    The p-zombie was originally just a thought experiment indicating that consciousness doesn't reduce to function, so functionalism can't pass for a complete theory of consciousness.

    A functionalist might take up the idea of the p-zombie to show that functionalism is conceivably adequate for explaining consciousness. But then, nobody has argued that it's inconceivable that we're all p-zombies. It just doesn't square with what most of us experience.

    I mean, on what grounds would they require that an AI system pass the Turing test in order to count as "conscious"?Cabbage Farmer

    I don't know. I gather that consciousness is pretty thoroughly deflated for functionalists, so they can say the word applies or doesn't as they like. Maybe they would advise that it's all a matter of language game.
123456Next
bold
italic
underline
strike
code
quote
ulist
image
url
mention
reveal
youtube
tweet
Add a Comment

Welcome to The Philosophy Forum!

Get involved in philosophical discussions about knowledge, truth, language, consciousness, science, politics, religion, logic and mathematics, art, history, and lots more. No ads, no clutter, and very little agreement — just fascinating conversations.