to grant a robot qualia requires a change in its programming, not its matter. — tom
I we want to give a robot subjectivity - i.e. "what it is like" knowledge, we have to program it that way. Swapping out a hard-drive, or adding more memory is not going to affect the running of the program that achieves this. What particular hardware constitutes the robot is irrelevant, but panpsychics clain it is relevant! — tom
We know that the robot, as a robot, does not possess subjectivity because we programmed it that way. — tom
The hard problem may indeed be hard, but I think the problem of how to create knowledge - of any kind - is the fundamental problem. — tom
I'm kind of grouping these since they are related.
I think, broadly speaking at least, whether a robot can identify a red card from all other colors is not the same sort of thing which the hard problem of consciousness is talking about. We can imagine a philosophical zombie, for instance, being able to identify red cards from all other colors. And the philosophical zombie is already more sophisticated than a robot in that it has all of our functional capacities -- which is (again, broadly speaking) how Chalmer's characterizes naturalism -- it just lacks consciousness, the "feel"-iness of first person experience.
We do not program the robot to have knowledge of qualia. We program it to identify cards which reflect light at such and such wave-length, then to send some kind of indicator that it has done so to us.
Also, I would say that 'qualia', while certainly related, are different from pan-psychism in that we could defend pan-psychism without, in turn, defending the more particular notion that qualia exist. (at least as entities -- of course we can use the word 'qualia' to simply refer, in general, to particular instances of subjective experience without committing ourselves to separately existing causal entities called qualia)
How do all the fundamental particle consciousnesses combine to create a unified consciousness, and why does that require a brain? i.e. how does a single unified consciousness emerge? This is the same question we have without panpsychism! — tom
I think this is a problem of psychological identity, which is something one can ask regardless of their stance on pan-psychism.
Even if there is no subjective experience we have people who profess to have a unified consciousness, and in general we observe that people who make such reports tend to have brains, so we can ask how this phenomena occurs.
So, I'd just say that what pan-psychism sets out to answer isn't this question.
Are atoms more conscious than fundamental particles? How about mobile phones?
Are humans 'more' conscious than dogs?
Honestly, one reason to adopt pan-psychism is it gets rid of this question. With emergence we might ask, at what point does a system gain consciousness? Does it come in degrees?
But I think a consistent pan-psychism would simply say that 'more' or 'less' aren't quite applicable here. It's a 'yes/no' question, and the answer is always 'yes', insofar that what we are naming is an entity (since clearly we can also speak of things which do not exist, and would thereby not be conscious)
It's just that the subjective experience of an electron differs from that of an atom differs from that of a cell-phone differs from that of a robot differs from that of a human.
Why are there no semi-conscious things. Or rather, there must be semi-conscious things, how do we identify them?
Because everything is conscious
:D -- so there is nothing to identify.
Why do I lose consciousness when I'm asleep, given that I am physically the same? Do my fundamental particles also sleep?
I'd have to be a fundamental particle to say whether or not I sleep. By all observations, at least, I'd infer 'no' -- but there's no reason to rule it out, I suppose.
Also, this question hinges on two different meanings to the word 'consciousness' -- one such meaning is 'awareness', as in "I am conscious of Matt's feelings for me" meaning the same thing as "I am aware of Matt's feelings for me". When you lose consciousness in your sleep you lose awareness. But you do not lose out on what it is like to sleep. We feel dreams, after all, at least the one's which we happen to remember after waking up. I don't see why we wouldn't feel the one's we don't remember just because we don't remember them or why sleep, itself, doesn't have a subjective side just because we don't quite remember what it is like afterwords.
It seems to me that given enough understanding of memory and sleep that we could actually engineer ourselves to retain such memories.