Therefore, wouldn't a p-zombie notice its lack of consciousness and experiences and comment on these, thus not being completely similar in its actions to a human being? — BlueBanana
If this kind of being - one that reacted to all its perceptions in a way a human would, but did not have the perception of the conscious experiences or thoughts - what would its reaction to this then be like? — BlueBanana
Take the Conscious Visual experience of the scene the p-Zombie is looking at. Without the Conscious Visual experience the p-Zombie would be Blind and would not be able to move around in the World without bumping into things. The Conscious Visual experience is the final stage of the Visual process. People who think the Conscious experience is not necessary for the p-Zombie to move around in the World are exhibiting Insane Denial of the purpose of the Conscious Visual experience.However, a p-zombie, despite having no consciousness, reacts to stimuli in exactly the same way a human being would. — BlueBanana
Sure, a p-zombie would notice something missing, just like a blind person notices something missing when they hear others talking about their visual experiences.However, a p-zombie, despite having no consciousness, reacts to stimuli in exactly the same way a human being would. A human, however, is aware of its own consciousness and sentience, and this awareness in itself is a perception that human reacts to. Therefore, wouldn't a p-zombie notice its lack of consciousness and experiences and comment on these, thus not being completely similar in its actions to a human being? — BlueBanana
How would a non-conscious being reflect on its own condition? — TheMadFool
Doesn't the term ''zombie'' specifically deny self-awareness of any kind? — TheMadFool
Depends on how self-awareness is defined, I guess. By the definition of p-zombie, it should act like it had self-awareness. — BlueBanana
If you define y as x and then it becomes impossible to inquire into how x and y may be compared. — TheMadFool
Any device that can do what a person or other animal can do has "Consciousness". That's how it does those things, you know. — Michael Ossipoff
Consciousness is the property of being a purposefully-responsive device. — Michael Ossipoff
Any device that can do what a person or other animal can do has "Consciousness". That's how it does those things, you know.
No, it does those things through causality causing it to do those things.
Doing those things does not imply consciousness being involved.
Consciousness is the property of being a purposefully-responsive device.
Where is the consciousness in a mousetrap?
How does it rise from its physical nature?
Yes, it’s a vague definition, because our use of the word “Consciousness” is imprecise. Really, “purposeful-responsiveness” is a better, more uniformly-used term. — Michael Ossipoff
some Spiritualist notion of a separate entity, separate and different from the body. — Michael Ossipoff
But Humans don't work like Robots. Humans and probably all Conscious beings have a further processing stage that presents the Visual experience to them. The Visual experience is what we use to move around in the World. When I reach out to pick up my coffee cup I see my Hand in the Conscious Visual experience. If my hand is off track I adjust my hand movement until I can touch the handle and pick up the coffee cup. It would be much more difficult to do this without the Conscious Visual experience. The Conscious Visual experience contains an enormous amount of information that is all packed up into a single thing. The Neural Activity is not enough. We would need far more Neural Activity to equal the efficiency that the Conscious Visual experience provides us.As a counter-example, robots lack conscious visual experience but manage to react accordingly to information transferred by photons. — BlueBanana
If a Computer could experience for example the Color Red then I would agree. But a Computer does not Experience anything. A Computer can be programmed to scan pixels in an image to find the Red parts. A Computer will look for pixels with values that are within a certain range of numbers. A Computer never has a Red experience but it can find the Red parts of an image. So just because it can find the Red parts of an image, like a Human can, it does not mean it has a Conscious Red experience while doing this. A Computer works in a different way than a Conscious being does. Science doesn't understand enough about Consciousness yet to design Machines that have Consciousness.Any device that can do what a person or other animal can do has "Consciousness". That's how it does those things, you know. — Michael Ossipoff
Any device that can do what a person or other animal can do has "Consciousness". That's how it does those things, you know. — Michael Ossipoff
If a Computer could experience for example the Color Red then I would agree. But a Computer does not Experience anything. — SteveKlinko
A Computer can be programmed to scan pixels in an image to find the Red parts. A Computer will look for pixels with values that are within a certain range of numbers. A Computer never has a Red experience but it can find the Red parts of an image.
So just because it can find the Red parts of an image, like a Human can, it does not mean it has a Conscious Red experience while doing this.
A Computer works in a different way than a Conscious being does.
Science doesn't understand enough about Consciousness yet to design Machines that have Consciousness.
Seriously ... you think a Computer experiences the color Red like we do? You know that the only thing happening in a computer at any instant of time is : Add, Subtract, Multiply, And, Or, Xor, Divide, Shift Left, Shift Right, compare two numbers, move numbers around in memory, plus a few more. If you have 4 cores then this is happening in 4 different places in the computer chip. Which one of these operations experiences the color Red?Any device that can do what a person or other animal can do has "Consciousness". That's how it does those things, you know. — Michael Ossipoff
If a Computer could experience for example the Color Red then I would agree. But a Computer does not Experience anything. — SteveKlinko
...and you know that....how? If you aren't a computer, then how can you speak for what a computer does or doesn't experience?
What does experience mean? I define "experience" as a purposefully-responsive device's interpretation of its surroundings and events in the context of that device's designed purposes.
By that definition, yes a computer has experience.
As I said, we tend to use "Consciousness" and "Experience" chauvinistically, applying those words only to humans or other animals. That's why I try to cater to that chauvinism by sometimes defining those words in terms of the speaker's perception of kinship with some particular other purposefully-responsive device.. — Michael Ossipoff
Question is: Do you have a Red experience in any meaningful sense. Think about your Red experience. Think about the Redness of the Red. That Redness is a Property of a Conscious phenomenon. Think about how a Computer works. Add, Subtract, Multiply, etc. There are categorical differences with how the Human Brain functions and how a Computer functions. A Human Brain has trillions of Neurons firing simultaneously at any instant of time. A 4 core processor chip only has 4 places where things can happen at any given instant of time. Effectively a 4 core computer chip has only 4 Neurons.A Computer can be programmed to scan pixels in an image to find the Red parts. A Computer will look for pixels with values that are within a certain range of numbers. A Computer never has a Red experience but it can find the Red parts of an image. — SteveKlinko
When you find the red part of an image, why should I believe that you have a red experience in a meaningful sense in which a computer doesn't?
The computer finds the red part of the image. You find the red part of the image. Period (full-stop).
You wouldn't report the red part of the image if you hadn't experienced it. The same can rightly be said of the computer. — Michael Ossipoff
I showed you how a Machine detects Color. It compares numbers in memory locations. It makes no sense to think that it also has a Red experience. It doesn't need a Red experience to detect colors. Machines and Brains do things using different methods.So just because it can find the Red parts of an image, like a Human can, it does not mean it has a Conscious Red experience while doing this. ---SteveKlinko
You call it a Conscious Experience when it's yours, or of another person, or maybe another animal. ...you or a purposeful-device sufficiently similar to you, with which you perceive some kinship.
A Computer works in a different way than a Conscious being does. ---SteveKlinko
...because you define a Conscious Being as something very similar to yourself. — Michael Ossipoff
I'm not defining Consciousness as the ability to pass as Human. Most Birds can probably have a Conscious Red experience.Science doesn't understand enough about Consciousness yet to design Machines that have Consciousness. ---SteveKlinko
...if you're defining "Consciousness" as "ability to pass as human".
Current technology can't yet produce a robot that acts indistinguishably similarly to a human and does any job that a human can do.
Imitating or replacing humans is proving more difficult than expected. Life has evolved over billions of years of natural-selection. It wasn't reasonable to just expect to throw-together something to imitate or replace us in a few decades.
If such a machine is ever built, some would say that it has Consciousness and Experience (as do we), and some would say that it doesn't (and that it's a philosophical zombie merely claiming to have feelings and experiences).
Of course the former would be right. — Michael Ossipoff
I suppose most of us are familiar with the concept of philosophical zombies, or p-zombies for short: beings that appear and act like humans and are completely indistinguishable from humans but do not have consciousness. — BlueBanana
if there were such machines with the organs and shape of a monkey or of some other non-rational animal, we would have no way of discovering that they are not the same as these animals. But if there were machines that resembled our bodies and if they imitated our actions as much as is morally possible, we would always have two very certain means for recognizing that, none the less, they are not genuinely human. The first is that they would never be able to use speech, or other signs composed by themselves, as we do to express our thoughts to others. For one could easily conceive of a machine that is made in such a way that it utters words, and even that it would utter some words in response to physical actions that cause a change in its organs—for example, if someone touched it in a particular place, it would ask what one wishes to say to it, or if it were touched somewhere else, it would cry out that it was being hurt, and so on. But it could not arrange words in different ways to reply to the meaning of everything that is said in its presence, as even the most unintelligent human beings can do. The second means is that, even if they did many things as well as or, possibly, better than anyone of us, they would infallibly fail in others. Thus one would discover that they did not act on the basis of knowledge, but merely as a result of the disposition of their organs. For whereas reason is a universal instrument that can be used in all kinds of situations, these organs need a specific disposition for every particular action.
It must be confessed, moreover, that perception, and that which depends on it, are inexplicable by mechanical causes, that is, by figures and motions, And, supposing that there were a mechanism so constructed as to think, feel and have perception, we might enter it as into a mill. And this granted, we should only find on visiting it, pieces which push one against another, but never anything by which to explain a perception. This must be sought, therefore, in the simple substance, and not in the composite or in the machine.
I can't see how it could maintain the pretence of being, well, 'a being', for very long, as all it can do is regurgitate, or combine, various responses and information that has been uploaded into it (how, by the way? Is it a computer? If so, could it pass the Turing Test?) — Wayfarer
But Humans don't work like Robots. — SteveKlinko
The Conscious Visual experience contains an enormous amount of information that is all packed up into a single thing. The Neural Activity is not enough. — SteveKlinko
When I reach out to pick up my coffee cup I see my Hand in the Conscious Visual experience. If my hand is off track I adjust my hand movement until I can touch the handle and pick up the coffee cup. — SteveKlinko
I wonder how Descartes would react to Siri? — BlueBanana
Isn't that what humans do as well? — BlueBanana
You're missing the reality that the Robot would most definitely need the concept of cupness to operate in the general world of things. Knowing the color of the handle of one particular cup might help with that cup. In the real world the Robot would need to understand cupness in order to find a cup in the first place. Then when it finds a cup it can determine what color it is.But Humans don't work like Robots. — SteveKlinko
Is the converse true? I think a robot works, although in a simplified way, like a human, making it possible for it to replicate the actions of conscious beings.
The Conscious Visual experience contains an enormous amount of information that is all packed up into a single thing. The Neural Activity is not enough. — SteveKlinko
I think the opposite is the case. A conscious experience, whatever its benefits are, cannot be efficient. While containing all of the visual data provided by eyes, it also contains the experience of that data, which is such a rich experience we ourselves can't even begin to comprehend how it is created. The brain also unconsciously organizes and edits that data to a huge extent, filling gaps, causing us to perceive illusions, basically expanding our visual experience beyond what information is provided by the senses. For example,
When I reach out to pick up my coffee cup I see my Hand in the Conscious Visual experience. If my hand is off track I adjust my hand movement until I can touch the handle and pick up the coffee cup. — SteveKlinko
a robot would only need to find a specific kind of group of pixels with a color matching the color of the cup. Conscious mind, for some reason, in a way wastes energy forming an idea of "cupness", equating that cup with other cup and connecting it to its intended usage as well as all the memories (unconscious or conscious) an individual has relating to cups. All that information could be broken down to individual points and be had access to by a robot, but instead human mind makes something so complex and incomprehensible.
The existence of that idea also allows me to, while seeing a simple cup, appreciate my conscious perception of that cup. I still can't see the evolutionary value of that appreciation, though. — BlueBanana
Like I said, p-zombies cannot be programmed. They are dead inside. Humans are more like robots, where p-zombies are more like a mechanical contraption without any capacity for programming. Humans are programmable. P-zombies are not.A robot, or a zombie, could be programmed to answer questions about their feelings as if they had any. — BlueBanana
Every time you are asked what it is that is missing when we compare humans to computers, you weasel out of answering the question.You're taking a lot for granted, and in such matters, that is not wise. — Wayfarer
So, we design a robot with templates - a template for cups, for humans, for dogs, for cars, etc. - just like humans have. We humans have templates stored in our memory for recognizing objects. We end up getting confused, just like a robot would, when an object shares numerous qualities with different templates. The solution is to make a new template, like "spork". What would "sporkness" be? Using the word, "cupness" just goes to show what is wrong with philosophical discussions of the mind.You're missing the reality that the Robot would most definitely need the concept of cupness to operate in the general world of things. Knowing the color of the handle of one particular cup might help with that cup. In the real world the Robot would need to understand cupness in order to find a cup in the first place. Then when it finds a cup it can determine what color it is. — SteveKlinko
You're missing the reality that the Robot would most definitely need the concept of cupness to operate in the general world of things. Knowing the color of the handle of one particular cup might help with that cup. In the real world the Robot would need to understand cupness in order to find a cup in the first place. Then when it finds a cup it can determine what color it is. — SteveKlinko
Like I said, p-zombies cannot be programmed. They are dead inside. Humans are more like robots, where p-zombies are more like a mechanical contraption without any capacity for programming. Humans are programmable. P-zombies are not. — Harry Hindu
Get involved in philosophical discussions about knowledge, truth, language, consciousness, science, politics, religion, logic and mathematics, art, history, and lots more. No ads, no clutter, and very little agreement — just fascinating conversations.