Are one’s thoughts on the same constitutive footing as one’s qualia in terms of their sense of self or are one’s thoughts a step removed or a step “higher” than one’s qualia? Would I still have a sense of self without any qualia but with my thoughts? And is the role played by my thoughts any more important to, or constitutive of, my sense of self than the the role played by my qualia? — Luke
An illusion needs a viewer. The waves of heat coming off the road on a hot day with strong sunshine give the illusion of water on the road. IF someone is there to see it. If nobody is there to see it, then there is no illusion. Same with a magician who makes a card disappear. There's no illusion if the seats are empty.Some claim that we are in fact in such a situation, that we don't really experience anything at all but just have the illusion that we do. — Janus
The idea that that which views illusions is, itself, an illusion makes no sense. The idea that an illusion is viewing itself makes no sense. — Patterner
I can't say I understand the argument in any way. Heh.The argument that consciousness and the self are illusory does not entail that we don't exist. As I understand it, it is more saying that we imagine consciousness and the self to be in some kind of way persistent entities, and that this is an illusion of reification. — Janus
The camera will only record what happens in a certain part of the spectrum. It will also record what the magician does with the cards. But it does not see the illusions. It is not amazed because it expected one thing and got another, despite not seeing how it could possibly have happened. Only we sees illusions.Your 'mirage' example, the illusory appearance of water on a road or plain will "fool" a camera just as it may fool a human. — Janus
True enough.In any case I'm not arguing for the position I have (rightly or wrongly) imputed to Dennett; it doesn't make convincing sense to me. either, but I acknowledge that making sense is a subjective matter; meaning that what makes sense to me may not make sense to you. — Janus
I can't say I understand the argument in any way. — Patterner
Only we sees illusions. — Patterner
Perhaps the answer to that will give the answer to, as Chalmers put it, Why is the physical "processing accompanied by conscious experience? Why does it feel like something from the inside? Why do we have this amazing inner movie going on in our minds all the time?"I don't know what those who think it is an illusion think it is. Still working on it.
— Patterner
I believe they think it is a physical process, just like anything else. Of course, that begs the question as to what exactly "physical" denotes. — Janus
Is ChatGPT conscious? What about a future version that passes the Turing Test? — RogueAI
Is ChatGPT conscious? What about a future version that passes the Turing Test? — RogueAI
However, what I found most fascinating is the idea that qualia constitute the self, rather than being something perceived by the self.
— Luke
I haven't read through the full thread, so forgive me if you have already done so, but could you point out a specific passage from the article that you interpreted as promoting such a view? — wonderer1
But we still have to address the crucial question: Why? Whatever could have been the biological advantage to our ancestors, and still to us today, of having conscious experience dressed up in this wonderful – and, some philosophers would say, quite unnecessarily exotic – fashion? To quote Fodor again:
Consciousness … seems to be among the chronically unemployed … As far as anybody knows, anything that our conscious minds can do they could do just as well if they weren’t conscious. Why, then, did God bother to make consciousness? What on earth could he have had in mind?
I can’t answer for God. But in answering for natural selection, I think we can and should let first-person intuition be our guide. So, ask yourself: what would be missing from your life if you lacked phenomenal consciousness? If you had blindsight, blind-touch, blind-hearing, blind-everything? Pace Fodor, I’m sure there’s an obvious answer, and it’s the one we touched on when discussing blindsight. It’s that what would be missing would be nothing less than you, your conscious self.
One of the most striking facts about human patients with blindsight is that they don’t take ownership of their capacity to see. Lacking visual qualia – the ‘somethingness’ of seeing – they believe that visual perception has nothing to do with them. Then, imagine if you were to lack qualia of any kind at all, and to find that none of your sensory experience was owned by you? I’m sure your self would disappear. — the article
I don't see why it would be unreasonable to answer Chalmers with, "That's just the way evolution went." — wonderer1
Any macro property is reducible to the properties of particles and the four forces. Individual particles aren't liquid, solid, or gas. But we know how the properties of particles make a group of particles liquid, solid, or gas. Particles don't fly. But we know how the properties of particles give rise to things like aerodynamics and lift, allowing flight. We can't see enough detail to calculate the results of the cue ball hitting the balls on the break, much less calculate every particles movement in a hurricane. While have much success calculating what masses of air are going to do, it's all because of the particles.So, if brain function is necessary for consciousness, what reason do we have for thinking consciousness could be something non-physical? — Janus
BTW, there is another angle on this: if some AI passes the Turing test, meaning that it can convince anyone that it is conscious, would it necessarily follow that it is, in fact, conscious? In other words, if to be conscious is to experience, would an AIs ability to convince us that it is conscious prove that it experiences anything? — Janus
It's true that we will never really know. We can't prove we are conscious to each other. Many will never believe a machine is conscious. If a machine became conscious today, many would still not believe it a hundred years from now.BTW, there is another angle on this: if some AI passes the Turing test, meaning that it can convince anyone that it is conscious, would it necessarily follow that it is, in fact, conscious? In other words, if to be conscious is to experience, would an AIs ability to convince us that it is conscious prove that it experiences anything?
— Janus
I don't think it could convince us that it's conscious. We would always wonder if it really is conscious. Passing the Turing Test is just a milestone, it doesn't confer consciousness. — RogueAI
Get involved in philosophical discussions about knowledge, truth, language, consciousness, science, politics, religion, logic and mathematics, art, history, and lots more. No ads, no clutter, and very little agreement — just fascinating conversations.