I'm talking about the website itself. Is the website more than the code? No. — khaled
Of course it is! Is the only thing you discover when you observe this website is that it's just computer code? Absurd. When you observe this website you observe philosophical discussions. You cannot claim that this forum/website/location in cyberspace is identical to computer code. That is a necessary, but not sufficient definition. It totally misses the fact that this is ALSO a place where people meet and discuss philosophy.
↪RogueAI
Is Mary surprised when she sees red?
— RogueAI
Yes. — khaled
Why is Mary surprised? She already knows everything there is to know about seeing red.
Maybe a car is a better analogy. We can say "This car can move at X km/h", without knowing anything about the engine or how cars are built. — khaled
Yes, but you're not claiming the car is identical to "moving at X km/h". I think what you're trying to say is that Hesperus is identical to Phosphorus, so talk of Phosphorus is talk of Hesperus even if the person has never heard of Hesperus. To which I would reply that that can be resolved by simply pointing out the labelling error going on.
Not so with ancient people meaningfully talking about their experiences. If experiences = brain configurations, then talk of experiences is talk of brain configurations and it's not just a labelling error going on. Ancient peoples had no idea what the brain even did. They were able to communicate meaningfully about their minds without exchanging any other meaningful communication, mislabeled or otherwise. If mental states = configurations of matter, and two people are meaningfully talking about their mental states, there should be meaningful communication about neurons and chemicals and action potentials and what not, but of course there's not. There's communication going on ONLY about mental states, which should not be the case if mental states are identical to anything else.
Me: A car is actually this specific combination of parts
You: So why is a car not this other specific combination of parts?
Does that make sense to you? How would you begin to answer that question? We can agree that a car is a combination of parts and no more yes? Engine, wheels, steering wheel, etc. Now if someone asks you "Ok but why is a car not a combination of biscuits, chocolate, and cream" how do you respond to them?
Explain to me why a car is a combination of parts (engine, wheels, steering wheel, etc) and not (biscuits, chocolate and cream), then I'll explain to you why stubbing your toe is pattern ABC not XYZ ok? — khaled
I think I addressed this with the Hesperus/Phospherus example.
Rogue AI: Are these pattenrs substrate dependent
— khaled
No I don't think so, but some define them as such. That's what I meant.
But you're not sure. So how would you go about verifying whether anything other than neurons can be conscious? You have a definition that neural state XYZ is the same as tasting vanilla ice cream. I will grant you there's neural correlates to experience and that's a definite plus for materialism and a problem for idealism.
So, you start with a prima facie advantage that the brain sure seems involved in consciousness (I think this grants you a prima facie casual connection between mental and physical states, and not an identity relationship). But now you have to prove whether brains alone are conscious. And of course you can't. There's no way in principle to verify the consciousness of anything outside yourself. Whatever physicalist theory of consciousness emerges, a scientist is going to eventually point to a machine and say, "that thing is doing the same thing brains do, so it's conscious." But you already admitted you don't know if consciousness is substrate dependent. So how is that scientist going to verify whether the machine that's functionally equivalent to a human brain is conscious or not? She can't. Science cannot give us the answer. I think that has implications. I think the above also answers the part I snipped out.
Let me ask you on the other hand, supposedly consciousness is an immaterial mind. How can you tell that your duplicate has an immaterial mind? You can't make a detector for it, because it's immaterial. So how could you tell? Or can you not tell? — khaled
I can't tell if there is more than one conscious mind or not. That is different than the situation the materialist finds herself in. Not only can she not disprove solipsism, she can't prove the material stuff she thinks brains are made of even exists (it's a non-verifiable belief), and she also can't prove whether a machine duplicate of a brain is conscious or not. I am not the in same boat. I only claim that mind and thought and consciousness exist. Unlike matter, we know that mind and thought and consciousness exist. My only problem is whether solipsism is true or not.