• TheMadFool
    13.8k
    Do you actually think there is no brain activity while you sleep? If so I can't very much help youkhaled

    Of course I don't believe that! However,

    1. NREM sleep -> Brain off/Mind off -> Sleep/Unconscious
    2. REm sleep -> Brain on/Mind on -> Sleep/Conscious but no memory
    3. Awake -> Brain on/Mind on -> Awake/Conscious with memory
    4. Mu -> Brain on/Mind off -> Awake/Unconscious

    If physicalism were true, Brain on/Mind on and Brain off/Mind off is how it should be. The Mu state contradicts that directly!
  • khaled
    3.5k
    Of course I don't believe that! However,

    1. NREM sleep -> Brain off
    TheMadFool

    If you don't believe that then brain not off. Brain always on.
  • TheMadFool
    13.8k
    If you don't believe that then brain not off. Brain always on.khaled

    Let's not get bogged down. I hope to extricate the two of us from this bog ASAP.

    First, if the brain is always on, what happens to consciousness between awake states and sleep states? There's, if some experts are to be believed, a definite change occurs in the level of consciousness between being awake and being asleep. You sleep, surely! I do too and I can say with a certainty unbecoming of a skeptic like myself that consciousness is altered between these two states (awake/asleep).

    Second, if physicalism is true and if the brain is always on whether one's awake, sleeping, daydreaming, dreaming, whathaveyou, then consciousness doesn't have physical correlates. Is that what you want to say?
  • khaled
    3.5k
    Second, if physicalism is true and if the brain is always on whether one's awake, sleeping, daydreaming, dreaming, whathaveyou, then consciousness doesn't have physical correlates.TheMadFool

    Non sequitor. First off, I think consciousness is a neurological state. It's not an independent existence that "has neurological correlates", no it's a pattern of neurological states. Mind is to a brain what an algorithm is to a computer.
  • TheMadFool
    13.8k
    Non sequitor. First off, I think consciousness is a neurological state. It's not an independent existence that "has neurological correlates", no it's a pattern of neurological states. Mind is to a brain what an algorithm is to a computer.khaled

    Non sequitur?! You said "...the brain is always on." Consciousness is assuredly not always on. Ergo, the brain state and consciousness correlation coefficient is ZERO. Stop moving the goal post!
  • TheMadFool
    13.8k
    Then you typing this post about your thought of Aphrodite isnt a physical action? What about the statues and paintings of Aphrodite? Those were not produced by physical actions? How can one produce a statue or hit keys on a keyboard spelling out Aphrodite without first having the thought of Aphrodite?Harry Hindu

    I did consider that side to the issue but it doesn't work like that. If my thought about Aphrodite is energy then that energy should be able to move something of the right mass but that's something that's not been observed.
  • khaled
    3.5k
    Ergo, the brain state and consciousness correlation coefficient is ZERO.TheMadFool

    Yes this is precisely the non sequitor. First off, it’s clear that you don’t know what correlation coefficients are. Just because the 2 variables don’t change identically doesn’t mean the correlation coefficient is 0. It can still be anywhere from 1 to -1. 0 is when they’re completely unrelated and we don’t use correlation coefficients on binary data (on/off) anyways….

    Consciousness is a certain brain pattern. This brain pattern disappears when you sleep. Even though your brain is still on. Does that make sense?

    If I have a series of blue and red lights, and I call the sequence RBRB “enlightenment”, then change the sequence to RRBB, then “enlightenment” is not occurring despite the lights being on. Capiche?
  • TheMadFool
    13.8k
    Yes this is precisely the non sequitor.khaled

    This is you going around in circles. It looked like it was fun and I joined in. Not any more. Sorry, you'll have to go on alone from here. Good bye!
  • Manuel
    3.9k
    The physical should be taken to mean everything that is physical. Mass is physical, but so are the quantum vacuum or fields. Which are quite "unsubstantial", unlike our common sense conception of physical stuff.

    The brain is physical, I think no one would doubt that. Without a brain we wouldn't have a mind. Of course, we need a body too: a brain by itself doesn't think or reflect, people do. But what are bodies? Surely they are physical stuff.

    Unless I'm missing something crucial, it should follow that the mind is physical too. Given how "unsubstantial" fields are, which are physical, and given how unsubstantial thoughts are - they both seem to be made of the same underlying stuff.

    This does not mean that physics can explain mind - that's asking too much from physics. What I take it to mean is that the physical is far broader and much stranger by far, than what we usually take it to be.

    But there's no need to postulate "non-physical stuff" or anything else.
  • TheMadFool
    13.8k
    To All

    There seems to be 3 states of consciousness we need to be cognizant of:

    1. Awake
    2. Asleep
    3. Dead

    They all differ from each other in terms of physical activity and consciousness. Allow me to expand a bit more and let objective EEG activity stand for brain on/off and subjective conscious experience represent mind on/off

    1. Awake.
    Brain on 1 (EEG reports activity E) and mind on i.e
    conscious (you're aware of the environment & yourself)

    2. Asleep.
    a) REM sleep: Brain on 1 (EEG reports activity E) and mind on i.e. conscious (you're dreaming but unless you're woken up, no memories)
    b) NREM sleep: Brain on 2 (EEG reports activity F) and mind is off i.e. unconscious (you're not dreaming)

    3. Dead.
    Brain off (EEG activity reports NIL) and mind off i.e. unconscious (you're dead)

    Now, Khaled seems to be fixated on 2 b) NREM sleep: Brain on 2 (EEG reports activity F) ajd mind off i.e. unconscious (you're not dreaming) but fae doesn't seem to realize that the on state of the brain in NREM sleep is indistinguishable from the off state in death. NREM sleep is as good as being dead insofar as consciousness is concerned.

    This implies, being as charitable as possible to physicalism, the part of the brain that's associated with consciousness is off in NREM sleep i.e. it's in precisely the state it would be if the entire brain were off as in death.

    Therefore, NREM sleep, even if the brain were on during that time, can't be used as a counterpoint against my claim that the brain is off insofar as the part responsible for consciousness is concerned.

    In other words, NREM sleep can be treated as a brain off state and thus I italicized situation 2 b) above.

    What do we have here then? To reiterate after making the necessary corrections,


    1. Awake.
    Brain on (EEG reports activity E) and mind on i.e
    conscious (you're aware of the environment & yourself)

    2. Asleep.
    a) REM sleep: Brain on (EEG reports activity E) and mind on i.e. conscious (you're dreaming but unless you're woken up, no memories)
    b) NREM sleep: Brain off (EEG reports activity NIL ) and mind is off i.e. unconscious (you're not dreaming)

    3. Dead.
    Brain off (EEG reports activity NIL) and mind off i.e. unconscious (you're dead)


    Now, take a look at what Mu is,

    4. Mu.
    Brain on (EEG reports activity E) and mind is off i.e. practically unconscious

    If phsyicalism is true, brain on must correlate with mind on and brain off must correlate with mind off (a positive binary correlation if memory serves).

    However, in the Mu mind state, the brain is on and the mind is off.

    This is a major setback for physicalism because it claims

    5. Brain on means mind on

    but, the Mu mind state demonstrates,

    6. Brain on doesn't mean mind on

    Where does the mind go when in Mu?

    Philosophical Zombie

    I leave it to the reader to connect the dots!
  • NOS4A2
    8.4k



    Thinking is an action. A thought is the act of thinking. We do not gain mass when we perform actions, but the body does perform work.
  • RogueAI
    2.5k
    it's just that there is a pattern, and we call that pattern mind.khaled

    Why do some patterns of brain activity result in conscious awareness while others (the vast majority of what the brain does) don't?
  • RogueAI
    2.5k
    Do you think computers will eventually become conscious (or already are)?
  • RogueAI
    2.5k
    First off, I think consciousness is a neurological state.khaled

    If X(consciousness) = Y(neurological state), then knowledge of X should entail knowledge of Y. For example, knowledge of the behaviors of bachelors would necessarily lead to knowledge of the behaviors of unmarried men, since they're the same thing.

    So, that being said, did ancient people who had knowledge of their minds also have knowledge of their brains?
  • khaled
    3.5k
    Why do some patterns of brain activity result in conscious awareness while others (the vast majority of what the brain does) don't?RogueAI

    This again displays a bias in asking the question. Again, certain patterns of brain activities are consciousness. This would be like asking "Why is this vanilla ice cream while that is not vanilla ice cream"? Why, because one is the pattern of vanilla ice cream while the other isn't!

    The only way for your question to even make sense is to conceive as consciousness as something that is "produced by" neurological states. Then it makes sense to ask why this neurological state produces it and that doesn't. But even then, it would be akin to asking "Why is pi equal to 3.14"? Or "Why does H2O boil at 100 degress not 70 degrees in standard conditions"? It just happens that this is the case, there was no necessary reason why it had to be this way.

    Let me actually ask you the question. Why do you think some patterns of brain activity "produce" consciousness and others don't? Stupid question right?

    Do you think computers will eventually become conscious (or already are)?RogueAI

    Well it's definitional. If you define consciousness as an animal or human capacity then obviously no. If you define it by having this or that pattern then I don't see why a similar pattern can't be reproduced in a computer. So yes, computers can eventually become conscious or already are according to your definition.

    So, that being said, did ancient people who had knowledge of their minds also have knowledge of their brains?RogueAI

    Firstly no, knowing that something is a pattern does not grant knowledge of that pattern in the first place. I know the video game I'm playing is a pattern of code, however I do not know the code in any way. I know this site is a pattern of code, however I don't know the code in any way. We can conceive of something as a pattern of something else, and talk about what that pattern does once actualized (allows me to play games/allows me to talk to strangers), without knowing what the pattern itself is.

    Secondly, those same ancient people believed that consciousness is a spirit of some sort in a dualistic fashion. To them, consciousness =/= neurological state, but moreso a spirit, a ghost in the machine. I'm talking here about Descartes, don't know if that's "ancient enough". And you see the remnants of that today.

    It's important to note that most of ancient people didn't believe in this dualistic split until Descartes. As for what they actually believed, I'm not an authority on that. But I would venture it was some sort of monism that was not idealism, considering Berkeley was in the 1700s and Descartes in 1600s (and he wasn't even a monist!). But I don't know much about ancient philosophies.
  • RogueAI
    2.5k
    So, that being said, did ancient people who had knowledge of their minds also have knowledge of their brains?RogueAI
    Firstly no,khaled

    You're claiming ancient people did NOT have knowledge of their own minds?
  • khaled
    3.5k
    You're claiming ancient people did NOT have knowledge of their own minds?RogueAI

    What's up with people here and taking quotes blatantly out of context. No I'm claiming that people can have knowledge of their minds but not their brains. Knowledge of the pattern without knowledge of the specifics. Like how you know how to use this site without knowing the code that comprises it.

    Actually read what I'm saying or it's a waste of time for everyone. If you're going to take something out of context at least bother to quote a sentence:

    Firstly no, knowing that something is a pattern does not grant knowledge of that pattern in the first place.khaled

    Or would that make it too difficult to take things out of context?
  • RogueAI
    2.5k
    No I'm claiming that people can have knowledge of their minds but not their brains.khaled

    Then minds are not identical to brains. How are they different?
  • khaled
    3.5k
    Then minds are not identical to brains. How are they different?RogueAI

    Minds are patterns of brains. They are not a separate sort of thing. No one said that minds are identical to brains, not even physicalists. Otherwise we wouldn't have 2 different words.

    A mind to a brain is an algorithm to a running program. The algorithm is not a thing in itself. It's a pattern.

    Point is, mind is not a new type of "mental stuff" that is distinct from "physical stuff" which is what idealism and dualism propose ontologically.
  • RogueAI
    2.5k
    Minds are patterns of brains. They are not a separate sort of thing.khaled

    Let's take the mental state: stubbing your state. Is the brain state that corresponds to "stubbing your toe" identical to the mental state "stubbing your toe"?
  • RogueAI
    2.5k
    No one said that minds are identical to brains, not even physicalists.khaled

    https://plato.stanford.edu/entries/mind-identity/

    "The identity theory of mind holds that states and processes of the mind are identical to states and processes of the brain."

    Either brain states are identical to mental states or they're not. You seem to be claiming a mental state is identical to a brain state "pattern".
  • RogueAI
    2.5k
    Actually, we can ditch the whole mental state/brain state thing. Do you claim that the pain of stubbing your toe is identical to some configuration/pattern of matter?
  • khaled
    3.5k
    "The identity theory of mind holds that states and processes of the mind are identical to states and processes of the brain."RogueAI

    This is not the same sentence as "the mind is identical to the brain" which you falsely attributed to me. Mental states are brain states.

    Now you would very easily know this if you were not in the habit of purposefully taking things out of context:

    "The identity theory of mind holds that states and processes of the mind are identical to states and processes of the brain. Strictly speaking, it need not hold that the mind is identical to the brain."

    Literally the next line in your own link..... I find it hard to give you the benefit of doubt and think you're doing this by mistake anymore. Just, what do you hope to gain by distorting my view and arguing against a distortion in your own head?

    Is the brain state that corresponds to "stubbing your toe" identical to the mental state "stubbing your toe"?RogueAI

    Correct.

    Yet the brain is not the mind.

    Do you claim that the pain of stubbing your toe is identical to some configuration/pattern of matter?RogueAI

    Yes.
  • RogueAI
    2.5k
    Do you claim that the pain of stubbing your toe is identical to some configuration/pattern of matter?
    — RogueAI

    Yes.
    khaled

    Ok, I have problems with that:
    1. Mary's room: You're committed to saying that Mary can know what it's like to see red without having the mental experience "seeing red". I think that's a huge problem for you. It's certainly counter-intuitive.
    2. The opposite of Mary's room: You're committed to saying that two people meaningfully talking about their mental states are also meaningfully talking about configurations of brain matter, since mental state = certain configuration of brain matter. So, two Ancient Greeks meaningfully talking about what bad days they had and how depressed they are are also meaningfully talking about configurations of brain matter? Again, very counter-intuitive.
    3. You have no explanation for why certain patterns of matter are identical to the pain of stubbing a toe, while other patterns of matter are identical to the joy of a good book, while other patterns are identical to no experience at all.
    4. Are these pattenrs substrate dependent, and how would you verify whether a non-organic pattern of matter that you conclude is conscious is actually conscious?
  • khaled
    3.5k
    You're committed to saying that Mary can know what it's like to see red without having the mental experience "seeing red".RogueAI

    No because the meaning of "know" in both instances is different. When we tell someone "You don't know X emotion" or X color we mean "You haven't had X emotion" or seen X color, not "You don't know the neurological basis for X emotion". If the latter was what we meant we woudn't be able to talk about emotions or colors without knowing the neurology, yet we do so all the time. In the same way that you can use this site without knowing the code, so can we talk about emotions without knowing the neurology, and vice versa, EVEN THOUGH the emotion is no more than a neurological pattern (and the site is no more than the code). So no, Mary doesn't know red, even though she knows everything physical about seeing red.

    You're committed to saying that two people meaningfully talking about their mental states are also meaningfully talking about configurations of brain matter, since mental state = certain configuration of brain matterRogueAI

    Same as above. Two people talking about thephilosophyforum need not know about the code that comprises the site. Even though the site is no more than the code, or do we disagree there? Is there something more to this site than its code? Something that you need to add to the code to get thephilosophyforum? I've already mentioned this previously:

    No I'm claiming that people can have knowledge of their minds but not their brains. Knowledge of the pattern without knowledge of the specifics. Like how you know how to use this site without knowing the code that comprises it.khaled

    In other words: Yes two people talking about their mental states are talking about brain states, without knowing about the brain states. Just like programmers can discuss algorithms without knowing about hardware development.

    You have no explanation for why certain patterns of matter are identical to the pain of stubbing a toeRogueAI

    Do you have an explanation for why vanilla ice cream is vanilla ice cream?

    No this question doesn't make sense for identity theory. The only reason you're able to ask it is, again, you're coming from a dualistic framework where there is a difference between the pain and the patterns of matter. There is no difference. Once you can explain to me why vanilla ice cream is identical to vanilla ice cream, and not chocolate ice cream, then I'll explain to you why certain patterns of matter are identical to the pain of stubbing a toe and not to the pain of breaking a finger or feeling nothing.

    Again, you've asked this "why is this state of matter corresponding to this state of mind and not that one" question before to me multiple times and each time I ask you to answer it and you provide no response. Why is that? Maybe because the question is nonsensical.

    In your framework, why does this pattern of matter cause the pain of stubbing a toe, rather than the pain of breaking a finger? You have a problem with my framework not being to explain this so I assume yours can yes?

    Are these pattenrs substrate dependentRogueAI

    Definitional.

    and how would you verify whether a non-organic pattern of matter that you conclude is conscious is actually conscious?RogueAI

    Again with the dualist view, suggesting there is a real object or property called "consciousness" that is added to physical stuff, that we can detect. There is no such thing.

    Actually, again, let me return that question to you along with the last one. You believe there is a non material thing called consciousness right? How do you determine whether a given object possesses it? I hear qualia are private and ineffable so supposedly you're not able to either. So why is it a problem when I'm not able to but not a problem when you're not able to?
  • RogueAI
    2.5k
    No because the meaning of "know" in both instances is different. When we tell someone "You don't know X emotion" or X color we mean "You haven't had X emotion" or seen X color, not "You don't know the neurological basis for X emotion". If the latter was what we meant we woudn't be able to talk about emotions or colors without knowing the neurology, yet we do so all the time. In the same way that you can use this site without knowing the code, so can we talk about emotions without knowing the neurology, and vice versa, EVEN THOUGH the emotion is no more than a neurological pattern (and the site is no more than the code). So no, Mary doesn't know red, even though she knows everything physical about seeing red.khaled

    At t1, Mary has never seen red before
    At t2 Mary learns all the physical facts about seeing red
    At t3, Mary sees red for the first time.
    Is Mary surprised when she sees red?

    Same as above. Two people talking thephilosophyforum need not know about the code that comprises the site. Even though the site is no more than the code, or do we disagree there? Is there something more to this site than its code? Something that you need to add to the code to get thephilosopphyforum? I've already mentioned this previously:khaled

    Are you saying the philosophy forum is identical to a computer code? I don't agree with that. The forum is computer code and a community of people talking about philosophy. Don't you agree that defining the forum as purely computer code is an incomplete definition?

    You have no explanation for why certain patterns of matter are identical to the pain of stubbing a toe
    — RogueAI

    Do you have an explanation for why vanilla ice cream is vanilla ice cream?
    khaled

    I'm not a materialist. I'm not claiming the taste of vanilla ice cream is anything other than the taste of vanilla ice cream. YOU are saying the taste of vanilla ice cream is actually pattern of matter A,B,C. YOU must then provide an explanation for why pattern of matter A,B,C is the taste of vanilla ice cream and not pattern of matter X,Y,Z or E,F,G.

    Are these pattenrs substrate dependent
    — RogueAI

    Definitional.
    khaled

    Oh, that's easy. OK, consciousness is an immaterial mind.

    Definitional.

    Again with the dualist view, suggesting there is a real object or property called "consciousness" that is added to physical stuff, that we can detect. There is no such thing.khaled

    There is no real property called consciousness??? Are you conscious, Khaled? Yes. Now imagine you have a mechanical duplicate of your own working brain. Is it conscious? If no, why not? If yes, how would you prove it? "Definitional" does not cut it. YOU are asserting that the machine has a property you admit you have: consciousness. YOU need to be able to prove that somehow.
  • khaled
    3.5k
    Is Mary surprised when she sees red?RogueAI

    Yes.

    Are you saying the philosophy forum is identical to a computer code? I don't agree with that. The forum is computer code and a community of people talking about philosophy. Don't you agree that defining the forum as purely computer code is an incomplete definition?RogueAI

    I'm talking about the website itself. Is the website more than the code? No. Can we still talk about it without knowing the code? Such as saying "thephilosophyforum is awesome"? Yes.

    Maybe a car is a better analogy. We can say "This car can move at X km/h", without knowing anything about the engine or how cars are built. You can know things about the pattern without knowing the specifics.

    OK, consciousness is an immaterial mind.RogueAI

    So it must be useless then? That's what you want to commit to? If it's immaterial then it can't interact with the material yes? Otherwise we'd just call it material.

    YOU are saying the taste of vanilla ice cream is actually pattern of matter A,B,C. YOU must then provide an explanation for why pattern of matter A,B,C is the taste of vanilla ice cream and not pattern of matter X,Y,Z or E,F,G.RogueAI

    Me: A car is actually this specific combination of parts

    You: So why is a car not this other specific combination of parts?

    Does that make sense to you? How would you begin to answer that question? We can agree that a car is a combination of parts and no more yes? Engine, wheels, steering wheel, etc. Now if someone asks you "Ok but why is a car not a combination of biscuits, chocolate, and cream" how do you respond to them?

    Explain to me why a car is a combination of parts (engine, wheels, steering wheel, etc) and not (biscuits, chocolate and cream), then I'll explain to you why stubbing your toe is pattern ABC not XYZ ok?

    I'm not a materialist. I'm not claiming the taste of vanilla ice cream is anything other than the taste of vanilla ice cream.RogueAI

    Ah but you're claiming that the taste of vanilla ice cream IS IN FACT the taste of vanilla ice cream! I now ask you this: Why is the taste of vanilla ice cream not the taste of chocolate ice cream!!!!!!!!!!

    Are these pattenrs substrate dependentkhaled

    No I don't think so, but some define them as such. That's what I meant.

    There is no real property called consciousness?RogueAI

    Ok I misspoke. There is no real object called consciousness, material or immaterial. Consciousness is a pattern, not an object.

    If yes, how would you prove it?RogueAI

    By scanning his brain and finding that he displays the pattern required for consciousness. In the same way that we can distinguish a car from a bus or an ice cream cone, by looking at whether or not it conforms to the structure of "car".

    Let me ask you on the other hand, supposedly consciousness is an immaterial mind. How can you tell that your duplicate possesses an immaterial mind? You can't make a detector for it, because it's immaterial. So how could you tell? Or can you not tell?
  • RogueAI
    2.5k
    I'm talking about the website itself. Is the website more than the code? No.khaled

    Of course it is! Is the only thing you discover when you observe this website is that it's just computer code? Absurd. When you observe this website you observe philosophical discussions. You cannot claim that this forum/website/location in cyberspace is identical to computer code. That is a necessary, but not sufficient definition. It totally misses the fact that this is ALSO a place where people meet and discuss philosophy.

    ↪RogueAI
    Is Mary surprised when she sees red?
    — RogueAI

    Yes.
    khaled

    Why is Mary surprised? She already knows everything there is to know about seeing red.

    Maybe a car is a better analogy. We can say "This car can move at X km/h", without knowing anything about the engine or how cars are built.khaled

    Yes, but you're not claiming the car is identical to "moving at X km/h". I think what you're trying to say is that Hesperus is identical to Phosphorus, so talk of Phosphorus is talk of Hesperus even if the person has never heard of Hesperus. To which I would reply that that can be resolved by simply pointing out the labelling error going on.

    Not so with ancient people meaningfully talking about their experiences. If experiences = brain configurations, then talk of experiences is talk of brain configurations and it's not just a labelling error going on. Ancient peoples had no idea what the brain even did. They were able to communicate meaningfully about their minds without exchanging any other meaningful communication, mislabeled or otherwise. If mental states = configurations of matter, and two people are meaningfully talking about their mental states, there should be meaningful communication about neurons and chemicals and action potentials and what not, but of course there's not. There's communication going on ONLY about mental states, which should not be the case if mental states are identical to anything else.

    Me: A car is actually this specific combination of parts

    You: So why is a car not this other specific combination of parts?

    Does that make sense to you? How would you begin to answer that question? We can agree that a car is a combination of parts and no more yes? Engine, wheels, steering wheel, etc. Now if someone asks you "Ok but why is a car not a combination of biscuits, chocolate, and cream" how do you respond to them?

    Explain to me why a car is a combination of parts (engine, wheels, steering wheel, etc) and not (biscuits, chocolate and cream), then I'll explain to you why stubbing your toe is pattern ABC not XYZ ok?
    khaled

    I think I addressed this with the Hesperus/Phospherus example.

    Rogue AI: Are these pattenrs substrate dependent
    — khaled

    No I don't think so, but some define them as such. That's what I meant.

    But you're not sure. So how would you go about verifying whether anything other than neurons can be conscious? You have a definition that neural state XYZ is the same as tasting vanilla ice cream. I will grant you there's neural correlates to experience and that's a definite plus for materialism and a problem for idealism.

    So, you start with a prima facie advantage that the brain sure seems involved in consciousness (I think this grants you a prima facie casual connection between mental and physical states, and not an identity relationship). But now you have to prove whether brains alone are conscious. And of course you can't. There's no way in principle to verify the consciousness of anything outside yourself. Whatever physicalist theory of consciousness emerges, a scientist is going to eventually point to a machine and say, "that thing is doing the same thing brains do, so it's conscious." But you already admitted you don't know if consciousness is substrate dependent. So how is that scientist going to verify whether the machine that's functionally equivalent to a human brain is conscious or not? She can't. Science cannot give us the answer. I think that has implications. I think the above also answers the part I snipped out.

    Let me ask you on the other hand, supposedly consciousness is an immaterial mind. How can you tell that your duplicate has an immaterial mind? You can't make a detector for it, because it's immaterial. So how could you tell? Or can you not tell?khaled

    I can't tell if there is more than one conscious mind or not. That is different than the situation the materialist finds herself in. Not only can she not disprove solipsism, she can't prove the material stuff she thinks brains are made of even exists (it's a non-verifiable belief), and she also can't prove whether a machine duplicate of a brain is conscious or not. I am not the in same boat. I only claim that mind and thought and consciousness exist. Unlike matter, we know that mind and thought and consciousness exist. My only problem is whether solipsism is true or not.
  • RogueAI
    2.5k
    And I saved the best for last. A blatant appeal to authority:

    "A more serious objection to Mind-Brain Type Identity, one that to this day has not been satisfactorily resolved, concerns various non-intensional properties of mental states (on the one hand), and physical states (on the other). After-images, for example, may be green or purple in color, but nobody could reasonably claim that states of the brain are green or purple."
    https://iep.utm.edu/identity/#H2

    I think they do a good job (far better than I could) explaining it. I'm done for the night! Great discussion, Khaled. I'll reply tomorrow.
  • khaled
    3.5k

    When you observe this website you observe philosophical discussions.RogueAI

    Which are no more than a pattern of of letters. Which are no more than than a pattern of lights on your screen lighting up. Etc.

    Let's use cars, maybe that's easier:

    Maybe a car is a better analogy. We can say "This car can move at X km/h", without knowing anything about the engine or how cars are built. You can know things about the pattern without knowing the specifics.khaled

    Why is Mary surprised? She already knows everything there is to know about seeing red.RogueAI

    Because she's never seen red before. No new knowledge was gained in the usual sense. Because again, in this case "know" has 2 meanings. There is the know in "know pythagorean's theorem" and the know in "know red". The latter simply means seeing something red. By the latter meaning, mary doesn't know red. Even if she knows everything about seeing red in the former meaning. No new knowledge in the former meaning is gained. The surprise comes from seeing red for the first time.

    Yes, but you're not claiming the car is identical to "moving at X km/h". I think what you're trying to say is that Hesperus is identical to Phosphorus, so talk of Phosphorus is talk of Hesperus even if the person has never heard of Hesperus.RogueAI

    No. I'm pointing out that we can talk about Phosphorus despite not knowing what electrons and protons are, even though phsophorous is no more than electrons and protons.

    Similarly, ancients could talk about their mental states, which are no more than brain states, without knowing what neurons are. Just like you can talk about Phosphorus despite not knowing what electrons and protons are. Does that make sense finally?

    You don't need all the details to discuss the pattern as a whole. Another example is computer scientists who talk about algorithms without talking about hardware design. Even though the program in the end is no more than a pattern of electrical signals on a motherboard.

    But you're not sure.RogueAI

    No, I'm sure. It's not substrate dependent. I'm sure about my definition. Also there are people who define it differently.

    So how would you go about verifying whether anything other than neurons can be conscious?RogueAI

    By seeing whether or not it meets the pattern that I defined.

    How do you THINK we actually verify consciousness? Brain scans can tell you if someone's conscious or sleeping or dead. That alone should tell you that conscoiusness is not an "immaterial mind" because if it was, then brain scans should tell us nothing.

    I think I addressed this with the Hesperus/PhosphRogueAI

    No you haven't. Because you still don't get what I mean there. It has nothing to do with labeling errors, and everything to do with the fact that you can discuss a pattern without knowing the specifics. Ancients can say "I am sad" which is a description of a physical pattern, despite having no clue what neurology is. Just like we can say "Phosphorous is a chemical element with symbol P" without having any clue what electrons or protons are, even though phosphorous is no more than electrons and protons.

    But now you have to prove whether brains alone are conscious. And of course you can't. There's no way in principle to verify the consciousness of anything outside yourself.RogueAI

    Ugh. Again with the implied dualism that makes the question meaningless. No, consciousness, is a particular pattern. That pattern is defined by us. It is not an object with its own existence that we detect. How do you identify a car? By seeing whether or not it meets the definition that we set for cars. How do you identify a conscious person? By seeing whether or not they meet the definiton that we set for conscious people. Not by looking for ghosts (immaterial minds).

    Not only can she not disprove solipsismRogueAI

    Materialists have as much trouble disproving solipsism as they have disporiving the theory that "I possess a car, and I'm not sure of the existence of any other cars". It is simple pattern recognition. Give a materialist an object, and he can check whether or not it's conscious very easily, just as easily as he can check whether or not it's a car. Because both "car" and "consciousness" are patterns of physical stuff.

    It's idealists that generally struggle to tell whether or not anything is conscious, given that to be conscious for them is to posses some "non material secret sauce" that cannot be detected by any means.

    Unlike matter, we know that mind and thought and consciousness exist.RogueAI

    Sure. And materialism is the further proposition that matter is fundamental, and that mind and consciousness are patterns of matter. The reason for this move is the fact that you can easily have the matter stay behind while the mind goes away, for isntance when you get knocked out. If consciousness was an immaterial mind, why do physical things impact it so much? Therefore something else must be fundamental, preceding mind, which mind is made of. We call that matter.

    Seeing as they're patterns of matter, there is absolutely no issue when it comes to recognizing whether or not something possesses them. We have no problems with pattern recognition.

    My only problem is whether solipsism is true or not.RogueAI

    Not just that. Your problem is detecting any other mind other than your own. You have 0 reason to believe any other minds exists or any way to detect them if they do (since you defined them to be undetectable). Now, you'll note that this is the exact same problem you think is present in materialism. No more and no less:

    Not only can she not disprove solipsism, she can't prove the material stuff she thinks brains are made of even exists (it's a non-verifiable belief), and she also can't prove whether a machine duplicate of a brain is conscious or not.RogueAI

    Literally everything in that quote would apply to you as well. Can't disporve solipsism, can't be sure matter exists, and can't prove whether a machine duplicate is conscious. Additionally, your version of dualism/idealism whichever it is comes with the problem that you must conclude that concsiousness is useless. After all, it's immaterial, it can't move atoms, or do much of anything. You’re right you’re not in the same boat, you’re in a worse boat!

    But for some reason it's fine for an idealist/dualist not to be able to do this but for a materialist it's a fatal flaw and reason to reject the theory. Additionally, as I've stated above, most of these are not problems in materialism. Being able to define precisely what consciousness is and isn't, and easily being able to detect it, is one of the big advantages to Identity Theory.

    I'm done for the night! Great discussion, Khaled. I'll reply tomorrow.RogueAI

    I'm having fun too. Finally someone that doesn't just bow out after 2 comments of disagreement. Looking forward to it.

    A more serious objection to Mind-Brain Type Identity, one that to this day has not been satisfactorily resolved, concerns various non-intensional properties of mental states (on the one hand), and physical states (on the other). After-images, for example, may be green or purple in color, but nobody could reasonably claim that states of the brain are green or purpleRogueAI

    First time I see this one. My initial thoughts are that it's not a serious objection. After images themselves are not green or purple. Experiences don't have colors (and yes, experiences are patterns of physical stuff. Patterns don't have colors). Just seems like misuse of language.

    I can say an apple is red, I can't say the sight of the apple is red. I can say "I am seeing a red after-image" but I can't say the sight of a red after-image is red. So yes, no one can claim that the state of my brain while seeing a red after image is red, and neither can they claim that the sight of a red after image is red. I don't see a problem.
bold
italic
underline
strike
code
quote
ulist
image
url
mention
reveal
youtube
tweet
Add a Comment

Welcome to The Philosophy Forum!

Get involved in philosophical discussions about knowledge, truth, language, consciousness, science, politics, religion, logic and mathematics, art, history, and lots more. No ads, no clutter, and very little agreement — just fascinating conversations.