• Mental States from Matter but no Matter from Mental States?
    All you are doing is moving the goal posts. Now we need to define pain. What if I defined pain as being informed that you are damaged.Harry Hindu

    That would not be a good definition of pain. The salient feature of pain is not information about damage to the system. You can have pain without any damage to the system (e.g., phantom limb pain). The main thing about pain is that it hurts, and any definition that doesn't mention this phenomenal aspect of pain is severely lacking. Wouldn't you agree? The main thing about pain is it feels bad?

    Also, I don't think there's any goalpost moving going on. I might grant you that "is x conscious?" might get bogged down in definitions, but "is x in pain?", won't. Everyone knows what that means. Either a machine can feel pain or it can't. No fancy definition is required. Do you believe that machines will ever be able to feel pain?
  • Integrated Information Theory
    Imo, the only people who can be indifferent about an instance of consciousness, are people who can meditate to a depth of ineffability, where they cannot say / recall anything about their experience. So in a sense they obliterate consciousness.Pop

    I would agree that people in such states are probably not doing any kind of memory creation.
  • Integrated Information Theory
    I agree. I didn't know what you meant by "emotion" at first.
  • Integrated Information Theory
    I tend to agree. If emotions create consciousness, wouldn't strong emotions create a strong sense of consciousness? Not necessarily, but the implication is there. I'm pretty emotionally neutral at the moment, but I don't feel any less conscious than times I was really happy/sad/scared/etc.

    And the Hard Problem is about how consciousness arises from non-conscious matter and why we are conscious.
  • Mental States from Matter but no Matter from Mental States?
    But there are still people who believe mental states are identical to brain states. For them, a mental state isn't emergent, it just is a physical brain state.
  • Mental States from Matter but no Matter from Mental States?
    bert1
    (the mental just is a physical function)

    "A more serious objection to Mind-Brain Type Identity, one that to this day has not been satisfactorily resolved, concerns various non-intensional properties of mental states (on the one hand), and physical states (on the other). After-images, for example, may be green or purple in color, but nobody could reasonably claim that states of the brain are green or purple."
    https://iep.utm.edu/identity/

    Also: having a song stuck in your head, but no music playing inside your skull. This is one of those cases where materialism goes down a rabbit hole into absurdity.
  • Mental States from Matter but no Matter from Mental States?
    I have an idea what someone might mean, but then that idea falls apart when subjected to logic and reason. The same goes for the word, "god". People use the word without a clear understanding of what it is that they are talking about. We need a definition in order to understand what each other are talking about so that we are not talking past each other.Harry Hindu

    I don't think we even need to use the word consciousness to poke some serious holes in materialism. For example, if scientists come up with a theory of consciousness and claim that some machine is conscious, instead of worrying about what consciousness means, we can just ask the scientists, "Is it capable of feeling anything, like pain or pleasure?" If the scientists say "yes", then they are still on the hook for proving that that machine can feel pain, and then we're back to the verification problem. People can throw up language barriers to questions like "Are you conscious?", but if they try to do so for something like "are you in pain?" it's not going to work. We all know what is meant by "are you in pain?"

    For example, Kenosha Kid thinks it's possible for consciousness to arise from different substrates, like rocks or ice cream cones (I think he used that example). So, instead of getting bogged down in questions like, "How could a collection of x produce consciousness?", we can ask "how could a collection of x feel pain?" The same absurdity arises (e.g., a collection of rocks feeling pain), there's the same explanatory gap and hard problem (e.g., how could a bunch of rocks feel pain? How does that work?) and we don't even have to mention consciousness.


    Only because we've learned to associate consciousness with behaviors and haven't come up with an explanation of consciousness that allows us to detect consciousness more directly.

    How would you detect consciousness in a machine, even in principle? How would you go about determining that a substrate other than neurons can generate the sensation of pain? I think this is, in principle, impossible to verify.

    I don't know what "physical" means, much less a physical fact. How about just facts, or information? I think it would be easier to figure out what consciousness is without the false dichotomy of "physical" and "mental".

    I'm sympathetic, and I think things are easier if we ditch physicalism altogether, but physicalism's central claim is that there is this non-conscious stuff that exists external to us and that it either causes consciousness or is consciousness. I don't think there's a problem understanding what physicalists mean when they say that. It's a pretty straightforward theory: mindless stuff exists and everything is made of it and it causes all phenomena. That's easy to understand. I happen to to think it's wrong, but I don't think there's a meaning problem there.

    I'm not so sure. Are you saying that my feet are conscious like my brain? Are you saying that molecules, as well as the atoms they are composed of, and then the quarks that the atoms are composed of, have points of view? What is a point of view, if not a structure of information?

    In monistic idealism, there is only one cosmic mind, and we are dissociated aspects of it (think dissosciative identity disorder, which used to be multiple personality disorder). So, would my feet be conscious? There's an assumption there that there are these things separate from us called "feet", and that they might be conscious. I don't think anything is separate. I think that separation is an illusion. There's only one thing that is conscious: the one mind. Our own focuses of awareness are, as I said, dissociated aspects of this one cosmic mind.
  • Mental States from Matter but no Matter from Mental States?
    Are you fibberfab? Is your significant other fibberfab? How can you answer those questions without knowing what fibberfab is or is not?Harry Hindu

    Do you really have no idea what someone is talking about when they ask "are you conscious"? You're not able to grok that sentence?

    You can say that you are conscious, but what makes you conscious?

    Nothing. Consciousness, mind, and ideas are all there is. Idealism makes everything so much easier.

    How can you tell if others are conscious when you can't observe their consciousness, only their actions? Are actions conscious? If not then what is conscious and how can you tell?

    You can't tell, you can only assume. Since we're all built the same way, there's been no problem assuming we're all conscious, but when computers get more sophisticated, and people start claiming things other than brains are conscious, the impossibility of verifying external consciousnesses is going to become a big problem.
    Maybe. Maybe not. Either way, the scientific definition can't contradict other definitions, or else scientists and laymen would be talking about different things.Harry Hindu

    Well said.

    Can we do the same thing with consciousness? Can you talk about how consciousness appears from consciousness and as it appears from a view from nowhere?Harry Hindu


    Can you unpack "view from nowhere"? Do you mean a god's eye view of your internal mental states?

    Your consciousness appears as a physical brain that drives various actions from my conscious perspective, which is not how my consciousness appears to me so how do I know if you or I are actually conscious or not? What is concsciousness like from a view from nowhere?

    Suppose we have an unconscious machine that knows all the physical facts about our universe. From that information, could it figure out that this thing called "consciousness" exists?
  • Integrated Information Theory
    Bob is just going to be a lot older than Frank. They'll be able to consult with a physicist to understand why.frank

    I think there's more to it than that. At time t to whatever, Bob and Frank report the same "speed" of consciousness. But if Frank accelerates enough, then at T+whatever, Bob and Frank will differ on how much conscious experience they report has happened to them, and they will both be correct. But that entails that for one (or both of them) their consciousness did not "flow at the speed it flows, neither faster nor slower".
  • Integrated Information Theory
    Doesn't IIT entail that our consciousness should fluctuate with the amount of information integration going on? For example, sitting in a dark silent room that's neither hot nor cold should result in a severely diminished conscious state compared to doing a stairmaster at a gym, but of course that's not the case.
  • The choice of one's philosophy seems to be more a matter of taste than of truth.
    Axioms can't be proven and I think there's a lot of relativism in our choice of axioms we follow. For instance, it's possible that there's a literal hell that you go to if you displease some god (or simulation programmer), but I find the notion so implausible that I don't entertain it seriously. But maybe I should...
  • Integrated Information Theory
    IIT, originated by Giulio Tonini,
    — frank

    Scott Aaronson debunkificated this a while back. David Chalmers shows up in the comment section.
    fishfry

    I thought this was relevant:

    "To his credit, Tononi cheerfully accepts the panpsychist implication: yes, he says, it really does mean that thermostats and photodiodes have small but nonzero levels of consciousness."
  • Integrated Information Theory
    I think there's a problem for IIT, though. If consciousness "flows at the speed it flows, neither faster nor slower", and Frank travels faster than Bob, then when Frank returns from his space travelling, he and Bob are going to disagree at the "speed" of which their consciousness "flows", and neither will be mistaken. Therefore, someone's consciousness went faster or slower than someone else's.
  • Integrated Information Theory
    We are conscious of very little of what our brain is actually doing, and it's doing a lot of information processing moment by moment. Why does information integration viz-a-viz digestion not result in conscious experience?
  • Integrated Information Theory
    "Intrinsic existence
    Consciousness exists: each experience is actual—indeed, that my experience here and now exists (it is real) is the only fact I can be sure of immediately and absolutely. Moreover, my experience exists from its own intrinsic perspective, independent of external observers (it is intrinsically real or actual)."

    I like this a lot.

    "Consciousness is structured: each experience is composed of multiple phenomenological distinctions, elementary or higher-order. For example, within one experience I may distinguish a book, a blue color, a blue book, the left side, a blue book on the left, and so on."

    I have problems with this. Consciousness is often structured, but it seems possible to clear our minds for short times during meditation and still retain consciousness. In that case, we are experiencing only our own conscious awareness, which would not be an experience composed of multiple phenomenological distinctions. I can also imagine a single thing that is not composed of anything else: a giant red blob. Mostly I agree with this.

    "Consciousness is specific: each experience is the particular way it is—being composed of a specific set of specific phenomenal distinctions—thereby differing from other possible experiences (differentiation). For example, an experience may include phenomenal distinctions specifying a large number of spatial locations, several positive concepts, such as a bedroom (as opposed to no bedroom), a bed (as opposed to no bed), a book (as opposed to no book), a blue color (as opposed to no blue), higher-order “bindings” of first-order distinctions, such as a blue book (as opposed to no blue book), as well as many negative concepts, such as no bird (as opposed to a bird), no bicycle (as opposed to a bicycle), no bush (as opposed to a bush), and so on. Similarly, an experience of pure darkness and silence is the particular way it is—it has the specific quality it has (no bedroom, no bed, no book, no blue, nor any other object, color, sound, thought, and so on). And being that way, it necessarily differs from a large number of alternative experiences I could have had but I am not actually having."

    Is this saying that all experiences are unique and that when an experience is happening there's something it's like to be having that experience, even if it's an experience of pure darkness and silence?

    "Consciousness is unified: each experience is irreducible to non-interdependent, disjoint subsets of phenomenal distinctions. Thus, I experience a whole visual scene, not the left side of the visual field independent of the right side (and vice versa). For example, the experience of seeing the word “BECAUSE” written in the middle of a blank page is irreducible to an experience of seeing “BE” on the left plus an experience of seeing “CAUSE” on the right. Similarly, seeing a blue book is irreducible to seeing a book without the color blue, plus the color blue without the book."

    I'm not sure that this is true...

    "Consciousness is definite, in content and spatio-temporal grain: each experience has the set of phenomenal distinctions it has, neither less (a subset) nor more (a superset), and it flows at the speed it flows, neither faster nor slower. For example, the experience I am having is of seeing a body on a bed in a bedroom, a bookcase with books, one of which is a blue book, but I am not having an experience with less content—say, one lacking the phenomenal distinction blue/not blue, or colored/not colored; or with more content—say, one endowed with the additional phenomenal distinction high/low blood pressure.[2] Moreover, my experience flows at a particular speed—each experience encompassing say a hundred milliseconds or so—but I am not having an experience that encompasses just a few milliseconds or instead minutes or hours.[3]"

    This one is fascinating, and I'm glad I clicked on your link. I want to talk about the bolded. Let's suppose we have three people. Bob is stationary, Frank is accelerating to 99% the speed of light, and Susie is also motionless, but through a magical telescope, she's able to observe Bob and Frank's brains in real time. Bob's brain should look like a normal functioning brain, but as Frank accelerates, shouldn't Suzie see Frank's brain functions go slower and slower as time dilation kicks in? And let's also say that Suzie's magic telescope can look inside Frank's mind. As Frank accelerates, would his thoughts look slower and slower to Suzie? Would the "speed of his mind" (just go with it) look slower to Suzie? And yet it must, because at the end of Frank's trip, he's going to report that he was conscious for X amount of time, while Bob reports that he was conscious for X+years more than Frank. If Suzie is watching their minds in real time, she's going to observe a divergence, and is it going to look like Frank's consciousness "slowing down"??? What would that be like? Slowing a film down?
  • Integrated Information Theory
    Yeah, but Newton didn't have a lot of red flags pop up right at the start. The theory he came up with almost perfectly mapped on to reality (except for Mercury's eccentric orbit which I'm not even sure was discovered in his lifetime) and made excellent predictions. I can already see what look like insolvable problems with IIT.
  • Integrated Information Theory
    Why would integration have to be all or nothing? How about degrees of it and a threshold for consciousness?frank

    Then that would be consciousness=(some amount of) integrated information, and vice-versa. That sounds a little ad hoc, but maybe. But by taking a measured approach and setting a minimum amount of information processing that has to go on before consciousness arises (call it X) an opponent of Tononi can claim, "No, no, that's all wrong! It's X-1 [or X+1]. Then you get consciousness". Since there's no way to "get under the hood" and actually see if something is conscious or not, Tononi and his opponent are just going to go around and around with no way to prove their respective cases. It's easier to simply claim consciousness=information processing, but that has problems of it's own.
  • Integrated Information Theory
    Good question. Did you see what I said earlier about axioms and postulates?frank

    I skimmed over it, but this will be real quick. Are you claiming consciousness=integrated information? Because if so, then integrated information=consciousness, hence my question. Or do you mean there's a causal relationship between consciousness and integrating information?
  • Mental States from Matter but no Matter from Mental States?
    Are you conscious? Is your significant(s) other conscious? To not draw this out, I'll answer for you: yes, and yes.
    — RogueAI

    This is precisely what I was talking about before. That sort of wishy-washy 'well, I know what I mean' way of communicating is no good for answering questions about consciousness in a scientific way.
    Kenosha Kid

    I'm not asking questions about consciousness in a scientific way. Are you, Kenosha Kid, conscious??? That's not talking about consciousness is a "scientific way". We're at a pretty basic level when I'm asking you that.

    Now, your claim is that the sophistication of the language has to increase when it comes to determining whether something other than ourselves is conscious. Why? If I can ask "Are you, Kenosha Kid, conscious???" in a meaningful way and get a meaningful answer (which I can), without defining consciousness in a scientific way, why can't I ask a scientist, "Hey, is that machine over there conscious? You say it is. Can it feel pain? Can it be happy? Sad? What is it like to be that machine?" The scientist has to answer those questions. Those aren't questions that are being asked "in a scientific way". Those are ground level questions that a small child can understand.

    So, my point is that the regular folk definitions of consciousness and pain and happiness and "what is it like to be that?" that we all use are perfectly appropriate to inquire meaningfully about consciousness. If that's the case, and some scientist is claiming some machine is conscious (which will eventually happen), someone is going to say, "prove it". What's the scientist going to do in that case? Retreat behind a wall of jargon? Claim he can't prove it because there's a language problem? No, the scientist can't prove a computer is conscious because it's impossible to verify the existence of other consciousnesses.

    Do you dispute the bolded? If so, explain how you can verify that other minds and/or consciousnesses exist. If not, then concede the point that any physicalist theory of consciousness will be unverifiable in principle.

    Before we go on to the possibility of consciousness coming from rocks, I want to close off this point: it's impossible to verify the existence of other consciousnesses. Agreed or not?
  • If you had everything
    I meant no judgement. It just surprised me.
  • Integrated Information Theory
    It doesn't lead to it per IIT. Integrateted information is consciousness.frank

    Every instance of information integration is an instance of consciousness?
  • Mental States from Matter but no Matter from Mental States?
    Absolutely not. We have no common "basic understanding" of consciousness. On this site alone you'll find a new one for every thread on the subject.Kenosha Kid

    Are you conscious? Is your significant(s) other conscious? To not draw this out, I'll answer for you: yes, and yes.

    Now, did we need a precise definition of consciousness to answer those questions? No. Did those questions and answers make sense to you and me? Yes. I know what you mean when you say you're conscious and vice-versa.

    We all have a basic understanding of consciousness. Claiming otherwise is absurd. The materialist "game" is often to retreat to language difficulties when the going gets tough (you'll notice I never once talked about qualia). You're doing that here.

    There are also some outstanding questions you haven't answered:
    - Is it possible to get consciousness from rocks, yes/no?
    - Is it possible to simulate consciousness, yes/no?
    - Is consciousness substrate independent, yes/no?
  • If you had everything
    You listed two things that you wish you could change when you were 18, and then talked about gay rights. I personally don't care, and I'm not attacking you. It just jumped out at me that you mentioned gay rights as a sort of afterthought. Maybe I read that wrong.
  • If you had everything
    It's late in life for me, and I find I have, or have had, most of what I ever wanted. Some of it is gone, owing to normal processes of aging, death, disease, and so on.

    There are two things I wish I had when I was 18--roughly--that I have now. One is peace of mind. I'm pretty contented. It would have been good to be so calm and collected when I was at the beginning of college, instead of bouncing off the walls.

    The second thing I wish I had had when I was 18 was the technology I use now -- computer, tablet, internet. These three things (and the companies that back them up, like Barnes & Noble or Amazon) would have made study so much more effective.

    Yes, it would have been nice if gay liberation had arrived in the outback where I lived in 1964. All that erotic energy wasted under the cold wet blanket of condemnation and guilt.

    Loads of money? Nope. I never had a lot, but I always had enough money. So far, anyway. All that one needs is a little more than one needs--a margin.
    Bitter Crank

    Technology trumps gay rights?
  • Integrated Information Theory
    This was one of the top comments in a consciousness debate that I was just watching:

    "Am I just in some weird internet bubble, or are tons of atheists (like myself) realizing that consciousness is a serious problem for materialism and becoming anti-materialists?

    And if so, why is the Hard Problem suddenly (as in the last ten years or so) dawning on lots of secular rationalists?

    Also, just as actually reading the Bible is often a great way to notice that religion is incoherent and ridiculous, reading Dennett’s _Consciousness Explained” is what finally made me realize that materialist explanations for consciousness are all incoherent. And ridiculous."


    That could have been me talking! For a lot of my life, I was hardcore atheist materialist and that was the paradigm when I was in college in 95. I only bring up this Youtuber nobody because I was talking about physicalism teetering, and I happened to run across their comment.
  • Mental States from Matter but no Matter from Mental States?
    And my point was that you don't need a scientific description of consciousness to tell whether something is conscious or not. In order for a scientist to discover scientifically what water is, yes, she needs a definition of water. If she doesn't know what water is, she can't tell you what's in the glass. Even if she knows what water looks like, she needs to be able to differentiate it from alcohol, or any other transparent liquid. As it happens, you don't need to know _much_ about water to be able to distinguish it perfectly well from not-water (it's appearance, fluidity, taste, lack of smell). This is the extent to which the definition of consciousness also needs to be precise: to distinguish it from unconscious things.

    You're arguing my point: you don't need to know _much_ about consciousness to be able to distinguish it perfectly well from non-consciousness. We don't need a rigorous definition of consciousness to determine whether that computer that just passed the Turing Test is conscious or not. We don't need to "know much" about consciousness to pose that question. Our basic understanding of consciousness is sufficient to make sense of the question: is that computer conscious or not? Just like we don't need to know much about water to measure how much is in the glass.

    Agreed?
  • Mental States from Matter but no Matter from Mental States?
    That's correct. Are octopuses conscious? Does that question involve whether computers are conscious or not? No. So the question is not about computers (although a perfectly good example).Kenosha Kid

    Is it possible for computers to be conscious? If yes, how would you verify whether a specific computer is conscious or not? If computer consciousness is impossible, why is it impossible?
  • Mental States from Matter but no Matter from Mental States?
    Suspicion confirmed. I'm not claiming there's a possible world where consciousness can arise from rocks.

    If there's no possible world where consciousness arises from rocks, then it is impossible for consciousness to arise from rocks. That is to say, no matter what you do with rocks, no matter if the rock-based system is functionally identical to a working conscious brain, if you believe there's no possible world where consciousness can arise from rocks, you CANNOT get consciousness from rocks, no matter what.

    I happen to agree. Is that your claim, though? Because now I'm going to ask you: why isn't a system of rocks that's functionally equivalent to a working brain conscious? What's stopping it from becoming conscious?

    This is tough for the materialist, because on the one hand, if you say, "consciousness from rocks is possible", you open yourself up to a reductio absurdum and a bunch of questions about how on Earth you can get consciousness from rocks. But if you say that consciousness from rocks is impossible (as you are now seeming to do), you're making a claim that some substrates won't produce consciousness. So, which substrates besides rocks are off limits and how do you know this? But you have to make a claim one way or the other: either consciousness from rocks is possible or impossible. Which is it?
  • Integrated Information Theory
    OK, but I think you're just pushing the Hard Problem to a different level: why does integrating information lead to conscious experience? How does that work exactly? And, in the case of simulated consciousness, which I think IIT endorses, there's the (very familiar) questions of why a particular series of switching operations should give rise to consciousness, and how that works, exactly. But I think IIT is a step in the right direction. At least people are thinking in non-material terms.
  • Integrated Information Theory
    Daemon
    I'm not sure how you could know that.

    For one, if I sit in a darkened silent room that's neither hot nor cold, I'm not any less conscious, which should be the case if my consciousness depends on sensory input. Also, I can imagine whatever sensory input might go missing. If they all go missing, it might eventually drive me mad, but I don't see why I would go unconscious. Even without sensory input, I would be conscious of my own internal mental states.

    But in any case you are starting from a position where I previously had working sense organs. But suppose I had never had them: I don't think I'd ever have been conscious. And consider this from an evolutionary perspective: consciousness would never have developed at all without sensing, sense organs.

    Good point. Sensory input might be necessary at the start.
  • Mental States from Matter but no Matter from Mental States?
    Why should we assume that consciousness can arise from rocks?
    — RogueAI

    I get the impression from later chat that this clicked: We should NOT assume that consciousness can arise from rocks.

    I agree. Are you still assuming that consciousness arises from neurons?

    This is the Hard Problem
    — RogueAI

    I don't think so. The hard problem allows for a bunch of rocks to be conscious, it just doesn't allow a complete third person description of that consciousness since it will not contain "what it is like to be a conscious bunch of rocks". And when I say "doesn't allow", I mean that Chalmers won't hear of it on grounds of taste.

    "The hard problem of consciousness is the problem of explaining why any physical state is conscious rather than nonconscious."
    https://iep.utm.edu/hard-con/

    I think that definition is fine, and I think that if you're going to argue that there's a possible world where consciousness arises from rocks, you're going to have to explain why that physical state is conscious rather than non-conscious. That's going to involve answering the questions I already posed, that you did not answer: How can conscious arise from rocks? What is the explanation for how the rocks become conscious? How many rocks are needed? What do you do with the rocks to make the rocks conscious (see what I mean about the absurdity of this)? Why is the act of whatever you do with the rocks important? Why does one set of rock interactions produce experience x, while a different set of rock interactions produces experience y, while a different set produces no experience at all?

    Do you have answers to any of these?

    How could you verify whether such a system is in fact conscious?

    I think this is catastrophic to the physicalist project of explaining materialism. Functionalism won't help here. Functionalism is the problem! Suppose we make a metal brain that is functionally equivalent to a working organic brain. If functionalism is right, it should be conscious. Time to test it! So, how do we test whether it's conscious or not?
    — RogueAI

    This has nothing to do with non-organic conscious as far as I can see. This problem already exists for discerning if an animal or even a person is conscious.

    Verifying consciousness has nothing to do with whether computers (which are non-organic) are conscious??? Of course it does. If scientists come up with a theory of consciousness and claim that that non-organic thing over there (computer) is conscious, they need a way to verify it. The problem of verifying consciousness is a problem for BOTH non-organic AND organic systems.

    I don't predict this will be the difficult part given a more comprehensive model of consciousness. The issues here (in my experience) relate primarily to language. The concept of "consciousness" is vague and therefore arguable. For instance, some people don't like the idea of observing anything non-human as conscious, and that vagueness gives sufficient wiggle room to be able to say, "but that's not quite consciousness" about anything. I think this is also partly why people like Chalmers retreat to the first person in these arguments. It's possible to claim that something is lost when you transform to the third person view, as long as that something is suitably wishy-washy.

    You don't need a precise definition of consciousness to verify whether something is conscious. You can verify you are conscious, correct? You're not in doubt about that, I assume. So, verification of consciousness using just folk terminology is possible on a personal level, but somehow the language fouls everything up when we try and verify whether other things are conscious? That's ad hoc. It's not a language problem, it's a verification problem- you can't get outside your own consciousness to verify whether anything external to you is conscious or not. As I said before, this isn't a problem at the moment because we all look like each other. It's going to become a hell of a problem when machines become as smart as us.

    But if you really want to test for consciousness, you have to define in precise terms what consciousness is, not what it isn't.

    Do you need a precise definition of water to tell whether a glass has any water in it? Of course not. If a scientist says, "I have a theory of consciousness, and I say that that computer (or pile of rocks) is conscious!" we all know what he means. The next question for the scientist is: "How do you know it's conscious?" If he replies, "well, what do you mean by consciousness, exactly?", that's a copout. So how is a physicalist going to verify whether anything is conscious??? They can't. Positing unverifiable theories isn't science.


    What other physical processes besides rock interactions can produce consciousness?

    N/A
    — RogueAI

    Physical processes besides rocks moving around are not applicable when it comes to producing consciousness? I can't be reading that right. What do you mean here? Do you think computers can be conscious? Because that would involve consciousness coming from "physical processes besides rock interactions", which would make those physical processes very applicable.

    Also: you believe that consciousness is substrate-independent. What evidence do you have for that?
  • Integrated Information Theory
    How would you measure how much PHI a computer has? Does the number of transistors matter? Or how they're arranged? Or both?
  • Integrated Information Theory
    I've read somewhere that they accept that a thermostat is conscious. A thermostat but not the whole brain? And the whole body is involved in consciousness!

    What's the hypothesis and how would it be tested?

    Why is it ok to consider their hypothesis as it is, when it seems to be fatally flawed from the outset?
    Daemon

    Indeed...
  • Integrated Information Theory
    "We can identify it in an abstract sense, but not in a practical sense, as we can with a manmade machine.

    We have "brainoids" now, grown from adult human skin cells. But unless they are connected to sense organs, and yes, things like feet, they can't do what real brains do. There isn't anything for them to be conscious of."

    If all your sense organs stopped working, you would still be conscious.
  • Integrated Information Theory
    Thus is a fascinating sentence:


    "Note that these postulates are inferences that go from phenomenology to physics, not the other way around. This is because the existence of one’s consciousness and its other essential properties is certain, whereas the existence and properties of the physical world are conjectures, though very good ones, made from within our own consciousness."

    It's Descartes 2.0.
    [bolding mine]

    Physicalism is teetering like a house of cards. Consciousness is primary. The physical world has been relegated to a conjecture (though a very good one). Soon, the parenthetical "though a very good one" will be gone. And then the conjecture of the physical world itself. Positing the existence of mind-independent stuff solves nothing and creates enormous problems.
  • Mental States from Matter but no Matter from Mental States?
    I think the functionalist has to define 'consciousness' in such a way that a function can constitute it. For example, X is conscious if and only if X maps the world and can predict events. Brains can do that, therefore brains are conscious. The trouble is that's not the definition of consciousness that many philosophers are talking about (including me, and I think you). The problem is we can't agree on definitions before we start. This impasse has arisen dozens and dozens of times on this forum and the last. I don't think functionalism is really a theory of consciousness, it's a definition. Most of the time anyway. Sometimes it's a theory, I think, depending on how its forumulated. With the walking and legs analogy, it's definition. Walking just is how that action is defined. And that's not interesting.

    I think I get what you're trying to say here. Functionalism was what I figured Kenosha Kid would use to answer the questions I posed about "rock consciousness". If I put on my materialist hat, let me see if I can answer some of them:

    1. Why should we assume that consciousness can arise from rocks?
    Because if you make a functionally equivalent working brain, it will be conscious and we can infer this from our knowledge of conscious and brains.

    Rocks are nothing like neurons, nothing like mental states, so why is that not an immediate category error?

    I don't think there's a good answer the materialist can give for this. I think the best the materialist can say is, "yeah, but the system is functionally equivalent to a working brain. Who cares what it's made of?" Except, consciousness could very well be substrate-dependent. It might only come about through the interactions of biological matter, for some reason. Since it's impossible to verify whether something other than working brains(s) are conscious, the issue of substrates and conscious will continue to bedevil physicalists, particularly as Ai starts doing stuff like passing Turing Tests.

    2. How can conscious arise from rocks?

    Because they're in an arrangement and interacting in a way that is functionally equivalent to a working brain.

    What is the explanation for how the rocks become conscious? How many rocks are needed? What do you do with the rocks to make the rocks conscious (see what I mean about the absurdity of this)? Why is the act of whatever you do with the rocks important? Why does one set of rock interactions produce experience x, while a different set of rock interactions produces experience y, while a different set produces no experience at all?

    This is the Hard Problem, and the materialist can't just give a functionalism argument. An explanation has to be given for why matter arrangement X,Y,Z gives rise to conscious experience. The explanation so far is that if you take a bunch of matter and arrange it in some fiendishly complex way, and have it share electrons (or interact in some way), voila! Consciousness! Needless to say, this explanation is lacking, hence the Hard Problem and Explanatory Gap.

    3. How could you verify whether such a system is in fact conscious?

    I think this is catastrophic to the physicalist project of explaining materialism. Functionalism won't help here. Functionalism is the problem! Suppose we make a metal brain that is functionally equivalent to a working organic brain. If functionalism is right, it should be conscious. Time to test it! So, how do we test whether it's conscious or not? Suppose we eventually build something that can pass a billion Turing Tests simultaneously while composing an opera for the ages. Is it conscious? Just as we'll never know if anything outside of our minds is conscious, we'll never know if anything we build is conscious. This isn't a problem when we all look like each other. We just assume we're all conscious. But a machine? Are we just going to assume advanced Ai is conscious without any way to verify it? I see problems with that.

    4. What other physical processes besides rock interactions can produce consciousness?

    I don't think functionalism helps here. I think the problem for the materialist here is that when they claim that consciousness is substrate-independent, they're going to end up at panpsychism (which, along with computation, is all the rage in consciousness these days). Because in a physical universe, there's nothing unique about what the brain does. If a conscious moment is neurons X,Y,Z doing A,B,C, you can replace the neurons with anything. And so long as you don't know what it is that the neurons are doing that actually produces consciousness, the materialist is going to be stuck saying that matter arrangement A,B,C (e.g., a bunch of rocks) doing X,Y,Z is conscious if it's functionally equivalent to a conscious-producing brain-state. OK, so any arrangement of matter that is doing X,Y,Z is conscious? If a bunch of rocks can be conscious, what about a rock slide? Is there a chance the rocks in a rockslide can do X,Y,Z accidentally and produce a moment of consciousness? What about a rain storm? There's a lot of matter-interactions going on there. Are there conscious moments in storms? Meteor swarms? Are microbes conscious (Christof Koch thinks they are)?

    As for definitions, I think that's a rabbit-hole we don't need to go down. I think we can just use a folk definition of consciousness to lay bare the problems and absurdities of materialism as it pertains to consciousness.