• Integrated Information Theory
    IIT, originated by Giulio Tonini,
    — frank

    Scott Aaronson debunkificated this a while back. David Chalmers shows up in the comment section.
    fishfry

    I thought this was relevant:

    "To his credit, Tononi cheerfully accepts the panpsychist implication: yes, he says, it really does mean that thermostats and photodiodes have small but nonzero levels of consciousness."
  • Integrated Information Theory
    I think there's a problem for IIT, though. If consciousness "flows at the speed it flows, neither faster nor slower", and Frank travels faster than Bob, then when Frank returns from his space travelling, he and Bob are going to disagree at the "speed" of which their consciousness "flows", and neither will be mistaken. Therefore, someone's consciousness went faster or slower than someone else's.
  • Integrated Information Theory
    We are conscious of very little of what our brain is actually doing, and it's doing a lot of information processing moment by moment. Why does information integration viz-a-viz digestion not result in conscious experience?
  • Integrated Information Theory
    "Intrinsic existence
    Consciousness exists: each experience is actual—indeed, that my experience here and now exists (it is real) is the only fact I can be sure of immediately and absolutely. Moreover, my experience exists from its own intrinsic perspective, independent of external observers (it is intrinsically real or actual)."

    I like this a lot.

    "Consciousness is structured: each experience is composed of multiple phenomenological distinctions, elementary or higher-order. For example, within one experience I may distinguish a book, a blue color, a blue book, the left side, a blue book on the left, and so on."

    I have problems with this. Consciousness is often structured, but it seems possible to clear our minds for short times during meditation and still retain consciousness. In that case, we are experiencing only our own conscious awareness, which would not be an experience composed of multiple phenomenological distinctions. I can also imagine a single thing that is not composed of anything else: a giant red blob. Mostly I agree with this.

    "Consciousness is specific: each experience is the particular way it is—being composed of a specific set of specific phenomenal distinctions—thereby differing from other possible experiences (differentiation). For example, an experience may include phenomenal distinctions specifying a large number of spatial locations, several positive concepts, such as a bedroom (as opposed to no bedroom), a bed (as opposed to no bed), a book (as opposed to no book), a blue color (as opposed to no blue), higher-order “bindings” of first-order distinctions, such as a blue book (as opposed to no blue book), as well as many negative concepts, such as no bird (as opposed to a bird), no bicycle (as opposed to a bicycle), no bush (as opposed to a bush), and so on. Similarly, an experience of pure darkness and silence is the particular way it is—it has the specific quality it has (no bedroom, no bed, no book, no blue, nor any other object, color, sound, thought, and so on). And being that way, it necessarily differs from a large number of alternative experiences I could have had but I am not actually having."

    Is this saying that all experiences are unique and that when an experience is happening there's something it's like to be having that experience, even if it's an experience of pure darkness and silence?

    "Consciousness is unified: each experience is irreducible to non-interdependent, disjoint subsets of phenomenal distinctions. Thus, I experience a whole visual scene, not the left side of the visual field independent of the right side (and vice versa). For example, the experience of seeing the word “BECAUSE” written in the middle of a blank page is irreducible to an experience of seeing “BE” on the left plus an experience of seeing “CAUSE” on the right. Similarly, seeing a blue book is irreducible to seeing a book without the color blue, plus the color blue without the book."

    I'm not sure that this is true...

    "Consciousness is definite, in content and spatio-temporal grain: each experience has the set of phenomenal distinctions it has, neither less (a subset) nor more (a superset), and it flows at the speed it flows, neither faster nor slower. For example, the experience I am having is of seeing a body on a bed in a bedroom, a bookcase with books, one of which is a blue book, but I am not having an experience with less content—say, one lacking the phenomenal distinction blue/not blue, or colored/not colored; or with more content—say, one endowed with the additional phenomenal distinction high/low blood pressure.[2] Moreover, my experience flows at a particular speed—each experience encompassing say a hundred milliseconds or so—but I am not having an experience that encompasses just a few milliseconds or instead minutes or hours.[3]"

    This one is fascinating, and I'm glad I clicked on your link. I want to talk about the bolded. Let's suppose we have three people. Bob is stationary, Frank is accelerating to 99% the speed of light, and Susie is also motionless, but through a magical telescope, she's able to observe Bob and Frank's brains in real time. Bob's brain should look like a normal functioning brain, but as Frank accelerates, shouldn't Suzie see Frank's brain functions go slower and slower as time dilation kicks in? And let's also say that Suzie's magic telescope can look inside Frank's mind. As Frank accelerates, would his thoughts look slower and slower to Suzie? Would the "speed of his mind" (just go with it) look slower to Suzie? And yet it must, because at the end of Frank's trip, he's going to report that he was conscious for X amount of time, while Bob reports that he was conscious for X+years more than Frank. If Suzie is watching their minds in real time, she's going to observe a divergence, and is it going to look like Frank's consciousness "slowing down"??? What would that be like? Slowing a film down?
  • Integrated Information Theory
    Yeah, but Newton didn't have a lot of red flags pop up right at the start. The theory he came up with almost perfectly mapped on to reality (except for Mercury's eccentric orbit which I'm not even sure was discovered in his lifetime) and made excellent predictions. I can already see what look like insolvable problems with IIT.
  • Integrated Information Theory
    Why would integration have to be all or nothing? How about degrees of it and a threshold for consciousness?frank

    Then that would be consciousness=(some amount of) integrated information, and vice-versa. That sounds a little ad hoc, but maybe. But by taking a measured approach and setting a minimum amount of information processing that has to go on before consciousness arises (call it X) an opponent of Tononi can claim, "No, no, that's all wrong! It's X-1 [or X+1]. Then you get consciousness". Since there's no way to "get under the hood" and actually see if something is conscious or not, Tononi and his opponent are just going to go around and around with no way to prove their respective cases. It's easier to simply claim consciousness=information processing, but that has problems of it's own.
  • Integrated Information Theory
    Good question. Did you see what I said earlier about axioms and postulates?frank

    I skimmed over it, but this will be real quick. Are you claiming consciousness=integrated information? Because if so, then integrated information=consciousness, hence my question. Or do you mean there's a causal relationship between consciousness and integrating information?
  • Mental States from Matter but no Matter from Mental States?
    Are you conscious? Is your significant(s) other conscious? To not draw this out, I'll answer for you: yes, and yes.
    — RogueAI

    This is precisely what I was talking about before. That sort of wishy-washy 'well, I know what I mean' way of communicating is no good for answering questions about consciousness in a scientific way.
    Kenosha Kid

    I'm not asking questions about consciousness in a scientific way. Are you, Kenosha Kid, conscious??? That's not talking about consciousness is a "scientific way". We're at a pretty basic level when I'm asking you that.

    Now, your claim is that the sophistication of the language has to increase when it comes to determining whether something other than ourselves is conscious. Why? If I can ask "Are you, Kenosha Kid, conscious???" in a meaningful way and get a meaningful answer (which I can), without defining consciousness in a scientific way, why can't I ask a scientist, "Hey, is that machine over there conscious? You say it is. Can it feel pain? Can it be happy? Sad? What is it like to be that machine?" The scientist has to answer those questions. Those aren't questions that are being asked "in a scientific way". Those are ground level questions that a small child can understand.

    So, my point is that the regular folk definitions of consciousness and pain and happiness and "what is it like to be that?" that we all use are perfectly appropriate to inquire meaningfully about consciousness. If that's the case, and some scientist is claiming some machine is conscious (which will eventually happen), someone is going to say, "prove it". What's the scientist going to do in that case? Retreat behind a wall of jargon? Claim he can't prove it because there's a language problem? No, the scientist can't prove a computer is conscious because it's impossible to verify the existence of other consciousnesses.

    Do you dispute the bolded? If so, explain how you can verify that other minds and/or consciousnesses exist. If not, then concede the point that any physicalist theory of consciousness will be unverifiable in principle.

    Before we go on to the possibility of consciousness coming from rocks, I want to close off this point: it's impossible to verify the existence of other consciousnesses. Agreed or not?
  • If you had everything
    I meant no judgement. It just surprised me.
  • Integrated Information Theory
    It doesn't lead to it per IIT. Integrateted information is consciousness.frank

    Every instance of information integration is an instance of consciousness?
  • Mental States from Matter but no Matter from Mental States?
    Absolutely not. We have no common "basic understanding" of consciousness. On this site alone you'll find a new one for every thread on the subject.Kenosha Kid

    Are you conscious? Is your significant(s) other conscious? To not draw this out, I'll answer for you: yes, and yes.

    Now, did we need a precise definition of consciousness to answer those questions? No. Did those questions and answers make sense to you and me? Yes. I know what you mean when you say you're conscious and vice-versa.

    We all have a basic understanding of consciousness. Claiming otherwise is absurd. The materialist "game" is often to retreat to language difficulties when the going gets tough (you'll notice I never once talked about qualia). You're doing that here.

    There are also some outstanding questions you haven't answered:
    - Is it possible to get consciousness from rocks, yes/no?
    - Is it possible to simulate consciousness, yes/no?
    - Is consciousness substrate independent, yes/no?
  • If you had everything
    You listed two things that you wish you could change when you were 18, and then talked about gay rights. I personally don't care, and I'm not attacking you. It just jumped out at me that you mentioned gay rights as a sort of afterthought. Maybe I read that wrong.
  • If you had everything
    It's late in life for me, and I find I have, or have had, most of what I ever wanted. Some of it is gone, owing to normal processes of aging, death, disease, and so on.

    There are two things I wish I had when I was 18--roughly--that I have now. One is peace of mind. I'm pretty contented. It would have been good to be so calm and collected when I was at the beginning of college, instead of bouncing off the walls.

    The second thing I wish I had had when I was 18 was the technology I use now -- computer, tablet, internet. These three things (and the companies that back them up, like Barnes & Noble or Amazon) would have made study so much more effective.

    Yes, it would have been nice if gay liberation had arrived in the outback where I lived in 1964. All that erotic energy wasted under the cold wet blanket of condemnation and guilt.

    Loads of money? Nope. I never had a lot, but I always had enough money. So far, anyway. All that one needs is a little more than one needs--a margin.
    Bitter Crank

    Technology trumps gay rights?
  • Integrated Information Theory
    This was one of the top comments in a consciousness debate that I was just watching:

    "Am I just in some weird internet bubble, or are tons of atheists (like myself) realizing that consciousness is a serious problem for materialism and becoming anti-materialists?

    And if so, why is the Hard Problem suddenly (as in the last ten years or so) dawning on lots of secular rationalists?

    Also, just as actually reading the Bible is often a great way to notice that religion is incoherent and ridiculous, reading Dennett’s _Consciousness Explained” is what finally made me realize that materialist explanations for consciousness are all incoherent. And ridiculous."


    That could have been me talking! For a lot of my life, I was hardcore atheist materialist and that was the paradigm when I was in college in 95. I only bring up this Youtuber nobody because I was talking about physicalism teetering, and I happened to run across their comment.
  • Mental States from Matter but no Matter from Mental States?
    And my point was that you don't need a scientific description of consciousness to tell whether something is conscious or not. In order for a scientist to discover scientifically what water is, yes, she needs a definition of water. If she doesn't know what water is, she can't tell you what's in the glass. Even if she knows what water looks like, she needs to be able to differentiate it from alcohol, or any other transparent liquid. As it happens, you don't need to know _much_ about water to be able to distinguish it perfectly well from not-water (it's appearance, fluidity, taste, lack of smell). This is the extent to which the definition of consciousness also needs to be precise: to distinguish it from unconscious things.

    You're arguing my point: you don't need to know _much_ about consciousness to be able to distinguish it perfectly well from non-consciousness. We don't need a rigorous definition of consciousness to determine whether that computer that just passed the Turing Test is conscious or not. We don't need to "know much" about consciousness to pose that question. Our basic understanding of consciousness is sufficient to make sense of the question: is that computer conscious or not? Just like we don't need to know much about water to measure how much is in the glass.

    Agreed?
  • Mental States from Matter but no Matter from Mental States?
    That's correct. Are octopuses conscious? Does that question involve whether computers are conscious or not? No. So the question is not about computers (although a perfectly good example).Kenosha Kid

    Is it possible for computers to be conscious? If yes, how would you verify whether a specific computer is conscious or not? If computer consciousness is impossible, why is it impossible?
  • Mental States from Matter but no Matter from Mental States?
    Suspicion confirmed. I'm not claiming there's a possible world where consciousness can arise from rocks.

    If there's no possible world where consciousness arises from rocks, then it is impossible for consciousness to arise from rocks. That is to say, no matter what you do with rocks, no matter if the rock-based system is functionally identical to a working conscious brain, if you believe there's no possible world where consciousness can arise from rocks, you CANNOT get consciousness from rocks, no matter what.

    I happen to agree. Is that your claim, though? Because now I'm going to ask you: why isn't a system of rocks that's functionally equivalent to a working brain conscious? What's stopping it from becoming conscious?

    This is tough for the materialist, because on the one hand, if you say, "consciousness from rocks is possible", you open yourself up to a reductio absurdum and a bunch of questions about how on Earth you can get consciousness from rocks. But if you say that consciousness from rocks is impossible (as you are now seeming to do), you're making a claim that some substrates won't produce consciousness. So, which substrates besides rocks are off limits and how do you know this? But you have to make a claim one way or the other: either consciousness from rocks is possible or impossible. Which is it?
  • Integrated Information Theory
    OK, but I think you're just pushing the Hard Problem to a different level: why does integrating information lead to conscious experience? How does that work exactly? And, in the case of simulated consciousness, which I think IIT endorses, there's the (very familiar) questions of why a particular series of switching operations should give rise to consciousness, and how that works, exactly. But I think IIT is a step in the right direction. At least people are thinking in non-material terms.
  • Integrated Information Theory
    Daemon
    I'm not sure how you could know that.

    For one, if I sit in a darkened silent room that's neither hot nor cold, I'm not any less conscious, which should be the case if my consciousness depends on sensory input. Also, I can imagine whatever sensory input might go missing. If they all go missing, it might eventually drive me mad, but I don't see why I would go unconscious. Even without sensory input, I would be conscious of my own internal mental states.

    But in any case you are starting from a position where I previously had working sense organs. But suppose I had never had them: I don't think I'd ever have been conscious. And consider this from an evolutionary perspective: consciousness would never have developed at all without sensing, sense organs.

    Good point. Sensory input might be necessary at the start.
  • Mental States from Matter but no Matter from Mental States?
    Why should we assume that consciousness can arise from rocks?
    — RogueAI

    I get the impression from later chat that this clicked: We should NOT assume that consciousness can arise from rocks.

    I agree. Are you still assuming that consciousness arises from neurons?

    This is the Hard Problem
    — RogueAI

    I don't think so. The hard problem allows for a bunch of rocks to be conscious, it just doesn't allow a complete third person description of that consciousness since it will not contain "what it is like to be a conscious bunch of rocks". And when I say "doesn't allow", I mean that Chalmers won't hear of it on grounds of taste.

    "The hard problem of consciousness is the problem of explaining why any physical state is conscious rather than nonconscious."
    https://iep.utm.edu/hard-con/

    I think that definition is fine, and I think that if you're going to argue that there's a possible world where consciousness arises from rocks, you're going to have to explain why that physical state is conscious rather than non-conscious. That's going to involve answering the questions I already posed, that you did not answer: How can conscious arise from rocks? What is the explanation for how the rocks become conscious? How many rocks are needed? What do you do with the rocks to make the rocks conscious (see what I mean about the absurdity of this)? Why is the act of whatever you do with the rocks important? Why does one set of rock interactions produce experience x, while a different set of rock interactions produces experience y, while a different set produces no experience at all?

    Do you have answers to any of these?

    How could you verify whether such a system is in fact conscious?

    I think this is catastrophic to the physicalist project of explaining materialism. Functionalism won't help here. Functionalism is the problem! Suppose we make a metal brain that is functionally equivalent to a working organic brain. If functionalism is right, it should be conscious. Time to test it! So, how do we test whether it's conscious or not?
    — RogueAI

    This has nothing to do with non-organic conscious as far as I can see. This problem already exists for discerning if an animal or even a person is conscious.

    Verifying consciousness has nothing to do with whether computers (which are non-organic) are conscious??? Of course it does. If scientists come up with a theory of consciousness and claim that that non-organic thing over there (computer) is conscious, they need a way to verify it. The problem of verifying consciousness is a problem for BOTH non-organic AND organic systems.

    I don't predict this will be the difficult part given a more comprehensive model of consciousness. The issues here (in my experience) relate primarily to language. The concept of "consciousness" is vague and therefore arguable. For instance, some people don't like the idea of observing anything non-human as conscious, and that vagueness gives sufficient wiggle room to be able to say, "but that's not quite consciousness" about anything. I think this is also partly why people like Chalmers retreat to the first person in these arguments. It's possible to claim that something is lost when you transform to the third person view, as long as that something is suitably wishy-washy.

    You don't need a precise definition of consciousness to verify whether something is conscious. You can verify you are conscious, correct? You're not in doubt about that, I assume. So, verification of consciousness using just folk terminology is possible on a personal level, but somehow the language fouls everything up when we try and verify whether other things are conscious? That's ad hoc. It's not a language problem, it's a verification problem- you can't get outside your own consciousness to verify whether anything external to you is conscious or not. As I said before, this isn't a problem at the moment because we all look like each other. It's going to become a hell of a problem when machines become as smart as us.

    But if you really want to test for consciousness, you have to define in precise terms what consciousness is, not what it isn't.

    Do you need a precise definition of water to tell whether a glass has any water in it? Of course not. If a scientist says, "I have a theory of consciousness, and I say that that computer (or pile of rocks) is conscious!" we all know what he means. The next question for the scientist is: "How do you know it's conscious?" If he replies, "well, what do you mean by consciousness, exactly?", that's a copout. So how is a physicalist going to verify whether anything is conscious??? They can't. Positing unverifiable theories isn't science.


    What other physical processes besides rock interactions can produce consciousness?

    N/A
    — RogueAI

    Physical processes besides rocks moving around are not applicable when it comes to producing consciousness? I can't be reading that right. What do you mean here? Do you think computers can be conscious? Because that would involve consciousness coming from "physical processes besides rock interactions", which would make those physical processes very applicable.

    Also: you believe that consciousness is substrate-independent. What evidence do you have for that?
  • Integrated Information Theory
    How would you measure how much PHI a computer has? Does the number of transistors matter? Or how they're arranged? Or both?
  • Integrated Information Theory
    I've read somewhere that they accept that a thermostat is conscious. A thermostat but not the whole brain? And the whole body is involved in consciousness!

    What's the hypothesis and how would it be tested?

    Why is it ok to consider their hypothesis as it is, when it seems to be fatally flawed from the outset?
    Daemon

    Indeed...
  • Integrated Information Theory
    "We can identify it in an abstract sense, but not in a practical sense, as we can with a manmade machine.

    We have "brainoids" now, grown from adult human skin cells. But unless they are connected to sense organs, and yes, things like feet, they can't do what real brains do. There isn't anything for them to be conscious of."

    If all your sense organs stopped working, you would still be conscious.
  • Integrated Information Theory
    Thus is a fascinating sentence:


    "Note that these postulates are inferences that go from phenomenology to physics, not the other way around. This is because the existence of one’s consciousness and its other essential properties is certain, whereas the existence and properties of the physical world are conjectures, though very good ones, made from within our own consciousness."

    It's Descartes 2.0.
    [bolding mine]

    Physicalism is teetering like a house of cards. Consciousness is primary. The physical world has been relegated to a conjecture (though a very good one). Soon, the parenthetical "though a very good one" will be gone. And then the conjecture of the physical world itself. Positing the existence of mind-independent stuff solves nothing and creates enormous problems.
  • Mental States from Matter but no Matter from Mental States?
    I think the functionalist has to define 'consciousness' in such a way that a function can constitute it. For example, X is conscious if and only if X maps the world and can predict events. Brains can do that, therefore brains are conscious. The trouble is that's not the definition of consciousness that many philosophers are talking about (including me, and I think you). The problem is we can't agree on definitions before we start. This impasse has arisen dozens and dozens of times on this forum and the last. I don't think functionalism is really a theory of consciousness, it's a definition. Most of the time anyway. Sometimes it's a theory, I think, depending on how its forumulated. With the walking and legs analogy, it's definition. Walking just is how that action is defined. And that's not interesting.

    I think I get what you're trying to say here. Functionalism was what I figured Kenosha Kid would use to answer the questions I posed about "rock consciousness". If I put on my materialist hat, let me see if I can answer some of them:

    1. Why should we assume that consciousness can arise from rocks?
    Because if you make a functionally equivalent working brain, it will be conscious and we can infer this from our knowledge of conscious and brains.

    Rocks are nothing like neurons, nothing like mental states, so why is that not an immediate category error?

    I don't think there's a good answer the materialist can give for this. I think the best the materialist can say is, "yeah, but the system is functionally equivalent to a working brain. Who cares what it's made of?" Except, consciousness could very well be substrate-dependent. It might only come about through the interactions of biological matter, for some reason. Since it's impossible to verify whether something other than working brains(s) are conscious, the issue of substrates and conscious will continue to bedevil physicalists, particularly as Ai starts doing stuff like passing Turing Tests.

    2. How can conscious arise from rocks?

    Because they're in an arrangement and interacting in a way that is functionally equivalent to a working brain.

    What is the explanation for how the rocks become conscious? How many rocks are needed? What do you do with the rocks to make the rocks conscious (see what I mean about the absurdity of this)? Why is the act of whatever you do with the rocks important? Why does one set of rock interactions produce experience x, while a different set of rock interactions produces experience y, while a different set produces no experience at all?

    This is the Hard Problem, and the materialist can't just give a functionalism argument. An explanation has to be given for why matter arrangement X,Y,Z gives rise to conscious experience. The explanation so far is that if you take a bunch of matter and arrange it in some fiendishly complex way, and have it share electrons (or interact in some way), voila! Consciousness! Needless to say, this explanation is lacking, hence the Hard Problem and Explanatory Gap.

    3. How could you verify whether such a system is in fact conscious?

    I think this is catastrophic to the physicalist project of explaining materialism. Functionalism won't help here. Functionalism is the problem! Suppose we make a metal brain that is functionally equivalent to a working organic brain. If functionalism is right, it should be conscious. Time to test it! So, how do we test whether it's conscious or not? Suppose we eventually build something that can pass a billion Turing Tests simultaneously while composing an opera for the ages. Is it conscious? Just as we'll never know if anything outside of our minds is conscious, we'll never know if anything we build is conscious. This isn't a problem when we all look like each other. We just assume we're all conscious. But a machine? Are we just going to assume advanced Ai is conscious without any way to verify it? I see problems with that.

    4. What other physical processes besides rock interactions can produce consciousness?

    I don't think functionalism helps here. I think the problem for the materialist here is that when they claim that consciousness is substrate-independent, they're going to end up at panpsychism (which, along with computation, is all the rage in consciousness these days). Because in a physical universe, there's nothing unique about what the brain does. If a conscious moment is neurons X,Y,Z doing A,B,C, you can replace the neurons with anything. And so long as you don't know what it is that the neurons are doing that actually produces consciousness, the materialist is going to be stuck saying that matter arrangement A,B,C (e.g., a bunch of rocks) doing X,Y,Z is conscious if it's functionally equivalent to a conscious-producing brain-state. OK, so any arrangement of matter that is doing X,Y,Z is conscious? If a bunch of rocks can be conscious, what about a rock slide? Is there a chance the rocks in a rockslide can do X,Y,Z accidentally and produce a moment of consciousness? What about a rain storm? There's a lot of matter-interactions going on there. Are there conscious moments in storms? Meteor swarms? Are microbes conscious (Christof Koch thinks they are)?

    As for definitions, I think that's a rabbit-hole we don't need to go down. I think we can just use a folk definition of consciousness to lay bare the problems and absurdities of materialism as it pertains to consciousness.
  • Mental States from Matter but no Matter from Mental States?
    Do I know you from some other forum? Did we cross paths once and it ended badly?

    Anyway, suppose you built a machine that was functionally equivalent to a working brain. How would you test whether it's conscious or not?
  • Mental States from Matter but no Matter from Mental States?
    I never said you could build a functioning brain out of anything. Your question was regarding whether something with the same function as a brain would be conscious; my answer is yes. It doesn't follow that you can build a functioning brain out of rocks, liquorice or thin air: that is a purely technological problem. But _if_ you built something with the same functioning as a conscious brain out of rocks, then yes, that system would by definition be conscious.

    I figured we would reach this point. I think this is where the materialist position collapses into complete absurdity. I know that's a person opinion, but I have some questions that you won't be able to answer that kind of illustrate the absurdity of it all.

    1. Why should we assume that consciousness can arise from rocks? Rocks are nothing like neurons, nothing like mental states, so why is that not an immediate category error?
    2. How can conscious arise from rocks? What is the explanation for how the rocks become conscious? How many rocks are needed? What do you do with the rocks to make the rocks conscious (see what I mean about the absurdity of this)? Why is the act of whatever you do with the rocks important? Why does one set of rock interactions produce experience x, while a different set of rock interactions produces experience y, while a different set produces no experience at all?
    3. How could you verify whether such a system is in fact conscious?
    4. What other physical processes besides rock interactions can produce consciousness?

    (2) lays bare the absurdity of it all, but I think (3) is catastrophic. It's impossible to verify that anything other than you is conscious. That's just a brute fact about our epistemic position in the world. The existence of other consciousnesses is assumed but can never be proven. You can't leave your mind and check for the existence of other minds. That leads to the following problem for materialism: suppose you've got this awesome theory of consciousness and it predicts that that object over there is conscious. How do you prove it? You can't. No physicalist theory of consciousness will ever be verified. It's impossible in principle. No matter how clever the theory is, you can never get inside the object it says is conscious or isn't to examine its internal mental states or lack thereof. The physicalist project to understand consciousness is doomed to failure.

    Note that this is not a god-of-the-gaps argument. There are a lot of things that cause physicalism to fail with regard to consciousness and none of them have anything to do with god:
    1. The absurdity of consciousness coming from rocks
    2. The total lack of explanation for how consciousness can possibly arise from doing stuff with rocks.
    3. A category error in thinking that non-mental stuff can produce mental events
    4. The impossibility of verification of any physicalist theory of consciousness
  • Mental States from Matter but no Matter from Mental States?
    It’s an interesting question, and I haven’t read the rest of the thread yet, but I think there’s a misunderstanding here. Both your descriptions here assume both consciousness and a working brain exists.

    Those are safe assumptions. Do you doubt brains and/or consciousness exists?

    Producing a feeling is not the same as producing consciousness, and I’m not sure how you would ‘arrange’ feelings or experiences as you’ve described without a working brain.

    Assuming a working brain is required for consciousness is just that: an assumption. Idealists always concede that point. They shouldn't. I will concede that it appears the brain is a necessary condition for consciousness. Are we justified in assuming appearances are as they seem? Sometimes. Sometimes not. The materialist cannot just assume brains exist and are required for consciousness. They have to argue that what seems to exist external to our minds actually does exist external to our minds. Since we can't leave our minds and verify whether anything external to our minds exists, there's no way to prove materialism. It is simply taken on faith that external stuff exists. It's no different than refuting Berkeley by kicking a rock.

    Anyway, even if brains are required for consciouness, there is still the issue of brain states producing new additional experiences, but experiences incapable of producing additional brain states. What do I mean by experiences? Simple: listen to music while you stub your toe looking at a sunset. Nothing additional is created by that arrangement of experiences. Nothing additional is ever added to the universe by mental states. What do I mean by additional? At time t, there are x number of experiences that have ever happened. At t+10 min, there will be many more additional experiences. That doesn't happen the other way around. Mental states never result in the addition of anything. Nothing physical is added to the universe from mental states. I think a materialist has to argue why that dichotomy exists. I think bringing up entropy is going to lead to substance dualism.

    The ‘feeling of stubbing your toe’ is a complex interrelation of ideas, including notions of ‘self’, ‘body’, ‘toe’, ‘movement’ and ‘impact’ as well as ‘unpleasant’, ‘sharp’ and ‘pain’. Potentially, it can all be rendered as a pattern of electric current through matter without understanding any of these ideas - provided that matter has sufficient experience to recognise and describe the pattern as ‘the feeling of stubbing your toe’. Otherwise how would you confirm this?

    It sounds like you're talking about consciousness from neurons/switches. See my post right above yours for my concerns on that.

    Conversely, one can theoretically arrange all of the above ideas in a particular way to construct a mental state that matches this pattern of electric current - without anyone ever actually stubbing their toe.

    Do you think mental states can exist on their own, without any substrate? That's how I read what you're saying.
  • Mental States from Matter but no Matter from Mental States?
    Those are good points. Instead of consciousness from moving rocks around, what about simulated consciousness? If one believes that consciousness can be simulated, then one believes that a collection of electric switches can produce consciousness. The questions I have about that are:
    1. Why should we assume that consciousness can arise from switches? Why is that not a category error?
    2. How can conscious arise from switches? What is the explanation for how the switches become conscious? How many switches are needed? In what order? Why is the act of switching important? Why does one set of switching operations produce experience x, while a different set of switching operations produces experience y, while a different set produces no experience at all? Is electricity required?
    3. How could you verify whether such a system is in fact conscious?
    4. What other collections of switches are conscious? Phones? My desktop computer?
    5. What other physical processes besides switching operations can produce consciousness?
  • Mental States from Matter but no Matter from Mental States?
    You just did by posting that, but so what. All we do here is waste time. What is your question you want me to answer?
  • Mental States from Matter but no Matter from Mental States?
    Why doesn't the creation of new mental states violate entropy?
  • Mental States from Matter but no Matter from Mental States?
    Can you name them?

    No, there was a group of computationalist materialists over at the International Skeptics Forum in its heyday who were completely invested in that comic (and Hofstader's book Godel, Esher, Bach). The line of thinking was pretty simple to follow: if consciousness can be simulated, then it can be produced through switching operations. Switching operations can occur in lots of different substrates, like a system of pipes, water, pumps, and valves (for some reason, the materialists over at the ISF preferred ropes and pulleys). Such a system, if it was functionally equivalent to a working conscious human brain, would also be conscious.

    None of this so far is controversial. How they got to "you can simulate consciousness by moving rocks around" was exactly what the comic claims: you can build a turing-complete computer by moving rocks around, if you have enough time and rocks. If you can compute by moving rocks around, and consciousness can be simulated on a computer, then consciousness can be simulated by moving rocks around. They even talked themselves into believing a bunch of people writing 1's and 0's on pieces of paper could also produce conscious moments.

    I think they're actually correct in that chain of logic. If you're willing to believe in a conscious system of pipes and water, why not rocks being moved around in a certain way? Under materialism, consciousness shouldn't be substrate dependent, and if you can replicate the computational processes going on in some brain state(s) by moving enough rocks around, that collection of moving rocks, if it's computationally equivalent to brain state(s), should be conscious. The problem is that at the end of that chain of logic is an absurdity: the possibility that a universe of conscious beings are being simulated by someone moving rocks around.

    I am only asking because I heard many preachers say, "I've met many such and such that said such and such". I think it's a rhetoric and I am having a hard time believing it any more. If you met many materialists who said this or that, some names must have stuck in your mind.

    I can name you message board names, but they won't mean anything to you. There was a substantial group of people who did buy in to what that comic was saying. I would be surprised if there weren't some materialists here who would agree with it.

    I am fully aware that you can say, "Joe Montague, Harry Griffin, Michele Adieu, Robert Frankovic, Debbi Gaal, and Rosemary Thimble." I ask you to be honest. Did you actually met MANY materialists who said what you claim they all said?

    I met many materialists who believe that it was possible to simulate consciousness by moving rocks around, yes.

    Now, let me ask you, can consciousness be simulated on a computer?
  • Mental States from Matter but no Matter from Mental States?
    What was the question again? The one about walking and legs?
  • Mental States from Matter but no Matter from Mental States?
    I'm encouraged that you think it's a fairy tale! I have encountered many materialists who agree wholeheartedly with the conclusion in that comic. Maybe you see why I think it's absurd that consciousness could arise from wiring switches together, running a current through them, and turning them on and off in a certain way. If you can't get consciousness from moving rocks around (you can't), why should you get consciousness from turning switches off and on?

    So, do you think it's possible to simulate consciousness?
  • Mental States from Matter but no Matter from Mental States?
    Same question to you, 180. Do you believe it's possible to simulate a universe of conscious beings by moving rocks around? If not, where do you and that comic I linked diverge?
  • Mental States from Matter but no Matter from Mental States?
    A brain doesn't have to be conscious, so I'd word it as: something functionally equivalent to my brain would have the capacity for consciousness. You're conveying incredulity but there's no way this is news to you.

    Let's explore this because this is important. Take a look at this comic:
    https://xkcd.com/505/

    Do you believe it's possible to simulate a universe of conscious beings by moving a bunch of rocks around in a certain way? If not, where do you and that comic diverge?
  • Mental States from Matter but no Matter from Mental States?
    Yes, uncontroversially. This is a philosophy forum, I'm well aware of the difficulty in claiming to know anything beyond that I'm a thinking thing, but as much as one can be certain of anything else, I'm at least certain of that.

    You're more certain that physical matter exists than of pretty much anything else? What do you base this high level of certainty on?

    Also, regarding consciousness, do you believe that something that is functionally equivalent to the brain will be conscious, whatever the substrate? The example that is often given is setting up an enormous system of valves, water, pumps and pipes that is functionally equivalent to a working brain and then running it. Do you think such a system would produce consciousness? How about a system of electric switches opening and closing? Do you think that if you open and close the switches in some way, the system of switches will be conscious? If so, why? Also if so, why would that particular combination of switching actions give rise to a conscious moment of, say, stubbing your toe, while a different set of switching operations give rise to, say, the beauty of a sunset?

    If so, why do you think it's taking so long to come up with an explanation for how the brain produces consciousness
    — RogueAI

    Those are not related things.

    Sure they are. If science can't solve consciousness, then it's first going to appear as an "explanatory gap" until people realize science isn't equipped to solve it. I think that's where we're at at the moment and why we're seeing people like Christof Koch turn to panpsychism.

    Also, you did not give an explanation for why consciousness has been such a tough nut to crack for so long. In an interview, Paul Davies called it the number one problem in science. I may be going out on a limb with idealism, but you are certainly going out on a limb denying there's a hard problem (which you do later on). Do you think that our brains just aren't equipped to handle the consciousness problem? But then that is ad hoc: we can detect gravity waves now, but we're still in the dark about how brains produce consciousness? That shouldn't be. That's a problem for materialists.


    There is no necessary cause for a brain to come to understand consciousness. If humans hadn't evolved, perhaps no brain would even have a concept of consciousness. I don't think rats, crows and dolphins spend their time thinking about this stuff.

    But we do spend our time thinking about such stuff, and science prides itself on its explanatory power, and in this one area, there has been a definite lack of progress that is starting to become embarrassing, leading people like Giulio Tononi to speculate, without a shred of proof or way to verify, that consciousness is a result of information processing. That's pretty out there, but IIT is all the rage now.

    For example, suppose 1,000 years from now the Hard Problem remains. Would you reexamine your belief that consciousness arises from matter?
    — RogueAI

    The hard problem is not a problem, it's a protest. It's even worded by Chalmers as such. There is nothing to wait for.

    How do brains produce consciousness? There is no answer, of course, which suggests there is something to wait for: the answer to how brains produce consciousness. If "there's nothing to wait for", why are so many people wasting their time trying to explain it? Your answer is not believable.

    As for running and legs and brains, we have an explanation for running/walking. We have no explanation for the emergence of consciousness from the actions of neurons.
    — RogueAI

    An of-the-gaps fallacy again. Science hasn't explained it yet, therefore it must be God/panpsychism/dualism/whatever other ism I favour.

    You're not reading what I said. The reason walking/running and legs isn't like consciousness emerging is because we have an explanation for walking/running and walking and running and legs all belong to the same ontological category. We don't have an explanation for consciousness (we don't even have an agreed upon definition of it), and mental states and physical states are ontologically different things.


    If you find yourself making this argument, stop, catch yourself, and remember: no one finds this a good argument when it's not used in the service of their pet theory. And more honest people don't think it a good argument period.

    If physical states can cause mental states, why not vice-versa?

    I'm not making a god-of-the-gaps argument. I'm saying materialism will never explain science because there's a category error going on: material things cannot, in principle, give rise to consciousness, just as consciousness cannot give rise to material things.

    I suspect you're going to say that a collection of electric switches, if arranged some particular way and turned on and off some particular way, will produce consciousness. This goes to the heart of the matter. A conscious collection of switches is already an absurdity, and it entails an additional absurdity: That a collection of valves, pipes, and water, if functionally equivalent to those switches that produce a conscious moment, will also be conscious. I think the debate is over when you make that claim. I think it's an obvious absurdity, so my argument against materialism isn't "god-of-the-gaps", it's a reductio absurdum: physicalism leads to conscious systems of valves and pipes and water (among other things). To which I respond: absurd.
  • Mental States from Matter but no Matter from Mental States?
    His general point stands: legs are a prerequisite for walking; walking does not cause legs. Atomic structure is a prerequisite for materials; material structure is not a prerequisite for atoms. A prerequisite for atoms is massive, charged particles; atoms are not a prerequisite for massive, charged particles. Or, more simply, trees are a prerequisite for forests; forests are not a prerequisite for trees.

    Do you believe the brain is a prerequisite for consciousness? If so, why do you think it's taking so long to come up with an explanation for how the brain produces consciousness? Also, how long would you be willing to wait before giving up? For example, suppose 1,000 years from now the Hard Problem remains. Would you reexamine your belief that consciousness arises from matter? What about 10,000 years from now? Also, would you agree that anything that is functionally equivalent to a working brain should be conscious?

    As for running and legs and brains, we have an explanation for running/walking. We have no explanation for the emergence of consciousness from the actions of neurons. Also, "Running" and "legs" exist in the same ontology, just like "wet" and "water" and "river". No new ontological categories are required for those examples. Not so with physical states and mental states. They are obviously ontologically different things.

    "You have an invalid assumption: that every hierarchical relationship in physics is or ought to be a two-way street. That is not a peculiarity of physics (just your conception of it) so, no, it's a problem for physicalists that consciousness is a function of brains but cannot create brains."

    If physical states can cause mental states, why not vice-versa? In other cases in physics where A causes B but B can't cause A, there's an explanation. What's the physicalist explanation for why matter can produce mental states, but not vice versa?
  • Mental States from Matter but no Matter from Mental States?
    Fear and love are mental states, and they produce all kinds of material consequences, like fights and babies

    Fair point. I will amend my claim to "you don't get new/additional matter from mental states".