IIT, originated by Giulio Tonini,
— frank
Scott Aaronson debunkificated this a while back. David Chalmers shows up in the comment section. — fishfry
Why would integration have to be all or nothing? How about degrees of it and a threshold for consciousness? — frank
Good question. Did you see what I said earlier about axioms and postulates? — frank
Are you conscious? Is your significant(s) other conscious? To not draw this out, I'll answer for you: yes, and yes.
— RogueAI
This is precisely what I was talking about before. That sort of wishy-washy 'well, I know what I mean' way of communicating is no good for answering questions about consciousness in a scientific way. — Kenosha Kid
It doesn't lead to it per IIT. Integrateted information is consciousness. — frank
Absolutely not. We have no common "basic understanding" of consciousness. On this site alone you'll find a new one for every thread on the subject. — Kenosha Kid
It's late in life for me, and I find I have, or have had, most of what I ever wanted. Some of it is gone, owing to normal processes of aging, death, disease, and so on.
There are two things I wish I had when I was 18--roughly--that I have now. One is peace of mind. I'm pretty contented. It would have been good to be so calm and collected when I was at the beginning of college, instead of bouncing off the walls.
The second thing I wish I had had when I was 18 was the technology I use now -- computer, tablet, internet. These three things (and the companies that back them up, like Barnes & Noble or Amazon) would have made study so much more effective.
Yes, it would have been nice if gay liberation had arrived in the outback where I lived in 1964. All that erotic energy wasted under the cold wet blanket of condemnation and guilt.
Loads of money? Nope. I never had a lot, but I always had enough money. So far, anyway. All that one needs is a little more than one needs--a margin. — Bitter Crank
And my point was that you don't need a scientific description of consciousness to tell whether something is conscious or not. In order for a scientist to discover scientifically what water is, yes, she needs a definition of water. If she doesn't know what water is, she can't tell you what's in the glass. Even if she knows what water looks like, she needs to be able to differentiate it from alcohol, or any other transparent liquid. As it happens, you don't need to know _much_ about water to be able to distinguish it perfectly well from not-water (it's appearance, fluidity, taste, lack of smell). This is the extent to which the definition of consciousness also needs to be precise: to distinguish it from unconscious things.
That's correct. Are octopuses conscious? Does that question involve whether computers are conscious or not? No. So the question is not about computers (although a perfectly good example). — Kenosha Kid
Suspicion confirmed. I'm not claiming there's a possible world where consciousness can arise from rocks.
— Daemon
I'm not sure how you could know that.
But in any case you are starting from a position where I previously had working sense organs. But suppose I had never had them: I don't think I'd ever have been conscious. And consider this from an evolutionary perspective: consciousness would never have developed at all without sensing, sense organs.
Why should we assume that consciousness can arise from rocks?
— RogueAI
I get the impression from later chat that this clicked: We should NOT assume that consciousness can arise from rocks.
This is the Hard Problem
— RogueAI
I don't think so. The hard problem allows for a bunch of rocks to be conscious, it just doesn't allow a complete third person description of that consciousness since it will not contain "what it is like to be a conscious bunch of rocks". And when I say "doesn't allow", I mean that Chalmers won't hear of it on grounds of taste.
How could you verify whether such a system is in fact conscious?
I think this is catastrophic to the physicalist project of explaining materialism. Functionalism won't help here. Functionalism is the problem! Suppose we make a metal brain that is functionally equivalent to a working organic brain. If functionalism is right, it should be conscious. Time to test it! So, how do we test whether it's conscious or not?
— RogueAI
This has nothing to do with non-organic conscious as far as I can see. This problem already exists for discerning if an animal or even a person is conscious.
I don't predict this will be the difficult part given a more comprehensive model of consciousness. The issues here (in my experience) relate primarily to language. The concept of "consciousness" is vague and therefore arguable. For instance, some people don't like the idea of observing anything non-human as conscious, and that vagueness gives sufficient wiggle room to be able to say, "but that's not quite consciousness" about anything. I think this is also partly why people like Chalmers retreat to the first person in these arguments. It's possible to claim that something is lost when you transform to the third person view, as long as that something is suitably wishy-washy.
But if you really want to test for consciousness, you have to define in precise terms what consciousness is, not what it isn't.
What other physical processes besides rock interactions can produce consciousness?
N/A
— RogueAI
I've read somewhere that they accept that a thermostat is conscious. A thermostat but not the whole brain? And the whole body is involved in consciousness!
What's the hypothesis and how would it be tested?
Why is it ok to consider their hypothesis as it is, when it seems to be fatally flawed from the outset? — Daemon
"We can identify it in an abstract sense, but not in a practical sense, as we can with a manmade machine.
We have "brainoids" now, grown from adult human skin cells. But unless they are connected to sense organs, and yes, things like feet, they can't do what real brains do. There isn't anything for them to be conscious of."
[bolding mine]Thus is a fascinating sentence:
"Note that these postulates are inferences that go from phenomenology to physics, not the other way around. This is because the existence of one’s consciousness and its other essential properties is certain, whereas the existence and properties of the physical world are conjectures, though very good ones, made from within our own consciousness."
It's Descartes 2.0.
I think the functionalist has to define 'consciousness' in such a way that a function can constitute it. For example, X is conscious if and only if X maps the world and can predict events. Brains can do that, therefore brains are conscious. The trouble is that's not the definition of consciousness that many philosophers are talking about (including me, and I think you). The problem is we can't agree on definitions before we start. This impasse has arisen dozens and dozens of times on this forum and the last. I don't think functionalism is really a theory of consciousness, it's a definition. Most of the time anyway. Sometimes it's a theory, I think, depending on how its forumulated. With the walking and legs analogy, it's definition. Walking just is how that action is defined. And that's not interesting.
I never said you could build a functioning brain out of anything. Your question was regarding whether something with the same function as a brain would be conscious; my answer is yes. It doesn't follow that you can build a functioning brain out of rocks, liquorice or thin air: that is a purely technological problem. But _if_ you built something with the same functioning as a conscious brain out of rocks, then yes, that system would by definition be conscious.
It’s an interesting question, and I haven’t read the rest of the thread yet, but I think there’s a misunderstanding here. Both your descriptions here assume both consciousness and a working brain exists.
Producing a feeling is not the same as producing consciousness, and I’m not sure how you would ‘arrange’ feelings or experiences as you’ve described without a working brain.
The ‘feeling of stubbing your toe’ is a complex interrelation of ideas, including notions of ‘self’, ‘body’, ‘toe’, ‘movement’ and ‘impact’ as well as ‘unpleasant’, ‘sharp’ and ‘pain’. Potentially, it can all be rendered as a pattern of electric current through matter without understanding any of these ideas - provided that matter has sufficient experience to recognise and describe the pattern as ‘the feeling of stubbing your toe’. Otherwise how would you confirm this?
Conversely, one can theoretically arrange all of the above ideas in a particular way to construct a mental state that matches this pattern of electric current - without anyone ever actually stubbing their toe.
Can you name them?
I am only asking because I heard many preachers say, "I've met many such and such that said such and such". I think it's a rhetoric and I am having a hard time believing it any more. If you met many materialists who said this or that, some names must have stuck in your mind.
I am fully aware that you can say, "Joe Montague, Harry Griffin, Michele Adieu, Robert Frankovic, Debbi Gaal, and Rosemary Thimble." I ask you to be honest. Did you actually met MANY materialists who said what you claim they all said?
A brain doesn't have to be conscious, so I'd word it as: something functionally equivalent to my brain would have the capacity for consciousness. You're conveying incredulity but there's no way this is news to you.
Yes, uncontroversially. This is a philosophy forum, I'm well aware of the difficulty in claiming to know anything beyond that I'm a thinking thing, but as much as one can be certain of anything else, I'm at least certain of that.
If so, why do you think it's taking so long to come up with an explanation for how the brain produces consciousness
— RogueAI
Those are not related things.
There is no necessary cause for a brain to come to understand consciousness. If humans hadn't evolved, perhaps no brain would even have a concept of consciousness. I don't think rats, crows and dolphins spend their time thinking about this stuff.
For example, suppose 1,000 years from now the Hard Problem remains. Would you reexamine your belief that consciousness arises from matter?
— RogueAI
The hard problem is not a problem, it's a protest. It's even worded by Chalmers as such. There is nothing to wait for.
As for running and legs and brains, we have an explanation for running/walking. We have no explanation for the emergence of consciousness from the actions of neurons.
— RogueAI
An of-the-gaps fallacy again. Science hasn't explained it yet, therefore it must be God/panpsychism/dualism/whatever other ism I favour.
If you find yourself making this argument, stop, catch yourself, and remember: no one finds this a good argument when it's not used in the service of their pet theory. And more honest people don't think it a good argument period.
If physical states can cause mental states, why not vice-versa?
His general point stands: legs are a prerequisite for walking; walking does not cause legs. Atomic structure is a prerequisite for materials; material structure is not a prerequisite for atoms. A prerequisite for atoms is massive, charged particles; atoms are not a prerequisite for massive, charged particles. Or, more simply, trees are a prerequisite for forests; forests are not a prerequisite for trees.
"You have an invalid assumption: that every hierarchical relationship in physics is or ought to be a two-way street. That is not a peculiarity of physics (just your conception of it) so, no, it's a problem for physicalists that consciousness is a function of brains but cannot create brains."
Fear and love are mental states, and they produce all kinds of material consequences, like fights and babies
