Why would integration have to be all or nothing? How about degrees of it and a threshold for consciousness? — frank
Evolutionary biology might be the answer. Why would we need to answer that definitively at this point? — frank
Is this saying that all experiences are unique and that when an experience is happening there's something it's like to be having that experience, even if it's an experience of pure darkness and silence? — RogueAI
m not sure that this is true... — RogueAI
Let's suppose we have three people. Bob is stationary, Frank is accelerating to 99% the speed of light, and Susie is also motionless, but through a magical telescope, she's able to observe Bob and Frank's brains in real time. Bob's brain should look like a normal functioning brain, but as Frank accelerates, shouldn't Suzie see Frank's brain functions go slower and slower as time dilation kicks in? And let's also say that Suzie's magic telescope can look inside Frank's mind. As Frank accelerates, would his thoughts look slower and slower to Suzie? Would his consciousness change at all? And yet it must, because at the end of Frank's trip, he's going to report that he was conscious for X amount of time, while Bob reports that he was conscious for X+years more than Bob. — RogueAI
but only the integrated information can create this moment of consciousness.
— Pop
I'm not sure what that means. — frank
We are conscious of very little of what our brain is actually doing, and it's doing a lot of information processing moment by moment. Why does information integration viz-a-viz digestion not result in conscious experience? — RogueAI
Well, I don't totally understand IIT at this point. That's why I started this thread, in hopes of figuring out how it comes together. — frank
We are conscious of very little of what our brain is actually doing, and it's doing a lot of information processing moment by moment. Why does information integration viz-a-viz digestion not result in conscious experience? — RogueAI
IIT, originated by Giulio Tonini,
— frank
Scott Aaronson debunkificated this a while back. David Chalmers shows up in the comment section. — fishfry
Bob is just going to be a lot older than Frank. They'll be able to consult with a physicist to understand why. — frank
This is well expressed. But I wonder, can you see how perception (extra information) has to integrate with already established information to form understanding? — Pop
So now we're describing experience itself, the graphic is the view out of one of your eyeballs), and attempting to hypothesize about correlates of that in the neuronal realm. — frank
Its not really a theory of consciousness, in my view, since the hard problem is being ignored, and in being ignored only half of consciousness is being calculated. It seems more of a proposal of a way to calculate cognition. So on the basis of this I'm not going to analyze it further. — Pop
IIT is a computational theory of consciousness that blocks out the hard problem. — Pop
The felt quality of consciousness is dealt as a secondary consideration that is simply explained by qualia being equal to consciousness, — Pop
Of course the hard problem would be blocked out, as a felt quality can not be conceptualized ( being felt slightly differently in every end user ), so can never be calculated in any universal sense. — Pop
They say "Oh, but that's just what 'experience' means. There is nothing more other than that." Which is nonsense. I certainly do not mean 'integrated information' when I talk about consciousness. — bert1
But as you say, it just doesn't touch the basic question of why we should think that integrated information is consciousness, why it creates a first person perspective at all. — bert1
Get involved in philosophical discussions about knowledge, truth, language, consciousness, science, politics, religion, logic and mathematics, art, history, and lots more. No ads, no clutter, and very little agreement — just fascinating conversations.