What problem?
Neurology is a discipline that tells us much about how conscious experience happens.
— creativesoul
Yeah, but as Luke in this thread (and Chalmers elsewhere) have pointed out, it doesn't explain why any physical system would be conscious. — Marchesk
Chalmers' "what it's like" rendering is an untenable and rather ill-informed approach. I've argued that at length on this very forum. There is no such
singular thing as "what
it's like" to be human.
Our conscious experience(being human) is an ongoing process that is directly and indirectly influenced by, and consisting of, all sorts of different things all the time. It is an autonomous process, one of which we have little to no control over, to very large degree. That said, each and every moment of our lives counts as "what it's like to be human", and this alone poses a huge problem, obviously enough I would hope, for anyone who aims at defining "what it's like to be human", for being human
is not like any single excised duration within our lives. It's exactly like all of them, but they are each respectively different. Thus, the notion is incoherent at best. It's untenable. Our conscious experience consists of all moments during our lives, and each and every duration is unlike the rest for each and every one consists of some elements that the others do not. Being human
is all of them.
Moreover, to labor the point by introducing changes in our thought and belief systems, because the way we think about what's happening changes over time(along with changes in our belief system) and the way we think about things affects/effects conscious experience, even our experiences involving the same sorts of things changes over time as well, despite the recurrence of some of the elements.
Drinking Maxwell House at time t1 is a much different experience than drinking Maxwell House at time t20,000 if along the way one
gradually begins to enjoy the experience less and less unbeknownst to themselves at first. This
will certainly happen as a result of the taster drinking 100% Kona coffee freshly ground and prepared with a French press at some time during their lives, and then continuing to drink Kona coffee more and more afterwards. We can replace Kona coffee and the preparation process with any other, and the point holds.
All of this places the notion of "what it's like to be a human" under rightful suspicion regarding it's ability to even provide an outline for our conscious experience, for what coffee tasting is like at time t1 is not what coffee tasting is like at time t20,000, even without the introduction of Kona coffee. The very same issues arise with any and all conscious experiences of 'X' at different times. Variables fundamentally change the experience.
Our understanding of physics would not predict this if we weren't already conscious. — Marchesk
This seems irrelevant to me, although I'd be happy to entertain an argument for how it is.
Some folk hereabouts seem to think that we cannot acquire knowledge of our own conscious experience, simply because we must use it as a means for doing so. They've adopted this fait accompli attitude about the subject. There's a similar vein of thought pervading philosophy of language and 'getting beneath language'. I've found that that's not an insurmountable problem at all, actually, in either respect. The method of approach matters most in such metacognitive endeavors, and that method must include adequate minimal(universal) standards and criterions which must be determined first and satisfied accordingly throughout the endeavor.
Unfortunately, attention spans are required, and seem to be lacking...
It's really no different(roughly speaking) than acquiring knowledge about anything that exists(existed) in it's entirety prior to our awareness and/or subsequent accounting practices of it. Conscious experience is one such thing.
A nervous system wouldn't fundamentally be different than a computer with input devices, in that regard. — Marchesk
This breaches another topic, but perhaps it's worth touching upon...
On my view, nervous systems aren't fundamentally conscious. They are most certainly fundamentally different than computers. I would not even go as far as to say that a human being is fundamentally conscious, at least
not from the moment of conception through the first
completely autonomous correlation drawn between different things.
This skirts around the issue of where to 'draw the line', so to speak, which again harks back to the aforementioned criteria.
Why do we see colors and feel pain when no other physical system does this, far as we can tell? What would it take for a robot to do so? Did Noonien Soong sliip a qualia chip into Data's positronic brain? — Marchesk
Animals do. They are physical systems, in part at least, just like we are.
What would it take for a robot to see colors and feel pain? Probably biological machinery capable of doing so. At least, that's my guess.