On the show, Data is always puzzled by some feature of common human behavior. Maybe he could convince someone he's autistic, except the can perform calculation and recitation of facts at a superhuman level if asked, and he usually does so unless told not to. — Marchesk
Second time I read the term "superhuman". The fact something is done at a superhuman level is now posited as an argument against something being conscious. — Benkei
I also don't think being able to reproduce the full range of human emotion should be a prerequisite to be considered conscious. — Benkei
Agreed, but the harder problem is about the epistemic justification for deciding whether a physical system different from our own is conscious. And the argument is that we have no way to really know, because our own consciousness does not tell us what it is about us that makes us conscious. It could be the brain stuff, it could be the functions performed by the brain, it could be both, or it could be that something else like panpsychism is the case. We just can't tell. — Marchesk
To summarize, the harder problem is that human phenomenal concepts do not reveal whether our material makeup or the functional role our neurobiology plays is responsible for consciousness. As such, we have no philosophical justification for saying whether a functional isomorph made up of different material such as the android Data from Star Trek is conscious. Even more confusing, we have no way of telling whether a "mere" functional isomorph is conscious, where "mere" means functional in terms of human folk psychology only, and not in the actual neural functions. — Marchesk
Is this a real problem though? — Benkei
I'm from the "common sense" approach that what's conscious is what people decide it is and it's neither here nor there why. — Benkei
f one morning you wake up feeling dumpy and stupid, just write an article in a philosophy forum and talk about how much you don't know about consciousness, you will feel better. The more you write about this thing that you don't know, the smarter you'll feel. — god must be atheist
Yes, I saw that and agree. I'm not satisfied with anyone's solution to the hard or harder problems. You end up biting one or more bullets no matter which way you go. — Marchesk
On the other hand, Searle doesn't treat consciousness as a ghost in the machine. He treats it, rather, as a state of the brain. The causal interaction of mind and brain can be described thus in naturalistic terms: Events at the micro-level (perhaps at that of individual neurons) cause consciousness. Changes at the macro-level (the whole brain) constitute consciousness. Micro-changes cause and then are impacted by holistic changes, in much the same way that individual football players cause a team (as a whole) to win games, causing the individuals to gain confidence from the knowledge that they are part of a winning team.
He articulates this distinction by pointing out that the common philosophical term 'reducible' is ambiguous. Searle contends that consciousness is "causally reducible" to brain processes without being "ontologically reducible". He hopes that making this distinction will allow him to escape the traditional dilemma between reductive materialism and substance dualism; he affirms the essentially physical nature of the universe by asserting that consciousness is completely caused by and realized in the brain, but also doesn't deny what he takes to be the obvious facts that humans really are conscious, and that conscious states have an essentially first-person nature.
It can be tempting to see the theory as a kind of property dualism, since, in Searle's view, a person's mental properties are categorically different from his or her micro-physical properties. The latter have "third-person ontology" whereas the former have "first-person ontology." Micro-structure is accessible objectively by any number of people, as when several brain surgeons inspect a patient's cerebral hemispheres. But pain or desire or belief are accessible subjectively by the person who has the pain or desire or belief, and no one else has that mode of access. However, Searle holds mental properties to be a species of physical property—ones with first-person ontology. So this sets his view apart from a dualism of physical and non-physical properties. His mental properties are putatively physical. — Biological naturalism
Immediately I would see that the first person ontology becomes the "ghost in the machine" that he purports to reject. It is exactly that question of how micro-states (third-person) IS or BECOMES (is over time) macro-states. Just to say "we have micro-states" and "we have macro-states" is to simply restate and beg the question. — schopenhauer1
The Cartesian conceptualization needs to be rejected entirely, both in whole and in part. There is just ontology that we flesh out in (public) language, whether ordinary or specialized. — Andrew M
Yes, I see this type of phrase a lot of rejecting the "Cartesian" conceptualization. But exactly does that mean? The hard problem still remains. It seems to me a sort of de facto panpsychism perhaps. I don't know. — schopenhauer1
A lot of philosophical language is implicitly dualistic. And it can make problems look more intractable or mysterious than they would otherwise be. — Andrew M
It seems to me the "ghost" may a product of conceptual problems that arise from (possibly misleading) introspection. It seems that my mind IS something, so I conceptually treat it as an entity. IMO this leads to a dualist (or quasi-dualist) view of the mind.Then what's an example of a solution? Or do we just not debate philosophy of mind and problem solved? I don't see how the problem is not a problem by using different language, or rather, I don't even see how that language would be employed. When I say "green" as a qualitative state and "green" as a wavelength of light hitting the eye and producing all sorts of neurological states and arrangements, they seem different. How would you suppose to not have the difference without adding the ghost? — schopenhauer1
In answer to your question: the quale "green" is an experience - a representation of a physical attribute, that is produced by the visual cortex which then passes into short-term, and then long-term, memory. — Relativist
I think the "ghost" is an illusion of introspection. Rather, the representation of greenness is present because it influences behavior. Some of the more important mental activity that is discussed in theory of mind is that which mediates between stimulus and response. This is important because it is contrary to the notion that color qualia are epiphenomenal.Representation of a physical attribute? That sounds like where you are sneaking in the ghost or the "Cartesian Theater". It usually happens somewhere. — schopenhauer1
Some of the more important mental activity that is discussed in theory of mind is that which mediates between stimulus and response. — Relativist
We perceive (have a subjective experience) of greenness, and having experienced it at least once, we then have a memory of greenness - a memory that is drawn upon when we dream, think, or imagine things that are green. Perceiving color is a functional capacity that we possess, one that confers an ability to tailor our actions based on this quality. The experience of greenness is nonverbal; words cannot convey the experience. What problems are you referring to?Some of the more important mental activity that is discussed in theory of mind is that which mediates between stimulus and response. — Relativist
Problem is that consciousness isn’t limited to perception. Memory, dreams, imagination, feelings, thoughts and hallucinations all can have colors, sounds, etc — Marchesk
...the harder problem is that human phenomenal concepts... — Marchesk
To summarize, the harder problem is that human phenomenal concepts do not reveal whether our material makeup or the functional role our neurobiology plays is responsible for consciousness. As such, we have no philosophical justification for saying whether a functional isomorph made up of different material such as the android Data from Star Trek is conscious. Even more confusing, we have no way of telling whether a "mere" functional isomorph is conscious, where "mere" means functional in terms of human folk psychology only, and not in the actual neural functions.
So if Data's positronic brain functions different from our own brain tissue, but still produces reports and behaviors based on things like beliefs, desires and phenomenal experience, we have neither the physical nor functional basis for deciding whether he is actually conscious, or just simulating it. — Marchesk
Then what's an example of a solution? Or do we just not debate philosophy of mind and problem solved? I don't see how the problem is not a problem by using different language, or rather, I don't even see how that language would be employed. When I say "green" as a qualitative state and "green" as a wavelength of light hitting the eye and producing all sorts of neurological states and arrangements, they seem different. How would you suppose to not have the difference without adding the ghost? — schopenhauer1
I just don’t buy that language is the problem here. I have pain and color experiences, but those aren’t part of the scientific explanations of the world or our biology. And language doesn’t create pain or color experiences. Rather, they are simply part of our experience which language reflects. This leaves color and pain unexplained, with no way so far for us to reconcile with science.
Language is dualistic, because that’s our experience of the world. — Marchesk
"Green" in its ordinary public sense is not a qualitative state, it's a property of certain objects that human beings can point to (trees, grass, etc.) There's a qualitative/experiential aspect in the pointing, but not in the objects. — Andrew M
The scientific usage of "green", while related, has a different referent (i.e., we're pointing at something else, namely a range of light wavelengths). — Andrew M
As I see it, problems are solved by differentiating our experiences, developing a public language around them, and generating testable hypotheses. That is what scientists (and to some extent all of us in our everyday lives) do. The philosophers' role is to resolve/dissolve the conceptual problems that arise. — Andrew M
I suggest that there are non-verbal concepts, and this includes qualia like greenness. The "concept" of greenness is that mental image that we perceive. The word "green" refers to this quale. The range of wavelengths associated with greenness are those wavelengths that are associated with this quale. Color-blind humans who lack the ability to distinguish red from green do not know greenness - they only know ABOUT greenness.Conceptual problems arise sometimes, when there is legitimately no good explanation how two phenomena that seem different are the same. That is the hard problem. — schopenhauer1
I suggest that there are non-verbal concepts, and this includes qualia like greenness. The "concept" of greenness is that mental image that we perceive. The word "green" refers to this quale. The range of wavelengths associated with greenness are those wavelengths that are associated with this quale. Color-blind humans who lack the ability to distinguish red from green do not know greenness - they only know ABOUT greenness. — Relativist
The experience of greenness is nonverbal; words cannot convey the experience. — Relativist
What problems are you referring to? — Relativist
The "concept" of greenness is that mental image that we perceive. The word "green" refers to this quale. The range of wavelengths associated with greenness are those wavelengths that are associated with this quale. — Relativist
That's not my experience (nor, I think, anyone else's). — Andrew M
What is the difference between those and that which does not count as being those? — creativesoul
Consciousness is not the problem. Our account of it is. — creativesoul
What is the difference between those and that which does not count as being those?
— creativesoul
Phenomenal: color, sound, smell, taste, pain, pleasure, hot, cold, thoughts, beliefs, desires, dreams, feelings.
Non: shape, space, time, composition, number, structure, function, computation, information, empirical. — Marchesk
Consciousness is not the problem. Our account of it is.
— creativesoul
Obviously it's not a problem for nature. It's a problem for humans because we can't figure out what the proper account of consciousness is. And depending on what the proper account is, our ontology or epistemology might need to change to reflect that. — Marchesk
Get involved in philosophical discussions about knowledge, truth, language, consciousness, science, politics, religion, logic and mathematics, art, history, and lots more. No ads, no clutter, and very little agreement — just fascinating conversations.