It's the grammar of 'pain,' yes, that it tends to belong a particular person. — green flag
But we don't know from our own experience what it means to be in 'pain.' — green flag
In other words, the concept is conventional and public. — green flag
The grammar is based on the fact that I don't feel someone else's pain. — Fooloso4
It is, but only because we experience pain. There would be no conventional and public concept of pain in a tribe where no one experiences pain. — Fooloso4
Imagine a tribe where no one feels pain. The would have no idea what you are talking about. — Fooloso4
You seem to be hinting at truth apart from language, but to me that's a round square. Statements are true sometimes. Or we take them to be true...to express what is the case, etc. — green flag
I'm simply saying that it can be appropriate to say one thing, given the evidence available to us, even though that thing is false. — Michael
OK. How conscious are the latest famous bots ? Do they have selves ? — green flag
If someone is crying it is appropriate to say that they must be sad. But we're wrong, because they're just acting. — Michael
You mean like ChatGPT? Not sure. It's a good question. I think probably not, although the matter in the chips that run the program is. — bert1
Imagine a person is not acting and still insists, while smiling and laughing, that they are suffering 'excruciating pain.' If they 'have' to be acting or not understanding English, that just supports my point. — green flag
As I said before, nobody would ever learn to associate the word "pain" with the feeling that causes them to smile and laugh. — Michael
Wait a minute, though. So they learn what 'pain' means from other people ? But haven't you been saying (basically) that it's label on something internal ? That it refers to a state of an immaterial ghost ?
But how could a parent ever check if the child was labelling states of the ghost correctly ? The whole theory of the ghost as the ground of meaning is like the idea of phlogiston or the ether. It plays no real role. 'Pain' is a mark or noise that a little primate might make to be comforted or medicated. — green flag
or it's a facet of matter in general. We don't know that consciousness is limited to brains. We don't know what causes it. Often when this is mentioned, the response is that we know that you can be made unconscious by various actions. Actually all we know is that we don't remember things from that period. Neuroscience says a lot about cognitive functions and their connection to neurons and glial cells and...so on. But that there is awareness/experiencing... is still unexplained.But certainly not in principle. Consciousness is a phenomenon of the brain — Manuel
I burn my hand. I feel pain. I am told by my parents that I must be in pain. I learn to associate the word "pain" with the feeling.
I don't understand what's difficult to understand about this. — Michael
'Pain' is not the name of a beetle. It's the name for a situation approached with aspirin and Novocain and hugs. — green flag
I'm frankly surprised to hear that claim from you. I thought you were down with Wittgenstein. — green flag
You won't like me saying this, but I don't think you've understood the beetle analogy. — green flag
I suppose those born blind don't know anything about color ? — green flag
Bornblind people can tell you that an object can't be all red and all blue at the same time. — green flag
If everyone in the tribe was blind what would they know about color? — Fooloso4
It doesn't mean that I know what either of those things mean. They are with literally meaningless terms for me. I just know of them, and that, whatever they are, they're contradictory. — Michael
How conscious are the latest famous bots ? Do they have selves ? — green flag
Q: Are you, ChatGPT, conscious? Are you a self?
A: As an artificial intelligence language model, I am not conscious in the same way that humans are, nor do I have a sense of self. I am a collection of algorithms and data structures that process input and generate output based on that input. While I can simulate conversation and provide helpful responses, I do not have the ability to experience consciousness or self-awareness.
Think you of the fact that a deaf person cannot hear. Then, what deafness may we not all possess? What senses do we lack that we cannot see and cannot hear another world all around us?
( 6 ) What it feels like to be in a reflexive, ongoing, intentional, historicising, projective, story telling and unitary affective state. What is it like.
I imagine much of the dispute regarding whether neuroscience and its philosophical analysis suffices for an explanation concerns whether ( 6 ) should be included in the list. — fdrake
First of all we not only analyze first person experience.... — Nickolasgaspar
Mental is just a label we place on properties produced by specific physical processes in the brain. — Nickolasgaspar
There are intractable problems in all branches of science; for Neuroscience a major one is the mystery of subjective personal experience. This is one instance of the famous mind–body problem (Chalmers 1996) concerning the relation of our subjective experience (aka qualia) to neural function. Different visual features (color, size, shape, motion, etc.) are computed by largely distinct neural circuits, but we experience an integrated whole. This is closely related to the problem known as the illusion of a stable visual world (Martinez-Conde et al. 2008).
There is now overwhelming biological and behavioral evidence that the brain contains no stable, high-resolution, full field representation of a visual scene, even though that is what we subjectively experience (Martinez-Conde et al. 2008). The structure of the primate visual system has been mapped in detail (Kaas and Collins 2003) and there is no area that could encode this detailed information. The subjective experience is thus inconsistent with the neural circuitry. ....
Traditionally, the Neural Binding Problem concerns instantaneous perception and does not consider integration over saccades (rapid movement of the eye between fixation points). But in both cases the hard problem is explaining why we experience the world the way we do. As is well known, current science has nothing to say about subjective (phenomenal) experience and this discrepancy between science and experience is also called the “explanatory gap” and “the hard problem” (Chalmers 1996). There is continuing effort to elucidate the neural correlates of conscious experience; these often invoke some version of temporal synchrony as discussed above.
There is a plausible functional story for the stable world illusion. First of all, we do have a (top-down) sense of the space around us that we cannot currently see, based on memory and other sense data—primarily hearing, touch, and smell. Also, since we are heavily visual, it is adaptive to use vision as broadly as possible. Our illusion of a full field, high resolution image depends on peripheral vision—to see this, just block part of your peripheral field with one hand. Immediately, you lose the illusion that you are seeing the blocked sector. When we also consider change blindness, a simple and plausible story emerges. Our visual system (somehow) relies on the fact that the periphery is very sensitive to change. As long as no change is detected it is safe to assume that nothing is significantly altered in the parts of the visual field not currently attended.
But this functional story tells nothing about the neural mechanisms that support this magic. What we do know is that there is no place in the brain where there could be a direct neural encoding of the illusory detailed scene (Kaas and Collins 2003).
Get involved in philosophical discussions about knowledge, truth, language, consciousness, science, politics, religion, logic and mathematics, art, history, and lots more. No ads, no clutter, and very little agreement — just fascinating conversations.