• Dfpolis
    1.3k
    For those of us who are not physicalists, it is fairly uncontroversial that conscious perception involves awareness of neurally encoded contents. That is certainly how I think of it. Yet, this simple schema turns out to be highly problematic.

    In Descartes' Error neurophysiologist Antonio Damasio argues that our knowledge of the external world started as neural representations of body state and evolved into representations of the external world as the source of changes in our body state:
    ... to ensure body survival as effectively as possible, nature, I suggest, stumbled on a highly effective solution: representing the outside world in terms of the modifications it causes in the body proper, that is representing the environment by modifying the primordial representations of the body proper whenever an interaction between organism and environment takes place. (p. 230) — Anthony Damasio
    I see three problems with this otherwise plausible hypothesis: (1) It requires one neural state to encode mul­tiple concepts. (2) There seems to be no mechanism this "solution" could have evolved. (3) Neural states do not represent as other signs do.

    First, on this view, one physical state (the object's modification of our body/neural state), represents two intelligible states (the object sensed and it effect on us).  For example, in seeing an apple our retinal state is modified, so the same neural signal represents both the apple and the fact that our retina has been modified.  Consequently, there is no neural basis for distinguishing data on the object from data on the sub­ject.

    The reason neural signals can provide no data by which we can distinguish subject and object is the identity of action and passion first noted by Aristotle: the subject sensing the object is identically the ob­ject being sensed by the subject. Given this ontological identity, there is no way to pry apart data on the subject as sensing and the object as sensed.

    Clearly, sensory signals can encode different notes of intelligibility. For example, our complex percep­tion of a ball encodes its matter and form differently. We can see and feel its sphericity, while squeezing it supports the notion of rubbery material. That is not the case with conceptualizing the difference be­tween sensed objects and the concomitant changes in body state. There is no difference, even in prin­ciple, between a neural message saying we are seeing a red apple and one saying we are seeing (having our bodily state modified by) a red apple.

    I am not arguing for solipsism. I take as a given that we are conscious of objects other than ourselves. Rather than questioning this datum, I am trying to understand the dynam­ics making it possible. It seems to me that grasping this difference requires a direct intuition of the ob­ject as an object, as other -- and this, or something functionally equivalent, is missing from our model.

    Second, while this is not a problem for behavioral evolution, it is a problem for understanding intellec­tion. Effective behavior can evolve interdependently of whether an organism responds to its internal state or to externalities. What a neural signal encodes is immaterial as long as the response to it is bio­logical effective (evolutionarily fit).  However, if consciousness of objects is solely due to awareness of neurally encoded content, we can have no basis for thinking objects are distinct from ourselves.  To do so we must grasp an intelligible difference between our self and the object, and there is none in the neu­ral signal.

    This problem is similar to that noted by Alvin Plantinga in his evolutionary argument against natural­ism. Both note that what gives evolution its traction, the selection of successful structure and behavior, provides no traction in selecting certain features of mental experience. In Plantinga’s case, it provides no traction in selecting true over false beliefs. Here, it provides no traction in distinguishing self-infor­mation from information on the other.

    Third, the idea that neural impulses act as a signs glosses over and obscures the dynamics of sensory awareness. Signs are means of knowing. Signifiers are only potential signs unless they actually evoke a thought. Smoke, though a potential sign of fire, is operative only when used to indicate fire. Since know­ing is relational, so are signs. In his Ars Logica the Portuguese Dominican John of St. Thomas (John Poinsot 1589-1644) distinguishes formal and instrumental signs. An instru­mental sign requires that we understand the sign’s own nature before it can signify. A formal sign does not. Instrumental signs, are things like smoke, writing, road signs, binary codes, rancid smells, etc. In order to grasp what an instrumental sign signifies we must first grasp what the sign is in itself. For smoke to signify fire, we must grasp that it is smoke, and not dust or a cloud. For writing to signify we must first discern the shape of letters or pictographs. If we cannot do so, such signs fail to convey mean­ing, to act as signs.

    It is common (e.g. in represeantational theories of mind) to confuse mental signs such as ideas and judgements with instrumental signs such as binary codes, but they are different. Ideas are formal signs. We need not grasp the nature of a formal sign for it to signify. We do not need to grasp that the concept apple is an idea before it can signify apples. Rather, understanding that we know apples, we see that we must be employing an instrument in knowing them. Thus, we come, retrospectively, to the notion of an apple idea. The whole reality of a formal sign, all that it does and can do, is signifying.

    Neurally encoded data works in neither of these ways. Neurons encode data in their firing rates, yet we grasp their encoded contents without the faintest idea of their firing rates, or indeed of anything about our brain state. In consciousness, neither our brain state in general, nor our neural firing rates in particular, function as ordi­nary (instrumental) signs. Since conscious awareness of contents does not involve the proprioception and interpre­tation of brain states, they do not function as instrumental signs in providing content to consciousness.

    Neither are neural firing rates a formal signs in consciousness. Remember that the entire reality of a formal sign, all that it does, is to signify. Firing neurons do more than signify, and when their firing rate is used to determine their meaning by in third person observation, they are not operating as formal signs. (If a neuroscientist were to observe the firing of a neuron and discern what it signified, its firing rate would be an instru­mental sign.)

    Despite not fitting into prior semiotic categories, neural firing rates do encode data. A great deal of neurophysical work supports the conclusion that the data they encode provides us with, or at least sup­ports, the contents of consciousness. So, they function as a kind of hybrid sign: one whose intrinsic na­ture need not be discerned for them to signify, but which, unlike formal signs, do more than signify.

    Like the problem of distinguishing self-data from object-data, this seems to intimate that we have a ca­pacity to grasp intelligibility that is not fully modeled in our present understanding.
  • christian2017
    1.4k


    I would argue that the ability to feel pain or pleasure is proof of either a divine (or soul) or its proof that the entire universe is one living organism and each of us including bacteria are just subset or should a cell with in that giant organism. Are you familiar with the notion of collective conscieeeeeence or collective soul? Its like the entire universe has phantom leg syndrome.
  • Galuchat
    809
    First...There is no difference, even in principle, between a neural message saying we are seeing a red apple and one saying we are seeing (having our bodily state modified by) a red apple.
    It seems to me that grasping this difference requires a direct intuition of the object as an object, as other -- and this, or something functionally equivalent, is missing from our model.
    Dfpolis

    Object and subject are an ontological unity, having epistemological distinctions.

    From a Cognitive viewpoint:
    A neural message is a function of sensation (stimulation-response).
    1) Stimulation is exogenous and/or endogenous stimulus (sensory signal) transduction by receptors, causing a response.
    2) Response is the propagation of action potentials in excitable cells.

    Seeing a red apple is a function of perception (sensory interpretation), specifically: vision.

    The brain processes sensation data, and the mind processes perception data (these are incommensurable levels of abstraction).

    From an Ecological viewpoint:
    Gibson, James Jerome. 1977. The Theory of Affordances. In R. Shaw & J. Bransford (eds.). Perceiving, Acting, and Knowing: Toward an Ecological Psychology. Hillsdale, NJ: Lawrence Erlbaum.

    Second...What a neural signal encodes is immaterial as long as the response to it is biological effective (evolutionarily fit). However, if consciousness of objects is solely due to awareness of neurally encoded content, we can have no basis for thinking objects are distinct from ourselves. To do so we must grasp an intelligible difference between our self and the object, and there is none in the neural signal.Dfpolis

    This is a function of self-awareness development, levels 2 & 3.
    Rochat, Philippe. 2003. "Five Levels of Self-Awareness as They Unfold in Early Life". Consciousness and Cognition 12 (2003): 717–731.

    Third, the idea that neural impulses act as a signs glosses over and obscures the dynamics of sensory awareness.Dfpolis

    Biosemioticians would classify a "neural impulse" as a signal type of sign.
  • Dfpolis
    1.3k
    Thank you for your comment,

    While I agree that we have an immaterial aspect that makes us subjects in the subject-object relation of knowing (a soul), the fact that each of us is a different subject, with unique experiences, makes it difficult for me to lend credence to the notion of a collective soul. I do think that there is an immaterial God, and that we can be aware of God via rational proof and direct, mystical experience.
  • Dfpolis
    1.3k
    Thank you also for responding,

    Still, I do not see that anything you said resolves the three issues I raised. Did I miss something?
  • Relativist
    2.5k
    Like the problem of distinguishing self-data from object-data, this seems to intimate that we have a ca­pacity to grasp intelligibility that is not fully modeled in our present understanding.Dfpolis
    I agree with this, and suggest this may just mean we have a problematic paradigm. E.g. reference to "information" seems problematic, because information connotes meaning, and meaning entails (conscious) understanding - which seems circular, and it doesn' seem possible to ground these concepts in something physical. That doesn't prove mind is grounded in the nonphysical, it may just be an inapplicable paradigm.

    Consciousness is that which mediates between stimulus and response. As such, we should consider the evolution of consciousness from the simplest (direct stimulus-response), to increasing complexity, and develop a paradigm that can be applied to the development of mediation processes. As far as I know, this has not been done.
  • Dfpolis
    1.3k
    Thanks for commenting.

    it doesn' seem possible to ground these concepts in something physical.Relativist

    I agree. As I argued last year, I do not think that intentional (mental) realities can be reduced to physical realities.

    That doesn't prove mind is grounded in the nonphysical, it may just be an inapplicable paradigm.Relativist

    "Physical" means now the reality it calls to mind now. Its meaning may change over time (and has), but the present paradigms are based on our conceptual space as it now exists. Changing paradigms involves redefining our conceptual space, and a consequent redefinition of terms such as "physical" and "natural."

    Consciousness is that which mediates between stimulus and response.Relativist

    This seems very behaviorist in conception and inadequate to the data of human mental experience.
  • Relativist
    2.5k
    Physical" means now the reality it calls to mind now. Its meaning may change over time (and has), but the present paradigms are based on our conceptual space as it now exists. Changing paradigms involves redefining our conceptual space, and a consequent redefinition of terms such as "physical" and "natural."Dfpolis
    I don't think it requires redefining "physical" and "natural", it means reconsidering the nature of our thoughts. A visual image is something distinct from the object seen, it's a functionally accurate representation of the object. In general, our conceptual basis for a thought is based on the way things seem to be, but the seemings may be illusory. It seems as if a concept is a mental object, but when employed in a thought, it may more accurate to describe it as a particular reaction, or memory of a reaction: process and feeling, rather than object.
  • christian2017
    1.4k


    I'm actually in the middle of something right now. I'll get back to you later. I've been drinking a little bit so i can't quickly reply with a quick answer. Thanks for the reply.
  • Galuchat
    809
    ...if consciousness of objects is solely due to awareness of neurally encoded content, we can have no basis for thinking objects are distinct from ourselves. To do so we must grasp an intelligible difference between our self and the object, and there is none in the neural signal.Dfpolis
    Signals are not only transmitted from environment to body to mind, but also from mind to body, causing change in the environment. The capacity for motor coordination differentiates object (other) and self in the mind of a sentient being.

    E.g. reference to "information" seems problematic, because information connotes meaning, and meaning entails (conscious) understanding - which seems circular, and it doesn' seem possible to ground these concepts in something physical.Relativist
    Communication (including: data, encoding, code, message, transmission, conveyance, reception, decoding, information) is a good analogy for the sensation process if a physical (as opposed to only semantic) type is acknowledged.
  • Relativist
    2.5k
    Communication (including: data, encoding, code, message, transmission, conveyance, reception, decoding, information) is a good analogy for the sensation process if a physical (as opposed to only semantic) type is acknowledged.Galuchat
    It's a useful analogy in some contexts, but it may not be the best analogy for analyzing the ontology of mind. For example, we aren't going to find a physical structure that corresponds to a packet of data (from perception) or of decomposable information (like the logcal constructs that define a concept). That is not sufficient grounds to dismiss physicalism; it may just mean we need a different paradigm.
  • Galuchat
    809
    It's a useful analogy in some contexts, but it may not be the best analogy for analyzing the ontology of mind.Relativist

    I think that mind is an integrated set of organism events which produce an individual's automatic and controlled acts, so; an open sub-system of (at least certain) organisms (e.g., those having a central nervous system). But, the ontology of mind is off-topic.
  • Dfpolis
    1.3k
    A visual image is something distinct from the object seen, it's a functionally accurate representation of the object.Relativist

    While I tend to agree with this, it does not explain how we distinguish the object from the subject -- which is the problem I have.

    It seems as if a concept is a mental object, but when employed in a thought, it may more accurate to describe it as a particular reaction, or memory of a reaction: process and feeling, rather than object.Relativist

    I think I agree. I would say that the concept apple, while often conceived of as a "thing" is simply the act of thinking of apples.
  • Dfpolis
    1.3k
    Signals are not only transmitted from environment to body to mind, but also from mind to body to environment. The capacity for motor coordination differentiates object (other) and self in the mind of a sentient being.Galuchat

    I agree, but how does this allow us to distinguish body states from external states?
  • Galuchat
    809

    Do you mean rather, how does this allow us to distinguish body states from the states of other objects?
  • Relativist
    2.5k
    I am not arguing for solipsism. I take as a given that we are conscious of objects other than ourselves. Rather than questioning this datum, I am trying to understand the dynam­ics making it possibleDfpolis
    I agree, but how does this allow us to distinguish body states from external states?Dfpolis
    I suggest that it's a consequence of the neural connections being different. Consider how we distinguish the location of a pain in the left knee - it's a consequence of the specific connections from peripheral nerves to specific areas of the central nervous system, wherein we become consciously aware of the pain's location. Even after the pain is gone, the memory of the pain is unique from other conscious experiences. Visual and auditory information are also unique, and processed through unique neural paths, and this maps to conscious experiences that are also unique.

    You referenced Plantinga, so perhaps you're familiar with "properly basic beliefs". Our "beliefs" about the external world are basic, baked into the mechanism (or support structure) that produces (or supports) consciousness. (I'll add that they are properly basic, because they are a consequence of evolutionary development: a functionally accurate grasp of the external world is advantageous. This is the core of my refutation of his EAAN).
  • Dfpolis
    1.3k
    I think that mind is an integrated set of organism events which produce an individual's automatic and controlled acts, so; an open sub-system of (at least certain) organisms (e.g., those having a central nervous system). But, the ontology of mind is off-topic.Galuchat

    It seems to me that subjectivity (being a knowing and willing subject) is essential to the experience of mind. Functionalism does not cut it.
  • Dfpolis
    1.3k
    Do you mean rather, how does this allow us to distinguish body states from the states of other objects?Galuchat

    Yes, that is what I said.
  • Dfpolis
    1.3k
    I suggest that it's a consequence of the neural connections being different.Relativist

    Different how? To take your example, how do I distinguish a signal indicating the existence of a condition causing pain from a signal that says only that a pain receptor is firing? Since they are one and the same signal, I do not see how I can.
  • Zelebg
    626
    For those of us who are not physicalists

    Just to make one thing clear. There is no such thing as “immaterial” or “non-physical”, it’s a self-contradiction. Instead, it can be undetectable, either due to our limits or in principle, but it must be made of something or otherwise is made of nothing and that means it does not exist.

    So, if there is such a thing as soul, it must be physical, it’s the only logically valid semantics, it’s just that substance it is made of is for some reason invisible to us. Any other claim about ontology of “immaterial” is a paradox, simply gibberish. Can we all agree?
  • Relativist
    2.5k
    how do I distinguish a signal indicating the existence of a condition causing pain from a signal that says only that a pain receptor is firing? Since they are one and the same signal, I do not see how I can.Dfpolis
    When a pain receptor is fired, the mind experiences it as the quale "pain". That is the nature of the mental experience. In effect, the signal passes through a transducer that converts the physical signal into a mental experience.
  • Zelebg
    626
    In Descartes' Error neurophysiologist Antonio Damasio argues that our knowledge of the external world started as neural representations of body state and evolved into representations of the external world as the source of changes in our body state:

    It’s most accurate and pragmatic to call it “virtual reality”, a sort of simulation, but to keep in mind that does not necessarily imply digital computation and computer algorithms as we know them today.


    I see three problems with this otherwise plausible hypothesis: (1) It requires one neural state to encode mul­tiple concepts. (2) There seems to be no mechanism thise "solution" could have evolved. (3) Neural states do not represent as other signs do.

    From a 3rd person perspective, neural states represent mental content in the form of electromagnetic and chemical signals, just like virtual reality of a simulated content is represented inside the computer in the form of signals between the logic gates and other circuits.

    Mechanics of even the simplest form of chemical reactions we call “life” is largely still a mystery. We really have no idea how the machine assembled itself, so it’s too optimistic to expect we could yet explain the ghost in the machine. But if you read between the lines of what everyone is talking about and where all the evidence points, this ghost is really just another machine in the machine, but a virtual kind of machine, and that explains everything.
  • Zelebg
    626
    To take your example, how do I distinguish a signal indicating the existence of a condition causing pain from a signal that says only that a pain receptor is firing? Since they are one and the same signal, I do not see how I can.

    Signals, just like words, pictures, and other kinds of representations are meaningless information by itself. Meaning comes from the grounding inherent in a decoder / interpreter system, also called personality, identity, ego, self...

    Sentience is a form of understanding, a way of coupling the signal with its meaning, so meanings / feelings are virtual properties, qualia are virtual qualities. Their ontology is virtual like that of Pacman, and in that sense virtual existence offers almost unlimited and arbitrary kinds of different properties or qualities, only visible from the “inside”, or via VR goggles if we ever figure out a way how to connect.
  • Dfpolis
    1.3k
    Thanks for your interest.

    Just to make one thing clear. There is no such thing as “immaterial” or “non-physical”, it’s a self-contradiction.Zelebg

    Would you care to show the contradiction? Please define "material" and "existence" and then show that existence entails material. I ask this because on the usual understandings these terms do not mean the same thing.

    Obviously, being immaterial does men not made of any kind of matter, but there is no logical reason why something not made of matter can't act, and so exist. For example, my intention to go to the store acts to motivate my motion toward the store. Your argument simply begs the question by assuming, a priori, that everything must be "made of something."

    You might find my discussion "Intentional vs. Material Reality and the Hard Problem" of interest. In it, I show why intentional existence cannot be reduced to physical existence.

    From a 3rd person perspective, neural states represent mental content in the form of electromagnetic and chemical signals, just like virtual reality of a simulated content is represented inside the computer in the form of signals between the logic gates and other circuits.Zelebg

    Of course, but what I am discussing is the first person perspective -- how it is that we know the difference between body states and object states.

    so it’s too optimistic to expect we could yet explain the ghost in the machine.Zelebg

    I am not suggesting a ghost in a machine. Rather, unified human have both physical and intentional operations, and neither is reducible to the other -- just as we cannot reduce the sphericity of ball to the rubber it is made of.

    Meaning comes from the grounding inherent in a decoder / interpreter system, also called personality, identity, ego, self...Zelebg

    While I agree, this does not solve the problem I am raising.
  • Dfpolis
    1.3k
    When a pain receptor is fired, the mind experiences it as the quale "pain". That is the nature of the mental experience.Relativist

    Yes, it does. How does this allow us to distinguish data on the sensor state from data on the sensed?
  • Zelebg
    626
    For example, my intention to go to the store acts to motivate my motion toward the store.

    Intentions, and other mental states, feelings and qualities, are not immaterial, they are virtual.


    Would you care to show the contradiction? Please define "material" and "existence" and then show that existence entails material. I ask this because on the usual understandings these terms do not mean the same thing.

    Your argument simply begs the question by assuming, a priori, that everything must be "made of something."

    To exist is to be (made of) something rather than nothing. No assumptions, only logic. “Material / physical” is everything that is not nothing, but material existence can also be virtual, not just actual.


    Of course, but what I am discussing is the first person perspective -- how it is that we know the difference between body states and object states.

    While I agree, this does not solve the problem I am raising.

    Can you give examples of what you are talking about?
  • Relativist
    2.5k
    Yes, it does. How does this allow us to distinguish data on the sensor state from data on the sensed?Dfpolis
    As I said, the pain signal (in effect) reaches a transducer which produces the mental state of localized pain. Does this much sound plausible? If so, what is your specific issue?

    If the mind is immaterial, as you assume, the issue seems to he: how do physical, electro-chemical signals produce the related mental states - right? It's not clear what specific issue you're focusing on. I'm just saying there has to be some sort of physical-mental transducer - that's where the magic is (the physical-mental causation).
  • Dfpolis
    1.3k
    Intentions, and other mental states, feelings and qualities, are not immaterial, they are virtual.Zelebg

    I have no idea what this means. "Virtual" usually means "potential." Clearly, my actual intentions are not longer potential.

    To exist is to be (made of) something rather than nothing.Zelebg

    This is begging the question. Clearly, anything that can act in any way exists, and, as I have pointed out, many intentions act to effect motions. Others act to motivate truth claims.

    Can you give examples of what you are talking about?Zelebg

    See the OP. The same signals indicating I am seeing an apple also indicate that my retinal state has change.
  • Zelebg
    626
    See the OP. The same signals indicating I am seeing an apple also indicate that my retinal state has change.

    What's the problem?
  • Zelebg
    626
    I have no idea what this means. "Virtual" usually means "potential." Clearly, my actual intentions are not longer potential.

    I said "it’s most accurate and pragmatic to call it “virtual reality”, a sort of simulation".

    https://en.m.wikipedia.org/wiki/Virtual_reality
bold
italic
underline
strike
code
quote
ulist
image
url
mention
reveal
youtube
tweet
Add a Comment

Welcome to The Philosophy Forum!

Get involved in philosophical discussions about knowledge, truth, language, consciousness, science, politics, religion, logic and mathematics, art, history, and lots more. No ads, no clutter, and very little agreement — just fascinating conversations.