• Antony Nickles
    1.1k
    the self is a thing just like any other thing. It comes into existence just like every other thing, by being thought of, conceptualized, by a person.T Clark

    I’m not sure we just disagree or whether you misunderstand my point. Not everything in the world is an object, like a tree; some are activities, like pointing or apologizing; some are concepts, like justice, truth, etc; some are logical conceptualizations, like Plato's forms or Kant's thing-in-itself. The idea of a physical or casual "consciousness" is a manifestation of our need for something specific, knowable. The act of naming, as Tzu references, is not the only way language works (we, more generally, particularize, which is I think more to his point). And “consciousness” is not an object. The question is: what matters to us in wanting it to be one? Because, if it is an object, we can know it, be certain about it, and ascribe causality and intention and “thoughts” to it; also, if I have a consciousness, and others do, then we have something certain in “knowing” other minds, say, their pain.

    I don't think consciousness handles intention and judgement, it just attaches meaning, labels to them using language.T Clark

    The idea that “consciousness” “attaches” something to words, or “uses” words, is just the desire to have control over the meaning of what we say by internalizing how language works. It's as if: because we have experiences, and we can chose words, that all of language is us putting words to what we are aware of, like labeling it.

    I'll just say that of all the functions of the body, none of them amounts to the mathematical, factual/physical certainty that we want for "consciousness"--for ourselves or others; philosophy created this picture, and science chases the image.
  • I like sushi
    4.9k
    My post was clear, precise and gave reasoning and a suggested resource for people to look at. No opinions.

    If people do not wish to take it seriously (like yourself) not my issue.

    Bye
  • T Clark
    14k
    No opinions.I like sushi

    This is clearly not true.
  • Isaac
    10.3k
    I think assigning a specific evolutionary purpose to consciousness is unjustified.T Clark

    Can you expand on that? Is this something specific to consciousness, or do you think it equally unjustified to assign an evolutionary purpose to osmosis, or active sodium ion transportation?
  • fdrake
    6.7k
    If one the many 'consciousness mysterians' were to say that the question of "how/why the brain produces consciousness" is unanswered and then go on to give what would count as an answer from their own definition of function - say "I'm expecting to see how consciousness carries out some function and by 'function' I mean..." - then we'd at least have something to discuss. But as it stands, the discussion still seems little more than "Ohhh, isn't it weird, man".Isaac

    Eh, being able to articulate the problem entirely, or even partially, in functional terms is also contested. You don't need to deny the idea that the body is productive of mental states to be against a functional account of mental states. By my reckoning, terms you've set out already skew the terrain of debate. I'm not convinced that non functional/mechanical terms of debate are appropriate, but it's still a distortion of terrain to demand the debate occurs in your favoured set of background assumptions. Albeit that distortion is also an important move in the debate!
  • Isaac
    10.3k


    You're missing my point, it's not that we must look at this from a functional point of view, that was just an example, it's that we must look at this from some point of view. It's not sufficient to be dissatisfied with answers given from one perspective (functional ones serving as the example here) without saying why or how those accounts are unsatisfactory, what are they missing?

    If neuroscience doesn't explain consciousness by reference to functions, why not? What is it that such an account is missing?

    I'm not asking that question rhetorically (in support of a functionalist account), I'm asking it literally in support of a coherent account. There needs to be some ground for satisfaction with 'why?' questions. It doesn't matter what it is, but it's nonsensical, in my view, to claim dissatisfaction with an answer without being able explain what that answer is missing. Anything less is just carping.

    All I have so far from people dissatisfied with the evolutionary, functional account (for example) is a repeat of the question "...but why?" We can repeat that question ad infinitum on absolutely any subject. We don't, because we have grounds on which we are satisfied.

    Without such ground the investigation is anchorless.
  • bert1
    2k
    Sorry, everyone, I'm still catching up. I'm only on page 5. I'll probably be jumping around a bit.

    There are many papers that explains how personal experiences arise from brain function, how pathology, physical injury and intoxication/physical condition can affect their quality and how we are able to diagnose and repair problematic states of consciousness.Nickolasgaspar

    Yes, every position on consciousness, including the most woo of the magic woo ones involving invisible ghostly ectoplasm and cosmic fairies, all understand that alterations to brain function affect what we experience. Of course they do. No one denies that. Getting drunk affects what we experience, as does getting hit in the head, as does receiving sensory input of any kind, etc etc. The issue is how does the capacity to experience anything at all get there in the first place. That's the contested bit. One way to bring out this distinction is to contrast consciousness simpliciter, with what we are conscious of.
  • bert1
    2k
    I would say that consciousness causes (some) behaviour, not that (some) behaviour is consciousness. As I mentioned before, I can think many things that I never "manifest" in behaviour.Michael

    I broadly agree. Some (forget the name of the guy I'm thinking of) say that there are no absolutely private facts about your experience. They say it is in principle (even if we don't yet have the tech) to access all the facts about your experience by examining brain function. However the guy I'm thinking of (I'll find who it is) says that nevertheless, there are two ways of accessing these facts. Of course, I think this 'two ways' is just another acknowledgement of privacy. My way of accessing my experience is private, even if I can't ultimately keep a secret in the face of a mind-scanner.
  • fdrake
    6.7k
    You're missing my point, it's not that we must look at this from a functional point of view, that was just an example, it's that we must look at this from some point of view. It's not sufficient to be dissatisfied with answers given from one perspective (functional ones serving as the example here) without saying why or how those accounts are unsatisfactory, what are they missing?Isaac

    I should have been clearer.

    If neuroscience doesn't explain consciousness by reference to functions, why not? What is it that such an account is missing?Isaac

    There's another two reasons I think of that any proposed account could fail: internal inconsistency, category error. As far as I know, people like Strawson take the former approach, people like Chalmers take the latter. A criticism which points out a category error or internal inconsistency doesn't, necessarily, make someone need to answer the "what's missing?" question to maintain the coherency of their criticism.

    If someone says "It's missing X", then you end up in a discussion about what X is, and I think that's where your comment is usually applied.

    If someone says "It's missing something which plays The Role of X, or allows the role of X to be played", then we end up in a different discussion. You don't need a (token) positive account of X to demonstrate a need for an account of the role of X (type). So someone need not rely on a particular account of qualia, or a particular account of representation, to argue against eg. a functional perspective precluding something which plays the role of qualia (first person experience, private, intrinsic, nonrelational, internal causation). In other domains from qualia, there could be a similar type of dispute between some functionalists and some representationalists - you don't necessarily need to care about intentions for a functionalist account to make sense.

    Eg. a belief that P is a tendency to act as if P vs a belief that P is an attitude toward P. Someone might say about the former account:

    "I understand that the person's behaviour in the first and second case may be the same, however behaviours can be determined by different processes - the first account says that beliefs are entirely behavioural and doesn't speak at all about their content of their belief state. How then can you develop a criterion that two people can behave as if P without invoking a non-behavioural invariant that allows us to tell P from "by all appearances, P".

    Something like that!
  • dimosthenis9
    846


    That is a really really good argument.Your thread makes a serious point.Nice.

    Maybe consciousness can't be defined as strictly brain function like the "walking example" cause it is not strictly brain that generates it.
    First even brain has to interact with the external world as consciousness to take place.And to interact also with the whole body function.

    So maybe consciousness isn't produced only from the brain but from the whole body.And brain just plays the central role to all that function.As in everything.The "coordinator role" let's say.
    In fact not even walking is just "what legs do" if you consider it. Cause brain involves also for it.As other human organs too.

    What i mean is, that it might be better starting to examine consciousness as a whole body effect and remove it from the typical theory that connects it only with brain.And all that Endless mind-brain debate.Well anyway it is only a hypothesis.
    The only thing i disagree is that neuroscience has nothing to say about consciousness.
    If not the only, for sure though ,brain plays a crucial role in consciousness.And neuroscience studies brain.
    So if not neuroscience then who would have a better say on that? Let's not be aphoristic.

    But again nice thread and great argument for the actual definition that exists about consciousness even in dictionaries.
  • Janus
    16.5k
    If 'orange' is understood to refer to a quale, then to whose quale is it supposed to refer ?plaque flag

    In my view 'orange' doesn't refer to a quale, but to a kind of experience or perception. It's not a precise term, which is obvious if you look at a colour wheel that shows colour gradations. I'd say there is nothing precise about language, but its binaries foster the illusion of precision.

    I don't think reference is dependent on precision, but on our understanding; words refer to things because we understand them to refer, that's all and there's no more to it than that.

    A person might then say that 'orange' refers to my quale. And then you get it and link it to yours. So it has two references that might be the same, one can never tell.plaque flag

    If I have seen something orange, say an orange, and I say "That orange is a very intense orange" the sentence refers to an orange I have seen, and the second time the word 'orange' is used in that sentence it refers to the colour of that orange.

    Say the orange is in front of us and you then say "I agree the orange of that orange is very intense" then you are also referring to a quality of the colour of the orange. Our actual perceptions are private and can never be compared except via talk, which means they can only vaguely or imprecisely be compared.

    I have no reason to think that your perception of the orange is very much different from mine, and I have only your similar description as reason to think that your experience was like mine. I might ask whether you see the colour of the orange as tending towards yellow or red, and if I see it as tending towards yellow and you say you see it tending towards yellow, then we have more reason to think our perceptions of the orange are similar.

    One can interpret things that way. I don't think it's obvious.plaque flag

    So the sentence " I saw an orange afterimage", just taken as a bare sentence, isn't abstractly referring to a certain kind of experience? What do you take the sentence to mean?

    Yes. And we can just watch interactions. On this forum, I can tell (I am convinced) that other people grasp Wittgenstein's later work the way I do. And we read one philosopher about another too, which possibly changes, all at the same time, what we think about the author, the philosopher being commented upon, and ourselves.plaque flag

    If you can tell something as complex as how others grasp Wittgenstein's later work the "way you do", should it not be much easier to tell if someone's perception of a colour is the way you perceive it?

    I don't even want THC these days. It'd probably be fine, maybe fun, but I don't bother to seek it out.plaque flag

    All drugs including caffeine, alcohol THC and psychedelics, etc., alter how we see things. I still find that interesting, although I don't explore these kinds of experiences at the expense of my health.
  • plaque flag
    2.7k
    then we have more reason to think our perceptions of the orange are similar.Janus

    I don't think so, however admittedly tempting this sounds. If 'true orangeness' is hidden, we have no data whatsoever for supporting such a hypothesis. I don't see logic but only a comfortable and familiar prejudice.
  • plaque flag
    2.7k
    So the sentence " I saw an orange afterimage", just taken as a bare sentence, isn't abstractly referring to a certain kind of experience? What do you take the sentence to mean?Janus

    We might ask what 'experience' is supposed to mean. It's the motte / bailey thing. In ordinary language, we have mentalistic talk of how things felt to Jack and Jill. No problem. But 'pure' qualia are problematic. Concepts are etymologically grips. Well, we can't get a grip on them. We can just train kids to make reliable reports. In the present of 'this' paint sample, we train them to say 'red' or 'blue.' The qualia crowd tends to insist that no one can see through their eyes, check for an inverted spectrum. This sounds right but it but it's pretty weird. People want to say what they also say can't be said.
  • plaque flag
    2.7k
    If you can tell something as complex as how others grasp Wittgenstein's later work the "way you do", should it not be much easier to tell if someone's perception of a colour is the way you perceive it?Janus

    In the ordinary sense, of course. Motte and bailey. We need not introduce internal images, though this is tempting in ordinary contexts, given popular metaphors like the mind is a container.

    The point is that 'secret' qualia as referents make for an impossible semantics. It's 'obvious' once one grasps it (switches metaphors?). Such a semantics assumes telepathy without realizing it --- or an equivalent set of immaterial states that are induced in a massproduced model-T ghost haunting pineal glands when afflicted by various electromagnetic waves. This may be a parody, but how wide of the mark is it ?
  • plaque flag
    2.7k
    All drugs including caffeine, alcohol THC and psychedelics, etc., alter how we see things.Janus

    :up:

    To me this also points toward that lack in our lifeworld of 'pure' mentality and its shadow 'pure' materiality.
  • T Clark
    14k
    Can you expand on that? Is this something specific to consciousness, or do you think it equally unjustified to assign an evolutionary purpose to osmosis, or active sodium ion transportation?Isaac

    Sorry to take so long to respond. I didn't get a notice.

    There's a long history in evolutionary biology of people off-handedly assigning evolutionary reasons why certain traits were selected with no evidence. A famous example was that mallard ducks sexual practices include males forcing females to have sex. People claimed that that trait was genetically controlled and could be the reason human males rape human females. So, my criticism wasn't about the particular example you selected, but the process of assigning evolutionary justifications in general. Obviously, eyes evolved so we could see. Many other traits have much less obvious purposes or even no purposes at all.
  • Janus
    16.5k
    I don't think so, however admittedly tempting this sounds. If 'true orangeness' is hidden, we have no data whatsoever for supporting such a hypothesis. I don't see logic but only a comfortable and familiar prejudice.plaque flag

    The idea that our perceptions of the any colour are likely similar (leaving aside that some people are colour-blind) is based not only on questions we might ask each other about whether the colour seems closer to one or other of the colours on either side of it on the colour spectrum, but also on the well-observed structural commonalities of the human visual system.

    But 'pure' qualia are problematicplaque flag

    I haven't mentioned qualia except to say I don't favour the idea.

    People want to say what they also say can't be said.plaque flag

    What cannot be said cannot be said, obviously. However that there is that which cannot be said can be said without contradiction or inconsistency. I also happen to think that the fact that there is that which cannot be said is perhaps the most important fact about being human.

    We might ask what 'experience' is supposed to mean.plaque flag

    You might...but why would you when you know what it means as well as you know what any word means?

    We need not introduce internal images, though this is tempting in ordinary contexts, given popular metaphors like the mind is a container.plaque flag

    I think most of us have experienced internal images and internal dialogue judging from what others have said to me, I don't think of the mind as a container; it doesn't contain sensations, feelings, thoughts and images, it is sensations, feelings, thoughts and images.

    It's 'obvious' once one grasps it (switches metaphors?).plaque flag

    I hope you are not falling into the dogmatic illusion that those who don't agree with you have failed to understand. There are already a few here afflicted with that particular prejudice.

    This may be a parody, but how wide of the mark is it ?plaque flag

    Very wide, I'd say. Again all I've said is that our perceptions are not accessible to others other than by means of what we tell them. Perception is private, but it is talked about in a public language; a fact which would only be possible if there were a good degree of commonality. So, most of us feel pain, see colour, taste food, visualize, and so on. These are all experiences, and the only way others can know about them is if we tell them.

    To me this also points toward that lack in our lifeworld of 'pure' mentality and its shadow 'pure' materiality.plaque flag

    Right, but I haven't anywhere claimed, or even suggested that, there is any such thing as "pure mentality" or "pure materiality".
  • plaque flag
    2.7k
    but also on the well-observed structural commonalities of the human visual system.Janus

    But this proves nothing. No data. All we have publicly is word use and structural commonalities. No hidden states can play a role in making this case.
  • plaque flag
    2.7k
    it is sensations, feelings, thoughts and images.Janus

    And what are these ? You go on to say:

    haven't anywhere claimed, or even suggested that, there is any such thing as "pure mentality" or "pure materiality".Janus

    If feelings / thoughts / images have no 'purely mental' component, why shouldn't a bot have them simply by meeting the same public criteria ? Do you see what I mean ?

    Again all I've said is that our perceptions are not accessible to others other than by means of what we tell them.

    So, most of us feel pain, see colour, taste food, visualize, and so on. These are all experiences, and the only way others can know about them is if we tell them.
    Janus

    Why couldn't someone with the right technology know my perception better than I do ? Unless this inaccessibility is 'purely mental' or 'immaterial' ?

    Perception is private, but it is talked about in a public language; a fact which would only be possible if there were a good degree of commonality.Janus

    Bots are already or are on the way to being better conversationalists than we are. So clearly even typical human sense organs are not necessary. You seem to be implying something like the same universal set of nevertheless private referents as making cooperative sign use possible, basically repeating Aristotle's assumption. In other words, you seem to assume that we all automatically / directly know what 'pain' means but not when this or that other person is 'in' pain.
  • plaque flag
    2.7k
    I also happen to think that the fact that there is that which cannot be said is perhaps the most important fact about being human.Janus

    :up:

    This is why I'm interested in the question of the meaning of being. This is what I think others are trying to say with the hard problem, though I think they take too much granted.
  • Isaac
    10.3k


    OK, what you're saying makes a lot of sense, but I'm having trouble applying it to the 'Why?" and "How?" questions that were posed originally here.

    If I understand you correctly, you're saying that an account (such as neuroscience might give) might be missing a component which serves a function in the coherence of an account, and we know it's missing by it's function, not by it's identity?

    Like if I proposed the function...



    ... you'd be able to say there's something missing without being able to say what it is, simply because there needs to be a denominator?

    Yet if one asks "why do we have consciousness?" I think the answer needs to consist of a set of satisfactory reasons, simply by the structure of the question, no?

    And so if a set of reasons are given, they can only be rejected on two grounds; they're not reasons, or they're not satisfactory.

    Evolutionary, or functional accounts are clearly reasons, so it must be that they're not satisfactory - which is presumably where this missing type comes in. There's some component a satisfactory reason has (which we might not know the token of in this instance), which is missing.

    The trouble is I'm not even getting what the category is when specifically related to the question "why do we have consciousness?", or "how do neurons produce consciousness?".

    If I could at least get as far as understanding the type of measure of satisfaction missing, that would be progress. The kind of reason that would suffice. But I'm so far missing even that.
  • Isaac
    10.3k
    There's a long history in evolutionary biology of people off-handedly assigning evolutionary reasons why certain traits were selected with no evidenceT Clark

    Yes, a pretty sorry history, particularly in psychology, where just about everything has at one time been justified because "...that's what we evolved to do".

    And yet...

    I don't think anyone would seriously argue that cells eliminate waste ions for anything other than evolved reasons to do with survival within a niche.

    So I suppose the extent to which one is content with an evolutionary frame is the extent to which one is willing to allow for other influence. With behaviour that might be culture. With anything we might have randomness, or God, or our alien simulation managers...

    For me, I think evolutionary psychology is almost all bollocks. I think that because cultural influences are just too obviously at least a possible factor.

    With consciousness, however, I can't really think of that conflicting influence. We could invoke randomness (it just turned up), but then we'd also have to explain why humans who didn't have it weren't easily able to outbreed those that did.

    We could argue, as Dennet does, that it's an illusion, there's nothing to find a purpose to. But I dislike defining things away.

    I don't dispute the plausibility of non-evolutionary accounts, they just seem far more complicated, have more loose ends, and don't seem to explain anything that isn't covered in a functional account.
  • universeness
    6.3k
    So maybe consciousness isn't produced only from the brain but from the whole body.And brain just plays the central role to all that function.dimosthenis9

    If we talk to/observe, a human with no legs, would we find some difference in their 'level of consciousness' compared to people with legs?

    We can consider the affects on human consciousness, if we removed parts of the brain.
    What if a human only had the R-complex of the triune brain model. So no limbic system and no cortex?

    I asked chat GBT about the R-complex and whether a human would still be self-aware and conscious if the brain consisted ONLY of the R-complex. The replies were:

    The R-complex, also known as the reptilian brain, is a term used to describe the most primitive part of the human brain. It is located at the base of the brain and is responsible for regulating basic survival functions such as breathing, heart rate, and body temperature.

    The R-complex is composed of several interconnected brain structures including the brainstem, cerebellum, and basal ganglia. These structures are responsible for controlling and coordinating movements, balance, and posture.

    In addition to regulating basic survival functions, the R-complex is also involved in instincts such as aggression, territorial behavior, and mating behavior. It is often referred to as the "reptilian" brain because it is thought to be the part of the brain that we share with reptiles and other primitive animals.

    Although the R-complex plays an important role in our survival, it is not capable of complex thought processes and is largely reactive rather than proactive. The more advanced regions of the human brain, such as the neocortex, have evolved to enable us to engage in more complex behaviors such as reasoning, problem-solving, and creativity.


    and

    It is difficult to say with certainty whether we would still be conscious and self-aware if the human brain consisted of the R-complex only. While the R-complex is responsible for regulating basic survival functions, it is not capable of supporting higher-level cognitive processes such as self-reflection and introspection.

    Consciousness and self-awareness are thought to emerge from the complex interactions between different regions of the brain, including the prefrontal cortex, the parietal cortex, and the temporal cortex. These regions are involved in a wide range of functions, such as attention, memory, perception, language, and decision-making.

    Without these higher-level cognitive processes, it is unlikely that we would experience the same level of self-awareness and consciousness that we do with our current brain structure. However, it is important to note that this is purely speculative and hypothetical, as the human brain did not evolve to consist of the R-complex only, and it is difficult to predict the exact nature of consciousness and self-awareness under such conditions.


    To me, this kind of angle to the discussion increases my personal credence level that consciousness IS what the brain does. I will offer a little of the 'interesting' evidence from Rupert Sheldrake in my next post however for his 'morphic resonance' and 'morphic fields' proposal.
  • universeness
    6.3k
    Over the years, Sheldrake has amassed a lot of data based on his own work and the work of others, with observing rat behaviour in mazes. The most basic being how fast a rat can get to food morsels placed in the maze. Can they remember the path to the food and pass the knowledge on to their offspring /other rats etc. Sheldrake has been building this data over 30+ years of experimenting.

    The offspring certainly did get to the food much faster. BUT much more interesting was his findings that when they used a whole new bunch of rats from the same area, who were not involved in the original experiments and who were not related to the original rats. They also found the food/got through the maze, much faster than the rats used in the initial experiments. Sheldrake then used this data to suggest that information was getting exchanged by the rats due to morphic resonance via morphic fields. A kind of natural telepathy.

    Supporters of Sheldrake's work and Sheldrake himself then repeated the experiments over many years in different countries with completely new rats and the data he gathered suggested the new rats could get through his mazes much faster on their first ever attempt, compared to the rats he first worked with.

    I am not convinced that such evidence, proves that information can be passed between 'conscious' creatures via morphic resonance and morphic fields. But even if it does, that does not mean consciousness is not 'what the brain does,' it would mean that perhaps information can be passed/correlated via some quantum phenomena such as entanglement (as Sheldrake himself has suggested).
  • Pantagruel
    3.4k
    I am not convinced that such evidence, proves that information can be passed between 'conscious' creatures via morphic resonance and morphic fields. But even if it does, that does not mean consciousness is not 'what the brain does,' it would mean that perhaps information can be passed/correlated via some quantum phenomena such as entanglement (as Sheldrake himself has suggested).universeness

    Yes, and/or that information is a naturalistic feature. If there is an 'information manifold,' however, it would seem to prima facie vastly expand, not contract, the scope of the science of consciousness.
  • dimosthenis9
    846
    If we talk to/observe, a human with no legs, would we find some difference in their 'level of consciousness' compared to people with legs?universeness

    Well they would simply have different consciousness as in every person in general.If by difference you mean lower level of consciousness for those with no legs.Then of course not.

    We can consider the affects on human consciousness, if we removed parts of the brain.universeness

    Yeah, but consider also the brain without a heart to support it.

    The rest you posted is extremely interesting.And really informative.I had little knowledge for R complex and now you made me wanna investigate it more.

    I mostly agree to the conclusion that consciousness is a phenomenon of constant interaction of different areas and cannot be spotted only in just one specific place in the brain.I believe that myself.

    My only guess is that this interaction, that makes the phenomenon of consciousness to emerge, is among all body functions.And yes brain plays a huge role as coordinate them.
    But as i mentioned before nothing can stand on its own in human body.Not even brain.
    It is the interaction of everything that makes it happen.
    But its only my hypothesis.Does not make it true.

    it would mean that perhaps information can be passed/correlated via some quantum phenomena such as entanglement (as Sheldrake himself has suggested).universeness

    And that is also a nice hypothesis.
  • universeness
    6.3k
    Yes, and/or that information is a naturalistic feature. If there is an 'information manifold,' however, it would seem to prima facie vastly expand, not contract, the scope of the science of consciousness.Pantagruel

    Yeah, but even if all that Sheldrake claims, eventually turns out to be true, how much would that increase the personal credence level YOU assign to such as panpsychism?
    For me, my answer would be, not much! I still have a credence level of around 1%.
  • universeness
    6.3k
    Well they would simply have different consciousness as in every person in general.If by difference you mean lower level of consciousness for those with no legs.Then of course not.dimosthenis9

    Then how can there be any consciousness in the body, if we can remove so much of it, without becoming a less conscious creature?

    Yeah, but consider also the brain without a heart to support it.dimosthenis9

    The heart is just a big blood pump. Does a person, kept alive with an artificial heart, ( can only be done for a short time at the moment,) have a reduced experience of consciousness? I mean, do you think their cortex would have a reduced ability, to play it's role in perception, awareness, thought, memory, cognition, etc due to having an artificial blood pump, instead of a natural one (such as a heart transplant)?

    My only guess is that this interaction, that makes the phenomenon of consciousness to emerge, is among all body functions.And yes brain plays a huge role as coordinate them.
    But as i mentioned before nothing can stand on its own in human body.Not even brain.
    It is the interaction of everything that makes it happen.
    But its only my hypothesis.Does not make it true.
    dimosthenis9
    Your last two sentence's above are true for all of us posting on this thread, so that's a given imo.
    BUT do you therefore think that if before you die, we could take out your brain and connect it to a fully cybernetic body. That there is no way and no sense that the creature produced would still be you?
    Still be your 'conscience?'
  • Dawnstorm
    249
    So I suppose the extent to which one is content with an evolutionary frame is the extent to which one is willing to allow for other influence.Isaac

    No, I think by the time you're arbitrating between evolution or alternatives, you've already resolved the problem, or put it on hold. It's no longer relevant. The problem is conceptual and starts before that; at the conceptual stage. [EDIT: I'll leave the previous sentence in; I like it in all its ineptness. It perfectly expresses the muddle I feel when confronted with this problem.]

    It's a very hard to grasp concept, which is why we help ourselves with conecepts such as p-zombies. A p-zombie and a person with first-person experience would both behave the same, and thus share the same evolution. What sort of test could we devise to tell if one is a p-zombie or not? If p-zombies are impossible, how can we conceptualise evidence for this?

    P-zombies aren't the point. They're a wishy-washy pin-point of some intuitive niggle people have. But the niggle's there. And the problem's bigger: there's a continuum that starts with solipsism and ends in pan-psychism.

    I personally am a mysterian - in the context of science: I don't think it's possible to resolve that, because for first person experience (the ultimate subjectivity possible) there's only ever a sample-size of one; and the sample possible is different for each scientist (always only their own). Normally, talking objectively about subjectivity is not a problem; subjects can be operationalised so you can talk about them. Neuroscience is definitely evidence of that. But you can't do that for the hard problem; the empirical substratus this is about goes away, if you do. You assume the outcome one way or another and go on to more interesting questions (evolution of brainstuff being one of them).

    I think the doubling of bodies as something you have and something you are is relevant here (Helmuth Plessner has talked about that, I think; it's been years and I've forgotten too much). When I say that my keyboard is made up of atoms, I can conceptualise this a matter of scale. It's easy. When I say, consciousness is made up of neural activity (which is my default working assumption), all I have is a correlation; the nature of the connection eludes me. Given that I tend to figure stuff out by comparison, and given that I'll never be able to entertain more than one first-person-experience at a time, I suspect it'll continue to elude me.

    (For what it's worth, the hard problem is little more than an interesting curiosity to me. When viewed as a problem it's hard, but for me it's hardly a problem.)
  • Pantagruel
    3.4k
    Yeah, but even if all that Sheldrake claims, eventually turns out to be true, how much would that increase the personal credence level YOU assign to such as panpsychism?
    For me, my answer would be, not much! I still have a credence level of around 1%.
    universeness

    All that would have to be true is that somehow information is affecting reality (which it clearly does) and is capable of being stored in such a way that it is not trivially evident, but is accessible and amenable to neural processing. And this is precisely how neural networks function, by processing inputs in hidden layers, with respect to confirming/disconfirming feedback, they exploit abstract relationships without necessarily even identifying what those are. In which case, social consciousness, hive mind, even panpsychism aren't in any way mysterious or non-scientific.
bold
italic
underline
strike
code
quote
ulist
image
url
mention
reveal
youtube
tweet
Add a Comment