I am glad that the discussion may see some extra use.Good subject for a term paper! — Wayfarer
You probably allude to the fundamental inconsistency between the pursuit of philosophy and any denial of being. But, as I said, I am not sure that eliminativists are denying the existence of the mind. I think that they deny any distinction between it and nature - they strip it of transcendence. So far, you did not say what is your position on pantheism is, Do you oppose materialism, but tolerate pantheism? Because, if you consider the co-extensiveness of matter and mind that eliminativists prescribe appalling, I assume that you feel the same about pantheists.It might be of relevance that the origin of the term 'ontology' is derived the first person declension of the Greek verb 'to be' (namely, 'I am'); which has somewhat different connotations from today's definition. — Wayfarer
My point was - assuming one treats the existence of the mind, rather then the body, as a starting point, wouldn't the theory I described be equivalent to that of eliminative materialism? If it isn't, what position would that be called?Solipsism is dissolved by empathy. — Wayfarer
In retrospect, projection may not have been the right term, because it implies some kind of codomain - a space to project onto. But a brain substate is a very primitive and base notion of awareness that doesn't require it to be separate from the body, and corresponds to the assumption of a medical model of psychology. Wouldn't that satisfy at least some eliminative materialists?I don't think materialists would acknowledge that. And by asking these questions, you're already outside the reductionist circle. — Wayfarer
This assumes that life can only be realized with a mind-body distinction. That is, that the material world is not capable of being the realization of consciousness.If you think that 'knowing you're alive' is a matter of faith then there's something the matter with your logic. :wink: — Wayfarer
I generally agree with the attitude of the statement, but wanted to remark, that some positions have assumptions that could actually be falsified experimentally. At least to the extent to which experimental information can be trusted. Some could even be considered logically inconsistent. Some of them, indeed, cannot be distinguished through consensus observations. And I personally cannot distinguish some even conceptually. It is a separate issue, that a position can be adapted to a new variant in order to survive a striking blow.Both are met with impossible circumstance: the one cannot prove with apodeictic certainty the mind is nothing but illusion, and the other cannot prove its apodeitically certain reality, so they both fall back on insisting they don’t have to. — Mww
If you are eliminative materialist, you do not admit the possibility of zapping the qualia away. The closest thing you would have to that is harming your body. Since the eliminativist does not believe in a transcendent mind, you can only suppress their qualia by killing them.Suppose we developed a machine that zaps your qualia away. You'll still function the same but just without any conscious experience. — RogueAI
The assumption that there is such a machine, already renders the eliminative materialism wrong, which voids the question. If you are asking, if they would take the chance, without knowing - this will be like like a "sell me your soul for a dollar" type of child prank. Some people would refuse on a principle.Would eliminative materialists actually use such a machine? Even if you paid them a lot of money? Or would they view it, as I do, as the equivalent of death? I think, when push comes to shove, you'd have to drag them to it, kicking an screaming. — RogueAI
To me - those are not logically unacceptable consequences. I feel obligated to stoically allow their consideration. The only thing I claim at the moment is that neither possibility appears fallacious. I make some speculations, but primarily in order to expand on their logical content.Any interest shown in this positive matter and I'll happily roll over and tolerate what strike me as more or less unacceptable consequences of an unbounded spectrum... e.g. conscious phones, insects etc. at one end, and literal talk of mental pictures, concepts, beliefs etc. at the other. — bongo fury
I fear that the scope of the discussion will broaden dangerously, if we include epistemology into the mix. My personal opinion, assuming a materialist point of view - equality can be considered a mostly evolutionary cerebral construct, supporting our ability to forecast and infer conditions in our environment, made possible by the local reproducibility of the natural patterns on a global scale of space and time.Notice something here. 'Mental states = brain states'. Now, I ask you, what kind of physical object is '='? Where in the physical world, where in nature, do you find anything at all remotely resembling "="? You won't find it, because it relies on abstraction, on assigning values to things, and then saying that ‘this means that, therefore this equals that.’ — Wayfarer
For me, this does yet falsify eliminative materialism, but makes it a theory that awaits further judgement. Isn't that true for most of philosophy?What kind of 'brain state' could equal 'equal'? And how would you go about finding that out? Even to ask the question, you have to make a lot of judgements about neural images and incredibly complex data - the brain being the most complex thing known to science. — Wayfarer
I thought, the mind-body dualism. Which I believe they refer to (obviously disparagingly) as the "common-sense" mind. But not the entire experience of life as such.Ask yourself this question - what does eliminative materialism eliminate? — Wayfarer
If the mind is co-extent with its embodiment's behavior, how can it not be real. (Not that I personally claim that the mind coincides with its embodiment, necessarily.) If the person is metaphysical solipsisist, it wouldn't be real. But then he wouldn't be eliminative materialist at the same time.The word ‘mind’ doesn’t correspond to anything real: what we take to be ‘mind’ is simply the snap, crackle and pop of billions of neural connections programmed by Darwinian algorithms for the sole purpose of propagation of the genome. That’s all there is to it. — Wayfarer
Is physicalism a repudiation of mental objects after all, or a theory of them? Does it repudiate the mental state of pain or anger in favor of its physical concomitant, or does it identify the mental state with a state of the physical organism (and so a state of the physical organism with the mental state)?
On the other hand, the same philosophers also claimed that common-sense mental states simply do not exist. But critics pointed out that eliminativists could not have it both ways: either mental states exist and will ultimately be explained in terms of lower-level neurophysiological processes or they do not.
, which then refers to Stanford Encyclopedia of Philosophy here, where the following statement is made:Modern eliminativists have much more clearly expressed the view that mental phenomena simply do not exist and will eventually be eliminated from people's thinking about the brain in the same way that demons have been eliminated from people's thinking about mental illness and psychopathology.
I still cannot fathom the nuance here. Isn't this just a re-phrasal with a different attitude. Unless the first group denies experience and existence. But I doubt it.Given these two different conceptions, early eliminativists would sometimes offer two different characterizations of their view: (a) There are no mental states, just brain states and, (b) There really are mental states, but they are just brain states (and we will come to view them that way).
I see now. First, let's agree that a vehicle and a vessel have some similarities, such as that they carry cargo and passengers. Of course, their method of transportation differs. Let's say that this aspect is fundamental for the purposes of the analogy. Then, for me at least, a human brain is to an insect brain, or to a plant's perception, more like a ship is to a boat, or a raft. A vehicle and a vessel would compare (in the sense that they are considered functionally different here), more like a person's brain compares to a person's leg. The gradual boundary between the two would be difficult to define indeed.The analogy was, "when does a vehicle become truly automotive i.e. a true automobile?". — bongo fury
Actually, you can switch the participants as many times as you want, as long as they keep notes of their neuronal state and pass them to their replacement.For example, humans can only survive for about 80 human years. How long could the giant being survive in human years? — TheHedoMinimalist
I am not claiming soundness, only the following implication - that if machines can develop mental state, and since we can build machines out of people, it follows that mental states can be composited from other mental states with separate experiences. This would apply in the context of eliminative materialism, panpsychism, functionalism, etc. Although, I fail to distinguish those very well.It’s hard for me to comment on consciousness in a scenario which is so alien to me. Either way, I’m skeptical that this thought experiment would imply that ecosystems or social systems might have mental activity. — TheHedoMinimalist
You know his propositions better, but isn't he implying that we are self-aware by construction, and not intrinsically?But he doesn't, really. He says we appear to be subjects, but the appearance of subjectivity is, in reality, the sum of millions of mindless processes. — Wayfarer
I believe that you are claiming some ontological basis for placing human beings (or at least human organisms) in a distinct category here. There are physical hypotheses for this, e.g. the quantum mind. Or it could be a transcendental assumption, which is generally fine, but speaking in the context of my original question, this would not be acceptable for an eliminative materialist.Do we hope that this society replaces its vague binary (automotive/non-automotive) with an unbounded spectrum, and stops worrying about whether automotivity is achieved in any particular vehicle that it builds, because everything is guaranteed automotive in some degree? — bongo fury
It may or may not be relevant, depending on your angle here, but from a physics standpoint, processes do not recognize an absolute measure of time. Real time is not a concept for the current theories.It’s kinda hard to imagine such slow responses would be influenced by mental states. Unless, the being experiences time really fast. But, how would experience time fast with such a slow brain. Having a slow brain doesn’t seem to make time go fast. So, I think it’s more plausible to think that the being is simply not conscious. — TheHedoMinimalist
Reading from the quotes that you kindly provided, I am left wondering what "illusion" means in this context. I understand the general sentiment expressed, and I can see how Daniel Dennett might reject subjectivity as its own substance (mind-body dualism) or intrinsic property (panpsychism), But I do not see how the determination and differentiation of one's self can be considered anymore an illusion than other biologically compelled emotions - like hunger or willfulness. We don't call those dysfunctional. In the end, I am not sure that Daniel Dennett considers the concept of self dysfunctional either, since he does accept the consequences from being a subject.The philosophy of mind that is based on this view, is that the mind is simply the harmonised output of billions of neurons that produce the illusion of subjectivity. — Wayfarer
I understand, but what is the alternative? If anything less then a million neurons is declared not conscious according to some version of materialism, then that one neuron somehow introduces immense qualitative difference. Which is not apprehensible in the materialist world, where the behavior will be otherwise almost unchanged - i.e. there will be no substantial observable effect. At the million scale, normal genetic variations or aging would be sufficient to alternate the presence of consciousness, without significant functional changes otherwise.I always suspect that (replacement of heap/non-heap by as many different grades of heap as we can possibly distinguish) is a step backwards. — bongo fury
But Mars has no analytico-synthetic capacity, just dynamism. Even in its own time scale, it wouldn't appear as sentient as human beings. I do entertain the panpsychic idea, that simple matter possesses awareness, but of very tenuous quality. Negligible by human standards. Mars does manifest adaptation, but It does not engender assumptions of complex underlying model of reality.Otherwise, we might as well conclude that a lifeless rock like Mars is conscious because it’s capable of micro-movements like teutonic plate activity. — TheHedoMinimalist
I don't know really. But according to Wikipedia:Wasn't that Searle's point? That the test was useless already, because an obvious zombie (an old-style symbolic computer) would potentially pass it? — bongo fury
This seems to me to suggest that John Searle wanted to reject machines sentience in general.The Chinese room argument holds that a digital computer executing a program cannot have a "mind", "understanding" or "consciousness",[a] regardless of how intelligently or human-like the program may make the computer behave.
For me personally, the value of the discussion is the inspection of the logical arguments used for a given position and the examination of its distinguishing qualities. Without some kind of method of validation, meaning - any kind of quality control, it is difficult to commit. I would like a scale that starts at nothing, increases progressively with the analytico-synthetic capacity of the emergent structures, and reaches its limit at a point of total comprehension, or has no limit. It simply would make interpretations easier.I agree it is interesting to poll our educated guesses (or to dispute) as to where the consciousness "spectrum" begins (and zombie-ness or complete and indisputable non-consciousness ends). I vote mammals.
Related to that, it might be useful to poll our educated guesses (or to dispute) as to where the zombie "spectrum" ends (and consciousness or complete and indisputable non-zombie-ness begins). I vote humans at 6 months. — bongo fury
I understand. I cannot imagine how eliminative materialists would deny the phenomenology of senses, considering that senses are central to logical empiricism, which I thought was their precursor, but I am not qualified to speak.I would to start by mentioning that bongo furry corrected me in his earlier comment about the definition of eliminative materialism. I would say that my view is more properly called functionalism rather than eliminative materialism. — TheHedoMinimalist
I would like to contribute to my earlier point with a link to a video displaying vine-like climbing of plants on the surrounding trees in the jungle. While I understand that your argument is not only about appearances, and I agree that analytico-synthetic skills greatly surpass plant life, it still seems unfair to me to award not even a fraction of our sentience to those complex beings.Well, I’m not sure if plants have mental activity of any sort. This is because plants do not seem to be capable of autonomous action or decision making which is remotely similar to that of humans. They also probably do not possess sufficient energy to support something like mental activity. Plants are more likely to have mental activity than dirt though. This is because dirt doesn’t seem to be sufficiently compact to form an embodied entity which could support a mind. — TheHedoMinimalist
My thinking here is probably inapplicable to philosophy, but I always entertain the idea of a hypothetical method of measurement, a system of inference, and conditions for reproducibility. If we were to observe that our muscles strain when we lift things, and conclude that there is a force compelling objects to the the ground, this assertion wouldn't be implausible. Yet, it wouldn't have the aforementioned explanative and analytical qualities. But I acknowledge that philosophy is different from natural sciences.This is difficult to precisely answer but I would make an educated guess and say enough to form a microscopic insect. I don’t think that my theory has to explain everything precisely in order to be a plausible theory. — TheHedoMinimalist
I don't accept any view at present. I am examining the the various positions from a logical standpoint. But, speaking out of sentiment, I am leaning more towards a continuum theory.The same epistemic difficulties exist for the binary view of consciousness which you accept. — TheHedoMinimalist
The peripheral input could be fed in as slowly as necessary to allow a relaxed scale of time that is comfortable for the human beings involved. This doesn't slow the brain down relative to the sense stimuli it receives, only to time proper. But real time does not appear relevant for the experiment.The reason why I think that the thought experiment you are providing me is absurd is because humans cannot remotely behave like neurons while maintaining their identity as humans or even humanoid creatures. This is because humans would have to carry out interactions as rapidly as neurons do with unrealistically perfect synchronization. — TheHedoMinimalist
Just to explicate something to which you may have alluded here with the vegetarianism remark. If the hypothesis that bivalves can feel pain is true (, which doesn't seem particularly implausible in principle, and I wouldn't eat them either), why wouldn't other reactionary systems, such as those of plants, feel pain at some reduced level of awareness?Edit: To avoid starting another thread, I just wanted to bring up that bivalves can feel pain and therefore have consciousness. This, for me, had partially resulted in a crisis of faith as a vegetarian, but I think may posit something useful for anyone concerned with Philosophy of Mind. That a decentralized network can still be conscious has interesting implications for the field. — thewonder
It shows you how philosophically illiterate I am. At least the Wikipedia article doesn't say - "first proposed by ancient Chinese philosophers". :)Do you here allude to, or have you just re-invented, the China brain? — bongo fury
Interesting. Thank you.Also relevant, this speculative theory of composition of consciousnesses. Also it attempts to quantify the kind of complexity of processing with which you (likewise) appear to be proposing to correlate a spectrum of increasingly vivid consciousness. — bongo fury
I know the question wasn't posed to me. But albeit not focusing on AI, I actually am a software engineer by trade, so I thought I could interject. :)Doesn't ascribing consciousness to any machines with "software" set the bar a bit low? Are you at all impressed by Searle's Chinese Room objection? — bongo fury
I am not proficient with the pantheist theories enough to speak of them. But I cannot imagine that any mature philosophy would ascribe the property of mindfulness/awareness to a kilogram of matter in itself. A heavy lifeless planet shouldn't be more intelligent than a significantly lighter human being. I presume, one would rather ascribe to matter the potentiality of consciousness, which would then manifest to a different degree in layers through emergent structures. For example, the ability to capture information, perform analytical processing, and produce anticipatory responses, could be used as criteria for the realization of this potentiality. I don't understand, how an eliminative materialist would differ from a pantheist with respect to any such criterion.Well, a pantheist or panpsychist believes that all things have mental activity while an eliminative materialist might believe that only things and beings capable of decision making and autonomous action has the capacity to experience things. — TheHedoMinimalist
That is completely fair. But then they must, at least in principle (even if currently unknown) hypothesize a function that maps states of matter to degrees of being aware/conscious/sentient, a set of states, which are considered non-sentient, and a boundary between the two. There is nothing incoherent in that, but it poses interesting questions.They do not deny that it is a spectrum but they don’t have to think that it begins on a molecular level or that all objects are part of the spectrum. — TheHedoMinimalist
I would like to illustrate how I think societies and ecosystems are similar with respect to consciousness using a thought experiment. Suppose that we use a person for each neuron in the brain, and give each person orders to interact with the rest like a neuron would, but using some pre-arranged conventional means of human interaction. We instruct each individual what corresponding neuron state it has initially, such that it matches the one from a living brain (taken at some time instant). Then we also feed the peripheral signals to the central nervous system, as the real brain would have experienced them. At this point, would the people collectively manifest the consciousness of the original brain, as a whole, the way it would have manifested inside the person? Or to put differently, do eliminative materialists allow for consciousness nesting?Probably not because ecosystems and social systems are not unified systems in the same way that an organism is. An organism is a unified embodied system which is composed of parts called organ systems which are composed of smaller parts called organs. All these organs work very closely together to maintain the organism. The same cannot be said of social systems. People who are part of a social system sometimes contribute to it and sometimes they don’t and they don’t make their entire existence about the social system. The very concept of a social system or ecosystem is a lot more vague than the concept of an organism. — TheHedoMinimalist
The term "abstract" was probably inaccurate, but the idea was to be able to describe all types of conscious structures not by exhaustion, but using a principle. In other words - not to name the human condition as conscious, or the animal one, but to use a rule that incorporates structures of various scales and appearances.Or to put in simpler terms, assuming the position of eliminative materialism, how would they precisely differentiate our sense experience from any other abstract system, simpler or more complex? — simeonz
I don’t fully understand this question. What do you mean by an abstract system? — TheHedoMinimalist
Let me restate my question. Do you think that a properly functioning brain, in all its biological aspects, can exist without manifesting a mind?Well, clearly the brain can exist without the mind. People die or sink into a permanent vegetative state. The mind is gone, but the brain continues. As for the mind existing without the brain - life cannot exist without chemical processes. Do you think life is just chemistry. Can you tell the difference between chemistry and biology? If not, I doubt you and I will be able to discuss this subject very productively. — T Clark
Yes, but if I were a materialist, I would claim that the the mind is not perceived first hand (such as by itself), but is merely attested to by the brain. And the brain does not always attest to externalities. Sometimes it purports intuitions and emotions. A materialist would then argue, the mind is simply a shared sentiment or concept."The mind is a widely observed phenomenon" says everything that needs to be said. Everything is "a widely observed phenomenon." That's how they come to exist for the observers. — T Clark
Yes, but the TV is still a system of leds, liquid crystals, capacitors, antennas, electromagnetic events, etc. The image is an aspect of the end result as seen by the viewer, which is one facet produced by underlying processes. A materialist would argue that the "image quality" is just term or conceptualization. That there is no "image quality", but just a state of the screen crystals and a number of preceding steps that evoke it.Sorry - but a leaked capacitor is (I imagine) a piece of metal with goo all over it. Poor color quality is a term applied to an image of something when the color of the image doesn't match the color of the original. They're completely different things. Is an iron bar something different from 10E +24 iron atoms? "Hey, please hand me 10E +24 iron atoms." — T Clark
I confess this is my omission, but I thought that my general idea was suggested in the spirit of my question. Let me restate it. Do you think that a properly functioning brain, in all its biological aspects, can exist without a mind?Well, clearly the brain can exist without the mind. People die or sink into a permanent vegetative state. The mind is gone, but the brain continues. As for the mind existing without the brain - life cannot exist without chemical processes. Do you think life is just chemistry. Can you tell the difference between chemistry and biology? If not, I doubt you and I will be able to discuss this subject very productively. — T Clark
The problem is, that according to a materialist, the mind is not perceived first hand by itself, but is only attested by the brain. Since the brain does not always attest externalities, but sometimes emotions and intuitions, the mind stops being an observation, but a shared sentiment. (Mind you, I am not defending materialism as a belief necessarily, just its deductive method.)"The mind is a widely observed phenomenon" says everything that needs to be said. Everything is "a widely observed phenomenon." That's how they come to exist for the observers. — T Clark
Ok. But the image is ultimately the result of leds and liquid crystals and capacitors and antennas and electromagnetic processes. The "image quality" is just an aspect of the end result presented on the screen, which is also a facet of the events produced by the underlying mechanisms. The term conceptualizes this aspect, but it does not change the nature of the televising process in substance. An eliminative materialists would argue that there is no image (even less so, image quality) as a separate phenomenon, just a variety of actual events and mechanism, being treated when conceptualized.Sorry - but a leaked capacitor is (I imagine) a piece of metal with goo all over it. Poor color quality is a term applied to an image of something when the color of the image doesn't match the color of the original. They're completely different things. Is an iron bar something different from 10E +24 iron atoms? "Hey, please hand me 10E +24 iron atoms." — T Clark
To say that the mind is distinct from the brain, to me at least, infers that the brain can manifest without a mind, or that the mind can exist separate from its physical embodiment. Otherwise, I feel that they will be simply co-extent.It's clear to me that the mind is different from the brain. — T Clark
That is why I used the term "common-sense" previously. I meant, that albeit privately experienced, the mind is a widely observed phenomenon. But I still struggle to find the scientific value of this statement.I guess I'd say "obvious," although I acknowledge that what's obvious to one person isn't to another. — T Clark
But since the facets are related, you might be talking about picture quality, but mean leaked capacitor. How do you differentiate? Unless you can switch the program broadcast or change the TV. But, for the analogical mind-body case, I think this is the real problem, that it cannot be done.The metaphor I often use is of a television. When I talk about the television device, I talk about LEDs, antennas, and speakers. When I talk about the program I'm watching on the TV, I talk about the sound quality, the colors, the images, and I guess even the basketball game I'm watching. — T Clark
Nothing sounds obvious to me anymore. Epistemologically, that is. :)Does that seem obvious to you? — T Clark
I think I probably understand your general sentiment, as a practical matter, but I am unclear about some of the details. Do you mean that the mind is co-extent with any collection of animated brain tissue? If the mind is always incidental with a brain, is it distinct from the brain? What about animal brain, or a brain with a handicap, or an electrical circuit?I think I probably wasn't clear. We know the mind - what it is and how it works - the same way we know other things, by observing the world, in this case, primarily the behavior of other people, including their words. We also know it from the inside, from our own personal experience. Then, those two get combined as we imaginatively come to understand that other people have internal experiences that are similar to ours. — T Clark
You are saying that the mind is common sense, I suppose. This would be fine if the definition of the term "mind" was technical - as in a collection of empirical facts. But whose facts are those - are they the facts perceived by the very mind that they define?I don't believe in the mind, I experience it and observe its effects in the behavior of myself and other people. — T Clark
This is what I fail to fully understand. I mean, not just the argument, but the very statement. Is this similar to relationalistic pantheistic position?I believe some eliminative materialists contend that “consciousness” doesn’t even exist, that it is folk psychology. To them, the concept should or will be eliminated in time and with new neuroscientists discoveries. — NOS4A2
I will, thanks. From a brief reading, I am not sure whether they are trying to localize the awareness to physical form or delocalize it, but it is related.But check out “embodied cognition”, which I believe is superseding the computational theory of mind. — NOS4A2