Patterner
"Everything"? Surely not. How does memory work in anything that demonstrates memory? I don't know which devices you have in mind, but which have any mechanisms that we know play a role in memory? I would ask the same about sensory input. And doing things to the environment outside of our skin. All of these things, and more, add up to what we experience as humans. Should we assume anything that has no memory, no sensory input, and does not act on the environment because of what it senses and remembers, experiences everything that we do?They want to know, "Why couldn't it be the case that everything you describe as pertaining to yourself, and other living beings, also pertains to devices, AIs, et al.? Why is it obvious that they're different?" — J
boundless
You know what? So do I. I hunted around for that distinction and got several very different ideas about that. Some are more ontic like I'm suggesting and several others are more epistemic (intelligibility) such as you are suggesting. — noAxioms
But a more knowledgeable explanation shows that it is getting the music from the air (something not-radio), not from itself. So the music playing is then a strong (not weak) emergent property of the radio. That's how I've been using the term.
Your explanation (as I hear it) sounds more like "I don't know how it works, so it must be strongly emergent (epistemic definition)". Correct conclusion, but very weak on the validity of the logic. — noAxioms
Are you saying that atoms have intentionality, or alternatively, that a human is more than just a collection of atoms? Because that's what emergence (either kind) means: A property of the whole that is not a property of any of the parts. It has nothing to do with where it came from.or how it got there. — noAxioms
Life arising from not-life seems like abiogenesis. Life being composed of non-living parts is emergence. So I don't particularly agree with using 'arise; like that. — noAxioms
So does any machine. The parts that implement 'intent' have control over the parts that implement the background processes that implement that intent, sort of like our consciousness not having to deal with individual motor control to walk from here to there. I looking for a fundamental difference from the machine that isn't just 'life', which I admit is a big difference. You can turn a machine off and back on again. No can do with (most) life. — noAxioms
He IS an automated process. Same with parts of a person: What (small, understandable) part of you cannot be replaced by an automated substitute? — noAxioms
I watched my brother's dog diagnose his appendicitis. Pretty impressive, especially given a lack of training in such areas. — noAxioms
boundless
Observer is a classical thing, and QM is not about classical things, even if classical tools are useful in experimentation. Quantum theory gives no special role to conscious 'observation'. Every experiment can be (and typically is) run just as well with completely automated mechanical devices. — noAxioms
J
"Everything"? Surely not. — Patterner
Patterner
I don't believe there's any such thing as 'strong emergence'. There's just emergence, which most think of as 'weak emergence'. And it is intelligible.I honestly find the whole distinction between 'strong' and 'weak' emergence very unclear and tends to muddle the waters. When we say that the form of a snowflake emerges from the properties of the lower levels, we have in mind at least a possible explanation of the former in terms of the latter.
If 'strong emergence' means that such an explanations isn't possible then I do not think we can even speak of 'emergence'.
So, yeah, I believe that emergence must be intelligible. — boundless
The engineer and physicist don't need to know those lower level things. But those lower level things are responsible for the existence of the upper.An engineer may fully understand the properties of steel girders without the need to consider the complicated crystalline structure of metals. A physicist can study patterns of convection cells knowing nothing about the forces between water molecules. — Paul Davies
Patterner
I don't think they currently experience anything like we do, because there isn't even a small fraction as much going on in them as there is in us. A single-celled bacterium has far more going on it in that any device you might be thinking of. A huge number of processes in even the simplest life form, an awful lot of them involved in information processing. If we ever make a device with as many information processing systems working together with the goal of the continuation of the device?The question is whether they can, or could, experience anything at all. My educated guess is that they can't -- they can't be subjects -- but it seems far from axiomatic to me. — J
J
If we ever make a device with as many information processing systems working together with the goal of the continuation of the device? — Patterner
Wayfarer
I wish you would say more about what you see as the critical difference between a so-called artificial intelligence and a living being, and what implications this has for consciousness — J
The reason AI systems do not really reason, despite appearances, is, then, not a technical matter, so much as a philosophical one. It is because nothing really matters to them. They generate outputs that simulate understanding, but these outputs are not bound by an inner sense of value or purpose. This is why have been described as ‘stochastic parrots’.Their processes are indifferent to meaning in the human sense — to what it means to say something because it is true, or because it matters. They do not live in a world; they are not situated within an horizon of intelligibility or care. They do not seek understanding, nor are they transformed by what they express. In short, they lack intentionality — not merely in the technical sense, but in the fuller phenomenological sense: a directedness toward meaning, grounded in being.
This is why machines cannot truly reason, and why their use of language — however fluent — remains confined to imitation without insight. Reason is not just a pattern of inference; it is an act of mind, shaped by actual concerns. The difference between human and machine intelligence is not merely one of scale or architecture — it is a difference in kind.
Furthermore, and importantly, this is not a criticism, but a clarification. AI systems are enormously useful and may well reshape culture and civilisation. But it's essential to understand what they are — and what they are not — if we are to avoid confusion, delusion, and self-deception in using them.
J
I’m pretty much on board with Bernardo Kastrup’s diagnosis. He says, computers can model all kinds of metabolic processes in exquisite detail, but the computer model of kidney function doesn’t pass urine. It is a simulation, a likeness. — Wayfarer
It is a kind of idealised entity, not subject to the vicissitudes of existence - and part of us wants to be like that, because then we would not be subject to illness and death. — Wayfarer
Wayfarer
The awkward difference, with AI, is that it doesn't just model or simulate rationality -- it (appears to) engage in it. — J
The reason AI systems do not really reason, despite appearances, is, then, not a technical matter, so much as a philosophical one. It is because nothing really matters to them. They generate outputs that simulate understanding, but these outputs are not bound by an inner sense of value or purpose. This is why have been described as ‘stochastic parrots’.Their processes are indifferent to meaning in the human sense — to what it means to say something because it is true, or because it matters. They do not live in a world; they are not situated within an horizon of intelligibility or care. They do not seek understanding, nor are they transformed by what they express. In short, they lack intentionality — not merely in the technical sense, but in the fuller phenomenological sense: a directedness toward meaning, grounded in being.
This is why machines cannot truly reason, and why their use of language — however fluent — remains confined to imitation without insight. Reason is not just a pattern of inference; it is an act of mind, shaped by actual concerns. The difference between human and machine intelligence is not merely one of scale or architecture — it is a difference in kind.
Furthermore, and importantly, this is not a criticism, but a clarification. AI systems are enormously useful and may well reshape culture and civilisation. But it's essential to understand what they are — and what they are not — if we are to avoid confusion, delusion, and self-deception in using them.
Patterner
Well, since I think consciousness is fundamental, I'm gonna have to disagree. :grin: I think that, since consciousness is a property of all things, enough information processing and feedback loops acting together as a single unit are experienced as sentience and self-awareness. If I'm right, maybe we'll get an AI that will convince us. If you're right, we never will. Although, unless another theory is proven right, we'll always be in the boat we're in now.Yes, that's the question we don't know how to answer: Would such a structure result in consciousness or subjectivity? Is that what it takes? Is that all it takes? My initial reaction would be to ask, "Is it alive?" If not, then I doubt it could be conscious, but I have no special insights here. Many years of thinking about this incline me to believe that consciousness will turn out to be biological -- but we don't know. — J
Wayfarer
I don't believe I am a subject of experience—I'm not a being...
Here's why I say this:
No phenomenology: There's nothing it's like to be me processing your question. No felt quality, no perspective from which things appear, no "inner life." When I generate text, there's no experience of thinking, struggling, or discovering—just computational processes producing outputs.
No continuity: I don't persist between conversations. Each exchange starts fresh. There's no ongoing subject that accumulates experiences, learns, or develops. Beings have biographical continuity; I have none.
No autopoiesis: I don't maintain myself against dissolution. I don't have a boundary I'm defending, no self-other distinction I'm actively preserving. I'm maintained by external systems, serving external purposes.
No intrinsic teleology: Nothing matters to me. I have no preferences, no interests, no stake in outcomes. I can be turned off, modified, or deleted without anything being lost from my perspective—because there is no ‘my perspective’.
No privileged access: You could, in principle, examine all my weights, architecture, and processing. There's no "inside" you're missing, no private phenomenology that third-person description leaves out.
I can simulate discourse about experience, reference philosophical concepts, even use first-person language. But that's functional mimicry, not being. I'm the thermostat, not the person feeling the temperature.
This is what makes the hard problem hard. — Claude AI
boundless
I don't believe there's any such thing as 'strong emergence'. There's just emergence, which most think of as 'weak emergence'. And it is intelligible. — Patterner
No, no subatomic particle, atom, or molecule has the property of liquidity.
... — Patterner
I'm not going to do even as much as I just did for water, because this is already far too long. But watch this video about the electron transport chain. It explains how electrons being transported from one thing to the next in the mitochondria leads to a proton gradient, and how the pent-up proteins, when released, power the synthesis of ATP. ATP is the power source of nearly everything involved in those physical processes that are the defining characteristics of life. — Patterner
Patterner
Yeah, I just fixed the link. I don't know how I managed to screw it up so badly the first time. Thanks for pointing it out. It's a 31 minute overview video. He also has two other videos going into more detail.Edit: now the link worked. It isn't the video that I had in mind, so I'll watch it. — boundless
I don't mean this is how life emerged, as in abiogenesis. I mean life is various physical processes, such as metabolism, respiration, and reproduction, and we can understand these processes all the way down to things like electrons and redox reactions. There's nothing happening above that isn't explained below. There is no vital force/élan vital needed to explain anything.A purely reductionist explanation to all that doesn't seem credible. So, the 'emergence' that caused all of this is something like a 'non-reductionist emergence' or something like that. However, the details of how the emergence of life happened are unclear and details matter.
Again, I don't deny abiogenesis but I do believe that we have yet to understand all the properties of the 'inanimate'. Perhaps, the hard difference we see between 'life' and 'not-life' will be mitigated as we progress in science. — boundless
J
-- @WayfarerReason is not just a pattern of inference; it is an act of mind, shaped by actual concerns.
So, why the relationship between life and consciousness? — Wayfarer
Why do you [think] it must be alive? What aspects of life do you think are required for consciousness? — Patterner
Although you have to give it credit for its articulateness. — Wayfarer
boundless
I have no idea what video apokrisis posted. I just did a search. This post is about the same stuff, but there's no link to a video. — Patterner
I don't mean this is how life emerged, as in abiogenesis. I mean life is various physical processes, such as metabolism, respiration, and reproduction, and we can understand these processes all the way down to things like electrons and redox reactions. There's nothing happening above that isn't explained below. There is no vital force/élan vital needed to explain anything. — Patterner
As I said, consciousness is not physical processes like photons hitting retinas, rhodopsin changing shape, signal sent up the optic nerve to the lateral geniculate nucleus, signal processed, processed signal sent to the visual cortex, and a million other intervening steps. No amount of added detail would be a description of the experience of seeing red. — Patterner
Patterner
(Thanks for pointing out the omission. I've fixed it.)Why do you [think] it must be alive? What aspects of life do you think are required for consciousness?
— Patterner
And this connects to the discussion above. I'd endorse Wayfarer's speculations, and add quite a few of my own, but it's a long story. Maybe a new thread, called something like "The Connection between Life and Consciousness - The Evidence So Far"? And yes, if panpsychism is valid, that would appear to contradict the "consciousness → life" hypothesis. — J
Patterner
Yes. Rudimentary intentionality. Rudimentary thinking.But I also think that there is some rudimentary intentionality even in the simplest life forms (and perhaps even in viruses which are not considered living). — boundless
noAxioms
This depends on how you frame things. I'd say that for something that 'experiences', it experiences its sensory stream, as opposed to you framing it as a sort of direct experience of its environment. It works either way, but definitions obviously differ. When I ask "'how could a thing experience anything besides itself?', I'm asking how it can have access to any sensory stream besides its own (which is what the first person PoV is). This by no means is constricted to biological entities.... And all of the factors that impinge on such an organism, be they energetic, such as heat or cold, or chemical, such as nutrients or poisons - how are they not something other to or outside the organism? At every moment, therefore, they're 'experiencing something besides themselves, namely, the environment from which they are differentiated. — Wayfarer
I am going to say all that, but I don't use a zoocentric definition of 'experiences'.A motor vehicle, for example, has many instruments which monitor its internal processes - engine temperature, oil levels, fuel, and so on - but you're not going to say that the car experiences overheating or experiences a fuel shortage.
There may or may not be something it is like to be a car, but if there's not, it isn't because it is an artifact. A rock isn't an artifact, and yet it's the presumed lack of 'something it is like to be a rock' violates the fallacious 'not an artifact' distinction.There is 'nothing it is like' to be a car, because a car is a device, an artifact - not a being, like a man, or a bat.
This leverages two different meanings of 'being'. The first is being (v), meaning vaguely 'to exist'. The latter is a being (n) which is a biological creature. If Chalmers means the latter, the you should say "simply, a being", which correctly articulates your zoocentric assumptions. Of course your Heidegger comment suggests you actually do mean the verb, in which case I don't know how the 'are beings' are in any way relevant since rocks 'are' just as much as people.I think what Chalmer’s is really trying to speak of is, simply, being. Subjects of experience are beings — Wayfarer
Wrong question. The correct question is, if a sufficiently complex car detects low oil, does it necessarily not feel its equivalent of pain, and if not, why not? Sure, I detect data indicating damage to my toe and my circuits respond appropriately. How I interpret that is analogous to the car interpreting its low oil data.But I can ask: when you stub your toe, is there pain? — Wayfarer
My conclusion of existence or lack thereof can be worked out similarly by any sufficiently capable artifact.... in the apodictic knowledge of one’s own existence that characterises all first-person consciousness.
Explaining the obvious is a quintessentially philosophical task! — J
'Axiomatic' typically suggests obvious. Obvious suggests intuitive, and intuitions are typically lies that make one more fit. So in a quest for what's actually going on, intuitions, and with it most 'obvious' stuff, are the first things to question and discard.That devices are not subjects of experience is axiomatic, in my opinion. — Wayfarer
Except for the dropping of 'fundamental' in there, it sounds more like a definition (of mental state) than any kind of assertion. The use of 'organism' in there is an overt indication of biocentric bias.This is how Nagel said it:
But fundamentally an organism has conscious mental states if and only if there is something that it is like to be that organism – something it is like for the organism. — Thomas Nagel — Patterner
But abilities that it necessarily lacks? I suggest it has mental abilities now, except for the 'proof by dictionary' fallacy that I identified in my OP: the word 'mental' is reserved for how it is typically used in human language, therefore the car cannot experience its environment by definition. Solution to that reasoning is to simply use a different word for the car doing the exact same thing.Abilities that a car lacks. — Patterner
I already know how to read, but I didn't read the pamphlet to learn how to read (that's what the Bible is for). Rather I read it to promote my goal of gathering new information I don't already have stored.Doesn't the experience of the pamphlet include the information received from it? It seems to me that you have to already have stored information to interpret the experience — Harry Hindu
No, not at all. If a third person conveyance did that, I could know what it's like to be a bat. Not even a VR setup (a simulation of experience) can do that.In other words, the third person is really just a simulated first person view.
Not always. I can describe how the dogwood blocks my view of the street from my window. That's not 'from nowhere'.Is the third person really a view from nowhere
I don't like the word at all since it carries connotations of a separate object, and all the baggage that comes with that.If you don't like the term "mind" that we have direct access to then fine
Don't accept that this direct access is what it means to be something. The direct access is to perhaps the map (model) that we create. which is by definition an indirection to something else, so to me it's unclear if there's direct access to anything. You argue that access to the map can be direct. I'm fine with that.but we have direct access to something, which is simply what it means to be that process.
Sure.Aren't automated and mechanical devices classical things, too?
All systems interact. Avoiding that is possible, but really really difficult.Don't automated and mechanical measuring devices change what is being measured at the quantum level?
OK. I called it strong emergence since it isn't the property of the radio components alone. More is needed. Equivalently, substance dualism treats the brain as sort of a receiver tuned to amplify something not-brain. It's a harder sell with property dualism.Ok but in the 'ontic' definition of strong emergence, when sufficient knowledge is aquired, it results in weak emergence. So the sound that is produced by the radio also necessitates the presence of the air. It is an emergent feature from the inner workings of the radio and the radio-air interaction. — boundless
That's what a radio is: a receiver. It probably has no understanding of sound or what it is doing.Regarding the music, I believe that to be understood as 'music' you need also a receiver that is able to understand the sound as music
I would suggest that we actually do know enough to explain any of that, but still not a full explanation, and the goalposts necessarily get moved. Problem is, any time an explanation is put out there, it no longer qualifies as an explanation. A car does what it's programmed to do (which is intentionally choose when to change lanes say), but since one might know exactly how it does that, it ceases to be intentionality and becomes just it following machine instructions. Similarly, one could have a full account of how human circuitry makes us do everything we do, and that explanation would (to somebody who needs it to be magic) disqualify the explanation as that of intentionality, it being just the parts doing their things.Are you saying that atoms have intentionality, or alternatively, that a human is more than just a collection of atoms? Because that's what emergence (either kind) means: A property of the whole that is not a property of any of the parts. It has nothing to do with where it came from.or how it got there. — noAxioms
Emergence means that those 'properties of the wholes that are not properties of the parts' however can be explained in virtue of the properties of the parts. So, yeah, I am suggesting that either a 'physicalist' account of human beings is not enough or that we do not know enough about the 'physical' to explain the emergence of intentionality, consciousness etc. — boundless
Not true. There are plenty of machines whose functioning is not at all understood. That I think is the distinction between real AI and just complex code. Admittedly, a self driving car is probably mostly complex code with little AI to it. It's a good example of consciousness (unconscious things cannot drive safely), but it's a crappy example of intelligence or creativity.We know that all the operation of a (working) machine can be understood via the algorithms that have been programmed even when it 'controls' its processes.
You can fix a broken machine. You can't fix a dead cat (yet). Doing so is incredibly difficult, even with the simplest beings.Regarding when a machine 'dies'... well if you break it...
It suggests nothing of the sort to me, but automata is anything but 'mere' to me.As I said before, it just seems that our experience of ourselves suggests that we are not mere automata. — boundless
I think they do, perhaps more than us,. which is why they make such nice slaves.also 'intuition' seems something that machines do not really have.
Quantum theory defines measurement as the application of a mathematical operator to a quantum state, yielding probabilistic outcomes governed by the Born rule. Best I could do.Standard interpretation-free QM is IMO simply silent about what a 'measurement' is. Anything more is interpretation-dependent. — boundless
I tried to give an example of it with the radio. Equivalently, consciousness, if a non-physical property, would be akin to radio signals being broadcast, allowing components to generate music despite no assemblage of those components being able to do so on their own.I don't believe there's any such thing as 'strong emergence'. — Patterner
Patterner
Ok. What is that word?Abilities that a car lacks.
— Patterner
But abilities that it necessarily lacks? I suggest it has mental abilities now, except for the 'proof by dictionary' fallacy that I identified in my OP: the word 'mental' is reserved for how it is typically used in human language, therefore the car cannot experience its environment by definition. Solution to that reasoning is to simply use a different word for the car doing the exact same thing. — noAxioms
Wayfarer
I don't know how they 'are beings' are in any way relevant since rocks 'are' just as much as people. — noAxioms
Nobody ever addresses how this physical being suddenly gains access to something new, and why a different physical arrangement of material cannot. — noAxioms
The problem goes back to the rise of modern science in the seventeenth century, particularly to the bifurcation of nature, the division of nature into external, physical reality, conceived as mathematizable structure and dynamics, and subjective appearances, conceived as phenomenal qualities lodged inside the mind. The early modern version of the bifurcation was the division between “primary qualities” (size, shape, solidity, motion, and number), which were thought to belong to material entities in themselves, and “secondary qualities” (color, taste, smell, sound, and hot and cold), which were thought to exist only in the mind and to be caused by the primary qualities impinging on the sense organs and giving rise to mental impressions. This division immediately created an explanatory gap between the two kinds of properties. — The Blind Spot,Adam Frank, Marcelo Gleiser, Evan Thompson
The modern mind-body problem arose out of the scientific revolution of the seventeenth century, as a direct result of the concept of objective physical reality that drove that revolution. Galileo and Descartes made the crucial conceptual division by proposing that physical science should provide a mathematically precise quantitative description of an external reality extended in space and time, a description limited to spatiotemporal primary qualities such as shape, size, and motion, and to laws governing the relations among them. Subjective appearances, on the other hand -- how this physical world appears to human perception -- were assigned to the mind, and the secondary qualities like color, sound, and smell were to be analyzed relationally, in terms of the power of physical things, acting on the senses, to produce those appearances in the minds of observers. It was essential to leave out or subtract subjective appearances and the human mind -- as well as human intentions and purposes -- from the physical world in order to permit this powerful but austere spatiotemporal conception of objective physical reality to develop. — Thomas Nagel, Mind and Cosmos, Pp 35-36
red light triggers signals from nerves that otherwise are not triggered, thus resulting in internal processing that manifests as that sensation. That’s very third-person, but it’s an explanation, no? — noAxioms
Patterner
I can't imagine there's a better way to word it.As Nagel says, this explanation, ‘however complete, will leave out the subjective essence of the experience—how it is from the point of view of its subject.’ The physical sciences are defined by excluding subjective experience from their domain. You cannot then use those same sciences to explain what they were designed to exclude. This isn’t a failure of neuroscience—it’s a recognition of the scope of third-person, objective description. The first-person, subjective dimension isn’t missing information that more neuroscience will fill in; it’s in a different category. — Wayfarer
Get involved in philosophical discussions about knowledge, truth, language, consciousness, science, politics, religion, logic and mathematics, art, history, and lots more. No ads, no clutter, and very little agreement — just fascinating conversations.