Experience is a stream of information — punos
↪T Clark That's an interesting Pinker quote, although I myself frequently think in English sentences - not that I regard that as typical or as something everyone would do. Others have said here there are people who can read and speak perfectly well without ever being aware of a stream of thought in their minds. I think my 'bottom line' with respect to AI (with which I now interact every day) is that LLMs are not subjects of experience or thought. And if ask any of them - Claude, Gemini, ChatGPT - they will affirm this. They are uncannily like real humans, right down to humour and double entrendes, but they're reflecting back at us the distillation of billions of hours of human thought and speech. — Wayfarer
But it’s important to see what’s really happening: the system has to be trained for hours on each subject, with researchers mapping brain activity against known images and then building statistical models to translate those signals back into visuals. — Wayfarer
So what we’re seeing isn’t the brain “projecting” a movie by itself, but a reconstruction produced through a pipeline of human design, training, and interpretation. Without that interpretive layer, the raw neural data wouldn’t 'look like' anything. — Wayfarer
They don’t show that the brain literally contains images — they’re model-based translations of neural activity, not direct readouts of images 'stored' in the neural data. — Wayfarer
The information is arranged on a substrate in which the experience cannot be broken down without losing what we call experience (when we see a glass of water, we do not see the neurons acting). It is like when we say that experience is nothing more than neural synapses. But methodologically, we have a one-way path: the association from experience to neural processes, but not a return path: from processes to experience. — JuanZu
We would then need a machine capable of writing (not just reading) to your brain using your specific encoding. Now, when i look at an image, you would see and experience everything i see. — punos
In fact, this is confirmed in the video you brought: we FIRST have evidence of what experience is, and then we adjust the monitor so that the electrical signals resemble what we see in experience. But we can translate those signals into anything, not necessary into an image on a monitor. — JuanZu
This raises a question: could we reconstruct experience in a physical way without first knowing what experience is (not seeing neurons, neither electrical signals, just a glass of water) and what it resembles? The answer is no. — JuanZu
We would then need a machine capable of writing (not just reading) to your brain using your specific encoding. Now, when i look at an image, you would see and experience everything i see. — punos
I don't know what you're asking here. Perhaps you can rephrase it? — punos
Thanks to the association of particular images and recollections, a dog reacts in a similar manner to the similar particular impressions his eyes or his nose receive from this thing we call a piece of sugar or this thing we call an intruder; he does not know what is 'sugar' or what is 'intruder'. — The Cultural Impact of Empiricism
Spinoza's 'conception of substance' refutes this Cartesian (Aristotlean) error; instead, we attribute "mind" only to entities which exhibit 'purposeful behaviors'.The conscious mind is defined as a substance ... — MoK
A more useful definition of "thinking" is 'reflective inquiry, such as learning/creating from failure' (i.e. metacognition).The thinking is defined as a process in which we work on known ideas with the aim of creating a new idea.
Circular reasoning fallacy. You conclude only what you assume.An AI is a mindless thing, so it does not have access to ideas ...Therefore, an AI cannot create a new idea either.
"The definition" does not entail any "fact" – again, Mok, you're concluding what you assume.So, an AI cannot think, given the definition of thinking and considering the fact that it is mindless.
That's not a good answer. It doesn't address the issue of decomposition or methodology. A good answer would be: We can actually see neural processes first-person, and not only that, but methodologically we have discovered how to create consciousness without needing to be conscious ourselves. — JuanZu
In our experience, we do not see the neural processes that would compose the glass of water. This points to an irreducible qualitative difference. Because if we try to break down the glass of water, we do not obtain those neural processes. — JuanZu
Thanks to the association of particular images and recollections, a dog reacts in a similar manner to the similar particular impressions his eyes or his nose receive from this thing we call a piece of sugar or this thing we call an intruder; he does not know what is 'sugar' or what is 'intruder'.
— The Cultural Impact of Empiricism
What scientific study does he cite for this empirical claim? If my dog goes and gets a ball when I say "go get your ball," even new balls not previously seen, have I disproved his claim by showing the dog's understanding of categories? If not, what evidence disproves his claim? — Hanover
Each sentence refers to at least one idea, such as a relation, a situation, etc. In your example, we are dealing with a situation.A car ran over the neighbor's dog.
Does the summary meaning of this sentence comprise an irreducible mental event? It (the idea via sentence) happened, it isn't any more or less than what it means. — Nils Loc
We are dealing with a situation again, no matter how much detail you provide.Compare:
A 2024 Rapid Red Mustang Mach E ran over our neighbor's 15 year old Chiweenie.
Does the summary meaning of this sentence comprise an irreducible mental event? — Nils Loc
They don't know what thinking is, so they cannot design an AI that simulates thinking.AI simply simulates thinking. — I like sushi
Are you saying that thinking is pattern recognition? I don't think so.It is built for pattern recognition and has no apparent nascent components to it. — I like sushi
They don't know what thinking is, so they cannot design an AI that simulates thinking. — MoK
Are you saying that thinking is pattern recognition? I don't think so. — MoK
Get involved in philosophical discussions about knowledge, truth, language, consciousness, science, politics, religion, logic and mathematics, art, history, and lots more. No ads, no clutter, and very little agreement — just fascinating conversations.