Experience is a stream of information — punos
↪T Clark That's an interesting Pinker quote, although I myself frequently think in English sentences - not that I regard that as typical or as something everyone would do. Others have said here there are people who can read and speak perfectly well without ever being aware of a stream of thought in their minds. I think my 'bottom line' with respect to AI (with which I now interact every day) is that LLMs are not subjects of experience or thought. And if ask any of them - Claude, Gemini, ChatGPT - they will affirm this. They are uncannily like real humans, right down to humour and double entrendes, but they're reflecting back at us the distillation of billions of hours of human thought and speech. — Wayfarer
But it’s important to see what’s really happening: the system has to be trained for hours on each subject, with researchers mapping brain activity against known images and then building statistical models to translate those signals back into visuals. — Wayfarer
So what we’re seeing isn’t the brain “projecting” a movie by itself, but a reconstruction produced through a pipeline of human design, training, and interpretation. Without that interpretive layer, the raw neural data wouldn’t 'look like' anything. — Wayfarer
They don’t show that the brain literally contains images — they’re model-based translations of neural activity, not direct readouts of images 'stored' in the neural data. — Wayfarer
The information is arranged on a substrate in which the experience cannot be broken down without losing what we call experience (when we see a glass of water, we do not see the neurons acting). It is like when we say that experience is nothing more than neural synapses. But methodologically, we have a one-way path: the association from experience to neural processes, but not a return path: from processes to experience. — JuanZu
We would then need a machine capable of writing (not just reading) to your brain using your specific encoding. Now, when i look at an image, you would see and experience everything i see. — punos
In fact, this is confirmed in the video you brought: we FIRST have evidence of what experience is, and then we adjust the monitor so that the electrical signals resemble what we see in experience. But we can translate those signals into anything, not necessary into an image on a monitor. — JuanZu
This raises a question: could we reconstruct experience in a physical way without first knowing what experience is (not seeing neurons, neither electrical signals, just a glass of water) and what it resembles? The answer is no. — JuanZu
We would then need a machine capable of writing (not just reading) to your brain using your specific encoding. Now, when i look at an image, you would see and experience everything i see. — punos
I don't know what you're asking here. Perhaps you can rephrase it? — punos
Thanks to the association of particular images and recollections, a dog reacts in a similar manner to the similar particular impressions his eyes or his nose receive from this thing we call a piece of sugar or this thing we call an intruder; he does not know what is 'sugar' or what is 'intruder'. — The Cultural Impact of Empiricism
Spinoza's 'conception of substance' refutes this Cartesian (Aristotlean) error; instead, we attribute "mind" only to entities which exhibit 'purposeful behaviors'.The conscious mind is defined as a substance ... — MoK
A more useful definition of "thinking" is 'reflective inquiry, such as learning/creating from failure' (i.e. metacognition).The thinking is defined as a process in which we work on known ideas with the aim of creating a new idea.
Circular reasoning fallacy. You conclude only what you assume.An AI is a mindless thing, so it does not have access to ideas ...Therefore, an AI cannot create a new idea either.
"The definition" does not entail any "fact" – again, Mok, you're concluding what you assume.So, an AI cannot think, given the definition of thinking and considering the fact that it is mindless.
That's not a good answer. It doesn't address the issue of decomposition or methodology. A good answer would be: We can actually see neural processes first-person, and not only that, but methodologically we have discovered how to create consciousness without needing to be conscious ourselves. — JuanZu
In our experience, we do not see the neural processes that would compose the glass of water. This points to an irreducible qualitative difference. Because if we try to break down the glass of water, we do not obtain those neural processes. — JuanZu
Thanks to the association of particular images and recollections, a dog reacts in a similar manner to the similar particular impressions his eyes or his nose receive from this thing we call a piece of sugar or this thing we call an intruder; he does not know what is 'sugar' or what is 'intruder'.
— The Cultural Impact of Empiricism
What scientific study does he cite for this empirical claim? If my dog goes and gets a ball when I say "go get your ball," even new balls not previously seen, have I disproved his claim by showing the dog's understanding of categories? If not, what evidence disproves his claim? — Hanover
Each sentence refers to at least one idea, such as a relation, a situation, etc. In your example, we are dealing with a situation.A car ran over the neighbor's dog.
Does the summary meaning of this sentence comprise an irreducible mental event? It (the idea via sentence) happened, it isn't any more or less than what it means. — Nils Loc
We are dealing with a situation again, no matter how much detail you provide.Compare:
A 2024 Rapid Red Mustang Mach E ran over our neighbor's 15 year old Chiweenie.
Does the summary meaning of this sentence comprise an irreducible mental event? — Nils Loc
They don't know what thinking is, so they cannot design an AI that simulates thinking.AI simply simulates thinking. — I like sushi
Are you saying that thinking is pattern recognition? I don't think so.It is built for pattern recognition and has no apparent nascent components to it. — I like sushi
They don't know what thinking is, so they cannot design an AI that simulates thinking. — MoK
Are you saying that thinking is pattern recognition? I don't think so. — MoK
They don't know what thinking is, so they cannot design an AI that simulates thinking. — MoK
Are you saying that thinking is pattern recognition? I don't think so. — MoK
Given the definition you suggested, you either don't understand what objectively exists means, or you don't know what emergence is. I don't understand why you removed substance from my definition, but something that objectively exists is a substance, as opposed to something that subjectively exists, such as an experience. A neural process cannot give rise to the emergence of a substance, or something that objectively exists.I’m OK with that as edited. — T Clark
Biology, chemistry, etc., are reducible to physics. That means that we are dealing with weak emergence in these cases. Emergence of the mind, if it is possible, is strong emergence, which I strongly disagree that it is possible because of the reasons mentioned in the previous comment.Of course it can. Life emerges out of chemistry. Chemistry emerges out of physics. Mind emerges out of neurology. Looks like you’re understanding of emergence is different from mine. — T Clark
To me, abstraction and imagination are examples of thinking. Remembering, free association, etc. are not.But that’s what it means. As I’ve said before, if you want to make up definitions for words, it’s not really philosophy. You’re just playing a little game with yourself. — T Clark
He is definitely wrong. Purposeful behaviors are attributes of living creatures. Living creatures have at least a body and a mind.Spinoza's 'conception of substance' refutes this Cartesian (Aristotlean) error; instead, we attribute "mind" only to entities which exhibit 'purposeful behaviors'. — 180 Proof
No. You need to read things in order to see what I said follows, and it is not circular.Circular reasoning fallacy. You conclude only what you assume. — 180 Proof
I define thinking as a process in which we work on known ideas with the aim of creating a new idea. This definition is inclined to processes such as abstracting and imagination.So, what is thinking? You've, from what I've seen, yet to delineate a clear and concise formula (and resulting definition) for such. — Outlander
You are talking about language here. Of course, this sentence does not mean anything to me since I cannot relate any of the words you used to something that I know. The language is used to communicate new ideas, which are the result of thinking. We are working with known ideas when it comes to thinking, so there is no such miscommunication between the conscious and subconscious mind.Well, I mean, take the following sentence.
Ahaj scenap conopul seretif seyesen — Outlander
Correct!Well, it appears to be 'thinking' was my point. It cannot think. It would have been better of me to state that AI models do fool humans into thinking it can think. — I like sushi
Correct again! An AI produces meaningful sentences only based on its database and infrastructure.It simulates speech very effectively now. I do certainly not equate speech with thought though. I want to be explicit about that! — I like sushi
Correct again! :wink: An AI is just much faster than us in pattern recognition since it is silicon-based. It is specialized in certain tasks, though. Our brains are, however, huge compared to any neural net that is used in any AI, and it is multitasking. A neuron is just very slow.I was not sayign any such thing. I was stating that AI is far more capable of pattern recognition than us. It can sift through masses of data and find patterns it would take us a long, long time to come close to noticing. It is likley these kinds of features of AI are what people mistaken for 'thinking' as it seriously out performance us when it comes to this kind of process. — I like sushi
I define thinking as a process in which we work on known ideas with the aim of creating a new idea. — MoK
Very accurate!Ok. So we have to differentiate between information and experience (Mary's room then). Because you're not seeing the experience, but rather a reconstruction in a monitor, in a flat screen. A few pixels, but the experience isn't made up of pixels. It is a translation from something to something totally different. — JuanZu
Correct. I should have said "an intelligent creature" instead of "we".Finally, the (metaphorical) tender and ignorant flesh is exposed. Now it can be graded properly. Ah, except I note one flaw. And I'm no professional by any means. There is no "we" in this abstract concept. A man can be born alone in the world and he will still think. But perhaps this is a simple habit of speech, a human flaw, like we all have to be ignored, so I shall. Just to give you the benefit of the doubt. :smile: — Outlander
I don't know the right word for playing with ideas, experiencing them, without any attempt to create a new idea. :wink: For sure, such an activity is different from thinking, given the definition of thinking.But! Ah, yes, there's a but. Even still. One cannot "know an idea" without the auspices and foreprocesses of thought itself. So, this is defining a concept without explaining its forebearer. Your so called "thinking" is created by the process of involvement with "known ideas". yet how can an idea exist and be known unless thought of? — Outlander
Get involved in philosophical discussions about knowledge, truth, language, consciousness, science, politics, religion, logic and mathematics, art, history, and lots more. No ads, no clutter, and very little agreement — just fascinating conversations.