Then it appears that there is no difference in an illusion of consciousness that doesn't have proper semantics, and one that does. Semantics is derived from the syntax - from the relationship between the rules and what the rules cause one to do or not do.The same as the distinction between an illusion of consciousness that (like the Chinese Room) doesn't have a proper semantics, and one that does. — bongo fury
But words are just colored scribbles and sounds. It seems like you'd have a problem defining the nature of words, too.Harry I don't have a problem defining consciousness and suchlike. Like many words they are defined ostensively.
Wikipedia:
An ostensive definition conveys the meaning of a term by pointing out examples. This type of definition is often used where the term is difficult to define verbally, either because the words will not be understood (as with children and new speakers of a language) or because of the nature of the term (such as colours or sensations) — Daemon
This is all just more information. All causal relations, which include logical entailments, is information.More seriously, the fundamental stuff of physics like fields, energy, matter, forces, spacetime and all the stuff that's logically entailed by that. — Marchesk
Because its difficult to derive meaning from anything Banno says. It probably has to do with how he uses words.Banno-inspired perception-related debates used to go 100+ pages. And it often included talk of apples. — Marchesk
How about why you have experiences at all?Sorry for the late reply. I've seen this argument a few times and never got the sense of it. We don't need any knowledge of the workings of the brain to understand why I don't experience your sensory input. It is not a neurological question. It's not even a sensible question imo. — Kenosha Kid
Actually it doesn't, which is why it is broken down into functional systems not specific parts of the brain. Either system could be, and likely is, distributed. But certainly parts of the brain are dominant in certain functions. — Kenosha Kid
What the heck does this even mean? What is the difference between unconscious and conscious phenomena, or systems? If the systems are distributed, then how is it that they aren't aware of what is going on in the other parts? How is the brain itself not aware of what it's different systems are doing? Can an unconscious system be aware of what the unconscious and conscious systems are doing?The answer to that is precisely why we labour under the illusion that we make those decisions consciously. Recall that we are not conscious of the unconscious causes of conscious phenomena. Decisions from System 1 are presented to System 2 apparently uncaused (i.e. without System 2 being aware of the process). So from System 2's point of view, decisions originate in System 2. There are lots of published tests for this. — Kenosha Kid
Banno-inspired perception-related debates used to go 100+ pages. And it often included talk of apples. — Marchesk
:rofl:Have we come to any sort of consensus as to what color is? Or pain?
If it's not qualia, is it ... a model? A language game? A private beetle we can't talk about? — Marchesk
It appears that you've answered your own question.the piano not perceiving certain inputs from the keyboard? Does it not perceive the meaning of your keystrokes and make the correct sounds for you to listen to? — Daemon
Harry I don't have a problem defining consciousness and suchlike. Like many words they are defined ostensively.
Wikipedia:
An ostensive definition conveys the meaning of a term by pointing out examples. This type of definition is often used where the term is difficult to define verbally, either because the words will not be understood (as with children and new speakers of a language) or because of the nature of the term (such as colours or sensations) — Daemon
I would say the theory is ideal, in that it's humans creating a map of the territory, while the territory itself might be understood as physical, assuming a physicalist ontology. That does allow for the possibility that the theory is missing something fundamental. A map is only as good as the map makers and their knowledge of the territory. — Marchesk
Humans, maps and territory are all observable, so I don't know what Marchesk means by "ideal" other than that they like the theory, or that it works for them. The fundamental aspect that is missing is causation - of how maps can be about territories.just means "in terms of observable phenomena". — Janus
Again, what does it mean to feel?Do you think a piano feels something when you press the keys? — Daemon
Is it the theory that is physical, or what the theory is about (what it points to) that is physical, or both?When it comes to producing speculative hypotheses regarding the origins of life and consciousness physical theories are all we have, because only they are testable. That doesn't mean you can't speculate idealistically; it just means there is no way to test such speculations. — Janus
No, because this is the primary point of contention, and you keep ignoring the contradiction that you keep making. What makes the hardware in your head special in that it feels, but computer hardware can't? What does it mean to feel?No, the computer is not perceiving inputs in the way you and I perceive things. Press a finger against the back of your hand. You feel a sensation. When you press a key on the computer keyboard, the computer doesn't feel a sensation.
Shall we try to agree on this before we move on to the rest of your ideas? — Daemon
So understanding has to do with perceiving meaning? What do you mean by, "perceive"? Is the computer not perceiving certain inputs from your mouse and keyboard? Does it not perceive the meaning of your keystrokes and mouse clicks and make the correct characters appear on the screen and windows open for you to look at?It doesn't have any understanding. It doesn't perceive the intended meaning, it doesn't perceive anything. It isn't equipped to perceive anything. — Daemon
But the semantics weren't ascribed by you. They were ascribed by your teacher(s) who taught you how to translate words. You weren't born knowing any language, much less how to translate them. You had to be taught that. You also weren't the one that created languages, to define what scribble and sound refers to what event or thing. You had to be taught that. You used your eyes and ears (your inputs) and your brain (your processor) to learn, to discern the patterns, so that you may survive in this social environment (produce the appropriate outputs as expected by your peers).Semantics, meaning, is not intrinsic to the physics of my PC. The semantics is ascribed, in this case by me, when I tell it how to translate words and phrases. — Daemon
That's because the only thing it knows is to spit out this scribble when it perceives a certain mouse click or key stroke. It has the same instructions as the man in the room - write this scribble when you perceive this scribble. It doesn't have instructions that actually provide the translation, of this word = that word, and then what that word points to outside of the room, which is how you understand it, because that is how you learned it.The translation tool often produces quite spooky results, it certainly looks like it understands to a naive observer, but it's easy to see that it doesn't understand when you allow it to translate on its own without my intervention (which I never do in practice). — Daemon
Because it doesn't have the same set of instructions, nor the need to learn them, that you did when you learned them, but that doesn't mean that it couldn't if it had the need and the correct set of instructions.One of the author's conclusions is that "linguistic meaning is derived from the role things and people play in everyday life". I said something about this above, using the word "good" and the translation of machine assembly instructions as examples.
If the translation tool's understanding was the same as mine, as you seem to want to believe, then machine translation would be as good as human translation. But it isn't! — Daemon
What role does "hello" play? Does this not mean that that utterance refers to the role that it plays?At the least, can you consider the possibility that there are parts of language, things we do with words, for which the meaning is not given by the referent, but is instead found in the role these utterances and scribbles play in our day to day lives? — Banno
No. The question is just a different way of framing the hard problem of why there are two very different perspectives of mental processes, but only one type of perspective for everything else, like chairs, mountains and trees.Because the fact that I am bert1 allows me two different perspectives to examine bert1's mental processes: introspection and extropsection. Whereas other people only have extrospection as a way of observing bert1's mental processes (to the extent that the can do so at all).
Is that question equivalent to "Why am I some particular person, rather than no one in particular?"? — bert1
Isn't that how you learned the translation of a word and then use the translation? Didn't you have to learn (be programmed) with that information via your sensory inputs to then supply that information when prompted? How is the translation tool's understanding different than a brain's understanding?I don't need the translation tool Harry, I can do the translation on my own, the tool just saves me typing. When I come across a word that isn't in my Translation Memory I add it to the memory, together with the translation. Then the next time that word crops up I just push a button and the translation is inserted. The translation tool doesn't understand anything. — Daemon
You used the phrase. I thought you knew what you were talking about. I would define it as a kind of working memory that processes sensory information.A dictionary definition of "understand" is "perceive the intended meaning of". Another dictionary says "to grasp the meaning of".
What do you think conscious experiences are? — Daemon
If you are the translator then why do you need a translation tool? Where do the translations reside - in your brain or in you tool? If you need to look them up in a tool, then the understanding of that particular translation is in the tool, not in your brain.I'm a translator, I use a computer translation tool — Daemon
These are all unfounded assertions without anything to back it up. What are conscious experiences? What do you mean by, understand?You're right that the rules in the room are not those that Chinese speakers use. But that's the point: a computer can't understand language in the way we can. The reason is that we learn meaning through conscious experience. — Daemon
The instructions in the room are written in a language - a different language than Chinese. How did the man in the room come to understand the language the instructions are written in? I've asked this a couple of times now, but you and Apo just ignore this simple, yet crucial, fact.You"re right that the rules in the room are not those that Chinese speakers use. But that's the point: a computer can't understand language in the way we can. The reason is that we learn meaning through conscious experience. — Daemon
I don't see a world of difference between them. Algorithms are a type of constraint.There is a world of difference between rules as algorithms and rules as constraints. — apokrisis
I'm not clear on how this answers the question.And what is it about you that provides you with different evidence of your consciousness than I have of your consciousness?
— Harry Hindu
I can introspect myself, but others can't. — bert1
Of course life and minds follow rules. You are following the rules of the English language that you learned in grade school when you type your posts. Ever heard of the genetic code? Why do you keep saying stuff that only takes a simple observation to see that it isn't true?But life and mind don’t “follow rules”. They are not dumb machine processes. They are not algorithmic. Symbols constrain physics. So as a form of “processing”, it is utterly different. — apokrisis
No. To understand language is to possess a set of rules in memory for interpreting particular scribbles and sounds. Like I said, understanding is the possession of a set of rules in memory for interpreting any sensory data. The man in the room has a different set of rules for interpreting the scribbles on the paper than the rules that Chinese people have for interpreting those same symbols. Hence, the instructions in the room are not for understanding Chinese because they are not the same set of rules that Chinese speakers learned or use. The room understands something. It understands, "write this symbol when you see this symbol." The room also understands the language the instructions are written in. How can that be if the room, or the man, doesn't understand language?To understand language is to know how to act. That knowing involves constraining the uncertainty and instability of the physical realm to the point that the desired outcome is statistically sure to happen. — apokrisis
It seems like that is your problem to solve. You are the dualist, so you are the one that sees this as a hybrid device. As a monist, I don't see it as such. What is it about carbon that is so special in being the only element capable of producing a hybrid device in the sense that you are claiming here? Why do you think that natural selection is often confused as a smart process (intelligent design), rather than a dumb (blind) process?So you can’t just hand wave about reconnecting the computer to the physics. You have to show where this now hybrid device is actually doing what biology does rather than still merely simulating the physics required. — apokrisis
What happens when the individuals you are competing with are a resource themselves? Altruism.1. Cooperation: individuals sharing a resource
2. Competition: individuals fighting over a resource — TheMadFool
I don't know what this means. Present physical states are informative of prior physical states. I don't see how you can have something that is physical that is also absent information. The only process that is needed for information to be present is the process of causation.Therefore, it must be physical, not informational, and can only be reproduced with the right mechanical process. — hypericin
Against the AI symbol processing story, Searle points out that a computer might simulate the weather, but simulated weather will never make you wet. Likewise, a simulated carburettor will never drive a car. Simulations have no real effects on the world.
And so it is with the brain and its neurons. It may look like a computational pattern at one level. But that pattern spinning can't be divorced from the environment that is being regulated in realtime. Like the weather or a carburettor, the neural collective is actually pushing and shoving against the real world.
That then is the semantics that breathes life into the syntax. And that is also the semantics that is missing if a brain, a carburettor or the weather is reduced to a mere syntactical simulation. — apokrisis
But that was my point... that there are only one set of rules for understanding Chinese, and both humans and computers would use the same rules for understanding Chinese. I don't see a difference between how computers work and how humans work. We both have sensory inputs and we process those inputs to produce outputs based on logical rules.I think you're getting things back to front. The room is set up to replicate the way a computer works, the kinds of rules it works with. It's not trying to replicate the way humans work, the kinds of rules we use to understand language. So Searle is showing why a digital computer can't understand language. — Daemon
The problem with the "Chinese" room is that the rules in the room are not for understanding Chinese. Those are not the same rules that Chinese people learned or use to understand Chinese. So Searle is making a category error in labeling it a "Chinese Room".Searle argues that the thought experiment underscores the fact that computers merely use syntactic rules to manipulate symbol strings, but have no understanding of meaning or semantics. — SEP article
Predictions are simulations in your head, and predictions have causal power. We all run simulations of other minds in our minds as we attempt to determine the reasoning behind some behaviour.Simulations have no real effects on the world. — apokrisis
None of this explains why have a different experience of my raw sensory input with memory, motivation, etc. than you have of my raw sensory input with memory, motivation. From my view, I don't experience neurons. I experience colors, shapes and sounds of the world. From your view, you experience neurons in the format of colors and shapes. How can on one end you point to a visual of neurons, while I point to an experience of a sound.Well, what use is it in day-to-day life? Isaac, Banno et al would argue that there isn't a meaningful separate phenomenal experience, i.e. it isn't useful at all even if it exists. The transformations and augmentations of raw sensory input with memory, motivation, etc., are sufficient to account for consciousness. And I agree to an extent. In my view, when we talk about qualia, we're talking about these transformations and augmentations, at least as available to access consciousness (which is all we can report on). The usefulness might be summed up as: it is quicker and easier to work with 'lion' than it is to work with an unadorned granular image. — Kenosha Kid
This assumes that consciousness only exists in one part of the brain. How do you know that there are not other consciousnesses in other parts of the brain making those decisions?That's probably not one thing. I've touched on an example from Kahneman's work earlier. There are decisions we make that are not consciously made, that is we are not conscious of making them in the way we do, but rather, once those decisions are made unconsciously, they are presented to consciousness as if for ratification in such a way that we'll swear blind we did make them consciously. (NB: Kahneman doesn't speak in terms of unconscious and conscious decision making but in terms of System 1 (fast, e.g. pattern-matching) and System 2 (slow, algorithmic). But the implication is there.) Consciously we can change our minds, i.e. System 2 will come up with a different answer. — Kenosha Kid
There is no decision being made as we always goes with "just right". Hot and cold are merely informing you that you are no longer in a state of homeostasis, or a state of "just right".I am saying that this dualism is always actually a dichotomy, and thus something intrinsically relational.
Hotter is only ever relative to colder. And vice versa. But then a world constructed within that contrast makes possible the new thing of having some particular position on that spectrum of possibilities. You can be a body in an environment where you have this Goldilocks three choices about the temperature you prefer — apokrisis
Effects are about their causes independent of any mind. A mind is not needed to establish that relationship. It is already there. A mind is just another effect of causes, and a cause for many effects. Minds simply focus on the causal relationships that are useful and ignore the rest. That doesn't mean that causal relationships don't exist except when accessed by some mind. Cause and effect is part of everything, including life and non-life. Again, we're merely talking about degrees of complexity of some causal system.The division - the epistemic cut - lies in the fact that life and mind are how we describe systems organised by symbols. They have a coding machinery like genes, neurons or word that can store memories and so impose a self-centred structure of habits on their environments.
It is pretty easy to recognise that difference between an organism and its backdrop inorganic environment surely? — apokrisis
But the boundary between life and non-life gets blurry. After all, life is just a more complex relationship than non-life, so it stands to reason that non-life would have very rudimentary, the most basic, the most fundamental relationships that life has, not that it doesn't have it at all. What that thing is is information. Effects are informative of their causes and vice versa. A relationship is informative of its constituents and vice versa. Information is the relationship between cause and effect and it exists in everything that is a causal relation, like scribbles on a screen and the intent that caused them to be on the screen and the information molecules have about their atoms.A holistic or triadic paradigm now explains life. And it is easy to see that it also explains mind, as semiosis already grants life an intentionality and "awareness" at the cellular level ... the subject of the cited paper here. — apokrisis
The hard problem is asking why are there both conscious states and brain states.states of consciousness are just brain states — Kenosha Kid
What does this really mean? It seems to me that you can always simplify dualism into monism. Dualism is just another way of saying that everything is a relationship. The problem is that there are many relationships between more than two things. Not to mention that dualism seems to be a false dichotomy derived from the idea that the singular "I" itself possesses qualities that are on a level between everything else. The world is only hot or cold relative to your own body temperature, large or small relative to your own size, etc. In other words, these sensations are relationships between the state of your body and the state of the environment. I think this is more or less something that you might agree with and maybe any disagreement we might have will be semantics, but then that just means that the real difference between dualism and monism is just semantics.What is relevant to this thread is the point I have already tied to make. Yes, there is a dualism at the heart of everything in some strong sense. — apokrisis
The accuracy likely is affected, but for one thing it avoids going down wrong paths when looking for or describing something. The way people often talk about human reason and it's role, for instance, seems very wrong to me. There was a famous experiment a while ago that showed that neurological behaviour associated with motor responses fired before correlated decision-making processes in the prefrontal cortex. The subjects remember, from their limited but direct phenomenal experience, deciding to act, then acting, when in fact the action appeared to be unconsciously chosen and only consciously ratified -- or rationalised -- after the act. The narrative based on the first person viewpoint is wrong, and this is exceedingly common it seems.
This reminds me very much of Daniel Kahneman's System 1 / System 2 model of the brain and his tests of it. Problems that appear amenable to pattern-matching (the thing that makes it easier to add 5 or 9 to things than 7 or 8) but that pattern-matching would lead to the wrong answer for follow a similar pattern. Human subjects swear blind they worked out the answer, when in fact they seem to be *receiving* an answer and ratifying it. Badly. That is, System 2 (the so-called rational, algorithmic part of the brain associated with conscious decision-making) receives a putative answer from System 1 (the dumb but hard-working pattern-matching part of the brain that acts without conscious input).
There are all sorts of psychological effects that follow from these sorts of behaviours, some good, some terrible, and those effects can be exploited. It's beneficial to know how your mind operates, what mistakes it makes. For instance, the above suggests to me that the conscious mind is not adept at discerning "We should do this, right?" from "We did this FYI." Besides that, it's just interesting. — Kenosha Kid
The last part doesn't make any sense. If all is replaced, then how can there be anything that remains?Really? So if you lost a finger you're not you anymore? Which part of the body exactly carries this "I"? How much of a body can you lose or replace to still be the same "I"? Whatever "I" remains after all is replaced or changed, that is "the experiencer". — khaled
What is shape? Information.Well at least we've established that there is an event. I thought you were one of those people who pretend that the scribble refers to nothing. But I still think "what is this event" is akin to asking "what is shape", It's one of those things you can't simplify further. Why don't you take a crack at it because I can't do it. — khaled
True. That is why I don't really care much for using the term, "experience". I was only using it because that is the scribble that you know to refer to the event we are talking about. I have learned that, in order to communicate, you have to use words that your audience understands, not necessarily the words you would use, because it is the thing that we are talking about that is important, not the scribbles that we use.It's just that when I'm talking definitions with someone I get really nitpicky about words. "experiencing eggs in the fridge" is sort of vague because it can either mean simply seeing eggs in the fridge or somehow literally "Knowing beyond all doubt that there are in fact eggs in the fridge". I just wanted to be specific that we're talking about seeing things here. — khaled
That all depends on what you mean by "conscious", "experience" and "knowledge". If the accuracy of our knowledge is not affected by how direct or indirect the knowledge is, then what is the point of using those terms? I'm only questioning the reliability of the experience by taking what you have said and running with it. If you don't mean to say that the accuracy of our knowledge is affected by the indirect nature of it, then what are you actually saying when you say that our knowledge is "indirect"?That's true. Which is one reason why people can be mistaken. But if your argument against the idea that we are not conscious of the causes of our consciousness is going to rely on a general doubt about the veracity of anything we experience, you really do have a contradiction on your hands. After all, I'm only rendering knowledge of the causes of our experience indirect. You're questioning the reliability of experience itself, which is going much further (too far imo) down the road. The reliability of experience has nothing to do with my earlier comment; it is the total absence of experience of certain events that underlies my comment. — Kenosha Kid
That's part of the problem - in thinking of these concepts in this way.This is circular.
— Harry Hindu
Of course it is. As is every definition ever (at least of these basic concepts)... — khaled
What is an experiencer?
— Harry Hindu
No, because I thought that
— Harry Hindu
Whatever "I" is referring to here. — khaled
Telling me that it's an "experience" just tells me what scribble I can use to refer to this event, but what is this event?
— Harry Hindu
Oh so you understand what it means now all of a sudden? Yes, it is probably that event you had in mind while writing this (in a literal and metaphorical sense).
A sensible question. But consider this: maybe the reason we use that scribble only and we do not have accurate language to describe what is happening is because we don't know what is happening. — khaled
Well, you know that you are conscious. So you tell me the meter that you used to determine that you are conscious. Your meter seems to simply be how many human beings in your immediate environment use a particular scribble to refer to the event.Not at all, I wouldn't say it is unimportant, I would say we can't know the answers to these questions. Because this isn't an event we can detect. Show me the "consciousness-o-meter" and then we might be able to answer these questions, or show me how to make one. — khaled
:roll: I was asking what the event is, not what the scribble is. And in asking what the event is, I'm NOT asking what scribble most English speakers use to point to it (unless you're saying that consciousness is a word?). I'm asking about those relationships I spoke about earlier.Sure you do, you wrote that "experience" is a scribble that refers to an event. — khaled
:roll:Seeing is a type of experience. However "seeing eggs" =/= "experiencing eggs in the fridge" (whatever that means). You experience a certain image, of there being eggs in a fridge. I don't understant what "experiencing eggs in the fridge" means. That image may or may not reflect reality. — khaled