Difficult (or impossible) as it may be, I'm interested in determining whether there is any evidence that English pronouns are supposed to refer specifically to a person's sex or gender (or both). — McMootch
No, the question only presupposes that we use scribbles to refer to things, not what those scribbles should or should not refer to.1. The question presupposes that there is a distinction between sex (biological) and gender (social/performative). If you don't affirm the distinction fair enough, but debating it is not the intent of this post. — McMootch
The reference to each "symbol" becomes a matter of causal fact. Effects "symbolize" their causes. The tree rings in a tree stump don't pretend to be about the age of the tree. The tree rings are about the age of the tree because of how the tree grows through out the year - cause and effect.Haha yes, potentially. When implemented as automation. Then the reference of each symbol token becomes a matter of mechanical fact. As when a machine translates a phonetic symbol into a sound. When considered apart from such automation, the syntactic connections may well be made semantically, so that we acknowledge a pretended connection between, say, a written letter and a phoneme, or between one written token of the letter and another. — bongo fury
Your example is to basic and leaves too many questions left unanswered. How does consciously observing scribbles on a page provide knowledge of unconscious processes?If you've ever read a nonfiction book, you have gained knowledge of things you never had conscious experience of. You do not experience Agincourt when you read about it, but you still acquire knowledge about it. Same goes for science. You can learn about things the brain does that we are not conscious of by study, research, education, reading out of interest, etc. I don't really get why this is where the conversation is going. It seems a tad basic. — Kenosha Kid
I'm not one of those asserting that the mind is an illusion, or doesn't exist. What I'm saying is that our view of the world as "physical" boxes containing "non-physical" images and minds is wrong. The boxes are quantified information. There are no "physical" boxes with "non-physical" items in them. It is all information.If i knew, I'd be famous. Assuming I could explain it to the rest of you bullet-biting p-zombies. — Marchesk
Given that our knowledge and understanding of brains is in the form of conscious visual models, if our minds are illusions, then so is our understanding of brains. All the deniers do is undermine their own theories of how brains work.One of the strangest things the Deniers say is that although it seems that there is conscious experience, there isn’t really any conscious experience: the seeming is, in fact, an illusion. — Olivier5
Then how do you know that minds or images don't literally exist in computers?How do images "literally" exist inside brains?
— Harry Hindu
I don't know. — Marchesk
Its only a hard problem if you're a dualist. You have to explain how certain hardware contains minds and other hardware doesn't. The problem is thinking in "physical" and "mental" terms - that there are physical boxes that contain these non-physical things we call images and minds.It's a hard problem. But maybe we'll know in another century. — Marchesk
Thats just rephrasing your statement that images are in minds. What does it mean for a mind to produce images? Doest your computer produce images on the screen? Where is the image of this web page- in your brain, in your mind, or on the computer monitor?Produced by minds, part of the makeup of minds, however you wish to phrase it. Mind being a word for consciousness, thinking, intentionality, desire and anything that's difficult to reduce to neurons firing and chemicals flowing. — Marchesk
Well, that was my question: how do minds exist "inside" brains?I don't know. The exist in our minds, though, and arguably nowhere else. — Marchesk
How do images "literally" exist inside brains?Yep, images and sounds don't literally exist inside computers. They're encoded as information for output devices that create sound and light waves for our eyes and ears. — Marchesk
Your disagreement isn't an valid argument against anything I've said.So, everywhere. I disagree. — bongo fury
Then semantics/meaning is a fiction?But it's a special fiction indulged by animals capable of playing along. — bongo fury
The problem is that you are still aware when asleep. You wake up suddenly to loud noises. How could you do that unless you were at least partially aware? Are you conscious while dreaming?The definition of consciousness, I'm going to use here is awareness of the external world and also of oneself. It's quite obvious that this is what is meant by consciousness by most folks as when these don't occur e.g. when one is asleep or in a coma, we're said to be unconscious. — TheMadFool
You used the term, "experiences", so I'm asking you how you were using the term.I'm not sure what specifically you're asking. We have brains that react to external stimuli and convert that reaction into what we consciously experience via various transformations and augmentations. What bit of that are you questioning: How things can react to external stimuli (physics)?; Why we have brains that can do this (evolution)?; How brains do this (neurology)?; Or are you just asking about the first-/third-person distinction, e.g. why a stimulated nucleus accumbens feels like pleasure? — Kenosha Kid
What does it mean to be "conscious" of something?My bad, I used the term 'phenomena' in an inconsistent way. What I meant was that there are _processes_ in the brain that we are not conscious of (e.g. outline detection, pattern-matching, etc.) and processes that we are conscious of (e.g. rational decision-making). — Kenosha Kid
It depends on what you mean by, "conscious" and "conscious efforts". How does one come to consciously know that they are unconscious of many processes occurring in the brain? :brow: It sounds like a meaningless contradiction to me.A child can just ask 'why?' to every answer; that's not interesting conversation. Do you believe that you are conscious of every thing your brain does, including the cited examples of inverting the retinal image, white-shifting colours, outline detection? Do you claim you make a conscious effort to do these things? Do you consciously regulate your breathing at every moment? Consciously produce dopamine when you spot something surprising that you consciously decide is good?
If not, then you already know that you are unconscious of many (indeed) of the processes occurring in the brain, and your incredulity is less than credible. — Kenosha Kid
Exactly! The relationship between cause and effect is information, and information is a fundamental unit of cognition.That would be the fundamental unit of cognition - basic cause and effect. The sand acknowledges the pressure of the footprint and gives way accordingly. Its a long way from the complicated cognition we enjoy, but it is the start of it. — Pop
Isn't your footprint information that Daemon passed this way? Doesn't the sand have a memory of your passing - the persistent existence of your footprint in the sand? Once the footprint is washed away, the sand forgets you ever passed this way.Was he saying that the sand on the beach (for example) was capable of cognition? — Daemon
Then it appears that there is no difference in an illusion of consciousness that doesn't have proper semantics, and one that does. Semantics is derived from the syntax - from the relationship between the rules and what the rules cause one to do or not do.The same as the distinction between an illusion of consciousness that (like the Chinese Room) doesn't have a proper semantics, and one that does. — bongo fury
But words are just colored scribbles and sounds. It seems like you'd have a problem defining the nature of words, too.Harry I don't have a problem defining consciousness and suchlike. Like many words they are defined ostensively.
Wikipedia:
An ostensive definition conveys the meaning of a term by pointing out examples. This type of definition is often used where the term is difficult to define verbally, either because the words will not be understood (as with children and new speakers of a language) or because of the nature of the term (such as colours or sensations) — Daemon
This is all just more information. All causal relations, which include logical entailments, is information.More seriously, the fundamental stuff of physics like fields, energy, matter, forces, spacetime and all the stuff that's logically entailed by that. — Marchesk
Because its difficult to derive meaning from anything Banno says. It probably has to do with how he uses words.Banno-inspired perception-related debates used to go 100+ pages. And it often included talk of apples. — Marchesk
How about why you have experiences at all?Sorry for the late reply. I've seen this argument a few times and never got the sense of it. We don't need any knowledge of the workings of the brain to understand why I don't experience your sensory input. It is not a neurological question. It's not even a sensible question imo. — Kenosha Kid
Actually it doesn't, which is why it is broken down into functional systems not specific parts of the brain. Either system could be, and likely is, distributed. But certainly parts of the brain are dominant in certain functions. — Kenosha Kid
What the heck does this even mean? What is the difference between unconscious and conscious phenomena, or systems? If the systems are distributed, then how is it that they aren't aware of what is going on in the other parts? How is the brain itself not aware of what it's different systems are doing? Can an unconscious system be aware of what the unconscious and conscious systems are doing?The answer to that is precisely why we labour under the illusion that we make those decisions consciously. Recall that we are not conscious of the unconscious causes of conscious phenomena. Decisions from System 1 are presented to System 2 apparently uncaused (i.e. without System 2 being aware of the process). So from System 2's point of view, decisions originate in System 2. There are lots of published tests for this. — Kenosha Kid
Banno-inspired perception-related debates used to go 100+ pages. And it often included talk of apples. — Marchesk
:rofl:Have we come to any sort of consensus as to what color is? Or pain?
If it's not qualia, is it ... a model? A language game? A private beetle we can't talk about? — Marchesk
It appears that you've answered your own question.the piano not perceiving certain inputs from the keyboard? Does it not perceive the meaning of your keystrokes and make the correct sounds for you to listen to? — Daemon
Harry I don't have a problem defining consciousness and suchlike. Like many words they are defined ostensively.
Wikipedia:
An ostensive definition conveys the meaning of a term by pointing out examples. This type of definition is often used where the term is difficult to define verbally, either because the words will not be understood (as with children and new speakers of a language) or because of the nature of the term (such as colours or sensations) — Daemon
I would say the theory is ideal, in that it's humans creating a map of the territory, while the territory itself might be understood as physical, assuming a physicalist ontology. That does allow for the possibility that the theory is missing something fundamental. A map is only as good as the map makers and their knowledge of the territory. — Marchesk
Humans, maps and territory are all observable, so I don't know what Marchesk means by "ideal" other than that they like the theory, or that it works for them. The fundamental aspect that is missing is causation - of how maps can be about territories.just means "in terms of observable phenomena". — Janus
Again, what does it mean to feel?Do you think a piano feels something when you press the keys? — Daemon
Is it the theory that is physical, or what the theory is about (what it points to) that is physical, or both?When it comes to producing speculative hypotheses regarding the origins of life and consciousness physical theories are all we have, because only they are testable. That doesn't mean you can't speculate idealistically; it just means there is no way to test such speculations. — Janus
No, because this is the primary point of contention, and you keep ignoring the contradiction that you keep making. What makes the hardware in your head special in that it feels, but computer hardware can't? What does it mean to feel?No, the computer is not perceiving inputs in the way you and I perceive things. Press a finger against the back of your hand. You feel a sensation. When you press a key on the computer keyboard, the computer doesn't feel a sensation.
Shall we try to agree on this before we move on to the rest of your ideas? — Daemon
So understanding has to do with perceiving meaning? What do you mean by, "perceive"? Is the computer not perceiving certain inputs from your mouse and keyboard? Does it not perceive the meaning of your keystrokes and mouse clicks and make the correct characters appear on the screen and windows open for you to look at?It doesn't have any understanding. It doesn't perceive the intended meaning, it doesn't perceive anything. It isn't equipped to perceive anything. — Daemon
But the semantics weren't ascribed by you. They were ascribed by your teacher(s) who taught you how to translate words. You weren't born knowing any language, much less how to translate them. You had to be taught that. You also weren't the one that created languages, to define what scribble and sound refers to what event or thing. You had to be taught that. You used your eyes and ears (your inputs) and your brain (your processor) to learn, to discern the patterns, so that you may survive in this social environment (produce the appropriate outputs as expected by your peers).Semantics, meaning, is not intrinsic to the physics of my PC. The semantics is ascribed, in this case by me, when I tell it how to translate words and phrases. — Daemon
That's because the only thing it knows is to spit out this scribble when it perceives a certain mouse click or key stroke. It has the same instructions as the man in the room - write this scribble when you perceive this scribble. It doesn't have instructions that actually provide the translation, of this word = that word, and then what that word points to outside of the room, which is how you understand it, because that is how you learned it.The translation tool often produces quite spooky results, it certainly looks like it understands to a naive observer, but it's easy to see that it doesn't understand when you allow it to translate on its own without my intervention (which I never do in practice). — Daemon
Because it doesn't have the same set of instructions, nor the need to learn them, that you did when you learned them, but that doesn't mean that it couldn't if it had the need and the correct set of instructions.One of the author's conclusions is that "linguistic meaning is derived from the role things and people play in everyday life". I said something about this above, using the word "good" and the translation of machine assembly instructions as examples.
If the translation tool's understanding was the same as mine, as you seem to want to believe, then machine translation would be as good as human translation. But it isn't! — Daemon
What role does "hello" play? Does this not mean that that utterance refers to the role that it plays?At the least, can you consider the possibility that there are parts of language, things we do with words, for which the meaning is not given by the referent, but is instead found in the role these utterances and scribbles play in our day to day lives? — Banno
No. The question is just a different way of framing the hard problem of why there are two very different perspectives of mental processes, but only one type of perspective for everything else, like chairs, mountains and trees.Because the fact that I am bert1 allows me two different perspectives to examine bert1's mental processes: introspection and extropsection. Whereas other people only have extrospection as a way of observing bert1's mental processes (to the extent that the can do so at all).
Is that question equivalent to "Why am I some particular person, rather than no one in particular?"? — bert1
Isn't that how you learned the translation of a word and then use the translation? Didn't you have to learn (be programmed) with that information via your sensory inputs to then supply that information when prompted? How is the translation tool's understanding different than a brain's understanding?I don't need the translation tool Harry, I can do the translation on my own, the tool just saves me typing. When I come across a word that isn't in my Translation Memory I add it to the memory, together with the translation. Then the next time that word crops up I just push a button and the translation is inserted. The translation tool doesn't understand anything. — Daemon
You used the phrase. I thought you knew what you were talking about. I would define it as a kind of working memory that processes sensory information.A dictionary definition of "understand" is "perceive the intended meaning of". Another dictionary says "to grasp the meaning of".
What do you think conscious experiences are? — Daemon
If you are the translator then why do you need a translation tool? Where do the translations reside - in your brain or in you tool? If you need to look them up in a tool, then the understanding of that particular translation is in the tool, not in your brain.I'm a translator, I use a computer translation tool — Daemon
These are all unfounded assertions without anything to back it up. What are conscious experiences? What do you mean by, understand?You're right that the rules in the room are not those that Chinese speakers use. But that's the point: a computer can't understand language in the way we can. The reason is that we learn meaning through conscious experience. — Daemon
The instructions in the room are written in a language - a different language than Chinese. How did the man in the room come to understand the language the instructions are written in? I've asked this a couple of times now, but you and Apo just ignore this simple, yet crucial, fact.You"re right that the rules in the room are not those that Chinese speakers use. But that's the point: a computer can't understand language in the way we can. The reason is that we learn meaning through conscious experience. — Daemon
I don't see a world of difference between them. Algorithms are a type of constraint.There is a world of difference between rules as algorithms and rules as constraints. — apokrisis
I'm not clear on how this answers the question.And what is it about you that provides you with different evidence of your consciousness than I have of your consciousness?
— Harry Hindu
I can introspect myself, but others can't. — bert1
Of course life and minds follow rules. You are following the rules of the English language that you learned in grade school when you type your posts. Ever heard of the genetic code? Why do you keep saying stuff that only takes a simple observation to see that it isn't true?But life and mind don’t “follow rules”. They are not dumb machine processes. They are not algorithmic. Symbols constrain physics. So as a form of “processing”, it is utterly different. — apokrisis
No. To understand language is to possess a set of rules in memory for interpreting particular scribbles and sounds. Like I said, understanding is the possession of a set of rules in memory for interpreting any sensory data. The man in the room has a different set of rules for interpreting the scribbles on the paper than the rules that Chinese people have for interpreting those same symbols. Hence, the instructions in the room are not for understanding Chinese because they are not the same set of rules that Chinese speakers learned or use. The room understands something. It understands, "write this symbol when you see this symbol." The room also understands the language the instructions are written in. How can that be if the room, or the man, doesn't understand language?To understand language is to know how to act. That knowing involves constraining the uncertainty and instability of the physical realm to the point that the desired outcome is statistically sure to happen. — apokrisis
It seems like that is your problem to solve. You are the dualist, so you are the one that sees this as a hybrid device. As a monist, I don't see it as such. What is it about carbon that is so special in being the only element capable of producing a hybrid device in the sense that you are claiming here? Why do you think that natural selection is often confused as a smart process (intelligent design), rather than a dumb (blind) process?So you can’t just hand wave about reconnecting the computer to the physics. You have to show where this now hybrid device is actually doing what biology does rather than still merely simulating the physics required. — apokrisis
