That raises the important question of what understanding is and, more importantly, whether it is something beyond the ability of a computer AI? Speaking from my own experience, understanding/comprehension seems to start off at the very basic level of matching linguistic symbols (words, spoken or written) to their respective referents e.g. "water" is matched to "cool flowing substance that animals and plants need", etc. This is clearly something a computer can do, right? After such a basic cognitive vocabulary is built what happens next is simply the recognition of similarities and differences and so, continuing with my example, a crowd of fans moving in the streets will evoke, by its behavior, the word and thus the concept "flowing" or a fire will evoke the word/concept "not cold" and so on. In other words, there doesn't seem to be anything special about understanding in the sense that it involves something more than symbol manipulation and the ability to discern like/unlike thing. — TheMadFool
. It strikes me too that people are often not much better than computers and often master languages - management speak, or whatever and have no idea what they are saying. — Tom Storm
Nobody masters a language with no idea what they are saying. I don't know why you would suggest such a thing. — frank
Yes, human translators sometimes don’t understand what they are translating. Everyone has been baffled by poorly translated product instructions I guess. And sometimes this is because the human translator does not have experience assembling or using the product, or products like it. — Daemon
Because it is true? But maybe you don't get my meaning and are making it too concrete. I've met dozens of folk who work in government and management who can talk for an hour using ready to assemble phrases and current buzz words without saying anything and - more importantly - not knowing what they are saying. — Tom Storm
. I've met dozens of folk who work in government and management who can talk for an hour using ready to assemble phrases and current buzz words without saying anything and - more importantly - not knowing what they are saying. — Tom Storm
It was an aside, Frank - the idea being that it is not only computers that can assemble syntax without connecting to the content. You don't have to agree. — Tom Storm
I have been a professional translator for 20 years. My job is all about understanding. I use a Computer Assisted Translation or CAT tool.
The CAT tool suggests translations based on what I have already translated. Each time I pair a word or phrase with its translation, I put that into the "translation memory". The CAT tool sometimes surprises me with its translations, it can feel quite spooky, it feels like the computer understands. But it doesn't, and it can't.
I do a wide range of translation work. I do technical translations, operating and maintenance instructions for machines for example. To understand a text like that, you need to have had experience of work like that. Experience is the crucial element the computer lacks. Experience of all facets of our world. For example, to understand fundamental concepts like "up" and "down", "heavy" and "light", you need to have experienced gravity.
I translate marketing texts. Very often my clients want me to make their products sound good, and they want their own customers to feel good about their products and their company. To understand "good" you need to have experienced feelings like pleasure and pain, sadness and joy, frustration and satisfaction.
I translate legal texts, contracts, court documents.
A. The councillors refused to allow the protestors to demonstrate, because they advocated violence.
B. The councillors refused to allow the protestors to demonstrate, because they feared violence.
A computer can't understand that "they" applies to the protestors in A. but the councillors in B, because it's not immersed in our complex world of experience — Daemon
My own views on the matter is semantics is simply what's called mapping - a word X is matched to a corresponding referent, say, Jesus. I'm sure this is possible with computers. It's how children learn to speak, using ostensive definitions. We could start small and build up from there as it were. — TheMadFool
Each time I pair a word or phrase with its translation, I put that into the "translation memory". The CAT tool sometimes surprises me with its translations, it can feel quite spooky, it feels like the computer understands. — Daemon
Experience is the crucial element the computer lacks. — Daemon
For example, to understand fundamental concepts like "up" and "down", "heavy" and "light", you need to have experienced gravity. — Daemon
He can't understand what it is to "want" something through ostensive definition. He understands that through experiencing wanting, desire. — Daemon
Machines can experience physical phenomena that reflects our perception - from cameras to thermometers, pressure and gyro sensors - none of our senses can't be adopted digitally. This means that fundamental concepts like "up" and "down", "heavy" and "light" can indeed be experienced by computers. — Hermeticus
Afterall, the statement that there's no physical phenomena corresponding to emotions is false. Strictly speaking, it's all chemistry - qualitatively and quantitatively measurable. — Hermeticus
A camera does not see. A thermometer does not feel heat and cold. A pressure sensor does not feel pressure. — Daemon
That is lumpen materialism. There is a reason why all living beings, even very simple ones, cannot be described in terms of chemistry alone. — Wayfarer
I think it's a question of how machinized someone perceives our human organism. — Hermeticus
There is a reason why all living beings, even very simple ones, cannot be described in terms of chemistry alone. It's that they also encode memory which is transmitted in the form of DNA. — Wayfarer
The examples I gave were intended to illustrate that semantics isn't simply mapping!
Mapping is possible with computers, that's how my CAT tool works. But mapping isn't enough, it doesn't provide understanding. My examples were intended to illustrate what understanding is.
Children learn some things by ostensive definition, but that isn't enough to allow understanding. I have a two-year-old here. We've just asked him "do you want to play with your cars, or do some gluing?"
He can't understand what it is to "want" something through ostensive definition. He understands that through experiencing wanting, desire — Daemon
The field of bionic prosthetics has already managed to send all the right signals to the brain. There are robotic arms that allow the user to feel touch. They are working on artificial eyes hooked up to the optic nerve - and while they're not quite finished yet, the technology already is proven to work. — Hermeticus
But this can't be done without using the brain! — Daemon
The difference is in the signal that is sent thereafter and how the signal is processed. — Hermeticus
I have a hard time imagining that AI will ever get a good grip on these concepts. — Hermeticus
If and how this could possibly translate to machines perceiving emotion is beyond me. — Hermeticus
On what basis can we say that an artificial brain wouldn't be possible in the future? — Hermeticus
Please tell me,
1. What understanding is, if not mapping? — TheMadFool
Bacteria can swim up or down what is called a chemical gradient. They will swim towards a source of nutrition, and away from a noxious substance. — Daemon
The bacterium does not experience the chemical concentration. — Daemon
My examples were intended to illustrate what understanding is — Daemon
Automated theorem proving (also known as ATP or automated deduction) is a subfield of automated reasoning and mathematical logic dealing with proving mathematical theorems by computer programs. — Wikipedia
Yet, they're doing things (proving theorems) which, in our case, requires understanding. — TheMadFool
We can't, but this is science fiction, not philosophy. I love science fiction, but that's not what I want to talk about here — Daemon
They aren't doing things, we are using them to do things.
It's the same with an abacus. You can push two beads to one end of the wire, but the abacus isn't then proving that 1 + 1 = 2. — Daemon
Get involved in philosophical discussions about knowledge, truth, language, consciousness, science, politics, religion, logic and mathematics, art, history, and lots more. No ads, no clutter, and very little agreement — just fascinating conversations.