Can you tell me how I could get all that stuff about the store shelf and banana bread into my translation memory? — Daemon
I can sketch it out.
You need some bootstrap capabilities outside of dictionaries... things like humans have; e.g.:
A camera does not see. — Daemon
...the ability to see. — InPitzotl
To most people. Why do you keep refusing to accept this? In a place where the councillors are corrupt/vicious why not the opposite. — I like sushi
What exactly are we discussing and for what purpose? I do fail to see the profoundness or any possible fruit of this topic. Computers, AI =/= human comprehension. I doubt there was any disagreement at any point. — Outlander
That raises the important question of what understanding is and, more importantly, whether it is something beyond the ability of a computer AI? — TheMadFool — Daemon
Now, I don't know how you have reduced such an interesting topic as "The important question of what understanding is" into a translation subject! — Alkis Piskas
That raises the important question of what understanding is and, more importantly, whether it is something beyond the ability of a computer AI? — TheMadFool
How would you prove that this extra thing beyond rule following, this 'understanding' exists? — frank
Hm, I can see the point — Outlander
I guess a question would be: how do you know your experiences are similar enough to allow understanding? — frank
What TMF is talking about is a mapping from "banana" to the stuff on the store shelf, the stuff infused within banana bread, the stuff in banana cream pies. — InPitzotl
I guess you wrote these sentences because you seem to be offended because myself, and another above, have pointed out they are poorly written. — I like sushi
Here’s a better example: “The chicken is ready to eat” — I like sushi
The point is that it is on the writer to avoid ambiguity in sentences when needed. — I like sushi
Ok so, what's your definition of understanding?
Please don't repeat yourself by saying, "...as illustrated by my examples...". — TheMadFool
If I'm correct, maybe someone can do a better job than I explaining why. — tim wood
Understanding is mapping — Hermeticus
Thank you! — TheMadFool
If we were to talk in hypotheticals: — Hermeticus
Sensation
The physical principles behind these sensors and the senses of our body are literally the same. — Hermeticus
We already have this. — Hermeticus
And as for representations - computers are literally built on it. They're a representational system. Everything you see on your browser is a representation of a programming language. The programming language is a representation of another programming language (machine code). Machine code is a representation of bit manipulation. Bits are a representation of electric current. — Hermeticus
I've argued that it's absolutely possible for an AI to have the same experiences we have with our senses — Hermeticus
If you just want to talk about what understanding in general is, I'm totally with TheMadFool here. Understanding is mapping. Complex chains of association between sensations and representations. — Hermeticus
Yet, they're doing things (proving theorems) which, in our case, requires understanding. — TheMadFool
Please tell me,
1. What understanding is, if not mapping? — TheMadFool
On what basis can we say that an artificial brain wouldn't be possible in the future? — Hermeticus
The field of bionic prosthetics has already managed to send all the right signals to the brain. There are robotic arms that allow the user to feel touch. They are working on artificial eyes hooked up to the optic nerve - and while they're not quite finished yet, the technology already is proven to work. — Hermeticus
There is a reason why all living beings, even very simple ones, cannot be described in terms of chemistry alone. It's that they also encode memory which is transmitted in the form of DNA. — Wayfarer
Machines can experience physical phenomena that reflects our perception - from cameras to thermometers, pressure and gyro sensors - none of our senses can't be adopted digitally. This means that fundamental concepts like "up" and "down", "heavy" and "light" can indeed be experienced by computers. — Hermeticus
My own views on the matter is semantics is simply what's called mapping - a word X is matched to a corresponding referent, say, Jesus. I'm sure this is possible with computers. It's how children learn to speak, using ostensive definitions. We could start small and build up from there as it were. — TheMadFool
Because it is true? But maybe you don't get my meaning and are making it too concrete. I've met dozens of folk who work in government and management who can talk for an hour using ready to assemble phrases and current buzz words without saying anything and - more importantly - not knowing what they are saying. — Tom Storm
There have been many studies where the mapped neural networks have been fed to the computer and the code was able to replicate the activity.
That is, if I were to say, touch an apple with my finger, a set of localised neurons would fire up in my brain and that is recorded with the fMRI machine and the data set is transferred to a computer in hopes of replication. — TheSoundConspirator
AI (and neural nets) is just a showy way of talking about laptops and PCs. Can you point to an "AI" that isn't just a digital computer — Daemon
Quantum computers? I'm not sure. Also, why do you ask? — TheMadFool
Voltage Tolerance of TTL Gate Inputs
TTL gates operate on a nominal power supply voltage of 5 volts, +/- 0.25 volts. Ideally, a TTL “high” signal would be 5.00 volts exactly, and a TTL “low” signal 0.00 volts exactly.
However, real TTL gate circuits cannot output such perfect voltage levels, and are designed to accept “high” and “low” signals deviating substantially from these ideal values.
“Acceptable” input signal voltages range from 0 volts to 0.8 volts for a “low” logic state, and 2 volts to 5 volts for a “high” logic state.
“Acceptable” output signal voltages (voltage levels guaranteed by the gate manufacturer over a specified range of load conditions) range from 0 volts to 0.5 volts for a “low” logic state, and 2.7 volts to 5 volts for a “high” logic state: — https://www.allaboutcircuits.com/textbook/digital/chpt-3/logic-signal-voltage-levels/
I'm referring to AI, at the moment hypothetical but that doesn't mean we don't know what it should be like - us, fully autonomous (able to think for itself for itself among other things).
For true AI, the only one way of making it self-governing - the autonomy has to be coded - but then that's like commanding (read: no option) the AI to be free. Is it really free then? After all, it slavishly follows the line in the code that reads: You (the AI) are "free". Such an AI, paradoxically, disobeys, yes, but only because, it obeys the command to disobey. This is getting a bit too much for my brain to handle; I'll leave it at that. — TheMadFool