Tanquam ex ungue leonem (I recognize the lion by his claw). — Johann Bernoulli
Memory is just data storage. Pattern recognition is the beginning of cognition : knowing, consciousness. Pattern recognition sees the invisible (meaningful) links between isolated bits of information. Human intelligence is far ahead of AI in its ability to do more than just mimic. Plus the human mind uses a variety of cognitive processes -- beyond pure Logic (e.g. emotional & visceral & muscle memory) -- to add nuance to sensation. :brow:That's the power of memory.
That's the power of pattern recognition.. — Agent Smith
How do art aficionados/experts identify the provenance of a painting? — Agent Smith
How do art aficionados/experts identify the provenance of a painting?
— Agent Smith
First, document search. Sales records.
Second, paint used. Many forgers get caught using paint that was not available at the time. — Jackson
I recommend reading about Artificial Neural Networks:
Neural networks learn (or are trained) by processing examples, each of which contains a known "input" and "result," forming probability-weighted associations between the two, which are stored within the data structure of the net itself. The training of a neural network from a given example is usually conducted by determining the difference between the processed output of the network (often a prediction) and a target output. This difference is the error. The network then adjusts its weighted associations according to a learning rule and using this error value. Successive adjustments will cause the neural network to produce output which is increasingly similar to the target output. After a sufficient number of these adjustments the training can be terminated based upon certain criteria. This is known as supervised learning.
Such systems "learn" to perform tasks by considering examples, generally without being programmed with task-specific rules. For example, in image recognition, they might learn to identify images that contain cats by analyzing example images that have been manually labeled as "cat" or "no cat" and using the results to identify cats in other images. They do this without any prior knowledge of cats, for example, that they have fur, tails, whiskers, and cat-like faces. Instead, they automatically generate identifying characteristics from the examples that they process. — Relativist
Interesting subject, AS! Let me give one shot for the goal. The human memory function very differently from the computer memory. If an image is projected on our retina, a corresponding neural structure is activated. The world is full of patterns and forms of which the parts have no causal connection to the whole. Diametrically opposed parts of the circle don't influence one another directly, but still the circle ( or spherical form of the sun) stays a circle. All parts have a common cause and form the circle. — Hillary
Memory is just data storage. Pattern recognition is the beginning of cognition : knowing, consciousness. Pattern recognition sees the invisible (meaningful) links between isolated bits of information. Human intelligence is far ahead of AI in its ability to do more than just mimic. Plus the human mind uses a variety of cognitive processes -- beyond pure Logic (e.g. emotional & visceral & muscle memory) -- to add nuance to sensation. :brow:
Ie ie! Yokoso!
You're welcome. — Gnomon
there's the real possibility that brain function is radically unlike that of computers. We'll have to wait for (neuro)science to tell us how as I have a feeling this matter is still not as cut-and-dried as we would've liked. — Agent Smith
Count me among those who think it’s a mistake to treat the mind as a computational device and neurons as 1’s and 0’s. I hew with enactivist cognitive psychology which rejects computationalism and representationalism when it comes to modelling human perception.
Relating this back to your distinction between memory and pattern recognition, I would argue that the neural activity of the brain is constantly changing in response both to external stimuli and its own activity. This means that memory is not stored patterns that remain unchanged until accessed. Meanwhile, what is perceived comes already pre-interpreted based on prior expectations. So memory , in the form
of expectations , co-determines what counts as data in the first place. All perception is recognition because of this contribution of anticipatory neural activity to perception at even the lowest levels. — Joshs
Yes. That's how AI chess players beat humans : they have instant access to thousands of historical games and situational plays. The only thing that keeps humans in the game today is creativity : to do what hasn't been done before, hence is not yet in memory. :smile:However, pure memory seems adequate to appear intelligent. You could, for instance, memorize every question and their answers and pass yourself off as a genius, but are you? — Agent Smith
In my humble opinion, any noncomputational model of mind reads like gobbledygook. Maybe that's just me, I'm not smart you see — Agent Smith
Yes. That's how AI chess players beat humans : they have instant access to thousands of historical games and situational plays. The only thing that keeps humans in the game today is creativity : to do what hasn't been done before, hence is not yet in memory. — Gnomon
The brain doesn't compute. It simulates. The mind computes. — Hillary
DOES NOT COMPUTE! — Agent Smith
Yes. But the human mind evolved for quick back-of-the-envelope solutions to pattern-recognition problems : tiger or bush? The computer was developed & dedicated specifically for maze-running expertise. Just think how dumb humans will feel when Quantum AI learns to play war games like SkyNet. :smile:Isn't a computer Go world champion? — Hillary
Actually, the first so-called "computers" were women mathematicians. And their primary advantage over their male competitors was that they were able to sit still and focus on numbers for hours on end. Meanwhile, the men would get restless, their minds would wander, and they were made to look like fools by the very females. who were not supposed to be "good with numbers". Unfortunately, for those number-crunching gals, the digital computer is even more focused & relentless. But dumb! If they divided by zero, they would keep-on crunching until kingdom come, or the machine burst into flames, whichever came first. :joke:This generates a kinda sorta paradox where a fool (computer) beats a sage (a person, relatively speaking that is). — Agent Smith
Just think how dumb humans will feel when Quantum AI learns to play war games like SkyNet — Gnomon
But the computer does! It computes big X times as fast as we do. On big Y times as many data we do. Simulating intelligence. The brain is a universe in small. Everything there is in the world, we can resonate with. While walking the streets you constantly resonate with the world and your inside world, and yourself (body) on their turn, shape the world. From conception to last breath, no from big bang to last breath, one ongoing process. No on or off button. Well, a final off button maybe... — Hillary
Actually, the first so-called "computers" were women mathematicians. And their primary advantage over their male competitors was that they were able to sit still and focus on numbers for hours on end. Meanwhile, the men would get restless, their minds would wander, and they were made to look like fools by the very females. who were not supposed to be "good with numbers". Unfortunately, for those number-crunching gals, the digital computer is even more focused & relentless. But dumb! If they divided by zero, they would keep-on crunching until kingdom come, or the machine burst into flames, whichever came first. — Gnomon
"Sit, and focus!"
"I tell you something, stick tha numbers on your p$n$s! Oh no, to big a number!" — Hillary
the all-purpose problem solving abilities of humans that can come up with good solutions to almost any problem they encounter. — Gnomon
Get involved in philosophical discussions about knowledge, truth, language, consciousness, science, politics, religion, logic and mathematics, art, history, and lots more. No ads, no clutter, and very little agreement — just fascinating conversations.