My contemplation of a math problem involves no qualia, but would be impossible without consciousness. — J
Very well said. I would say that thoughts are also a form of Qualia.Sorry, our math contemplations do contain a lot of fine qualia that are not so maybe prominent as other stronger qualia, but can still very much be sensed: i.e. rapture, elation, insight, direction, similarity - all of these are qualia feels, too. :)
We could posit that basically ALL of the contents of the conscious aware process are different levels of qualia, actually... (?) — Ulthien
Or if we insist on some such description, then we're talking to the humans who invented the program.
— J
not really. The programmers gave them only the framework to learn, — Ulthien
Sorry, our math contemplations do contain a lot of fine qualia that are not so maybe prominent as other stronger qualia, but can still very much be sensed: i.e. rapture, elation, insight, direction, similarity - all of these are qualia feels, too. — Ulthien
Bit of an odd reply on my part perhaps, and for that I apologize, — Outlander
Qualia — MoK
To me, Qualia are the texture of the experience. So it is the texture when it is applied to the experience.Is there really no term or concept (even if it's not a simple one or two length word) synonymous with "Qualia". — Outlander
It is the texture.It's an invented term, presumably because no word suited what whomever coined it presumes or otherwise postulates it describes. Is there really no single word synonymous beyond the definition? — Outlander
Experience, to me, is a mental event. Experience, to me, is the result of the mind perceiving a substance. I have a thread on substance dualism where I discussed this. Physicalism is out of discussion. I have a thread on "Physical cannot be the cause of its own change". Idealism is out of discussion as well, since it cannot answer why the ideas are coherent.Is it not "experience" (perhaps as it relates to the brain-mind model)? — Outlander
I don't suspect an abacus is a conscious unit. While I suspect consciousness is everywhere, in all things, I don't think everything that humans view as physical units necessarily are conscious units. I think the unit must be processing information in order to be a conscious unit. — Patterner
You possess something that instruments don’t, namely, organic unity.
— Wayfarer
Is "organic unity" not a collection of material components? Because as far as I'm aware, organic matter is matter. — Michael
The issue is more, what is it that is being named by "qualia"?But what else should we substitute? — J
See the present thread for samples of the confusion they incur. — Banno
Very well said. I would say that thoughts are also a form of Qualia. — MoK
Yeah, probably a losing battle on my part. But I'd like to see more pushback against the easy acceptance of the fiction that a program is an entity or even an agent. With a name! Who starts sentences with "I . . . "! — J
Could you do that without giving it a power system and adding the rules so the abacus would manipulate the beads correctly?An abacus can be used to process information - it's a primitive computer. There's no real difference in principle between the abacus and a computer. The difference is one of scale. The NVidia chips that drive AI have billions of transistors embedded in a patch of silicon. You could in principle reproduce that technology with the abacus, although it would probably be the size of a city, and it would take long periods of time to derive a result. But in principle, it's the same process. — Wayfarer
I understand you are asking something, but it is not at all clear to me, what. — Banno
The problem is that qualia are no more clearly defined than is consciousness, and so are not all that helpful. — Banno
Organisms operate by different principles to non-organic matter. — Wayfarer
The point is they are qualities of experience and therefore precisely what eludes objective description. — Wayfarer
The issue is more, what is it that is being named by "qualia"?
The idea was that philosophers define consciousness in terms of qualia. The problem is that qualia are no more clearly defined than is consciousness, and so are not all that helpful. — Banno
How is "qualities of experience" clearer than that? — Banno
The really hard problem of consciousness is the problem of experience. When we think and perceive, there is a whir of information-processing, but there is also a subjective aspect. As Nagel (1974) has put it, there is something it is like to be a conscious organism. This subjective aspect is experience. When we see, for example, we experience visual sensations: the felt quality of redness, the experience of dark and light, the quality of depth in a visual field. Other experiences go along with perception in different modalities: the sound of a clarinet, the smell of mothballs. Then there are bodily sensations, from pains to orgasms; mental images that are conjured up internally; the felt quality of emotion, and the experience of a stream of conscious thought. What unites all of these states is that there is something it is like to be in them. All of them are states of experience. — Facing Up to the Problem of Consciousness, David Chalmers
This definition is used in academic philosophy, especially in A-level and university-level discussions of consciousness and the mind — Ulthien
Yes, computers are physical systems. But an abacus is not a computer. It can't process information unless you give it a power system and add the rules so it manipulates the beads correctly. IOW, unless you turn it into a computer.↪Patterner Yes, would need all of that - but the point being, computers are still physical systems. — Wayfarer
Do you think they use "learn" and "teach" inappropriately in this article?(Notice my careful avoidance of the term "learn"! :wink: There is no entity here that can learn anything.) — J
In March of last year, Google's (Menlo Park, California) artificial intelligence (AI) computer program AlphaGo beat the best Go player in the world, 18-time champion Lee Se-dol, in a tournament, winning 4 of 5 games. At first glance this news would seem of little interest to a pathologist, or to anyone else for that matter. After all, many will remember that IBM's (Armonk, New York) computer program Deep Blue beat Garry Kasparov—at the time the greatest chess player in the world—and that was 19 years ago. So, what's so significant about a computer winning another board game?
The rules of the several-thousand-year-old game of Go are extremely simple. The board consists of 19 horizontal and 19 vertical black lines. Players take turns placing either black or white stones on vacant intersections of the grid with the goal of surrounding the largest area and capturing their opponent's stones. Once placed, stones cannot be moved again. Despite the simplicity of its rules, Go is a mind-bogglingly complex game—far more complex than chess. A game of 150 moves (approximately average for a game of Go) can involve 10360 possible configurations, “more than there are atoms in the Universe.” As complex as it is, chess is vastly less complex than Go, and chess is amenable to “brute force” algorithmic computer approaches for beating expert chess players like Kasparov. To beat Kasparov, Deep Blue analyzed possible moves and evaluated outcomes to decide the best move.
Go's much higher complexity and intuitive nature prevents computer scientists from using brute force algorithmic approaches for competing against humans. For this reason, Go is often referred to as the “holy grail of AI research.” To beat Se-dol, Google's AlphaGo program used artificial neural networks that simulate mammalian neural architecture to study millions of game positions from expert human–played Go games. But this exercise would, at least theoretically, only teach the computer to be on par with the best human players. To become better than the best humans, AlphaGo then played against itself millions of times, over and over again, learning and improving with each game—an exercise referred to as reinforcement learning. By playing itself and determining which moves lead to better outcomes, AlphaGo literally learns by teaching itself. And the unsettling thing is that we don't understand what AlphaGo is thinking. In an interview with FiveThirtyEight, one computer scientist commented, “It is a mystery to me why the program plays as well as it does.” 5 In the same article, an expert Go player said, “It makes moves that no human, including the team who made it, understands,” and “AlphaGo is the creation of humans, but the way it plays is not.” It is easy to see how some viewed AlphaGo's victory over Se-dol as a turning point in the history of humanity—we have created machines that truly think and, at least in some areas like Go, they are smarter, much smarter, than we are. — Scott R. Granter, MD
But an abacus is not a computer. — Patterner
Do you think they use "learn" and "teach" inappropriately in this article? — Patterner
But this exercise would, at least theoretically, only teach the computer to be on par . . . etc. — Scott R. Granter, MD
AlphaGo then played against itself millions of times, over and over again, learning and improving with each game — Scott R. Granter, MD
AlphaGo literally learns by teaching itself. — Scott R. Granter, MD
we have created machines that truly think and, at least in some areas like Go, they are smarter, much smarter, than we are. — Scott R. Granter, MD
Get involved in philosophical discussions about knowledge, truth, language, consciousness, science, politics, religion, logic and mathematics, art, history, and lots more. No ads, no clutter, and very little agreement — just fascinating conversations.