• Ulthien
    34
    My contemplation of a math problem involves no qualia, but would be impossible without consciousness.J

    Sorry, our math contemplations do contain a lot of fine qualia that are not so maybe prominent as other stronger qualia, but can still very much be sensed: i.e. rapture, elation, insight, direction, similarity - all of these are qualia feels, too. :)

    We could posit that basically ALL of the contents of the conscious aware process are different levels of qualia, actually... (?)
  • MoK
    1.8k
    Sorry, our math contemplations do contain a lot of fine qualia that are not so maybe prominent as other stronger qualia, but can still very much be sensed: i.e. rapture, elation, insight, direction, similarity - all of these are qualia feels, too. :)

    We could posit that basically ALL of the contents of the conscious aware process are different levels of qualia, actually... (?)
    Ulthien
    Very well said. I would say that thoughts are also a form of Qualia.
  • J
    2.1k
    Or if we insist on some such description, then we're talking to the humans who invented the program.
    — J

    not really. The programmers gave them only the framework to learn,
    Ulthien

    I'm actually happier with leaving out the whole "talking to" description, partially because if we try to stretch it, as I did, to generously include the human programmers, then your point becomes relevant -- it is a stretch, considering how the program runs. (Notice my careful avoidance of the term "learn"! :wink: There is no entity here that can learn anything.)

    Sorry, our math contemplations do contain a lot of fine qualia that are not so maybe prominent as other stronger qualia, but can still very much be sensed: i.e. rapture, elation, insight, direction, similarity - all of these are qualia feels, too.Ulthien

    A quale is usually defined as a sense perception, not a "feel," so that's how I used it.

    Bit of an odd reply on my part perhaps, and for that I apologize,Outlander

    Not at all. You'll be hard pressed to find any two philosophers who agree on how to discuss consciousness!

    The main point here is that I'm recommending making a distinction between consciousness and the contents of consciousness. (How firm and/or clear such a distinction will turn out to be, remains to be seen.) So qualia and other objects of thought or perception are in one bucket, and subjectivity or consciousness is what thinks or perceives them.

    As you point out, consciousness itself can also be an object of consciousness -- "thinking about thinking," self-consciousness. I myself don't believe that's a necessary element of subjectivity; probably very few animals other than humans have it, whereas consciousness is surely widespread throughout the animal kingdom.
  • Outlander
    2.6k
    QualiaMoK

    Is there really no term or concept (even if it's not a simple one or two length word) synonymous with "Qualia". It's an invented term, presumably because no word suited what whomever coined it presumes or otherwise postulates it describes. Is there really no single word synonymous beyond the definition? Is it not "experience" (perhaps as it relates to the brain-mind model)? Why or why not?
  • MoK
    1.8k
    Is there really no term or concept (even if it's not a simple one or two length word) synonymous with "Qualia".Outlander
    To me, Qualia are the texture of the experience. So it is the texture when it is applied to the experience.

    It's an invented term, presumably because no word suited what whomever coined it presumes or otherwise postulates it describes. Is there really no single word synonymous beyond the definition?Outlander
    It is the texture.

    Is it not "experience" (perhaps as it relates to the brain-mind model)?Outlander
    Experience, to me, is a mental event. Experience, to me, is the result of the mind perceiving a substance. I have a thread on substance dualism where I discussed this. Physicalism is out of discussion. I have a thread on "Physical cannot be the cause of its own change". Idealism is out of discussion as well, since it cannot answer why the ideas are coherent.
  • Wayfarer
    25.2k
    I don't suspect an abacus is a conscious unit. While I suspect consciousness is everywhere, in all things, I don't think everything that humans view as physical units necessarily are conscious units. I think the unit must be processing information in order to be a conscious unit.Patterner

    An abacus can be used to process information - it's a primitive computer. There's no real difference in principle between the abacus and a computer. The difference is one of scale. The NVidia chips that drive AI have billions of transistors embedded in a patch of silicon. You could in principle reproduce that technology with the abacus, although it would probably be the size of a city, and it would take long periods of time to derive a result. But in principle, it's the same process.

    You possess something that instruments don’t, namely, organic unity.
    — Wayfarer

    Is "organic unity" not a collection of material components? Because as far as I'm aware, organic matter is matter.
    Michael

    Organisms operate by different principles to non-organic matter. They grow, heal, maintain homeostasis, and reproduce. None of those behaviours can be observed in matter (crystals grow, but they don't exhibit any of the other characteristics.) None of the parts of inorganic aggregates are functionally related to the other parts, wheres the cells in a multicellular organism are differentiated in accordance with their functions in the various organs, as optic cells, kidney cells, etc. When they begin as stem cells, they are able to assume any of those functions depending on where in the organism they're located (hence the effectiveness of stem cell therapy).
  • Banno
    28.5k
    But what else should we substitute?J
    The issue is more, what is it that is being named by "qualia"?

    The idea was that philosophers define consciousness in terms of qualia. The problem is that qualia are no more clearly defined than is consciousness, and so are not all that helpful.

    See the present thread for samples of the confusion they incur.
  • Outlander
    2.6k
    See the present thread for samples of the confusion they incur.Banno

    How can one know confusion (rather that they are confused) without knowing clarity (that they are not confused). If one does not know clarity it is simply a difference in opinion. So, please, like I've requested multiple times now, provide such.
  • Ulthien
    34
    Very well said. I would say that thoughts are also a form of Qualia.MoK

    i would say that thoughts are a sequence of qualia (feels of concepts) that follow in quick succession.
    On brain scans, we can follow these for a few seconds, and then the brain rests for a few - evaluating "the feel of it" & then it triggers another thought.

    This cycle never ends :)

    That is how our cybernetics modelling regulator - the brain, works.

    Patanjali in his Yogasutras calls this Cittavrti aka mind-spinning.
  • Ulthien
    34
    (Notice my careful avoidance of the term "learn"! :wink: There is no entity here that can learn anything.)J

    in tech, we do call it "training".

    Colloquially, learning :)
  • J
    2.1k
    Yeah, probably a losing battle on my part. But I'd like to see more pushback against the easy acceptance of the fiction that a program is an entity or even an agent. With a name! Who starts sentences with "I . . . "!
  • frank
    17.9k
    Yeah, probably a losing battle on my part. But I'd like to see more pushback against the easy acceptance of the fiction that a program is an entity or even an agent. With a name! Who starts sentences with "I . . . "!J

    But if you're talking to a computer, you aren't talking to the program. The program is formal. Strictly speaking, your voice is being sampled and that data is being manipulated by the hardware according to directions in the software. It's all dynamic. It's actually so similar to what happens with a real human, that the only thing missing is awareness.

    Computers were originally developed to take the place of humans (with regard to basic math calculations). I think going forward, the dividing line will become more and more blurred.
  • Banno
    28.5k
    I understand you are asking something, but it is not at all clear to me, what.
  • Patterner
    1.6k
    An abacus can be used to process information - it's a primitive computer. There's no real difference in principle between the abacus and a computer. The difference is one of scale. The NVidia chips that drive AI have billions of transistors embedded in a patch of silicon. You could in principle reproduce that technology with the abacus, although it would probably be the size of a city, and it would take long periods of time to derive a result. But in principle, it's the same process.Wayfarer
    Could you do that without giving it a power system and adding the rules so the abacus would manipulate the beads correctly?
  • Outlander
    2.6k
    I understand you are asking something, but it is not at all clear to me, what.Banno

    What, in the best of your ability, are whoever you're referencing "confused" about. And why so. And what does this allegedly professed "knowledge" or this so-called guiding near-absolute wisdom you possess which they seemingly are not able to grasp, contain. In the simplest terms. This isn't hard. So stop making it as if it were.
  • Banno
    28.5k


    Do they like coffee, as their behaviour indicate, or do the really dislike coffee, despite their behaviour?

    It's a clear comparative, not dependent ton some absolute notion of real...

    Puzzled.
  • Wayfarer
    25.2k
    Yes, would need all of that - but the point being, computers are still physical systems.

    The problem is that qualia are no more clearly defined than is consciousness, and so are not all that helpful.Banno

    The point is they are qualities of experience and therefore precisely what eludes objective description. So even though you can't define them, exactly, we all know directly what 'quality of experience' means.

    If you situate them in the context of quantitative measurement as distinct from qualitative experience, you can see the point more clearly. A piece of medical equipment can provide a quantitative description of some physical condition, right down to the molecular level. But only the subject can feel the condition.

    I don't see what is obtuse or controversial about that.
  • Banno
    28.5k
    What, in the best of your ability, are whoever you're referencing "confused" about.Outlander
    Qualia.
  • Michael
    16.4k
    Organisms operate by different principles to non-organic matter.Wayfarer

    Perhaps, but organic matter is still a collection of material components. So if we have a reason to believe that organic matter can be conscious then we have a reason to believe that a collection of material components can be conscious.
  • Banno
    28.5k
    The point is they are qualities of experience and therefore precisely what eludes objective description.Wayfarer

    You drop this sentence as if it was clear what a "quality of experience" is - and indeed, if it is to serve as a way of understanding consciousness, as if it were clearer than "consciousness".

    Here's a definition stolen from Google: consciousness refers to a person's awareness of themselves and their environment, encompassing wakefulness, alertness, and the ability to respond to stimuli.

    How is "qualities of experience" clearer than that?
  • Ulthien
    34
    The issue is more, what is it that is being named by "qualia"?

    The idea was that philosophers define consciousness in terms of qualia. The problem is that qualia are no more clearly defined than is consciousness, and so are not all that helpful.
    Banno

    Here's one widely accepted formulation:

    Qualia are intrinsic and non-intentional phenomenal properties that are introspectively accessible.

    Let’s break that down:

    Intrinsic: They are part of the experience itself, not dependent on anything external.

    Non-intentional: They aren’t about anything (unlike beliefs or desires).

    Phenomenal properties: They are the felt qualities of experience—what it’s like to see red or feel pain.

    Introspectively accessible: You can become aware of them by turning your attention inward.

    This definition is used in academic philosophy, especially in A-level and university-level discussions of consciousness and the mind
  • Wayfarer
    25.2k
    How is "qualities of experience" clearer than that?Banno

    Let's go back to the source.

    The really hard problem of consciousness is the problem of experience. When we think and perceive, there is a whir of information-processing, but there is also a subjective aspect. As Nagel (1974) has put it, there is something it is like to be a conscious organism. This subjective aspect is experience. When we see, for example, we experience visual sensations: the felt quality of redness, the experience of dark and light, the quality of depth in a visual field. Other experiences go along with perception in different modalities: the sound of a clarinet, the smell of mothballs. Then there are bodily sensations, from pains to orgasms; mental images that are conjured up internally; the felt quality of emotion, and the experience of a stream of conscious thought. What unites all of these states is that there is something it is like to be in them. All of them are states of experience.Facing Up to the Problem of Consciousness, David Chalmers

    The whole 'problem' is intended to demonstrate the sense in which objective, third-party descriptions, the basic currency of the natural sciences, doesn't capture the first-person nature of experience (also known as 'being'.) So, as such, it's only a 'problem' within that context.
  • Banno
    28.5k
    This definition is used in academic philosophy, especially in A-level and university-level discussions of consciousness and the mindUlthien

    Sure - it's questioned therein. See for example the Stanford article, were four differing uses are listed, each with variations and qualifications. It is not universally accepted that the term makes sense. Yours is an appeal to authority.
  • Ulthien
    34
    Yours is an appeal to authority.Banno

    more, to common sense :)
  • Banno
    28.5k
    Some stuff from the thread Nothing to do with Dennett's , and referring to Quining Qualia

    Intuition pump #1: watching you eat cauliflower.
    There is a way this cauliflower tastes to you right now. Well, no. the taste changes even as you eat it, even as the texture changes as you chew.

    Intuition pump #2: the wine-tasting machine.
    As a tool for convincing those who disagree, this strikes me as singularly useless. Dennett will say there is nothing missing from the machine description; advocates of qualia will say that there is...

    Except that they cannot say what it is that is missing; qualia are after all ineffable. But this never stops their advocates from talking about them...

    Intuition pump #3: the inverted spectrum.
    Undergrad speculation.

    Intuition pump #4: the Brainstorm machine. Qualia gain no traction here, either.

    Intuition pump #5: the neurosurgical prank. Back to Wittgenstein: how could you tell that your qualia had been inverted, so that what was once blue is now red, as opposed to say, your memory had changed, so what you always saw as red you now recall, erroneously, previously seeing as blue? Intuition pump #6: alternative neurosurgery

    Intuition pump #7: Chase and Sanborn. They have the same decreased liking for the coffee they taste; but is it the coffee that is faulty, or is it the capacity to taste that has changes? The difference between this example and 4-6 is the removal of memory as a participant.

    Whence the boundary of the white triangle? In the perception or in the judgement?

    Hence, intuition pump #8: the gradual post-operative recovery; is the recovery in the quality of the qualia or in the judgement that ensues? And if you cannot tell, then what is the point of introducing qualia?

    Intuition pump #9: the experienced beer drinker. This is similar to 7 & 8 in playing on the supposed difference between the qualia and the judgement of that qualia. What is added is a seeming rejection of a spit between the taste of the beer and the appreciation of the beer...

    Intuition pump #10: the world-wide eugenics experiment. How to make sense of the qualia of secondary properties... Someone who says phenol-thio-urea is tasteless is not wrong.

    Intuition pump #11: the cauliflower cure. The cauliflower tastes exactly the same, but is now delicious...

    Intuition pump #12: visual field inversion created by wearing inverting spectacles. The point here seems to be that even if there were qualia, they need not count as intrinsic to consciousness. Needs more consideration.

    Intuition pump #13: the osprey cry. There's danger here of following Kripke rather than Wittgenstein. However the point must stand, that recognising the rule one is following consists at least in part in being able to carry on with the rule; but nothing in a single instance allows for this. Hence, if a qual (singular of qualia) cannot by its very nature recur, there can be no grounds for claiming that some rule has been followed; if that be so, there can be no basis for differentiating a qual; hence, no qual and no qualia.

    intuition pump #14: the Jello box. This seems to be about the information content of the notion of qualia; if I've understood it aright, one side of the Jello box are the ineffable qualia, the side other, corresponding exactly, the effable, public content of our everyday discourse. But if the content are identical, what is pointed at by the notion of the qualia of say the taste of coffee that is not also pointed at by the usual conversation about the taste of coffee? What additional information is to be found in qualia?

    And intuition pump #15: the guitar string. Arguably we have here three qualia; the first open E, the harmonic, and the second open E. Is the point here that as the ineffable becomes the subject of discussion, the qualia is less ineffable...?

    Here's my question for those who would have us talk of qualia: what is added to the conversation by their introduction? If a qual is the taste of milk here, now, why not just talk of the taste of milk here, now?

    The pretence that Qualia are a given is misguided.
  • Patterner
    1.6k
    ↪Patterner Yes, would need all of that - but the point being, computers are still physical systems.Wayfarer
    Yes, computers are physical systems. But an abacus is not a computer. It can't process information unless you give it a power system and add the rules so it manipulates the beads correctly. IOW, unless you turn it into a computer.


    (Notice my careful avoidance of the term "learn"! :wink: There is no entity here that can learn anything.)J
    Do you think they use "learn" and "teach" inappropriately in this article?

    In March of last year, Google's (Menlo Park, California) artificial intelligence (AI) computer program AlphaGo beat the best Go player in the world, 18-time champion Lee Se-dol, in a tournament, winning 4 of 5 games. At first glance this news would seem of little interest to a pathologist, or to anyone else for that matter. After all, many will remember that IBM's (Armonk, New York) computer program Deep Blue beat Garry Kasparov—at the time the greatest chess player in the world—and that was 19 years ago. So, what's so significant about a computer winning another board game?

    The rules of the several-thousand-year-old game of Go are extremely simple. The board consists of 19 horizontal and 19 vertical black lines. Players take turns placing either black or white stones on vacant intersections of the grid with the goal of surrounding the largest area and capturing their opponent's stones. Once placed, stones cannot be moved again. Despite the simplicity of its rules, Go is a mind-bogglingly complex game—far more complex than chess. A game of 150 moves (approximately average for a game of Go) can involve 10360 possible configurations, “more than there are atoms in the Universe.”  As complex as it is, chess is vastly less complex than Go, and chess is amenable to “brute force” algorithmic computer approaches for beating expert chess players like Kasparov. To beat Kasparov, Deep Blue analyzed possible moves and evaluated outcomes to decide the best move.

    Go's much higher complexity and intuitive nature prevents computer scientists from using brute force algorithmic approaches for competing against humans. For this reason, Go is often referred to as the “holy grail of AI research.”  To beat Se-dol, Google's AlphaGo program used artificial neural networks that simulate mammalian neural architecture to study millions of game positions from expert human–played Go games. But this exercise would, at least theoretically, only teach the computer to be on par with the best human players. To become better than the best humans, AlphaGo then played against itself millions of times, over and over again, learning and improving with each game—an exercise referred to as reinforcement learning. By playing itself and determining which moves lead to better outcomes, AlphaGo literally learns by teaching itself. And the unsettling thing is that we don't understand what AlphaGo is thinking. In an interview with FiveThirtyEight, one computer scientist commented, “It is a mystery to me why the program plays as well as it does.” 5 In the same article, an expert Go player said, “It makes moves that no human, including the team who made it, understands,” and “AlphaGo is the creation of humans, but the way it plays is not.”  It is easy to see how some viewed AlphaGo's victory over Se-dol as a turning point in the history of humanity—we have created machines that truly think and, at least in some areas like Go, they are smarter, much smarter, than we are.
    — Scott R. Granter, MD
  • Patterner
    1.6k
    not sure what I'm doing wrong with that. The link is:
    https://meridian.allenpress.com/aplm/article/141/5/619/194217/AlphaGo-Deep-Learning-and-the-Future-of-the-Human

    The authors are:
    Scott R. Granter, MD; Andrew H. Beck, MD, PhD; David J. Papke, Jr, MD, PhD
  • Wayfarer
    25.2k
    But an abacus is not a computer.Patterner

    It’s not a digital computer, but it’s a device used for calculations. But the rhetorical point, was simply that computers no more intend than does the abacus. And notice in the question you posed, that you placed ‘learn’ and ‘teach’ in quotation marks.

    By the way - I might draw your attention to an AEON article from a few years ago - now a book - The Blind Spot. It is a relevant criticism of the form of panpsychism (of the Harris/Goff variety) that you’re pursuing.
  • J
    2.1k
    Do you think they use "learn" and "teach" inappropriately in this article?Patterner

    That's a really useful question. Let's see . . .

    But this exercise would, at least theoretically, only teach the computer to be on par . . . etc. — Scott R. Granter, MD

    Yes, this is inaccurate. Teach "the computer"? Which computer? Surely they don't mean some actual piece of hardware. So what or who is being taught?

    AlphaGo then played against itself millions of times, over and over again, learning and improving with each game — Scott R. Granter, MD

    On the fence here. Do we require learning to be something the so-called "AlphaGo" is doing under that description? In other words, can something be learned according to a 3rd person point of view, but not from the 1st person PoV of the learner? I don't think there's a right answer to this. If we decide to say that we can recognize learning even though a program cannot, then yes, AG can be said to be learning.

    AlphaGo literally learns by teaching itself. — Scott R. Granter, MD

    No. Nothing like this could be "literally" happening. A computer program is running, and responding. Where do we find the "itself"?

    we have created machines that truly think and, at least in some areas like Go, they are smarter, much smarter, than we are. — Scott R. Granter, MD

    You didn't ask about "think," but my 2 cents is: Yes, we should be generous and agree that an LLM program simulates algorithmic human thinking so successfully that, if we use this metric for what "thinking" means, thinking is indeed happening.

    I'm interested in how you see this issue. Are you more inclined to grant an agent-like status to the AG program and others of similar sophistication?
  • AmadeusD
    3.6k
    I see that plenty of objections are being ignored. Such is life...
bold
italic
underline
strike
code
quote
ulist
image
url
mention
reveal
youtube
tweet
Add a Comment

Welcome to The Philosophy Forum!

Get involved in philosophical discussions about knowledge, truth, language, consciousness, science, politics, religion, logic and mathematics, art, history, and lots more. No ads, no clutter, and very little agreement — just fascinating conversations.