@Luke,
@Isaac,
@Kenosha Kid
This is mostly taken from
here, Dennett's summary of his Intentional Stance book.
I think the role mental entities play in Dennett's philosophy of mind is a tightrope walk. On the one hand, he does not want to deny the
efficacy of explanation styles which use mental furniture, on the other he does want to deny
some ontological commitments which may be taken to explain that efficacy.
So take "I enjoy spicy food", I believe Dennett would see that as quite unproblematic. I can taste things, I can have taste preferences. I have a taste preference for spicy food. But what he would see as problematic is an unrestricted commitment to the existence of tastes, spiciness feelings and so on. As if spiciness, enjoyment as we typically conceive of them are somehow instantiated in my mind and body.
But how can he see "I enjoy spicy food" as unproblematic if he also believes that there's no spiciness experience in some sense of the word? I think it is a difficult question, but he has written on it. It seems to boil down to the attribution of mental states to myself and others is effective at explaining, describing and predicting how we think, feel, sense, behave at a certain level of abstraction. That's the intentional stance idea.
The intentional stance is an explanatory style in which purposive states are attributed to systems in order to predict, explain or describe their behaviour. If I write "2+2" and hit return in the R software environment I have open, it will output "4", one way of explaining that is "my computer added the number 2 to the number 2 and outputted the number 4". That's not what the computer's internal systems did - which involved a lot of electronics and software-hardware interactions I just don't understand - but I can describe and predict its behaviour with that understanding. My computer never had the natural number "2" in it literally, but it did have some systemic pattern that behaves in a way sufficiently similar to having the natural number 2 in it that "my computer added the number 2 to the number 2 and outputted the number 4" works as a predictive explanation. You can tell it was a predictive explanation because I'm familiar with the software, think about it
as if it is really adding the natural number 2 to the natural number 2, and it reliably produces the output of "4". The elements in my description correspond to functional patterns in the computer.
Another example is seeing the red light on a printer that signals it is out of paper, I've thought "oh, the printer wants fed" - "wants". Just to be super specific about it, attributing "wants" to the computer there makes a lot of sense as the procedure of printing requires paper, it currently has no paper, and in order for it to be able to print again its paper supply must be refilled. By attributing "wants" to the computer, I have summarized patterns in the printer and analogized it to having an unfulfilled desire (for paper, it is hungry).
Dennett's perspective seems to be that we take exactly the same approach in attributing mental content to people. If you took the intentional stance towards me in trying to understand why I've written this post, you might think some things like "fdrake wants to clear up an ambiguity he sees" or "fdrake wants to steer discussion in the thread" or "fdrake wants to contribute to the discussion" and so on. I'd describe my motivation as involving those and other things.
And in a similar way as I don't have to become committed to the printer having human desires and needs - hunger, wants - by reacting to its paper requirements for printing as "oh, the printer wants fed", why should I have to become committed to the literal existence of
any constituent of an explanation I construct when adopting the intentional stance?
I'm not trying to say that "oh, the printer wants fed" and "fdrake wants to clear up the ambiguity he sees" use "wants" with precisely the same denotation and connotations - that is precisely the ambiguity of commitment leveraged in the intentional stance.
Even if we experience the same subjective flavours, how do I really know what you mean by "nutty"? Does it taste like a particular type of nut? Do all peanuts taste the same, for example? And what sort of bitterness are we talking about? There are many shades of difference here which language cannot easily capture. We could go on endlessly trying to refine it. I think this is what Dennett criticises (or what qualia advocates are referring to) when he speaks of the ""homogeneity" or "atomicity to analysis" or "grainlessness" that characterizes the qualia of philosophical tradition." A picture is worth a thousand words in other words, and language has difficulty doing justice to the sight before our eyes (or the taste on our tongue, etc.), especially when attempting to convey it to others in high fidelity. — Luke
I realise that what I'm about to say isn't directly about Quining Qualia's argument, but it is related to the above and the intentional stance. Adopting the intentional stance towards a system renders one relatively insensitive to fine grained distinctions between constitutive elements of the considered system for explanatory purposes. Take "fdrake enjoys spicy food", when I write that I've got a few memories associated with it, and I'm attributing an a pattern of behaviour and sensation to myself. I've made a whole type out of "spicy food", but in particular I had some memories of flavours from a vindaloo I'd had a few years ago and the burrito I'd described previously. The particulars of the flavour memories didn't really matter (I can give both more and different "supporting evidence" for the statement), as I'm summarising my engagement with an aggregate of foods, feelings and eating behaviours with discriminable characteristics (sensations, flavour profiles, event memories) etc.
Instead of attributing a quality of ineffability to a particular experience, it can be seen as a result of the indifference of intentional stance explanations to the particular details of their constituents. Ineffability of experience as a feature of the descriptive strategies we adopt regarding experience, rather than of the abstract entities we are committed to when using those strategies. Analogously, the computer's exact reaction to my call command for "2+2" is also practically ineffable; there are thousands of transistors coming on and off, there are allocation patterns for memory etc; and not because it's trying to express the natural number 2 added to the natural number 2 producing the natural number 4 through the flawed media of binary representations and changes in voltage states of transistors.