No, which is why I didn't say you were committed to that view. I said you claimed something you didn't mean.Again, why do you think I am committed to the view that we are not aware of things? — Bartricks
And yet, you are distinguishing them. That leads to my response to this:my claim that states of awareness are introspectively indiscernible from otherwise identical states that lack representative contents. — Bartricks
...which was this:Thus if we are bot built we will not know anything.
We do know we exist and a whole lot else, of course. — Bartricks
...in the case of Van Gogh, maybe we can pull out a magnifying glass. Maybe we can carbon date. Maybe we can check certificates of authenticity. But here you claim that "of course" we know versus 'know', which suggests we just naturally, introspectively know it. That leads to the first challenge.Of course not. By your own admission you cannot even introspectively tell if you know things. So how could it possibly be obvious enough to say "of course"? — InPitzotl
...and in the case of Van Gogh, if you cannot introspectively distinguish the genuine from the fake you ipso facto are not introspectively aware.Where did I say that I am not introspectively aware of things? — Bartricks
Because:Why are you telling me what introspectively indiscernible means? — Bartricks
...your claim ipso facto introspectively discerns two things (belief and 'belief') you claim are introspectively indiscernible. Because:What I am arguing is that if all of our faculties are bot-built, then they won't create any beliefs, just 'beliefs' (where a 'belief' is introspectively indiscernible from a belief, but nevertheless isn't one). — Bartricks
...you pretend not to realize that claiming something is introspectively indiscernible means you cannot introspectively be aware of it.Where did I say that I am not introspectively aware of things? — Bartricks
LOL! And round and round and round we go!What are you on about?? — Bartricks
Sure I do. Indiscernible means not able to discern. Introspectively is an adjective, meaning by means of introspection.You clearly don't understand English — Bartricks
Denial, contradiction and disinformation are key ingredients to gaslighting.You clearly don't understand English. — Bartricks
That's not what the problem is. The problem is:Again, no rephrasing necessary. You don't understand words.
...you absolutely refuse to clarify. It is not your intent to be clear. You'd rather be dramatic than do a simple reasonable thing. I won't speculate as to why, but you're making at least these things crystal clear.I don't know what grunts and howls would do the trick — Bartricks
Yeah yeah... Plantinga is a total amateur.↪InPitzotl
Greatest thinker since Plantinga?!? Now that's an insult! — Bartricks
Ahem...To recap: I have never, ever, ever said that we are not introspectively aware of things or not aware of things generally. — Bartricks
...and that means, well, what it says it means.(where a 'belief' is introspectively indiscernible from a belief, but nevertheless isn't one). — Bartricks
So you just said something you didn't mean. Maybe you should rephrase it.Read the OP! I think we ARE aware of things. — Bartricks
Nope. It has something to do with your allergy to conceding even that which would benefit you, for who knows why.It has everything to do with you not reading carefully or not understanding what you read — Bartricks
Yes, it did. Exactly as I said last post:It said 'if', matey. If. — Bartricks
There's the if, right before the underlined antecedent, the italicized consequent, and the bolded parenthetical.What I am arguing is that if all of our faculties are bot-built, then they won't create any beliefs, just 'beliefs' (where a 'belief' is introspectively indiscernible from a belief, but nevertheless isn't one). — Bartricks
Oh what narratives!And you ignored that. Willfully ignored it, or didn't understand its significance. — Bartricks
...oh what poetic drama!If the latter then i cannot rephrase my argument in a way you'd understand. I don't know what grunts and howls would do the trick — Bartricks
Christ, you are either very stupid or you can't read. — Bartricks
There's the gaslighting that has zero chance of working...Have you got this far in life without knowing? — Bartricks
...and the fantasizing, right on cue.InPenetrablyS: "so I can go" — Bartricks
Sure. Here's you're whole sentence:Quote the whole sentence. — Bartricks
What I am arguing is that if all of our faculties are bot-built, then they won't create any beliefs, just 'beliefs' (where a 'belief' is introspectively indiscernible from a belief, but nevertheless isn't one). — Bartricks
Indeed it does.It says 'If'. — Bartricks
"If" introduces an antecedent.What does that mean? — Bartricks
That's irrelevant, because in the "full quote" above, the antecedent (p) is "all of our faculties are bot-built", and the consequent (q) is "they won't create any beliefs, just 'beliefs'". "(where a 'belief' is introspectively indiscernible from a belief, but nevertheless isn't one)" is a parenthetical phrase. That parenthetical phrase is not part of the consequent.What's the difference between saying "If p, then q" and "q"? — Bartricks
In the quote I underlined. This one: ======vvvvWhere did I say that I am not introspectively aware of things? — Bartricks
^^^===== It's right there. It's underlined.(where a 'belief' is introspectively indiscernible from a belief, but nevertheless isn't one). — Bartricks
There's no other reasonable meaning of "introspectively indiscernible" except that one cannot discern using introspection.Everytime i desvribe what 'would' be the case if our faculties were bot built, you read that as me saying that that's what is actually the case. — Bartricks
Don't care. If you are so bad at communicating that you say opposite things, that's not on me. Introspectively indiscernible means one cannot discern using introspection.No. I am arguing the exact opposite, as the OP makes clear. — Bartricks
A purer form of True Scotsman fallacy I have never seen.What I am arguing is that if all of our faculties are bot-built, then they won't create any beliefs, just 'beliefs' (where a 'belief' is introspectively indiscernible from a belief, but nevertheless isn't one). — Bartricks
Of course not. By your own admission you cannot even introspectively tell if you know things. So how could it possibly be obvious enough to say "of course"?Thus if we are bot built we will not know anything.
We do know we exist and a whole lot else, of course. — Bartricks
I have always been able to tell I use my ears. I can also tell when someone has had "gender reassignment" — Andrew4Handel
I smell shifting the goalpost here. I also smell black and white fallacy here.I find it hard to believe that 99.9% of people are unable to differentiate between a male and female. — Andrew4Handel
That would have been who I guessed you meant.I meant Laverne Cox. Yikes. — Andrew4Handel
Not sure why you're asking me this question.How did humans create 7 billion of themselves? — Andrew4Handel
What is the antecedent to "it" there? We were just talking about sexual size dimorphism. I specifically cited sexual size dimorphism in spiders to contrast. Now suddenly you're talking about all sexual dimorphism.It is not an average it is a huge majority of humans displaying sexual dimorphism. — Andrew4Handel
There's nothing normative about evolution. There is no "scientific mandate" to reproduce.Puberty blockers , hormones and genitals surgeries lead to the evolutionary dead end of infertility. — Andrew4Handel
To intentionally use the pun, science has made controlling for all sorts of errors in personal accounts a science. Scientific observations would employ said controls. Your personal observations are most certainly not controlled.Well you have just found a problem with the notion of science. In what sense is any observation not a personal account? — Andrew4Handel
Sure; humans are sexually dimorphic. But note that you're immediately jumping to averages, because the dimorphism in heights isn't all that extreme. There are plenty of short males and tall females. This is child's play compared to the sexual size dimorphism found in spiders.But there is science as well. For example men are taller on average than woman. — Andrew4Handel
This is also anecdotal. Incidentally, are you sure you really mean Lauren Laverne?Lauren Laverne identifies as female and is much taller than Chase Strangio who identifies as male. — Andrew4Handel
It is by definition an anecdote. I gave you the definition. What part of that definition does not apply?Being able to identify a male form a female is not an anecdote and if you claim so you are just outright lying about reality. — Andrew4Handel
Science does not rely on anecdotes.Science has to assume that some human faculties are accurate., — Andrew4Handel
Yes, I'm seriously claiming I cannot always tell.Are you seriously claiming you cannot identify who is male and female. — Andrew4Handel
Okay, but that does not entail that science has to rely on anecdotes, nor does it suffice to show that anecdotal evidence is scientific. Neither of those things are true.The start of categorisation of entities is based on the reliability of the human senses. — Andrew4Handel
I crossed out the superfluous part. To the best of my knowledge, there's no ethical current technology to produce a human offspring without involving a male and a female, though there are potential unethical technologies. But this is simply factual; there's nothing normative here.Science showsand every day experiencethat only a man and woman can produce a children. — Andrew4Handel
"insane dystopia" is a political term. As for the science, there's no scientific theory I'm aware of that states that it is impossible to give a woman such a penis. The absolute best you can say is that there's no extant technology to pull this off.They have yet to create the insane dystopia of trying to give a woman a penis implant and implant a womb in a man. — Andrew4Handel
In the definitive sense:In what sense is it anecdotal? — Andrew4Handel
anecdotal
1. (of an account) not necessarily true or reliable, because based on personal accounts rather than facts or research
https://www.oxfordify.com/meaning/anecdotal
That doesn't sound very scientific to me. Using your eyes and ears to tell you how reality works is what nearly everyone does. Also, scientists don't tend to go around declaring whose eyes and ears they are going to trust and whose eyes and ears they are going to distrust.I have survived 45 years using my ears and eyes to tell me how reality works. I am not going to distrust my senses based on someone else's self i.d. in their mind. — Andrew4Handel
Given you have opened this thread talking about science, what does the science say?I thought gender dysphoria was a mental illness now we are told it's not — Andrew4Handel
Having such surgeries is scientifically possible; but since when is bodies wanting to be things scientific?Having surgery to mutilate your genitalia and spending a life time on wrong sex hormones and other body damaging chemicals is not changing sex it is forcing your body to be something it doesn't want to be and once hormones stop it will revert back to it's natural self. — Andrew4Handel
Not always.You cannot tell the difference between a man or a woman.? — Andrew4Handel
That sounds anecdotal, not scientific. Silly me, but by your OP I thought you were complaining about the acceptance of something that was scientifically impossible.I have always been able to tell I use my ears. I can also tell when someone has had "gender reassignment" — Andrew4Handel
Okay, so how do you tell?No. I have never needed to know about someone's chromosomes to know whether they are male or female. — Andrew4Handel
Let's walk through this. Presumably, sex is a matter of chromosomes (it's not exactly; but that's close enough for government work). So let's call a person who is XY male-sexed, and a person who is XX female-sexed.How is it possible.
It isn't from a scientific perspective. How has it become so accepted as a concept? — Andrew4Handel
...and here we fail. It doesn't appear that it is technologically feasible to change a person from male-sexed to female-sexed.How has it become so accepted as a concept? — Andrew4Handel
Your argument. I have mentioned that several times BTW.What on earth are you on about? — Bartricks
Okay. So what backs up that claim?Here's my claim: our faculties need to have been designed to provide us with information before they can be said to generate states with representative content. — Bartricks
Wrong!! See above. My problem is with your argument. Your claim does not follow from your argument. Incidentally, this makes everything below this line:You're trying to show this is false with an example of something that has been designed to give us information and is successfully doing so!! — Bartricks
...irrelevant.But anyway, that will do nothing whatever to help you. — Bartricks
Well... yes. You were the one who offered the Bartricks-bot argument; the logic I teased out from your argument when applied to Garmin shows that the Bartricks-bot argument doesn't follow. Now, as far as I'm concerned, you're just whining because I'm forcing you to do the work you claimed to have done in the first place."Oh, but, but, but, bots - bots are designed and you used bots to make your case. Bots. Garmin. Bots. Bots." — Bartricks
Okay... are you saying Garmin is not a bot then? If so, why not? What makes Bartricks-bot a bot and Garmin not one? Incidentally, I'm not asking you because I'm consulting the great wizard. I'm asking you because this is your argument you're supposed to be making.Bots are not designed to give information. They are designed to randomly generate 'messages'. — Bartricks
...and that doesn't cut it here. Nobody was trying to convey to me that I have reached my destination. Whatever "Garmin is designed to give me information" means, Garmin is nevertheless not trying to do anything, because despite being designed, Garmin is not an agent. I don't care that Garmin was designed; you're the one telling me Garmin is distinct. But your argument does not provide this distinction.no one was trying to convey to you that there was a pie in the oven — Bartricks
It's your exact logic! You have a problem with Garmin that you don't have with Bartricks.Once more: how does your example challenge my case? — Bartricks
Was Bartricks-bot designed to impart information? Funny the question never came up. With Bartricks you started with the premise it was a bot, and ended concluding there was no representative content, explaining why. All of those why's apply to Garmin, btw, despite it being designed.You are trying to challenge that with an example of something that is designed to impart information. — Bartricks
Not really. it's premise 1:I am arguing that our faculties need to have been designed to do what they do in order for them to be capable of generating states with representative contents. — Bartricks
...that you're trying to argue for. But you're giving a particular argument that alleges to do so. That this argument supports that premise is the question.1. If our faculties of awareness are wholly the product of unguided evolutionary forces, then they do not give us an awareness of anything — Bartricks
Good question. Here's what you just got finished saying about a bot:How, exactly, does that work? — Bartricks
This isn't a message if I am a bot, right? Explain that without vindicating my argument — Bartricks
So we have scenario 1. In this scenario there is some sign s that some entity x produced. In this case, s is a post, and x is Bartricks. You just said above that if x is a bot, then s is not a message. You just said above that if x is a bot, x doesn't have a mind; x isn't trying to communicate because it doesn't have a mind, x doesn't have goals, purposes, and desires. You just said above that therefore ("therefore" being a translation of "So.....") the alleged message won't be a message, and that it won't have any representative contents.It doesn't have a mind. It isn't 'trying' to communicate, because it doesn't have a mind - so it doesn't have goals, purposes, desires.
So.....the message won't be a message at all. It won't have any 'representative contents' — Bartricks
Yes. But:Has Garmin been designed to give you information? — Bartricks
...the destination was not trying to communicate with me; likewise for the Garmin.for it nevertheless remains the case that the pie was not trying to communicate with you (likewise for the clouds the pie created). — Bartricks
Still no answer to my question. Maybe I can get to this through another angle. You see, here you're obsessed about making a point that messages have to be made by agents, and as a result you're having us play pretend that you are a bot.No you're not. See argument. — Bartricks
...so you're being asked to follow through. If your pie in the oven sky writing is proving we aren't aware of something because an agent didn't intentionally try to tell us pie is in the oven, then there must be something we aren't aware of with Garmin when it tells me "you have reached your destination", because Garmin isn't intentionally trying to tell us we've reached our destination either. Garmin is a bot if there ever was one.1. If our faculties of awareness are wholly the product of unguided evolutionary forces, then they do not give us an awareness of anything — Bartricks
No no no... you stopped too early. You stopped at your message point and didn't relate it to awareness (remember premise 1?)Now just apply that moral more generally and you get my position. — Bartricks
It doesn't have a mind; it's not trying to communicate; it doesn't have goals, purposes, desires, and therefore, we (who do have minds, have goals, purposes, and desires) cannot be aware of... what?It doesn't have a mind. It isn't 'trying' to communicate, because it doesn't have a mind - so it doesn't have goals, purposes, desires. — Bartricks
But aren't we aware of it?This isn't a message if I am a bot, right? Explain that without vindicating my argument — Bartricks
...and you tie it in thusly:1. If our faculties of awareness are wholly the product of unguided evolutionary forces, then they do not give us an awareness of anything — Bartricks
What explains this failure to know is the fact that no one was trying to convey to you that there was a pie in the oven by means of your dream states. ...
So, in essence if our faculties of awareness - or rather, 'faculties of awareness' - are wholly the product of unguided evolutionary forces, then none of us are 'perceiving' reality at all. — Bartricks
You do realize you're trying to pass off the rehearsal of prejudices as reasoning.Can there be desires without a desirer? No. Can there be thoughts without a thinker? No. Can there be precepts without a perceiver? No. Can there be representations without a representer? No. — Bartricks
...given you've chosen to open this can of worms, what does that make 250 stone me with my 15 stone cat?Yes, a very lightweight opponent. — Bartricks
It seems to me that what's preventing you from acquiring knowledge in this sort of case is that you have acquired a true belief from an 'apparent' representation, not a real one. — Bartricks
If unguided - by which I mean, unguided by any agency - natural forces produced those shapes in the sky, then it was not imparting information to you. It was just pure fluke that, to you, the clouds appeared to be trying to tell you something. — Bartricks
Just a few interesting notes regarding this profound and beautiful argument.Then I refute the idea that reliability has anything to do with whether something is representing or not. — Bartricks
You do realize you're fantasizing again.Like I say, you don't have a case. You just know that Anscombe is supposed to have used the example of a speak your weight machine to refute an argument made by cs Lewis. — Bartricks
And I've explained numerous times why it works. So if the number of times one explains things is a factor in how true something is, then we're about even in that department, so you had better get another metric.have now explained numerous times. — Bartricks
And I might care, were it the fact that all you're arguing is that agency is involved somehow in semantics. But that's not what you were arguing. You were arguing that symbols must be intentionally given by an agent in order for them to represent something.The weighing machine is designed. — Bartricks
I don't deny it's designed. The problem is:Now, final time, the weighing machine example is shit. Why? Because it's DESIGNED. — Bartricks
...there's no representer (in the sense you mean it).I am arguing that in order for something - be it a mental state, a picture, some squiggles - to be said to be 'tepresenting'something to be the case (as opposed to appearing to represent something to be the case) there needs to be a representer. — Bartricks
Sure it is, because the scale is not a representer.So it ain't a counterexample. — Bartricks
Yes. Incidentally, I am an agent that speaks English.You think the fact the machine enables you reliably to know the cat's weight is what's doing the trick, yes? — Bartricks
You haven't refuted anything except in your imagination. We're still left with the symbols 15 that my scale displayed, and the fact that this indicates to me that my cat weighs 15. Somehow you got it into your mind that if you tell me a story about a leaf that by a fluke blows into my window with the number 15 on it, then it means that my scale isn't indicating my cat's weight. I have no idea how you came up with such a silly idea, but it's clearly wrong.I keep refuting that with examples you don't understand. — Bartricks
Reliability is critical. If the symbols have nothing to do with what my cat weighs, they can't possibly represent my cat's weight.It has nothing to do with reliability. — Bartricks
"Begging the question" does not apply here. Begging the question is a logical fallacy where you assume the conclusion of your argument.Again, you are begging the question throughout by just helping yourself to the idea of a representation, — Bartricks
So what's the problem? 15 on the digital scale successfully represents my cat's weight. The 15 on the leaf blowing through the window does not represent my cat's weight. You may as well have my cat knock over a deck of Tarot cards in such a way that when I draw the top card it happens to be a 15. Your particular idea of the causal relationship to the symbol via the leaf is simply the wrong idea (and it's just a rehash of your pie in the oven).when what it takes for something successfully to represent is precisely what's at issue. — Bartricks
Don't worry too much about that... other people know what it means.Not sure what that means, — Bartricks
There's no question begging here; only your confusion. In fact, you agreed I formed a justified true belief that my cat weighs 15 pounds. I formed that belief by reading and interpreting the symbols "15". So something about those symbols justify my belief that the cat weighs 15."The symbols '15' represents the weight of my cat. My cat's weight was conveyed to me." — InPitzotl
Question begging. See OP and other representations of the argument above. — Bartricks
It might help if you understood why I say 15 on the digital scale represents my cat's weight.It isn't superfluous because although I have other examples that illustrate the same point, they don't seem to have conveyed it to you, and thus I keep coming up with variations in the hope that by about example 7 or 8 you might get the point. — Bartricks
The leaf is not even apparently making a representation. Incidentally, it's worth noting that the symbols "representing" a thing has suddenly mutated into the surface it's written on "making a representation" of the thing.The leaf is 'apparently' making a representation, but isn't actually. — Bartricks
Are you sure? Because you don't seem to know what you're trying to adjust for when you're tightening the relation.And no amount of tightening the causal relation between what it appears to be making a representation of and the truth-maker of your belief is going magically to make it start representing successfully. — Bartricks
I laud the approach... this is much better than repeating yet another silly thing with 15 on it. But it misses.I can perhaps make the point in another way. — Bartricks
So let's start here. You are a sentient entity that understands English. So you have mental representations. You are capable of using your agency to translate mental representations of agentive world models (including hypothetical ones) into strings. The digital scale I referred to is not an agent, and does not have mental models, but nevertheless its display can generate strings... strings like "15".Imagine I want to convey to you what your cat's weight is. — Bartricks
Agreed. It represents a mental model you've formed about a shared world model. But it doesn't represent my cat's weight. It just "apparently" represents my cat's weight.YOu read the note, which says 'your cat weighs 15 stone'. Is that a representation? Yes. — Bartricks
Agreed. It conveys information about a mental model you have. But it doesn't properly inform me of what my cat's weight is.Is information from me being conveyed to you? Yes. — Bartricks
In other words, it does not represent the weight of my cat.Yet the mechanism I have employed is about as unreliable as it is possible to be. — Bartricks
...okay.Now go back to my leaf. — Bartricks
...this doesn't seem to relate to what that 15 on the leaf represents.and by purest fluke its markings cause you to believe that you are being told — Bartricks
Not sure why, but okay.Now imagine that the connection between the leaf coming through the window and your cat's weight is very tight, such that if your cat did not weigh 15 stone it would not have come through the window. — Bartricks
Nope.That's not going to make a difference, is it? — Bartricks
No, you're just crowing in a pathetic attempt to gaslight me.No, you are just showing that you don't really know your stuff. — Bartricks
Words aren't concepts.Not wordplay, it's just about grasping the concept. — Bartricks
The symbols "15" represents the weight of my cat. My cat's weight was conveyed to me.Once more, you have acquired a true belief. But no information was conveyed to you. For no representation was made. — Bartricks
There's an infinite number of imagined scenarios where I can see the symbols 15 in such a way that they have no bearing on the weight of my cat. But they have no bearing on the fact that the scale's display showing 15 means my cat weighs 15.Imagine a leaf floats in through the window and the markings on the leaf look like the number 15. — Bartricks
It sounds like you're confusing two things. "15" and "the cat is on the mat" are strings of symbols, written in a medium. We can call those signs. These signs exist on screens, displays, notes and the like. We form mental states from signs by reading them; but the signs don't require us to read them to be the signs they are. My scale would still show 15 if my cat stepped on it even if nobody read the display.Mental states with representative contents are essential to perception. — Bartricks
With said caveats, sure.For a mental state to have representative contents (and this is a vulgar way to speak, of course, for no mental state itself represents anything to be the case) it needs to be being used by an agent for the purposes of representing those contents to its bearer. — Bartricks
Your leaf example is superfluous. You already have a pie in the oven, and it doesn't refute my cat's weighing 15. I don't get why you think introducing a leaf with a 15 stamp is going to help you any.The leaf that floated in through the window with 15 on it — Bartricks
You're just playing games. How you define a word is arbitrary. If I want to say a brainless creature with nerves perceives something, I might want a weaker definition.First, perception goes by way of mental states with representative contents. You say you're willing to grant this, like there's an option to deny it. No, they're essential. — Bartricks
And yet, my cat weighs 15. There's no information giver here. So either this is a lingual quibble or it's wrong.Second, 'conveying' information - as opposed just to acquiring a true belief - requires an information giver and an information receiver. — Bartricks
If the digital scale my cat steps on shows you're wrong, it's pointless to keep running back to your cloud writing.And in the case where the sky writing — Bartricks
And yet, my cat weighs 15. That 15 was not conveyed to me by any mind. And yet, my cat weighs 15.You then proceed to beg the question by supposing that it is somehow the squiggles that are doing the representing. No, they're not. Minds represent 'by' using squiggles to convey something to another. — Bartricks
If you bake that into your concept of perception, which is fair, then sure.In order to be able to perceive a world one needs to be subject to mental states with representative contents, yes? — Bartricks
Sure.in order for a mental state to be said (vulgarly) to 'represent' something to be the case, there would need to be an agent who is doing the representing in question. — Bartricks
This doesn't work. That you're trying to tell me about a cat isn't in question, so let's grant that immediately. But for you to succeed in your intent to inform me there is a cat via that note, you have to have written symbols on that paper that would convey that notion. Not all symbols do that; only particular symbols do that.The note is not telling you anything; I am telling you about the cat via the note. — Bartricks
Well yeah, because you made a point regarding truth in the OP with respect to the sky writing (truth by fluke). But you were also talking about information being conveyed. So consider "the cat is on the mat". That's just a bunch of letters. But those letters have a meaning according to the rules of English; it's about some cat being "on" some mat. What it means for that statement to be true is for the semantic content behind those symbols to have valid referents. What it means for that statement to convey information regarding its truth to us (in the usual sense) is for those symbols to convey those semantic contents to us.What you are doing, it seems to me, is focussing on the fact that we can nevertheless acquire accurate and justified beliefs — Bartricks
For you to convey "the cat is on the mat" to me as a true statement, it is insufficient for you to intend to tell me the same. You must also somehow be aware of the referenced cat's being on a mat.The representing is done via them, but not by them. — Bartricks
Yes, I can tell how loose it is.They have to be being used - used by an agent - for that purpose or a sufficiently closely related one before they can be said to be 'representing' something to be the case (and again, even then, this is loose talk, for the state itself does not do any representing). — Bartricks
That's what perception does. There's an image on your retina. Something happens, and lo and behold... some mental state is formed about something that is a mental state such that you tend to have it if there were a cat there and not have it if there were no cat there. That is a mental state of "seeing a cat".So we can have two states that are introspectively indiscernible, and one can be representing something to be the case, and the other not. In order for us to be perceiving a world, our mental states - some of them - need to be representing there to be a world. — Bartricks
Sure; hallucinating cats isn't seeing cats.It is not sufficient that they be introspectively indiscernible from such states. They need actually to be representing something to be the case. — Bartricks
There's the question begging again.And they will not be doing this unless an agent got them to arise in us for that very purpose. — Bartricks
That's a difference without a meaning.If that is not the case - if our faculties have been forged by unguided natural forces - then although we will still acquire true beliefs about the world we are living in from them, we will not be perceiving the world, even though our situation would be introspectively indiscernible from what would be the case if we were. — Bartricks
I think this counteranalysis misses two major points.At first the analysis we might give here is that the reason you don't 'know' that there is a pie in your oven is that it was just coincidental that the clouds formed those shapes and that the belief these shapes caused you to acquire was in fact true. — Bartricks
JTB's are TB's, but TB's aren't necessarily JTB's, so:Er, yes. A justified true belief is still a true belief. So your 'no' was incorrect. — Bartricks
...my no correctly refutes that wrong part.You acquire a true belief about your cat's weight, that's all. — Bartricks
And yes, the belief is justified. Relevance? — Bartricks
The symbols "15" produced by the scale represent the weight of my cat because my cat's weighing 15 causally relates to the symbols "15" being produced on that display. The symbols "250" on your weird plant thing is unrelated to my weight being 250. So the fluke note does not represent my weight. The "15" on my digital scale by contrast does represent my cat's weight.And when you step on it it emits a seed that is paper-like and has squiggles on it that look, by fluke, like'your weight is 250'. — Bartricks
Nope. But digital scales can show representations of weights using symbols.Do weight machines greet you now? — Bartricks
No, I acquire a justified true belief about my cat's weight.You acquire a true belief about your cat's weight, that's all. — Bartricks
And yet, the scale produces the symbols 15; and those symbols represent the weight of my cat. So apparently all those things the scale isn't doing, and doesn't have, don't have anything to do with the symbols representing the weight of my cat, since 15 does in fact represent the weight of my cat.When I write you a note, the note isn't telling you anything. It doesn't have a little mouth or desires that you know things. — Bartricks
Yes, you're begging the question. We'll get to that later.I am arguing that your faculties need to have been designed to tell you about the world if you are to be told about the world via them. — Bartricks
The note in this case is "15". It was produced when my cat stepped onto the scale. But apparently it cannot tell me anything. Nevertheless, 15 represents the weight of my cat.If I write you a note saying "The cat is on the mat" is the note telling you that the cat is on the mat? No. I am. By means of the note. The note is telling you nothing. I am telling you something by means of the note. — Bartricks
No. I am challenging your messed up notions of semantics here. I quoted the same exact quote where you messed it up in this thread.And you are trying to challenge that with a weighing machine that is designed to give you information about your weight?!? — Bartricks