Of course there are exceptions where we need to verify someone's account of things, but my point is, that there are many instance of knowing that don't involve the perspective of science. I'm saying that sometimes I have verification apart from science or experiment. — Sam26
But my life doesn't depend on guessing correctly, if it did then things would be much different in terms of what we know. — Sam26
If I explain what's in my backyard, isn't that most likely a good explanation of what's in my yard, or do you need to test it. — Sam26
No. It is a simple enough materialist account. — apokrisis
Thank you for these tutorials in the philosophy of science. But you might want to check your facts. — apokrisis
Of course. In the same way that all theories have to be motivated by a counterfactual framing - one which could even in principle have a yes/no answer.
So are all minds the result of a mush of complicated neurology found inside skulls? As a first step towards a natural philosophy account of consciousness, does this feel 99% certain to you.
If not, why not? Where is your evidence to the contrary? — apokrisis
Does poking this delicate mush with a sharp stick cause predictable damage to consciousness? Well ask any lobotomy patient.
And so we can continue - led by the hand - to where neuroscience has actually got to in terms of its detailed theories, and the evidence said to support them. — apokrisis
All good moral questions. How do you answer them? — apokrisis
I thought it was because we all act the same way. Roughly. Within engineering tolerances.
You might need a neuroscience degree, along with an MRI machine, to tell if a person is indeed built the same way.
You know. Verified scientific knowledge and not merely social heuristics. — apokrisis
And do the tests claim the theory is true? Or do they make the more modest epistemic claim that the theory seems pragmatically reliable in terms of the purposes you had in mind? — apokrisis
Which of these standards do you want to hold mind science to? — apokrisis
Science promises pragmatism. And so one suggested test of artificial consciousness is the Turing proposal. Interact with the machine and see if it behaves exactly like all the other meat puppets that surround you - the people you might call your family and friends, and to whom you pragmatically grant the gift of being conscious. — apokrisis
You will never know whether it is actually true that you Mom has a mind. But for all practical purposes, I'm sure you act as if you believe that to be the case. — apokrisis
Does science, in principle, verify or falsify its hypotheses? — apokrisis
And would neuroscience talk about the feelings of insects in terms of them being composed of similar matter to humans - some matching proportion of carbon, hydrogen, oxygen, nitrogen, phosphorous, other trace elements? Or would the arguments have to be made in terms of having significantly similar "neural structure"? — apokrisis
China is just starting to act like we do. Yes, that could be a frightening prospect. — T Clark
The argument "science has failed to explain consciousness" against science's ability to explain consciousness is common enough, although I don't think that's Chalmers' argument. — Kenosha Kid
I guess in a nutshell it is hard for a country or superpower to get as big as the US without being a bit corrupt/"evil" in when dong so. — dclements
I already explained how and why the US military and her allies follow something along the lines of the "Hitler doctrine" where they very aggressively (or even over aggressively) seek to hinder the expansion of power of ANY country that might be a threat to them. — dclements
It my intuitions are to be trusted about these cases, then, it seems that if you (in an epistemically responsible way) acquire stolen goods but then do something to them that destroys their original value, you do not owe the original owner anything. — Bartricks
The thief doesn't have that right but it doesn't necessarily mean ownership isn't vested by the new buyer as long as he can demonstrate good faith and it doesn't concern a registered good.
Children cannot enter in valid contracts because they do not have the necessary will for offer and acceptance.
And there's no problem, it's been working fine for at least two centuries. — Benkei
This is a common mistake found in posts in perception/phenomenology threads.
Try to be clear about what it is you're referring to. — Caldwell
Do first person experiences count as phenomena or are they experiences of phenomena? — Janus
Is that an actual quote from Dennett: did he actually say that?
If he did say exactly that, then the obvious critique would be that a third person account is not a first person account; so by definition a third person account cannot include a first person account without being something more or other than just a third person account. So, I cannot see how Dennett could be claiming that a third person account could include a first person account; I doubt he would claim something so obviously absurd, so I conclude that he must mean something else, and we would need to see the context to find out what that is. — Janus
'Pain' seems to be a word reserved to describe the experience of had by the experiencer of a human. It would be a lie to say that I feel pain, in the context of this topic, so lacking an experiencer, I cannot by definition feel pain any more than can a robot with damage sensors. Again, I may use the word in casual conversation (outside the context of this topic) not because I'm lying, but because I lack alternative vocabulary to describe what the pure physical automaton does, something which by your definition cannot feel pain since it lacks this experiencer of it. — noAxioms
I don't feel pain. I'm a zombie, remember? — noAxioms
We do not postulate anything. If you can see and touch a thing you have to be far off to even think about the possibility that it might not "exist". That is the problem with undirected reflections and witty, but mindless, efforts. If e.g. social constructivism tells us that we can construct the "reality" of things it is clear that we can construct an idea of things that makes it impossible to say anything about
it. Given we can - why should we do it?
Where is step B? Where is the negation of the negation? What should be the difference between empirical science and philosophy be, if it loses itself to it's objects (e.g. "truth")? — Heiko
Who says it's unsolvable? — frank
OK, I was thinking more in terms of answers to questions, but in any case, "learning to accept my intrusive thoughts and not fight them" is a material change of behavior isn't it. I mean instead of sitting or lying there and ruminating, don't you go and do something else? — Janus
All solutions to problems are material solutions. What other kind(s) of solutions can you offer an example of? — Janus
The point of course would be, how you could you tell the mental state apart from a programmed response? I don't think, in theory, you could. — Sam26
Yes, and this is why I said, "...they lack the internal subjective experiences of a real self," which was meant to mean they are not conscious. It's difficult to know if such a zombie would really act like a conscious being. It seems that you could in theory make them respond just like us. It would be like playing a game, say, World of Warcraft, and not knowing if you're talking with a real person or not. — Sam26
If we're not conscious in the way that philosophers like Chalmers claim we are, then qualia would count as such a word in our universe. Idealism would be another. Platonism would be yet another. Not to conflate those three terms, but it demonstrates that if the world is physical, it doesn't prevent us from coming up with non-physical words. — Marchesk
One possible answer, is that the zombie is just programmed to say these kinds of things. If, for example, our reality is a kind of program of sorts, then it's quite possible that some being (what we refer to as a person) might just be part of the program. They act like us, they talk like us, but they lack the internal subjective experiences of a real self. It's certainly possible, but unless you were able to remove yourself from the program, it would be difficult if not impossible to tell the difference. — Sam26
Actually Chalmers touches on this because, if his zombie-twin has an "inverted spectrum" of any conscious experience, for example, (sees blue where the other sees red i.e.) then there necessarily will be different "causal histories" of that type of experience, even if the experiences themselves are the same. So isomorphic mapping of history can be problematic. — Pantagruel
I'll say no. Will you answer my question now? — InPitzotl
You said that before. Do you mean something besides "pain is painful"? — Srap Tasmaner
