Alright, I think we're talking past each other a bit. The two (mind you, not one) experiencing entities are a result of corpus callosotomy. The notion that experience is what makes you an entity cannot account for the fact that a corpus callosotomy should make two entities. Agentive integration by contrast explains why you are a single entity. The notion that you are an entity due to agentive integration does account for the fact that a corpus callosotomy should make two entities. Once again, experience is doing no work for you here; it's an epicycle.Two experiencing entities. — Daemon
No idea what you're saying here. Are you suggesting there are two individuals before the corpus callosotomy?But I don't really think the effects of cutting the corpus callosum are as straightforward as they are sometimes portrayed, for example, the person had a previous life with connected hemispheres. — Daemon
Quite the opposite; see above. We can take an external view as well:I don't think the "corpus callosum/AHS" argument addresses this. — Daemon
So if experience is what makes us an entity, how could that possibly happen?Cutting the corpus callosum creates two entities. — Daemon
Yes. Experimental results violate Bell Inequalities. (FYI, there are "outs" for HVT's, but they require giving up something like locality, realism, etc; some choose to do so).But hidden variables have been ruled out by experiment, yes? — tim wood
No:All right, then, the game is not fair. QED. Is that the point? — tim wood
What I can possibly do to rig the game is analogous to a Hidden Variable Theory. The "real" goal here is to explain the 1/4 probability (the "win" thing is just to encourage working the classical probabilities).The point of this game is that it is Bell's Theorem in disguise — InPitzotl
Are you sure?Yes, I am a super determinist. — Philosophim
That's not what superdeterminism means.Yes, I am a super determinist. Once some type of existence is in play, it will act and react the same way identically each time. — Philosophim
I think you're misreading the game. I can be unfair, but I can't change the game being played. All I can do is be maximally unfair but follow all of the rules.That's because probability requires a certainty of certain facts for formulation. As soon as you said, "I might not be necessarily being fair," you remove the ability to make an accurate assessment of odds. — Philosophim
You're trying too hard. We're not talking about "in a given run". We're talking about, I set up a casino, you come play, and I have a viable business model where your funds slowly drain into my casino.With this, we can calculate the likelihood of standard deviation. — Philosophim
And we're not talking about a puzzle you have to guess right at either. This is open ended. You can look up the answer. You can have other people do the work. I'll work it out myself, and you can use my workbook.I read the discussion between you and the others after posting this, so you can be sure this was my personal and honest view, and not influenced by the other conversations. — Philosophim
...the puzzle stays open. Quantum mechanics would tell us the probabilities of this sort of match are 1/4. But classical probability can only bring us down to 1/3. Experiment appears to confirm quantum mechanics; that is, that Bell Inequalities are violated as per Bell's Theorem.Lets use an easier model to digest, as odds work the same no matter the complexity. — Philosophim
I think you took something descriptive as definitive. What is happening here that isn't happening with the thermostat is deference to world states.I've looked at this many times, and thought about it, but I just can't see why you think it is significant. — Daemon
You're just ignoring me then, because I did indeed address this.But also there's a more fundamental point that I don't believe you have addressed, which is that a robot or a computer is only an "entity" or a "system" because we choose to regard it that way. — Daemon
I think you have some erroneous theories of being an entity. AHS can be induced by corpus callosotomy. In principle, given a corpus callosotomy, your entity can be sliced into two independent pieces. AHS demonstrates that the thing that makes you an entity isn't fundamental; it's emergent. AHS demonstrates that the thing that makes you an entity isn't experience; it's integration.The computer or robot is not intrinsically an entity, in the way you are. — Daemon
Not sure what you're trying to get at here. Are you saying that dogs aren't entities? There's nothing special about a dog-not-running for mayor; that could equally well be a fictional character or a living celebrity not intentionally in the running.I was thinking about this just now when I saw this story "In Idyllwild, California a dog ran for mayor and won and is now called Mayor Max II". — Daemon
That's correct. Eight isn't a large number, so let's list them. The possible arrangements are BBB, BBR, BRB, BRR, RBB, RBR, RRB, and RRR.Given three cards, each either R or B, there are eight possible arrangements of R and B. — tim wood
That's also correct. But I think you're missing this:And there are three ways of choosing two of three cards. That is, 24 possibilities. — tim wood
So there are 24 possibilities here, but that doesn't mean they're equally likely. I could be stacking the deck. So pretend you're me, maybe. How would you rig the odds? Well, in the BBB and RRR case, you're guaranteed to win... so maybe I just never give you those deals.I always shuffle the deck (incidentally, I am not necessarily being fair; take that into account). — InPitzotl
It's not really that kind of puzzle. The whole point of this puzzle is that it looks fishy. It's more relevant that it looks fishy than that you solve it (it's also not new; though it's slightly in disguise here).I await your revealing the error in this reasoning. — tim wood
Not... exactly.Lets use an easier model to digest, as odds work the same no matter the complexity. — Philosophim
You tell me. I'm still asking you what your concept of causality is. It appears to me that you are indeed committing to sufficiency here though.Does that mean the cards don't follow causality? — Philosophim
Hmmm... that might be interesting. Okay.If that did not explain what you were asking, please try to rephrase the question with a deck of cards example. — Philosophim
So to me, it sounds like your notion of causality is similar that of "reason" in the Principle of Sufficient Reason, with the exception that I've yet to hear a commitment to sufficiency. I'd now like to explore sufficiency.Yes, you've nailed it. — Philosophim
I'm still not sure you answered my question.I've been on a computer chips kick in my posts, so I suppose I'll continue with them. — Philosophim
So let's go the other way. There's no electricity flowing out of the transistor. Can we ask what caused no electricity to flow out of the circuit? Can the answer be, "The gate was off" and/or "the electricity was off"?A transistor can either be on, or off. If it is on, the electricity will travel through the gate. When it is off, the electricity is cut off. Imagine that we have power constantly running to the transistor. Now imagine that the circuit is complete. We have electricity traveling that circuit. What caused electricity to travel the entirety of the circuit? At a particular scale we can say, "The gate was on". Or we could be more detailed and say, "And the electricity was on." — Philosophim
Honestly, no, I'm still trying to analyze this. I can still see what you possibly mean branching off in a few different directions, and I don't quite know which one you'll take. I reserve the right to make a point later, if I have one to make; but for now, I'm just trying to figure out where you're coming from.I feel at this point you have something you want to say. Feel free to. Once I understand the larger point, I think we can get all of your questions out of the way at once — Philosophim
Different from the former as opposed to same as the former?Its about things being a state captured in time, another state captured later in time, and an explanation for why the state of the later is different form the former. — Philosophim
I'm just trying to capture what you mean by causing something to exist. It sounds like it would be less confusing to just drop the exists part... at this point I'm not sure what the difference is between "cause things to exist" and just "cause things".I am not trying to put my own spin on force here. Yes. All of these are forces in physics. — Philosophim
Would gravity be a force? Magnetism? The Higgs Mechanism?My apologies if I've been confusing. The state of the cue ball in its new velocity is not the same as the cue ball without velocity. This is a "new" state caused by the cue ball's collision. Without the cue balls collision, or an equally placed force, the 8 ball would not be in its new state of velocity. — Philosophim
I'm not clear how this is answering the question. Are you comparing the 8 ball before the cue ball hits it to the 8 ball after it hits it, or the 8 ball after the cue hits it to what would be the 8 ball were the cue ball not to hit it? And how does this relate to my question... what new thing was caused to exist?Yes, the 8 ball in a state of velocity is different from the 8 ball in a state of zero velocity. — Philosophim
This means nothing to me until you tell me what new thing was caused to exist.The reason it is in the state of velocity — Philosophim
You're a bit ahead of yourself here. I'm trying to figure out what you mean by causing something to exist, and you're having me pick scales for some reason or another.Depending on the scale of measurement, — Philosophim
Curious language... isn't this your argument? I would have thought you would be the authority on what was meant.I believe the argument isn't concerned with scale, — Philosophim
Example of what? This sounds like a typical example of causality per se. My question is about what you mean causing something to exist.Sure, the usual example in philosophy is a cue ball hitting an 8 ball. — Philosophim
Is there a new thing that exists when the 8 ball exists in a new velocity state?The 8 ball exists in a new velocity state — Philosophim
Sure... would that be a new thing existing?You could go plot the life of the entire ball up to its creation in the factory if you wanted. — Philosophim
Could I get an example of a thing causing something to exist?1. Either all things have a prior cause for their existence, or there is at least one first cause of existence from which a chain of events follows. — Philosophim
I don't think it's off topic, but just to clarify, I am not intending to give this as a definition of intentionality. I'm simply saying there's aboutness here.This may be off topic , but that’s one definition of intentionality, but not the phenomenological one. — Joshs
Irrelevant. This is ostensive.Don't or can't? — noAxioms
Yes.Would it make it not the same toaster if the name got scratched off — noAxioms
If you say so, but all I can talk about is what I mean by "being the same".I'm talking about it actually being the toaster in question or not. — noAxioms
That's David Chalmers' story. I'm not David.That's not the story being pushed — noAxioms
You said the toaster feels warm. It doesn't matter how you're using the word "I"... the toaster doesn't feel warm. It lacks the parts.Not how I'm using it when I make a distinction. — noAxioms
Yes; by that age, most humans learn theory of mind.Age of five eh? — noAxioms
No, it implies that you can for example pass the Sally Anne test. Theory of Mind has nothing to do with p-zombies.Does that imply you were a zombie until some sufficient age? — noAxioms
I'm a bit lost. This is what a zombie is according to Chalmers:A computer can't tell you it's conscious? — frank
http://consc.net/zombies-on-the-web/A zombie is physically identical to a normal human being, but completely lacks conscious experience. — Chalmers
My use of mind here is metaphorical (a reference to the idiom "of one mind").That's broadly what I meant when I said that it's experience that makes you an individual, but you seem to think we disagree. — Daemon
I don't think this is quite our point of disagreement. You and I would agree that we are entities. You also experience your individuality. I'm the same in this regard; I experience my individuality as well. Where we differ is that you think your experience of individuality is what makes you an individual. I disagree. I am an individual for other reasons; I experience my individuality because I sense myself being one. I experience my individuality like I see an apple; the experience doesn't make the apple, it just makes me aware of the apple.Think about the world before life developed: there were no entities or individuals then. — Daemon
Yes; the thermostat is only metaphorically trying; the robot is literally trying.Suppose instead of buying bananas we asked the robot to control the temperature of your central heating: would you say the thermostat is only metaphorically trying to control the temperature, but the robot is literally trying? — Daemon
Sure.Could you say why, or why not? — Daemon
The word "atom" comes from the Latin atomus, which is an indivisible particle, which traces to the Greek atomos meaning indivisible. But we've split the thing. The word "oxygen" derives from the Greek "oxys", meaning sharp, and "genes", meaning formation; in reference to the acidic principle of oxygen (formation of sharpness aka acidity)... which has been abandoned.The word "meaning" comes from the same Indo-European root as the word "mind". Meaning takes place in minds. — Daemon
Of course they are. This is why they tend to say we have these properties, but these things over here, they don't. They are ostensively pointing to the properties, and they are formulating an incomplete theory in an attempt to explain the properties they are pointing to. And I even agree it's a bad theory about what they're ostensively including.Bad analogy. In the case in question, nobody is ostensively using a term. — noAxioms
The notion that either we have an immaterial driver in the driver's seat experiencing things or the toaster feels warmth sounds like a false dichotomy to me.The only way I can parse it, it is the followers of Chalmers that are making the error you point out, where a human is privileged in being allowed to call something water/cold/wet, but anything else (a sump pump moving the stuff) doing the exact same thing is not allowed to use such privileged language (the pump moves a substance which could be interpreted as water). — noAxioms
Yes, it's the same rock...Is it the same rock, — noAxioms
...and it's probably that too. Most of the toaster's mass is in gluons. They're constantly obliterating and reforming. And the next grand TOE may even do something more weird with the ontologies.or merely a different arrangement of matter in the universe — noAxioms
Actually, yes, I can. The toaster reacts to warmth. "Legal me" reacts to warmth as well. But "legal me" also reacts to an increase in blood acidity.You can't point to your subjective feeling of warmth and assert the toaster with thermostat doesn't feel anything analogous. Sure, it's a different mechanism, but not demonstrably fundamentally different. — noAxioms
Your sales pitch here is a dud. I can play Doom on this computer. I might could even play Doom on my Keurig. But I cannot play Doom on this bottle of allergy pills.This seems to be an example of the privileged language mentioned above. What I see as the 'bad theory' asserts privileged status to humans, raising them above a mere physical arrangement of matter, and assigns language reserved only for objects with this privileged status. — noAxioms
There are particular arrangements of physical matter that come in individual "toaster"-like bodies, which are embedded in their environments and must navigate them, and which regularly participate in conversations of various sorts with other entities. "Hey google" is not one of these things. But noAxioms is one of these things.My son has one of those 'hey google' devices sitting on its table, and it might reply to a query with "I cannot find that song" or some such. — noAxioms
It was not a put-down. I'm not just generically using braggart language here; you're literally one step behind. The water example is a response to the response you just gave, and it does not negate it. We did not discard the notion of water when we discarded classical elements, and there is a good reason we did not do so. That we discarded phlogiston on replacing it with a better theory, does not negate this good reason not to discard water when dropping classical element theory.I am not sure how to take this. Is this just a generic putdown, or did you mean something more specific? What am I missing? — SophistiCat
That's not quite the clumsiness I was referring to. "X is a bad theory of Y" is to be understood in the sense of X being an explanans and Y an explanandum. In this sense, phlogiston theory is not a theory of phlogiston because phlogiston is an explanans. The explanandum here is combustion; so phlogiston in this sense is a theory of combustion. When we got rid of phlogiston theory, we did get rid of phlogiston (explanans), but we did not get rid of combustion (explanandum).Well, referring to the phlogiston theory as a theory of heat heat transfer was perhaps clumsy, but you have ignored the substance of my response in favor of capitalizing on this nitpick. — SophistiCat
I am pretty sure you're at least one step behind, not ahead, of the post you just replied to.An eliminativist about personal identity could hold the phlogiston as a counterexample. — SophistiCat
This is clumsily phrased. Phlogiston theory is a theory about combustion. It was replaced by oxidation theory, a better theory about combustion. We dropped the notion of phlogiston, but not the notion of combustion.But the preferred solution, at least in the case of the phlogiston, was not to come up with a better theory of the phlogiston, but to drop the stuff altogether as part of a better theory that accounts for the manifest reality of heat transfer. — SophistiCat
I cry foul here. Imagine a believer of the classical elements telling you that he just fetched a pail of water from the well. When you ask the guy what water is, he explains that it is the element that is cold and wet. Analogously, you object... there is no "water"; for "water" refers to an element that is cold and wet, and we don't have such things. The problem is, the guy did in fact fetch the stuff from the well. This I believe is your error.The "I" on the other hand refers to the experiencer of a conscious thing, something which gives it a true identity that doesn't supervene on the physical. — noAxioms
I've no idea what you mean by legal me, but the ostensive I to which humans refer is not something a toaster has. I can't comment on the automaton... the term's too flexible.No, that's the legal 'me' doing that. Any toaster has one of those. Any automaton can type a similar response in a thread such as this. — noAxioms
No, being agentively integrated is what makes me (and you) an individual. We might could say you're an individual because you are "of one mind".It's experience that makes you an individual. — Daemon
Not in the robot case. This is no mere metaphor; it is literally the case that the robot is trying to buy bananas.In ordinary everyday talk we all anthropomorphise. The thermostat is trying to maintain a temperature of 20 degrees. The hypothalamus tries to maintain a body temperature around 37 degrees. The modem is trying to connect to the internet. The robot is trying to buy bananas. But this is metaphorical language. — Daemon
And my contention has been throughout that you're just adding baggage on.My contention is that a computer or a robot cannot understand language, because understanding requires experience, which computers and robots lack. — Daemon
This phrase sounds suspicious. There's a me, but there's no I being me?There's no 'I' (a thing with an identity say) that's being me. — noAxioms
Yeah, that's the real problem here. If qualia are epiphenomenal, how can we talk about them?My brain hurts now. I'll admit to having difficulties with the p-zombie argument when it comes time for the zombies to talk about consciousness. — Marchesk
I'm not sure what "want" means to the precision you're asking. The implication here is that every agentive action involves an agent that wants something. Give me some examples... my cat sits down and starts licking his paw. What does my cat want that drives him to lick his paw? It sounds a bit anthropomorphic to say he "wants to groom" or "wants to clean himself".The cat wants something. — Daemon
The robot had better be capable of "trying to shop and get bananas", or it's never going to pull it off.The robot is not capable of wanting. — Daemon
Ah, finally... the right question. But why not?You're wrong because the robot doesn't have a goal. — Daemon
Why not?But a robot buying bananas is? — Daemon
In terms of explaining agentive acts, I don't think we care. I don't have to answer the question of what my cat is thinking when he's following me around the house. It suffices that his movements home in on where I'm going. That is agentive action. Now, I don't think all directed actions are agentive... a heat seeking missile isn't really trying to attain a goal in an agentive way... but the proper question to address is what constitutes a goal, not what my cat is thinking that leads him to follow me.But where do the goals come from, if not from "mere thought"? — Daemon
I've no idea why you think it muddies the water... I think it's much clearer to explain why shaking after drinking coffee isn't agentive yet shaking while I dance is. Such an explanation gets closer to the core of what agency is. Here (shaking because I'm dancing vs shaking because I drank too much coffee) we have the same action, or at least the same descriptive for actions; but in one case it is agentive, and in the other case it is not.It might be better to take a clearer case, as you drinking the coffee is agentive, which muddies the water a little. — Daemon
Agentive action is better thought of IMO as goal directed than merely as "thought". In a typical case an agent's goal, or intention, is a world state that the agent strives to attain. When acting intentionally, the agent is enacting behaviors selected from schemas based on said agent's self models; as the act is carried out, the agent utilizes world models to monitor the action and tends to accommodate the behaviors in real time to changes in the world models, which implies that the agent is constantly updating the world models including when the agent is acting.Agency is the capacity of an actor to act. Agency is contrasted to objects reacting to natural forces involving only unthinking deterministic processes. — Daemon
So answer it.That isn't a difficult question. — Daemon
...doesn't answer this question.Only conscious entities can be agentive, but not everything conscious entities do is agentive. — Daemon
But neither does a robot. — Daemon
Just to remind you what you said exactly one post prior. Of course the robot interacts with bananas. It went to the store and got bananas.You seem to be contradicting yourself. — Daemon
...this is what you quoted. This was what the question actually was. But you didn't answer it. You were too busy "not counting" the robot:What is this "do" you're talking about? I program computers, I go to the store and buy bananas, I generate a particular body temperature, I radiate in the infrared, I tug the planet Jupiter using a tiny force, I shake when I have too much coffee, I shake when I dance... are you talking about all of this, or just some of it? — InPitzotl
I'm conscious. I experience... but I do not agentively do any of those underlined things.They aren't agents because they aren't conscious, in other words they don't have experience. — Daemon
Ah, how human-centric... if a tiger runs amok in the supermarket and tears someone's head off, we won't send the tiger to jail. Don't confuse agency with personhood.When your robot runs amok in the supermarket and tears somebody's head off, it won't be the robot that goes to jail. — Daemon
If I let the tiger into the shop, I'm morally culpable for doing so, not the tiger. Nevertheless, the tiger isn't acting involuntarily. Don't confuse agency with moral culpability.If some code you write causes damage, it won't be any good saying "it wasn't me, it was this computer I programmed" — Daemon
I think you're dragging a lot of baggage into this that doesn't belong.I think you know this, really. — Daemon
Zombies are functionally equivalent to conscious entities. Generically different entities have different evolutionary histories (because "you count to two when you count them"), but given the functional equivalent clause in the definition, any treatment of p-Chalmers as saying something Chalmers says is by definition fair game.Would Chalmer's P-zombie twin also have the same evolutionary history as Chalmer? — RogueAI
Generically, I reject fate outright. I'm agnostic about determinism. And I'm agnostic about free will. This definition of fate roughly fits into the concept of fate that I reject. This particular definition of free will I have conceptual issues with, requisite to fit my free will agnosticism. So I don't quite mesh well with the fate/free will ying/yang concept here.Now, Fate is defined as: “the development of events beyond a person’s control, regarded as determined by a supernatural power.” And, as for Free-Will, this is defined as: “the power of acting without the constraint of necessity or fate; the ability to act at one’s own discretion.” — Lindsay
I'm pretty sure if you understood what I was saying, you would see there's no contradiction. So if you are under the impression there's a contradiction, you're missing something.You seem to be contradicting yourself. — Daemon
The other day you had a robot understanding things, now you say a computer doesn't know what a banana is. — Daemon
Your CAT tool doesn't interact with bananas.the CAT tool still wouldn't know what a banana is. — InPitzotl
What is this "do" you're talking about? I program computers, I go to the store and buy bananas, I generate a particular body temperature, I radiate in the infrared, I tug the planet Jupiter using a tiny force, I shake when I have too much coffee, I shake when I dance... are you talking about all of this, or just some of it?I've been saying from the start that computers don't do things (like calculate, translate), we use them to do those things. — Daemon
I think you're running down the garden path.I thought some examples of Gricean Implicature might amusingly illustrate what computers can't understand (and why) — Daemon