• The imperfect transporter
    if a one particle difference is all it takes to remove identity, then identity is lost every moment anyway
  • On emergence and consciousness
    I think you're not taking the emergent possibility seriously enough personally. The possibility that consciousness really does emerge from certain large scale physical arrangements and interactions. I think the idea seems alien to you - which is fair, it's by no means easy to grasp - and so your reflex is to go for something that's at least apparently more intuitive.
  • An unintuitive logic puzzle
    how are you getting those probabilities?
  • To What Extent is Panpsychism an Illusion?
    Illusion is the wrong word. It's either correct or incorrect. If it's incorrect, it's no more illusory than the illusion that 2+2=6 (I don't think that's an illusion at all, it's just the wrong answer)
  • An unintuitive logic puzzle
    yeah, nice nested logic there. I think that's right.
  • An unintuitive logic puzzle
    unfortunately that's always been the only possibility that matters anyway. From the beginning, we already know everyone can see everyone else's eye colours, just not their own - the only thing that matters is that one unknown.
  • An unintuitive logic puzzle
    so all that says is that, other than the guru, there can't be 2 non brown non blue eyed people. So? There can still be 1.
  • An unintuitive logic puzzle
    so can you phrase it better now? Because I still don't get what reasoning you're offering.
  • An unintuitive logic puzzle
    Next: if there were two or more islanders that had neither blue nor brown eyes, then there would have to be 98 or less people with either brown or blue eyes instead of 99 (other than the guru), and any islander could see that that is not the case.ToothyMaw

    I don't get this paragraph. There's a green eyed person, and everyone who doesn't have green eyes sees her.
  • An unintuitive logic puzzle
    there's steps in there that you didn't really explain
  • An unintuitive logic puzzle
    So, what is the point of the guru's comment?L'éléphant

    That's half the puzzle.

    Can the islanders not know by counting how many islanders present and how many blue eyes and brown eyes? ( I get it that each one of them will end up counting 99 and 100) But is it just us who know this fact?L'éléphant

    They don't know their own eye colour. They can count everyone's colours except their own, but counting 99 and 100 doesn't tell them their own. They can't just assume it's an even split.

    3. In what context is the "on what night" the islanders leave? Do we respond, the first, the second, the third, and so on?L'éléphant

    You could say, "the islanders don't rely on the guru saying anything, everyone leaves the island on the third night" or "the islanders leave the island on the second night after the guru speaks." The thing that happens every night, once a night, is that the ferry comes. Maybe they can't figure it out the first night, but somehow waiting a day gives them extra information
  • On emergence and consciousness
    I don't know at what point of complexity I think an entity must attain before its subjectively experience can be casualPatterner

    That's a pretty big problem. Everything else fundamental is also fundamentally causal. It's not fundamental now, causal later - it's causal at a fundamental level. If consciousness isn't causal at a fundamental level, but it is causal at a microscopic scale... I think the whole idea, in my opinion, crumbles
  • Referential opacity
    If I misread your lack of further comment, that'd be pleasing.Banno

    I read your comment (the one I replied right to) sounded like it was explicitly agreeing with me. That's all
  • Referential opacity
    right
    — flannel jesus

    Not convinced?
    Banno

    Why does me saying "right" lead to you saying "not convinced?"? "That's right" is the kind of thing someone says when they are agreeing with something...
  • An unintuitive logic puzzle
    If we follow it through, then if I'm an islander with red eyes, I will still erroneously conclude on day 100 that I have blue eyes and get thrown off the ferry.Mijin

    based on what?
  • Referential opacity
    “if x and y are the same object, then x and y have the same properties"

    A typical example involves Lois Lane believing that Superman can fly, but she doesn't believe Clark Kent can. Yet Superman=Clark Kent.

    a. Superman is Clark Kent. Major
    b. Lois believes that Superman can fly. Minor
    c. ∴ Lois believes that Clark Kent can fly. a, b =E
    — IEP
    frank

    Allow me to risk being idiotic, but perhaps part of the solution lies in thinking "Lois believes that Superman can fly" is not a property of Superman. It's a fact that you can say, but it's not a property as such.

    Seems more like that statement is about a property of Lois lane
  • An unintuitive logic puzzle
    you're clearly confused.

    Just because it's true that there's 100 blue eyed people doesn't mean any individual blue eyed person knows there's 100 blue eyed people. They don't have that fact available to them. There could be 101 brown eyed people for all they know.

    You can't use information available to us, from outside the island, as if it's necessarily available to them.
  • An unintuitive logic puzzle
    how does he know there's 100? Nobody said that.
  • An unintuitive logic puzzle
    So, through logical deduction, that person must have blue eyes since there are only 100 people in total with blue eyes, that person must conclude that he/she is the 100th person with blue eyes.night912

    What's the logical deduction?
  • On emergence and consciousness
    Sure you can. You can measure its effect on everything else.noAxioms

    I'm also curious about this. Effects are measured in physical change. You measure a physical change, how do you determine that it was fundamental consciousness that caused that rather than something else? Some other physical cause?
  • The Christian narrative
    hasa diga eebowai!
  • The imperfect transporter
    because it kinda means you're constantly dying
  • On emergence and consciousness
    okay, you've obviously developed your entire unique language for talking about this, that uses the same terms other people use but with entirely new meanings unique to you. I don't think I can wade through it.
  • On emergence and consciousness
    then you're not talking about properties

    When someone says consciousness is a property of certain complex systems, they're not saying "consciousness is a specific shape that the system can take". Properties are not shapes.
  • On emergence and consciousness
    I don't know why you think that or where you got that from. You think properties can be exhausted, you think functions can be exhausted, I think you've invented this whole idea of exhausting properties, I don't think it comes from anybody who knows what they're talking about when it comes to emergence.

    Think about a turing complete system. You can write any program technically in any Turing complete system - the limitation to the size and capabilities of that program are limited by the number of units in your turing complete system, but if you increase the units (like individual logic gates and storage capacity), you increase the number of things it can do.

    Even though a particular logic gate may have a remarkably few set of properties, when you combine many logic gates, the number of new possible programs - with new possible system level properties - increases rapidly. "Exponentially" is probably an understatement. More rapidly than that.

    So the number of possible system level properties isn't just limited by the number of properties of the components, but also increases exponentially with the number of those components as well. You seem to think that if you count the properties of the components, you can somehow figure out a specific number of properties any system made of those components can have, without taking into account this fact about turing complete systems. There's genuinely no hard limit from just the properties of the components - add more components in the right ways, and the higher level systems can have more and more properties. There's genuinely no limit once you have turing completeness.

    So where are you getting these ideas about exhausting properties from? Did you just make it up?
  • On emergence and consciousness
    because the question doesn't even make sense. It's like a Christian asking an atheist, "oh yeah, well how many angels are there?" What the fuck do you mean how many angels are there? I'm a fucking atheist. YOU tell ME how many angels there are.
  • On emergence and consciousness
    otherwise tell me the number of functionsMoK

    This is a completely inappropriate question for you to ask me. This is the question YOU have to answer. I never said anything about how the functions or properties are exhausted, YOU did. For you to know that, you have to somehow have a full list of all those priorities and all those functions and a proof that there can't be any more functions. My position has no such restraint.

    Your approach here has been really weird. You're saying now that the functions don't have to be one to one, but you said before that the available properties have been exhausted. That statement only makes sense if you think a property can "be exhausted" by being used in one function and then not being able to be used again, with other properties in some other function.

    Your entire approach here is completely bizarre. I don't think you have any idea what you're talking about. Weak emergence, strong emergence, and properties being functions of other priorities - all three of those things you're talking about seemingly unaware of how drastically different everyone else uses those terms. You're in way over your head here.
  • On emergence and consciousness
    you haven't shown that anything is complete though. You say "exhausted", it seems like you just want me to take your word for it. You're not making a case for it.
  • On emergence and consciousness
    I'm failing to see how any of that is an argument for "a function of" meaning a one to one mapping. I think that's just a deep confusion of yours, and unfortunately it doesn't seem like you want to hear that. In a mathematical sense, if you have "a function of" many variables (like a function of lower level properties), that function doesn't just have a one to one relationship between the many inputs and the output. That's not what anybody else means but you. You seem pretty locked into that though and I can't pull you out.
  • On emergence and consciousness
    and who says the functions have to be one to one? Why does it have to be "a specific property that relates to a specific property"? I just don't think that's true at all - I think you've invented this conception of how a function has to work and you've imposed it too strictly.

    Any number of properties can be combined in any number of ways to create any number of system-level properties. It's not a property-to property one to one mapping.

    Think about a high level property in Conway's game of life - a glider has the property that it travels diagonally. This property doesn't come about because of a one to one mapping with some specific property of the little pieces, this property comes about because of the interactions of many of the properties of many little pieces.

    It's not one to one at all, and nobody else but you is talking about properties being functions of other priorities like it has to be one to one.
  • On emergence and consciousness
    exhausts all functions? What does that mean?

    It seems like you think every property of a system has to have a 1-to-1 mapping to a property of the components, and that somehow you know that all the available mappings have been taken before consciousness can be accounted for. I don't see why you think either of those things are true. Have you mapped ALL the system properties before consciousness? And where are you getting this 1-to-1 idea from?
  • On emergence and consciousness
    what I'm trying to get at is, the way you've described both strong and weak emergence, the higher level property is "a function of" what's happening at a lower level in either case.
  • The imperfect transporter
    I'm not equating lag with the classic understanding of memories. Or to put it another way, the definition of the term "real time" is from the perspective of the individual, not a third person observer with a stopwatch.LuckyR

    Sure, and I think that makes sense as well
  • On emergence and consciousness
    strong emergence, by which they mean that the experience is the result of the properties of matter in the brain onlyMoK

    Up here you say experience is strong emergence because it's the result of the properties of matter in the brain only. That's "a function of". Why do you think that's not "a function of"?
  • The imperfect transporter
    ironically though, it turns out studies from neuroscience tell us that our experience of "the present" is constructed with a slight lag, and so your present ACTUALLY IS composed of some memories - things that happened to you at least a few moments ago, but which you still perceive as "effectively now".

    Though that's still obviously very different from memories from a few days ago. You're not confusing your "smeared present" with memories from last Wednesday unless you have serious neurological problems.
  • On emergence and consciousness
    okay, well at least I've done my part in informing you that you're using those words in a way other philosophers who are familiar with those words will be likely to misunderstand. Strong and weak emergence don't mean what you mean to most philosophers who use them.

    Even the way you use the phrase "a function of", now that I've realised what you've been saying the whole time, turns out to be off from how everyone else uses it.
  • On emergence and consciousness
    I truly think that you've got entirely turned around on what the difference is between strong and weak emergence. In your op, you worded certain things that made it sound like you got it right, but since then you seem to have doubled down into what looks to be interpretations that are the direct opposite of what those two terms mean.

    Even in this last post, you say strong emergence is "properties of matter in the brain only" and weak is "a function of the properties of the brain". Something is mixed up for you.