• Nobody's talking about the Aliens


    Right, it doesn't fit the bill for Hume. What I never understood though is how he could maintain his view on miracles when his thoughts on the problem of induction would seem to lead us to conclude that miracles are impossible to rule out, and can only be argued against by circular appeals to induction.
  • "Why I don't believe in God" —Greta Christina
    I've been writing an essay on this for awhile. As someone who grew up in a militantly atheist household and also has spent a lot of time a Evangelical churches (which preach conversion and the Great Commission above all else), I think there is a fundemental disconnect with how the Church writ large appeals to people. It is generally concerned with emotional appeals to people who grew up at least culturally Christian, and this is a diminishing share of the population. It doesn't address the two main issues I see, addressed in the OP.

    1. The idea that Christianity or religion in general is incompatible with naturalism, or requires belief in superstitions. I don't believe this is true, but it is a common conviction among both atheists and the faithful alike.

    2. That there is no explanation for religious pluralism within religious traditions themselves. There is only "well they are all wrong/lacked the Holy Spirit, we are right." Actually, the Catechism of the Catholic Church has a suprisingly ecumenical section on other sects, Ill try to find it. But polysemy was very big with the Church Fathers and such pluralism is not surprising if one assumes that God works, unfolds in God's immanent form, through world history.

    The dialectical churn of faith and reason seems to be what is needed to drive humanity towards goals like freedom, contemplation of the Absolute, self-development, etc. And indeed the Bible is an example where God starts off commanding from on high, then teaches laterally through fellow men in the Gospels and Acts, and then moves towards an indwelling, internal mode of self-development with the advent of Pentecost.

    The main barrier re naturalism I feel is that people have a fairly inaccurate view of what science says the world "must be like," the limits on speculation, where empirical fact begins to cull possibilities. While in the upper reaches of the sciences and philosophy it has long been accepted that 19th century corpuscular, reductive materialism has major problems, and I don't think modified versions even remain one of the more popular conceptions, it remains popular writ large. This is by far and away the most popular layman's interpretation of what science says "the world is like." Thus, even more sophisticated presentations of faith for the curious tend to result in people talking past each other, because you need vast detours into other areas to set up the ground work on which such arguments are made.

    Over time, people simplify and crystalize metaphysical views of the world, but this process has stagnated due to the fact that no one paradigm has come to replace that popular 140 years ago or so. Thus, you have a bit of an idiosyncratic grab bag floating out there.
  • Literary writing process


    Right, and it doesn't have a good solution I can think of. That's part of the problem. I tried to lighten it by making the modern plotline about budget disinfo merchants full of levity, a bit absurd, but I don't know if it works. It makes for stuff that is more fun to write at times though.

    Well, if we need inspiration…

    She shuffled some books on her desk, found what she was looking for, a small rectangular package. The label on the front of the package was a gold on orange holographic image. From one angel it showed a muscular, bearded man in a toga, rolling a stone up a steep hill. Depending on how you tilted the package, you could make the stone roll up or down the hill, in endless repetition. But, if you tilted it far enough, a totally new image would appear, the face of a man, eyes comically red. Many customers didn’t know it, but this was the face of the French existentialist, Albert Camus. Above his face popped out the words:

    Absurdly Good Weed(™)”

    Then, below the face:
    “One must imagine Sisyphus stoned.”

    She opened the package and pulled out a joint.

    And what sort of story would a disinfo merchant fall for?

    Hilde looked back down at the books cluttering her desk. Her eyes locked on Plato’s dialogue on the immortality of the soul, the Phaedo.



    [/quote]

    Still, Chris wasn’t naive about what most people would say about their work. Purveyors of misinformation.Disinfo merchants. Propagandists. Liars. Trolls.

    Or, as one journalist for the Des Moines Register had put it, Nigel was “a rotund British cancer on the American body politic, not talented enough to metastasize, but hardly benign.”

    “Fucking self indulgent purple prose — who does this asshole think he is writing for?” Nigel had fumed, showing the article to Chris. “Not talented enough? I turn down bigger jobs all the time. I keep a low profile because I’m not a moron like this absolute pleb.”

    This had been during the phase when Nigel was using “pleb,” as his go-to insult. The insult held no classest connotations when wielded by Nigel. He frequently painted billionaires and officials in high office with the label.

    “Pleb,” for Nigel, was short for “plebian of the soul,” a term he had adopted after being turned on to the works of the ultra-conservative, caste-system-advocating, esotericist, Julius Evola. He had come across the facism-adjacent, wizard, or sorcerer, or what-the-fuck-ever people who do “magic” call themselves, via some godforsaken VR community that Nigel had been frequenting back then.

    Evola had convinced Nigel that he was an “aristocrat of the soul.” From that it followed that his enemies were “the plebs,” the low-class mob hoping to drag others down to their level of “spiritual mediocrity.” This was worse than Economic Marxism — worse than Cultural Marxism even — this was… Spiritual Marxism.

    Nigel had been particularly insufferable during this period, frequently accusing Chris of “Spiritual Marxism” and its attendant ills, whenever Chris had pushed back on his increasingly unhinged ideas. For Chris, the turn had been evidence that even his boss, so astute in fathoming the psychology of the masses, was not immune to the lure of intrigue, controversy, and self-flattery.

    It had also been a period of significant “biohacking,” Nigel’s preferred term. Biohacking was “the rational and intentional alteration of one’s own neurochemistry to help maximize productivity, achieve one’s goals, and fully realize one’s potential.” It was, “better living through science,” “the use of entheogens to achieve a fit-to-purpose physicochemistry conducive to the demands of the modern workplace.”

    Biohacking, per Nigel, was a premier example of “the application of Logos to Psyche,” the “triumph of Gnosis over Eros.”

    Chris had secretly thrown out the man’s cocaine stash, a key “biohacking reagent,” after he had, only half-jokingly, referred to it as “Aristocrat’s Powder.”

    In retrospect, this inflation of the man’s eccentricities had foreshadowed his downfall, the end of the first company, and his fourteen month, all expenses paid “vacation” to the Yazoo City Federal Corrections Complex. He had been a bubble ready to pop, destined for the “Zoo.”
  • A question for Christians


    "Peak experience," is sort of an anachronism. It's from Maslow, working in the 1960s. It just means a powerful, intense experience that becomes a foundation for identity, defining, and highly memorable — a lens through which the world is viewed. Many mystics describe only a handful of such ecstasies, Boehme for example.

    Eckhart wasn't excommunicated, but his teachings were suppressed (half-heartedly), so you have a point there. However, he was never even officially condemned, just passages from his work. He did have a trail in Germany that tried to condemn him, but as a Benedictine he wasn't under their jurisdiction so it was a bit of an exercise. Only his university in Paris or the Pope could try him, Paris demured. He appealed to Avignon and a Papal committee reviewed his work. They turned up like 126 suspect statements, then dropped all but 28. Eckhart died mid trial, at 67-68, so not uncommon then.

    Normally, the issue would have been dropped but the Pope was facing multiple mystical challenges, particularly a feud with the Franciscans over if Christ and the Apostles owned property, and decided to release a bull condemning some of Eckhart's statements as "error or heresy." Not necessarily heresy. Eckhart himself never really was forced to recant, his recantation, which he gave readily, just says "I reject anyone who misreads my work as not being catholic teaching and orthodox." Funny stuff.

    Really, he was more someone who got caught up in political feuds and a larger wave of, in some cases obviously unorthodox mysticism.

    He was still buried with the full rights and honors of his position and has since been rehabilitated. That so much of his writing survived and that he influenced Boheme, Hegel, etc. so much shows the condemnation didn't really have much influence in the end, people saw it as the political gesture it was.

    Not that the Church wasn't burning people for heresy then, the same Pope who started the inquest on Eckhart had four Franciscans burnt over the question of Christ's poverty around the same time period. It's just that Eckhart's internal looking mysticism never aroused the same political passions.
  • Literary writing process


    Thanks for the feedback. That's exactly the stuff I like to hear! That's why it's rough, not so much not having been edited for grammar, but I'm thinking I may need to break up the exposition. The main critique I've always heard of Bakker is that there is too much exposition up front. Same for Game of Thrones, I found the whole first book to be a bit sloggy. But I can turn the exposition into dialogue easy enough. I've been rereading the Black Company because I think Cook does a good job at this, even though his story and setting are much less complex.

    It's the second chapter, (really third, since I do interludes between each chapter), because it's a bit more abstract, less "grabby." But it's the one chapter where magic is front and center because the Refuge is the one place in the world where its common. I think I said the entire HRE stand-in has about 8 million people and can muster 1,600 sorcerers of varying ability in an earlier chapter, so fairly uncommon elsewhere, 0.02% or so.

    In any event, I'm sort of paused on this project because I started another one about people living for generations in an infinite house (labyrinth of rooms in every direction), and searching for a way out, interspersed with some modern story lines. It allows for a lot more dialogue and humor, less "genre fiction," and so I figured it likely has the wider appeal.
  • Reading "The Laws of Form", by George Spencer-Brown.


    Interesting. You would think that a process view would tend to collapse the distinction between abstract and physical. Maybe not.
  • Literary writing process


    lol, it's only accessible through the Emerald Tablet and ascending up the Sepirot! Or... I've forgotten to include the link and just added it. :cool:
  • A question for Christians


    It makes sense to me. What does not make sense is the idea that faith is preferable to knowledge, or that knowledge cannot replace faith. This is the anti-mystical idea that for me undermines the credibility of the church's dogma and alienates modern thinkers.

    Learning anything requires a certain degree of faith but the idea of learning what must always remain merely a faith, and is merely a faith even to those who teach it, will be unappealing to a rational person.

    I don't know if that is necessarily a common teaching; certainly, it is not universal. Faith is multifaceted. Do we choose what it is that we believe? To be sure, our beliefs are reinforced by willful acts. E.g., I come to agree with some position in biology through my choice to study it more thoroughly (where I am at with EES actually). But in an important sense, beliefs are beyond our control. My car is blue; I cannot have faith that it is really red. But there is faith that and faith in, the latter being a sort of "moral regard for," and this is more controlled by the will. Notably, the Greek commonly translated a "faith" in Acts and Paul's letters can mean "arguments in favor of."

    The goal is to build up both sorts of faith, through knowledge on the one hand, and experience on the other. Bonaventure mentions "three books," that we learn about God from. The Book of Nature, the books of Holy Scripture, and the Book of Mystical Experience. Jean Gearson, writing late-1300s, puts a more apt label on the mystical experience than William James' more influential effort, which is too focused on "peak experiences," IMO.

    Gearson distinguishes between intellectual knowledge of a person -- the knowledge of the physician and the biographer -- and personal knowledge of that same person -- the knowledge a child or spouse might have. Mystical theology is simply the way we come to know God in that latter way, through experience. No visions or ecstasies required (and this is where people get tripped up). And indeed, many mystics, Thomas Merton, Bernard of Clairvaux, etc. do not seem to have had any Jamesian "peak experiences," (while others well worth studying, like Saint Hildegard, the Sybil of the Rhine, obviously do).

    It's worth noting that when speaking of supernatural gifts that might strengthen faith, Paul says:

    Love never ends; as for prophecies, they will pass away; as for tongues, they will cease; as for knowledge, it will pass away.

    For our knowledge is imperfect and our prophecy is imperfect;

    but when the perfect comes, the imperfect will pass away.

    When I was a child, I spoke like a child, I thought like a child, I reasoned like a child; when I became a man, I gave up childish ways.

    For now, we see in a mirror dimly, but then face to face. Now I know in part; then I shall understand fully, even as I have been fully understood. So faith, hope, love abide, these three; but the greatest of these is love.

    - Saint Paul of Tarsus - First Epistle to the Corinthians 13:8-13

    Signs and wonders strengthen faith, but Christ tells Thomas in John 20 "because thou hast seen Me, thou hast believed. Blessed are they that have not seen and yet have believed." Similar points are made at other points, for example, Jesus chides a man in John 4:48, saying "except ye see signs and wonders, ye will not believe." If we are not to see wonders, then our faith must be built up through an intentional search of the "Three Books."

    Knowledge, intellectual ascertainment of Truth, only passes away, only has secondary status, because we come to experience such Truth directly, in a way that is perfectly simple and complete, an unmediated whole, Saint Denis' divine "Darkness Above the Light."

    I think the early Augustine's commentary is instructive here.

    6.13 Reason is the soul’s contemplative gaze. But it does not follow that everyone who contemplates sees. Rightful and perfect contemplation, from which vision follows, is called virtue. For virtue is rightful and perfect reason.But even though the soul may have healthy eyes, the contemplative gaze itself cannot turn toward the light unless these three [virtues] have become permanent: faith by which it believes the reality which it gazes upon can, when seen, make us blessedly happy; hope by which it trusts that it will see if only it contemplates intently; love, by which it yearns to see and to enjoy.Then the vision of God flows from the contemplative gaze. This vision is the true goal of our contemplation, not because the contemplative gaze no longer exists, but because it has nothing further to strive toward...

    7.14 Therefore, let us reflect on whether these three are still necessary once the soul succeeds in seeing (that is, knowing) God. Why should faith be needed since now it sees? Why hope, since it already grasps its hope? But as for love, not only will nothing be taken away, but rather much will be added. For when the soul sees that unique and true Beauty, it will love all the more deeply. But unless it fixes its eye upon it with surpassing love and never withdraws its gaze, it will not be able to continue in that most blessed vision.

    -Saint Augustine of Hippo - The Soliloquies
  • Literary writing process


    I finally put a sample chapter up, if you or anyone else is interested.

    The Darkness Before the Light is an epic fantasy novel (think Game of Thrones or The Darkness That Comes Before). The setting is a mix between Reformation Europe and the Wars of Religion, a setting that will allow us to explore theological intricacies, and the early Italian Renaissance, an interesting period for the evolution of warfare, with the advent of canons and the rise of large mercenary companies. A main conceit of the novel is that its sorcery is based on the esoteric traditions of this and earlier periods.

    https://medium.com/@tkbrown413/a-fantasy-novel-based-on-real-world-esoteric-systems-the-darkness-above-the-light-sample-chapter-53da1fe4de48
  • is the following argument valid (but maybe not sound)?
    Wouldn't this be more:

    For all AP, KM (appearance; known mediately).
    AC is not KM (action; known mediately)
    Thus, AC is not AP.

    Same as:
    All men are mortal.
    Zeus is not mortal.
    Thus, Zeus is not a man.

    "If anything is an appearance it is known mediately,
    The individual knows that he (or she) acts non-mediately
    Thus, action cannot be an appearance."

    It's a containment relationship that fails to obtain. Or we can define it through membership. Action is not in the set of "things known mediately," while "all appearances" are members of that set. Thus, on pain of contradiction, action cannot be a member of the set of appearances as this would entail that it is an element in the set of things that are know mediately (which is rejected in P2).

    We could thus set this up as a proof by contradiction by assuming our premises and assuming that "action IS appearance." This results in a contradiction where action both is and is not a member of the set of "things known mediately," if it is a member of the set of "all appearances." If it is not a member of S(Appearances) then we have no problems at all, Action is simply not a member of either.

    I would question if the premises hold up though. Work on brain injuries would suggest knowledge of actions is known mediately and incompletely, varying with attention, cognitive resources, etc.
  • A question for Christians


    Perhaps it is strange to say that men and women who willingly faced death were cowards but perhaps someone like Nietzsche would say that this is proof of their rejection of life.

    The problem with Nietzsche's philosophy becomes obvious when you seek to generalize it. How are we all to become overmen, revaluing all values. Cannot one man adopt values such that they find it good to deprive other's of freedom? So then, it seems that if we have some successful overmen, nothing precludes most men being thralls, unfree, slaves. And indeed Nietzsche seems to allow this. Part of the conceit of reading Nietzsche is that the reader is part of a spiritual elite, a technique utilized later by Evola and Guenon and punched up by shallow, dismissive rants against the whole prior corpus of philosophy.

    The question then is, is this the ideal solution? The many unfree so that the free can be free? Further, can those who are free in such a system derive recognition from their inferiors given that they have been reduced to an "other?" Or have we just recreated Hegel's Lord/Bondsman dialectical? Is the overman unable to be truly free because he cannot adopt a view that would reduce his status vis-á-vis the masses without risking his overman status? Is he like the Romans of Augustine's City of God, unable to relinquish violent rule less it be turned on them by new tyrants ready to fill the vacuum?

    Why was it acceptable for God to wage war against the wicked in heaven and somehow impermissible for his faithful son and servants here on earth? Is it a double standard or is it something deeper? Maybe Christ didn't have a dog in the fights that happen down here on earth but what are we to do? Should we fight when faced with an evil enemy like Micheal or should we do as christ did and lay down our lives for the ones we love because we are taught by him to love our enemies?

    Christianity is, in its core, a religion about overcoming the world through internal transformation, not through Manichean struggle between good and evil.

    "Vengeance is mine, I shall repay," says the LORD in Deuteronomy. It is not the Christians duty to fight. What can man add to God if God wants to destroy something? Is God weak that he cannot accomplish his own aims? Have "[we] and arm like God. Or can [we] thunder with a voice like His?" as God asks Job. Can we "then adorn yourself with majesty and splendor, And array yourself with glory and beauty. Then I will admit that your own hand can save you."

    But the Christian is not meant to judge. I would argue that they are asked to judge no one, to earnestly hope the ALL are saved.

    "For if ye forgive men their trespasses, your heavenly Father will also forgive you:

    But if ye forgive not men their trespasses, neither will your Father forgive your trespasses." - Mathew 6: 14-15

    ----

    I think I can best answer your questions by showing how I think freedom is essential to the message of the Faith.

    Freedom leads to happiness because a free man will not freely choose to be unhappy.

    This does not mean we will always be ecstatic about everything we do. Being free to become certain things, to take on certain roles, means being free to accept the duties that come with these roles. If we are to become a “good doctor,” a “nurturing father,” or a “loving husband,” there are surely unpleasant things we must do and pleasing things we must give up. We do not have to find pleasure in all that these duties entail to find happiness in our roles and responsibilities.

    Freedom requires both negative freedom, freedom from external constraints, and positive freedom, freedom to become what one authentically desires to be and to control one’s own drives and desires.

    The Role of Positive Freedom

    Of these two, positive freedom is harder to foster. You need self-discipline to get what you want in this world. Most self-help work stops at this point. But more important still is the freedom over one’s base instincts, the ability to want what you want to want. Self-control alone is not freedom, it can become its own sort of life-denying slavery. Freedom is self-control directed towards what Harry Frankfurt call’s “second order volitions,” things that you “want to want to do” in essence.

    Freedom then, is not easy to achieve. Like Saint Paul, we must be “at war with the members of our body.” On all sides we are driven on by desire, instinct, and drive. We are free only when we fathom what we want to do, why we want to do it, and act in accordance with our reason. This is why Leibniz thought the Principle of Sufficient Reason, that idea that “everything happens for a causal reason,” in a word, “determinism,” was prerequisite for freedom, not antithetical to it.

    For us to be free actions must have defined consequences and we must want to do what we want for a reason. A world where our actions aren’t deterministic is simply a world where our actions are arbitrary. Arbitrariness isn’t freedom.

    Free men are like Hegel’s state, “act[ing] in accordance with known ends,” and “know[ing] what it wills.”

    Reason then, is essential, as is “reason-out-in-the-world,” “natural law,” cause, Logos. As Paul says in Romans 7, he dies a death of autonomy and personhood when sin lives through him, when he is driven on by desire and instinct. He then talks about how he is resurrected in this life, to personhood by Christ, the Logos. Christ, who casts out the “legion within,” the demons that strive to control us, to rob us of our freedom.

    So then, we must develop reason, but also authenticity. This is what the existentialists get right. One must discover their true selves. Where the existentialists err is in elevating the Copernican Principle into a dogma and denying the Logos, claiming the universe is absurd, even as the Logos burns bright in the order within all things.

    The Importance of Social Freedom

    We are also social creatures. We compete with one another, even as we compete for one another. And so freedom also has a social element. Freedom requires a state that promotes freedom, on which shapes individuals interests such that they have an incentive to promote each other’s development and freedom.

    This is Hegel’s insight and vision. We progress towards this goal via the dialectical evolution of history, a sort of selection process where states that promote freedom survive because they promote human welfare, technological innovation, a greater ability to muster resources, and because they will be defended with greater zeal by their citizens. If thinking of "natural selection," in terms of intentionality bothers you, simply think of "selection," at work in Hebbian "fire together, wire together," neuronal development, where children lose most of their neurons as function is sculpted through selection, a process that both involves and causes intentionally in patterns of cyclic feedback.

    The Essence of Freedom is What is Essential to It
    Freedom then, includes duty, self-control, knowledge — gnosis.

    We have a duty to be free. This is why criminals have a right to be punished. We do not punish merely to deter crime. To do this is to treat another human being like an animal to be domesticated.

    Freedom requires knowledge of nature, and so we must study the sciences. We are natural creatures and must understand nature to understand ourselves. Likewise, we must master nature, “subdue it and have dominion over it,” in order to enact our will.

    Freedom requires knowledge of the Logos, and so we must study philosophy, logic, and mathematics.

    Freedom requires knowledge of the self, and so we must study psychology, the great works of art, etc.

    And as we drain the Cup of Gnosis we shall find three things at the bottom:

    The external word, the symbols through which they are known, and the I that is ourselves. And these three we shall know to be in an image of three others: the Father, the Son, and the Holy Spirit reflected in the very necessary nature of coherent being.

    And then happy consciousness shall give praise to that which formed it, chanting “glory to the Father, the Son, and the Holy Spirit, both now and then, and on to ages and ages, amen.”

    We are the midwives of the Absolute. We are Mary, the theotokos, giving birth to the Body of Christ, his Church. As the Blessed Virgin served to create his first physical body, so we now construct his immanent body through world history. We come together to form the Church and strive to fulfill its Marian mission of the creation of the Body of Christ (in this world, as an emergent, dynamical, historical process)

    And yet the Church is also the Bride of Christ, of whom the Canticle of Canticles speaks. This is a mystery, but one we can fathom. For the Bride and the Bridegroom are to become one flesh, one body. And so we are the immanent body forming in this world. This is what is part of what is meant by the “Kingdom being near.”

    And yet the Church is also the Bride of Christ, of whom the Canticle of Canticles speaks. This is a mystery, but one we can fathom. For the Bride and the Bridegroom are to become one flesh, one body. And so we are the immanent body forming in this world. This is what is part of what is meant by the “Kingdom being near.”

    And so, we find our authentic place in the world. A Christian man is lord of all, subject to none. And yet he is servant to all, lording over none. This is the mystery, just as the Gospel is vast and concise, as Saint Denis says.

    Thus, the Faith is not world denying, rather it denies the inessential, seeking freedom perfected. The most direct translation of the Lord's prayer would be "give us our super-essential bread." It is the super-essential then that is sought.

    Too late have I loved you, O Beauty so ancient, O Beauty so new.
    Too late have I loved you! You were within me but I was outside myself, and there I sought you!
    In my weakness, I ran after the beauty of the things you have made.
    You were with me, and I was not with you.
    The things you have made kept me from you – the things which would have no being unless they existed in you!
    You have called, you have cried, and you have pierced my deafness.
    You have radiated forth, you have shined out brightly, and you have dispelled my blindness.
    You have sent forth your fragrance, and I have breathed it in, and I long for you.
    I have tasted you, and I hunger and thirst for you.
    You have touched me, and I ardently desire your peace...

    You have made us for yourself, O Lord, and our heart is restless until it rests in you

    -Saint Augustine of Hippo
  • A question for Christians


    Biblical literalism is a vocal minority view, strongest in American Evangelical Protestantism, which is a small minority of the faithful no matter how much it tends to deny this fact to itself.

    Mainline Protestants and Catholics are still citing Bonaventure's six-fold, 7 step, 3 mode journey of the Mind (mens) into God, moving outward to God's vestiges in the book of nature (Francis of Assisi, brother son and sister moon), then inward to psychological reflection (Augustine), and finally upwards (Denis).

    Boehme is less common, but still explored by some Lutherans. Augustine and his mysticism is everywhere in the Latin Rite, and has even had a resurgence in the Orthodox churches. Denis still suffuses the Eastern Tradition. Granted, these have their own ultra conservative movements, but they rely less on literalism.

    Anti-naturalism seems to be a more an unresolved issue for the small subset of American churches that has begun rapidly disintegrating due to culture war politics, hemorrhaging members since 2010 and seeing their median age shoot upwards like a rocket. It's not that the other sects have completely resolved this issue, but they have been working on it for a long time and have not been afraid to get their hands dirty doing metaphysics. They're also less cut off from the Churches roots in some of the greatest thinkers in the philosophical tradition, and can regularly draw on minds like Augustine or Eckhart for ammunition.


    If you opt to not believe, then much of the teaching by Jesus and his Church are likely not going to make a whole lot of sense to you. If you opt to believe, it isn't that everything will fall into place and make perfect sense.


    It doesn't make "perfect sense." Faith is a journey. Even Peter doubts when Christ allows him to walk on water and begins to sink. Only Christ grabbing him saves him. Likewise, he draws a sword to protect Jesus even at the end when the authorities come for him, and Jesus must rebuke him then. But then we see him finally understanding and mirroring Christ in Acts, as with the raising of Tabitha. There, it is the people who do not yet understand, they call on Peter thinking he is special, not realizing the Spirit "shall be poured out on all flesh." (Joel 2/Acts 2)

    What church or synagogue isn't filled with debate? What religious life isn't filled with doubts and seasons? Israel itself means "one who wrestles with God." Jacob sees a ladder ascending to the heavens not an escalator
  • Quantum Entanglement is Holistic?


    Ah, even less can it match up then. Behold!

    141117002325-dnt-ia-virgin-mary-tree-00003411.jpg

    I think they put the flowers there after the barky miracle. To me, it looks more like she is holding a pumpkin than a baby or praying though.

    But that's the whole thing about signs, they only convey meaning based on what has come before. To the right eye, there is plenty there, I'm sure. Semiosis is sort of spooky and magical in general. I have been going to a non-denominational, Protestant Church for a long time, and they don't really have the same "eye" for the Blessed Virgin that Catholics and the Orthodox do.
  • Quantum Entanglement is Holistic?
    Counterpoint: Jesus with a halo praying carved into the universe with a nebula.

    1280px-Cone_Nebula_%28NGC_2264%29_Star-Forming_Pillar_of_Gas_and_Dust.jpg

    Or you have the Virgin Mary showing up in the streaks from glass cleaning, burnt toast, and other chaotic phenomena.
  • Nobody's talking about the Aliens
    There was an even more alien looking six inch tall mummy with a conical head that was dug up in Chile a few years back. It turned out to be the remains of a human with a severe birth defect upon testing though.
  • To be an atheist, but not a materialist, is completely reasonable
    Energy being physical is fairly well established. If you want to get into a more wonky question there is the matter of it information is physical (Landauer's Principle) and there remains some hot debate on that.

    But, if information because essential for explaining cause in a way that people do not think is somehow an epistemic artifact, I imagine we'd see widespread acceptance of information as physical (it's already a majority opinion I would think).

    The problem with, "if it has causal powers it is physical," is that it would simply mean that if ghosts and magic are real, we just need to accept the physical reality of ectoplasm, djinns, etc. (Hemple's Dilemma). I think in general physicalists would like to go further, but that's where the interesting problems come up.

    Is saying that there is no intentionality behind the behavior of the universe writ large necessary for physicalism? Is saying that mind is not essential to being necessary for physicalism? Can we say some things about the nature of the physical beyond the scope of scientific realism or simple naturalism?

    In general, I think ontic structural realism, the idea that the mathematical structures of physics are themselves the ontological basement, the constituents on which all cause depends and from which all being emerges, doesn't sound like physicalism. We don't tend to think of mathematical entities and processes as physical, rather they are abstractions. But I'm also not sure if it's necessarily disallowed. Certainly there are theories that do advance structural realism as physicalism.

    But it's not like idealism, in its broadest form, is that much different in this respect. In some ways it's defined largely by what it says "no," too. So, say what you will about dualism, but when you have two distinct types of being, there is a lot more you can say of them, since there is at least some comparison to define them through. (I am not a dualist BTW, a theory being more interesting, less "flat," doesn't make it necessarily more true lol).
  • To be an atheist, but not a materialist, is completely reasonable


    The first thing to stress would be that composition in computation doesn't work like composition in superveniance metaphysics. Salt is salt because of how Na and Cl interact. 20 grains of salt is salt in the very same way that 1,000,000 grains of salt is salt. The output comes from the causal properties of fundamental units, which may arguably be unpredictable from the properties of these units themselves (classical emergence).

    But 5 * 10 is not an output of 50 in the way that Na + Cl = NaCL. You can add grains of salt to salt and it remains salt. If you add multiples of 5 or 10 to 50 you get a different number.More importantly, there are limitless ways to write an arithmetical function that will output 50 and so the output cannot be uniquely defined by the inputs in the way NaCl is defined by its component particles.

    Against this view, we can consider that, if all of physics was unified into one thing, if the fundamental forces and space-time itself were unified, and we could say: "yes, there is one undifferentiated substance that forms all these building blocks from different processes," then the difference elucidated above looks to be in trouble. However, if this was the case, "substance" as a concept now fails to do any explanatory lifting at all. All phenomena are generated from a term that applies equally to all things, and so it is only the processes that actually have causal power.

    Emergence was developed by a number of British philosophers in the 19th century with old-style materialism in mind. Substrate independent emergence, the example of material formed into a wheel, is a later innovation, and I would argue that it is better explained via a process metaphysics. From this start, "emergence" largely developed up to the 1990s in line with popular ideas of superveniance physicalist metaphysics. "Classical emergence," is just emergence that accepts substance metaphysics.

    Thus, one of the big arguments in emergence tended to be if "strong emergence," or "true emergence" is even possible, or if emergence just represents opportunities for what is essentially data compression. If the latter holds, then all phenomena can still be fully (and often, most accurately) described by simply ignoring the emergence and instead fully describing any physical system via the sum of its fundamental components. Or, at least this idea is believed to be true, "in theory;" however, plenty of people accept that, barring the advent of some Le Placean Demon capable of almost supernatural computations, emergence might still make sense as a concept to use from a pragmatic perspective.

    More recently, it has been common to argue that "strong emergence," appears to be impossible within a substance metaphysics, but, so the argument goes, this is simply more evidence that that we must move to a process based metaphysics.

    House of Cards?

    The most influential critiques of ontological emergence theories target these notions of downward causality and the role that the emergent whole plays with respect to its parts. To the extent that the emergence of a supposedly novel higher - level phenomenon is thought to exert causal influence on the component processes that gave rise to it, we might worry that we risk double - counting the same causal influence, or even falling into a vicious regress error — with properties of parts explaining properties of wholes explaining properties of parts. Probably the most devastating critique of the emergentist enterprise explores these logical problems. This critique was provided by the contemporary American philosopher Jaegwon Kim in a series of articles and monographs in the 1980s and 1990s, and is often considered to be a refutation of ontological (or strong) emergence theories in general, that is, theories that argue that the causal properties of higher - order phenomena cannot be attributed to lower - level components and their interactions. However, as Kim himself points out, it is rather only a challenge to emergence theories that are based on the particular metaphysical assumptions of substance metaphysics (roughly, that the properties of things inhere in their material constitution), and as such it forces us to find another footing for a coherent conception of emergence.

    The critique is subtle and complicated, and I would agree that it is devastating for the conception of emergence that it targets. It can be simplified and boiled down to something like this: Assuming that we live in a world without magic (i.e., the causal closure principle, discussed in chapter 1), and that all composite entities like organisms are made of simpler components without residue, down to some ultimate elementary particles, and assuming that physical interactions ultimately require that these constituents and their causal powers (i.e., physical properties) are the necessary substrate for any physical interaction, then whatever causal powers we ascribe to higher - order composite entities must ultimately be realized by these most basic physical interactions. If this is true, then to claim that the cause of some state or event arises at an emergent higher - order level is redundant. If all higher - order causal interactions are between objects constituted by relationships among these ultimate building blocks of matter, then assigning causal power to various higher - order relations is to do redundant bookkeeping. It’s all just quarks and gluons — or pick your favorite ultimate smallest unit — and everything else is a gloss or descriptive simplification of what goes on at that level. As Jerry Fodor describes it, Kim’s challenge to emergentists is: “why is there anything except physics?” 16

    The concept at the center of this critique has been a core issue for emergentism since the British emergentists’ first efforts to precisely articulate it. This is the concept of supervenience...

    Effectively, Kim’s critique utilizes one of the principal guidelines for mereological analysis: defining parts and wholes in such a way as to exclude the possibility of double - counting. Carefully mapping all causal powers to distinctive non - overlapping parts of things leaves no room to find them uniquely emergent in aggregates of these parts, no matter how they are organized...

    Terrance Deacon - Incomplete Nature

    But there is a powerful argument against mereological substance metaphysics: such discrete parts only appear at the quantum scale through large scale statistical smoothing. In many cases, fundamental parts with static properties don't seem to exist and even those that are put forth can form into new, fundamental entities (e.g., Humphrey's notion of fusion).

    This is not meant to suggest that we should appeal to quantum strangeness in order to explain emergent properties, nor would I suggest that we draw quantum implications for processes at human scales. However, it does reflect a problem with simple mereological accounts of matter and causality that is relevant to the problem of emergence.

    A straightforward framing of this challenge to a mereological conception of emergence is provided by the cognitive scientist and philosopher Mark Bickhard. His response to this critique of emergence is that the substance metaphysics assumption requires that at base, “particles participate in organization, but do not themselves have organization.” But, he argues, point particles without organization do not exist (and in any case would lead to other absurd consequences) because real particles are the somewhat indeterminate loci of inherently oscillatory quantum fields. These are irreducibly processlike and thus are by definition organized. But if process organization is the irreducible source of the causal properties at this level, then it “cannot be delegitimated as a potential locus of causal power without eliminating causality from the world.” 20 It follows that if the organization of a process is the fundamental source of its causal power, then fundamental reorganizations of process, at whatever level this occurs, should be associated with a reorganization of causal power as well.

    Terrance Deacon - Incomplete Nature

    I have posted relevant parts of some of Bickhard's analysis here in an earlier post:
    https://thephilosophyforum.com/discussion/comment/826619

    But since, in examples like the Game of Life you mentioned, "more is different," I don't even know if the concept of emergence can even be fundamental under a process view. To be sure, it's useful for higher level fields where speaking of substance is a fine stand-in for describing long term stabilities in process. However, the only thing it makes sense to decompose computations - the transformation of output into input - into is the intervening states between S1 and SF. But if you're defining composition by transitions of states over time, you aren't talking substance anymore, that's process, and so the "emergence" part is redundant since a different process is a different process. We can have morphisms between processes, but it doesn't make sense to say a F(x) = 100 is somehow emergent, in the same way it isn't really useful to say "lines are emergent from points," or "planes are emergent from dimensions."
  • To be an atheist, but not a materialist, is completely reasonable


    Bickhart and Deacon have some good explanations of this I will try to find. The SEP article is rather lacking.

    Nested functions are part of code, static instructions on how to run a computation. Of course you can concatenate functions, and in this sense it is quite possible to decompose more complex functions. But the relationship between nested functions is not analogous to the way superveniance relations work in metaphysics (what I was trying to get at, which perhaps wasn't clear). Computation is substrate independent. If we change out the tape in a Turing Machine for some other brand of tape with a different chemical composition the computations it runs remain the same.

    In any event, the "computation" is the actual process of transforming the input into the output. To say nested functions are the computation is a bit like saying thought is neurons, rather than what the neurons do, or that the computation in a Turing Machine is the symbols on the tape plus the state instructions in the head (why run the machine then?) Now, could we say the computation is all the states the computer transitions through from input -> output? Maybe. But this is a process, prior states dictating future ones.
  • The Identity of Indiscernibles and the Principle of Irrelevance


    I may indeed be stuck in empiricism. I think rationality is something we primarily experience for instance.

    But the way I meant that was more: "our world is very complex and interconnected, and so we may err by simply abstracting items out of it and assuming that this works." I find it possible that scientific findings may eventually convince many people that two iron balls alone in a universe is actually metaphysically impossible. That is, if we could describe our whole universe mathematically, we might discover that iron balls are the type of thing that can't exist alone. They might need a "void" that works like our very, very weird "void" that is a sort of seething ocean of strange activity and condensates, in which case all sorts of interactions that differentiate the two balls vs one open up.

    I think pure rationalism works well enough for toy universes because we can wrap out minds around everything in them though. This is why they're cleaner than abstracting everyday times into a void.

    Which I'll admit, is maybe taking the question too seriously, but given the published responses to it invoke geodesic space-time and the like, at least I'm in good company.

    What about redescrbing the situation as one ball in two locations?

    That could work too! Or an infinite wrap around of balls. Or maybe only our phenomenal idea of the balls can ever exist.
  • The Identity of Indiscernibles and the Principle of Irrelevance


    I do not see where you get the idea that what you would call "a discernment", could be anything other than the product of an act of discernment, which is the act of a subject. Because of this, I do not see how you propose the possibility of a discernment which is not a subjective discernment. Each and every discernment is produced by a subject, therefore all possible discernments (by induction only) are subjective discernments.

    You might propose a form of discernment which is not subjective, but this would violate inductive reasoning, rendering it as a useless tool within your argument, so that your whole argument which is based on induction would be undermined, by allowing that a very strong inductive principle could be violated.

    Sorry, I didn't mean "the set of discernment which are not subjective." I meant, "the set of all discernments (which are necessarily made by subjects) is a set, an abstract entity," and abstract entities are generally not considered to be subjective.

    For example, we could have the set of all experiences where people experience red. The experiences are subjective, the set is an abstract object.

    Leibniz' law does not leave open the possibility of differences which make no difference. Instead, you ought to recognize that what the law intends, is that there is no such thing as a difference which makes no difference, this itself would be contradictory. If an observer notices something as a difference, then by that very fact that the difference has been noticed as a difference, the difference has already, necessarily, made a difference to that observer.

    Right, but the converse is generally not accepted. "If no observer notices something as a difference, then by the very fact that no difference has been noticed as a difference, the difference has, necessarily, made no differences to any observer... and so is not a difference." In general, people admit the possibility of differences that may not have made a difference yet (and might not ever make a difference). And indeed, these sorts of differences come up in the philosophy of language and then tie back into arguments vis-a-vis events/states of affairs/propositions.

    The law does not speak of possibilities, and I think that is where you misrepresent it. It is based in an impossibility, which is an exclusion of possibility. This is the impossibility that an entity which could only be identified as itself, could also be identified as something else.

    Given that LL is often applied to metaphysics writ large, that it is not used simply as an rule in a specific formalized context, I think it's fine to discuss it within the terms of possibility. That's how its author intended it (not as necessarily modal, but rather as a wider metaphysical claim). You can really make statements about "being as being," and then say "no the logic that I'm using doesn't allow for that aspect of being," right? If someone has a good argument for why we should eliminate possibility from metaphysical consideration, I'm happy to entertain that (would be interesting). But I don't see the point in saying, "possibility exists, but this rule isn't in a system that includes it, so its off limits."

    "This is the impossibility that an entity which could only be identified as itself, could also be identified as something else," isn't the only way LL is used. It's used in the context of, "when can we say that two things are different." That is, the problem of "if two things share all their properties, are they actually the same thing or numerically distinct identical objects. This comes up in terms of haeccitism. It is generally denied that fundamental particles have haecciety in light of LL, because the principles of QM make it such that it is not possible for us to distinguish between the electron me measure now and either of the two electrons we were working at some prior time. This is why John Wheeler suggested conceptualizing just one electron existing in all the universe, one electron that can be many places at once.

    But, per some largely defunct theories of physics, there are definitely multiple electrons. We are epistemically blocked from ever knowing this, but "real inaccessible differences exist."

    Really, I am just looking for a good argument that says "positing inaccessible differences is sort of nonsensical."
  • To be an atheist, but not a materialist, is completely reasonable


    It depends on how you define emergence I suppose. I do not mean classical emergence, where combinations of different substances somehow generate new terms that did not exist before. I think Jaegeon Kim dealt classic, substance based emergence a virtual death blow.

    Prehaps emergence in the "more is different," sense you see at work in cellular automata. But then it's not really clear to me if this warrants the name emergence, or if it just obviates the idea of emergence, consigning it to the dust bin of history.

    After all, it doesn't make sense to think of computations as being "composed of" smaller computations. To be sure, we have a step-wise element in computation (although steps can run in parallel), but this is necessarily change, a process occuring over a timelike dimension. √81 doesn't "emerge" from smaller units of composition, it is its own process.
  • Reading "The Laws of Form", by George Spencer-Brown.


    I didn't mean feedback necessarily, just the view that process might be seen as fundemental, not substance. That what a thing is might not be best defined by "what it is made of." For example, heat is a measure of average motion, not a thing. Fire is a chemical reaction, not its own substance. So these are best understood as processes not things, but I was always given the view that at any deeper level substances must define reality.

    But I suppose the basics of feedback loops are important too. I do feel like I was exposed to that early on. For example, we sweat because we're hot, we get cool from evaporation, and then we stop being hot - negative feedback, or positive feedback too.
  • To be an atheist, but not a materialist, is completely reasonable


    I prefer more descriptive terms like e.g. immaterial or disembodied or nonphysical or spiritual or magical ... to the umbrella term "supernatural".

    Disembodied or incorporeal would be my least favorite here. If someone talks about "the text of War and Peace, but not the books it is printed on or the hard drives it is saved on," that seems like something that is "disembodied," but very different from the "magical." Economic recessions would be another example; they lack a body, but can be an object of scientific inquiry and we can attribute causes to them (e.g. "layoffs picked up in 2009 because of the recession.")

    I think a process/computational/complex systems view works quite well to recover our intuition about some incorporeal entities, e.g. "the Japanese language," existing, even if there can be no well defined superveniance relationship between them and a discrete set of physical components.

    "Everything" which causes changes is material, ergo "energy" is material, no?

    Interestingly enough, I'm starting to think that this proposition is what is at stake as the sciences, particularly physics, try to define and define a place for the concept of information. The question of: "can what is not there be causally important," or can "properties that a system lacks," be essential for explaining phenomena. The range of possibilities seems essential for explaining things like the heat carrying capabilities of metals, or life, even though this range is not actual.

    I think the thinking around it gets dicey, and very muddy, because there is a tendency to want to reduce relations to objects, whereas it seems like the process view is more relevant here. In the context of a process, what doesn't occur is important. It's like how you can't encode a message in just 1s, you need the possibility of 0s in a medium.
  • Reading "The Laws of Form", by George Spencer-Brown.
    I found this summary fairly interesting: https://www.projectenportfolio.nl/images/1/16/Robertson-Laws_of_Form.pdf

    Axioms are considered primitive assumptions beyond questions of true or falsity. The remainder of a system is then developed formally from these primitives. In contrast, Spencer-Brown’s axioms seem to be indisputable conclusions about the deepest archetypal nature of reality. They formally express the little we can say about something and nothing...

    Once bitten, twice shy—mathematicians became much more concerned with abstraction and formality. They separated what they knew in their mathematical world from what scientists asserted about the physical world. Mathematics was supposed to be the science which dealt with the formal rules for manipulating meaningless signs. Spencer-Brown’s attempt to develop

    Interesting vis-a-vis the original thread that sparked this one. Is there a "logic-like reality that exists outside the minds of individuals," etc.

    Sometimes I wonder if the discoveries of the early-20th century should have been taken as a warning against strict bivalance and "truth as objectivity," rather than as an argument for deflating truth (as they generally were).


    I.e.:

    Meanwhile, if the fear of falling into error introduces an element of distrust into science, which without any scruples of that sort goes to work and actually does know, it is not easy to understand why, conversely, a distrust should not be placed in this very distrust, and why we should not take care lest the fear of error is not just the initial error. As a matter of fact, this fear presupposes something, indeed a great deal, as truth, and supports its scruples and consequences on what should itself be examined beforehand to see whether it is truth. It starts with ideas of knowledge as an instrument, and as a medium; and presupposes a distinction of ourselves from this knowledge. More especially it takes for granted that the Absolute stands on one side, and that knowledge on the other side, by itself and cut off from the Absolute, is still something real; in other words, that knowledge, which, by being outside the Absolute, is certainly also outside truth, is nevertheless true — a position which, while calling itself fear of error, makes itself known rather as fear of the truth.

    G.W.F. Hegel, the Phenomenology Sec 74

    The idea of imaginary numbers as oscillations is interesting too. I have always seen them described as a number line running orthogonal to the real number line instead. Imaginary numbers are interesting in general because they seem to be a move to admit some sort of paraconsistency into mathematics for pragmatic expedience. I assume they have since been grounded in mathematical logic somehow? I just recall from mathematical histories that they were initially accepted on the grounds that "it works, don't it?" as with zero as well.

    Interesting quote from Varela, who expanded Brown's system to include self-reference as a third mark, a move made to make it more usable with biology, where self-reference is central.

    When [Norbert] Wiener brought the feedback idea to the foreground, not only did it become immediately recognized as a fundamental concept, but it also raised major philosophical questions as to the validity of the cause-effect doctrine.…the nature of feedback is that it gives a mechanism, which is independent of particular properties, of components, for constituting a stable unit. And from this mechanism, the appearance of stability gives a rationale to the observed purposive behavior of systems and a possibility of understanding teleology.…Since Wiener, the analysis of various types of systems has borne this same generalization: Whenever a whole is identified, its interactions turn out to be circularly interconnected, and cannot be taken as linear cause-effect relationships if one is not to lose the system’s characteristics (

    Reading Terrance Deacon's Incomplete Nature right now, it's clear this thread has been developed a great deal, but not resolved. Deacon tries to explain how purposefulness emerges by reintroducing Aristotle's formal cause via-thermodynamics and an explicitly process-based, as opposed to substance metaphysics.

    All very interesting, but damn hard to wrap one's mind around. I do wonder why it is that it has taken so long for the process view to take over. Is it necessarily less intuitive, or is the problem that we drill a sort of naive corpuscularism, a substance metaphysics, into kids for the first 14-18 years of their education? It certainly seems less intuitive. I sort of buy into Donald Hoffman's argument that we evolved to want to focus on concrete objects (thus excluding the "nothing").


    Side note: It's interesting that Brown was working on network issues. I've seen some articles on information theoretic/categorical models of quantum mechanics that attempt to explain physics as a network. This in turn, allows us to recreate standard QM in a different language, but also explains entanglement in a more intuitive network-based model (or so the author claimed, I did not find anything intuitive about the paper lol). I do find the idea of modeling reality as networks or possibility trees interesting though. But again, it's easier to conceptualize the network as a fundamental thing, rather then that the network simply is a model of process and relation, which seems to be the true basic entity!
  • The von Neumann–Wigner interpretation and the Fine Tuning Problem


    The article reviews a few of them.

    Quantum mechanics is a scientific theory. It describes aspects of our world. Our world includes consciousness. That doesn't mean there is a specific, direct connection between QM and consciousness.

    True. Although given the ways we've already found that life has adapted to take advantage of quantum effects, I figure it will probably come to play some sort of role. Obviously life uses quantum phenomena in that all chemistry is quantum phenomena, but it seems likely that adaptations for molecule level cellular machinery taking advantage of non-classical effects will be something we continue to find. After all, live evolved in our real world, not the abstraction we call the "classical scale world," and if optimal solutions involve quantum effects then life could easily have chanced upon them over 4 billion years.

    You already have neat little experiments like this: https://www.sciencealert.com/study-suggests-spins-of-brain-water-could-mean-our-minds-use-quantum-computation

    https://iopscience.iop.org/article/10.1088/2399-6528/ac94be

    There has been a decent trickle of these, some related to how microtubules and tubulins re-emit trapped light, etc.

    My guess is that, if these are verified, we will see some big headlines about "the quantum brain discovered," but that's about it. It's not going to answer any big questions. It won't mean much of anything. It'll just
    be more evidence that the idea of a "classical world," is just a useful abstraction. If anything, it will mean it's going to be even harder to unpack how the brain works, not that we'll get some sort of "quantum leap," if you will, in understanding.
  • The von Neumann–Wigner interpretation and the Fine Tuning Problem


    That was sort of my feel too. It felt like he had an axe to grind at some points, although it does seem like some people intentionality went out of their way to destroy other's careers to keep their theory from being challenged, which also isn't a good thing. Evolutionary biology has a very similar thing going on right now re how central the "gene pool," really is to adaptation.

    Plus, he totally skips retrocausality and information based approaches. The former is interesting, and the latter is one of the most popular versions.

    The spontaneous collapse versions do make slightly different predictions and have been tested in some forms. I posted a link to those above.

    Information based versions that claim the universe is computable are falsifiable, we have ideas for experiments that might confirm them (and arguably strong emergence) but lack the technology to pull them off.

    But verifiable experiments have been born from this sort of work. For example, tests of Bell's Inequalities came out of work in foundations and are important. The delayed choice quantum eraser experiment came out of Wheeler and Feynman's work in foundations.

    I am less familiar with work on quantum gravity and attempts to unify physics, but my understanding is that theories about "what is really going on behind the measurement outcomes," have at least some implications for thinking up ways to unify physics and ways to test said theories.

    So even if the interpretations can't be falsified currently, work on them does indeed produce testable hypotheses. The idea of decoherence, given short shift in Becker's book, is probably the biggest of these. It has had a huge impact and it wouldn't exist without considerations of what "collapse" is.
  • Bell's Theorem


    This confuses me. What does it mean that communication takes place instantaneously but no information can be transmitted? I would have thought that "communication" means the transfer of information. I have to do more reading

    It's not you, it's a confusing solution that is invoked to save relativity's "speed limit." Non-locality suggests that causal influences can move faster than the speed of light (although there are other interpretations like retro-causality, superdeterminism, etc.). But relativity originally said that isn't possible. The theory is saved by a move whereby we say it is information that cannot move faster than light. Information of course has many definitions, but the one here is crafted with preserving the speed limit in mind.

    Practically, you can't use this phenomenon to send messages faster than light.

    But we already knew that relativity is not consistent with quantum mechanics, so this wasn't completely suprising. Einstein himself was deeply troubled by non-locality.

    There are plenty of neat little experiments that punch tiny holes in "physical laws." You can perform a trick with cesium gas to get faster than light behavior. In some experiments conservation of mass/energy seems to be violated (open to interpretation) and in some phenomena we seem to have very short periods where conservation is out of wack (more accepted). Part of the hope for any sort of new big paradigm shift is that it will explain all the little oddities that pop up in a way that is more intuitive.

    I think non-locality is just a case where it's more helpful to say "yes it's counter intuitive, cause seems to be instantaneous across distances" at least in terms of basic explanations.
  • The Identity of Indiscernibles and the Principle of Irrelevance


    Yes, this problem seems to me a special case of the general problem about whether there is a reality that exists independently of observers.

    This seems to me embedded in our language and thought, except possibly in sub-atomic physics, and that's a special case because the act of observation directly affects what happens next.

    But the idea of an unobservable reality seems absurd or pointless.

    Yes, that's the idea. It's a special case of what Berkeley is talking about. He is saying it is pointless to talk about things that aren't perceived.

    The Principle is more modest, saying that it's pointless to talk about things that are necessarily not perceived.

    But, it probably hinges on what one considers metaphysically and physically possible. In a world with no observers, can we still talk about things in terms of the possibility of observers? In a universe where observers are not physically possible, can we still say they are metaphysically possible?

    The universe of just two spheres seems like a case where the Principle works less ambiguously. By the very definition of the toy universe it is metaphysically impossible for there to be observers. If there were observers then it wouldn't be a universe of just two spheres, it would be a universe with two spheres and an observer. So maybe it just works to rule out these sorts of metaphysical thought experiments with toy universes that don't admit observers?

    But if we follow Kripke on essentialism and nature having the properties it does for intrinsic reasons, then it seems like a universe where observers aren't physically possible is also a universe where observers aren't metaphysically possible. This would have implications for the metaphysics of a multiverse, where most universes cannot support observers, or the Fine Tuning Problem.
  • The Complexities of Abortion


    Are those special terms? I just meant that we can't have a principal by which "what goes in and comes out of our bodies is inviolably our right," and still have laws most of us like.

    Just on a basic level, laws against public defecation or laws against exhibitionist public sexual acts are, by definition, restrictions on that sort of thing. But I think they're plenty supportable.

    More clear cut are statutes against consensual sexual relations between adults and minors up to some "age of consent," that is generally set to early adulthood or at least late adolescence in much of the world. The codification of statutory rape is itself a government restriction on what goes into people's bodies. While such statutes are sometimes abused, like when they are used to prosecute relations between two people of close age, I think we can generally support these sorts of laws for instances when one party is well into adulthood, a teacher, etc.

    But they are cases where society is making these sorts of restrictions and saying one cannot consent to x or y, regardless of what one says.

    Laws against the use of illegal drugs work on similar grounds. And while we might not agree that prohibition in its current format works, I think most people would support some state restrictions on who is allowed to put these substances in their body. E.g., we shouldn't sell heroin to 12 year olds, or maybe we shouldn't even sell cigarettes to those under 21.

    Conscription is just a defacto violation these sorts of principles because in many contexts a great deal of all people conscripted for some task are going to end up with things in their body that they don't want there. Like naval aviators and submarine crews in World War Two generally had fatality rates of 70-90%, water and shells ending up where they didn't want them.
  • The Complexities of Abortion


    I'm pro-choice and find that in the inviolability of our physical integrity. I choose what goes in and comes out of my body.

    ?
  • Fractal Geometry in the Natural Selection
    Or, from the horse's mouth:

    https://www.nature.com/articles/514161a

    Charles Darwin conceived of evolution by natural selection without knowing that genes exist. Now mainstream evolutionary theory has come to focus almost exclusively on genetic inheritance and processes that change gene frequencies.

    Yet new data pouring out of adjacent fields are starting to undermine this narrow stance. An alternative vision of evolution is beginning to crystallize, in which the processes by which organisms grow and develop are recognized as causes of evolution.

    Some of us first met to discuss these advances six years ago. In the time since, as members of an interdisciplinary team, we have worked intensively to develop a broader framework, termed the extended evolutionary synthesis1 (EES), and to flesh out its structure, assumptions and predictions. In essence, this synthesis maintains that important drivers of evolution, ones that cannot be reduced to genes, must be woven into the very fabric of evolutionary theory.

    We believe that the EES will shed new light on how evolution works. We hold that organisms are constructed in development, not simply ‘programmed’ to develop by genes. Living things do not evolve to fit into pre-existing environments, but co-construct and coevolve with their environments, in the process changing the structure of ecosystems...

    The core of current evolutionary theory was forged in the 1930s and 1940s. It combined natural selection, genetics and other fields into a consensus about how evolution occurs. This ‘modern synthesis’ allowed the evolutionary process to be described mathematically as frequencies of genetic variants in a population change over time — as, for instance, in the spread of genetic resistance to the myxoma virus in rabbits.

    In the decades since, evolutionary biology has incorporated developments consistent with the tenets of the modern synthesis. One such is ‘neutral theory’, which emphasizes random events in evolution. However, standard evolutionary theory (SET) largely retains the same assumptions as the original modern synthesis, which continues to channel how people think about evolution.

    The story that SET tells is simple: new variation arises through random genetic mutation; inheritance occurs through DNA; and natural selection is the sole cause of adaptation, the process by which organisms become well-suited to their environments. In this view, the complexity of biological development — the changes that occur as an organism grows and ages — are of secondary, even minor, importance.

    In our view, this ‘gene-centric’ focus fails to capture the full gamut of processes that direct evolution. Missing pieces include how physical development influences the generation of variation (developmental bias); how the environment directly shapes organisms’ traits (plasticity); how organisms modify environments (niche construction); and how organisms transmit more than genes across generations (extra-genetic inheritance). For SET, these phenomena are just outcomes of evolution. For the EES, they are also causes.

    Valuable insight into the causes of adaptation and the appearance of new traits comes from the field of evolutionary developmental biology (‘evo-devo’).Some of its experimental findings are proving tricky to assimilate into SET. Particularly thorny is the observation that much variation is not random because developmental processes generate certain forms more readily than others...



    SET explains such parallels as convergent evolution: similar environmental conditions select for random genetic variation with equivalent results. This account requires extraordinary coincidence to explain the multiple parallel forms that evolved independently in each lake. A more succinct hypothesis is that developmental bias and natural selection work together4,5. Rather than selection being free to traverse across any physical possibility, it is guided along specific routes opened up by the processes of development5,6...

    Another kind of developmental bias occurs when individuals respond to their environment by changing their form — a phenomenon called plasticity. For instance, leaf shape changes with soil water and chemistry. SET views this plasticity as merely fine-tuning, or even noise. The EES sees it as a plausible first step in adaptive evolution. The key finding here is that plasticity not only allows organisms to cope in new environmental conditions but to generate traits that are well-suited to them. If selection preserves genetic variants that respond effectively when conditions change, then adaptation largely occurs by accumulation of genetic variations that stabilize a trait after its first appearance5,6.In other words, often it is the trait that comes first; genes that cement it follow, sometimes several generations later5.

    Studies of fish, birds, amphibians and insects suggest that adaptations that were, initially, environmentally induced may promote colonization of new environments and facilitate speciation5,6. Some of the best-studied examples of this are in fishes, such as sticklebacks and Arctic char. Differences in the diets and conditions of fish living at the bottom and in open water have induced distinct body forms, which seem to be evolving reproductive isolation, a stage in forming new species. The number of species in a lineage does not depend solely on how random genetic variation is winnowed through different environmental sieves. It also hangs on developmental properties that contribute to the lineage’s ‘evolvability’.

    In essence, SET treats the environment as a ‘background condition’, which may trigger or modify selection, but is not itself part of the evolutionary process. It does not differentiate between how termites become adapted to mounds that they construct and, say, how organisms adapt to volcanic eruptions. We view these cases as fundamentally different7...




    Whereas many of the rebuttals (in this article or this one: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5329086/) focus on "predictive power."


    Finally, diluting what Laland and colleagues deride as a ‘gene-centric’ view would de-emphasize the most powerfully predictive, broadly applicable and empirically validated component of evolutionary theory. Changes in the hereditary material are an essential part of adaptation and speciation. The precise genetic basis for countless adaptations has been documented in detail, ranging from antibiotic resistance in bacteria to camouflage coloration in deer mice, to lactose tolerance in humans.

    The problem here is that genes only reproduce by virtue of the bodies they are in. Bodies do the replicating. Thus, if it was as easy to catalog and quantify variances in phenotype, in all observable properties of an organism, across a population (which would of course include all differences in genotype, since it is part of the body), it is prima facie reasonable to assume that any such models would be more predictive than gene-based models. We don't do tend to do modeling based on phenotypes in this way because it's incredibly difficult and you can't trace lineages the same way. That's not a good argument against their relevance though. This particular counter argument like saying the keys must be under the streetlight because you couldn't see them if they fell anywhere else.

    Because you can make predictive models based on a host of factors, and it's unclear that "most important" = "most predictive," given problems with data collection and accurate modeling.

    Anyhow, my inclination is to think that, if different levels of a hierarchical phenomenon can all recommend themselves to being "the/a big mover," then what you really might have is a fractal type problem where the same pattern is reasserting itself of different levels. Each one can show up in a model as predictive because it is following the same pattern as other levels. Then it's the overall pattern you really want to look at in the end. Whether or not this is feasible for experimental science is another question.
  • Fractal Geometry in the Natural Selection


    I think these days it is fairly widely understood, amongst those who have looked into the subject beyond high school biology, that there are selection effects that take place through changes in DNA outside the boundaries of genes. (Gene expression promoting regions of DNA, which are not themselves part of a gene, for example.)

    So there is a sense in which definitions of evolution in terms of change in allele frequency over time is simplistic. However, perhaps when looked at on geological time scales, changes in allele frequency over time are such a dominant factor that such simplistic definitions are pragmatic for introducing people to the subject?

    Right, that's what the debate is generally about; is the simplification pragmatically warranted or does it obscure important facts. It's pretty rare to see a denial of the fact that group selection can occur. It's not generally an argument about absolutes, but rather one about "what is most fundamental?" and what is "interesting, but not a central variation on the process."

    That said, arguments about selection on the basis of form, defined broadly as "developing echolocation," or "developing the ability to fly" do seem fairly controversial. At least part of the fear here is that it introduces too much teleology in to biology, making it seem like purposeful development. But I've certainly seen arguments for selection on the basis of broad form/function made in ways that don't seem, at first glance, to be teleological at all. Generally, their framed as in terms of evolution as a scan of a sample space, and broad functional/formal adaptations being attractor regions in that sample space.

    So, answers like this are pretty common:

    Richard Dawkins likes to couch this discussion in terms of replicators and vehicles. Replicators are any entities of which copies are made; selection will favor replicators with the highest replication rate. Vehicles are survival machines: organisms are vehicles for replicators and selection will favor vehicles that are better at propagating the replicators that reside within them. There is a hierarchy of both replicators and vehicles. The key issues are that 1) the "unit" of selection is one that is potentially immortal: organisms die, but their genes could be passed on indefinitely. The heritability of a gene is greater than that of a chromosome is > that of a cell > organism > and so on. But , because of linkage we should not think of individual genes as the units; it is the stretch of chromosome upon which selection can select, given certain rates of recombination. Issue 2) is that selection acts on phenotypes that are the product of the replicators, not on the replicators themselves, but the vehicles have lower heritability and immortality than replicators. What then is the unit of selection? All of them, just of different strengths and effects at different levels.

    But they seem messy. A sort of fractal model seems like it could address how different levels can look more primary depending on how you do your analysis.
  • The irreducibility of phenomenal experiences does not refute physicalism.


    I don't see any real problem. Panpsychism seems like nothing more than an unfalsifiable hypothesis that has no significant explanatory value, and Ockham razor seems like sufficient justification for dismissing panpsychism. From my perspective panpsychism doesn't seem to present any more challenge than solipsism.

    If everything intrinsically has some form of first-person subjective experience that would explain why there is first person subjective experience. We still need to explain why some entities have more depth of experience than others, but not experience itself, since it is an unanalyzable primitive. That would seem to be the explanatory value. Knowledge of the brain already does shed much light on why it is that different people experience subjective life differently, so this seems like a far more tractable problem. At least that's the argument panpsychists give; I am not terribly convinced.


    This seems to me, more a matter of unrealistic expectations on the part of critics of physicalism, than it seems a problem for physicalism. Brains are enormously complex, and I say this as an electrical engineer who routinely deals with highly complex systems. Yes there is a huge way to go in developing a understanding of how brains instantiate minds, and no guarantee that human minds are up to the task of developing something approaching an ultimate explanatory theory. However, substantial explanatory progress has been made over my lifetime, and that progress is ongoing. I don't see how anything similar can be claimed for panpsychism.

    I think you've misread my point. My point was that physicalism/panpsychism isn't a mutually exclusive dichotomy. If we discovered some sort of empirically observable phyche particle or property of mass/energy that suffuses the universe, and we were able to associate that with the emergence of first-person experience on a level with our own, we would say "aha, that's the physical entity related to consciousness."

    Panpsychism doesn't posit a suis generis substance responsible for consciousness; most formulations just say that subjective experience is a property of physical substance, period. So, the problem isn't that I expect physicalism to debunk panpsychism, it's that, if the physicalist wants to say "panpsychism is not commensurate with physicalism," they have to explain why this is the case. On the face of it, there doesn't seem to be any ontological reason for this to be the case. But it's much harder to explain what consciousness can't be caused by if you don't know what it is caused by.

    I agree with the rest of what you said. And perhaps this critique just reduces to Hempel's Dilemma. After all, if we had solid scientific evidence of psychic powers or ghosts, physicalists would probably also want to point to the mechanisms by which we found those phenomena to work and say "see, look, ghosts are physical, it's the physical ectoplasm that explains it." But then the problem is that physicalism has just turned out to be "whatever there is widespread support for." The key problem there is that, at least in physics, and at least for those that publish metaphysically minded papers and books, it seems that the scientists who should be guiding "scientific realism" towards physicalism have a tendency to advocate for ontologies that don't seem very physicalist (e.g. "It From Bit," ontic structural realism/Platonism, etc.)

    In any case, I'm interested in hearing more about what you see as "massive problems" for physicalism.

    See: https://plato.stanford.edu/entries/physicalism/

    But note that these are philosophical problems with coherently defining physicalism, not empirical arguments against it. Physicalism can move along so well despite these because they aren't issues that concern most people. Also, part of the reason it has received so many wounds is simply because it is popular. If another ontology became as popular it would probably also have more people analyzing it, which would then lead to more problems being identified.

    Consider:

    1. Physicalism is true at a possible world w iff any world which is a physical duplicate of w is a duplicate of w simpliciter.

    2. Physicalism is true at a possible world w iff every property instantiated at w is necessitated by a physical property.

    But the most influential objection to supervenience physicalism (and to modal formulations generally) is what might be called the sufficiency problem. This alleges that, while (1) articulates a necessary condition for physicalism it does not provide a sufficient condition. The underlying rationale is that, intuitively one thing can supervene on another and yet be of a completely different nature. To use Fine’s famous (1994) example, consider the difference between Socrates and his singleton set, the set that contains only Socrates as a member. The facts about the set supervene on the facts about Socrates; any world that is like ours in respect of the existence of Socrates is like ours in respect of the existence of his singleton set. And yet the set is quite different from Socrates. This in turn raises the possibility that something might be of a completely different nature from the physical and nevertheless supervene on it.

    One may bring out this objection further by considering positions in philosophy which entail supervenience and yet deny physicalism. A good example is necessitation dualism, which is an approach that weaves together elements of both physicalism and its traditional rival, dualism. On the one hand, the necessitation dualist wants to say that mental facts and physical facts are metaphysically distinct—just as a standard dualist does. On the other hand, the necessitation dualist wants to agree with the physicalist that mental facts are necessitated by, and supervene on, the physical facts. If this sort of position is coherent, (1) does not articulate a sufficient condition for physicalism. For if necessitation dualism is true, any physical duplicate of the actual world is a duplicate simpliciter. And yet, if dualism of any sort is true, including necessitation dualism, physicalism is false.

    Further, consider that if the physical supervenes on all mental events we could as easily flip the script and say that the mental supervenes on all related physical events. And yet physicalism generally wants to say that only one set is relevant for causal explanations, thus we need something more than mere supervenience.

    Also, supervenience itself seems unable to deal with a process-based metaphysics. It is an idea born of substance thinking. However, the natural sciences have overwhelmingly tended to move away from substance explanations: heat is now thought of in terms of average motion not caloric, combustion is a process not the substance phlogiston, atoms have a beginning and end and will eventually decay, they are patterns of mass energy not primary substances, "fundamental" particles are now often thought of as mere patterns in a field, etc.

    A final topic that I will consider is that of supervenience. The intuition of supervenience is that higher level phenomena cannot differ unless their supporting lower-level phenomena also differ. There may be something correct in this intuition, but a process metaphysics puts at least standard ways of construing supervenience into question too.

    Most commonly, a supervenience base — that upon which some higher-level phenomena are supposed to be supervenient — is defined in terms of the particles and their properties, and perhaps the relations among them, that are the mereological constituents of the supervenient system [Kim, 1991; 1998]. Within a particle framework, and so long as the canonical examples considered are energy well stabilities, this might appear to make sense.

    But at least three considerations overturn such an approach. First, local versions of supervenience cannot handle relational phenomena — e.g., the longest pencil in the box may lose the status of being longest pencil even though nothing about the pencil itself changes. Just put a longer pencil into the box. Being the longest pencil in the box is not often of crucial importance, but other relational phenomena are. Being in a far from equilibrium relation to the environment, for example, is a relational kind of property that similarly cannot be construed as being locally supervenient. And it is a property of fundamental importance to much of our worlds — including, not insignificantly, ourselves.

    A second consideration is that far from equilibrium process organizations, such as a candle flame, require ongoing exchanges with that environment in order to maintain their far from equilibrium conditions. There is no fixed set of particles, even within a nominally particle view, that mereologically constitutes the flame.

    A third, related consideration is the point made above about boundaries. Issues of boundary are not clear with respect to processes, and not all processes have clear boundaries of several differentiable sorts - and, if they do have two or more of them, they are not necessarily co-extensive. But, if boundaries are not clear, then what could constitute a supervenience base is also not clear.

    Supervenience is an example of a contemporary notion that has been rendered in particle terms, and that cannot be simply translated into a process metaphysical framework [Bickhard, 2000; 2004]. More generally, a process framework puts many classical metaphysical assumptions into question.

    Mark Bickhard - Systems and Process Metaphysics - The North Holland Handbook of the Philosophy of Science: The Philosophy of Complex Systems

    Of course, not all physicalism is supervenience physicalism, but it is what most people generally mean by the term. Second, if what is physically not present has causal power, as in information theoretic ontologies, absential phenomena, etc., then this seems to violate the causal closure principle as it is commonly put forth for physicalism (although it might be recoverable through reformulation). Really, if information is "the ontological basement" as some physicists contend, or even if it is fundamental, "coequal with energy" as others assert, it is hard to see how classical physicalism's causal closure principle works, even if reformulated in process terms.

    IMO, it's unclear is a "process physicalism" is worthy of the name. Physicalism always struct me as a substance metaphysics, partly because of how it came to define itself historically in terms of an opposition to substance dualism.

    Or consider:

    A third problem, which we mentioned briefly above, is the problem of abstracta (Rabin 2020). This concerns the status within physicalism of abstract objects, i.e., entities apparently not located in space and time, such as numbers, properties and relations, or propositions.

    To see the problem, suppose that abstract objects, if they exist, exist necessarily, i.e., in all possible worlds. If physicalism is true, then the facts about such objects must either be physical facts, or else bear a particular relation (grounding, realisation) to the physical. But on the face of it, that is not so. Can one really say that 5+7=12, for example, is realised in, or holds in virtue of, some arrangement of atoms and void? Or can one say that it itself is a physical fact or a fundamental physical fact? If not, physicalism is false: the property of being such that 5+7=12 obtains the actual world but is neither identical to, nor grounded in or realized by, any physical property. (Sometimes the problem of abstracta is formulated as concerning, not abstract objects such as numbers or properties, but the grounding or realization facts themselves; see, e.g, Dasgupta 2015. We will set this aside here.)

    There are a number of responses to this problem in the literature; for an overview, see Rabin 2020, see also Dasgupta 2015 and Bennett 2017; for more general discussion of physicalism and abstracta, see Montero 2017, Schneider 2017, and Witmer 2017.

    One response points out that, while the problem of abstracta confronts many different versions of physicalism, it does not arise for supervenience physicalism. After all, since numbers exist in all possible worlds, facts about them trivially supervene on the physical; any world identical to the actual world in physical respects will be identical to it in respect of whether 5+7=12, because any world at all is identical to the actual world in that respect! But the difficulty here is that supervenience physicalism seems, as we saw above, too weak anyway. Indeed, one might think that the example of abstracta is simply a different way to bring out that it is too weak.

    Another option is to adopt a version of nominalism, and deny the existence of abstracta entirely. The problem with this option is that defending nominalism about mathematics is no easy matter, and in any case nominalism and physicalism are normally thought of as distinct commitments.

    A third view, which seems more attractive than either of the two mentioned so far, is to expand the notion of a physical property that is in play in formulations of physicalism. For example, one might treat the properties of abstract objects as topic-neutral in something like the sense discussed in connection with Smart and reductionism above (see section 3.1). Topic-neutral properties have the interesting feature that, while they themselves are not physical, but are capable of being instantiated in what is intuitively a completely physical world, or indeed what is intuitively a completely spiritual world or a world entirely made of water. If so, it becomes possible to understand physicalism so that the reference to ‘physical properties’ within it is understood more correctly as ‘physical or topic-neutral properties’.

    But of course if there are "physical" and "topic-neutral properties" then we actually have two types of things.
  • Fractal Geometry in the Natural Selection
    For a bit more context:

    Dawkins describes genes as replicators. The suffix “ - or” suggests that genes are in some sense the locus of this replication process (as in a machine designed for a purpose like a refrigerator or an applicator), or else an agent accomplishing some function (such as an investigator or an actor). This connotation is a bit misleading. DNA molecules only get replicated with the aid of quite elaborate molecular machinery, within living cells or specially designed laboratory devices. But there is a sense in which they contribute indirectly to this process: if there is a functional consequence for the organism to which a given DNA nucleotide sequence contributes, it will improve the probability that that sequence will be replicated in future generations. genes as active replicators for this reason, though the word “active” is being used rhetorically...

    Replicator theory thus treats the pattern embodied in the sequence of bases along a strand of DNA as information, analogous to the bit strings entered into digital computers to control their operation. Like the bit strings stored in the various media embodying this manuscript, this genetic information can be precisely copied again and again with minimal loss because of its discrete digital organization. This genetic data is transcribed into chemical operations of a body analogous to the way that computer bit strings can be transcribed into electrical operations of computer circuits. In this sense, genes are a bit like organism software.

    Replicators are, then, patterns that contribute to getting themselves copied. Where do they get this function? According to the standard interpretation, they get it simply by virtue of the fact that they do get replicated.

    But of course, phenotypes also get replicated, as do broad forms like wings, eyes, etc. Ants represent 20% of terrestrial animal biomass. That's a lot of replicated form, function, and phenotype in our world. Trees all share key formal features (hence how we can define them) and represent 80% of all terrestrial biomass period and shape the entire atmosphere's chemistry to a large degree. Trees have phenotypes that are construct the larger planetary environment that allows their genes to reproduce.

    The qualifier “active” introduces an interesting sort of self - referential loop, but one that seems to impute this capacity to the pattern itself, despite the fact that any such influence is entirely context - dependent. Indeed, both sources of action — work done to change things in some way — are located outside the reputed replicator. DNA replication depends on an extensive array of cellular molecular mechanisms, and the influence that a given DNA base sequence has on its own probability of replication is mediated by the physiological and behavioral consequences it contributes to in a body, and most importantly how these affect how well that body reproduces in its given environmental context. DNA does not autonomously replicate itself; nor does a given DNA sequence have the intrinsic property of aiding its own replication — indeed, if it did, this would be a serious impediment to its biological usefulness. In fact, there is a curious irony in treating the only two totally passive contributors to natural selection — the genome and the selection environment — as though they were active principles of change.

    But where is the organism in this explanation? For Dawkins, the organism is the medium through which genes influence their probability of being replicated. But as many critics have pointed out, this inverts the location of agency and dynamics. Genes are passively involved in the process while the chemistry of organism bodies does the work of acquiring resources and reproducing. The biosemiotician Jesper Hoffmeyer notes that, “As opposed to the organism, selection is a purely external force while mutation is an internal force, engendering variation. And yet mutations are considered to be random phenomena and hence independent of both the organism and its functions.”

    By this token the organism becomes, as Claus Emmeche says, “the passive meeting place of forces that are alien to itself.” So the difficulty is not that replicator theory is in error — indeed, highly accurate replication is necessary for evolution by natural selection — it’s that replicators, in the way this concept has generally been used, are inanimate artifacts. Although genetic information is embodied in the sequence of bases along DNA molecules and its replication is fundamental to biological evolution, this is only relevant if this molecular structure is embedded within a dynamical system with certain very special characteristics. DNA molecules are just long, stringy, relatively inert molecules otherwise.The question that is begged by replicator theory, then, is this: What kind of system properties are required to transform a mere physical pattern embedded within that system into information that is both able to play a constitutive role in determining the organization of this system and constraining it to be capable of self - generation, maintenance, and reproduction in its local environment? These properties are external to the patterned artifact being described as a replicator, and are far from trivial... [It] can’t be assumed that a molecule that, under certain very special conditions, can serve as a template for the formation of a replica of itself exhibits these properties. Even if this were to be a trivially possible molecular process, it would still lack the means to maintain the far - from - equilibrium dynamical organization that is required to persistently generate and preserve this capacity. It would be little more than a special case of crystallization.



  • The Identity of Indiscernibles and the Principle of Irrelevance


    You're quite right here. I wasn't really sure how to formulate the Principle exactly. It has a modal component in that it's about what is necessarily indistinguishable, as opposed to what is contingently so. It seems possible that there may be states of affairs that are contingent that, if they obtain, will result into two entities becoming indiscernible. This isn't what I was trying to get though.

    Rather, I was thinking more along these lines, which you have formulated better than I:


    "If X is such that necessarily there does not exist an observer O such that possibly there exists (a distinction of X from Y for O) then X is indiscernible from Y."


    But maybe it's vacuous? The problem of vacuousness seems to hinge on the proposition that the set of all possible ontological differences between entities is in fact different than the set of all possible observable differences.

    However, I think these are indeed a different sets. We can easily posit real ontological differences in properties that necessarily never result in any phenomenal differences. The Principle just says that we shouldn't bother doing this since, whether or not claims of this sort are true or false will necessarily be a matter of indifference to us.
  • The Identity of Indiscernibles and the Principle of Irrelevance


    I forgot this part!

    The second problem which is more to the point, is that each observer is oneself, a unique and particular individual, according to the law of identity. Because of this, the observational apparatus and perspective of the observer is also unique to the individual. This makes it highly improbable that two distinct observers will ever precisely describe the very same thing in the exact same way. Accordingly, the criteria for "X", which needs to be the same description provided by all observers, will never be fulfilled, and "X=Y" will refer to nothing.

    This is a formidable challenge. Do you think this makes Leibniz Law untenable entirely?

    Or can we talk about entities' properties without any reference to an observer? If the latter, can't we do the same sort of abstraction and apply the Principle to the set of all possible discernments? That is, within the set of all possible discernments, there is no case in which x ≠ y, thus x = y. All possible discernments are not "subjective discernments," as such a set would be an entity itself (if we allow that such abstract entities as sets exist). If we are realists, it doesn't seem that this should be a necessarily fatal problem, phenomena are entities as are sets of said phenomena. Perhaps this trivially reduces the principle to Leibniz Law, but I don't think it does because Leibniz Law leaves open the possibility of bare haecceities of difference, differences that never make any possible phenomenological difference, which is what the Principle denies.

    Also, note that it is not necessary that all possible entities perceive or describe X in exactly the same way. It is only necessary that all possible entities cannot distinguish between X versus Y being the case. That is a key difference. I.e., "for all possible cases, no entity can distinguish between X and Y being a state of affairs that obtains or fails to obtain," not that "all possible entities view X and Y identically." There is a morphism between the ability to discern between X and Y for all entities, not between their experiences of X (and Y if it exists).
  • The Identity of Indiscernibles and the Principle of Irrelevance


    Sure. But we've already stayed the hand holding the razor to allow unobservable noumena to exist.



    I think it was glass balls because I remember one getting a scratch on it, or maybe that's a later version. IMO that whole series of articles seems to make a misstep by assuming a "classical universe of just two balls" is something that could necessarily exist. How do the balls get there? You need stars to go supernova to create glass (or iron), right?

    In our world we can clearly distinguish between one ball versus two because the causal history of one or the other situation is different. Apparently, this universe has no causal history. If you remove all observers, and all causal history, it's unclear if you're left with something that makes sense though. Like, linguistically, the premise makes sense, I can imagine it. But we've copied and pasted different elements of our world into a foreign abstract landscape where the things we can say about them based on our world break down. The "geodesic space-time" explanation that claims that there is only one ball is particularly funny because, if we're assuming a "classical set of balls," why not go all in and just assume "absolute space and time," to simplify things? I mean, we left the constraints of lived reality behind a long time ago in these examples, so we've made things so malleable that you can make the case for all sorts of interpretations.
  • The irreducibility of phenomenal experiences does not refute physicalism.


    Panpsychism has always been a problem for physicalism because it seems to be decidedly not what physicalists want to posit, but at the same time it is in no way ruled out by mainstream physicalism. Partly because no physicalism that precludes panpsychism has been developed that doesn't seem to spawn massive problems for the theorist. It isn't easy to say "mind exists, but it can only exist in some places," without knowing what it is that "causes," mind. But that's exactly the unfortunate position a physicalist who wants to deny panpsychism finds themselves in.

    To be honest, it's really weird to me how physicalism is the most popular ontology writ large, but in the context of metaphysics as a specialty it's like a battleship that's taken direct multiple direct torpedo hits, is listing to one side, its magazine blew, and it looks liable to break in half. I think what that tells us is just how unattractive the alternatives are lol. It might be sinking, but the lifeboats are filled with holes too.
  • The irreducibility of phenomenal experiences does not refute physicalism.


    But I am talking about the information contents of the actual image, you are talking about features of the physical object the image has been projected on. I can produce the same image on different paper or have it on a digital screen and identify the contents of the image; those contents are not directly related to the physical composition of a photograph you can hold in your hand and cannot be reduced to it, which is the main point.

    Yes, it's true that that image is not totally independent of other factors; after all, the type of camera and resolution etc will have an effect on the image but these largely still come from the same interactions during the photo-taking process by which the image of Everest was stored - it is still information of the image which is independent of the physical medium an image is projected on and so cannot be reduced to it.

    I don't disagree that there is a useful distinction to be made between the "image" and the physical photograph. We can think of the image as the "Shannon Entropy," a collection of variable discrete differences that is substrate independent.

    But physicalism says that the physical subservienes on any such information. That is, all representation is representation only in virtue of the physical properties of the system that holds the representation. Thus, while we can abstract the picture from the photograph, and we can say that there are isomorphisms between different copies of the same image, these are causally irrelevant. All cause can be explained in purely physical terms, the causal closure principle.

    So what you're describing seems to be more an argument against physicalism than a way to save physicalism. Physicalism without causal closure and superveniance doesn't seem to be physicalism. Physicalism says that everything that can be known about seeing red is physical. There is nothing else. Perhaps experiencing red is a different experience than knowing "how red is experienced." This is fine, but it's going to lead you to physicalism with type or predicate dualism (which may or may not be physicalism depending on who you ask).

    Now you do have scientific theories where information is essential, "it from bit," views in physics, Deacon's "absential phenomena," which are born of what "is not physically present," etc. But generally information based ontologies, at least those that say that information is ontologically primitive, are taken to be "immateriality," and not physicalism.

    If physicalism isn't going to fall to Hemple's dilemma and define itself as "just whatever currently has evidential support," it seems like it has to pick a hill to die on, and superveniance is the most obvious hill.

    To be fair, I think similar sorts of problems show up for idealism. I am inclined to think that the problem might be substance metaphysics writ large, with both physicalism and idealism making the cardinal error of following Parmenides in thinking of static being-as-substance, instead of Parmenides being-as-flux-shaped-through-logos. Maybe these even helps get at cosmological issues because, while stabilities of matter have begining and end, the Logos is necessarily without beginning or end, as cause and effect is the ground from which before and after can even exist (for a bit of a non-sequitor).

Count Timothy von Icarus

Start FollowingSend a Message