• Banning AI Altogether
    Surely, discussing AI capabilities, flaws and impacts, as well as the significance this technology has for the philosophy of mind and of language (among other things) should be allowed, and illustrating those topics with properly advertised examples of AI outputs should be allowed.Pierre-Normand

    The A.I.-derived OP’s are likely to be better thought-out than many non-A.I. efforts. Banning A.I. is banning background research that will become built into the way we engage with each other.Joshs

    This is the reality. The tool is now ubiquitous. Every intellectual is going to have to factor it into their practice. Time to learn what that means.

    If you need to cheat to pass your exams or publish your research, then in the end it is you who suffers. But if AI can be used in a way than actually expands your brain, then that ought to be encouraged.

    PF seems a suitably low stakes place to evolve some social norms.
  • Banning AI Altogether
    I guess I’m naïve or maybe just not very perceptive, but I haven’t recognized any posts definitely written by AI.T Clark

    I’m definitely seeing posters who are suddenly injecting chunks of more organised and considered material into their responses. There are AI tools to detect the giveaway changes in rhythm, vocab and style. But if you know the poster, even if they’ve done some rewriting, it is already jarring enough.

    So sure. AI as a tool will change things in ways that are the usual mix of better and worse. And all my life I have seen nothing but that kind of change.

    I remember life before and after Google. The internet before and after it was just academics and geeks on it. The world as it once was when I had to fill out cards at the British Library and wait several days for obscure tomes to arrive at my desk, brought by porters with clanking metal trolleys.

    Being Luddite never works. Listservs were once the greatest intellectual medium ever invented - the ideal combination of book and conference. But the internet got overrun and personal blogs took over. They didn’t last long themselves - or tried to evolve into substacks or whatever. I had already lost interest in that line of development. YouTube was the next medium to become actually useful.

    If anyone values PF for some reason, they ought to think about why and how to respond to AI from that point of view. Banning it is just going to increase the disguised use of it. Folk can already Google and then can’t help but get an AI response from it as the first hit. So would one ban search engines too?

    There was once a moment when PF went in for social media likes and dislikes. PF is already socially gamified and some got into that while others deplored it. I think the change in platform might have simply failed to support the necessary like button. I vaguely remember an ignore function that also bit the dust.

    Anyway, the point is there is always change and its tempo is only increasing. And what even is PF’s mission? What would you miss most if it upped and vanished? That should inform any policies on AI.

    Are we here for erudition or the drama? And what would AI’s impact be on either?
  • Banning AI Altogether
    And AI agrees. :razz:

    AI poses several dangers to ordinary human intellectual debate, primarily through the erosion of critical thinking, the amplification of bias, and the potential for large-scale misinformation. Instead of fostering deeper and more informed discourse, AI can undermine the very human skills needed for a robust and productive exchange of ideas.

    Erosion of critical thinking and independent thought: By outsourcing core intellectual tasks to AI, humans risk a decline in the mental rigor necessary for debate.

    Cognitive offloading: People may delegate tasks like research and analysis to AI tools, a process called cognitive offloading. Studies have found a negative correlation between heavy AI use and critical thinking scores, with younger people showing a greater dependence on AI tools for problem-solving.

    Reduced analytical skills: Over-reliance on AI for quick answers can diminish a person's ability to engage in independent, deep analysis. The temptation to let AI generate arguments and counterarguments can bypass the human-centered process of careful reasoning and evaluation.

    Stagnation of ideas: If everyone relies on the same algorithms for ideas, debate can become repetitive and less creative. True intellectual debate thrives on the unpredictable, human-driven generation of novel thoughts and solutions.

    Amplification of bias and groupthink: AI systems are trained on human-created data, which often contains pre-existing biases. Algorithms can create "filter bubbles" and "echo chambers" by feeding users content that reinforces their existing beliefs. In a debate, this means participants may be intellectually isolated, only encountering information that confirms their own point of view, and they may be less exposed to diverse perspectives.

    Erosion of authenticity: As AI-generated content becomes indistinguishable from human-generated content, it can breed a pervasive sense of distrust. In a debate, it becomes harder for participants to trust the authenticity of arguments, eroding the foundation of good-faith discussion
  • Against Cause
    My personal worldview is built upon what I call the BothAnd principle*1 of Complementarity or the Union of Opposites. Instead of an Either/Or reductive analysis, I prefer a Holistic synthesis. We seem to be coming from divergent directions, with different vocabularies, but eventually met somewhere in the middle of the Aperion.Gnomon

    There are many many versions of this in world culture as it is simply what is obvious once you think about how anything could come to have existence. Unless you go for some Big Daddy in the Sky divine creator figure, you are going to have to posit an ultimate stuff so vague it is just the potential for stuff, which then becomes something by dividing against itself in the complementary fashion that allows it to evolve into the many kinds of things we find.

    One such philosopher whom you might like to add to your list is Schelling and his Ungrund.

    A quick AI summary…..

    Schelling's theory of the Ungrund (non-ground) posits a primal, ungrounded principle that precedes and underlies all existence, including the rational mind. This "ungrounded ground" is a chaotic, indeterminate, and free force that is the source from which all reality and consciousness emerge, a concept that departs from purely rationalistic systems and emphasizes the importance of the unconscious and irrational.

    Key aspects of the Ungrund

    Primal, undetermined principle: The Ungrund is an "unfathomable" and "incomprehensible" starting point that has no prior cause or ground itself. It is a pure, indifferent identity that exists before the separation of subject and object, logic and existence.

    Source of freedom and creativity: Because it is not bound by pre-existing structures or reason, the Ungrund is inherently free and allows for the possibility of change and development. This freedom is the basis for creativity and action in both nature and the human being.

    Precedes reason: For Schelling, reason and rational structures are not the ultimate source of reality but rather emerge from this ungrounded source. The world contains a "preponderant mass of unreason," with the rational being merely secondary.

    A link between philosophies: The Ungrund serves as a bridge between Schelling's early philosophy of identity and his later division into negative and positive philosophies. It is introduced to explain the origin of difference and existence from a prior, non-dialectical unity.

    Connection to the divine: Schelling also uses the concept of the Ungrund in a theological context, suggesting that God has an inner ground that precedes existence, but that God is also the principle that gives rise to this ground, as seen in his discussions on freedom and God
  • Hume and legitimate beliefs
    Perhaps I should consider myself lucky I have a sketchy grounding in formal logic.Janus

    Correct. I had a shudder at the mention of Tooley and Sosa’s Causation. Made me think how lucky I was to have an instinctive aversion to reductionism from as early as I remember. :grin:
  • Models and the test of consciousness
    Friston and his followers often claim that subjective experience (qualia) is not the target of the theory.
    But to describe the brain as a predictive, inferential, or representational system is already to invoke the phenomenal domain.
    Wolfgang

    It would be guilty of the dualism of Cartesian representationalism if it were not instead the enactive story of a self in interaction with its world. The triadic thing of a semiotic modelling relation where “reality” emerges as the mediating truth of an “Umwelt”.

    So the question is not how to explain mental subjectivity in physically objective terms - the Cartesian hard problem we are all familiar with. It is instead the different question of how an Umwelt can emerge that connects a model and a world in a mediated relation. How can such a thing be? And why would it so naturally evolve?

    Your critique is based on attacking metaphysical dualism. And that is fair enough. Do plenty of that myself.

    But then Peircean semiotics in particular understood that dualism in nature is really the symmetry breaking of a dichotomy which then resolves itself in the recursive order of a hierarchy. Sort of like Hegelian dialectics, but with the added holism of recursive scale.

    Anyway, a theory of Umwelts would be the target here, not a theory of subjectivity. An Umwelt is not a representation of the real world. It is the state of an interpretation which puts an organism into a pragmatic accommodation with the environment required to sustain its being.

    That is sort of like looking at something as if it is really there. But as the enactivists would say, what is seen is a panorama of affordances. All the little levers and buttons we might pull or push to get things done in a way we might desire.

    The Umwelt that arises as the mediating connection between self and world is not merely a model. It is a model of a world as it would be with a self existing at its semantic core. The world as a sweeping entropic flow rushing heedlessly towards its heat death. But now also with a central “us” included in this panorama - the bit that makes the sunset look so rosily beautiful and the apple so appealing to the bite. That is, until the model is updated be because an ugly wormhole has been noted and expectations have formed about the grub shortly to be discovered in that first juicy crunch.

    So Friston’s dualism is not in fact a dualism at all. It is a model of this semiotic modelling relation. It is a theory not of subjectivity as a special kind of mind stuff but of the Umwelt as the mediating connection by which a system of semantics can engage with a system of thermodynamics and immediately - habitually and reflexively - see it all in terms of the availability of an exergy or capacity for work. The shortest available path to achieving whatever it is we might happen to desire.
  • Hume and legitimate beliefs
    I think the problem is that if you train in philosophy and then specialise in logic, you can't help but become a metaphysical-strength reductionist. That is the habit of thought being ground into you. There seems no way to think yourself back out of this hole.

    But science at least has to cover both bases. It has to be holistic to some degree, even if that is disguised. As in calling the evolved habits of Nature its "laws". Cosmic regularities that simply exist for some reason or other, not because they might have had some Hegelian history of rational evolution.
  • Hume and legitimate beliefs
    To suggest that laws (so defined) come and go over time is ad hoc, because there's no evidence for this. Types of particulars may come into or out of existence, but if they exist - the associated laws will necessarily exist.Relativist

    I agree with you... unenlightened is blowing the smoke of mere logical possibility.Janus

    Yep. The growing block universe is a way to see both sides of the story. The fact that the future is free, and so can evolve in unpredicted ways. And yet also the past is what has definitely happened and so removed some vast number of such degrees of freedom. The future is free, and yet also strongly constrained.

    So there is truth that the future is open. But also that this openness has become increasingly constrained over time.

    What Peirce called the growth of cosmic habit. The Universe as an evolving process where the regularity of natural laws emerges from an initial state of pure chance via the universal metaphysical tendency to form habits.

    A rather Bayesian thing. Certainty emerging at the end of the trail of a process of uncertainty constraining or possibility eliminating. But never a terminus of absolute certainty. Only the pragmatic thing of arriving at a reasonable limit on doubt.
  • Models and the test of consciousness
    Free energy minimization gives you a framework where you can write down the equations describing the conditions for a system to maintain its own existence over time. That might not be interesting for a rock. But I think thats quite interesting for more complicated self-organizing systems. Its a framework for describing what complex self-organizing systems do, like choosing to Describe physical systems as following paths of least action.

    ...as a unifying theory of self-organization, it does exactly what it says on the tin, and its impossible for it to be any more precise empirically because the notion of a self-organizing system is far to general to have any specific empirical consequences. Exactly the same for a "general system's theory". Nonetheless, this theory is fundamentally describing in the most general sense what self-organizing systems do, and gives you a formal framework to talk about them which you can flesh out with your own specific models in specific domains.
    Apustimelogist

    Precisely. It is not a theory of consciousness but a meta-theory of self-organisation. And one large enough to encompass the meta-theories of self-organisation that had arisen already in the different metaphysical settings of physicalism and semantics. So Wolfgang is comparing apples and oranges. :up:
  • We Are Entirely Physical Beings
    That’s why I said “usually.” As I understand it, engineering mechanics is the science of phenomena that can be constructed using the principles of lower levels of organization.T Clark

    Sure. But I am drawing attention to the tricky fact that Nature is organised by both “complicity” and “simplexity” as Stewart and Cohen waggishly put it.

    So there is simplicity that applies across all scales of being. And there is complexity that arises because topological order than cuts across the smooth change to create its abrupt phase transitions.

    You get a world being organised by two apparently quite different types of hierarchical cause. Hence the word play of complicity versus simplexity as an attempt at unifying these oppositions - making them both fundamental in their own right as complementary ways to slice up the hierarchical order of Being.
  • Against Cause
    For example, "the dichotomising action of apokrisis" meant nothing to me, until Google revealed some associated concepts that I was already familiar with.Gnomon

    AI gives a nice summary….

    Anaximander used the term apokrisis (separation off) to explain how the world and its components emerged from the apeiron—the boundless, indefinite, and eternal origin of all things. In his cosmology, this process involved the separation of opposites, such as hot and cold or wet and dry, from the undifferentiated primordial substance.

    The process of apokrisis
    A contrast to Thales: Anaximander's teacher, Thales, had proposed that water was the fundamental principle (archē) of all things. Anaximander disagreed, arguing that if any one of the specific elements (like water) were infinite and dominant, it would have destroyed the others long ago due to their opposing qualities.

    The function of the apeiron: To resolve this issue, Anaximander proposed the apeiron as a neutral, limitless, and inexhaustible source. The apeiron is not itself any of the known elements and is therefore capable of giving rise to all of them through an eternal motion without being depleted or overpowered.

    Cosmic differentiation: The apokrisis, or "separating off," is the key mechanism by which the universe comes into being. Anaximander held that an eternal, probably rotary, motion in the apeiron caused the pairs of opposites to separate from one another.

    Formation of the cosmos: This separation led to the formation of the world. For instance, the hot and the cold separated, with a sphere of fire forming around the cold, moist earth and mist. This ball of fire later burst apart to form the heavenly bodies. This dynamic interplay of opposites is regulated by a sense of cosmic justice, with each opposite "paying penalty and retribution to one another for their injustice," according to the "disposition of time"

    You might note that your own AI prompt ends up referencing a lot of my own PF posts. :lol:
  • Models and the test of consciousness
    When you speak of a “bridge mechanism,” you already presuppose that there is a level of description where semantics becomes physics.Wolfgang

    Sure. The molecule must be a message. But it is also a molecule. And there must be a message.

    The issue is not semantics becoming physics. It is semantics regulating physics in a way that builds more of the semantics. An organism that lives as it entropifies in some purposeful and selfcontrolled fashion.

    But semantic reference, intentionality, or subjective experience are not additional physical phenomena that arise through complexity. They are descriptions that belong to a different epistemic domain.Wolfgang

    Now you have reified things in the way Ryle criticises. Or at least you don’t see that you have jumped a level of semiosis to talk about the socially-constructed level of mind. Animals are conscious in an immediate and embodied fashion. Humans add self-consciousness as they regulate their consciousness through a collective socialised notion of what it is like to have a mind which has semantic reference, intentionality, subjective experience,

    These kinds of thoughts have no way of formulating in the mind of an animal. But they are the jargon that a self regulating human is expected to employ to regulate their behaviour as part of a higher level social narrative.

    So connecting consciousness to neurobiology is one thing. Connecting self awareness as a narrative habit that can then regulate human behaviour according to socially-evolved norms is another thing.

    This is why neuroscience would not talk much about “consciousness” but about attentional processing and predictive modelling and the other functional aspects appropriate to a neurobiological level account. If you want a model of the stuff you are talking about, call in the social constructionist. The issue is about how words and numbers organise the human mind, not how neurons and genes organise the biological level of mind.

    So when you say that biophysics “has already provided the bridge,” I would say: it has provided the conditions of correlation, not the transition itself. What you call a “bridge” is in truth an interface of perspectives, not a mechanism.Wolfgang

    Read the link I provided and you will see you are off the mark. The issue was how an information could even affect entropy flows. The critical finding was that all the various physical forces converge to the same scale at the quasi-classical nanoscale and so can be switched from one form to another at “no cost” by a semantic network of molecular machinery.

    A cell was once thought of as just a bag of autocatalytic chemistry - toss in enzymes at the right moment and watch the metabolism run. But now the model has completely changed to a biosemiotic one.

    Modern philosophy has thus taken on the character of a stage performance. When David Chalmers, with long hair and leather jacket, walks onto a conference stage, the “wow effect” precedes the argument. Add a few enigmatic questions — “How does mind arise from matter?” — and the performance is complete.
    Koch and Friston follow a similar pattern: their theories sound deep precisely because almost no one can truly assess them.
    Wolfgang

    But I was there. I had lunch with Chalmers and Koch the day they launched their projects of “the hard problem” and “the neural correlates of consciousness”. I quizzed them and continued to do so. I agree that each was shallow.

    And likewise, I spent time with Friston when he was just “the statistics guy”. I could see he was in a completely different class of seriousness. Which is why I object to your OP that couldn’t tell the two apart.

    The field of “consciousness studies” attracts every kind of crackpot and chancer. Everyone reasons, well they are “conscious” so already they must be an authority on the subject. It is thus important to know who is doing the serious work.

    The second is subtler: it assumes that mind must arise from matter, when in fact it arises from life.
    If you reduce a physical system, you end up with particles.
    If you reduce a living system, you end up with autocatalytic organization — the self-maintaining network of chemical reactions that became enclosed by a membrane and thus capable of internal coherence.
    That is the true basis of life: the emergence of a causal core within recursive, self-referential processes.
    Wolfgang

    Yes I agree we need to reduce to the point where life begins. But now you illustrate the mistake of talking just in terms of the physics - cells as bags of metabolism - and ignoring the semantics that must somehow be organising the chemistry.

    The biophysical surprise is that the interface that produces this “organic state of being” is so throughly mechanical. A story of molecular switches, ratchets clamps and motors. And that this molecular machinery is “mining” the possibilities of physics at the quantum level while existing in the midst of a battering thermodynamical storm.

    If you are still thinking of bags of autocatalytic chemistry, your are stuck back in the 1980s when the issue was how to square this kind of new complexity theory model of self-organising physics with the kind of informational mechanism that was a genome needing some kind of semantic bridge to allow that type of physical potential. Information needed to do less work if physics could organise itself. But it still had to do some absolutely critical work.

    Hence biosemiosis. The search for how self-information could connect to dissipative structure. And the answer has turned out to be molecular machinery doing quantum biology.
  • Models and the test of consciousness
    Elite batting or return tasks show anticipatory control because neural and sensorimotor delays demand feed-forward strategies. That is perfectly compatible with many control-theoretic explanations (internal models, Smith predictors, model predictive control, dynamical systems) that do not require Bayesian inference or a universal principle of “uncertainty minimization.” From “organisms behave as if anticipating” it does not follow that they literally minimize epistemic uncertainty.Wolfgang

    So you accept the principle of feedforward in general, just not in Friston’s particular case? And yet Friston is generalising the feedforward story as the particular thing he is doing? :chin:

    Sliding between these domains without a strict translation rule is a category error.Wolfgang

    Or instead trying to unify the two perspectives that need to be unified.

    The issue in the 1990s was the question of which paradigm was the best to model neurobiology. Was it dynamical systems theory of some kind, or a computational neural network of some kind? Both seemed important, but it was a little mysterious as to which way to jump as a theorist.

    Friston for example was interested in Scott Kelso’s coordination networks and neuronal transients as representing the strictly physicalist approach - dynamical self organisation. But also in generative AI models like Helmholtz machines as the alternative of an informational approach.

    So a lot of us were torn in this way. Was the brain a fundamentally analog and physical device, or instead better understood as fundamentally digital and computational? Was the brain trafficking in terms of entropy or information.

    I found my answer to this conundrum in biosemiosis - a movement in theoretical biology where hierarchy theorists were just discovering the useful connection between dissipative structure theory and Peircean semiotics.

    Friston found his resolution in Bayesian mechanics - a more austere and mathematical treatment that boiled the connection down to differential equations. But saying essentially the same thing.

    So what you see as the bug is what I see as the feature. Finding a way to tie together the physical dynamics and the information processing into the one unified paradigm.

    Of course Friston could be accused of just being too sparse and general in offering a bare algorithm and not a larger metaphysics. And I would agree. But also see it as still be part of the same important project that I just described.

    For myself, I am concerned with the actual how of this semiotic connection is made. And that has become its own exciting story with the rapid advances in biophysics as I outlined in this post some years back - https://thephilosophyforum.com/discussion/comment/679203

    Between physics and semantics there can be no bridge law, only correlation.Wolfgang

    A sweeping statement. As I argue, what is needed is not a law but a bridging mechanism. And that is what biophysics has provided.

    f “organisms minimize uncertainty” is to be an empirical claim rather than a post-hoc description, it must yield a pre-specified, falsifiable prediction that (i) distinguishes the FEP/predictive-coding framework from alternative non-Bayesian control models, (ii) is measurable in the organism itself (not only in our statistical model), and (iii) could in principle fail.Wolfgang

    Yeah. I just see this as missing the point as to what the game is about. It is not about the best model of predictive coding. It is about how to bridge between the control model - implemented as flesh and blood biology - and the entropic world that as an actual organism it is meant to be controlling.

    Let’s not get forget this is a problem of biology and not computer science. How do you get consciousness out of genes and biochemistry? What is a modelling relation look like in those terms?

    Which again why the Bayesian Brain approach is an advantage by being generalised to a level beyond the choice of “hardware”.

    The issue is the illegitimate conceptual leap from physical energy flow to semantic uncertainty, and from probabilistic modelling to biological reality. That’s precisely the confusion I am objecting to.Wolfgang

    And I say it is the leap that in the 1990s only a relative few understood was the leap that needed to be made. Friston in particular shaped my view on this.

    I was talking to Chalmers, Block, Baars, Koch and many, many others too. But there was a reason that when Friston’s name was mentioned, serious neuroscientists gave a knowing nod that he was quietly in a different league. The one to watch.
  • We Are Entirely Physical Beings
    Usually you cannot use the principles of one level of organization to predict—construct—phenomena at another level.T Clark

    Fluid dynamics, thermodynamics, and other maths of complexity do a good job of modelling physical processes over all scales. A vortice is a vortice from the level of a Bose-Einstein condensate to a black hole accretion disk.

    Similarly, semiosis covers all scales of life and mind from cell metabolism to human political and economic organisation.

    Levels are of course different in a qualitative sense. But the maths describing the constraints can be basically the same.
  • Models and the test of consciousness
    It claims that organisms minimize uncertainty. This claim cannot be empirically confirmed.Wolfgang

    But this is just a basic principle of cognitive science. There is already abundant evidence for it. The mind is a predictive model. Some of the most direct demonstrations come from sports science. Getting elite athletes to return tennis serves or bat cricket balls. Showing how even the best trained players can’t beat the built in lag of neural processing delays. Anticipation can build up a state of preparation up until a fifth of a second before bat must hit ball in precise fashion. But that last 200 milliseconds is effectively flying blind.

    So Friston had abundant reason to focus on this principle. I myself talked to him about the sports science and other relevant lab results. He was working closely with Geoff Hinton on “Helmholtz machines” and the general theory of generative neural networks. It was already known that this was basic to any theory of “consciousness”. The flesh and blood human nervous system simply could not function in any other way but to be based on an anticipatory logic. It takes 50 milliseconds just to conduct signals from the retina to the brain. So minimising uncertainty is the most uncontroversial of assumptions - for anyone familiar with the real world constraints on human wetware.

    Physical energy and semantic uncertainty belong to entirely different descriptive levels; they have nothing to do with each other.Wolfgang

    Again a naive opinion. Shannon information is a useful formalism exactly because it sets up this inverse relation between noise and signal. And brains are all about the kind of “information processing” that allows an organism to create the free energy that does useful work - the entropy flow that results in homeostatic stability of the entropifying structure. The energy liberated to repair and reproduce the organism in a fashion that preserves its structural integrity.

    So some of this is Friston being tongue in cheek. Framing a psychological theory in physicalist terms. But also that is genuinely what is being claimed. That the mind does connect to the physics as a biosemiotic modelling relation. An anticipatory model that liberates free energy so that an organism can homeostatically maintain a state of persistent identity.

    And you will note that Friston doesn’t make the simple claim that consciousness just is this free energy principle. He is explicit that it is a general theory of life and mind. And perhaps even AI. He is not even trying to play the game of Tononi and others who might be mentioned.

    He shared a lab with Tononi under the famous egotist Gerald Edelman. Another amusing topic when it came to the great neural correlates hunt that Koch and Chalmers launched in the mid-1990s. A stunt that gave scientific cover for a whole rash of the kind of non-theories that you rightly deplore.

    I’m simply saying, Friston was never part of that bandwagon. Even though of course he also sat right at its centre as the guru on how properly to calculate the correlations resulting from brain imaging research.
  • Against Cause
    I like the Britney version for being more like how the words would echo in their own confusion. Serious, but not taken seriously. :cool:
  • Against Cause
    This is related to this:

    differentiation emerge from a state of uniformity
    JuanZu

    Reading your post again, it does sound like you are stressing that my view would be based on immanence rather than transcendence in terms of any "first cause" or first symmetry breaking. And that would be right.

    And between Plato and Aristotle, this would be more Aristotle (although the Timaeus tantalises us with its notion of chora, or the "receptacle" that the forms need to become actualised in material being).

    So in this thread, I have argued for the immanent and hylomorphic view of causality. The systems science view. And that is a metaphysics that has even come into vogue – because of gauge symmetry – in recent philosophy as Ontic Structural Realism. It has also come into vogue in fundamental physics as more and more is understood about the maths of topological order – and so how the gauge physics of quantum field theory can be properly generalised to cover, for instance, condensed matter and other emergent material phenomena.

    Likewise, the idea of dimensionality as it might account for a hot Big Bang is leading to a dissipative structure approach – as first pushed in cosmology by David Layzer. So a thermodynamics, but one more appropriate to a cosmos as a self-organising structure of dissipation. One that gets beyond the roadblock of the Second Law and its world built on systems already gone to the equilibrium of a Heat Death. Again a tale of immanence where even "entropy" emerges or evolves in topological fashion.

    So it is all about symmetry and symmetry-breaking. But then also about that as itself an evolving hierarchy of topological order.

    Our current universe is in its very complex – and yet also very simple – state. This seems an odd thing to say, but that itself stresses we are dealing with a logic of dichotomies. Things start to happen when two complementary things are happening at once. This is the thought that breaks the logjam of metaphysics. And has done so ever since Anaximander figured out the logic of the Apeiron split by the dichotomising action of apokrisis.

    Anyway, our current universe has achieved a state that is matter dominated – as the radiation background has already cooled itself to causal irrelevance. The CMB has been redshifted out of sight and may as well be a literal a-causal void of Newtonian fancy. And also, all the anti-matter that did exist has been almost completely annihilated – fizzled to join that CMB because of a charge-parity violation buried in the fine print of gauge symmetry and the Standard Model hierarchy of particles it produced.

    So as I said, SU(3) exists but is balled up into protons and neutrons and doesn't organise the universe at any more general level than being the strong nuclear force making atomic-level matter possible. SU(2) hangs around as the weak force which allows Standard Model particles to rotate their way down its thermal ladder and become the final most massless versions of their type. Pretty much everything has degenerated to up and down quarks, electrons and neutrinos, by now.

    And U(1) runs the show in conjunction with the gravitational degrees of freedom embodied in this simple collection of Dirac particles. Charge is permanently broken as protons and electrons can't be rotated into each other anymore. As quark matter and lepton matter, the temperature of the Universe is too cold and they are now locked into that final lowest level slot on the Standard Model's gauge structure.

    So charge is permanently broken, but then also permanently in a dynamical state of dichotomised balance. Protons and electrons have to be arranged into neutralising atomic forms just to settle things down enough for matter to be locally electrostatically bound and globally gravitationally organised as the stars and galaxies of the great Cosmic Web of dynamically swirling mass.

    It is dances within dances. Symmetries broken at one level that need to be healed to create the symmetry to be broken the next. The story of topological order. First the universe breaks everything down so that all that seems to be left in the void is a residual dust of positive protons and negative electrons. But then that broken charge must restore its lost symmetry however it can. Hence the emergence of what we actually call matter. Atoms of hydrogen, helium and lithium.

    And then because these atoms have mass as well, you get the clumping of clouds that reheats this dusty matter to the point it catches fired and becomes a star. A fusion furnace that is self-organisd for longevity as it exists as a yo-yo balance of gravity and its heat. Its own weight collapses it, but its own heat expands it. And so it can hang in space, radiating into the void, neatly balanced at its critical point.

    A lucky fact as fusion starts making heavier elements. And then the exhaustion of the fuel and final collapse of the star is an explosion that loads up every possible slot of the periodic table with all the atoms its symmetries represent. The symmetries determined now by electron shells or orbitals.

    And so it goes on. Dichotomies all the way up and so all the way back down. Simplicity creating complexity. Symmetries get broken and create some new brand of complexity. That in itself is the cause for some new re-simplification which fixes what just got broken. In a world broken by U(1) charge, structure must arise that can neutralise that destabilising fact. Order must be restored. Protons and electrons must get bound into neutrons.

    If no charge symmetry had been broken, then the Cosmos could have rested its self-organising immanence right there. The only matter to talk of would be dark matter – cold clouds of non-interacting particles just dancing the dance of a gravitating dust. And too cold to even make interesting emergent patterns doing that.

    The Big Bang might be a fireball – for quantum uncertainty reasons, as Planck hot as it was Planck small – but most of that disorganised potential got spent quickly under the doubling~halving expansion and cooling of a Minkowski spacetime metric – the Poincare group organisation of Special Relativity.

    So the spacetime container that emerges from relativistic symmetry was disposing of the heat as fast as it could run. And running at the speed of light – in being at first just a ball of radiation, a chaotic soup of relativistic excitations – that was so fast it would have gone to zero about now. Or as the CMB, it has now fallen to 2.7 degrees K, and will halve that temperature once again in another 14 billion years.

    (Or in fact 10 billion years, as dark energy has shown up to hurry things along. A further topological complication to add to the long list of things that are immanently emergent due to symmetry breakings we still need to figure out.)

    One final point to toss in – as I was trying to give a feeling for why gauge symmetry-breaking is the cause of the "complex simplicity" which is our current "atoms in a void" universe – is that it should be noted how the symmetries are (dichotomously) divided between the real number symmetries of relativity and the complex number symmetries of quantum theory.

    The U in the SU(3)xSU(2)xU(1) formula means a unitary matrix. A matrix description in which rotation and translation can be entangled. Or indeed, re-entangled. The view of reality as it was before rotation and translation got broken and so what it looks like again when that symmetry is restored.

    Likewise the key symmetry at the heart of relativity is the Lorentz group SO(3,1). An orthogonal matrix group – orthogonal meaning it is fully broken at a dimensional level and so described in disentangled real number values. No complex number mixing. Just the three orthogonal rotations of the 3D spin group SO(3). But with the addition of a time~energy conservation "direction" that is created by the speed of light as the universal limit on interactions with directions. This then makes it the 4D spacetime group of SO(3,1).

    The Poincare group is then this 4D Lorentz group of relativistic "boosts" added to the standard 3+3D standard symmetries of 3D space to construct the 10D Poincare group – not literally ten dimensional, but that is how many degrees of freedom you have to package together in a self-constraining fashion where you have 6 Euclidean or flat space degrees of freedom (three translations/three rotations) being traded off against the contractions and dilations demanded to make the resulting metric properly Minkowski and speed of light restricted.

    Phew. So gauge symmetry comes in forms that are real number to speak to the globally-broken symmetry of spacetime as the dimensional container. And then also that are the complex number or unitary gauge symmetries to describe what remains as the open freedoms to be expressed at every point of this spacetime container. Just by constraining the metric to a collection of points itself sets up an "inner space" of intrinsic quantum spin that has now been set free to to add its topological order to the whole mix. You get all the possible kinds of local excitations or particles that become possible under the hierarchy of SU(3)xSU(2)xU(1).

    At least once inflation is over and has dumped a load of reheated vector particles into the vacuum. Which also must have the different thing of a scalar Higgs field – although that has to be SU(2) to later mix quantumly with the electroweak force and thus break it to become the weak force plus electromagnetism.

    Then dark matter could be anything. Black holes, a condensed matter effect that results in "2D" anyon particles, all sorts of exotic stuff – which would somehow have to also be explained in Platonic structural terms. There are always more symmetry groups to be pulled out of the maths bag.

    And dark energy! If it exists as some kind of quantum uncertainty effect within dimensionality itself and isn't just an optical effect of viewing the universe from some locally underdense "gravitational well".

    But immanence rules. And it rules as possibility can't help but stumble its way into orderly patterns. And it doesn't do this just once. Order always creates its own new possibilities which then have to stumble into new patterns.

    Symmetry thinking is then just the way to discover this kind of immanent pattern making. The invariant structures that can't help but emerge when everything is trying to happen all together and all at once.

    Shake up possibility and it settles down into whatever conformity that is its stable equilibrium state. A state where differences can no longer make a difference as the same old pattern just keeps re-emerging. The pattern that was immanent and then emerged dynamically to become something that looked permanent and even fore-destined.

    But every such pattern then becomes its own state of free possibility. A higher level symmetry to be broken at a higher level – shaken and shaken until it to settles into its own dynamical balance.

    A Cosmos is more complex from the start as it has to be symmetry-breaking in two complementary directions at once. It is itself already a symmetry-breaking in progress. A Big Bang that is a mix of its relativistic metric doing the expanding and a hot quantum content doing the other thing of a cooling. A plasma fireball that is turning itself from a radiation soup into a matter dust via a series of topological phase transitions as the container expands and the contents spread out and dilute.

    So good job we have both the real number matrices and the complex number matrices to keep track of both sides of this symmetry equation. Poincare invariance married to gauge invariance by little particle creating tricks such as that SU(2) is the double cover of SO(3). A Minkowski metric point can harbour the further inner complexity of a half-spin fermion that has to rotate 720 degrees to look like a 360 degree revolution. The mirror reflection trick that allows fermions to come as the creation~annihilation pairs of matter and antimatter.

    The symmetry that gets broken to arrive at a world made of just matter and a void of long-spent radiation. The relic antimatter gone to join all dark matter and other stuff like neutrinos that has become causally irrelevant in this ruthless game of cosmic Darwinism. Or self-organising immanence.

    And so it goes on. And on. Symmetry-breaking out to the furthest horizon. :wink:
  • Against Cause
    It is against to the thesis that matter is a passive receptacle for external and transcendent forms (first cause), while symmetry breaks give matter (to which they are immanent) the ability to generate forms without external intervention.JuanZu

    I’m not sure what you mean there. But the fact that fundamental physics is rooted in the maths of symmetry is rather Platonic and hylomorphic. It is a pointer to a strong version of structuralism.

    See SEP on https://plato.stanford.edu/entries/structural-realism/

    So in talking about the prime cause of Being, the shape of Nature gets imposed by the constraints that it can’t help but generate in its Becoming, to use the Aristotelean model.

    If we start with the idea of pure unformed potential - fluctuations without directions; an infinity of dimensions without cohesion - then what kind of dimensionality could begin to cohere out of that fundamental vagueness?

    Well symmetry seems to tell us that there is only one dimensionality that is the possible outcome of any such striving to become organised in some exact and balanced fashion. Only 3D could be where actuality begins as it is only in three dimensions that the number of rotational degrees of freedom match the number of translational degrees of freedom. As a metric, only 3D has the property that then produces physics as we know it. A dimensionality with the basic Newtonian principles of the conservation of momentum as both translations and rotations, and so the holism of the Galilean group of symmetries.

    Also only 3D has a doubling-halving story built into it in the fashion which gives gravity and force their inverse square law and so sets up a metric that can expand geometrically while also diluting at the same rate. You can have a metric driven by the “explosion” of its hot content, but that explosion then lasts forever as the hot content is getting cooled by that expansion and so the whole thing takes until the end of time to cool to zero and come to a complete stop.

    So you have a story where there is symmetry and it’s breaking that starts right from some ultimate state of vague potential. A symmetry where everything was possible as nothing had yet started to happen in any cohesive sense. An infinity of dimensions that was both an everythingness yet also less than nothing.

    Then all this potential could be poured into the generation of something. It could become broken by the fact that 3D was a stabilising solution. A Platonic form. If any kind of physical geometry was going to exist, it had to be this one.

    Of course you then have to explain the other aspects of the Big Bang. Like how this 3D receptacle started Planck small and so Planck hot because of quantum mechanics and its gauge symmetries. And how it was also the start of time in any proper sense as the speed of light got added to give the metric its 4D relativistic symmetry - it’s Poincare group of symmetries.

    It all gets quickly complex in topological fashion.

    But this picture says we have a metaphysics where anything was possible, and yet that everythingness was immediately being constrained by mathematical principles. To become actualised, it had to strike on a structure that made the most geometric sense. It had to be a hot 4D spacetime speck that would instantly begin to expand and cool. A speck whose dimensionality was defined in unit 1 terms by its three critical constants of c, G and h.

    That is, the strength of gravity to define the flatness of its translations, the quanta of action to define its fundamental unit of particle spin or intrinsic rotation, and the speed of light as the unifying rate at which the relativistic metric and its quantum contents could thermally decohere and start becoming a realm of material particles.

    The gauge constraints could kick in and start to fashion raw fluctuations into the vectors and spinors that are the zoo of Standard Model “matter”. The shapes of the excitations we know as the electrons, protons, neutrons and photons of a world organised under the ultimate gauge simplicity of U(1) electromagnetism and the messy hierarchy of mass terms added to particle fields by the Higgs mechanism. The further symmetry breaking that turned on gravity by breaking 4D spacetime into an effectively 3D story of an inertial rest frame of co-moving matter particles.

    So in a nutshell, the Universe exists in a complex fashion by striving to become as simple and balanced as possible in symmetry terms.

    It only arrive in its current state having eliminated infinite possibility and boiled itself down to what remained as the simplest possible state with its now locked in remaining order. It eventually arrived at its destination - a dust of gravitating and electrically neutral atoms in a void of uniformly scattered 2.7 degree K photons. And even that stage will pass with the ultimate concrete simplicity of a Heat Death. Just a last baseline sizzle of absolute zero photons radiated by the cosmic event horizon. The comoving spacetime metric finally reaching its doubling-halving halt.
  • Against Cause
    This is not talking about Symmetry in the traditional mirror-image sense.Gnomon

    But that is exactly what gauge symmetry does. It explains why particles are created and annihilated in matter-antimatter pairs. You break a chiral or mirror symmetry into its two halves and then these annihilate in a burst or energy when they come back together again.

    That’s why fermions are spin-1/2 and charge can exist. It is why anything exists to start making the Universe a complex place of material structure.
  • We Are Entirely Physical Beings
    For example, the genetic code (A–T–G–C) is just chemistry, but evolution selected the combinations that could store and replicate information.Copernicus

    So physics did not organise this new situation and evolution did. Thus your simple complexity thesis has a sudden hole in it.

    The key difference lies in the informational architecture, not the physics underneath it. So life and mind aren’t exceptions to physical law — they’re extensions of it. The universe, in a way, learning how to encode itself.Copernicus

    Sure. But this isn’t just more physics. And you are now needing to invoke informational architecture rather than entropic architecture. Semiotic complexity rather than merely physical complexity. Evolution rather than emergence.

    If biology starts at the point where “a molecule can be a message,” then that’s the threshold where matter becomes reflexive — where it starts encoding its own persistence.Copernicus

    Or where organisms first arise as not a new state of matter but a novel form of organisation.

    In short, codes aren’t supernatural — they’re emergent designs within physics.Copernicus

    Yes, there is nothing supernatural here. But it is wrong to minimise things by saying life and mind are merely physically emergent. If you don’t deal with what changes at the level of molecular biology then you really start getting into a mess by the time you are dealing with neurobiology. A small metaphysical misstep turns into a hugely handwaving one.

    To science, this matters. Well it matters to biologists and neuroscientists who like to feel they are getting to tackle big questions too. :smile:
  • We Are Entirely Physical Beings
    Can you ask in simpler terms exactly what your objection was?Copernicus

    Probably not.

    But what is it in terms of a simple continuum of physical complexity that a DNA base carries a semiotic meaning. Try to make that connection. How does a scrap of chemistry make the leap to being a scrap of information?

    In what sense is that just more physics and not something now more complex than just physics? In what sense is it - as you claim - physics organising itself?
  • We Are Entirely Physical Beings
    Human cognition exists along a continuum of increasing physical complexity:

    • Sentience – the ability to feel or experience; found widely among animals with nervous systems.
    • Sapience – higher reasoning, foresight, and abstraction; a hallmark of human cortical evolution.
    • Consciousness – awareness of the environment and oneself; emerging from multi-level neural feedback loops.
    • Conscience – moral awareness; the social and reflective layer of consciousness shaped by empathy and memory.

    Each is built upon physical substrates—neurons, synapses, chemical gradients—yet each transcends its parts through emergent organization.
    Copernicus

    Life and mind depend on the emergence of codes. The information processing possibilities of genes, neurons, words and numbers. So how do codes “just emerge” from more complex physics?

    Biology starts where a molecule can be a message. Is that simply “more physics”. A property of matter that simply follows from a continuing continuum of complexity?

    Or is it something a little more novel?
  • Hume and legitimate beliefs
    Caught red-handed!
  • Hume and legitimate beliefs
    Aha! AI now writing your posts. That will solve one of your problems. :lol:

    Now you just need to learn to write honest prompts.

    Anyways, as AI replies….

    Friston's Bayesian mechanics learns from experience by using prediction errors to update its internal models of the world, a core component of the Bayesian Brain Theory and Active Inference. Incoming sensory data is compared to the system's predictions, and any discrepancies drive changes to the probabilistic beliefs and generative models, allowing the system to adapt to its environment and improve its simulations of reality.

    These prediction errors are crucial because they are used to update the internal models and beliefs. This process of updating probabilistic models based on new sensory evidence is the core of Bayesian inference. Through this ongoing process of prediction, comparison, and updating, the organism constructs and refines its "reality model," which enables adaptive behavior in a complex environment.

    So you forgot the power of recursion.

    And also the power of attention. The ability to shift from responding habitually for as long as the future is resembling the past, to attention the moment it no longer does.

    Friston's Bayesian brain model explains attentional processes by proposing that attention acts to estimate and manage uncertainty during hierarchical inference, thereby controlling the flow of sensory information. By dynamically adjusting the "precision" of prediction errors (how much weight is given to sensory input vs. prior beliefs), the brain can focus processing resources on the most informative parts of the sensory scene, a process which naturally accounts for phenomena like salience and selection

    So priors can be suppressed and new ones rapidly prototyped by dynamically fixating on some narrowed part of the information space. Letting that spotlit part of the world now be what constrains the system’s abductive reasoning.

    Brains are very clever in their design. Might be worth focusing on how they actually function for a change,
  • Against Cause
    Brilliant! Hysterical!

    Not Death Valley Girls, but perhaps valley girls. And the bombastic rap treatment works

    I’m now seeing Britney Spears in front of a chalk board of equations. An audience of enthusiastic wizened professors thumping the benches.

    Or is that too literal?
  • Against Cause
    Is this psych rock?:PoeticUniverse

    It's pretty shit. :razz:

    I was thinking more a pastiche of the Death Valley Girls. But only as this immediately sprang to mind. I kind of dread the AI take on any decent music.

    However you go at it your way. An interesting project.

    We we surely in future not have photo albums but instead instantly generated rock operas to tell the tale of our lives. Chatbots will supplant PF in another year. Just so much to look forward to.

  • Hume and legitimate beliefs
    You don't actually say anything here about why I'm wrong.Banno

    I already pointed out the issue of priors. Others have noted your failure to cash out semantics in stabilising ontic commitments.

    You can't be claiming that Bayesian calculus is not about belief. So, what?Banno

    It is about the openness of beliefs closed under ontic commitment. Best inference constrained by the reality to be encountered at its end.

    Your arguments are so sloppy. You point to a SEP page and say “see!”. You mumble to your class about maybe having to start a new thread on that and then womble off to lunch. Apparently forget that immediately and wander back in chewing on a sandwich.

    But anyways. Bayesian reasoning + dissipative structure theory = a biosemiotic model of life and mind. Friston’s Bayesian mechanics.

    That is what Bayesianism closed under thermodynamics looks like. A world that sets the weights on a mind’s priors in useful fashion.

    You should really try to catch up with where epistemology is at.
  • Against Cause
    Any choice of the music genre? Does AI do psych rock?
  • Hume and legitimate beliefs
    Bayesian calculus deals with our beliefs, such that given some group of beliefs we can calculate their consistency, and put bets on which ones look good. But it doesn't guarantee truth. So what it provides is rational confidence, not metaphysical certainty. It's in line with Hume's scepticism.Banno

    I think you neatly demonstrate the pitfalls of relying on words when doing serious metaphysics - as in here, trying to say something useful about epistemic method without merely restating the bleeding obvious.

    You’ve just twisted a lot of words to meet your sociological ends. And I’m sure you feel that is a watertight verbal construction. There is some chain of entailment that was constructed to close your linguistic sketch.

    But oh what a leaky boat. It never left the safe harbour of self-centred idealism. Which would be the only reason it felt like it floats.
  • Hume and legitimate beliefs
    I could have written 'invariances' instead of "laws of nature". Do you think it is reasonable to say that if the past constrains the future it follows that nature's invariances do not suddenly or randomly alter?Janus

    If you are asking me, then invariance is the globally constraining symmetry of the Cosmos. And so what we mean by talking about Nature’s laws.

    And there is evolution of these laws as constraints produce freedoms. And those freedoms can reorganise the general state of those constraints, so producing a new state of cosmic order.

    Furthermore, the change is in fact often abrupt. As in the phase transitions of the kind when the temperature or pressure drops, causing steam to turn to water and then ice. A change in topological state from gas to liquid to solod.

    The Big Bang was a story of at least five or six such major reorganisations in just its first billionth of a second. Events like inflation, a reheating dump, the Higgs crack, the CP violation phase. All added constraints to the previous physics to lead us towards the world as we now know it.

    It took three minutes to get to a state of a hot atomic plasma of electromagnetic radiation. Then 380,000 years for the next major phase change that was the cooling of that plasma - that radiation soup - to the point that it broke into a cosmic microwave background filled with the condensed crud of a gravitating dust of atomic matter. Electromagnetism neutralised and so gravity able to start sweeping the atomic dust into stars and galaxies.

    So the laws of nature did evolve through many stages. There was the one prevailing evolutionary logic. The Big Bang was a fireball cooling itself by expanding. But that same cooling-expanding exposed new ways that a suitably cooled and spaced out state of material being could find more complex ways to become thermally organised, As a dissipative structure, it could reorganise itself into richer forms with their ever more detailed or localised laws.

    Luckily for the existence of us. As otherwise we wouldn’t be here.

    And do things persistently in that rich state. At least until the current high water mark of cosmic complexity starts to eat even itself and erode back to the generalised nothingness of a Heat Death void. The ultimate inversion of the Hot Big Bang where it all started,

    So the story of the laws of nature is that they started ultimately simple, became interestingly complex, and then eventually are going to degenerate back to the ultimately simple. An ultimate simplicity that is kind of the same, just the dichotomous or inverted form of that initial symmetry. The anthesis to the thesis.

    In the end, nothing will be left. But it will also be so eternally everywhere. And it’s rules will be as simple as possible as by then, possibilty will likewise have become as simple as it can get.
  • Against Cause
    That’s pretty impressive if you just whipped it up. :grin:
  • Against Cause
    Since I got into philosophy only after retirement from the practical world, I have skipped most of the post-Platonic academic argumentation.Gnomon

    But the unity of opposites is preSocratic.

    the smallest units of matter are not physical objects in the ordinary sense, they are formsGnomon

    And not any old forms but gauge symmetries. Special relativity zeroes the spacetime metric to a set of local points under the invariance of the Poincare group of symmetries. But then the "inside" of these zeroed spacetime points can contain the something further of their intrinsic spin symmetries. Gauge structures such as the trio of SU(3), SU(2) and U(1) that generate the Standard Model of particle physics once the vacuum cools sufficiently for such structure to crystallise out and become a thing. A flood of excitations of that form.

    So quantum field theory gives you your Platonic structure. And that theory is now mathematically precise. And experimentally verified.

    You can look to Aristotle and Plato for the basic metaphysics. They were trying to sum up what Greek philosophy had already spent three centuries discussing. The logic of the Unity of Opposites.

    Aristotle and Plato were applying that basic dialectical approach to reasoning by trying to boil it down to the fundamental dichotomy of form and matter. Or what in modern terms we could call information and entropy. Global constraints and local degrees of freedom.

    This provided the holistic paradigm of two complementary notions of being in interaction. Mathematic structure in interaction with material fluctuation. Or in quantum terms, a sea of fluctuation shaped by the emergence of constraining order. The kind of global order made precise by the maths of symmetry and symmetry-breaking.

    The Universe as we now know it. A dimensionality that has the Poincare group structure of a 4D spacetime, and then which is filled with the energy of its gauge group local excitations. The bosons and fermions that are generated by SU(3), SU(2) and U(1) as the local spin resonances which an excited vacuum can't help but ring with.

    If you fashion a bell, that makes a cavity that has to then ring with certain frequencies. And that is basically the Universe. A 4D spacetime cavity that echoes with its own violent shaking. SU(3), SU(2) and U(1) are the frequencies at which this global whole can resonate in terms of its mix of locally particular excitations. The Big Bang is the hard strike that starts as a shattering confusion of resonances and then subsides to the low fading hum that we can hear as the state of our 2.7 degree K Universe today.

    SU(3) effectively disappears from sight when the quarks get rolled up into protons – a U(1) state of electric charge. SU(2) also gets broken into the weak force as a short range decay story and electromagnetism as the U(1) photon which is what is finally left as the final last weak, but all pervasive, hum of the void.

    That is a considerable simplification of particle physics. But you get the basic idea.

    There is no real matter involved. Just a cascade of echoing frequencies that lose their energy and become reduced to ever more simplified forms. Quantum excitations shaped by their spacetime container and winding up as simple as possible.

    The Heat Death of the Big Bang arrives when all that remains is the eternalised low hum of "black body" photons with a wavelength that the scale of the visible universe. Or a frequency that has a temperature as close to absolute zero degrees K as it can get – given quantum uncertainty and the holographic principle.
  • Models and the test of consciousness
    But you think consciousness is real.bert1

    I hear people talking about it all the time. Just not very meaningfully. And certainly not at all scientifically.
  • Against Cause
    I apologize for harping on the notion of Holism & Original Cause, but it's essential to my personal philosophical worldview.Gnomon

    And so my reply was precisely about that. The holistic view of a first cause. The unit 1 story of the first symmetry-breaking. The unit 1 story of a unity of opposites.
  • Models and the test of consciousness
    when a system does such-and-such, you declare it to be conscious.bert1

    I thought I was arguing against using a reifying term such as consciousness. I thought I was saying this is where folk already went off track. The call for a “theory of consciousness” is already turning phenomenology into the hunt for a substrate.

    So I can recognise life and mind as processes to be explained. And biosemiosis as the best general physicalist account of that,

    I would endorse Friston in particular for developing a model along those lines.

    Others like Varela, Dehaene and Baars are really just talking about attentional processing in contrast to habit processing. And more in terms of the description of a functional anatomy than a general functional logic. Which is why they say little about the “hard problem”.

    But you are welcome to keep popping up with your strawman attack that never goes anywhere. :up:
  • Against Cause
    Again, you are talking about practical (useful) Science, instead of theoretical (reasonable) Philosophy. Except that the notion of "constants" is a generalization & abstraction from specific & concrete instances of physical changes. Likewise, the notions of Unity and Absolute are never observed in the real world, but inferred from multiple instances.Gnomon

    So it seems I am both not talking about philosophy yet talking about philosophy in your book?

    Ought one consider where logic sits in all this at this point?
  • Hume and legitimate beliefs
    To repeat, I wouldn't put it that way but instead "the future will most likely resemble the past, because the future has, as far as we know, always resembled the past".Janus

    I prefer the past constrains the future. It has already eliminated a huge range of possibilities. That is what makes the future so predictable. But also leaves it full of contingency.

    If I ate the cake this morning, I can be sure I won’t be eating this evening. Not will anyone else. But if I didn’t do so, I could eat it at any future moment. Unless someone else beats me to it. That sort of thing.
  • Hume and legitimate beliefs
    Such mock humility. :up: