Comments

  • Peirce's categories: what's the big deal?
    All thought is in signs, so all mind is semiosis.aletheist

    Yes. But then so is all matter ... in the pansemiotic view.

    That is, if we want the ultimate category that underpins reality, it is a logic of relations. A logic that is triadic and can thus account for its own rationalising existence, its own universality.

    That was a big trick to pull off.

    Merely affirming traditional idealist concepts of spirit and mind would have been no achievement at all. There would be nothing new worth discussing.

    According to Peirce's own testimony, he is above all else a synechist.aletheist

    He proclaimed himself "above all" all kinds of things. Above all, a Spinozian. Above all, an anti-Hegelian. In his scattered writings in which he explored his own twists and turns of thought, there are any number of hostages to fortune for those who want to quote mine.

    I am not concerned with the "true Peirce". He is allowed his uncertainties and contradictions. I take the systems thinking perspective and point to where his thought reaches mathematical-strength conception, and so says something robust and useful.

    Synechism and tychism are a dichotomy that are joined at the hip as each other's "other". So to defend either, you must bring along both.

    My OP spelt out the confusion that can then arise - the confusion of the triadic category which encapsulates both the diachronic and synchonic view of things.

    From a developmental perspective, Firstness or Vagueness must be the starting point - the blooming, buzzing, confusion of Tychism. But then - pace Aristotle and Hegel - Synechism is the finality that draws possibility into its actuality by dint of necessity. So Synechism is just as primary once you take the full holistic view. It is just "present" in terms of its marked absence at the beginning.
  • Peirce's categories: what's the big deal?
    Sure, anything cognitive is basically 3ns. However, an important principle to keep in mind is that the categories are never really isolated from each other in our experience, only as artifacts of analysis that result from a kind of abstraction. We prescind 2ns from 3ns, and we prescind 1ns from both 2ns and 3ns.aletheist

    That treats the categories as an accurate model of human perception. The brain models it’s world by imposing cognitive structure top down. Even to see that an object stands still, we have to have eyes that dance and impose “our” motion on the scene.

    So as a model of epistemology, this is right. And it is also anti-phenomenology in method. Bare qualities exist for us only within cognitive frames. Thus they don’t actually “exist”. Psychologically, it is signs all the way down for our minds.

    But there is a reason why a logic of vagueness was a great ambition of Peirce. There is a reason why tychism was ontically fundamental. Firstness is primordial - the start of “thingness” - when it comes to his ontology.
  • Peirce's categories: what's the big deal?
    After all, Peirce considered phenomenology to the the first positive science, on which all the others depend for principles, and explicitly affirmed (objective) idealism in the sense that the psychical law is primoridal while the physical law is derived and special, such that matter is a peculiar sort of mind--mere specialized and partially deadened mind.aletheist

    It makes sense to start with an analysis of the structure of experience as an epistemic necessity. Kant shows that. It becomes nonsense to then claim mind is ontologically primordial.

    Peirce certainly makes idealist sounding statements. Yet he is, in the end, the pragmatist and so all about the epistemology of how we mentally model the ontic structure of the world.

    We have to then find ourselves in this world as real creatures. So in the end, the purely logical category - everything is semiosis - has to be able to transcend other metaphysical statements like everything is matter, or everything is mind.
  • Peirce's categories: what's the big deal?
    Of course it is a metaphysic, and idealist to boot.Wayfarer

    So the categories are an attempt to complete the statement: Everything is …

    How do you get from “Eveything is mind”, to “Everything is semiosis”?
  • Peirce's categories: what's the big deal?
    Strictly speaking, his categories are phenomenological rather than metaphysical, the irreducible elements of whatever is or could be present to any mind in any way.aletheist

    Surely they have to transcend the phenomenal to avoid a mere reduction to idealism. This is where the pragmatism comes in. Experience can be organised in ways that move beyond “concrete” qualities. We can imagine the world in mathematical/logical terms and thus measure it in terms of numbers.

    That surely is the point of the pragmatic method. Moving past the givenness of the “phenomenal”.
  • Peirce's categories: what's the big deal?
    "Firstness is that mode of being of that which is such as it is, positively and without reference to anything else."

    "Secondness is that mode of being of that which is such as it is, with respect to a second but regardless of any third."

    Thirdness is is that mode of being of that which is such as it is, in bringing a second and third into relation with each other"
    Manuel

    This emphasises the triad as a series of relations of expanding dimensionality. It is a reductionist version in the spirit of Euclidean geometry or Newton's laws of motion.

    Take a 0D point, let it trace a 1D line, then expand that to a 2D plane, etc. Or take the zeroeth derivative of motion, move on to the second and third...fourth, fifth, and as many more as you like.

    So misses the point in my view. The holistic argument starts from the unlimitedness of an infinity of dimensionality and finds that the fewest dimensions that describe a reality is three.

    Firstness as vagueness is an everythingness that can only become the kind of firstness described here - the bare particular of some spontaneous and contextless action - when retrospectively seen as such from a Thirdness that finds this Firstness in its betrayal as a Secondness.

    It's a hard point to get across, but - as in a particle detector - you only know that there is a bare particular, an atomic fact, when an interaction has been able to happen within a suitable arranged environmental context. So this description - even by Peirce - gets you off on quite the wrong foot in my opinion.

    He says that "typical idea of firstness are qualities of feeling, or mere appearances. That scarlet... the quality itself ..." he also speaks about the idea of "hardness" being an example of firstness.Manuel

    It gets worse. :grin:

    Sure, Peirce tried to take this phenomenological tack. And I like Peirce because he argues from the psychology and epistemology to the cosmology and ontology. This builds in the bridge between mind and world while the mind builds its model of that world.

    But that smart approach leads eventually to the logical generality of semiosis - the sign relation, or pansemiotic modelling relation, by which even the Comos can self-organise itself into being. That's the big idea that Peirce gets to.

    However talking about qualia is another misstep. It is like talking about other atomic facts such as geometric points or physical events. It starts things off in a standard reductionist fashion where something is already determinate, or concretely in stable existence. Then everything else becomes an upward act of construction - one thing added on top of the last thing.

    You appear to apply these categories as widely as possible, which was likely his intent.Manuel

    This is true. Semiosis is a logic of everything. In Nature, everything is a system - a form of dissipative structure. So it doesn't matter whether you are talking physics or neurology or economics. Everything that can develop and persist in the world, does so because it expresses the same general triadic structure.

    I've always thought using an empirical example would be extremely helpful, as in, speaking about a red ball in a game of dodgeball so I can better visualize the categories:

    For instance seeing the red of a ball is an instance of firstness, me reacting to someone throwing the ball at me and felling the rubber of the ball would be secodness and me thinking about whom to hit in this game would be thirdness.
    Manuel

    But in this game, what does the redness matter? What could it be a sign of? Why mention it?

    Maybe the rule of the game is that only being hit by a red ball counts.

    So this is an example of semiosis in the minds of some group who have constructed a Thirdness that is the habits that define the game. The interpretant. And the fact that you got hit by the red ball is the Firstness of the object. Then the Secondness of accepting that this mean you are "out" - the fact you curse and step away from the game - is the sign standing for the relation between the Third and the First.

    Now it might be clearer. The game is the context that makes it possible for the event of being hit by a red ball to count as some particular event. It is not a random and meaningless accident. It is the game's most meaningful fact. The game needs to exist to define it as a fact. And - pragmatically - you have to show you share the interpretation of the event in the same fashion by leaving the game.

    But all this is an example of semiosis as epistemology - a theory of meaning within a system of communication, or better yet, of a communal modelling of the world.

    As psychology, it is already a brilliant analysis.

    But where Peirce really wins is in generalising semiosis to the degree it is mathematically general enough to lay a formal foundation for logic, and also ... perhaps, to a degree ... act as a general story on ontology. It can ground an account of the Universe as a self-organised rational structure as well.
  • Physical Constants & Geometry
    Yes, and it seems as though gravity alone can induce ordered states, such as the supermassive black-hole at the center of our galaxy.Shawn

    Remember that cosmology relies on the kinetic explosion of the Big Bang being exactly balanced by the gravitational potential of all the crud being flung apart. So the Big Bang actually creates as much negentropy as entropy in the bigger picture.

    Blackholes are local collapses that use up that negentropy. But there is also the negentropic potential of the cosmological event horizons that would be released if the universe reversed and headed into a Big Crunch.

    The notions or order and disorder applied to the universe as a whole get tricky. You can take the view that in the end no entropy is ever created. The heat of the Big Bang is a source of energy or negentropy that is simply swapped for an equivalent amount of spatiotemporal order.

    This is another example of what I was saying about reality being a product of its own symmetry breaking division.

    Order and disorder are produced in equal amounts. So the Big Bang doesn’t require an energy source to make it happen.

    Of course there are open questions here - like whether inflation was a thing, or whether dark energy is an external acceleration force.
  • Physical Constants & Geometry
    That happens usually when you attack the orthodoxy.LaRochelle

    If you want to attack the foundations of physics, get published in a major journal. A physics forum is not going to be the place. It is for explaining technicalities and keeping abreast of interesting news.
  • Physical Constants & Geometry
    The scientific method does not exist.LaRochelle

    It certainly does. In a pragmatic sense.

    you only have a superficial and a naive layman's knowledge of the model we are talking about here.LaRochelle

    You're the one that gets banned from physics forums, not me.

    And this is a philosophy forum. The ability to separate epistemology from ontology is how you make the step up from lay naivety, whether you are talking particle physics or anything else.
  • Physical Constants & Geometry
    OK. So what's your Causal Model or God Metaphor?Gnomon

    I've talked about it in every post ever. From Anaximander to Peirce to modern systems science. The immanence of self-organisation. The triadic process of a vagueness, a dichotomisation, then the synthesis which is global laws that constrain local freedoms.

    If folk need to squeeze the God word into this causal metaphysics, they talk about some kind of divine immanence or pantheism.

    I don't feel that need.

    So, my personal model is the relatively simple algorithm of Hegelian dialectic : the world progresses toward the future along a zig-zag path of positive & negative causes, which tend to sum to a Middle Way (Buddha) or Moderation (Aristotle).Gnomon

    Hegel did seem to break it down into a series of stages. But read more carefully and he becomes more Peircean – seeing the dialectic as a hierarchy of increasingly recursive involution. Or what you might call, enformation.

    Though also, Peirce believed Schelling and Duns Scotus to be still closer to the mark. And for reasons that escape me, perhaps Spinoza most of all.

    Anyway, Fichte so corrupted the popular understanding of Hegel that I don't usually find it useful to reference it.

    Careful!! I'm not sure what you are saying here, but it sounds like putdowneryGnomon

    It was a joke. Perhaps at his expense, but he ought to appreciate it as he himself has confessed to all his bannings everywhere else, and yet here on PF, he can keep spamming us with sockpuppet accounts to rejoin the forum multiple times a day.

    I'm not even sure that this was his first incarnation of what must by now be 50 or so.

    And despite that, for a quite a few of us I think, he fits the spirit of the place. Another curiosity like shawn or banno or SX.

    Most folk don't even notice it is him again until some clue like the Higgs mechanism comes up. The crackpot energy on PF is generally quite high.
  • Physical Constants & Geometry
    Modern physics is reduced to a mathematical exercise without being in touch with the reality of the stuff it describes,LaRochelle

    You misunderstand the pragmatism on which the scientific method is based.

    I asked questions about the both on a physics forum and the response was, as usual, axiomatic. With almost an instant ban following. But axioms are there to broken.LaRochelle

    Hah. I got philosophy banned on Physics Forum and yet was not banned myself. :grin:

    All physics is just modelling. You have to accept that and then move on from debates about whose intuitive axioms represent what is "really real", as opposed to being a measurably pragmatic reference frame.
  • Physical Constants & Geometry
    Not quite. But intuitively pleasing. It's again the popular naive view of the layman. A quantum vacuum is not involved in interaction. Well, minorly.LaRochelle

    As usual, you agree even though you disagree. You first assert your position as the expert, and then start mumbling into your fingers.

    The exception being the gauge for the weak interaction, giving rise to massive ones, which only indicates that the invoked symmetry breaking is an unohysical state of affairs, though mathematically satisfying.LaRochelle

    Like a cat sniffing its vomit, you keep circling back to your anti-Higgs unhappiness. Is this where your career went off the tracks?

    I mean, I would be interested if you had a good reason to reject the Higgs mechanism. But the way you present your thoughts here just comes across as another bullshitting crackpot.

    Meanwhile in another few hours you will be banned again, then reappear from another little cloud of sockpuppet identities. :up:

    It is like you are the forum's own virtual particle, forever erupting and self-annihilating from the cyber void. Your contributions exist because the PF vacuum expectation value must manifest its daily quota of crackpottery.

    But if you can organise a real argument, I would recognise it.
  • Physical Constants & Geometry
    By assumption the sides are of a determinate length to which, say, an integer value is assigned.tim wood

    Sure. You call the length 2. If it is 1.99999... in actuality, you also call that 2. And if 2 is in fact only 2.0000... - that is, a fact still be to absolutely determined - you again shrug off the inherent vagueness of the the "interger" value you wish to assign.

    Any argument can be won by making your conclusion also the premise.

    And as that value can be carried out to an arbitrary degree of precision, it would seem that "vague" itself would need some qualification.tim wood

    You don't seem to understand that my view doesn't challenge the success of the mathematical operation. It justifies it.

    Peirce founded pragmatism after all. Descartes started the business of infinite doubt. Peirce fixed that by saying that we model the world for a reason. And so an arbitary degree of precision is the definition of pragmatic success. We have eliminated doubt for all practical purposes - and that is what constitutes a belief.

    So science is all about finding the maths that can ground acts of measurement. We don't have to grab hold of the thing in itself. We only have to box that ultimate form of uncertainty into a space so small we no longer have to worry what it "actually is".

    So science is quite right to apply mathematical models that work to build belief by eliminating uncertainty.

    I just point out that there is still a vagueness trapped in the box. That is where metaphysical speculation again must kick in if we might want to dig even deeper.
  • Physical Constants & Geometry
    Simplest example in the complex plane.jgill

    Ok. You give me an example from maths and not from physics. :roll:

    But the complex plane is a good example of how maths works its way backwards by removing the constraints, while the Cosmos must have developed - out of vagueness - by adding constraints.

    Euclid constructs the world by allowing a point to be a line, then a line to make a plane, a plane a volume, and so on. The number line does the same, and then looks to see how much algebraic structure remains as it ups its dimensionality.

    And what do you know? It turns out the structure rapidly vanishes. You get normed division algebra in 1, 2, 4 and 8 dimensions.

    But these resonances get weaker, vaguer, and then you are into the spotty faint echoes of algebraic structure that continue into the distance of the exception Lie groups, and the real last gasp of the monsters, before the true vagueness of infinite dimensional "number" - completely unconstrained mathematical being - closes over.

    And also what do you know? Physics appears to have arrived at the structure of our actual world by coming back at us from this far distance. Our Standard Model world smashes its way through all possible symmetry groups - including the residual quantum uncertainty of imaginary numbers - to arrive at a classical, well-behaved, Euclidean realm where particles are just points carving linear trajectories.

    Do you see the connection?
  • Physical Constants & Geometry
    Oh it's just you again "Marco". :razz:

    Let's hear again how you waste your education with a turn towards crackpottery....

    Now you are putting words in mu mind. I don't have a binary view, however long your array of words may be to explain that. I simply stated the reality of gravitons. That may seem binary to you but it surely isn't. It's kind of unitary! But luckily I pull this view within the boundaries of a tertiary system.LaRochelle

    Yup. The expected panicked flurry of words, but no organised argument.

    A virtual particle is just a particle not abiding to the usual energy-momentum relation. It takes care for interaction and as such cannot be directly observed. That's why it's called virtual. It mediates between real particles.LaRochelle

    So we know they are real because they are possible contributions to the particle's state ... contributions from the quantum vacuum. Contributions from the background.

    Have you stopped to think about what I said rather than just hit insta-response and regurgitated what you thought you already knew?

    The accounting trick is to start with a bare particle and then build up its full state by including its off-shell contributions. We pretend that there is a particle that is not entangled with its world, and then make that work by adding back all the ways that it has to be still entangled with that world - still vague and not disambiguated as "a real particle".

    Reality is only ever relatively split or divided into event and context. You know that from quantum decoherence and the whole collapse issue. But what makes physics wonderful is its tricks to get around that - stuff like sum over history path integrals or the taking into account of off-shell contributions.

    We glue together the absolute counterfactual definiteness of classical dynamics with the equally absolute counterfactual indeterminacy of quantum dynamics and - hey presto - we come up with a mathematical framework that can be refined to as many decimal places as we find useful.

    It is then only the metaphysics that troubles us. How could the Big Bang quantum tunnel its way out of literally nothing? How can we accept a kluge like renormalisation when there has to be something that actually produces an exact number as a QG calculation?

    It is this clash between the functionality of the models, and yet the patent failure of their metaphysical base, that drives so many "physicists" to crackpottery. What the textbooks say both works in the most splendid fashion but also is so obviously rooted in irreality. All kinds of quite acceptable madness results, like Block Universes, Clashing Branes, and Many Worlds.
  • Physical Constants & Geometry
    Virtual particles are said to be mathematical tools, that's a common naive approach, made by people who don't care to investigate further.LaRochelle

    But to call them real could be just to compound the error of calling them unreal. That way lies a sterile debate where both sides are wrong because they frame the issue as a simple binary.

    What I am arguing is the larger logic of Peirce - a triadic valued logic where that which becomes the definite, the actual, does so by emerging out of the murk of a vague potential via a dichotomous act of symmetry breaking.

    And this is what the Planck triad of constants directly encodes. The facts stare us in the face. The Planck scale is both the extreme of smallness, and of hotness.

    In simple terms, the less spatiotemporal context that frames any energetic event, the more uncertain is its energy state. Or in geometrodynamical terms, the more things are contracted to a point, the less sure you can be that the metric is flat and not curved.

    So for a graviton to be a graviton, it needs a global metric of which it can be the local fluctuation. It can definitely exist only by virtue of there being the dichotomy where there is an actual difference between some event and its background.

    If we shrink the scale factor to the Planck scale, we arrive at the realm where the accepted laws of physics say the fluctuations are the same size as their metric. We do seem to have a dichotomy of the local and global. But both are the same size. And so the distinction has just become moot. The PNC fails to apply. In Peirce's logic - and Peirce founded modern logic, even if Frege gets the credit for sociological reasons - that means we have reached the event horizon where the critical distinction has become fundamentally vague. Fluctuation and metric are indistinguishable.

    Conventional thought can try to push on past the Planck scale by talking about the quantum foam - virtual fluctuation with no spacetime metric. Or it can allow relativistic spacetime to dissolve into a "space" of disconnected black holes and a "time" composed of innumerable wormholes.

    Particle physics also can push on to talk about anomalies/singularities like monopoles and branes. Every extrapolation can produce its pathologies.

    So there are choices. You can either remain bogged down in pursuing some ultimate ground of definite being - the bottom turtle of the infinite stack - or you can start to think organically in terms of a vagueness that can beget a crispness or definiteness in terms of a logical symmetry breaking - a dichotomy which divides pure possibility into precisely opposed directions.

    A virtual particle is a calculational device that allows us to talk about the kinds of particles or fluctuations that could be realised, if only a matrix or background manifold also existed to be their contrasting context.

    So they are certainly real - as possibilities ... if their matching spatiotemporal context can also be realised as the "other" of the backdrop void or vacuum that is in turn relatively lacking in particles or fluctuations.

    That is, what makes gravitons real (as quantum vacuum fluctuations) is that there is also the generalised coherence of a connected spacetime metric to stand as a contrast. There is a place in which such an event could happen.

    So a virtual particle is its own free possibility. Yet that possibility is strictly tied to the vacuum expectation value. There has to be a context in which things are starting to become more cool, more expanded, and so everything is no longer a Plankscale fluctuation of the same size as the metric. A fluctuation can actually register as an event that happened at a time and place within a backdrop starting to become cold enough, and large enough, to stand as the "other" of its historical context.

    Oh, and there is the little difference that makes gravitons hard to observe against the actual spatiotemporal back drop. Photons are spin 0 objects, and gravitons would be spin 2. So one would stick out like a sore thumb, the other rather blends into the backdrop fabric.
  • Physical Constants & Geometry
    Just a minor point, but in dynamical systems this is not necessarily so.jgill

    Do you mean in the models of dynamical systems? Which models exactly?

    Chaos theory is a good example of how this issue is fudged. The shadowing lemma is needed to smooth over the embarrassment of claiming an actual infinity of diverging trajectories in a strange attractor, for instance.

    Do you, by chance, have such a degree?jgill

    My background is biology and neuroscience. Through that, I became pretty expert in complexity modelling and thermodynamics. I then went back over fundamental physics as a hobby interest.
  • Physical Constants & Geometry
    Why should gravitons have a running coupling constant?LaRochelle

    Are gravitons even real, rather than virtual calculational devices?

    And gravity increases in effective strength as you shrink the scale factor. As the spacetime metric is shrunk towards the Planckscale, it gains energy density - also up to the Planckscale. So things run nicely up to that event horizon. Then we have to decide what really happens beyond.

    I am contrasting the options of singularities (the naive choice from the metaphysics of mathematical objects like irrational numbers) and vagueness (the logical choice, and the one physics on the whole seeks in current interpretations like asymptotic safety, loop quantum gravity, Hartle-Hawking imaginary time, etc.)
  • Physical Constants & Geometry
    When it's at home, I'm sure the square root of two knows where its pipe and matches are, and its footrest.tim wood

    Sure. We can construct a machinery that homes in on a fixed point, a singularity. The square root operation can be modelled as an infinite series of increasingly more refined acts of measurement. It all promises to converge to a point. Yet also, the very notion of an irrational number tells us that point lies at infinity.

    So that gives two choices. Either there really is a singularity sitting at infinity or there must be something that cuts off safely before that kind of abhorrent breakdown occurs.

    It is the same as the Big Bang. A naked singularity where the laws of nature disappear up their own arse must be avoided. A cut off must be introduced to protect them so that their structure could in fact emerge in the first place. And the Planck scale provides that safe horizon. A theory of quantum gravity would cloud the scale beyond the Planck minimum in the vagueness of quantum uncertainty. The singularity would be avoided by all the physical structure dissolving into the pure potential of a quantum foam, or geometrodynamics, or causal dynamical triangulation, or imaginary coordinates, or holographic information horizon on, or whatever other equivalent picture physics plays with to talk about a singularity avoiding beginning to everything.

    What I am describing is pretty standard thinking. But no one has really nailed it mathematically as yet. So we get a lot of handwaving versions.

    And then in maths, vagueness is not taken seriously at all. It doesn’t even seem something that could be mathematised. So physics ain’t getting much help there.

    Even Peirce never properly wrote up his logic of vagueness. You have to do a lot of reconstruction to understand the direction his project was taking.

    But the point is that it you can construct a machinery of asymptotic approach to a fixed point, then the inverse of that mathematical operation has to be able to pop back out of that point to. The value of the irrational number that stands at the end of the square root of 2 must also be able to generate only that particular constructing operation. From the answer, you would have to be able to directly derive the question.

    But a singularity is where such reversibility breaks down. A singularity points back equally in all directions. Like falling into a black hole, the information would be lost forever. And that is unphysical. We need an event horizon - the notion of a grounding vagueness - to protect our maths from the breakdown of its reversible structure,

    That is, the irrationality is not in the thing itself, but its relationship to other things.tim wood

    Vagueness is the “thing” that stands at the limit of thingness. This was Peirce’s formal definition. Vagueness is that to which the PNC fails to apply. And that then makes it the “thing” from which the constraint of the PNC could emerge. If there is a lack of something so specific, then it becomes what could specifically be the big change that gets a new more orderly and structure world - such as the one we exist in - going.

    A singularity is a dead end. A vagueness is the open possibility of everything that it is not.

    It is fine for maths to talk about irrationals as points that exist at infinity for some mathematical operation - an endless asymptotic approach to some fixed point on the number line. What we know is that the operation, like a square root, will continue to hold true without ever breaking down. This exactness is built into the maths itself. And we know we will never care in practice beyond the first few million places of the decimal expansion. Or even the first dozen in any real world calculation.

    But physics has the need to formally introduce a cut off point where the laws of physics dissolve into a safe vagueness. The quantum and relativistic description of nature must be joined in a way that smooths over their essential differences. Gravity must run its couplings as the scale factor shrinks towards the Planck scale, but it can’t be allowed then to keep on going to infinity, as a spacetime with infinite curvature can make no physical sense.
  • Physical Constants & Geometry
    I think your thinking is seeing only one side of a two-sided coin. My model is both Mechanical (scientific) and Organic (philosophical).Gnomon

    Sure. So is semiosis. Code leads to mechanisation. A system of logical switching behaviour is imposed on the entropic flows of the world.

    In the Enformationism metaphor, the real world was originally an idea in the Mind of G*D, with the infinite possibilities of Omniscience, that was realized by an act of Will.Gnomon

    But even as a metaphor, that is quite the wrong kind of causal model for the kind of self-organising immanence I’m talking about. We diverge big time there.
  • Physical Constants & Geometry
    Unbound = eternal?? . . . .Gnomon

    It is unbound possibility. So not about an actualised duration.

    Vagueness is defined as that to which the principle of noncontradiction fails to apply. There just is no fact to the matter. Anything might be the case when nothing has yet concretely happened.

    rationalizing structure = Logos??Gnomon

    Yep. Heraclitus spoke about the dichotomy of logos and flux. The Pythagoreans framed it as apeiron ("Unlimited") and peras ("Limit"). Peirce as tychism and synechism.

    Everyone uses a different jargon to say essentially the same thing.

    Dissipation & Entropy seem to be necessary adjuncts to Integration & Energy in the program of Evolution.Gnomon

    A vortex is a dissipative structure - the emergence of order in the service of disordering.

    What would be a big conceptual shift is to see dissipative structure (a "far from equilibrium" system) as not secondary to the ordinary Second Law definition of an equilibrium system, but instead the more generic case.

    So instead of nature being already dead - entropically closed - and so make it a mystery why anything ever happens at all, nature is seen as a self-organising vortex (metaphorically). It organises the deeper thing of a vague potential into an entropic gradient. It creates the heat sink which is what then also creates the heat that fills it.

    The Big Bang was a symmetry breaking which gave action a direction. Action became something countable - as information/entropy, or local degrees of freedom. And the expanding and cooling also became countable as the density of the action became as spaced out as possible.

    So you are always dealing with duality. And dissipative structure is the order out of chaos story. Organisation develops to maximise the production of entropy.

    We don't generally see that because we enter the story at the point in the Universe's history where it is only a few degrees from its Heat Death and it is full of material crud that is going to take forever to round up in black holes and radiate away.

    Semiotics seems to imply that Meaning is inherent to the system of evolution. The question is : meaningful to whom?Gnomon

    Semiotics is indeed the science of meaning. But having a strong definition of meaning is also what allows you to define the meaningless.

    So sure, any complex dissipative structure that also has a coding mechanism and can thus implement an epistemic cut, or have a modelling relation with its world, is all about truly meaningful action. Even a bacterium is semiotic as it has genes to run the show and model its world in useful fashion.

    But a tornado lacks a model of its world. It has no code to support a selfish point of view. It is a dissipative structure, but now the information informing its behaviour is spread out in its environment. It moves across the landscape in pursuit of gradients of hot air to feed its upward spiralling existence. But we wouldn't credit it with any actual capacity to store a purpose and pursue an outcome.

    Information theory lacks a division - an epistemic cut - between a-biotic and biotic dissipative structures. Semiotics can speak to both the continuity of the mental and physical "realms", as well as put a precise finger on the difference between the two.

    Negentropy is what Aristotle called "entelechy" and what I call "enformy" in my Enformationsim thesis.Gnomon

    Negentropy is materialised constraint, entelechy is the potentiality of having a purpose. So one looks to the structured future, the other is that structured future embodied.

    So the fact that Reality contains creatures capable of semiotics and extraction of meaning would seem to deny the "essential meaningless of reality"Gnomon

    Semiotic systems construct meaning. A code-based modelling relation with the environment - such as the ones supported by genes, neurons, words, numbers - is how a first person point of view arises within a "third person" physical world.

    So sure, information theory is now dual with statistical mechanics. The same maths underpins both. But that glosses over the semiotic story. Information loses its everyday meaning and becomes simply a count of symbols or marks. I can count how many bits a channel can reliably transmit. That doesn't mean the pattern of marks is a language in which something is trying to be said. It could be just random noise.

    What you call "Apeiron" is similar to what I call "Enfernity" : the unbounded realm of Eternity and Infinity, which is an unformed ocean of Possibility. Which I also call BEING, the eternal power to be, the essence of existence.Gnomon

    Why invent another jargon to describe something that already has so many names?

    The Tao, Brahman, Apeiron, Hyle, Quintessence, Bosenazelo, Hunabku, Manitu, Orenda, Wakonda, Wakan, Mana. Peirce called it both Firstness and Vagueness. The Kaballah calls it Tohu wa-bohu.

    All cultures have some version of it as the way a cosmos could arise organically as order triumphing over raw chaotic possibility. The more advanced versions, like Ying-Yang, Buddhist dependent co-arising, or the Greek Unity of Opposites, also get the duality that breaks the symmetry of the unformed potential. But the next step has to be to develop a fully logical, mathematical, model of what everyone can understand intuitively.

    Peirce is certainly important on that score. As is modern hierarchy theory and dissipative structure science.

    So do we need yet further jargon like Enfernity? And in fact, it seems wrong because the vagueness of a pure potentiality is that which lies beyond the concreteness of time and space. It is where time and space would arise as complementary limits on being. The birth of an organised cosmos is a structure that becomes a measure of properties like duration and extent. The extremetisation of those properties as eternity and infinity are then to take these physical directions and point to a future (but not a past!) that is unlimited, yet definite.

    So infinity and eternity are a pair of concepts that come from a quite different metaphysics - one based on a mechanical understanding of the world. It leads to an Atomist conception of a Cosmos that simply exists forever and has an unending extent.

    The organic metaphysics I'm talking about is all about the self-organising creation of the Cosmos as the growth of focusing limits on unconstrained "everythingness". So the Cosmos is not eternal. It has a birth and death of time, space and material content. The Heat Death may last "forever", but that is essentially a state of ultimate void. It may always have a matter content, but it will be just that of a quantum vacuum - a featureless rustle of virtual particle excitations.

    So I would say your thinking goes in the wrong direction here. It re-embraces the mechanical model of reality that an organic conception is intent on rejecting. :chin:
  • Physical Constants & Geometry
    Well, 1/3 is rational and has an infinite decimal expansion.Heiko

    And even 1 is 1.00000.... (and claimed to be indistinguishable from 0.9999....) :smile:

    Thinking about it, it is questionable if the idea of the number line is even justified: A line is a spatial object as opposed to a number (i.e. a "count of things"). Writing the "1" somewhere on the line tries to synthesize two very different things and "flaws" the pure space with the pitfalls of "counting".Heiko

    All modelling is questionable. But the main question that needs to asked is: "Does it work, is it useful?".

    Obviously, I want to ask the metaphysical question of what reality actually is. But I also accept the epistemic constraint that all our accounts of that are going to be constructed models.

    So it is not a problem that maths might use a rather paradoxical construct like an unbroken line that is also an infinity of broken points. That construct doesn't have to be the truth of the reality it models.

    The problem only arises when the maths starts to be read as what is the indubitably real, and that is used to either make obviously silly claims about reality (such as the Block Universe) or to close of metaphysical inquiry as to other reality models.
  • Physical Constants & Geometry
    Does that relationship between Symmetry and physical Constants, imply that the Big Bang Singularity was also perfectly symmetrical and unchanging (e.g. eternal), until some perturbation (outside force) broke the symmetry, resulting in our dynamic and evolving world? I ask that strange question because I just wrote a review of a book that reaches Anthropic conclusions from the : "unique “initial conditions” and “fine-tuned constants” that seemed arbitrarily selected to produce a world with living & thinking creatures."Gnomon

    I really like Wheeler as a bold and holistic thinker. The anthropic principle is also an obviously powerful argument when it comes to the cosmological problem. And I even agree - as Peirce argued - that the cosmos arose from unbound possibility as the inevitable growth of a rationalising structure. Wheeler also got that right with his geometrodynamics.

    But the issue is where do you insert some cut-off point between what is truly a product of Platonic-strength inevitability and that which is just some kind of residual chance, spontaneity or indeterminism.

    The Peircean view is that arbitrariness, or vagueness, must always exist in the system as Platonic order exists only to suppress or constrain it to the degree it matters pragmatically. Existence is statistical and so based on the stability that arises once a state of affairs can become indifferent to further change - an equilibrium condition in other words.

    So anthropic thinking doesn't have to squeeze all the arbitrariness out of cosmic history so that it is seen to arrive at the singular conclusion that is a "self-aware human", let alone carbon-based life. These can be left as local meaningless accidents of history - especially as the basic state of Universe seems to have been fully locked in at the Big Bang, and the long-run destiny is for it to become a generalised Heat Death - a void with a temperature of absolute zero and the only material action being the black body radiation emitted by the holographic boundaries of an anti-de Sitter spacetime metric. That is, the faintest possible sizzle of photons with wavelengths that are redshifted to the size of the visible event horizon itself.

    So sure, the Big Bang baked in this story of an eternally cooling~expanding dissipative structure - a cosmos developing in the form of its own heat sink. And then more complex dissipative structures can live for a time on top of that, so as to break down particular accidental entropic blockages.

    It was inevitable the Higgs field would get tangled up in the gauge particles and create a degree of gravitating crud that needed to be returned back to the general spreading CMB radiation bath somehow. Stars collected the hydrogen and helium into burning balls of fusion to start that process, but then blew up in supernova, that left behind new levels of crud - all the complex atomic elements. That in turn led to tectonic planets and eventually life as new levels of dissipative structure.

    Anthropically, if these higher levels of dissipative structure could happen, they had to happen. The initial conditions of the Big Bang already mandated that. And given that carbon and water are materials that had to emerge if every combination of atomic matter was going to be tried out, then life was going to happen as these materials came with such rich possibilities.

    Peirce gives the argument for why semiotics is then itself an inevitable organising informational arrangement. We get to the point where information (as genes, neurons, words, numbers) must become a thing ... because there is a sufficiently rich stock of local negentropy to be dissipated. The heat of a sea floor thermal vent, the radiation of the sun, petawatts of fossil fuel trapped in underground deposits.

    So the fact of a hierarchy of dissipative structure was foreshadowed by the very fact the cosmos was based on dissipation as its fundamental principle. Everything is constrained by the laws of thermodynamics. But the same laws only work because of the way they can gloss over the detail.

    That is why we can model reality using statistical patterns. Every mountain range on every planet will look fractal - a tectonic flow caught at some snapshot moment between its negentropic building up and its entropic erosion. Humans might want to celebrate any peak which seems especially high, or unusually pointy. But for nature, these are the predictable accidents. Local spontaneity is built into the model along with the global necessity. There is a cut-off - and it is what the maths of chaos and criticality make measureable in terms of things like fractal dimensions and Lyapunov exponents.

    To bring it back to your particular concerns, this view does see information as the structure of constraints that limit the arbitrary. So holography has become a big deal as light cones bound the degrees of freedom available to some defined region of spacetime. Information thinking is just a better description of nature as it captures the physicality of a constraining context. You can talk about the material world in terms of a relational structure. And you can add measurement to such a conception by using information entropy as the fundamental unit. You can imagine the dialectical extremes of absolute order and absolute disorder - information vs entropy - and measure the one in terms of the absence of its other. The reciprocal relation of a dichotomy, as I have earlier said.

    So there is good reason for science to be shifting generally towards a metaphysics of order out of chaos - an information theoretic framework. But entropy descriptions are still ones that presume an essential meaningless of reality, so we need the next level of information theory - the Peircean semiotic one – to begin to talk about dissipative structure with the extra machinery that makes it alive and mindful.

    Then on your first question - was the Big Bang Singularity perfectly symmetrical and unchanging - that is another long answer.

    But in brief, it does all come back to symmetry and asymmetry principles – and how these themselves might be the two complementary faces of an even deeper, hence triadic, story of developing structure.

    So there just is no singularity, as there is instead just a vagueness that becomes a somethingness as soon as it starts to become a structure of relations. The Big Bang would have arisen from an Apeiron - an unbounded and formless "sea" of fluctuation. A chaos of impulse. A fundamental incoherence that could thus start to become something by becoming structurally constrained in some developing fashion.

    In a sense, a chaos of dimensional fluctuations is a perfect symmetry. Any and everything can be happening. It is also the definition of unchanging as even its ceaseless change is no change. Everything remains as confused as before, which is confused as possible.

    But the anthropic principle tells us that something had to happen, something had to change. The Apeiron's very lack of any limits or character would be the grounds from which limitation and charactersation must have been able to arise.

    So the breaking of the "Big Bang singularity" would be like a phase change where disorder learns to constrain itself. But all this would be in the form of an internally caused shift. We have to note that the Big Bang described in its purest sense would be just an adiabatically spreading and cooling radiation bath. There would be nothing really happening except an expansion making more spacetime, driven by a cooling which made the general energy density ever more spread out.

    The universe is a hot point tumbling into the vastness of its own heat sink. And it is a closed system from start to finish (once we clean up details like dark energy - the true source of the cosmological constant). So it is both a pair of extreme changes - a plunge from the Planck Heat to Absolute Zero, an expansion from the Planck Length to the de Sitter holographic solution that spells the practical end of time. But also, all this is just a swapping of the tiniest/hottest radiation bath for the coldest/largest radiation bath.

    If we count the total number of degrees of freedom - the total entropy or information - the same number at there at the Big Bang as they are at the Heat Death.

    This again is where we seem to need a deeper level of description to measure what seems to us some real physical event - namely the birth and eternalised death of a Cosmos. And that is where being able to measure this tricky thing of Peircean vagueness, or Anaximander's apeiron, would look to have a role to play in fundamental theory.

    Information is the brave new metric. But an even more general metric looks called for.
  • Physical Constants & Geometry
    If the ancient Greeks were left doing only arithmetic they would've never encounterd irrationalsTheMadFool

    So geometry then injects just enough physical reality into the mathematical abstraction to raise the problem?

    Zeno's paradoxes were another route into the same issue. As an object of the mathematical imagination, the number line claims to be both continuous yet also infinitely divided. That is a useful quality for modelling/measuring the world, but what way is it realistic?

    I'm with Peirce and those who argue that reality is at root vague, and then this vagueness can be organised towards opposing dichotomous limits of being. So reality is organic rather than mechanical. It has to develop the counterfactuality attributed to rational structure.

    When it comes to the number line, this means that it is neither continuous nor discrete in the absolute sense usually claimed. Instead, it is only relatively connected or divided. It can approach either of these claimed states "in the limit". But the limits are themselves unreached in actuality. It is only in the mathematical model that it is useful to think this way.

    So mathematicians tend to want to treat the closest approach to these paired limits on possibility as being true actualities. The infinite and the infinitesimal are both real objects in the mathematical imagination. They stand for the concepts of the continuous and the discrete, the line and the point, being extremetised - boundaries states which can actually exist in categorical fashion.

    But the alternative view - based on Peirce's organic logic of vagueness - is that the infinite and the infinitesimal are the two ends of a process of dichotomising. Each is yoked to the other in a reciprocal relation. Infinity = 1/infinitesimal, and infinitesimal = 1/infinity. So as quantities, each is formally quantified not in terms of what it is, but in terms of how little of its other it contains.

    The rational numbers stand far enough back from the fray that it seems quite easy to treat a continuous line as an ordered series of points. As an object, it can paradoxically be the two things at once. But then as mathematicians go deeper, they have to keep expanding the notion of continuity to come up with a transcendent hierarchy of infinities. Likewise, the ability to cut the number line ever finer leads to a hierarchy of divisions. We encounter the infinite decimal expansions of the irrationals.

    This maths is useful, so people believe in it. People get used to "taking the limit" and that allows them to knit together models of reality that marry dichotomies like lines vs curves. We can do calculus and other tricks. We can employ constants like pi, e and phi that are treated as exact values - even though they are at root vague values, being irrational numbers.

    So there is no dispute the maths works. But the point is that it works by simplifying reality. It models the fact that reality arises in this symmetry-breaking, dichotomous fashion - the good old unity of opposites - but instead of treating the two halves of the deal as co-defining limits, it treats them as two categorical states of being.

    A Peircean approach adds a further dimension to the picture - the organic and developmental one. It says at root is the vague. And so the infinite decimal expansion of an irrational value is another way of accepting this, without openly admitting it. We are happy to have the first four or five digits of pi. That is enough for almost all our practical purposes. The rest of the long line of digits can be allowed to disappear into the mists of the unknowable. We don't have to make a big metaphysical deal out of the fact we can never actually arrive at a final number. Instead, we create a metaphysics which claims that the infinite expansion actually exists ... in some Platonia.

    But I am coming at this from the side of physics where vagueness might be a useful thing to be able to model and measure. And where the difference between a constant being produced by a closed symmetry operation, vs a constant reflecting an endlessly convergent series, seems like an important dichotomy to understand.

    It isn't too much of a stretch then to posit there's something geometric about the irrationals. Helium - Sunnish; Irrationals - Geometric.TheMadFool

    I have no idea how you make that connection. Was that some other post?

    No matter what you plug into that equation as the value of x, you will always miss out some points (incommensurable/noncomputable/transcendental numbers) i.e. the line will actually be discontinuous.TheMadFool

    Well, the more points you plot, the smoother the line will become. So as a construction, you are progressively limiting the possibility of the line turning rough, jagged or fractal inbetween the points you have so far plotted.

    It depends which way round you want to view the situation. But the mathematical object appearing on the page is becoming both more a collection of points and more a continuous line at the same time.

    So symmetry produces neat rational constants in your opinion? However, I maybe wrong of course, these values (spin and the other one whatever it is) don't show up as physical constants in Wikipedia.TheMadFool

    That is the idea I would explore.

    The dimensionless physical constants of the Standard Model are made a fuss of because they seem to be arbitrary values. They are numbers that could have been anything as they don't seem to follow from some fundamental symmetry that would have to be enforce on any form of material being. They are irritatingly patternless as well as irrational as actual numbers.

    On the other hand, values like quantum spin are well behaved because they fall directly out of symmetry principles. There is an exactness imposed on them in the way that resonances in a cavity must fall into countable whole numbers - the Pythagorean discovery of harmonics, the music of the spheres, which was the pleasant metaphysical surprise the ancient Greeks celebrated, alongside their disgust at finding numbers could also be irrational or matchingly discordant.

    So if we really dig into the issue from the side of fundamental physics, I see three classes of constants.

    There are the arbitrary numbers (dimensionless ratios of measured values) that give us things like the odd difference in mass between an up and down quark. Why does one weighs 2.01 megaelectron-volts, the other 4.79 (give or take current measurement error)? A theory of everything would want to find a way to calculate such values from first principles rather than have no good explanation at all.

    Then there are the properties of particles that follow directly from closure of symmetry relations - things like the Lie groups or permutation symmetries that underpin gauge symmetry breaking. It is quite natural to see why these are rational numbers, being counts of transformations that leave things unchanged. A perfect triangle maps onto itself three times with every rotation. We are counting its resonant states. In the same way, quarks and leptons - as products of Platonic-strength symmetries, and indeed quantum harmonic oscillators - have no other choices, no possible intermediate states, as they are metaphorically, but effectively, like reverberations in fixed cavities.

    The third class of constants are the magic triad of the Planck constants - c, G and h. Okun's cube shows how they anchor all of physical theory, and a theory of quantum gravity will finally unite all three values in the one fully relativised quantum field theory. As constants, these have measured values. But really, it makes just as much sense to set all their values to 1 as they are arranged to create a set of reciprocal equations. Each represents a different kind of physical limit - a limit on connection, curvature and certainty - and so are just absolutes. We can presume that like the gauge symmetries written into the Standard Model, they have no specific material value as they are just pure forms - the expression of pure relations, an ontological structure that would have to be the same for any universe with a self-organisingly "resonant" geometry.

    So my overview is that the notion of a physical constant refers to that which is fundamental because of closed symmetry principles - change that doesn't make a change, then that which is fundamental as an measured and apparently arbitrary ratio (why should up and down quarks lack symmetry in their relative coupling strengths?), and finally, underpinning everything is the Planck triad that is itself a kind of decomposed ultimate constant in breaking reality in three absolute ways.

    G defines flatness (in terms of a lack of curvature). h defines uncertainty or vagueness (in terms of a lack of counterfactual definiteness). And c defines the rate at which a thermalising coherence of the two is achieved - the universe arriving at a state that is as flat and decohered as it can get at any particular moment in its thermodynamic trajectory.

    Again I stress this is my speculative understanding and not even my final opinion. It is how the current physics looks once you see its maths as a reductionist mask and you start to interpret things instead from the point of view of a holistic systems science, or Peircean metaphysics, approach.
  • Good luck
    I think that we human beings were not created all at once, all at one stroke. I think that we all evolved slowly over a period of 350 million years.Ken Edwards

    If you are interested, this is why biologists distinguish between the mortal body and the immortal stem line. So it was a feature of sexual reproduction and basic to the needs of being able start building complex multicellular life about 2 billion years ago.

    Nick Lane's The Vital Question covers this well.
  • Physical Constants & Geometry
    There is no clear question in the OP, but I'm happy to speculate. :grin:

    As a minor point, geometry and algebra are dual descriptions of nature as Michael Atiyah argues across a number of addresses.

    So yes, geometry seems spatial, and algebra seems temporal (or about the sequential order of operations). But in the end, they are mirrored descriptions of the same deeper thing. Just as Einstein had to unite spacetime, so physics generally has developed algebraic geometry as a way to unify a dichotomous description of nature.

    Descartes started it by showing curves were algebraic functions in a coordinate space. It continues right up to the central place that permutation symmetry has in particle physics. If it can be said one way in geometry, it can be said the other way in algebra. And Atiyah was the one to really hammer this home.

    Then the issue of constants - and why they may or may not be irrational values.

    Again, dualities or dichotomies rule. And symmetry principles.

    A constant speaks to a lack of change. Change can be lacking either because the world is frozen, static, inert. Or because further change can no longer make a measurable difference. And the world being a dynamic and change-filled place - not the kind of timeless realm of the Block Universe or Many Worlds Interpretation – would have to be defined by constants of the second kind.

    So imagine a disk of paper, perfectly circular and unmarked. By virtue of its symmetry, you can't tell if it is spinning or still, let alone if it is spinning and how fast/in which direction. As with quantum spin, you are dealing in whole numbers because the symmetry is such that every change maps the circle back on to itself. Any amount of change winds up as no meaningful change. Until you break the symmetry by marking the perimeter of the circle with a dot, motion if the disk makes no difference.

    So one way to arrive at a constant in a dynamic world is perfect symmetry. And that will produce a simple rational value. With quantum spin, the values are 1, 0 or -1. Or when it comes to the electromagnetic charge of quarks with their more complex rotational symmetry, rational fractions like 1/3 and 2/3.

    It is also why you get the inertial constants of motion. Newton's law/Noether's theorem telling us that rotation and translation are both costless actions. A rolling ball would roll forever in a frictionless world.

    And it is why in mathematics you have identity elements. Zero is the identity element of addition/subtraction operations, one is the identity element of multiplication and division. These particular numbers are the only numbers that make no difference, and so serve as the conceptual anchor for all numbers that then could make for a difference.

    One times anything leaves that anything unchanged. Anything divided by one is likewise left unchanged. Zero added or subtracted to anything again leaves it unchanged. Apply the same identity element as often as you like, and like a spinning circle, nothing will be measurably different. The identity element underwrites the mathematical closure of the system.

    Then there is the other way to arrive at an unchanging constant in a dynamic world - the path that is open-endedly infinite, yet also an asymptotic approach to a fixed value. This arises in maths, and also likely in physics, because it is the extremisation of an asymmetry rather than the extremetisation of a symmetry.

    So pi and e are efforts to relate opposing kinds of things. They produce irrational values as they are imagined as the final convergence of the discrete and the continuous - that ambiguous place where a line becomes an infinitesimally separated pair of points. A Dedekind cut.

    Pi is what you get from trying to map lines on to curves. A radius is a line. The circumference is a curve. The ratio of the line to the curve is then the attempt to map two incommensurate things. You can approximate the (straight line) length of the circumference by laying an infinite number of tiniest lengths around it - ie: an infinity of points. But in the end, you are trying to bridge a category difference, an asymmetry. The discrete and the continuous, the straight and the curved, are dichotomous concepts. Ratios of two things metaphysically defined in terms of not actually being the other,

    To be discrete is to lack all discernible trace of continuity, and vice versa. To be straight is to lack curvature. To be in inertial motion is to lack acceleration. Etc.

    Likewise e does the same trick, except it contrasts geometric growth with its antithesis. That which is growing has to grow from whatever exact value it had at some point of time. But it was growing then as well, so had no exact value.

    However in all these cases you can integrate. You can asymptotically approach a fixed and stable value that has an irrational value. The answer will converge towards the same place, even if it does so in a never-arriving and open-ended fashion ... and thus the resulting constant represents a fundamental, locked down, degree of asymmetry which can be used as an identity element.

    The famous arbitrary constants of the Standard Model look to be fundamental in this second sense. They arise because the couplings giving rise to mass and energy values for particles are quantum sums over histories. Each is an open-ended sum of all the possible contributions from interactions - as in the most famous case of the QED calculation of the magnetic dipole moment. But also, the convergent sum because as the interactions become more complex and rare, they rapidly contribute less and less.

    So somehow - and this is at the speculative edge now – nature sums over every possible asymmetry of a particle's symmetry breaking and winds up converging on a value. This reflects the fact that we are trying to force a metaphysics on the world - the particle as the discrete point, the quantum vacuum as the continuous whole. The incommensurability of these two categories then leads to an irrationally-valued constant for the same reason even the number line ends up suspended between the notions of discreteness and continuity. Even the number line can only converge on the fictitious place where one category - the continuity of a line - becomes its other of the zero dimensionality of a disconnected point.

    Anyway, constants can emerge either because of a symmetry that is unbreakable by change - like an unmarked disk, or any kind of relativistic coordinate change - or because of an asymmetry that is asymptotically convergent, locked in on a fixed value to be found at an infinitely distant horizon.

    Geometry doesn't have anything special to do with it, except that it too illustrates the general duality of nature. And the importance of algebraic geometry to modern physics shows how a convergence from all directions on symmetry and its asymmetric breaking is also only to be expected.

    Duality is everwhere. But then final theories are where it seems to get blushingly hidden from sight, safely wrapped up either as an emergent symmetry constant, or convergent asymmetry constant. Or - whoops - a combination of the two. The combination of fundamental laws and fundamental constants.
  • Are there sports where nothing is open to subjective interpretation?
    For instance, is there any subjective interpretation involved in calling a 1v1 tennis match?Cidat

    Examples of things I won't allow: Figure skatingT Clark

    Like science, this is us humans reaching for the objectivity of a measurement. So all games have rules and scoring.

    Even figure skating has scoring. A triple axel beats a double axel. Even the briefest touch of hand to ice is a deduction.

    And like science, objectivity is just an aspiration. Even if we reduce measurement to a number on a dial, it can get blurry when the needle hovers between numbers. A choice of whether to round up or round down has to be imposed.

    Of philosophical interest might be why we strive for these objective measures of human performance. And why is it no surprise that the rationalising Greeks and industrialising English seem to have led the way in the invention of formal sport?
  • How can one remember things?
    The brain is no digital computer. Most things leave traces in the brain. There are more possible traces in the brain than there are elementary particles in the universe.GraveItty

    Yep. And unlike the conventional notion of a computer – the representational understanding - meaning arises semiotically. What is significant is the brain's ability to eliminate that all that "information".

    To recognise is to whittle a near infinity of possibilities down to some useful act of identification. In a split second, any number of less well fitting states of interpretation are discarded.

    So computers place high value on storing information. Brains place high value on how much can instead be ignored.

    Recognition is something forced on our attention by our inability to otherwise look past some aspect of our environment.
  • How can one remember things?
    So there is a kind of comparison made, as I see now. The face is drawn to the trace, so to speak. But there is no litteral comparison.GraveItty

    So simulated annealing? Hebbian gradients?

    Do you want pointers to scientific models of recognition? What you seem to want to say is just the usual way of understanding the associative structure of the brain's neural networks.
  • How can one remember things?
    It's a fact that the memory doesn't function like storing data on a computer. It's nonsense to claim the memory contains a zillion bytes of information. So there is no comparison.GraveItty

    No comparison to what? I agree, no comparison of any merit to a Turing Machine. But why not of some comparison to a neural network?

    And even if there was, how does a comparison constitute a memory. I can say that two faces are the same, but that's no memory.GraveItty

    If your argument is about rejecting the term "memory", then I would tend to agree.

    A first clarification would be that brains work not on stored memories but active anticipations. They are designed not to remember the past but predict the future. So the comparison is between what is expected to be the case, and what turns out to be the case.

    When we recognise people, things and places, it is in the context of some forward looking expectation. If I see you in the street, I am mildly surprised to see someone I know. I was mostly expecting to see people I don't know. So your presence pops out as emotionally significant, attentionally salient. A fact promoted to focal consciousness.

    That focus then brings with it a fresh "computation" of my expectations. I am now flush with all the likely and unlikely possibilities that may characterise my interaction with this "you". The forward prediction of "my" world is maintained.

    So the story is not about knitting together the present and the past, but about updating my general orientation to my immediate future.

    The second clarification is that humans have language and so can do more than just do simple recognitiion. We can linguistically construct states of recollection. We can learn the habit of seeing our selves as selves that exist in the past and so imagine ourselves reacting to situations at other moments of time. We can construct autobiographical narratives - such as remembering you when I last saw you as some sort of episodic memory.

    Eyewitness research and other psychology tells us how unreliable and constructed such narratives are. But still, this gives a whole other level to any discussion of what memory is.

    As we layer the social skill of recollection on top of the animal skill of recognition, we move further away from any simplistic and mechanical notion of the comparison of active states of experience with passive stores of data.
  • How can one remember things?
    I explained though that it's not computer memory-like.GraveItty

    You mean not like a Turing Machine, or not like any kind of machine architecture, including neural networks that try to mimic the brain?

    I'm not looking for a scientific explanation. I already have one. I'm looking for a philosophical one.GraveItty

    It is unclear what you seek. But neuroscience would aim for a more sophisticated account than a Cartesian strawman such as comparing a memory of a face with an experience of a face.

    So philosophically speaking, we would want to leave behind a representationalist framing of the issue and move towards an enactive or semiotic one.
  • How can one remember things?
    If I see a face, I don't compare it to a stored memory and (consciously or unconsciously) to the memory of the face I have.GraveItty

    Is there a good reason to claim this? Cognitive neuroscience would tell us that the ability to recognise - the ability to make a qualitative judgement of familiar~novel, or match~mismatch - is pretty central to everything the brain does.

    So a sense of things being known or unknown is built into the process of perception as a critical contrast.
  • Is global warming our thermodynamic destiny?
    The green is trees and plants and they have a knack of storing energy in the form of oil, coal and other hydrocarbons. This will never do, it is cheating the universe from getting to its natural, evenly temperate state as quickly as it might.TheVeryIdea

    The OP is heading in the right direction. But life is in fact entrained to the second law and serves it by accelerating universal entropification. Bare dirt reflects solar radiation at a higher temperature than a mature ecosystem - 60 degrees C vs 20 degrees C as a rule of thumb.

    So life pays for its negentropic existence by actively degrading energy. Life exists because it serves the thermodynamic imperative.

    So is our control of the energy gradient, which allows us to live anywhere on the planet because we can heat ourselves and cook otherwise inedible foods, a natural selection pressure that has lead ultimately to humanity being stuck on a path to burning hydrocarbons and in a tiny way speeding up the process of the universe reaching its natural end point?TheVeryIdea

    Or you could see Nature being stuck with this crap-load of hydrocarbons that got trapped in inaccessible rock seams. A lot of buried energy. And if it were possible for an organism to evolve that could dissipate it, there would be a big pressure for that to happen.

    So along came humans. And eventually they invented a new technology based way of life. That led to an explosive rise of the industrial revolution and a social/economic system geared to burning hydrocarbon.

    It was just bad luck that the heat produced couldn’t go straight out into space. The burning also produced a crap-load of atmospheric carbon.

    So in general, you can see global warming as a consequence of the second law. And humanity like an algal bloom.

    Sometimes life does too much too quick. You get explosive growth and an extinction event.
  • Realism
    Speaking very roughly, just to get started, realism holds that ...stuff... is independent of what we say about it; anti-realism, that it isn't.Banno

    Same old chestnut. Same old answer.

    Realism is wrong to the degree it neglects that all such speech acts have some pragmatic purpose. So the world isn’t understood in some mind-independent fashion. That wouldn’t even be useful.

    And anti-realism is wrong to the degree it might pretend there is no world independent of this minding, or mindfulness, or however you choose to describe a semiotic modelling relation.
  • Synchronicity, Chance and Intention
    Can we then say that simplicity ... is some kind of telos for the natural world.TheMadFool

    As a telos, it would be a material tendency rather than a sentient purpose - what Salthe calls teleomaty rather than teleology.

    And rather than just being a drive towards simplicity, it would be a drive towards generality.

    To be simple is merely to lack a mess of particulars. To be generic is when every particular ceases to make a difference. So the general is a limit on change not because change is halted, but because it becomes a matter of indifference.

    An equilibrium system fluctuates, but the fluctuations all average out.

    A disc sitting on a surface could be rotating at any speed, or even be at rest. Unless the disc has its symmetry broken by some kind on tell-tale mark, or we will see is that it is circular. The particulars of its rotation are absorbed into the generality of its rotational symmetry.

    (A triangle looks exactly the same every third of a rotation, a hexagon on every sixth rotation. A circle in fact has an infinite number of edges, so always looks the same. A triangle in fact makes a worse wheel than a pentagon for that reason ... but is the simplest answer for producing strong structures or describing networks of relations.)


    If yes, how does that relate to synchronicity?TheMadFool

    My comments have nothing to do with synchronicity. Although physics certainly has good models of synchrony.
  • Synchronicity, Chance and Intention
    More on this please. Gracias.TheMadFool

    If in a general way you take an evolutionary or process view of reality - what exists is what is self-stabilising - then symmetry principles explain what is likely to be the case because it is the most stable and persistent outcome of symmetry breaking.

    So a wheel emerges as the shape that exemplifies rotational symmetry. If you want something to roll smoothly and with the least friction, then a circle is as simple as it gets. It doesn't get simpler. The circle is the limit towards which all else tends.

    This is the general story behind all physics - the search for the ultimate simplicity in terms of breaking possibilities down to the point where they have got as simple as it is possible to be. At that point, flux becomes stability.

    So in a state of thermal equilibrium, all the particles are in busy motion. But it no longer makes a difference. The distribution of the momenta has converged on a stable Gaussian distribution. The system has a stable temperature and pressure.

    Or if we are talking about Newtonian mechanics, reality boils down to the simplicity of zero D points that then have the irreducible freedoms of translation and rotation. Point particles are constrained to a location, but remain free to move inertially in a straight line or spin on the spot.

    Symmetry principles - Noether's conservation symmetries - predict the limits of geometric constraint. You can limit the motion of a ball in many ways, but - in a frictionless world - you can't stop it rolling in a straight line forever.

    Gauge or permutation symmetry in particle physics explains why protons and electrons exist. Again, starting with all possible arrangements, only some particular arrangement winds up being the simplest achievable. Once you arrive at that state, you can't go further. There is no north of the north pole, as they say.

    Existence is change meeting its match in the shape of a limiting state of indifference. Change might continue, but it makes no real difference.

    The particles of a gas at equilibrium are as restless as ever. But their distribution remains the same in terms of its collective average.

    A wheel might wear with use, but it doesn't continue to evolve into another shape.

    The problem for a metaphysics of order out of chaos is explaining why the evolution of unbound possibility arrives at bounded terminus. Symmetry maths explains that. Things get simple to the point that fluctuations can't produce an arrangement that is any simpler.
  • How can chance be non-deterministic?
    There is a real ‘realm of possibilities’ which is defined by the wave function, very precisely, as a distribution of probabilities of possible outcomes.Wayfarer

    Yep. That is the creation of concrete possibilities by the preparation of a system. It is like carving a die with six sides. You constrain things so that outcomes are limited to a particular range of choices.

    Vagueness would be a deeper state of indeterminacy. The wavefunction of the universe would be so broad as not to either rule in or rule out the existence of any particular electron and its history.

    So put another way, the answer to the question ’does the object exist?’ is the equation, isn’t it? You can’t say ‘yes it exists’ until the measurement has been taken. So the object is not unambiguously real until it’s measured.Wayfarer

    The problem is that the collapse isn’t part of the formalism. So there isn’t a good ground for claiming some kind of definite transition that promotes the particle from some kind of existence as a probability to a state of being real.

    I don’t have a hard position on the issue for that reason. But decoherence at least let’s the thermal environment be the “observer”. We can do without an actual collapse because the uncertainty reduces asymptotically towards a definable limit.

    I like the term, almost surely, in probability theory - https://en.wikipedia.org/wiki/Almost_surely

    A probability of 1 isn’t absolutely certain. But close enough for all practical purposes.

    The demand for a collapse is another example of the backwards metaphysics that infects the quantum vs classicality discussions. The holistic view says uncertainty is merely being constrained. Reality doesn’t actually have to be made certain to exist. Being highly constrained gives it enough of a definite counterfactuality to amount to the same thing.
  • Synchronicity, Chance and Intention
    Those presuppositions may be unprovable but they are not arbitrary, they are objectively verified every single time we use them and most importantly they have predictive and practical value (instrumental value).Nickolasgaspar

    I’m fine with methodological naturalism as a fallback position. But it is a little much to claim the virtues of being both objective and instrumental.

    A pragmatist would remind that we are only after all modelling the world, and doing that with a vested interest.
  • How can chance be non-deterministic?
    What about Peirce's 'tychism'? Didn't he see chance as basic?Wayfarer

    Yep. But unfortunately a further dichotomy is built into that - one that Peirce was still working on.

    Just as there is a Aristotelian distinction between potentials and possibilities, there is a distinction between vagueness and fluctuation.

    So there is “chance” that is basic in terms of being a logical vagueness - anything might be the case. And then there is “chance” in the sense of some definite spontaneous event - a tychic “sporting”.

    One is about the generality of potential being. The other is about the particularity of some accident of being - a definite possibility that is logically crisp in the sense of being a counterfactual.

    Again, this speaks to a holistic systems view of nature as concrete chance only exists by virtue of the counterfactuality of some matchingly definite context. It is a local-global deal. The radioactive particle is still there or it just spontaneously decayed.

    But the quantum vacuum is a much vaguer beast - a generic indeterminacy. It is both full of fluctuations, and yet they are “virtual”. The vacuum needs a constraining context to make its zero-point uncertainty manifest. You need an apparatus like two Casimir plates to turn a vague potential into definite possibilities.

    What about the strange attractors in chaos theory - they produce patterns arising from apparently minute fluctuations - which seems a way of conceptualising something which is both a product of chance but also subject to laws?Wayfarer

    Attractors are produced by correlated interactions. So rather than trajectories exploring the world with complete freedom, they become entrained to emergent patterns.

    Draining water forms a spiral. A vortex is a simple point attractor. Rather than every molecule having to find its own random path to the plug hole exit - which could take for bloody ever - they get sucked into the most efficient possible path that solves the collective problem.

    Order out of chaos, as they say. All the minute individual fluctuations are overwhelmed by mob forces.

    The butterfly effect is then the widely misunderstood converse of the story. If we try to figure out which minute fluctuation began the general plug hole spiral, we might pick one wee fellow that seemed to mark the right angle of attack first. It was the spontaneous fluctuation that broke the symmetry and so set up the giant “tropical storm in a distant land” that became the gurgling vortex.

    But really, the cause of the vortex was the general shape of the system - the boundary conditions that set up a bath full of water where the plug had suddenly been pulled. After that, any old fluctuation could have been the first panicked lurch that set the whole crowd stampede off.