Comments

  • Definitions
    If you say that isn’t the logical corollary of your proposition, then you are agreeing the statement was vague.

    Sounds legit.
  • Definitions
    Sure. But not every thing we do with words is pointing.Banno

    So the corollary is that every thing we do is largely pointing? Harry is thus largely correct?
  • "Turtles all the way down" in physics
    . It confirms my interpretation that when you evoke "a sea of U1 photons", you are talking about the unreachable limits between which time and the universe happen.Olivier5

    I’m not sure what you mean there. For what I have in mind, refer to figure 6.2 in this excellent paper - https://www.mso.anu.edu.au/~charley/papers/LineweaverChap_6.pdf
  • "Turtles all the way down" in physics
    I was trying for a simple explanation. Obviously too simple.

    As the gif illustrates, the plane of rotation is orthogonal to the plane of translation. The sine wave is then observed as a trace on the plane of translation. So it is a helical path mapped on to a side view using complex numbers. A combination of the two kinds of motion taking place and the projection of that onto a plane of observation.
  • "Turtles all the way down" in physics
    A dot on a circle spinning along a straight line traces a cycloid.Olivier5

    The disc rotates around, and rolls along, its origin. Check the gif I linked to.
  • Is space/vacuum a substance?
    This is exactly what I was talking about. If you take the laws of non-contradiction and excluded middle out of context, remove them from their relationship with the law of identity, you no longer have anything to ground truth or falsity in, no substance. Without identity truth and falsity is not relevant.Metaphysician Undercover

    You are reading it backwards. A logical definition of vagueness (and generality) is what helps ground your desired "truth-telling" apparatus. It tells you the conditions under which the laws of thought will fail - ensuring you do what is needed to fix those holes.

    So you have to establish that you are dealing with a concrete case where a binary judgement can apply. The thing in question has to be that thing and no other thing. You can't simply presume it. You have to check it.

    But that is then why you need a pragmatic definition of "truth". One that has measurable consequences.

    Theism routinely by-passes that constraint on logicism. God becomes a concept so general that nothing is impossible of Him, a concept so vague that anything can be taken as evidence of Him.

    There is evil in the world? It's put there as a test. You recovered from your heart attack? It was the power of prayer. But your dead neighbour prayed too? God probably knew he was a paedo.

    You are treating the laws of thought as if they are Platonic abstractions. Peirce was concerned with rooting them in the reality of the world. And so defining when a rule does not apply is necessary to being able to define when it actually does.

    We can never simply assume that the law of identity has been truthfully applied, Peirce was correct in this, and it's the starting point for skepticism.Metaphysician Undercover

    Exactly. But having started skepticism going, we then need to rein it in appropriately. And that is what this is about.

    We want to avoid the two errors of credulity and unbounded skepticism. We want to be like scientists and say, as far as our model goes in terms of the measurements it suggests, the theory is probably true.

    Peirce also did critical work on probability theory so that exact numbers could be put on the relatively likelihood of something being false rather than true. His was a system of logic with methodological consequences.

    It makes no sense to conclude that the law of identity cannot be applied, because that just demonstrates a lack of effort.Metaphysician Undercover

    Again, yes. And what does that effort look like?

    (Reveal: Pragmatism rather than theism!)
  • "Turtles all the way down" in physics
    The way I understand your take, this perfect simplicity at at begining and end of time is still an unreachable limit, a state of affairs that never actually happened at any point in time.Olivier5

    It is more complicated. The simplicity is about the Universe being in a state of perfect thermal equilibrium. That means its expansion - or its "cooling by expanding" - is "adiabatic". The system grows, but does so in such a smooth and even way that its internal balance isn't broken or disturbed. It retains its simplest possible state.

    At the very moment of the Big Bang - given we are assuming it is represented accurately by the Planck limits - it would have had that thermal balance. But it almost immediately got disrupted by the quick succession of symmetry-breaking steps represented by the Standard Model of particle physics - SU3, SU2, SU1, and Higgs. And so the smooth flow - the development of the Universe as a spreading space containing a cooling material - got disrupted.

    This is important for how we even imagine "time". A universe that is just the simplest thing of a spreading bath of radiation is essentially timeless. All action happens at the speed of light - c. And so there just isn't anything different to measure that temporal rate against.

    It is only when particles become "massive" - once the Higgs field in particular gets switched on by the messy symmetry-breaking and particles begin to "weigh" something - that time has the kind of meaning it has for us conventionally. Mass makes it possible for particles to go slower than c. They can even be "at rest" from the right inertial frame perspective.

    In effect, mass makes particles fall out of the general adiabatic flow - the spreading bath of mass-less radiation. There is some part of the initial heat of the Big Bang now lagging behind as crumbs of matter. A gravitating dust, as cosmology puts it. And that - emergently - gives us a new kind of temporal potential to be unwound.

    The mass is moving about slowly as lumps of energy density. It is taking a variable time to do things - have interactions like lumps crashing into each other - while, making the constant backdrop, radiation continues to move at its single rate of c.

    Ultimately - to erase this particular complexity and return to the maximal simplicity of a bath of adiabatically expanding radiation - all the lagging mass particles need to be swept up and boiled away into radiation by black holes. Time as we know it - a spectrum of possible rates between "rest" and c - will then disappear. There will be only a simpler kind of time that is the universal rate of an unchanging thermal equilibrium - a world of event horizons formed by light cones.

    Now we get into the really head-spinning topic of de Sitter models. So I won't start that.

    But the point is that "time" is defined by change. Or the ability for change to happen at a variety of different rates within the one world. And time as we know it only exists from soon after the Big Bang until about the Heat Death - that being the period of Cosmic history during which particles could be massive and so go relatively slower or relatively faster within the gradient of rates defined by the opposing limits or "rest" and c.

    At the Heat Death, this kind of gradient will have been erased. Only a continuing c-rate flow will continue. But that is a kind of frozen state of no effective change. A vanilla and featureless state. So a state that is both eternal and timeless - at least viewed relative to our current "timeful" view of things where differences in rate are a thing that can matter.

    Yet when viewed overall - as a trajectory from a point-like Big Bang to a de Sitter light cone Heat Death space that "freezes out" at a scale of 36 billion light years in diameter - there is clearly some other notion of "time as a global change in state" to be had here.

    Something did happen. One view of it is that a lot of "hot stuff" - the initial Big Bang energy density - got exchanged for a matching amount of "cold stuff", all the vast empty space that could act as a sink in which that heat content could be "wasted".

    But even that is a rather too simple description of the actual deeper simplicity in which the two things of spacetime and energy density would be unified under the description of a theory of quantum gravity or "theory of everything".

    Time, as we conventionally imagine it - a Cartesian axis marked by divisions, an unbounded sequence of instants - gets radically rewritten as we go along here. Time becomes merely an effective "dimension", an emergent distinction. Time is only another way of talking about the possibility of change. And change is always relative to the possibility of stasis.

    The simpler the state of the Universe, the less meaningful difference there is between change and stasis. The definition of an equilibrium is a state where differences don't make a difference. In an ideal gas, every particle changes place freely. But overall, the temperature and pressure stay the same. The gas is effectively in a timeless state.

    So our very notion of time has concealed complexity. It is not a physically simplest state as we normally conceive it - living in a world that has lumpy mass blundering about at any old speed between rest and c.

    At least with space, we accept it has the complexity of three dimensions. Maybe many more with the higher symmetry states modelled by String Theory.

    But with time, we brush all its complexity under the carpet by just modelling it as a single extra "dimension" against which everything can be measured in some abstracted fashion.

    This is why time is the issue to unpick in arriving at a final theory. And thermodynamics - as the probabilistic view of nature, the laws that deal with emergent statistics - would be the key to that.
  • Is space/vacuum a substance?
    Further, the laws of non-contradiction and excluded middle, provide guidelines as to what we can truthfully say about any identified object.Metaphysician Undercover

    That is rather the point. Peirce was highlighting the presumption you have “truthfully” identified an object. Some concrete particular under the first law. And he was drawing out the logical implications of the corollary - the case when the principle of identity doesn’t apply.

    Perhaps a more scientific pair of definitions would be that anything is 'general' in so far as the principle of the excluded middle does not apply to it and is 'vague' in so far as the principle of contradiction does not apply to it.

    Thus, although it is true that "Any proposition you please, 'once you have determined its identity', is either true or false"; yet 'so long as it remains indeterminate and so without identity', it need neither be true that any proposition you please is true, nor that any proposition you please is false.

    So likewise, while it is false that "A proposition 'whose identity I have determined' is both true and false", yet until it is determinate, it may be true that a proposition is true and that a proposition is false.

    C.S. Peirce, 'Collected Papers', CP 5.448
  • Is space/vacuum a substance?
    If you can show me how knowing the truth is possible when the PNC is violated, then I might give up that necessary presumption.Metaphysician Undercover

    The PNC is not about "truth". It is about "validity". Or indeed, merely about "computability".

    So let's take the deflationary tack here.

    The PNC could apply to a world of definite particulars - a mechanical realm of being. It just is the case (it is the ontological truth) that identity has this binary quality of having to be one thing and not its "other". If that is how we find reality, the PNC is a good metaphysical model. We might build in that strong presumption as a given.

    But it is quite reasonable to question the claim the world in fact is divided quite so crisply. Indeed, that is the very thing that quantum indeterminism has challenged in the most fundamental way. If two particles are entangled, there is no fact of the matter as to their individual identity. They happily embody contradictory identities - until the further thing of a wavefunction collapse. A thermal measurement.

    So right there is a canonical modern example of how reality is vague (a quantum potential in which identity is accepting of contradictions). But then - emergently - it can also evolve a binary crispness. The PNC now applies. A definite measurement one way, and not the "other", can be entered in the ledger of history.

    So a logic of vagueness, in which the PNC becomes an emergent feature of classical reality, has direct empirical proof now. Peirce was right in his willingness to question some ancient metaphysical "truths".

    The PNC remains a useful tool because we also know that wavefunctions do get collapsed. Well, that is if you can move past the no-collapse quantum interpretations and accept a thermal decoherence model of time itself. :wink:

    But anyway, wavefunctions do collapse and so the PNC does apply from a classical perspective. Yet we then need a logic of vagueness to account for how the PNC could emerge from a ground of being, a ground of quantum indeterminism, where it patently doesn't.
  • Is space/vacuum a substance?
    Your approach is, who cares if this naturalist metaphysics leads us into contradiction...Metaphysician Undercover

    It only contradicts some assumptions you take as axiomatic to your theism. The PNC is a case in point. A belief in some Newtonian and non-thermal model of time being another.

    It is good that your theism is constrained by the attempt at a self-justifying metaphysics - a rational logical structure. And I agree that conventional scientific metaphysics - being overly reductionist - fails palpably to have this kind of causal closure.

    But that is why pragmatism – particular in the Peircean sense - is the royal route to "truth". It combines that causal closure of the formal metaphysical model with the empirical checks that are needed to be able to say the resulting metaphysical model indeed predicts the world as we can observe or measure it.

    Your reaction to Peirce's relaxation of the PNC is telling. He makes the PNC an emergent limit whereas you cling to it as a brute fact. You need it as an input to construct your system. Peirce showed it to be a natural outcome of any kind of systematic development of a "rational cosmos".

    Sure, you can have an argument against that. But it has to be better than: "I don't like the challenge it creates for my necessary presumptions".
  • "Turtles all the way down" in physics
    though i'm unclear about the U1 part.Olivier5

    U(1) is just the simplest possible symmetry group. It is the symmetry of a rotating circle. And nothing is more symmetric than a circular object.

    If you have a sphere, it always looks the same no matter how you rotate it. That is why the Greek Atomists imagined atoms as little spheres - the simplest material form.

    A triangle (or tetrahedron) would have a more complex symmetry. The smallest turn of a triangle makes a visible difference. You can see right away something has moved. It is only after a 120 degree rotation that the triangle maps back on to itself as if nothing in fact changed.

    Compare that to spinning a circular disc - one that has no marks to give the game away. Nothing visible ever changes no matter how furiously it is turned. The disc could be standing still for all you can tell.

    Photons - as avatars of electromagneticism - have this simplest rotational symmetry. A sine wave is the trace carved out by letting a disc roll for a length by a mark on the circumference. So a photon - understood as a ray with a frequency - is just the simplest way to break the simplest state of symmetry.

    It makes use of the two irreducible freedoms of nature under Noether's Theorem - rotational and translational symmetry. A photon rotates once and rolls one length - as the minimal definition of its existence.

    At the Planck scale, such an electromagnetic event - a U(1)-expressing rotation + roll that marks a single wave-like beat of "hot action", something energetic happening - clearly happens in an unusual place.

    Being confined to a spin and roll limited to a single Planck distance, it would also be the shortest, hence hottest, frequency event to ever exist. And energy being matter, it would also be the most gravitationally massive possible material event - so would curve the spacetime around it to a black hole extreme.

    So it all becomes self defining. To break the simplest symmetry takes the simplest asymmetry - the combination of a spin and a roll that creates the mark, the trace, that is a spacetime-filling and energetic event. A single hot beat. The heat of that event defines the size of that spacetime (due to gravitational closure). And the size of that available spacetime in turn defines the heat that that even must have (due to the severest shortening of its frequency).

    Ah, good point: there is a mathematical limit in terms of mass to infinite spliting, a limit that is equal to 0 mass, just as there is a solution in the form if a mathematical limit in Zeno's paradox.Olivier5

    So what I have just described is different in that instead the zero is about the zero sum game by which we can get "something from nothing" due to a symmetry-breaking that is based on a reciprocal balancing act.

    A photon expresses the world of the circle. We can't tell if a circle is rotating. So that means that if reality is constrained by a generalised demand for maximum symmetry, then the ultimate best solution to that demand is to arrive at the shape of a circle. It is most stable shape in that it always must look the same.

    A circle has translational symmetry as well because - without the help of outside reference marks - we can't tell if it is rolling along. This is the standard relativity. Motion is only detectable if the symmetry of the reference frame is broken in some way.

    And as I say, putting a dot on the edge of a circle free to rotate + roll then counts as the most minimal mark, the simplest symmetry-breaking. The result is the "energetic event" of a spacetime frame that now contains a single sine wave.

    We thus have a toy world described in reciprocal limits. There is both near perfect symmetry (U(1)) and near perfect symmetry-breaking - the dot on the circumference that reveals the still unconstrained "Noether" freedoms of the ability to spin, the ability to roll. The ability to thus mark an empty space constrained to perfect circularity with a sine wave event that the constraints can't in fact eliminate. And what can't be eliminated, must happen.

    Real physics is more complex as the real Big Bang could not access the great simplicity of a U(1) world so directly. It actually had to constrain all the other possible symmetries - the many higher or more complex symmetries of group theory - and remove them from the fray first.

    That created the shower of other particles with more complex rotational actions – the particles of SU(3) and SU(2) symmetries, according to the Standard Model. And even to get to U(1) perfection involves the Higgs kluge that cracks SU(2).

    But the basic picture of what reality is seeking to achieve is to arrive at its greatest state of simplicity - as defined by the complementary limits of a perfect U(1) symmetry broken by a matchingly-perfect least form of asymmetry. The slightest blemish on the Cosmic cheek. :grin:

    The Heat Death tells us that this perfection is where we will arrive in the future.

    Given the discovery of Dark Energy (a new unexplained ingredient in the story), we at least know that the Universe is coasting towards a destiny where spacetime will be best described by a reciprocal U(1) structure of holographic event horizons and their "as cold as possible" black body radiation. That is, a de Sitter cosmology.

    Spacetime will be devoid of matter. Blackholes will have gobbled up all remaining gravitating matter and spat it out as electromagnetic radiation. So spacetime will be empty with an average temperature of zero degrees K. But it will also be filled with the even radiance of a cosmic bath of photons produced by the quantum holographic mechanism - the Unruh effect.

    These would be photons that - in effect - span the width of the visible universe in just a single wavelength. Their frequency would be measured in multi-billions of lightyears. A single rotation + roll that spans the gap that the speed of light can transverse.

    So spot the connection. The beginning of spacetime - the Big Bang - and the Heat Death are mirror images.

    Both are defined by that single U(1) based rotation + roll deal. Except at the Big Bang, the spacetime extent is the smallest possible, making the energy of the frequency as hot as possible. And at the Heat Death, it all has unwound to arrive at the complementary state of the largest possible spacetime extent and thus the coldest possible photon, the lowest possible energy wavelength.

    Simplicity is always the goal. But because complexity has to be constrained first - all the other available higher symmetry states have to be got rid of along the way - it is only by the end of time that U(1) perfection (in terms of a simple circle and its irreducible symmetry breakings) is achieve.

    That is why it isn't turtles all the way down. Existence is a push towards the limiting extreme that is simplicity. And that push is self-terminating in that the constraints (an insistence on arriving at maximal symmetry) contains within it the terminating thing which are those irreducible symmetry breakings.

    Every kind of difference can be eliminated by U(1) circular symmetry, except a rotation and a roll. So already, the necessary blemish is built in to break that symmetry (in the simplest way).

    Reality can go no further as there is no further splintering of the system arrived at. The constraining towards a symmetrical nothingness gets hung up on an irreducible grain of being. Things can go that far and no further - leaving reality as the coldest-possible fizzle of holographic event horizon radiation. Photons with the physical wavelength of the visible universe - that sea I speak of.

    Charlie Lineweaver at ANU has written a bunch of decent papers about all this.

    And as a caveat, Dark Energy remains a fly in the ointment. It is necessary to explain why spacetime expansion will get truncated by the de Sitter event horizon mechanism. But we need some further bit of machinery - another Higgs-like field or irreducible entanglement - to fold that into the final theory of everything.

    As someone once said, explanations ought to be as simple as possible. But not too simple.

    U(1) is the simplest possible story. But getting there was not a simple process as all other symmetries had the possibility of being the case. And the way they would then interact and entangle with each other becomes part of the story of where things actually wound hung up in practice.

    The world of quark/SU(3) symmetry and lepton/SU(2) symmetry, plus the Higgs mechanism, is how we are all still hung up at that more complex level of things at the moment. The Universe is still breaking its way down through all those entanglements along the ultimate path.

    The more complex symmetries have more complex spin states - chiral spin. And they thus have their own equivalent irreducible rotational symmetries. Higher level Noether freedoms that can't be eliminated directly.

    By rights, in a symmetric world, matter ought to be annihilated by anti-matter leaving only radiation. But these complex spins produce uneven outcomes. So some matter survives. Quarks can then protect themselves by forming triplet structures like protons and neutrons.

    And so that complexity could last forever. Proton and neutron crud messing up the empty perfection of a cooling and expanding void. A flood of ghostly neutrinos as well, messing up reality with their pointless SU(2) weak force interactions.

    So long as black holes perform as advertised - hoovering up the crud and evaporating it into photons - the universe can get there in the end. SU(3) and SU(2) will be rendered relic memories. Maybe surprisingly, the Cosmos will arrive at the mathematically ultimate state of simplicity in terms of its symmetry - and the symmetry-breaking events, the holographic U(1) photons, needed to reveal that that symmetry in fact "exists".

    The edge of the disc has to be marked to reveal the world within which it can rotate + roll. The blemish is needed to complete the deal that conjures "something from nothing".
  • Is space/vacuum a substance?
    . The scientific community hijacks and restricts the use of "time" to conform to their empirical observations, i.e. they define time in relation to the material world.Metaphysician Undercover

    Yes, we can consider this a contest between pragmatic naturalism and dogmatic theism if you like. One holds consequences here in the real world. The other not so much.
  • Mathematics as a way to verify metaphysical ideas?
    And there are logically consistent but mutually inconsistent theories out there.fishfry

    But Euclidean geometry was merely shown to be a special limit case on non-Euclidean geometry.

    The two were consistent. The advance was to find a parameter - a constraint on parallel lines - that could be relaxed to arrive at an even more symmetric or generalised mathematical structure.

    And in that, the maths rather exactly mirrored the physics.

    The everyday world looks flat and Euclidean - at the inertial scale we are likely to be measuring it. That only proves to be the special case as we become able to step back and factor in gravitational and quantum "curvature". The flatness becomes relative to the dynamics of spacetime and energy density.

    So the abstraction in terms of mathematics - the relaxation of some critical constraint on the model - mirrors the actual thermal evolution of the hot Big Bang. Go "further back" towards the Planck scale and relativistic and quantum effects intrude. The geometry loses its cold and expanded Euclidean flatness. It becomes the chaos of quantum gravity - equal parts black hole strength curvature and quantum strength uncertainty.

    Mathematics can not say what the truth of the universe is.fishfry

    What the steps from Newton to Einstein and so on tells us is that the Ancient Greeks were already on the right track with Euclid. His geometric model of space was the correct starting point. It was simply over-specified in not having time and energy explicitly included in the dynamics.

    Once the symmetries were expanded to include these - once space could be bendy, and then bend as a precise reciprocal of its energy density - we had stumbled into the model that could describe a GR world.

    Quantum field theory adds another dimension of plasticity to the frozen Euclidean realm. Instead of being all bendy, now the spacetime manifold is all "grainy" - composed of fluctuations that need to be collapsed.

    QG is already clear as the final step that would unite GR and QFT in a still higher state of geometric symmetry. Bendy space would be unified with grainy space as two sides of the one dynamical coin.

    The great metaphysical project has been working out really well. Ancient Greeks got the ball rolling. Modern mathematical science has now got the hang of its deep logic.

    All aboard for the ride. :up:
  • Is space/vacuum a substance?
    Again, you are ignoring the contradiction involved in "time emerges". Time must already be passing for anything to emerge, so time is necessarily prior to emergence.Metaphysician Undercover

    If time is what is emergent, then it is necessary that nothing be happening before it gets started. The idea of "before" becomes the incoherent claim here.

    You presume time to be eternal. Thus there is always a "before". Hence time is proven to be eternal. Your argument is a simple tautology.

    A thermal model of time is about the emergence of a global asymmetry - an arrow of time pointed from the now towards the after - the present towards the future. So the past, the before, is a backwards projection. It is imagining the arrow reversed. And reversed to negative infinity.

    Yet the reality - according to science - is that time travel (in a backward direction) is unphysical. And the Big Bang was an origin point for a thermal arrow of time.

    Yes, we can still ask where the heat to drive that great spatial expansion and thus create an arrow of time, a gradient of change, could have come from. What was "before" that?

    But this is no longer a conventional notion of a temporal "before" anymore than it is a conventional notion of "what could have been hotter" than the Planck heat, or "shorter" than the Planck distance, or "slower" than the speed of light.

    Every such conventional notion fuses at the Planck scale - the scale of physical unification. The asymmetries are turned back into a single collective symmetry. There is no longer a before, a shorter, a hotter, a slower. All such definite coordinates are lost in the symmetry of a logical vagueness. That to which the principle of non-contradiction (PNC) now fails to apply.

    Before the PNC applied, there is a time when it didn't. That is the "before" here. :wink:

    You are not properly distinguishing the active from the passive.Metaphysician Undercover

    Another of the many co-ordinates that are erased if you wind back from their current state of divided asymmetry to recover their initial perfect symmetry. At which point the PNC fails to apply. The logic you want to argue with suddenly runs out of the road it felt it was travelling down.

    In the beginning, the active and the passive (along with the stable and the plastic, the necessary and the accidental, the global and the local, etc, etc) were a symmetrical unity. Both halves of the dichotomy had the same scale and so were indistinguishable as being different. The PNC might feel as though it ought to apply, but - being indistinguishable - it can't.

    It is only as they grow apart that a proper asymmetric distinction can develop. The passive part of nature is that which is less active. And vice versa. Taken to the limit, you get the passive part of nature as that with the least possible activity. Or the reciprocal relation where passive = 1/active. And vice versa. The active = 1/passive.

    The immaterial is separate and distinct from the material in the very same way that the future is separate and distinct from the past.Metaphysician Undercover

    That can only mean ... as a religious and unphysical belief.

    It is a claim of a theistic model. And a naturalistic model has become the one that has produced all the useful physics here.

    What we understand, in a mysticism based metaphysics, is that the entire material universe is created anew with each passing moment of time. This is a necessary conclusion derived from the nature of freewill. The freewill has the power to interfere with the continuity of material existence at any moment in timeMetaphysician Undercover

    Epicycles to explain away a metaphysics that is provenly unphysical. It feels like an explanation being expanded but it is a confusion being compounded.
  • Mathematics as a way to verify metaphysical ideas?
    Some might say that String theory is metaphysics, others might disagree.jgill

    This is a good example of the tensions now developing because mathematical research looks to be cutting deeper than empirical research.

    We simply can't recreate the Big Bang to test out our theories. But we can computer simulate the Big Bang - or at least check the self-consistency of a mathematical model.

    So the tensions are about the trivial thing of academic careers. Are your skills obsolete if Cern has no new super-collider, but Stanford has this new supercomputer simulation team? You will be quick to start name-calling - "That's not real science! That's just metaphysics!" - if your own job is on the line.

    Academia is a social game. People have to construct their in-groups by "othering". Calling someone a metaphysician can seem like the worst kind of insult.

    It is not helped by the fact that many then take advantage of the social notion that metaphysics means "unbounded speculation" rather than logically rigorous argumentation. Any crackpot will claim to be a metaphysician as convenient cover.

    But it is all games. No need to worry. The simple fact is that the traditions of metaphysics, maths and science have always been fused at the cutting edge of human inquiry. There are few examples of great scientists who didn't combine the three in productive fashion.
  • Mathematics as a way to verify metaphysical ideas?
    Could it be possible to translate metaphysical concepts into matematical language so that we would be able to prove some theories as valid and refute other?Eremit

    I certainly find that any good metaphysics has a mathematical clarity. So there is a deep connection. But not a simple one.

    In general, good metaphysics provides a putative model of reality in terms of some self-consistent logical argument. So it is mathematical in the sense of having some logic at work. And that logic can be checked out in terms of the mechanics of how it maps the inputs of an argument to its outputs. There is a system of rules. And so an argument can be checked for validity or internal consistency under those rules.

    So - as with maths - you can prove the validity of the argument. But then if metaphysics wants to be saying something fundamentally true of reality - ontology being its ultimate goal - it has to start engaging with science and empiricism.

    The wrinkle is that maths can now be regarded as itself the science of patterns. It is actual research into the nature of being - if Being itself is constrained by a generalised demand of self-consistency.

    That is why fundamental physics and the maths of symmetry are in such tight alignment. Metaphysically, both treat their "realities" - the worlds they seek to explain - as "systems". And symmetries are the key that unlocks that door.

    So that is an example of how science, maths and metaphysics become joined at the hip once they seek out a particular line of self-consistent logic - a holism that makes maths "unreasonably effective" for the physicist.

    There are lots of domains of maths that have very little metaphysical application. They lack the kind of logical (hence causal) closure that metaphysicians seek in making their ontological models of reality.

    And it is not generally helpful to think it is just a case of translating verbal descriptions into mathematical language. Metaphysics is generally going to be in advance of some mathematical notation. The mathematical concepts have to come first. And when first understood, they may still be hazy. But good metaphysics always has that quality of someone describing an essentially mathematical structure. There is some holistic arrangement of parts that forms a holistic process. And the solidity of that discovered structure - as with a symmetry operation - is something that can then be turned into a standardised mathematical notation.

    Do you think that we could describe metaphysical theories by geometric constructions?Eremit

    I am arguing that all good metaphysics is basically about the holism of structures. So that is geometric in that you would have to "picture" both the relata and the relations.

    But every good geometric idea can be describe algebraically (and vice versa). So it is not that geometry is primary. The key again is the presumption that we are trying to discover the hidden structure of reality - its deep skeleton. And the maths that is about the science of patterns, the logic of structures, is naturally going to be treading the same path.

    It will be making concrete the useful mathematical language that can then be employed by mathematical physicists to write their fundamental theories. And the fundamental physicists are today's actual metaphysical community.

    The academics working in philosophy departments can contribute plenty to a history of metaphysical ideas, but not much to the development of new ideas beyond some useful commentary from the sidelines.

    In summary, metaphysics, maths and science are in practice fused at the cutting edge of human inquiry already. Mathematical strength thought has always stood behind the best metaphysics. And science created the empirical connection between such metaphysical models and the reality they might claim to model. A lot of bad metaphysics fell by the wayside as a result. And the mathematical scientists were left the ones closest to the new action.
  • Is space/vacuum a substance?
    The only way that the thing could come into existence as the thing it is, and not some random other thing, is that it's material existence is preceded by its form.Metaphysician Undercover

    I agree with that argument too. Which is why I say the matter of origination can only be solved by adding a logic of vagueness to our metaphysical tool kit.

    Both formal and material cause have to arise in the same moment. They in fact must emerge as the two aspects of a shared symmetry breaking. And time (as spacetime) also emerges.

    Big Bang cosmology describes that. At the Planck scale, matter and spacetime are clearly dual. The smallest coherent distance is also the greatest energy density as being so confined, it can contain only the single shortest frequency wavelength beat. And that is the hottest thing possible. A material event of the highest energy.

    So the duality of matter and spacetime is written into the heart of physics by the reciprocal mathematics of the Planckscale. Material cause and formal cause are two halves of the same symmetry. All that happens is that the Cosmos expands and cools from there.

    There is then no time before this first moment as time is part of the onset of metric expansion and thermal cooling. There is change with an emergently coherent direction.

    The Hartle-Hawking no boundary story is based on that. The planckscale is a general cutoff as it is the point where energy density and spacetime are indistinguishable. They are a symmetry not yet broken. Vagueness rules until they each establish the mutually reinforcing directions to grow apart from the other.

    Energy density can become energy density by virtue of thinning and cooling. Spacetime can become spacetime by expanding and becoming a frame on energy densities. Crisp difference can become possible as not everything needs to be all the same temperature and all the same size any longer.

    So the key is to stop asking the usual question of what can first. Hylomorphism starts already as a package deal where both material and formal cause exist, doing their job, as the complementary aspects of a holistic transition from a vague everythingness to a crisp somethingness.

    Without accepting this principle, that form is prior in time to material existence (and this is the principle which necessitates the proposal of divinity), you cannot claim that your metaphysics is consistent with Aristotle's.Metaphysician Undercover

    It is an over interpretation to claim Aristotle was consistent himself. What I say is that he still broke the story apart into the many elements that are still useful today.

    And the conceptual tool he really lacked was a notion of vagueness (as opposed to crispness). This leads to problems where one half of a dichotomy must always precede the other half. And of course, that never can be the case if each half is effectively the cause of the other in being its Hegelian “other”.
  • Definitions
    Are you dismissing what I say without argument?unenlightened

    Yep. Guess so,
  • Definitions
    Where was your question?

    Were you wanting a definition of force as a term of art in modern physics as opposed to the many other ways of using the same word in colloquial language?

    Seems like you just wanted to rant and blow off steam.
  • Definitions
    To a technician, every word is a technical term, but to a philosopher, every word it a gateway to a universe.unenlightened

    Hmm. Is this philosophy as practiced by academia or the kind of “philosophy” that believes in crystals and scented candles?

    Seriously. Show me the philosopher who treats every word as a gateway to a universe.
  • Is space/vacuum a substance?


    The term 'inertia' is often used to describe a kind of irrational resistance to change in individuals or institutions.

    http://www.cosmosandhistory.org/index.php/journal/article/viewFile/464/778

    The social definition of inertia; demonstrated here as “meaning is use”. :lol:
  • Is space/vacuum a substance?
    Next demonstrate your comprehension of today’s set word by using it successfully in a whole sentence. Tomorrow, we can try using it in a paragraph. In a blue moon, you can employ it as the subject of a reply adequate to the discussion. :clap:
  • Is space/vacuum a substance?
    Then he demonstrated with the cosmological argument, that it is impossible for any potential to be eternal.Metaphysician Undercover

    Do you have a source where it is clear that is the argument?

    The Stanford article I cited on the prime matter issue fits with my view that Aristotle never fully worked it out, even if he left us with most of the essential tools.

    In other passages too Aristotle seems to leave the question of whether or not there is prime matter deliberately open.

    The issue with respect to "matter" is that matter is itself just an idea. This might be hard for you to grasp, because "matter" is exactly what we assign to the physical world as what is independent from us, and therefore not an idea. But as "matter", is simply how we represent the physical world. It is our idea of temporal continuity, what persists unchanged in time, represented in science as inertia, mass, energy, etc.. In reality, what exists independent from us is changing forms, and we represent the aspects which are consistent, constant, as "matter", and this is the basis of the temporal continuity which is called "Being",Metaphysician Undercover

    I agree with the first part but not the second. In my semiotic view, time as a continuous thread of Being is also emergent.

    And physics supports this. The Cosmos has a thermal history that locks in its future direction. It is a space of possibilities that becomes increasingly constrained as it expands and hence cools.

    So yes, we apply psychological models that see a world divided in into matter and void (the spacetime coninuum]. With Newtonian modelling, this becomes a system of laws and measurements. We have an Aristotelean division into material and formal causes.

    But physics has kept marching on until matter and void, space and time, etc, are all unified as aspects of a universal substance - a theory of quantum gravity, if we can pull that off.

    And spacetime would have to be emergent in that scheme, just as would mattergy - relativistic mass.

    When he supposedly refuted idealism, by denying that potential could be eternal, he also refuted materialism, because materialism is actually just a twisted form of idealism, substantiated by the concept of "matter".Metaphysician Undercover

    Is this your interpretation? I don’t think he had the mission of refuting idealism as even Plato is not really an idealist - especially by the Timaeus.

    Instead I would say the issue was resolving the issue of hylomorphic substance - how substance could be the co-production of formal and material causality. Or as systems science would put it, bottom-up construction in interaction with top-down constraints.

    you'll find a similar concept in the Hartle-Hawking no-boundary proposal. This eternal infinite regress is logically repugnant for a number of reasons, best demonstrated by the absurdities produced by the principle of plenitude which dictates that in an infinite amount of time, all possibilities have been actualized.Metaphysician Undercover

    Peirce’s logic of vagueness resolves this initial conditions issue as I have outlined before. I realise you don’t agree.

    Here we have divergent courses of study. You would say that we ought to put aside this notion "God did it", stick with the demonstrably deficient and faulty scientific conceptions of temporal continuity, and ignore the vast wealth of accumulated theological knowledge of this subject. Thus you adhere to that prejudice which assumes a "naturalistic resolution" is possible, regardless of the mounting evidence against this possibility. On the other hand, we can take Aristotle's lead and proceed toward understanding the teleological nature of the universe, discovering the completely different understanding of temporal continuity, Being, which is explored in Neo-Platonism and early Christian theology.Metaphysician Undercover

    Yes. That is why I wanted to check how much scholasticism you are projecting onto what Aristotle actually says (as much as we can rely on the curated version passed down by history).
  • Is space/vacuum a substance?
    , mass is better understood (more useful...)Banno

    Lol. Why are you trying so hard to avoid talking about inertia? What is that when it is at home in your naive realism paradise?
  • Qualia and Quantum Mechanics, the Reality Possibly
    The biosemiosis perspective is similar to mine because it is based on the same empirical evidence. The slight difference is that I view this underlying substrate as not unformed, homogeneous chaos, but a substance with complex patterns of supradimensional flux we have not yet even approached modelingEnrique

    That seems more like a huge difference. :grin:

    but that amounts to trillions and trillions of pockets of quantum causality in an Earth lifeform, which make nonlocality the predominant ingredient in many facets of the organic world, a reality we have not yet deeply tapped into scientifically and technologically.Enrique

    I get the impression from a small amount I've read about semiconductors that the mechanism of "wave" propagation might amount to quantum tunneling/entanglement.Enrique

    Do you see how one contradicts the other? We have tapped into quantum tunnelling/entanglement in a big way with our technology. So it has certainly been delved into deeply.
  • Is space/vacuum a substance?
    A more fruitful approach might be to look at mass rather than substance.Banno

    Err no. Mass is not a simple matter of a weighing a kilo of stuff when you get beyond schoolboy physics. It is defined in terms of inertia - resistance to acceleration. You need to get out your ruler and stopwatch. And that is while you are still working within a Newtonian metaphysics where references frames are inertial.

    It is bad to spread this kind of silliness just because of some ancient Scientistic prejudice against “metaphysics”. It betrays a lack of familiarity with both physics and metaphysics as academic disciplines.
  • Definitions
    “What a thing means is simply what habits it involves.” CP 5.400) :up:
  • Definitions
    First, reductionist is a technical term. Second, you are well recognised as the name caller and scoffer in chief around these parts. :hearts:
  • Definitions
    How many times do we have to scratch the same itch. :smile:

    The process of “definition” is circular. For a reductionist like Banno, that is a bad thing. But for a holist, the circularity is actually hIerarchical or cybernetic. The iterations don’t lead you around in a meaningless chase. They should zero in on a functional state of meaning.

    So this is pragmatism 101. Meaning is use. Our understanding of a word (as a meaningful sign) is uncertain.

    I say to you “frepp”. Whatever could I mean?

    We would discover that by the extent to which it pragmatically constrains my behaviour.

    So we could never completely eliminate your uncertainty about all the ways frepp could be defined, all its possible connotations. But we could certainly constrain that uncertainty to a degree that is reasonable and pragmatic.

    As a start, you might ask “animal, vegetable or mineral”. You would work your way down from the most general constraints towards the sharpest distinctions.

    So “frepp” is inherently vague - capable of meaning anything as all signs are. And we can corral its meaning by the binary exercise of asking “what is frepp?” by virtue of its logical corollary - “what then is not-frepp”.

    We seek a definition in terms of the differences that make a difference. And we are satistified our interpretation is sufficiently constrained when the differences no longer make any practical difference.

    It is all bog standard semiotics. No mystery even if Banno wants to recycle it as some great metaphysical quandary for the nth time.
  • Qualia and Quantum Mechanics, the Reality Possibly
    Molecular machinery internal to an axon of course must differ from the properties of for instance a copper wire, but some kind of transport chain including tunneling and entanglement is probably involved, similar to photosynthesis, not solely the diffusion of ions.Enrique

    I’m unclear as to what you mean. But tunneling and entanglement don’t seem relevant even if axons were understood as copper wires with a flow of electrons.

    Classical wave mechanics gives the conceptual model for how it works. The wire is like a tube stuffed with charged particles. Give it a shove at one end and the disturbance will propagate. Each electron - or ion - will be move a bit, like happens with waves in water or air, and then that collective motion itself becomes a wave of change travelling elastically at speed.

    In a good conductor like copper, the electrons themselves move a short distance at a drift speed of 1% the speed of light and the resulting wave or pulse at about 90% the speed of light. This is all explained in mechanical terms rather than by invoking any quantum properties (and so any putative qualia weirdness).

    In axons, you of course have an even more clearly classical mechanical story in that the charged ions are sluggish and not in such a conductive medium. And to create any kind of speedy travelling wave as a collective phenomenon requires further levels of actual molecular machinery.

    The structure of the wire is critical. It is like a series of switches - or a chain of rat traps that trigger each other. Nothing quantum about the triggering or conduction. And speed is created by spacing the gaps between the sites where membrane depolarisation is happening. This is saltatory conduction. Myelin insulation spaces out the nodes and so - which may be the point you were making - the signal is carried a distance by an elastic ripple of conductive disturbance in a “tube” packed with charged ions.

    So the whole story is complex. However that is because of all the extra biological engineering that constructs a classical machinery for propagating a signal. It is not about amplifying some quantum signal. It is about constructing the possibility of a mechanical signal out of materials whose quantum uncertainties have been suitably constrained.

    The structure of computers is based on models of information processing, so in that case it is an apt term, but analogy to brains could be flawed. I'd be interested to get your definition of information in the context of biosemiosis.Enrique

    Yes, I agree that simple notions of information are flawed. The difference with biosemiosis is that instead of assuming that the substrate of nature is this mechanically definite stuff - stable substantial atoms of matter - the ground of being is instead a fundamental uncertainty or chaos. A quantum potential. So machinery is something that has to be built on wobbly foundations. Machinery in fact exists only if it can constrain or stabilise its own foundations.

    So that is the difference. The information processing is all about imposing a classical order on an underlying unruly chaos. A computer of course is imagined as a device with no entropic connection to reality. That is why it is free to compute abstract patterns. It doesn’t even know it needs to be plugged into a socket to work.

    But life and mind are computational patterns that are plugged directly into the job of stabilising the world that is their entropic power supply. All the information processing is tied to that basic purpose - maximising an entropy flow. The machinery exists to regulate material instability and create an organically structured process of growth.

    That perspective is why biosemiosis can both accept the quantum basis of everything - uncertainty is the fundamental ground - but then build in the expectation that the machinery of life and mind exists as mechanism to stabilise this quantum ground.

    Panpsychism on the other hand wants to treat the quantum realm as a definite substance. It sees it as a weird nonlocal fluid - a spread out coherent field with substantial properties, including qualia. And the machinery of life and mind would somehow have to be doing the job of amplifying the weak or dilute signal contained in the field effect.

    So pansemiosis is about the top-down regulation or constraint of uncertainty as the paradigm. And the new biophysics gives the scientific story of how molecular machinery manages the delicate quasi classical interface where the thermal decoherence takes place.

    Panpsychism is about the supposed amplification of a weak signal. And in its quantum version, it relies on early versions of quantum theory where nonlocal coherence seemed the new loophole in nature. But now thermal decoherence tells us that, in practice, everywhere that loophole is closed. The universe itself prevents any quantum instability from running out of hand.

    Biosemiosis has been confirmed by science. It’s prediction that the quantum is regulated is what we find.

    Quantum consciousness must predict the opposite. More and more unlikely ways must be found to explain how quantum coherent states escape their own inherent instability. Biology must somehow channel things so that thermal decoherence is held at bay. And yet, biology not only is generally bad at that on the macro scale, we have found how it is actually designed to manage it on the nano scale.

    So yes, the quantum realm is harnessed in many ways. But by an “information processing” system of molecular engineering. Photosynthesis, respiratory chains, sensory receptors and everything else are proteins able to micro-manage quantum instability for purposes encoded in a system of semiosis - the regulatory habits encoded in levels of genetic, neural and (in humans) linguistic machinery.
  • Is space/vacuum a substance?
    This is what Aristotle claims to refute with the "cosmological argument", the idea of "emergent actuality".Metaphysician Undercover

    Surely what he wanted to refute was an efficient first cause to the Cosmos. And this led him to claim that the actuality of Being must therefore be eternal.

    So he got something wrong. We now know our Universe started in a Big Bang. There is a data point to be dealt with.

    But his own theory of substance include finality - a prime mover. And if you put aside the suggestions that “God did it”, then his contrast of immobile celestial spheres and an actuality that is thus driven in circular motion Is not too bad a stab at some kind of naturalistic resolution. It is a fact of quantum theory that spin exists as a fundamental degree of freedom because the classical spacetime universe provides the motionless reference frame that makes it so.

    Noether’s theorem at work.

    I wouldn’t exaggerate the fore shadowing. But Aristotle was heading in the right direction.

    So Peircean firstness, and the metaphysics which follows from it, is not at all consistent with Aristotle's metaphysics, because it adopts the very principle which Aristotle claims to have refuted.Metaphysician Undercover

    An efficient cause is only so if it is efficient. And a fluctuation is defined by being a difference that doesn’t make a difference. Or only the weakest imaginable difference.

    So Peirce was making an argument along the lines of modern symmetry breaking and chaos theory thinking. The old butterfly wing effect. If things are poised and ready to tip, then even the least disturbance, any old action no matter how small or undirected, will cause the system to go in its finality-serving direction.

    You can’t really attributed some grand causal power to that fluctuation as any old fluctuation Could have done the same trick. And yet a fluctuation was also a necessary ingredient. The first accident. The first difference to make a difference.

    You really can't just overlook the fact that Aristotle replaced the concept of "prime matter" with "prime mover", as the foundation of his ontologyMetaphysician Undercover

    He rejected a first efficient cause in that particular argument against Atomism and the claim that the Cosmos could be created rather than eternal.

    But here we were talking about prime matter - that is material cause, not efficient cause (even if I agree the two must be related).

    So for example we have this in the Stanford article I cited...

    Another key passage where Aristotle has been thought to commit himself more decisively to prime matter is Metaphysics vii 3. Here we are told:

    By “matter” I mean that which in itself is not called a substance nor a quantity nor anything else by which being is categorized. For it is something of which each of these things is predicated, whose being is different from each of its predicates (for the others are predicated of substance, and substance is predicated of matter). Therefore this last is in itself neither substance nor quantity nor anything else. Nor is it the denials of any of these; for even denials belong to things accidentally. (1029a20–26)

    Although the word “prime” does not occur here, Aristotle is evidently talking about prime matter. A natural way to read this passage is that he is saying there is a wholly indeterminate underlying thing, which he calls “matter”, and it is not a substance. Those who wish to avoid attributing a doctrine of prime matter to Aristotle must offer a different interpretation: that if we were to make the mistake of regarding matter, as opposed to form, as substance, we would be committed (absurdly) to the existence of a wholly indeterminate underlying thing.
  • Is space/vacuum a substance?
    Conversation?

    I asked you a question that you can’t/won’t answer. I supplied the reference to how modern physics would quantify substance.

    Play or go home. Whining is undignified.
  • Is space/vacuum a substance?
    You had your chance to reply. And...

    Banno didn't.Banno
  • Is space/vacuum a substance?
    I wouldn't. Banno did.

    What else did you think I was picking out here?....

    What are your units of measurement now? Are you going to rely on a pair of scales or a stopwatch and ruler?apokrisis
  • Is space/vacuum a substance?
    ...and that leaves me thinking that there is something disingenuous about your repliesBanno

    You mean that I just exposed your own disingenuous game here. Yes, you can buzz off now.