• Banno
    25k
    So this was interesting...

    Which came first, the chicken or the egg? According to a report in Physical Review Letters, a team of physicists from the University of Queensland and Grenoble’s Institut NÉEL have come up with lab evidence concerning the direction of causality that would have left Aristotle speechless. Dr Jacqui Romero from the ARC Centre for Engineered Quantum Systems explains that quantum physics offers a strange alternative to clear cut causes and effects: “The weirdness of quantum mechanics means that events can happen without a set order… This is called ‘indefinite causal order’ and it isn’t something that we can observe in our everyday life.” To observe this effect in the lab, the scientists devised a photonic quantum switch. “By measuring the polarisation of the photons at the output of the quantum switch, we were able to show the order of transformations on the shape of light was not set.” In other words, the transformation events were not taking place in a fixed order. Dr Fabio Costa sees much potential in the findings: “This is just a first proof of principle, but on a larger scale indefinite causal order can have real practical applications, like making computers more efficient or improving communication.”
    Link

    and then I searched for this...
    Quantum mechanics allows events to happen with no definite causal order: this can be verified by measuring a causal witness, in the same way that an entanglement witness verifies entanglement. Here, we realize a photonic quantum switch, where two operations Aˆ and Bˆ act in a quantum superposition of their two possible orders. The operations are on the transverse spatial mode of the photons; polarization coherently controls their order. Our implementation ensures that the operations cannot be distinguished by spatial or temporal position—further it allows qudit encoding in the target. We confirm our quantum switch has no definite causal order by constructing a causal witness and measuring its value to be 18 standard deviations beyond the definite-order bound.
    Link

    Well, why not? Why shouldn't a cause happen after the event?

    Me, too.


    Edit:
    Strictly, the experiment shows that we cannot know if event A caused event B, or B caused A. The meaning of "cause" breaks down here.
  • tim wood
    9.3k
    And all this happens on the scale of the really, really small. It (quantum weirdness, a useful catch-all for quantum weirdness) could happen with cars and battleships, but it's just really, really unlikely. Now, I'm sure there's a perfectly good commonsensical explanation for all this (quantum weirdness), but I just don't happen to know what it is.
  • Banno
    25k
    And all this happens on the scale of the really, really small.tim wood

    But what if we link the quantum switch to Schrodinger's catbox? Then there is no truth to the suggestion that the death of the cat was caused by event A and not event B.
  • Andrew M
    1.6k
    Well, why not? Why shouldn't a cause happen after the event?Banno

    There's no need to give up causality. The paper isn't saying that the cause can happen after the event. It's instead saying that a photon can have an indefinite causal history. The experimenters send a photon through an interferometer where one path has event A followed by event B and the other path has event B followed by event A. The paths are recombined and measurements of the photon match the predictions of quantum mechanics rather than classical mechanics (where the photon travels only one of the paths).
  • Marchesk
    4.6k
    The experimenters send a photon through an interferometer where one path has event A followed by event B and the other path has event B followed by event A. The paths are recombined and measurements of the photon match the predictions of quantum mechanics rather than classical mechanics (where the photon travels only one of the paths).Andrew M

    Is the indefinite history only a product of thinking of a photon as a particle instead of a wave?
  • Andrew M
    1.6k
    Is the indefinite history only a product of thinking of a photon as a particle instead of a wave?Marchesk

    No. The experiment can also be considered at a macro scale using Schrodinger's cat as Banno suggests above. If QM holds true at a macro level (as most physicists expect) then the cat's history would similarly be indefinite.

    It's really a matter of how one views counterfactual definiteness. If a measurement were taken after the first event on each path (whether event A or B) then a definite result (and thus history) would be obtained. It's similar to the double-slit experiment in that respect.
  • Marchesk
    4.6k
    No. The experiment can also be considered at a macro scale using Schrodinger's cat as Banno suggests above.Andrew M

    Wouldn't the cat be doing the equivalent of taking a measurement, creating a definite result? I never understood why the cat could be in a superposition, but the scientists conducting the experiments were not.
  • Andrew M
    1.6k
    Wouldn't the cat be doing the equivalent of taking a measurement, creating a definite result?Marchesk

    Yes.

    I never understood why the cat could be in a superposition, but the scientists conducting the experiments were not.Marchesk

    They can be. It just depends on which joint system is being considered.
  • Andrew M
    1.6k
    Strictly, the experiment shows that we cannot know if event A caused event B, or B caused A. The meaning of "cause" breaks down here.Banno

    I would say it's actually classical physical explanations that break down rather than causality.

    Events A and B are independent transformations which can be ordered differently. So, for example, on one arm of the interferometer the photon is rotated left then down, on the other arm the photon is rotated down then left.

    Classically, it is expected that the measured result would be consistent with the photon having traveled along only one of the interferometer arms and thus there being a definite event ordering. But each result instead indicates a combination of both orderings as if the single photon traveled along both arms simultaneously.

    This is analogous to the double-slit experiment where the detected photons don't build up behind the two slits as one would expect on classical assumptions. They instead build up an interference pattern as if each photon goes through both slits simultaneously.

    This is still understood causally. It's just that both arms of the interferometer necessarily contribute to the result not just one arm as per a classical explanation.
  • bert1
    2k
    Isn't this all solved by panpsychism?
  • Jake
    1.4k
    Perhaps this is helpful...

    I watched a documentary which explained that time runs at different speeds at different locations. Not a theory, proven fact.

    Apparently matters effects time. A large body like a planet creates small but measurable differences in the rate at which time unfolds. So for instance time runs at a different rate at sea level than it does at the top of a mountain (farther from the center of the Earth). GPS satellites have to take this factor in to account or the data they produce would be way off.

    On human scale the time rate difference is so small that it's not noticed. This is a good example of how phenomena can be seen inaccurately if one doesn't have sufficient perspective.
  • Metaphysician Undercover
    13.1k
    Strictly, the experiment shows that we cannot know if event A caused event B, or B caused A. The meaning of "cause" breaks down here.Banno

    Right, the conceptions of time and space utilized by physicists are inadequate, such that they cannot distinguish the temporal order of such events. Physicist have no standard principles whereby they can get beyond the deficiencies of special relativity, which sees simultaneity as reference dependent. It appears like some physicists might take Einstein's relativity theories as the be all and end all to understanding the relationship between space and time.
  • yazata
    41
    asks:

    Well, why not? Why shouldn't a cause happen after the event?

    That raises the problem of time asymmetry. Why is there a distinction between the past and future, in a way that there isn't for left and right? The most obvious difference seems to be that causation appears to only work in the past => future direction.

    https://en.wikipedia.org/wiki/Arrow_of_time

    So, why doesn't retrocausation occur? Why doesn't the future determine the past just as much as the past seems to determine the future?

    https://en.wikipedia.org/wiki/Retrocausality

    One difficulty that might arise if that happened is that we would get paradoxical loops such as those imagined in time-travel science fiction. So if causation behaved in a temporally symmetrical fashion, reality might take the form of a cosmic-scale superposition of possibility states. (Primordial chaos.)

    Another consideration: It seems that the 'laws of physics' are almost all time-symmetrical. They work just as well in the future => past direction as in the past => future direction. So (perhaps) the asymmetry of time isn't inherent in the underlying physical 'laws'. That suggests that perhaps time asymmetry is the result of initial conditions.

    So maybe (speculatively) the reason that our universe exists at all is that something, some local condition, forced all the causal chains nearby to propagate in the same temporal direction. The 'Big Bang' seems to fulfill that requirement. All causal chains in our universe seem to propagate away from it. Creating conditions favorable for a universe of to crystalize into actuality.

    But conceivably (and speculatively) causation can still propagate in the pastward direction for very short intervals. So causal loops still occur on the microscale with superpositions of probability states. Maybe that's why quantum mechanics seems weird and why there's a distinction between physics on the microscale and the macroscale.

    It's all just speculation, of course. (I'm a longtime science fiction reader.)

    More speculation: Perhaps our universe is akin to a shockwave, propagating away from whatever caused it (the 'Big Bang'). Behind the shockwave lies the past, determined and frozen in amber. Ahead of the shockwave lies a space containing many superimposed possibilities. And perhaps the shockwave itelf is the present, 'now', and we are kind of surfing on a giant 'collapse of the wave function' as it expands into the future. Which would accord very nicely with our intuitions about time.

    If there was any truth to any of this, it would seem to suggest an expanding-block model of time.
  • creativesoul
    11.9k
    Well, why not? Why shouldn't a cause happen after the event?Banno

    Do you mean "after the effect"?

    Some events are causes. Causal events influence subsequent events. They are called "causal chains of events" because of that. We call the aforementioned influence - the "effect" - of the cause. We did not arrive at causality by virtue of inventing and/or imagining it. We arrived at causality by virtue of witnessing it happen... over and over and over again...
  • Blue Lux
    581
    Witnessing causality or imposing upon witnessing causality?
  • creativesoul
    11.9k


    Not interested. I don't even know what you meant.

    Clearly we can be wrong when we attribute causality. We can also be right.
  • apokrisis
    7.3k
    We arrived at causality by virtue of witnessing it happen... over and over and over again...creativesoul

    But with quantum mechanics, what is witnessed is violations of this simple classical model of causality "over and over and over again".

    Why did the neutron decay? If its propensity to decay is steadfastly random, any moment being as good as another, then how could you assign a cause to that effect? It is a spontaneous event and so causeless in any specific triggering sense.

    And the retrocausality implied by quantum eraser effects are just as big a challenge to classical locality. The decision the experimenter makes in the future becomes a constraint that affects the probabilities taking shape in the past. There is something spooking acting backwards in time - again, not as a triggering cause, but still as probabilistic constraint on what then is observed to happen.

    Entanglement happens across time as well as space. And the OP-cited experiment is another example of QM challenging any simplistic cause~effect model of events "over and over and over again".

    So sure, causes being followed by their effects is a model we might impose on reality quite successfully at a classical macroscale of observation. But with QM, we are still seeking to find some other way of understanding causality.

    And we already know it must be the more fundamental model, classicality merely being the emergent description.
  • Hanover
    12.9k
    I sent this message, then typed it in. True story. Things have been happening out of order with me for a while now.
  • Marchesk
    4.6k
    I sent this message, then typed it in. True story. Things have been happening out of order with me for a while now.Hanover

    You might have some kind of superpower. I would check into. You could be investing successfully or winning the lottery before you use your money!
  • creativesoul
    11.9k
    But with quantum mechanics, what is witnessed...apokrisis

    There's when you went wrong.
  • Christoffer
    2k
    Well, why not? Why shouldn't a cause happen after the event?Banno

    So far we know that the universe is deterministic, our known universe has a set axis of time. However, it seem to be based on the properties of probability. The larger the object or space, the more probable the consequences of causes becomes. The smaller you go, down to the quantum level, the less probable it gets. Now, by "large objects" I don't mean suns and galaxies, but even ourselves. You need to go down to extreme small scales before the quantum behaviour breaks down things.

    But here lies a contradiction of sorts. If the universe is deterministic, how do we know that the quantum randomness aren't causes in a chain reaction of events? If you could predict every particle in the universe, that would mean you could predict all outcomes of all movements and states, but because you cannot predict quantum level events, since they are random, how then do we know that the consequences aren't affected by randomization on the quantum level?

    If a large object in space, say, an astroid, speeds through space, it gets attracted and accelerated in different directions throughout it's journey, changing it's path slightly but so much that further down the line it's changed course by millions of light years. If random events on a quantum level change that asteroid's trajectory by a margin that is almost not measurable, it would still have changed the location billions of light years of travel later.

    Now, I'm pulling hypotheses from educated guess work here since I'm not a theoretical physicist. Maybe the quantum level is random but cannot change the deterministic nature of the larger world since the only way for it to truly change the course of the asteroid is by expanding the randomness into observation, into levels of probability in which the randomness becomes so high in probable conclusions that it won't change the trajectory. Maybe the randomness and low probability of the quantum level through the process of going from 0% probable to 99,999999999999999999999999% (infinite decimals) probable, is part of how causality and entropy works and therefor the deterministic universe is still solid. If the randomness on the quantum level cannot effect movement of mass, it won't move particles of mass, but only charge their state.

    So, as said, hypothetical guess-work here. I still don't know enough of things like Higgs fields and particles and there's also that little thing called unification theory that we haven't solved. However, while it's chaos on the quantum level it doesn't effect us on the larger scale. The general laws of the universe starts breaking down at a quantum level, but the laws prevents things to move backwards in time on any scale larger. I mean, we could also talk about the state of light, in which the speed of light makes only us experience light. Light in itself does not have a concept of time, since it's the speed of light, so at the start of it's journey it has already reached it's destination at the same in "it's own perspective". Because everything else is slower than the speed of light, we witness things going slower, but if you were speeding at the speed of light, you would be at the start and end of the destination at the same time, since time stops at that speed and would have been like that since the dawn of our universal laws.

    Both at the quantum level and at the speed of light or at the maximum gravitational force exceeding lights speed, things break down and our laws of the universe cease to work in the way we perceive it. The big question is; if we had the means, could we make us perceive things outside of the perception we are slaves under now? Or are we forced to only understand as far as our perception goes? Even if we prove things like tachyons, would we fully understand them? Or would all the data get scrambled into a mess since we have no framework existing in our universe to even explain the basics of them?
    If we live on a scale, at a slower speed than light and under normal gravitational conditions, the probability of events going against the laws of the universe are 99,9999% with infinite decimals. If there was a slim chance of a consequence causing a cause, that seem to never be, because infinitely unlikely that it will happen. It's that mind boggling thing in math where there is a chance of something, but a version of "infinite" makes it infinitely unlikely, even if it is likely.

    But I'd rather point to a theoretical physicist on all of this, I can barely calculate basic math :sweat:
  • Metaphysician Undercover
    13.1k
    You might have some kind of superpower. I would check into. You could be investing successfully or winning the lottery before you use your money!Marchesk

    I've got a better idea, spend millions and then win the lottery.
  • TheMadFool
    13.8k
    Causality is defined in terms of cause preceding effect in time. Anyway that's the definition I'm familiar with.

    What sort of a definition of causality did the folks who claim that cause follows effect use? Thanks.


    One way I can make sense of this quantum weirdness is in terms of potential possibilites. The future has possibilities but they ''must be'' finite in number. So, in a way, the future does determin (''cause'') the past.
  • Andrew M
    1.6k
    But with quantum mechanics, what is witnessed is violations of this simple classical model of causality "over and over and over again".

    Why did the neutron decay? If its propensity to decay is steadfastly random, any moment being as good as another, then how could you assign a cause to that effect? It is a spontaneous event and so causeless in any specific triggering sense.
    apokrisis

    Unpredictability doesn't imply a violation of causality. Without knowledge or control of the underlying physical causes coin flips are also unpredictable.

    The Schrodinger equation is deterministic and so, in principle, can predict when a particular neutron will decay. For a more practical experiment, the Schrodinger equation predicts that a beam of light sent through a Mach-Zehnder interferometer (with equal optical path lengths) will always arrive at the same detector, with certainty. Per Wikipedia:

    In Fig. 3, in the absence of a sample, both the sample beam SB and the reference beam RB will arrive in phase at detector 1, yielding constructive interference. ... At detector 2, in the absence of a sample, the sample beam and reference beam will arrive with a phase difference of half a wavelength, yielding complete destructive interference. ... Therefore, when there is no sample, only detector 1 receives light."Mach–Zehnder interferometer

    As is usual in quantum experiments, the same result occurs when only one photon at a time is sent through the interferometer. Neither classical explanations nor randomness can account for that result. Both predict that a single photon should arrive at detector 1 or detector 2 with equal probability.
  • apokrisis
    7.3k
    Unpredictability doesn't imply a violation of causality. Without knowledge or control of the underlying physical causes coin flips are also unpredictable.Andrew M

    Right. So what I am arguing is that there are two models of causality here - the conventional atomistic/mechanical one, and a holistic constraints-based one. And there doesn't have to be a metaphysical-strength "violation" if the mechanical story is understood as the emergent limit of the underlying holistic constraints story.

    In a nutshell, all events are the constraint on some space of probabilities. An "observation" is some set of constraints that restricts outcomes to a fairly definite and counterfactual result. So contextuality rules. And you can have relatively loosely constrained states - like entangled ones - or very tightly constrained ones, such as when the whole course of events is being closely "watched".

    Atomistic causality presumes that everything is counterfactually definite from the get-go. Any uncertainty is epistemic. As with a coin flip, it is because you toss the coin without watching closely that you don't see the micro-deterministic story of how it rotates and eventually lands.

    But a holistic causality says uncertainty or indeterminacy is the ontological ground zero. Then it is the degree to which a process is "watched" - contextually constrained by a decohering thermal environment - that places restrictions on that uncertainty. Effectively, in a cold and expanded spacetime, there is such a heavy weight of context that there is pretty much zero scope for quantum uncertainty. It all gets squished out of the system in practice and classical causal sequence rules.

    So there is no violation of the classical picture from taking the holistic route. It simply says that the classical picture was never fundamental, only ever emergent.

    Conceptually, that is a big shift though. It means that cause and effect are entangled in root fashion. When we come to talking about time as being a universal direction for change, a passage from past to future, we are talking about the emergent thermal view. The effective bulk condition. On the quantum microscale, past and future are "talking" to each other in a nonlocal fashion. Decisions an experimenter might make about which constraints to impose on the evolution of an event a million years in the future will then "act backwards" to restrict the possibilities as they looked to have taken shape a million years ago in the past.

    Of course, respecting relativity, this retrocausal impact of constraints on probabilities can't be used to actually do any causal signalling. Time - as an emergent bulk property - does have a conventional causal structure in that sense. But it is a property that is emergent, not fundamental. That is the "violation" of conventional ontology.

    The Schrodinger equation is deterministic and so, in principle, can predict when a particular neutron will decay.Andrew M

    It is only deterministic because some definite constraints have been put in place to limit some set of probabilities. The big problem for conventional causality is that the constraints can be imposed at some distant date in the far future, as with a quantum eraser scenario - while also, having to be within the lightcone of those "initial conditions". (So the lightcone structure is itself another highly generalised constraint condition on all "eventing" - causality is never some wild free-for-all.)

    Another quantum result is the quantum zeno effect. Just like a watched pot never boils, continually checking to see if a particle has decayed is going to stop it from decaying. Observation becomes a constraint on its usual freedom.

    This is another "weirdness" from the point of view of causality. But it illustrates my key point. Neutrons that are left alone exhibit one extreme of possibility - completely free and "uncaused" decay. And the same neutron, if constantly monitored, will exhibit the opposite kind of statistics. Now it can't decay because it is no longer free to be spontaneous. It is being held in place as it is by a context of observation.

    So a mechanical view of causality presumes an ontology of separability. The OP experiment's demonstration of indefinite causal order shows that causal non-separability is a more fundamental physical condition. It is direct evidence for quantum holism. Spontaneity rules, but counterfactuality is what emerges, as an environment of constraints gets built up.

    Quantum computing is bringing the issue into focus. Ordinary causality can be describe in terms of familiar logic circuits. There everything is strictly determined to follow a "normal" causal sequence. But quantum computing is now developing the kind of process matrix formalism which this latest experiment illustrates. If you relax the constraints, allow paths to be logically entangled, then you get the kind of causal indeterminism reported.
  • Blue Lux
    581
    Do we witness causality or do we impose upon our witnessing, or is causality imposed upon our witnessing by some means or in some way? One could say that causality is imposed because it is an objective fact of witnessing, derived through the experience of witnessing, making observations and conclusions, but what is the basis of this? Has it not been showed time after time that what the world is and how it can be understood relates to manifestations of the human mind, in which one could possibly be capable of relating to something 'outside of oneself.'
    Causality is not a law of the universe in which we must adopt its priority and assume that it is not any better or any more than any human creation or imagination, which adds to and complements his opposing will to power..
    The idea of an epistemological acquiescence or inheritance renders human knowledge a passive action. This is absolutely absurd with regard to philosophy. Knowledge could never be a passivity, for in terms of existence it relates to something absolutely beyond the passive and active. Knowledge must be in some form knowledge of existence, whose will is of an intention, something indesputably active. And so this activity is an illusory activity only a simulation of passivity, rendering what is only already before it? The activity of an epistemological intention is the reaction the counterpart of which would be termed passive, or capable of being apprehended. The whole of knowledge, relating to imposing upon experience and consequently our understanding of existence with a priori, postulates of sorts, these contain exploration and knowledge. Schools of thought are born out of this... Inseparable divisions the roots of which are excommunicated, in relation to one another.

    The will to knowledge is the apprehension of passivity; a spiral the result of which a closed system has gained another dimensionality.
  • Cheshire
    1.1k
    Well, why not? Why shouldn't a cause happen after the event?Banno

    If your talking mechanics it's best to preserve the definition of cause for the sake of coherence. But, in the sphere of discontent apes such as myself the cause often comes after the event. Hence, inductive reasoning has a purpose. Knowing what an action in the present will result in the future is often the cause of the action.
  • apokrisis
    7.3k
    Continuing a bit, I take the view that existence, and thus causality, is fundamentally probabilistic. Atomism is emergent. And we have two formal statistical models - the classical and the quantum - that capture that fact.

    An irony is that Boltzmann settled the argument in favour of atomism by establishing a statistical mechanics view of reality. His famous dictum was “If you can heat it, it has microstructure.”

    The equipartition law says there is a direct link between macroscopic and microscopic physics because if you know the total thermal energy of a body - its temperature - you can calculate the number of microscopic degrees of freedom it must contain. Avogadro’s constant.

    So atomism was "proved" by spacetime having a well-behaved statistics. A given volume could contain a given number of degrees of freedom. And then - the ontological leap of faith - by observational degrees of freedom, we would be talking about actual definite particles ... as that is what our causal interpretation most naturally would want to assume.

    But who in particle physics believes in "actual particles" anymore? What we actually know to exist is the statistical formalism that describes the prototypically classical situation. We have equations that cough out results in terms of countable microstates or degrees of freedom.

    So the classical picture and the quantum picture are pretty much aligned on that score. They boil down to the kind of statistics to expect given a physical system with certain global or macro constraints on local possibilities. Going beyond the statistics to talk about "actual particles" - conventional atomism - is a reach.

    So in this way, quantum weirdness should cause us to go back and revisit the classical tale. Classical thermodynamics had already created an approach where atoms were modelled as the limit of states of constraint. The basic degrees of freedom of a system - the very "stuff" it was supposed to be constructed from - were emergent.

    And getting back to the quantum level of the story, Thanu Padmanabhan is pursuing this way of thinking as a way to understand dark energy and spacetime geometry -
    http://nautil.us/issue/53/monsters/the-universe-began-with-a-big-melt-not-a-big-bang

    So Boltzmann's argument - if it can be heated, it has "atoms" - can be used to impute a quantumly grainy structure to spacetime itself.

    But it is not that spacetime is actually composed of fundamental causal particles. Instead, it is the reverse story that regular spatiotemporal causal structure has a smallest limit. There is not enough contextuality to continue to imprint its regularity on events once you arrive at the Planck scale. You are foiled by all directions turning symmetric at that point - principally in the sense that there is no thermal temporal direction in which events can move by dissipating their localised heat.

    So again, what we read off our successful statistical descriptions is the literal existence of hard little atomistic parts. Our conventional notions of causality encourage that. Possibility itself is understood atomistically - which is what makes an added degree of quantum uncertainty rather a mystery when it starts to manifest ... and eventually completely erases any definite atoms by turning everything in sight vanilla symmetric. A quark-gluon fluid or whatever describes a primal state of material being.

    But we can turn it around so that atoms are always emergent. And classical atoms reflect another step towards maximal counterfactual constraint - one that takes a step beyond a looser quantum level of constraint, but then even a quantum level is still pretty constrained.

    It is exactly the story with algebras. Normal classical number systems operate as point on a 1D line. Quantum number systems operate in one step more complex/less constrained realm of 2D imaginary numbers. Yet there are further algebras beyond - the 4D quarternions and 8D octonions, and then eventually right off into barely constrained structures of the even higher dimensional exceptionals.

    So classical counting uses fundamental particles - 0D points on 1D lines. The emergent limit case if you were constraining the freedom of the act of counting. But then quantum counting leaves you with chasing your number around a 2D plane, which winds up behaving like an added rotation. When it comes to actual particles - like an electron - you have to in some sense count its spin twice to arrive at its spin number. To fix its state with classical counterfactual definiteness, you have to add back an extra constraint that eliminates the extra quantum degree of freedom it has from "inhabiting" a larger background space of probability.

    Everywhere you look in modern fundamental physics, this is what you find. Classicality is emergent - where you arrive at the end of a trail of increasing constraint on free possibility. So causality needs to be understood now in these same terms.

    And when it comes to quantum mechanics, it isn't even really that "weird" as it is already way more constrained in its dimensionality than the more unconstrained dimensional systems that could lie beyond it in "algebra-space". Quantum mechanics just has ordinary classical time baked into it at a background axiomatic level. That is why it is possible to calculate a deteministic wavefunction statistics for any given initial conditions. A definite basis has been assumed to get the modelling started.

    But to move beyond QM, to get to quantum gravity, it seems clear that time itself must become an output of the model, not an input. And if you give up time as being fundamental, if you presume it to be merely the emergent limit, then of course conventional notions of causality are dead - except as useful macroscopic statistical descriptions of nature.
bold
italic
underline
strike
code
quote
ulist
image
url
mention
reveal
youtube
tweet
Add a Comment

Welcome to The Philosophy Forum!

Get involved in philosophical discussions about knowledge, truth, language, consciousness, science, politics, religion, logic and mathematics, art, history, and lots more. No ads, no clutter, and very little agreement — just fascinating conversations.