You’re saying that classical physics approaches counterfactuality, just as it approaches locality. But QM doesn’t actually say whether one, the other, or neither is a basic property. — noAxioms
But QM sets it all up. It says there are two questions that could be asked that would fully dichotomise your coordinate system – your basis of measurement. The catch is – on the finest scale of resolution – you can't ask them both at the same time. The issue of commutative order kicks in.
So QM stands for the division of reality into its complementary extremes – the standard move of metaphysical logic since Anaximander and even before. You have position and momentum as your two crucial measurements that define "something actually happening in the spacetime vacuum".
Or in a less clearly defined fashion – as time remains outside the current quantum formalism – you have the complementarity of time and energy as the coordinate system for measuring quantum action. Another way of looking at "something actually happening in the spacetime vacuum", that then fuzzes out on the fine grain view due to Heisenberg uncertainty.
So QM sets up the bivalent metric that is needed to measure a hierarchically organised cosmos – one which is defined in terms of the classical local~global scale distinction.
Note how position speaks to the local invariance that derives from spin – one arm of Noether's/Newton's conservation of angular momentum principle. And momentum speaks to the other arm of translational coordinate invariance – the matching global view when it comes to measuring some classical difference that ain't in fact a difference, being simply a first derivative inertial freedom, and thus a ground zero as your measurement basis ... in a world that is now explicitly dynamical as rotation and translation are its ground states.
And note how QM sets even this up as the quantum vacuum is never empty, just has some dynamical balance as its ground state. Time remaining outside the formalism is how the world starts already energetically closed ... making QFT a little semi-classical and in need of QG to unite it with the fundamentally open perspective of GR. Another more basic level of cosmic coordinate defining.
Anyway, side-tracked as usual. QM poses its dichotomous question with its commutative order catch. Classical mechanics – the notion of a quantum collapse – then delivers some counterfactually definite measurement.
This is easy to do, due to thermal decoherence, when you stand right in the middle of the local~global divide in terms of measurement scale. Newtonian mechanics is what you see in a low temperature and inertially constrained reference frame. You can measure position and momentum in a way that seem to give you concrete initial conditions and so a deterministic trajectory for every event ... after the "retrocausal" principle of least action has been built into your Newtonianism as Lagrangian mechanics.
But as you head down to the Planck scale, it all gets too small, hot and fuzzy. Your classical coordinate system falls apart.
Well at least to a degree as you can answer one question at a time, if not two. That would be the advantage of QM not actually including time as another moving part of its story, just parking it on the semi-classical sidelines as an informal time~energy kluge that is also quite useful over all scales where time does seem to have a linear lightcone flow – where the general thermal arrow prevails and the fine-scale retrocausal corrections don't cause enough minor temporal eddies to matter.
It is in the sun’s past light cone ... The sun is only sort of in the future light cone of IOK-1. — noAxioms
Yep. There is an asymmetry in the scale terms I just described. Time is the great big flowing river with its irreversible thermal history. The fact that is has all these tiny retrocausal eddies is something that gets washed away in the general big picture view. It is only once you get down to wanting to measure the most local grain of events – as in some set-up like the quantum eraser – that you can measure this other face of time.
Each individual act of thermalisation is its own bit of history. It might take billions of years for a distant galaxy to complete the photonic interaction that allows it to cool down at its end and the sun to heat up by the same amount at this end. Almost all of the radiation by an IOK-1 would be absorbed by some far more local particle. Probably interstellar dust not even light years away.
So really long-distance retrocausality would be matchingly rare as well. The time it took for an 1OK-1 photon to reach us would have impact on the overall statistical flow of the cosmic thermal arrow.
And even then, the arriving photon would look red-shifted by its long journey. We would see that QM had balanced its accounts. The metric expansion of space is included in the equation. That is why radiation gives you extra bang for your thermalising buck. The hot photon is a very cold photon by the time it has retrocausally connected two very distance locales in spacetime and so dissipated some quanta of energy in a decoherently definite, quasi-classical, fashion.
Everett’s interpretation is completely deterministic, but not empirically deterministic since there is no way to predict what you’ll have measured in tomorrow’s observation.
BM on the other hand is deterministic in both ways, and in that interpretation, the sun exists relative at best to the universe, and the relation to IOK-1’s light cones is irrelevant. — noAxioms
Why offer BM and MWI as your orienting dichotomy of interpretations? Both are really old hat sounding these days.
I say it is better to treat collapse and collapseless ontologies as simply mapping out the limits of the real story – the one where there may be no actual true collapse, but then indeed an effective collapse due to thermal decoherence and a relational understanding of QG.
Reality is always contextual and so "collapse" becomes a matter of degree – determined by the scale of observation.
On the finest grain, no collapse can be found. You just have the two questions you would have to answer to give you your bearings in a classically-imagined cosmos.
On larger scales – ones where the spacetime metric is larger and cooler, where lightcones have the time and space to have their equilbrating effect on questions about location and momentum – then a sharp sense of classical reality emerges from the quantum vagueness and uncertainty.
Time appears to flow like a constant c-rate thermal arrow. Space appears to remain as gravitationally flat and thermally even as it ever was – at least on the scale of galactic structure where it all should settle down to a conformal or scalefree metric.
I was going to ask, have you checked out Penrose's twistor model which is an attempt to map everything to exactly this kind of conformal metric – a lightcone view of spacetime?
From IOK’1’s point of view, that’s a counterfactual statement. It’s not meaningful in a local interpretation. — noAxioms
Another little point here. The fact that the photon was absorbed by a detector at one particular point in all the points that it could have hit on the same lightcone is where you find the counterfactuality in the local view.
From IOK-1's point of view, does it give a stuff where its emitted photon lands? It sprayed the wavefunction in every possible direction. There was some probability of it going off in the precisely opposite direction. And even hitting the general vicinity of the experimenter's lab still leaves a lot of scope for narrowing things down.
So the degree of counterfactuality is only maximised – collapsed to its limit – in the sense that the photon landed "here", and not any other "there", on the holographic boundary that is the surface of your lightcone that defines the particle's "past".
BM has that kind of retrocausality as well. — noAxioms
BM is explicitly nonlocal. The problem is that it isn't relativistic without fudging the Born rule. So it has fatal shortcomings.
This is why I lean towards interpretations that are "kind of nonlocal" in a way that is complementary to the way they are "kind of local". That is, dichotomistic interpretations where each aspect emerges as the limit of its "other". So causal sets and other emergent spacetime models like that.
Adlam speaks of the new "all at once" interpretation where accepting nonlocality in both time and space – going further than BM for instance – allows you then to recover your local view in terms of the resulting sum over possibilities.
The principle of least action finally becomes an element of reality, not some spooky teleogy that has to be invoked to make the results come out right in the measurable world.