I don't really have any objection to any of this — Seppo
If you're going to be so damn reasonable then I have to rescind that dogmatic comment. Bugger.
:smile:
If the current projects/paradigms (string theory, supersymmetry, etc) were going to bear fruit, you would have hoped it would have happened by now... and that just hasn't happened, we've been spinning our wheels for decades. — Seppo
Yep.
I tend to be a bit more conservative in sticking to what is the widely held view of people with actual formal expertise on the subject, hence my comments here sticking to what I guess is sort of the party line on the topic RE quantum gravity and early Big Bang cosmology. — Seppo
Again, that's fair.
It it just that the party line too often feels like the party members papering over their own divisions and confusions so the general public/taxpayer funders don't catch on to what a mess they might be in.
In fact I parked particle physics and cosmology a decade ago to give them time to catch up with themselves and see if some actual new consensus might emerge. Loop and condensed matter approaches were encouraging at the time, but also starting to fall apart like strings did.
I think what did it for me was the loop guys suddenly promoting bounce cosmology as the kind of "theory" that a new multi-billion euro collider might just be able to test. Suddenly there was a new party line to be built around a ginormous funding application ... and let's not look too closely at its scientific merits.
But in case you are interested in where I am coming from, there was this really good blog post by the "Hammock Physicist", Johannes Koelman, in 2010. He was so on the money for me that it was no surprise he appeared to give up his academic ambitions and turn to making a living in industry soon after.
http://www.science20.com/hammock_physicist/physical_reality_less_more
I wrote up a precis at the time which I can simply paste here just in case it has value.
Preamble: Most modern metaphysics presumes the laws of reality, the structure of the cosmos, to be contingent. The laws are just whatever they are with no real explanation other than some kind of anthropomorphic accident. This is a view that drives Tegmark and his multiverse speculation and other expressions of modal realism.
But physics itself appears to be closing in on a tale of mathematical necessity, a tale of symmetries and symmetry breaking, which now in metaphysics is also inspiring new schools of thought like Ladyman and Ross’s ontic structural realism - http://www.amazon.com/Every-Thing-Must-Metaphysics-Naturalized/dp/0199573093
So this is a new “emergent Platonism”. It is not that there is a realm of infinite forms – a Platonic ideal for every possible particular entity from triangles to jam jars – but rather that there is a general mathematical inevitability to the structure of nature. Given a starting point of unlimited material freedoms, some kind of prime matter, apeiron or entropic gradient to shape, a world must then self-organise according to certain intelligible principles. And this is what fundamental physics has quietly been doing from Newtonian Mechanics right up to string theory and loop quantum gravity today – systematically following the path leading back to the deep mathematics, the ur-pattern shaping nature.
So this is post about the unification of physics project. And this excellent blog post by Johannes Koelman gives the guts of the argument - http://www.science20.com/hammock_physicist/physical_reality_less_more
I will use it as a jumping off point, particularly this Venn diagram of how the theories form a three-cornered hierarchy of generalisation....
Planck constant triad: Where does it all start? With the idea of symmetry and symmetry-breaking. Or the birth of scale, the birth of difference within what was “the same as itself”. And so it is about a special kind of reciprocal dualism or asymmetric dichotomisation where the same becomes different by moving away from itself across local~global scale.
Now this is an unfamiliar idea to most even if it is very ancient – the basis of Anaximander’s cosmology, the very first true metaphysical system. But briefly, it is about inverse relationships. If you take a classical metaphysical dichotomy like flux~stasis, chance~necessity, discrete~continuous, etc, you can see how each pole defines itself as the reciprocal of the other. Stasis is the state where there is no flux, or the least possible flux. So stasis = 1/flux. That is, the larger you imagine flux to be, the smaller or more fractional the quantity of it you will find within stasis. And the converse applies. Flux = 1/stasis. The larger the amount of “no change” imagined, the less of that there is to be found in flux, and so the more “changeable” flux becomes. All regardless of any actual measurement or quantification.
So this is a special mathematical relationship that emerged in Ancient Greek metaphysics – the dialectic manoeuvre that drove its speculative twists and turns. And it has re-emerged centre stage in modern physics as symmetry-breaking and the various dualities or complementary relationships that are a feature of high-level theories.
Now on to those theories. As Koelman makes clear, it starts with Newtonian Mechanics (NM) where the local~global relationship, the primal dichotomisation of physical scale, was first properly quantified – but in an actually broken apart way.
NM presumed a fixed space and time backdrop and then defined the rules for quantifying local events within that absolute reference frame. That 'broken apart" classical view of nature certainly worked at the human scale of observation, where we are so far from the bounding limits of the cosmos. But as science developed particle accelerators and radio telescopes, physics had to expand its view too. It had to develop post-classical theories that included an account of the global container as well as the local contents.
And in brief, that has turned out to mean bringing a triad of "dimensionless" constants inside the picture we have of reality – the three Planck constants of h, G and c. (h = Planck’s constant that scales the quantum uncertainty of things, c = the “speed of light” or the constant that scales causal interaction, and G = Newton’s gravitational constant that scales mass/spacetime curvature.)
There are only these three critical “numbers”, all somehow tied to the most fundamental level of symmetry-breaking. And with NM, we start with them all outside the physical theory as values that have to be empirically measured – which is an immensely tricky and approximate story in itself. Then the story of modern physics has been about pulling the constants inside the theories, first in ones, then in twos, and finally, hopefully, with the ultimate Theory of Everything (ToE), getting all three inside the picture of nature together at once. At which point, physical theory would become completely rational, drawing up the ladder on the need for empirical measurement as the mathematical structure would be able to account for itself entirely.
This is because the constants will have been defined in the same self-explanatory way as the old metaphysical dichotomies like flux~stasis. The constants represent the action that breaks a symmetry, but now in both its “directions”. Asymmetrically or reciprocally across actual hierarchically-organised scale. What this means should become clearer as we see how physics has developed since Newton.
Newtonian Mechanics: As Koelman points out, NM was based on reducing the empirical measurement of reality to four quantities – distance, duration, force and inertia. The brilliant idea at the heart of scheme was to disconnect the local scale from the global scale by imagining the global scale to be a fixed, flat and eternal, backdrop. Space and time were made a static symmetry – you could go backwards or forwards in space’s three global dimensions or time’s single global dimension and they “didn’t care”. It was all the same, and so symmetrical.
This was of course a view of nature directly inspired by Ancient Greek atomism and its notion of an a-causal void, where similarly, all causality, all action or symmetry-breaking, involved local parts. Only material/efficient cause was real, making formal and final cause a fiction projected onto the emergent regularity of atoms contingently at play. And given that space and time were defined in this absolute sense – a matter of brute and immutable fact – this legitimated the use of clocks and rulers as universal measuring yardsticks. You could create a standard unit of distance or duration because such a human construct was underpinned by the concreteness of space and time themselves. At any place, in any era, and at any scale, these clocks and rulers would continue to function reliably because they were measuring something unimpeachably real.
So Newton – as a metaphysical premise – created an unchanging backdrop against which the kind of change we are most interested in, the middle-scale realm of lumpy objects, could be crisply measured. Now he only had to model a localised symmetry-breaking – the one between mass and force, or between the material property intrinsic to a body and the web of interactions between such bodies. This led to his three laws of motion.
Newton's first law defines the inertia of bodies. Massive objects can have a "resistance to change" in their motion if that motion does not break a global symmetry. So a ball can roll forever in a straight line due to translational symmetry, and it can spin forever due to rotational symmetry. The combination of mass and velocity gives a body a momentum value. A force - as then defined by the second law, F=ma - is a change in momentum imposed from without. Like by getting smashed into by another object. With absolute space and time as a fixed reference frame, these "hidden" local quantities of force and inertia could be read off the world in terms of localised symmetry breakings - a curving path or a change in velocity.
Then Newton’s third law of action~reaction restored the broken symmetry by creating a global energy conservation principle. Everything that got pushed, pushed back equally, leading to a net zero force at the global scale. Nothing happened to disturb the static stage upon which the mechanics played itself out.
Newtonian gravity: It was a neat system. But of course it only dealt with objects banging into each other. And Newton had to make another big leap of the imagination to deal with gravity – a global force that “acted at a distance”. His peers like Descartes had tried to make sense of gravity’s pull as a jostling of spatial atoms. The sun and planets were swirled around in circular orbits because they were caught in the flow of super-fine corpuscles. But Newton boldly posited gravity as an intrinsic property of mass. And one that scaled (inversely!) with distance. The greater the separation, the more weakly the gravity of a body was felt.
So Newton again used an absolute backdrop as a way to localise the notion of an action – the material cause of change, the thing that breaks a physical symmetry. And as the other forces like electro-magnetism became recognised, the same general mechanics could be used with them as well. They could be quantified as vectors – a small push or pull in a direction acting to disturb the inertial motion of a body.
Post-Newton theory: The Newtonian model worked so well because it homed in on symmetry-breaking on the human scale - where we are 33 orders of magnitude distant from the smallness/hotness of the quantum scale and 28 orders of magnitude away from the bigness/coldness of the visible Universe, the relativistic scale. Action or change might be taking place at the extremes of scale, but the Universe would still look a flat and unchanging backdrop because either the change up at the relativistic limit was so large and slow that it was beyond our field of view, or equally, down at the quantum limit, so small and rapid that it become an unbroken-looking blur.
But eventually it was realised that the Universe was dynamical over all its scales. For one thing, it was born in a Big Bang and is spreading/cooling towards an entropic Heat Death. So the extremes of scale had to be brought inside the general model and made subject to the same laws ruling change.
In prescient fashion, it was Planck in 1899 who saw that all mechanics could be boiled down to three constants – three dimensions quantifying the actions that break even the most global symmetries. Well, Planck thought it would be four as he included Boltzmann’s constant, k. But by the 1930s, Matvai Bronstein had clarifed that physics was looking for the magic trio of cGh. Well, in fact to show that it is only retrospectively that physics understood the deep logic of its own progress, the much more famous names of Gamow, Ivanenko and Landau cooked up this little insight as at first a joke, allegedly to impress a girl, and then 50 years later, Okun, another Russian rediscovered it and finally popularised the way cGh anchored all physical theory as “Okun’s cube”. Here are a couple of blog posts on this history.
http://backreaction.blogspot.com/2011/05/cube-of-physical-theories.html
http://blogs.scientificamerican.com/guest-blog/2011/07/14/why-is-quantum-gravity-so-hard-and-why-did-stalin-execute-the-man-who-pioneered-the-subje
So OK, it was not so obvious at the time. But it does explain why physics ended up organised like Koelman’s Venn diagram, a systematic attempt to turn three empirical and apparently arbitrary measurements into three reciprocally self-defining and so mathematically necessary global symmetry-breakings.
Special Relativity: Ticking through these quickly, first came Einstein’s Special Relativity (SR) which incorporated c into mechanics as a general yo-yo factor.
First space and time (representing stasis and flux in terms of locatedness and change) became glued together to become the one thing, a global scale symmetry balance that Koelman dubs “spacetime-extent”. Then c scales any breaking of this balance with extent multiplied by c = distance^2 and extent divided by c = duration^2. So in this way spacetime is changed from being a static backdrop to a dynamic dimensionality where the baseline of “no action” is effectively redefined as the expanding sphere of an event horizon. Events are physically separated by a distance and a duration in the way that the sun may have vanished four minutes ago, but it will take another four minutes before we can know about it. So to break spacetime in such a way as only to see “a distance”, you have to multiply by c to allow “enough time” for the distance to “happen”. And conversely, to recover “a duration”, locate it within a purely temporal dimension, you have to divide by c to remove the space over which it has has spread.
A simpler way to understand this is considering spacetime from the points of view of a massive particle and a light ray. Even if it is stationary, not moving in space, the particle is now moving in time. It is “travelling” into a future where the distance to any event horizon is getting constantly c-times more expanded. And conversely, the light ray may be moving at c through space, but now – being already “at” the event horizon – it is “stationary” in respect of the global dimension of time. So there are two ways to be standing still and two ways to be moving. And which way round you read off the symmetry breaking is scaled by c or 1/c.
Thus globally, spacetime was scaled by a reciprocal action. Then locally, the same was done to the Newtonian version of the action. Energy and momentum became glued together as a “general stuff” – spacetime-content – and this symmetry again broken by a yo-yo inverse relation. By E=mc^2, mass could be converted into a “times-c” amount of energy, while energy could be converted into a “times-1/c” amount of mass. The material contents of the Universe could be viewed either in terms of an energy density located to a spatial point, or smeared across a temporal sphere. The two were dichotomous ways of looking at the same thing. So “where you were” as an observer within the system had to be specified by an inertial reference frame if you wanted your Newtonian rulers and clocks to read off the same distances and durations. Nothing was absolute, but in a dimensionless way, you could still distinguish c from 1/c as the generalised limits on reality.
General Relativity: SR incorporated one of the three Planck constants, c, but left out h and G. So the quantum uncertainty and gravitational curvature of the Universe remained “measurements from outside” the system being measured. This didn’t matter outside the middleground scale of classical objects – lumpy masses bumping about in a cool/large void. But it did matter if SR was going to continue to measure the world accurately as it approached these two other limits of nature.
Einstein of course took the next step of extending SR by incorporating G alongside c to give General Relativity (GR). Spacetime was now made bendy, defined in local fashion by its energy density. Instead of distance and duration being flat and even dimensions – Euclidean as presumed by Newton – they were free to adjust their geometry according to the density differences in their material contents. Or being a bit more technical, spacetime and its contents became unified as flipsides of the same thing – the Einstein-Hilbert action. The reciprocal nature of the deal was again made explicit in the maths, spacetime being scaled by G/c^4 while the mass/energy contents was scaled by c^4/G.
Quantum Mechanics and Quantum Field Theory: The other big revolution going on was Quantum Mechanics (QM). It had been discovered that reality is scaled by an uncertainty relation – a yo-yo deal centred on h as its “physical” value. In QM they called this complementarity, making the connection with Eastern metaphysical thinking like Yin-Yang (which of course is another version of the same thing the Ancient Greeks were talking about with dichotomies). But anyway, what QM said was reality is fundamentally vague or indeterminate.
Measurements need to be made from a fixed point of view to have definiteness. And when you get down to nature’s essential symmetries (and the asymmetries that break them in local~global scale fashion) then trying to pin down one end to a crisp value sends the other off to unknowable indefiniteness. So ask about a particle or event’s position and its momentum goes off the other end of the scale. Zero in on it in terms of time, and its energy could have any value.
By including this yo-yo measurement issue in the physics, QM made explicit the way classical mechanic had been coarse-graining reality. When things are cold and large – far from h as a limit – then the Newtonian picture works well as reality is near to dammit determinate in its behaviour. But approach the complementary limits of the hot or the small, then the classical crispness breaks down in a well modelled exponential fashion.
So QM pulled h inside the mothership of Platonic physics, but like SR, it left the two other constants dangling – c and G. This was fixed by Quantum Field Theory (QFT) which repeated GR’s trick, this time combining h and c.
QFT is a relativised version of QM and it did this by treating particles as excitations in a field. So there was the jump from a Newtonian strict location of a symmetry and its breaking (a particle, its properties, the forces that might impinge on it) to a field view where everything becomes global and contextual. The key calculational breakthrough was Feynman’s concept of path integrals, or sum-over-histories. Uncertainty could be quantified as all the paths that a particle might have taken quantum mechanically and then the path “actually” taken as the shortest possible path to get where it got. So all the energy values the particle might have had (given the scale of the action) could be averaged across. And relativistic effects, like what speed does to mass and time, could be included as contributions to the final result as well, giving a picture of an action zeroed to some definite reference frame.
Cartan Gravity: SR, GR, QM and QFT are the familiar fab four. But far less well known is that there is a third leg to this story of the grand consolidation of the theories. As Koelman points out, logic demands there was also Cartan Gravity – an effort to match SR/c and QM/h with a generalisation of Newtonian mechanics that dealt solely with G. And then following that, even a Cartan Quantum Gravity that unified G and h.
Now the Cartan notion of space is based on torsion (as opposed to say curvature). But I confess I am not clear why it is not a big deal like quantum and theory and relativity. Perhaps combining G and NM has little technological value (QM especially has been the basis of valuable everyday application). Certainly Newtonian gravity deals with the classical scale of interaction perfectly adequately given that massive gravitational fields are not the kind of thing we can bring to bear on nature in the same way we can with c or h scaled phenomena.
Anyway it is said Cartan theory may yet come into its own as the basis for loop quantum gravity or other ultimate theories where forces have to be modelled as twists in space. And certainly it is a necessary third leg of the theory unification process. It is logical that this way of climbing the same mountain also is possible.
Theory of Everything: So that then just leaves one final step – a ToE or Quantum Gravity (QG) theory that hoovers up all three Planck constants, cGh, into the mothership of reciprocal dimensional maths.
As Koelmans says, this seems to require a further extension of the sum-over-histories approach where QFT is enlarged to include G or spacetime curvature. As well as averaging across the uncertain energy levels of a particle and any global relativistic contributions, the calculation would have to average across any local uncertainty in the spacetime the particle is meant to be travelling through (or the excitation and the field it is “happening” in). QFT can in fact give approximate answers of that kind by imposing a cut-off on local gravitational contributions, but a properly elegant way of doing this – one which shows how the three dichotomies can be both internalised and also connected to each other as some kind of fundamental geometric relationship – is still a work in progress.