How is it that the normal distribution occurs all the time? It seems at the macro-level, at least, the more likely events occur more of the time.
At the scale of the very small, that rule seems violated. Which may be no more than a case of different rules - very different rules. Or no rules at all. Or a third case: rules, but not that we can determine because of fundamental limits to our ability to determine rules - at least so far. — tim wood
The Galton board is a good example. But doesn’t it illustrate the way that micro chance and macro determinism are yoked together?
The board engineers things so that every peg gives a 50-50 probability of deflecting a falling ball to its left or right. The randomness is deliberately maximised at this level - or else it is a loaded board. We can argue that no board could ever be so perfectly engineered. Each peg might be infinitesimally biased. But the point of the exercise is to approach the limit of pure randomness at this level.
Then given a perfect board, it will produce a perfectly determined probability distribution. At the macro level, you can be absolutely certain of a nice and tidy Gaussian distribution emerging from enough trials.
Each ball hits 7 pegs on the way down. Each deflection is a 50-50 split. There is only one way to hit the outside bin - 7 left or right deflections in a row. And then 70 ways to land in the central two bins as an even mix of left and right deflections.
So the individual pegs provide the pure chance. But the board as a whole imposes a sequential history on what actually happens - a certainty about the number of 50-50 events and the number of different histories, or paths through the maze, that describe the one final outcome.
So in a Platonically perfect world, the micro and the macro scale are engineered to represent the opposing ideals or the accidental and the determined. The system isn’t either the one or the other in some deeper metaphysical sense. It is designed to represent the dialectic of accidental versus determined as being the proper model of a reality that is probabilistic.
Chance and determinism are yoked in a reciprocal relation as the opposing limits of nature, Micro level chance and macro level determinism are how we get a system that has a stochastic character.
Then of course, the problem is that the real world may not be amenable to such perfect engineering. This is where chaos and quantum effects impact on things.
Chaos is about non-linearity. It is written into our assumptions about the pegs and the board that we can keep any imprecision in our engineering within linear bounds. Any bias or error in the construction will itself be averaged away in Gaussian fashion. But if there is non-linearity of some kind - maybe the pegs are springy in a way that reverberations are set up - then errors of prediction will compound at an exponential rate. The attempt to engineer a perfect distinction between local randomness and global determinism will go off course because of the emergence of non-linear divergences that lead to new kinds of internal correlations, or synchronised behaviour.
Then quantum uncertainty also affects our perfect engineering. If the Galton board is very small or very hot, then it is going to start to misbehave. Everything from the balls, to the pegs, to the board as a whole, will be fluctuating in ways that introduce an indeterminism about both the randomness of each deflection event and the determinism about the countable ensemble of paths as a whole.
Again the classical picture of a world cleanly split between absolute chance and absolute constraint will lose its linearity and become subject to an excess of divergence and/or an excess of correlation.
We will arrive at the quantum weirdness of a physical system that either diverges at every event to create a many world ensemble of separate histories, or we have to accept the other available interpretation - that there are spooky non-local correlations limiting the chaos.
So what I am arguing is that the classical picture demands some kind of monistic commitment - either reality is fundamentally based on determinism or chance. But our best models of randomness or probability are intrinsically dichotomistic. It is essential to construct a system - whether it is a die, a coin, a Galton board, a random number generating algorithm- that exemplifies indifferent chance at the micro scale and constraining history on the macro scale.
Then we learn in fact that physical reality can’t be so perfectly engineered. We can approach linearity, but only by suppressing non-linearity. To achieve our Platonic image of the ideal gaming device, we have to do work to eliminate both its potential for divergence - too much local independence in terms of accumulating history - as well as the opposite peril of a system with too much internal correlation, or too many emergent intermediate-scale interactions.