Gravity is just a name for the acceleration of any two masses towards each other. — Banno
What’s causing precise acceleration? — ucarr
Respective masses curving spacetime. — 180 Proof
Are you talking about gravitational attraction?
You hold this equation in contempt? — ucarr
Saying gravity causes acceleration is just saying the acceleration between two masses causes the acceleration between two masses. — Banno
Gravity and acceleration-due-to-gravity are, in a certain sense, as one. They are conjoined as a unified concept: gravity-and-acceleration. Thus cause and effect are, in the same sense, as one, save one stipulation: temporal sequencing. — ucarr
..."time" is neither "temporal" nor a "phenomenon". (I think you're confusing (your) maps with the territory.) — 180 Proof
What’s the critical operation between cause and effect when considered as conjunction: time?'' — ucarr
No. IMO, wrong, or incoherent, question (i.e. misuse of terms). — 180 Proof
Are there any observable boundaries time cannot merge? — ucarr
More incoherence. "Time" is a metric (i.e. parameter), ucarr, not a force or agent. — 180 Proof
(I think you're confusing (your) maps with the territory.) — 180 Proof
(I model mathematical causal chains as compositions of functions. A result (effect) at a time t is, say, z. The next temporal step is to compute s, where s=f(z), then after that, r, where r= g(s), and so on. There's a whole theory herein. But I think it more realistic to assume several functions act on z, not just one. Like differing forces. So each step - and these are associated with intervals of time - has as outcome the influence of a number of "forces", rather than a single function.) — jgill
I model mathematical causal chains as compositions of functions. — jgill
"Forward-flowing" is a cognitive illusion and intuitive way of talking about asymmetric change. "History" represents time-as-past-tense-narrative (i.e. a ghost story). Particle physicists refer to worldlines (or many-worlds branchings) and statistical mechanics refer to entropy gradients. I still don't see what your musings, ucarr, have to do with philosophy. What's the philosophical itch you're trying to get us to scratch? State it plainly.forward-flowing of history — ucarr
I still don't see what your musings, ucarr, have to with philosophy. What's the philosophical itch you're trying to get us to scratch? State it plainly. — 180 Proof
Do you think the forward-flowing of history comprises the physical phenomena populating our empirical experiences? — ucarr
"Forward-flowing" is a cognitive illusion and intuitive way of talking about asymmetric change. "History" represents time-as-past-tense-narrative (i.e. a ghost story). Particle physicists refer to worldlines (or many-worlds branchings) and statistical mechanics refer to entropy gradients. — 180 Proof
Particle physicists refer to worldlines (or many-worlds branchings) and statistical mechanics refer to entropy gradients — 180 Proof
"what is causing galaxies to deviate from the predictions of our models?" Such causes get posited as new elements of a model a in many subfields uncovering the nature of these causes becomes a major, or the major topic of research, e.g. dark matter and dark energy. — Count Timothy von Icarus
The gist of my claim herein is that the above quote describes our fluidly transforming world as an ongoing continuity of boundary crossings, boundary mergers, Venn Diagram overlapping and transcendence of boundaries. — ucarr
So for example, someone in another thread suggested to me that we could model an atom as a system. However, the natural state of atoms is to exist within complex molecules, where parts (electrons for example) are shared. If two atoms share an electron, and the atoms themselves are being modeled as distinct systems, then in each model, the shared atom is both an internal part of the inertial continuity of the system, and also a part of the other system, thereby acting as a causal force of change on that same system. In other words, from this 'systems' perspective, the electron must be understood as both a part of the inertial continuity of the system, and a causal force of change to the system (being a part of an external system), at the same time. — Metaphysician Undercover
I think the arbitrary nature of system boundaries is akin to other problems in the sciences and even humanities. For example, in semiotic analysis/communications, a physical entity, say a group of neurons, might act as object, symbol, and interpretant during the process, depending on the level of analysis that is used. But at a certain part, the ability of any one component to convey aspects of the total message breaks down. E.g., a single logic gate can't hold the number "8," itself. Certain relationships only exist at higher levels of emergence, like your example of shared electrons. — Count Timothy von Icarus
I don't know wtf I'm talking about, jgill, but somebody with real QM chops is bound to come along who can talk mathematical physics to a mathematician. :sweat: — 180 Proof
Again take a box with a partition in it, with gas A on one side, gas B on the other side, and both gases are at the same temperature and pressure. If gas A and B are different gases, there is an entropy that arises once the gases are mixed. If the gases are the same, no additional entropy is calculated. The additional entropy from mixing does not depend on the character of the gases; it only depends on the fact that the gases are different. The two gases may be arbitrarily similar, but the entropy from mixing does not disappear unless they are the same gas - a paradoxical discontinuity...
As a central example in Jaynes' paper points out, one can develop a theory that treats two gases as similar even if those gases may in reality be distinguished through sufficiently detailed measurement. As long as we do not perform these detailed measurements, the theory will have no internal inconsistencies. (In other words, it does not matter that we call gases A and B by the same name if we have not yet discovered that they are distinct.) If our theory calls gases A and B the same, then entropy does not change when we mix them. If our theory calls gases A and B different, then entropy does increase when they are mixed. This insight suggests that the ideas of "thermodynamic state" and of "entropy" are somewhat subjective.
Again take a box with a partition in it, with gas A on one side, gas B on the other side, and both gases are at the same temperature and pressure. If gas A and B are different gases, there is an entropy that arises once the gases are mixed. If the gases are the same, no additional entropy is calculated. The additional entropy from mixing does not depend on the character of the gases; it only depends on the fact that the gases are different. The two gases may be arbitrarily similar, but the entropy from mixing does not disappear unless they are the same gas - a paradoxical discontinuity...
In the case of two distinct gases, an act of mixing is required, and this requires time and energy. In the case of the gases being the same, it appears like the gases have already mixed as soon as the separation is removed. That's just an illusion, mixing has not occurred, as marking the molecules would reveal.
I guess I give up, having not been able to follow what it is you might be claiming. — Banno
What you have to say is too muddled to have any reverberation. — Banno
Get involved in philosophical discussions about knowledge, truth, language, consciousness, science, politics, religion, logic and mathematics, art, history, and lots more. No ads, no clutter, and very little agreement — just fascinating conversations.