Actually, one definition of Entropy is "the arrow of time". Time is a measure of Change, and Entropy is the direction of Change (e.g.from hot to cold). So, Entropy and Time are necessarily entangled, just like Space-Time. For example, Entropy is the effect of Time on Spatial objects. The notion of Temporal Entropy then, is not trivial, but essential to the "way of the world". Moreover, Evolution would have vanished into nothing eons ago, if not for positive Natural Selection, to oppose the negative effects of Random Mutation. Hence, Time moves in a partly positive direction (progress), despite negative Entropy, but to the complex workings of Space-Time-Entropy-Energy (i.e. four dimensions). :smile:Would it be useful to consider a four dimensional (i.e. time inclusive) form of entropy? — Count Timothy von Icarus
Yes. Classical macro Time is intuitive, hence easy to understand. It's just the measure of cycles that are meaningful to humans : day/night ; moon phases, etc. Quantum micro Time, not so much. We are not normally aware of cycles at subatomic scales : Energy field phases & Radioactive Decay. Also, Cosmic cycles are measured in billions of Earth years, hence not perceivable in a human lifetime. Plus, Einstein's Block Time has no cycles at all. So, the concept of Time that we take for granted is just one way of measuring Change in the physical world. That's not to mention the subjective experience of Time that varies due to emotional states. Consequently, like everything else in Einstein's worldview, Time is relative to the Observer.The problem, which has been proposed since Boltzmann's time, is that time and entropy are not necessarily correlated. They are contingently so. This is known as Loschmidt's paradox. — Count Timothy von Icarus
"Enformy" is my own term for what physicists refer to as "Negentropy". But that scientific nomenclature, taking Entropy as primary, makes it sound like a Fatalistic Force. On the contrary, "to enform" means "to give meaningful form to . . ." In other words, it's a creative force in nature. And Evolution is the record of an ongoing series of emergent forms, from a formless beginning (the abstract mathematical Singularity).Interesting. Enformy seems like it would be an emergent factor from : The laws of physics being what they are and allowing for complexity to emerge. — Count Timothy von Icarus
Would it be useful to consider a four dimensional (i.e. time inclusive) form of entropy?
Entropy is often defined as the number of possible microstates (arrangements of particles) consistent with an observed macrostate.
Time entropy would be the number of possible past states consistent with an observed present state. Is this potentially useful? — Count Timothy von Icarus
It's worth noting that for use in some physics problems, entropy's definition is altered to be: the total possible number of microstates consistent with all the information we have about a system, such that, as you complete more measurements and gain information about a system the "entropy" goes down because your information continually rules out certain microstates.
This may be a better definition because the number of potential microstates for a fully unobserved system is obviously infinite. Observing a "macrostate" is getting information about a system, which is then reducing the possible configurations the system can have. So while the definitions seem different, it isn't clear to me that they actually are. The idea of naively observed "macrostates" may be one of those concepts we inherited from "common sense," that work well enough for some problems, but hurt us in the long run. — Count Timothy von Icarus
Perhaps what you have in mind are past microstates that, when evolved into the present, would be consistent with the present macrostate? In other words, past microstates that evolve into any of the microstates that partition the present macrostate.
Now, let me backtrack a bit and reexamine your definition of entropy. A given macrostate induces a particular statistical distribution of microstates. When a system is at a thermodynamic equilibrium (and thus at its maximum entropy), all its microstates have the same probability of occurrence. Then and only then can we calculate the entropy simply by counting the number of microstates. In all other cases* entropy can be calculated, per Gibbs' definition, as a sum of probabilities of microstates in a statistical ensemble. Given a time discretization, we can then add up past microstate distributions leading to the present distribution to obtain your "time entropy."
Am I on the right track?
Get involved in philosophical discussions about knowledge, truth, language, consciousness, science, politics, religion, logic and mathematics, art, history, and lots more. No ads, no clutter, and very little agreement — just fascinating conversations.