• Count Timothy von Icarus
    1.9k
    Hello friends,

    Long time no see. Been very busy with work and the birth of my son. However, a nagging idea I had has brought me back:

    Would it be useful to consider a four dimensional (i.e. time inclusive) form of entropy?

    Entropy is often defined as the number of possible microstates (arrangements of particles) consistent with an observed macrostate.

    Time entropy would be the number of possible past states consistent with an observed present state. Is this potentially useful?

    (I searched for this idea existing already but couldn't find it. I have little doubt someone else has thought of it though, so I would love to know what it is called. I will call it "time-entropy" for now.")


    That's the question, below are some thoughts and clarifications:

    It's worth noting that for use in some physics problems, entropy's definition is altered to be: the total possible number of microstates consistent with all the information we have about a system, such that, as you complete more measurements and gain information about a system the "entropy" goes down because your information continually rules out certain microstates.

    This may be a better definition because the number of potential microstates for a fully unobserved system is obviously infinite. Observing a "macrostate" is getting information about a system, which is then reducing the possible configurations the system can have. So while the definitions seem different, it isn't clear to me that they actually are. The idea of naively observed "macrostates" may be one of those concepts we inherited from "common sense," that work well enough for some problems, but hurt us in the long run.

    What is interesting here is that, for any system, time-entropy would increase going back further in time. After all, a system a second before an observation might have only one possible configuration, but if you go further back, it is possible that multiple different configurations could have led to the present outcome.

    For an example, if a biologist spots a new insect that looks a lot like a known species, it is possible they share a close common ancestor, but it is also possible that the similarities are the result of convergent evolution (which entails many more possible prior states).

    What I find interesting about a potential "time-entropy" measure is that it necessarily stays the same or increases with time, whereas the tendency towards normal entropy in the universe seems ad hoc; it doesn't have to be that way. A universe with a starting point starts with one configuration. Over time it can take on new configurations or it can stay the same, but it can never see a reduction in potential states that would lead to its given present state, or the number of actual states it has had.

    This might be begging the question though, as it assumes an arrow of time that we currently use the increase in normal entropy to define. But perhaps it is getting at an essential reason for why time is the way it is?

    I know one objection here is that, if the universe is deterministic, then there is, in fact, only one set of possible past states for every observation and one set of future states. I will allow that, but this is also entirely true for current run of the mill spatial entropy. If everything is deterministic, then there is, in fact, just one possible microstate for every system, the one it actually has. The illusion of possibilities only comes from us having finite/impartial information about a system in the first place. The mathematical surprise of any microstate that actually exists in a deterministic universe is, in fact, zero for someone who has all the information.

    I don't think this is actually a problem. If information is physical, then it is impossible to have all the information about the universe. Hell, it is impossible to code the movements of even one mole of hydrogen gas as of now, since you have Avagadro's number of particles, each with multiple values associated with different forces to consider in relation to one another, and this requires a lot of storage and computation to simulate. Physics is about how the world is to physical observers, magical entities cause all sorts of problems.

    Another possible issue though is that, if you assume a block time universe, the future already exists. In this case, time-entropy for an observer increases going further into the past and further into the future. The further you go from the present, the more states are consistent with an observed present state. However, I don't think this is a fatal problem either. This doesn't change the fact that the total entropy in terms of all the states the universe has had can only ever go up or stay the same with the forward passage of time. It is merely finite information that causes possible states to proliferate the further you get from the present.

    Time-entropy would increase with the passage of time even if spatial entropy had a tendency to decrease over time (a shrinking universe), because you're still generating more configurations with any change. Question begging might come up though in block universes, as the "starting side" of time could be said to be arbitrary. Indeed, if you took a random moment in the middle of time and decided to claim that time flowed outwards in two directions, you'd have the number of states increasing in either, so maybe the key concept to take away is that possibilities increase in all directions from any given observation point-event. Not sure if this is trivial.
  • Gnomon
    3.5k
    Would it be useful to consider a four dimensional (i.e. time inclusive) form of entropy?Count Timothy von Icarus
    Actually, one definition of Entropy is "the arrow of time". Time is a measure of Change, and Entropy is the direction of Change (e.g.from hot to cold). So, Entropy and Time are necessarily entangled, just like Space-Time. For example, Entropy is the effect of Time on Spatial objects. The notion of Temporal Entropy then, is not trivial, but essential to the "way of the world". Moreover, Evolution would have vanished into nothing eons ago, if not for positive Natural Selection, to oppose the negative effects of Random Mutation. Hence, Time moves in a partly positive direction (progress), despite negative Entropy, but to the complex workings of Space-Time-Entropy-Energy (i.e. four dimensions). :smile:

    Entropy and Time :
    The idea that entropy is associated with the “arrow of time” has its roots in Clausius’s statement on the Second Law: “Entropy of the Universe always increases.” However, the explicit association of the entropy with time’s arrow arises from Eddington. In this article, we start with a brief review of the idea that the “increase in entropy” is somehow associated with the direction in which time increases. Then, we examine three different, but equivalent definitions of entropy. We find that none of these definitions indicate any hint of a relationship between entropy and time. We can, therefore, conclude that entropy is a timeless quantity.
    https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7516914/
    Entropy, as a mathematical concept is indeed timeless. But the effect of Entropy on Spatial objects is what we measure as "Time". So, together, Space-Time-Entropy-Energy is what we know as Cosmic Evolution. The positive twin of evil Entropy is what I call Enformy, which is another name for causal Energy.
  • Count Timothy von Icarus
    1.9k


    The problem, which has been proposed since Boltzmann's time, is that time and entropy are not necessarily correlated. They are contingently so. This is known as Loschmidt's paradox.

    Boltzmann's logic for thinking that entropy will increase in the future is very sound. If you took a set of all the trajectories of every particle in the universe consistent with the current macrostate, the vast, vast majority would trend towards higher entropy. That said, there are indeed potential sets of trajectories that would lead to lower entropy, so we can't know for sure that entropy will always increase, but it's a safe enough bet. This isn't a big issue because the number of states that would trend towards lower entropy are so much smaller than the states that trend towards higher entropy.

    However, there is a deeper problem here. For any one observed state of "medium" entropy, i.e. what we see around us (a universe at neither maximal nor minimal entropy), there is a vastly higher number of prior states that have more entropy than less entropy. That is, Boltzmann's logic works exactly the same in the other direction. Sitting outside the universe, looking at a frozen macrostate from our current moment, we should predict that the entropy will be higher in the future, but also that it would have been higher in the past. This is due to the fact that the past has vastly more ways to be high entropy than low entropy.

    Boltzmann tried to fix this with H Theorem but it turned out that the axioms of said theorem assume, subtly, a direction of time to begin with.

    That's sort of where I was going with the whole time entropy thing. It's just a way of flipping the spatial entropy concept on its head to get at mysteries/problems with current statistical mechanics, namely "why the past had a lower entropy." But the neato thing about the time entropy concept is that, if the start of the universe is not an arbitrary point, this measure does necessarily increase or stay the same with time unlike our current measure.

    -----

    On a side note, I can see why a "time entropy," could be redundant due to Leplace's Demon. The concept of Leplace's Demon is that an entity, given the exact relative position, velocity, mass, etc. of every particle in the universe, can predict everything that will happen in the future and retrodict everything that will happen in the past from that information. This is because information is conserved (hopefully, some experimental data seems to show violations of the First Law of Thermodynamics, but that's aside the point). So complete knowledge of a microstate limits the number of potential past and future states to just one set (provided a theoretical truly closed system).

    What gets me is that this seems to imply that time entropy, the collection of all past microstates consistent with an observed state has the same exact information content as spatial entropy. They both tell you the same thing, namely every possible state of a system. But if they are the same information, why does one seem like it should increase in both directions whilst the other must only increase or go down .
  • Gnomon
    3.5k
    The problem, which has been proposed since Boltzmann's time, is that time and entropy are not necessarily correlated. They are contingently so. This is known as Loschmidt's paradox.Count Timothy von Icarus
    Yes. Classical macro Time is intuitive, hence easy to understand. It's just the measure of cycles that are meaningful to humans : day/night ; moon phases, etc. Quantum micro Time, not so much. We are not normally aware of cycles at subatomic scales : Energy field phases & Radioactive Decay. Also, Cosmic cycles are measured in billions of Earth years, hence not perceivable in a human lifetime. Plus, Einstein's Block Time has no cycles at all. So, the concept of Time that we take for granted is just one way of measuring Change in the physical world. That's not to mention the subjective experience of Time that varies due to emotional states. Consequently, like everything else in Einstein's worldview, Time is relative to the Observer.

    Likewise, the general concept of Entropy is intuitive, as we are familiar with hot coffee that gets cold in a short time. But technically, Entropy is a measure of predictability into the future, which varies both with degrees of physical order/disorder, and with the subjective perception of the observer. So, we have two moving targets to shoot-at, when we define Time and Entropy. And that's not to mention the philosophical paradoxes of Time Reversal, which in theory should be possible, but is never actually observed in practice.

    Therefore, I have come to realize that both Time & Entropy are correlated to a third function of Change, which we define in various ways depending on our frame of reference. I call that third aspect of Causation/Change : Enformy. It seems that Time & Entropy & Enformy are interrelated at all levels of reality : microstates, macrostates, and cosmicstates, that are never static -- except in mental snapshots -- but always changing, evolving and emerging. :nerd:


    Time's Errant Arrow :
    Time is arguably among the most primitive concepts we have—there can be no action or movement, no memory or thought, except in time. . . . .
    "What is time? If nobody asks me, I know; but if I were desirous to explain it to one that should ask me, plainly I know not." ___Augustine . . . .
    "... Philosophers tend to be divided into two camps. On one side there are those who regard the passage of time as an objective feature of reality, and interpret the present moment as the marker or leading edge of this advance. Some members of this camp give the present ontological priority, as well, sharing Augustine's view that the past and the future are unreal. Others take the view that the past is real in a way that the future is not, so that the present consists in something like the coming into being of determinate reality. .... Philosophers in the opposing camp regard the present as a subjective notion, often claiming that now is dependent on one's viewpoint in much the same way that here is. Just as "here" means roughly "this place", so "now" means roughly "this time", and in either case what is picked out depends where the speaker stands. In this view there is no more an objective division of the world into the past, the present, and the future than there is an objective division of a region of space into here and there. . . .
    Often this is called the block universe view, the point being that it regards reality as a single entity of which time is an ingredient, rather than as a changeable entity set in time."

    http://www.scholarpedia.org/article/Time%27s_arrow_and_Boltzmann%27s_entropy

    Enformy :
    In the Enformationism theory, Enformy is a hypothetical, holistic, metaphysical, natural trend or force, that counteracts Entropy & Randomness to produce complexity & progress. . . . .
    1. I'm not aware of any "supernatural force" in the world. But the Enformation theory postulates that there is a meta-physical force behind Time's Arrow and the positive progress of evolution. Just as Entropy is sometimes referred to as a "force" causing energy to dissipate (negative effect), Enformy is the antithesis, which causes energy to agglomerate (additive effect).
    2. Of course, neither of those phenomena is a physical Force, or a direct Cause, in the usual sense. But the term "force" is applied to such holistic causes as a metaphor drawn from our experience with physics.

    http://blog-glossary.enformationism.info/page8.html
  • Count Timothy von Icarus
    1.9k
    Interesting. Enformy seems like it would be an emergent factor from:

    The laws of physics being what they are and allowing for complexity to emerge.

    The universe starting in a low entropy state due to conditions during/preceding/shortly after the Big Bang (The Past Hypothesis)

    The fact that this tendency from a low entropy past to a high entropy future creates natural selection effects on far from equilibrium physical systems.

    The fact that the mathematics/physics of self-organizing / self-replicating systems works the way it does and synch can occur in disorganized, chaotic systems.

    The fact that information about the environment confers adaptive advantage on a replicating system.

    The fact that fundemental information in the universe can be encoded at higher and higher levels of emergence in a sort of fractal self-similarity.

    ---

    On an unrelated note, I'm realizing now that this measure of entropy is sort of dumb for any finite, closed system. Poincaré recurrence theorem dictates that every potential state of an isolated/closed system will be a past state of that system at some point, given a long enough time.

    I was unaware of the Gibbs Paradox and came up with the same problem myself but the folks at the Stackexchange physics section helped me out. I was not aware that there were multiple entropy formulas or that the measure is somewhat subjective. I feel like every subject is an onion with no bottom.
  • Gnomon
    3.5k
    Interesting. Enformy seems like it would be an emergent factor from : The laws of physics being what they are and allowing for complexity to emerge.Count Timothy von Icarus
    "Enformy" is my own term for what physicists refer to as "Negentropy". But that scientific nomenclature, taking Entropy as primary, makes it sound like a Fatalistic Force. On the contrary, "to enform" means "to give meaningful form to . . ." In other words, it's a creative force in nature. And Evolution is the record of an ongoing series of emergent forms, from a formless beginning (the abstract mathematical Singularity).

    Since Energy is the universal Causal Force in nature, we could conclude that energy is also Enformy. But Energy is both creative and corrosive, both constructive Enformy and destructive Entropy. Also, the Big Bang -- imagined as an explosion -- would seem to be a deconstructive event. Instead, it began to self-organize into matter/antimatter, and thence into Darwin's "endless forms most beautiful"*1. Therefore, I view Energy simply as the Potential for Change, which can be construed as good or bad, positive or negative. Hence, Enformy and Entropy are emergent factors of Energy, as positive & negative forms of Causation (i.e. the defining & organizing laws of physics).

    Ironically, we usually think of Evolution as progressive from simple elements to complex compounds. But it is also digressive, in that most of its new forms don't survive the life or death competition of Random Change. So, if it were not for the "law of nature" we call Natural Selection, the nascent universe would have blinked-out long long ago. And I view Enformy as the embodiment of that natural tendency to progressive change.

    From a viewpoint outside our universe, the Arrow of Time would seem to be pointing downward toward a cold-dark-heat-death. But, from the local perspective of living & thinking beings inside the Causal Train, Time's Arrow appears to be pointing upward, toward greater organization & complexity, hence Creativity. So, the Big Bang can be viewed as both a destructive explosion of Entropy, and a constructive expansion of Enformy. Personally, I prefer the more uplifting worldview. :cool:


    *1. Darwin :
    “Thus, from the war of nature, from famine and death, the most exalted object which we are capable of conceiving, namely, the production of the higher animals, directly follows. There is grandeur in this view of life, with its several powers, having been originally breathed into a few forms or into one; and that, whilst this planet has gone cycling on according to the fixed law of gravity, from so simple a beginning endless forms most beautiful and most wonderful have been, and are being, evolved.”
    https://www.goodreads.com/quotes/3895-thus-from-the-war-of-nature-from-famine-and-death

    Negentropy is reverse entropy. It means things becoming more in order. By 'order' is meant organisation, structure and function: the opposite of randomness ...
    https://simple.wikipedia.org/wiki/Negentropy

    Enformy :
    The BothAnd principle is a corollary of the Enformationism thesis. It views the world as a process motivated and guided by antagonistic-yet-complementary powers. For example, Energy is the motive force for all physical actions, but its positive effects are offset by the, less well known, antithetical force of Disorganization, in the great dialectical process of evolution. The overall effect of Change in the universe is detrimental, as encapsulated in the concept of Entropy (negative transformation). Yet, by balancing destructive Entropy with constructive Enformy (self-organization), evolution has proven to be a creative process. However, since the existence of “Enformy” has not yet been accepted by mainstream science --- except in the crude concept of “negentropy” --- any worldview based on such a flimsy foundation is likely to be dismissed by either/or empiricists as a bunch of Woo. Yet, all scientific & philosophical speculation inevitably begins with a leap of imagination. And this hybrid world-view is one such leap into the unknown.
    http://www.bothandblog.enformationism.info/page17.html

    BIG BANG : EXPLOSION or EXPANSION --- DIVERGENT or EMERGENT ?
    vLq9PC5VDGqgCFXxSUUCaQ-1200-80.jpg
  • SophistiCat
    2.2k
    Interesting topic. Entropy, macro- and microstates are notoriously tricky subjects, even to specialists.

    Would it be useful to consider a four dimensional (i.e. time inclusive) form of entropy?

    Entropy is often defined as the number of possible microstates (arrangements of particles) consistent with an observed macrostate.

    Time entropy would be the number of possible past states consistent with an observed present state. Is this potentially useful?
    Count Timothy von Icarus

    It is not very clear to me what you mean by this formulation. A microstate is consistent with a macrostate in the sense that it is consistent with the macroscopic variables that make up the macrostate, such as temperature and pressure. In general, those state variables change over time, so that past microstates will not be consistent with the present macrostate in the same sense in which the present microstate partition is consistent with it (except in the static limit).

    Perhaps what you have in mind are past microstates that, when evolved into the present, would be consistent with the present macrostate? In other words, past microstates that evolve into any of the microstates that partition the present macrostate.

    Now, let me backtrack a bit and reexamine your definition of entropy. A given macrostate induces a particular statistical distribution of microstates. When a system is at a thermodynamic equilibrium (and thus at its maximum entropy), all its microstates have the same probability of occurrence. Then and only then can we calculate the entropy simply by counting the number of microstates. In all other cases* entropy can be calculated, per Gibbs' definition, as a sum of probabilities of microstates in a statistical ensemble. Given a time discretization, we can then add up past microstate distributions leading to the present distribution to obtain your "time entropy."

    Am I on the right track?

    * By "all other cases" I mean a rather restricted class of pseudo-equilibrium states of matter, where the system is weakly interacting with its surroundings, and changes are relatively slow. This is the context for all talk of entropy, macro- and microstates.

    It's worth noting that for use in some physics problems, entropy's definition is altered to be: the total possible number of microstates consistent with all the information we have about a system, such that, as you complete more measurements and gain information about a system the "entropy" goes down because your information continually rules out certain microstates.

    This may be a better definition because the number of potential microstates for a fully unobserved system is obviously infinite. Observing a "macrostate" is getting information about a system, which is then reducing the possible configurations the system can have. So while the definitions seem different, it isn't clear to me that they actually are. The idea of naively observed "macrostates" may be one of those concepts we inherited from "common sense," that work well enough for some problems, but hurt us in the long run.
    Count Timothy von Icarus

    This epistemic take on entropy comes information theory. Information-theoretic and physical entropies are related, but they are not the same - the differences stemming mainly from their uses in their respective fields. Physical entropy - the kind that enters physical equations - is not a function of information that we have about a system at a given time. The choice of macroscopic observables and microscopic degrees of freedom is subjective, to a degree. However, once the choice is made at a high level, the rest objectively follows.
  • Count Timothy von Icarus
    1.9k


    Perhaps what you have in mind are past microstates that, when evolved into the present, would be consistent with the present macrostate? In other words, past microstates that evolve into any of the microstates that partition the present macrostate.

    Now, let me backtrack a bit and reexamine your definition of entropy. A given macrostate induces a particular statistical distribution of microstates. When a system is at a thermodynamic equilibrium (and thus at its maximum entropy), all its microstates have the same probability of occurrence. Then and only then can we calculate the entropy simply by counting the number of microstates. In all other cases* entropy can be calculated, per Gibbs' definition, as a sum of probabilities of microstates in a statistical ensemble. Given a time discretization, we can then add up past microstate distributions leading to the present distribution to obtain your "time entropy."

    Am I on the right track?

    Yup, that's the main way I thought of it. I also thought of it in terms of the macroscopic variables in the past consistent with those in the present, but this seems to run into immediate problems:

    1. No system is actually fully closed/isolated so backtracking the macroscopic variables seems impossible in practice on any scale that would be interesting.

    2. If I'm thinking this through right, an idealized isolated system's macroscopic variables shouldn't change at all over time unless the system is contacting or expanding.

    The "past microstates that, when evolved into the present, would be consistent with the present macrostate" conception seemed like it could be more interesting when I wrote the thread. For example, the whole idea behind Leplace's Demon is that the full information about the actual microstate also gives you total information about all past states. This makes me think that you should be able to show that knowledge about the distribution of possible states in the present should contain all the information about the possible past states, but I'm not sure if that actually holds up.
bold
italic
underline
strike
code
quote
ulist
image
url
mention
reveal
youtube
tweet
Add a Comment

Welcome to The Philosophy Forum!

Get involved in philosophical discussions about knowledge, truth, language, consciousness, science, politics, religion, logic and mathematics, art, history, and lots more. No ads, no clutter, and very little agreement — just fascinating conversations.