Anyways, a good book on entropy is "Understanding Non-Equilibrium Thermodynamics" by Georgy Lebon, David Jou and Jose Casas-Vazquez.
https://b-ok.cc/book/508021/aad3be
From the preface:
Besides being an introductory text, our objective is to present an overview, as general as possible, of the more recent developments in non-equilibrium thermodynamics, especially beyond the local equilibrium description. This is partially a terra incognita, an unknown land, because basic concepts as temperature, entropy, and the validity of the second law become problematic beyond the local equilibrium hypothesis. The answers provided up to now must be considered as partial and provisional, but are nevertheless worth to be examined.
Right, so non-equilibrium thermodynamics is a terra incognita, a no man's land, well a no woman's land as well, to be politically correct, and not to be accused of sexism.
From chapter 2:
An important question is whether a precise definition can be attached to the notion of entropy when the system is driven far from equilibrium. In equilibrium thermodynamics, entropy is a well-defined function of state only in equilibrium states or during reversible processes. However, thanks to the local equilibrium hypothesis, entropy remains a valuable state function even in non-equilibrium situations. The problem of the definition of entropy and corollary of intensive variables as temperature will be raised as soon as the local equilibrium hypothesis is given up.
By material body (or system) is meant a continuum medium of total mass m and volume V bounded by a surface Σ. Consider an arbitrary body, outside equilibrium, whose total entropy at time t is S. The rate of variation of this extensive quantity may be written as the sum of the rate of exchange with the exterior deS/dt and the rate of internal production, diS/dt:
dS/dt = deS/dt + diS/dt (2.7)
So, the total entropy of the system under consideration is the sum of its internal entropy production, plus the entropy that it exchanges with/due to its surroundings.
Once entropy is defined, it is necessary to formulate the second law, i.e. to specify which kinds of behaviours are admissible in terms of the entropy behaviour. The classical formulation of the second law due to Clausius states that, in isolated systems, the possible processes are those in which the entropy of the final equilibrium state is higher or equal (but not lower) than the entropy of the initial equilibrium state. In the classical theory of irreversible processes, one introduces an even stronger restriction by requiring that the entropy of an isolated system must increase everywhere and at any time, i.e. dS/dt ≥ 0. In non-isolated systems, the second law will take the more general form
diS/dt > 0 (for irreversible processes) (2.10a)
diS/dt = 0 (for reversible processes or at equilibrium) (2.10b)
It is important to realize that inequality (2.10a) does nor prevent that open or closed systems driven out of equilibrium may be characterized by dS/dt < 0; this occurs for processes for which deS/dt < 0 and larger in absolute value than diS/dt. Several examples are discussed in Chap. 6.
Therefore, equations 2.10a and 2.10b, which, as the text says, is the 2nd law of thermodynamics in a more general form, refer to the internal entropy of the system: the internal entropy of a system will always increase or remain constant. If the system is isolated, which means that there is no exchange whatsoever with the surroundings, then the term d
eS/dt of equation 2.7 is zero and therefore, dS/dt = d
eS/dt + d
iS/dt = 0 + d
iS/dt = d
iS/dt >= 0. So, dS/dt = d
iS/dt >= 0. This is the form of the 2nd law of thermodynamics for isolated systems: its entropy equals its internal entropy, and remains constant (at equilibrium) or increases with time (when not in equilibrium).
For systems, however, whether open or closed, that are nonetheless driven out of equilibrium, their total entropy may as well decrease with time, the 2nd law has no say in this, if the rate of external entropy exchange d
eS/dt is negative and larger in absolute value than the internal entropy production. In other words, the entropy of a non-isolated system can do whatever it pleases,
when not in equilibrium.
It is also important to note that all of the above can be said for systems where the local equilibrium hypothesis holds, so what does this hypothesis state? Again from the text:
According to it, the local and instantaneous relations between thermodynamic quantities in a system out of equilibrium are the same as for a uniform system in equilibrium. To be more explicit, consider a system split mentally in a series of cells, which are sufficiently large for microscopic fluctuations to be negligible but sufficiently small so that equilibrium is realized to a good approximation in each individual cell. The size of such cells has been a subject of debate, on which a good analysis can be found in Kreuzer (1981) and Hafskjold and Kjelstrup (1995). The local equilibrium hypothesis states that at a given instant of time, equilibrium is achieved in each individual cell or, using the vocabulary of continuum physics, at each material point.
And then they go on to give a more technical description of the hypothesis, as well a justification for doing so. The local equilibrium hypothesis is therefore a rather good approximation for describing, thermodynamically and in terms of entropy, a system which is out/known to be out of thermodynamic equilibrium, by assuming that at each instant of time the system behaves like it is in fact in equilibrium.
But it just so happens that there are systems where this hypothesis has to be given up, due to the fact that fluctuations from equilibrium are just too great, as well as the time scales where anything takes place are too small for even definining a local entropy per unit time. By giving it up, the 2nd law of thermodynamics becomes highly problematic, up to the point that we are not even able to ascribe a temperature, or say that heat flows from hot to cold anymore, a fundamental tenet of this law. And so physicists have to devise new concepts, and to reformulate this 2nd law in terms of a more general "transport law":
...As a consequence, when working at short timescales or high frequencies, and correspondingly at short length scales or short wavelengths, the generalized transport laws must include memory and non-local effects. The analysis of these generalized transport laws is one of the main topics in modern non-equilibrium thermodynamics, statistical mechanics, and engineering. Such transport laws are generally not compatible with the local equilibrium hypothesis and a more general thermodynamic framework must be looked for. — chapter 7
And all this happens in the laboratory, for well known chemical and biological processes that exhibit such out-of-equilibrium behavior. What is there is to say for the thermodynamics of the universe, where gravitational phenomena kick in, comprising of hypothetical dark matter and dark energy, of which we know absolutely nothing with regards to entropy? I mean, how on earth do you extrapolate ignorance that you have, that you know that you have, on a local level to a global one, to be able to produce certain and definite conclusions, beyond a reasonable doubt, about the fate or the state of the universe?? That's .. that's just mad! Why do that thing? Why put yourself in such a position? Oh, I guess it's just the need to mythologize, like the mythical beings that we are, to tell you the truth, I have the same urge. But I think it's better to be more practical and fight the 2nd law instead, this "law" of decay and decadence, rather to embrace it.