Pointer, please.
Thanks.
Not so interested in the "which came first" question. It simply isn't an important question in this context.
Let me try a different path that might make the "word salad" less obscure.
In computer science we have the important concept of increment: X = X + 1
From a traditional logic perspective, this is absurd. It only makes sense in the context of allowing state, or allowing the variable (register) X to change values (instantiation) over time. Ignoring the original seed, the current instantiation of X is dependent on the prior instantiation of X. There is causality. So, if X has a value x, then there was an X with the value x - 1. Yes, ultimately, this iterates all the way back to the seed (note: ignoring overflow if you know what that means in this context).
Would you state this as X -> X - 1 ? (or X <- X + 1? or, X < X + 1? or ≡? or ⊃? or ⇒ or ⇔... ). If this conditional is true, isn't the contrapositive ¬ (X - 1) -> ¬ X also true?
When we allow general logical variables to change state, and we create situations where the state change is causal (can't think of a case where this couldn't be true), we have a situation where the normal logic rules seem to apply, but not really. Without going into details of "where I'm going" with this (which is far more complex), I am trying to figure out if there is a body of knowledge/rules/techniques for dealing such situations. I'm guessing that there must be, because almost all life is based upon non-static agents. I can't find it. I hope that I'm simply looking in the wrong places and asking the wrong people.