A “system” is a finite set of memory components interrelated through causative event maps.
Phwew, that was a mouthful! What does that mean?
Memory is the ability of matter to change state and maintain that state for a non-zero period of time. At the smallest scales of existence, atoms have memory when, for instance, chemical changes influence the electron configuration of those atoms. The ability of paper to hold graphite markings throughout its lifetime is also a form of memory.
An event is a directional transfer of energy from one memory component to another, from source to target, in a way that induces a state change in the target which lasts for a non-zero period of time. An event is an event if it alters the memory configuration of its target. An event map is a set of source/target associations. Causality is the study of the effects of event maps upon their state-absorbing targets.
To study a system is to study a well-defined, finite set of memory components and the causative event maps which affect those components. For every system under study, there exists that which is outside of that system which we call the system’s environment. Causative events flow from system to environment, and from environment to system, composing a causative event map called a feedback loop.
Entropy is the degree to which a system has been affected by its causative event map. Low entropy implies that a system has “room” to absorb new state changes in an unambiguous way. A set of aligned, untoppled dominoes has low entropy. High positive entropy implies that a system has attained a degree of ambiguity with regard to its ability to absorb specific kinds of changes. A set of toppled dominoes has a high degree of entropy relative to “toppling” events. One can attempt to topple already-toppled dominoes, but the result is ambiguous in that it is more difficult to leave evidence of a toppling event (a finger push) than it was prior to toppling. Negative entropy is a condition in which a system is to some degree “reset” so that it can once again, unambiguously, absorb more events than it could before. To induce negative entropy into a system of toppled dominoes is to set them back up again to be retoppled.
All physical systems tend to increase in measures of entropy over time. They do so because they have memory and exhibit hysteresis. To memorize a change is to freeze that change in time. Changes induced by previous events interfere with the ability of new events to be absorbed. A thermodynamically hot system imparts kinetic events to cold systems they are connected to, at the cost of the energy stored in its own memory. Slowly, the cold systems absorb the kinetic energy of the hot until a point is reached which the cold memory systems reach capacity, or become saturated. Such a point of memory capacity saturation is called “equilibrium”. If the cold system had no memory, for instance if it were a vacuum, it would never have increased in temperature and the hot system would have eventually become absolutely cold since it would be connected to systems with infinite capacities to absorb events.
As noted by Erwin Schrödinger, life in general has a “habit” of reversing entropy and in fact could be defined by this single, dominant habit. Lifeless physical systems tend towards maximum positive entropy and tend to remain that way. Life, on the other hand, does its damnedest to reverse entropy. For life, it is not merely enough to keep entropy from increasing. Like all systems, life which is saturated to its limit of information capacity can fail to adapt to a changing environment. Life is a process through which its subsystems are continually de-saturated in order to make room for new information. Life depends on entropy reversal.
This is not to say that entropy reversal does not happen to lifeless systems; entropy may be reversed here and there and for short periods of time. Random, isolated reversals of entropy in any system however are always—even in the case of life—compensated for by an increase of entropy in the outer environment. Ultimately, the Great Environment we call the Universe is continually losing more and more of its ability to unambiguously absorb new events. The arrow of time since the Big Bang is the story of how the memory components of the Universe are reaching capacity saturation.
The metaphor of the economic transaction is useful for describing the flow of events leading to entropy reversal. Financial transactions follow the same entropy build-up and subsequent decrease. Even in the simplest of cases, financial participants form a “memory system” which saturates before it collapses. Work is done between participants before money is exchanged. The exchange of money allows the information of the transaction to “compress”, and entropy to reverse in the well-defined, temporary system of the particular transaction. This entropy reversal occurs, of course, at the expense of the outer environment. Quantum transactions also follow the same build-up and tear-down in terms of the memory capacities of participating elements of matter.
For true de-saturation to occur within a system, a system’s memory must be irreversibly erased. If memory erasure were reversible, then memory would not have been erased and the system would have remained saturated. “Reversible” memory loss is not true memory loss, but an illusion, a shuffling, a card trick. Irreversibility however, comes at a price for a system. One can shuffle sand in a sandbox from one side to another, but to truly increase the capacity of a sandbox one must expend energy to remove sand from it and returning that sand to the outer environment. “Irreversibility” however, is not some separate, measurable feature of entropy reversal, but is a necessary part of its definition. If a transaction is reversible, then entropy was not reversed. If entropy has not been reversed, either partially or completely, then the transaction metaphor does not apply. Irreversibility is a necessary test to determine the appropriateness of the transaction metaphor.