3.1. An Action Principle for Biology
The difficulty in understanding the natural phenomenal dynamics of living systems comes from our limitations in applying the traditional reductionist approach in analyzing complex systems while still requiring that they are subjugated to basic physical laws. Employing a unique action principle that arises from the logical inference of basic information processing procedures that occur with biological dynamics may be the way forward for analyzing these complex open systems. In the latter part of the 20
th Century, physical chemists Ilya Prigogine and Lars Onsager were awarded the Nobel Prize for their analyses of the nonequilibrium thermodynamics of open systems [
20,
21]. Joseph Kestin formalized these ideas for open attractor systems as the Law of Stable Equilibrium or the Unified Principle of Thermodynamics [
22,
23]. This unified principle states that attractor systems that are stable in the Lyapunov sense will naturally oppose applied gradients and move back toward an equilibrated steady state condition. For living systems acting as open attractors, this adaptive recoil toward the steady state is triggered by divergent information within the purview of the organism’s perception.
An alternative derivative of the Jaynes methodology discussed above that considers such divergent information within a system was formulated by Solomon Kullback as the Principle of Minimum Discrimination Information [
24]. This principle delineates the procedural dynamics required in the reconciliation of informational differences as the system state moves towards an equilibrated steady state condition. As demonstrated by Jaynes, this inference driven impetus for change is equivalent to the entropic forces of the 2
nd Law of Thermodynamics [
8]. The Kullback information minimization derivation utilizes the concept of relative information as described in the Kullback-Leibler (
formulation of system information divergence:
is considered the average of the "surprisal-difference" between the original condition with Shannon information entropy probabilities
and the new status. Therefore, it measures the information differences between a new system state and the prior reference state of the system. In the terminology of Shannon Information, where
is the current probability distribution and
is the distribution that is the entropy-driven end target of the change dynamic as directed by the Jaynes principle, the equation becomes:
By way of this solution principle, incoming information is used to infer the formation of a new parameter probability distribution that minimizes the discrimination from the original distribution as much as possiblef 0 {\displaystyle f_{0}}. Because phenomenal dynamics are based on relational interactions, physicist Carlo Rovelli considers the relative nature of the
metric as the best physical equivalent of the Shannon information entropy measurement [
25]. Through the incorporation of the Boltzman constant in the calculation, physical thermodynamic entropy measures are a basic physical property of the system. Statistical determinations of physical thermodynamic entropy then becomes a special use case for inferential information processing. By contrast, information can be formulated as a strictly relational dimension connecting the elements and processes of any system. Therefore, the information entropy is mainly epistemological while physical thermodynamic entropy can be considered as more ontological. Kullback’s Principle of Minimum Information Discrimination applies an axiomatic inference procedure to minimize all relative information entropy differences in a way that globally maximizes the system entropy as subject to the system constraints.
Living organisms continually observe new information within their biocontinuum for the purpose of system adaptation toward a state of stability and survival [
26]. This process is essentially equivalent to the minimization of free energy in a complex system as it naturally moves to a steady state [
27]. Therefore, by the Principle of Minimum Information Discrimination the new probability distribution of the system information entropy is inferred in a way which minimizes the discrimination from the original distribution as much as possible. The trajectory for this inference in living systems during the processing of biological adaptations can be projected and characterized by integrating Shannon information entropy into the operations of a standard replicator dynamic as first described by R.A. Fisher [
28,
29]. The derivation of this integration and conversion into a Kullback-Leibler information divergence format (
) has been previously described in detail [
26,
30] and is summarized as:
I(p,q) is the information state.
q is the target state with a fixed probability distribution.
p is the time dependent probabilities of current state.
When
is a target goal state with a fixed probability distribution and only the probability of
is time dependent then:
Therefore, differentiating
results in the following expression:
Where
(rate of change of probability
) is defined by the replicator equation as:
Substituting this replicator expression into our derivative
results in:
Where fi(P) is the fitness of each type i in the population with fitness being a survival likelihood or probability characteristic in the context of the conditions of the environment.
Since the probability
sums to one and
is the same for
then:
If we consider
as the kinetic and
as the potential, then the
ACTION functional (kinetic – potential) for all
elements is described as:
Therefore, ACTION is defined as the integral summation of the Lagrangian integrand (the difference between the kinetic and potentials of information) and is naturally minimized by the Kullback’s Principle of Minimum Discrimination Information. This integral also serves as a functional to determine the trajectory of dynamics in the biological system.