The theoretical framework derived for the analysis of biological systems was structured in a way to broadly describe the homeorhetic functioning of these systems based on first principles including information processing for the adaptive reconciliation of entropy divergences within the biocontinuum in order to maintain stability and survival. These divergences provide the entropic drive for these system dynamics as determined by the inference procedure of the Kullback Principle of Minimum Information Discrimination and in the context of the inherent constraints of the biology.
3.1. Derivation of Equations of Entropic Dynamics for the Biosystem
While the basic driving forces of entropic dynamics are the same for biology as they are for any physical systems, the utilization of a particular form of Jaynesian entropic inference (Kullback’s Principle of Minimum Information Discrimination) and the adaptive constraints of living systems are unique. Those distinctive features are incorporated into a mathematical framework for the analysis of biosystem dynamics. The framework mechanics of the derived mathematical expressions are also founded on the logical essence of the biological natural selection process as modeled by Fisher’s equations of replicator dynamics [
28]. For a general population (P) of organisms with variations of type (i), the Lotka–Volterra derivative of the replicator equation describes a very general rule for how these population numbers can change with time and is defined as [
32]:
However, rather than considering the probability distribution of a population, the expression can also be employed to describe the dynamics of possible states for the biocontinuum of a single organism. As an iterative dynamic with time, there is a continuous cyclic renewal and propagation of the state of that biocontinuum for the singular living system. In this expression, the relative sustainability of the various possible th states for an organism is determined by the difference in the state’s ability to endure [fitness: ] compared to other possibles and serves as a prescribed system constraint. The full range of these functions determines the fitness landscape for the biocontinuum. Therefore, there is the natural emergence of an organism that is most adapted to the conditions of the biocontinuum with the greatest probability of survival.
In systems control theory the fitness function is also called an adaptation evaluation function and used in relation to the error function which determines the difference between actual and
a priori predicted solutions [
33]. System adaptation for minimization of error also minimizes the system’s internal energy as information that is gained is assimilated into the system’s structure and function. Using the systems control approach to fitness evaluation of the single organism allows for integrating dynamic models of the constraints of very complex living system homeorhetic and adaptive functioning into the dynamics.
The Lotka–Volterra expression is made more practical by a normalization of the probabilities of all possible states by letting
be defined as the probability fraction of the
th state within the entire spectrum of possibles (see below). In this configuration the values for
are also considered as probabilities where
is the probability that a randomly chosen constituent of the possible states is of the
ith type.
These values for
are between 0 and 1 and add up to 1. In this mathematical framework and by the quotient rule of calculus for derivatives, the Lotka–Volterra equation becomes the usual form of the replicator equation:
Where:
is the mean fitness of all the possible system states.
And if we consider that each fitness
depends on the fraction of each possible state, then the replicator equation simplifies to a more statistically usable form for information theory as:
This expression then determines the change of the probability fraction of a possible biocontinuum state at a rate proportional to fitness of that state minus the mean fitness. With normalized probability distributions of the possibles, the derived state expression will include information probability and knowledge uncertainty based on Shannon information theory. This information state is defined as amount of average information required to determine the probability spectrum of possible states. It is also a measure of the uncertainty an observer has about the system state. This measure is the summation of the surprisals and is defined as:
Incorporating these information theory metrics into the replicator expression then allows for the analysis of entropic dynamics in living systems. This framework of uncertainty in the differentiation of the “signals” of information concerning the state of the biocontinuum environment is also consistent with known biological knowledge acquisition processes as the system gains information and adapts to reflect the context of its space. Therefore, the rate of change in this information gain and uncertainty loss (the entropic dynamics) for the living system is given by the equations:
From this expression it appears that the information gain is mainly dependent on the differential assessment of the relative fitness of the organism to the biocontinuum condition associated with that information. And that fitness is determined by the integrity and robustness of the inherent complex homeorhetic processes and functioning of the living organism. This is because the structural constraints and entropic dynamics of the living system are actively adapted toward the objective of stability and a sustained existence based on this assessment.
In the dynamics of any system, there is a continuous transition from the current global state to an optimal or goal state. This transition requires a reconciliation of divergent conditions toward that optimal state across the whole of the system continuum. Physical phenomena within such systems are fundamentally based on relational interactions and described by the relative information concerning those interactions. Divergences in the entropic information between the current and desired states is best measured by the Kullback Leibler divergence metric for relative information. The Kullback-Leibler information divergence of P from Q (denoted D
KL(QǀP) or I(q,p)) is where P is the prior or current probability distribution of types and Q is the distribution that is the optimal end state. By incorporating this measure into the functioning of the replicator expression, the use of prior information within the living system regarding the state of the biocontinuum is possible using a Bayesian inference approach. The Bayesian updating during the iterative procedure with time also accounts for the Landauer erasure of information required for balancing thermodynamic entropy [
34]. The information differential between q and p at any point in the dynamic transition is the remaining information to be learned. The equation for the Kullback-Leibler information divergence is then given by:
For
where
is a target goal state with a fixed probability distribution and only
is time dependent then:
And
where
is the rate of change of the probability
and is defined by the replicator equation as:
Substituting this expression into our derivative equation results in:
Since the probability
sums to one, the equation becomes:
where
demarcated in the biocontinuum is the same for
If the potential information (I) as the Kullback–Leibler divergence of the biocontinuum is defined by:
And the kinetic information defined as the changing of the Kullback–Leibler divergence during the procedure of information being assimilated by the adaptive processes of the living system is described by:
Then the
ACTION measure for all
elements is:
as the integral summation of the Lagrangian integrand which is the difference between the kinetic and potentials at each phase of the change transition. This expression determines the overall evolution and trajectory of the living system as it moves to its final state. However, this unique action principle and functional for living systems is
not fundamental but arises from the more basic physical dynamics of entropic inference subject to the constraints of the organism’s structure and processes as they are geared for stability and survival. While the inherent dynamics, fitness function and adapting constraints of the replicator equation naturally reduces the system’s information divergences, the broader axiomatic behavior of this process is grounded in the entropic drive of the Kullback’s Principle of Minimum Discrimination Information.