In this paper, we derive Jeffreys divergence, generalized Fisher divergence and corresponding De Bruijn identities on space-time random field. First, we determine the relation between Fisher information on the space-time random field in one of the space-time points and the ratio of Jeffreys divergence on a space-time random field at distinct space-time positions to the square of coordinate difference. In addition, we also find identities between the partial derivative of the Jeffreys divergence and the generalized Fisher divergence with respect to space-time variables, i.e. the De Bruijn identities, between two space-time random fields obtained by different parameters under the same Fokker-Planck equations. At the end of this paper, we present three examples of the Fokker-Planck equations on space-time random fields, identify their density functions, and derive the Jeffreys divergence, generalized Fisher information, generalized Fisher divergence, and accompanying De Bruijn identities.
Keywords:
Subject: Computer Science and Mathematics - Applied Mathematics
1. Introduction
Information entropy and Fisher information are quantities to measure random information, and entropy divergence is derived from information entropy to measure the difference between two probability distributions. Formally, we can construct straightforward definitions of entropy divergence and Fisher information for the case of a space-time random field found on classical definitions. The density function, in their definitions, can be obtained in many different ways. In this paper, the density function of a space-time random field is obtained by Fokker-Planck equations. The traditional Fokker-Planck equation is a partial differential equation that describes the probability density function of a random process[1]. It describes the density function’s time-varying change rule. However, Fokker-Planck equations for random fields, particularly space-time random fields, does not have a clear form so far. The classical equation needs to be generalized because the variable varies from time to space-time.
In this paper, we mainly obtain the relation between Jeffreys divergence and generalized Fisher information for space-time random field generated by Fokker-Planck equations. Jeffreys divergence is a symmetric entropy divergence, which is generalized from Kullback-Leibler divergence (KL divergence). In information theory and statistics, Jeffreys divergence is often used to measure the distance between predicted and true distributions, but its drawback is that when there is less overlap between the two distributions, the result is infinity. In order to avoid the divergence results, we consider the relationship between Jeffreys divergence and generalized Fisher information for space-time random field with small differences in space-time parameters.
Moreover, the classical De Bruijn identity describes the relationship between differential entropy and Fisher information of Gaussian channel[2], and it can be generalized other cases[3,4,5,6,7]. Thanks their works and following their ideas, we obtain De Bruijn identities on Jeffreys divergence and generalized Fisher information of two space-time random fields whose density function satisfies Fokker-Planck equations.
1.1. Space-time random field
Random field was first studied by Kolmogorov[8,9,10] and it was gradually improved by Yaglom[11,12,13] in the middle of last century. A random field with variables can be expressed as
where . We call (1) as a generalized random field, or a multiparameter stochastic process. In some practical applications, we often use the concept of space-time random field. The space-time random field on a d-dimensional space is expressed as
where are the space-time variables. It has many applications in statistics, finance, signal processing, stochastic partial differential equations and other fields[14,15,16,17,18,19,20,21,22,23,24,25,26,27].
1.2. Kramers-Moyal expansion and Fokker-Planck equation
In the literature of stochastic processes, Kramers-Moyal expansion refers to a Taylor series of the master equation, named after Kramers and Moyal[28,29]. The Kramers-Moyal expansion is an infinite order partial differential equation
where is the density function and
is the n-order conditional moment. Here is the transition probability rate. The Fokker-Planck equation is obtained by keeping only first two terms of the Kramers-Moyal expansion. In statistical mechanics, Fokker-Planck equation is usually describes the time evolution of probability density function of the velocity of a particle under the influence of drag forces and random forces, as in famous Brownian motion. And this equation is often used to find the density function of Itô stochastic differential equation[1].
1.3. Differential entropy and De Bruijn identity
The entropy of a continuous distribution was proposed by Shannon in 1948, known as differential entropy[30]:
where represent the differential entropy and is probability density function of X. However, differential entropy is not easy to calculate and seldom exists. There have been related studies on the entropy of stochastic processes and continuous systems[31,32,33,34]. If we consider a classical one-dimensional Gaussian channel model
here X is the input signal, G is standard Gaussian noise, is the strength and is the output, we can obtain the density of satisfies the following Fokker-Planck equation
Further, by calculating the differential entropy of and obtaining its derivative to t, we can get
where
is the Fisher information of . The equation (8) here is the De Burijn identity. The de Bruijn identity connects the differential entropy and the Fisher information , which shows that they are different aspects of the concept of “information”.
1.4. Entropy Divergence
In information theory and statistics, an entropy divergence is a statistical distance generated from information entropy to measure the difference between two probability distributions. There are various divergences generated by information entropy, such as Kullback-Leibler divergence[35], Jeffreys divergence[36], Jensen-Shannon divergence[37], Rényi divergence[38], etc. These measures have been applied in a variety of fields such as finance, economics, biology, signal processing, pattern recognition and machine learning[39,40,41,42,43,44,45,46,47,48,49]. In this paper, we mainly focus on the Jeffreys divergence of two distributions, formed as
where is a measure of x.
2. Notations, Definitions and Propositions
2.1. Notations and Assumptions
In this paper, we use the following notations and definitions:
Given a probability space , two real valued space-time random fields are denoted as , or , , where and , , are space-time variables.
The probability density functions of P and Q are denoted as p and q. With , is the density value at of X and is the density value at of Y.
Suppose that our density functions , i.e. and are derivable twice with respect to u and once with respect to or , respectively.
In this paper, we denote that only the k-th coordinate differs in the d-dimensional real vectors as and , where the k-th coordinates are and , .
2.2. Definitions
To obtain the generalized De Bruijn identities between Jeffreys divergence and Fisher divergence, we need to introduce some new definitions and propositions.
The first and important information quantity is the Kullback-Leibler divergence for random fields. Same as the classical Kullback-Leibler divergence, we can easily get the Definition 1.
Definition 1: The Kullback-Leibler divergence between two space-time random fields and , , with density functions and , is defined as
Similar to the classical Kullback-Leibler divergence, Kullback-Leibler divergence on random fields is not symmetrical, i.e.
Following the classical definition of Jeffreys divergence on two random variables, we mainly consider Jeffreys divergence for random fields in this paper.
Definition 2: The Jeffreys divergence divergence between two space-time random fields and , , with density function and is defined as
Here, we replaced || by , in the distortion measure to emphasize the symmetric property.
Another significant measure of information is Fisher information. In this paper, we consider the generalized Fisher information of space-time random field.
Definition 3: The Generalized Fisher information of space-time random field , , with density function defined by nonnegative function are formed as
In particular, as , is the Fisher information in usual case. In addition to equation (14), we have other similar forms of generalized Fisher information
and
for . Obviously, (15) and (16) are generalized Fisher information on space-time variables. Regarding the generalized Fisher information (14), we can come to a following simple proposition.
Proposition 4: For arbitrary non-negative function , we assume the generalized Fisher information of random variable X
is well defined, where represents the probability density. Then we have the generalized Fisher information inequality
When , is the Fisher information in usual case.
Proof Denote , , and represent densities i.e.
and derivative function
If , and are never vanish,
is the conditional expectation of for given z. Similarly, we can obtain
and , , we find that
Hence, we have
with equality only if
with probability 1 whenever and we have
Averaging both sides over the distribution of z
i.e.
Setting and , we obtain
According to Definition 3, we can get relevant generalized definitions of generalized Fisher information in general.
Definition 5: The Generalized cross Fisher information for space-time random fields and , , with density functions and , defined by nonnegative function is formed as
Similar to the concept of cross-entropy, it’s easily to verify that (30) is symmetrical about P and Q.
Definition 6: The Generalized Fisher divergence for space-time random fields and , for , with density functions and , defined by nonnegative function is formed as
In particular, as , is the Fisher divergence in usual case.
Obviously, the generalized Fisher divergence of random fields is not a symmetrical divergence. To get the symmetrical formula, we need to generalize (31) in another field.
Definition 7: The Generalized Fisher divergence for space-time random fields and , for , with density functions and , defined by nonnegative functions and is formed as
In particular, as , is the generalized Fisher divergence for random fields by one function. In general, is asymmetric with respect to P and Q, i.e.
If we suppose that f and g are functions only related to P and Q, i.e.
where is a operator, the generalized Fisher divergence can be rewrite as
and we can easily get
In this case, we call (35) symmetric Fisher divergence for random fields generated by operator and denote it as
Proposition 8 (Kramers-Moyal expansion)[28,29]: Suppose that the random process has any order moment, then the probability density function satisfies the Kramers-Moyal expansion
where
is the n-order conditional moment. Here is the transition probability rate.
Proposition 9 (Pawula theorem)[50,51]: If the limit on conditional moment of random process
exists for all , and the limit value equals 0 for some even number, then the limit values are 0 for all .
Pawula theorem states that there are only three possible cases in the Kramers-Moyal expansion:
1) The Kramers-Moyal expansion is truncated at , meaning that the process is deterministic;
2) The Kramers-Moyal expansion stops at , with the resulting equation being the Fokker-Planck equation, and describes diffusion processes;
3) The Kramers-Moyal expansion contains all the terms up to .
In this paper, we focus on the case of Fokker-Planck equation.
3. Main Results and Proofs
Lemma 1: Suppose f, p, q are continuous derivable functions defined on . Then
holds true if it makes sense.
Proof:
Notice that
i.e.
then
So we can get the result
Lemma 2: Suppose that the Fokker-Planck equation for the density function is
We use variable substitution or to convert the equation to
where , , and are functions of t and . Then we get the density
Proof: Using Fourier transform, we can get the solution of Fokker-Planck equation (50) about v
It is worth noting that while p is a probability density function for u, and p is not a probability density function for v. Our transformation, or , is only introduced in the process of solving the equation, which does not mean that v is a random process. Therefore, the integral value of p with respect to v is not 1. But we can obtain
and
where is the supporting set of with respect to u.
Theorem 3: The probability density function of space-time random field , , , satisfies the following Fokker-Planck equations
where
here
are n-order conditional moments and is the orthonormal basis of the standard identity of .
Proof: , we can get the difference of density function in time variable
where
is the n-order conditional moment. Then the partial derivative of the density function with respect to t is
The Pawula theorem implies that if the Kramers-Moyal expansion stops after the second term, we get the Fokker-Planck equation about time variable t
where
Similarly, we may consider the increment of the space variable and we can obtain the Fokker-Planck equations about is
where
here
is the orthonormal basis of the standard identity of , .
Theorem 4: Suppose that and are rapidly decreasing functions of space-time random field such that the partial integral terms are 0, . If there are two infinitesimal and satisfied
where , are the k-th coordinate of x, , such that
is true to all meaningful . Then we have
here and are generalized Fisher information on space-time variables and
are infinitesimal on space-time variables, .
Proof: First, we consider . Notice that
then we can get
where .
Recall the conditions, there are two infinitesimal and
such that
is true to all meaningful , then we can get
so there exits infinitesimal as such that
Similarly, we can get the bounds of difference quotient on Jeffreys divergence for space coordinates
for .
Theorem 5: Suppose that and are rapidly decreasing functions of space-time random fields and such that the partial integral terms are 0, . Then the Jeffreys divergence satisfies generalized De Bruijn identities
Similarly, for , the generalized De Bruijn identities about the space variable is
where
then we get the conclusion.
4. Three Fokker-Planck Random Fields and Their Corresponding Information Measures
In this section, we list three types Fokker-Planck equations, obtain their density functions and the corresponding information measures, the Jeffreys divergence, generalized Fisher information and Fisher divergence. Starting from these quantities, we get two results illustrated in Theorem 4 and Theorem 5. On the one hand, we obtain the quotient of Jeffreys divergence and the squared of space-time variation on the same Fokker-Planck space-time random field at different space-time points, comparing with the generalized Fisher information. On the other hand, we obtain the De Burijn identities on Jeffreys divergence and generalized Fisher divergence from the Fokker-Planck equations on space-time random field at the same space-time position under the same type and different parameters.
4.1. A Trivial Equation
If we let
are continuous derivable functions independent of u and is the initial density, the Fokker-Planck equations are simple parabolic equations and the solution can be obtained by Fourier transform
In order to write the above results into a unified form, we suppose that
hold for , we can get a unified formula. This reminds us that we need to find functions and subject to the total differential
i.e. , correspond to partial derivatives of and with respect to space-time variables, respectively. In this way, we can get the probability density function
Actually, there are many examples whose Fokker-Planck equations fit this form. Let is Brownian sheet. That is, a centered continuous Gaussian process which is indexed by real, positive parameters and takes its values in [52,53]. Moreover, its covariance structure is given by
for , , where represents the minimum of two numbers. We can easily get
where is the coordinate product of and the density function is formed as
Moreover, the Fokker-Planck equations are
with the initial condition as .
Following the construct idea of Brownian bridge on Brownian motion[53], we name
Brownian sheet bridge on cube where is the Brownian sheet. Obviously, is Gaussian, and
we can get
and density function of is
On the other hand, the Fokker-Planck equations are
with the initial condition as and we get the solution (97).
Now, we have density functions (92) and (97) to get their respective Jeffreys divergence and the generalized De Burijn identities. Actually, we can easily get the Jeffreys divergence of (89) at different space-time points is formed as
and the Fisher divergence of and at the same space-time points
here .
Bring the density function of Brownian sheet into (99), we can get the Jeffreys divergence of Brownian sheet at different space-time points
and the generalized Fisher information on space-time variables are
.
Then we can get quotients of the squared difference between Jeffreys divergence and space-time variables
and then we can get the relation between quotients and generalized Fisher information
for . If we consider the approximation of spacetime points and , the final result (104) satisfies the conclusion of Theorem 4.
Similarly, we can get the Jeffreys divergence of Brownian sheet bridge at different space-time points
and the generalized Fisher information on space-time variables
. Further, we can easily get the quotients of the squared difference between Jeffreys divergence and space-time variables
and then we can get the relation between quotients and generalized Fisher information
for . Without loss of generality, the result (108) also satisfies Theorem 4.
Next, we consider the Jeffreys divergence on (92) and (97) at the same space-time points. Notice that the bounded domain of the Brownian sheet bridge density function, we consider only the space-time area .
with the remainder terms , . Furthermore, we can get the generalized De Bruijn identities
4.2. An Nontrivial Equation
If we let
are continuous derivable functions and
is the transformation. Then the Fokker-Planck equations can be rewrite as
Using Fourier transform, we can get the solution
where is the logarithm of initial value. Then we need to suppose that there are functions subject to
and we take into , we can get a unified density function formula.
As a example, we let
and , the Fokker-Planck equations are
and i.e. , we can obtain
with the solution
and
Then we suppose that there is a function subject to the total differential
and the density function is
Similar to the idea of Section 4.1, if we consider different and in (123), we can get density functions and , then we can get the Jeffreys divergence
and generalized Fisher information
then the quotients
and we can easily get the relation
for . Without loss of generality, the result (127) also corroborates Theorem 4.
Furthermore, if we consider different and in (123), we can get density functions and , then the generalized Fisher divergence at the same space-time points is
with the remainder terms , . Then the generalized De Bruijn identities are
4.3. An Interesting Equation
If we let
are continuous derivable functions and
where to are nonnegative functions independent of u and we omit .
Thus, we can rewrite the Fokker-Planck equations as
where and we can obtain the Fokker-Planck equations formed as
with initial value .
Using the formula , we can rewrite the Fokker-Planck equations
with the solution
and the variable substitution is .
Similarly, we suppose that there is a function subject to the total differential
then the density is
From density function (142), if we consider different and in (142), we can get density functions and , then we can get the Jeffreys divergence and generalized Fisher information
and
then the quotients
Obviously, we can easily get
for . Without loss of generality, the result (146) corroborates Theorem 4.
Furthermore, if we consider different and in (142), we can get density functions and , then the generalized Fisher divergence at the same space-time points is
with the remainder terms , . Then the generalized De Bruijn identities are
5. Conclusions
In this paper, we generalize the classical definitions of entropy, divergence and Fisher information, obtain these measures on space-time random field. In addition to that, we also get the Fokker-Planck equations (55) for space-time random field and obtain the density functions. Moreover, we obtain the Jeffreys divergence of a space-time random field at different space-time positions, and we get the approximation of the ratio of Jeffreys divergence to the square of space-time coordinate difference to the generalized Fisher information(68). Further, we use the Jeffreys divergence on two space-time random fields from same type but different parameters Fokker-Planck equations, to obtain generalized De Bruijn identities (77), and get the relation between Jeffreys divergence of space-time random field and generalized Fisher divergence. Finally, we give three examples of Fokker-Planck equations, with their solutions, calculate the corresponding Jeffreys divergence, generalized Fisher information and Fisher divergence and obtain the De Bruijn identities. These results encourage further research into the entropy divergence of space-time random fields, which advances the pertinent fields of information entropy, Fisher information, and De Bruijn identities.
Acknowledgments
The author would like to thank Prof. Pingyi Fan for providing relevant references and helpful discussions on topics related to this work.
Conflicts of Interest
The authors declare no conflict of interest.
Abbreviations
The following abbreviations are used in this manuscript:
KL
Kullback-Leibler divergence
FI
Fisher information
CFI
Cross Fisher information
FD
Fisher divergence
sFD
symmetric Fisher divergence
JD
Jeffreys divergence
References
Risken, H. The Fokker-Planck equation: methods of solution and applications; Springer Heidelberg: Berlin, 1984. [Google Scholar]
Stam, A. J. Some inequalities satisfied by the quantities of information of Fisher and Shannon. Inf. Control1959, 2, 101–112. [Google Scholar] [CrossRef]
Barron, A. R. Entropy and the central limit theorem. Ann. Probab.1986, 14, 336–342. [Google Scholar] [CrossRef]
Johnson, O. Information theory and the central limit theorem; Imperial college Press: London, U.K., 2004. [Google Scholar]
Guo, D. Relative entropy and score function: New information estimation relationships through arbitrary additive perturbation. Proc. IEEE Int. Symp. Inf. Theory, Seoul, South Korea, Jun./Jul. 2009; pp. 814–818.
Toranzo, I. V.; Zozor, S.; Brossier, J-M. Generalization of the De Bruijn Identity to General ϕ-Entropies and ϕ-Fisher Informations. IEEE Trans. Inform. Theory2018, 64, 6743–6758. [Google Scholar] [CrossRef]
Kharazmi, O.; Balakrishnan, N. Cumulative residual and relative cumulative residual Fisher information and their properties. IEEE Trans. Inform. Theory2021, 67, 6306–6312. [Google Scholar] [CrossRef]
Kolmogorov, A. N. The local structure of turbulence in incompressible viscous fluid for very large Reynolds numbers. Dokl. Akad. Nauk SSSR1941, 30, 299–303. [Google Scholar]
Kolmogorov, A. N. On the degeneration of isotropic turbulence in an incompressible viscous flu. Dokl. Akad. Nauk SSSR1941, 31, 538–542. [Google Scholar]
Kolmogorov, A. N. Dissipation of energy in isotropic turbulence. Dokl. Akad. Nauk SSSR1941, 32, 19–21. [Google Scholar]
Yaglom, A. M. Some classes of random fields in n-dimensional space, related to stationary random processes. Theory Probab. Its Appl.1957, 2, 273–320. [Google Scholar] [CrossRef]
Yaglom, A. M. Correlation theory of stationary and related Random functions. Volume I: basic results; Springer-Verlag: New York, 1987. [Google Scholar]
Yaglom, A. M. Correlation theory of stationary and related random functions. Volume II: supplementary notes and references; Springer-Velag: Berlin, 1987. [Google Scholar]
Bowditch, A.; Sun, R. The two-dimensional continuum random field Ising model. Ann. Probab.2022, 50, 419–454. [Google Scholar] [CrossRef]
Bailleul, I.; Catellier, R.; Delarue, F. Propagation of chaos for mean field rough differential equations. Ann. Probab.2021, 49, 944–996. [Google Scholar] [CrossRef]
Wu, L.; Samorodnitsky, G. Regularly varying random fields. Stoch. Process Their. Appl.2020, 130, 4470–4492. [Google Scholar] [CrossRef]
Koch, E.; Dombry, C.; Robert, C. Y. A central limit theorem for functions of stationary max-stable random fields on Rd. Stoch. Process Their. Appl.2020, 129, 3406–3430. [Google Scholar] [CrossRef]
Ye, Z. On entropy and ε-entropy of random fields. Ph.D.dissertation, Cornell University, 1989. [Google Scholar]
Ye, Z.; Berger, T. A new method to estimate the critical distortion of random fields. IEEE Trans. Inform. Theory1992, 38, 152–157. [Google Scholar] [CrossRef]
Ye, Z.; Berger, T. Information Measures for Discrete Random Fields; Science Press: Beijing/New York, 1998. [Google Scholar]
Ye, Z.; Yang, W. Random Field: Network Information Theory and Game Theory; Science Press: Beijing, 2023. (in Chinese) [Google Scholar]
Ma, C. Stationary random fields in space and time with rational spectral densities. IEEE Trans. Inform. Theory2007, 53, 1019–1029. [Google Scholar] [CrossRef]
Hairer, M. A theory of regularity structures. Invent. Math.2014, 198, 269–504. [Google Scholar] [CrossRef]
Hairer, M. Solving the KPZ equation. Ann. Math.2013, 178, 559–664. [Google Scholar] [CrossRef]
Kremp, H.; Perkowski, N. Multidimensional SDE with distributional drift and Lévy noise. Bernoulli2022, 28, 1757–1783. [Google Scholar] [CrossRef]
Beeson, R.; Namachchivaya, N. S.; Perkowski, N. Approximation of the filter equation for multiple timescale, correlated, nonlinear systems. SIAM J. Math. Anal.2022, 54(3), 3054–3090. [Google Scholar] [CrossRef]
Song, Z.; Zhang, J. A note for estimation about average differential entropy of continuous bounded space-time random field. Chinese J. Electron2022, 31, 793–803. [Google Scholar] [CrossRef]
Kramers, H. A. Brownian motion in a field of force and the diffusion model of chemical reactions. Physica1940, 7, 284–304. [Google Scholar] [CrossRef]
Moyal, J. E. Stochastic processes and statistical physics. J R Stat Soc Series B Stat Methodol1949, 11, 150–210. [Google Scholar] [CrossRef]
Shannon, C. E. A mathematical theory of communication. Bell System Technical Journal1948, 27, 379–423, 623–656. [Google Scholar] [CrossRef]
Neeser, F. D.; Massey, J. L. Proper complex random processes with applications to information theory. IEEE Trans. Inform. Theory1991, 39, 1293–1302. [Google Scholar] [CrossRef]
Ihara, S. Information theory-for continuous systems; World Scientific: Singapore, 1993. [Google Scholar]
Gray, R. M. Entropy and information theory; Springer: Boston, 2011. [Google Scholar]
Bach, F. Information Theory With Kernel Methods. IEEE Trans. Inform. Theory2023, 69, 752–775. [Google Scholar] [CrossRef]
Kullback, S.; Leibler, R. A. On information and sufficiency, Ann. Math. Stat.1951, 22, 79–86. [Google Scholar] [CrossRef]
Jeffreys, H. An invariant form for the prior probability in estimation problems. Proc. R. Soc. Lond. A1946, 186, 453–461. [Google Scholar]
Fuglede, B.; Topsøe, F. Jensen-Shannon divergence and Hilbert space embedding. In Proceedings of the IEEE International Symposium on Information Theory (ISIT), Chicago, IL, USA, 27 June-2 July 2004; p. 31. [Google Scholar]
Rényi, A. On measures of entropy and information. Proceedings of the Fourth Berkeley Symposium on Mathematical Statistics and Probability,1961, 1, 547–561. [Google Scholar]
Liu, S.; She, R.; Zhu, Z.; Fan, P. Storage Space Allocation Strategy for Digital Data with Message Importance. Entropy2020, 22, 591. [Google Scholar] [CrossRef]
She, R.; Liu, S.; Fan, P. Attention to the Variation of Probabilistic Events: Information Processing with Message Importance Measure. Entropy2019, 21, 439. [Google Scholar] [CrossRef]
Wan, S.; Lu, J.; Fan, P.; Letaief, K.B. Information Theory in Formation Control: An Error Analysis to Multi-Robot Formation. Entropy2018, 20, 618. [Google Scholar] [CrossRef]
She, R.; Liu, S.; Fan, P. Recognizing Information Feature Variation: Message Importance Transfer Measure and Its Applications in Big Data. Entropy2018, 20, 401. [Google Scholar] [CrossRef] [PubMed]
Nielsen, F. An Elementary Introduction to Information Geometry. Entropy2020, 22, 1100. [Google Scholar] [CrossRef] [PubMed]
Nielsen, F. On the Jensen–Shannon Symmetrization of Distances Relying on Abstract Means. Entropy2019, 21, 485. [Google Scholar] [CrossRef] [PubMed]
Nielsen, F.; Nock, R. Generalizing skew Jensen divergences and Bregman divergences with comparative convexity. IEEE Signal Process. Lett.2017, 24, 1123–1127. [Google Scholar] [CrossRef]
Furuichi, S.; Minculete, N. Refined Young Inequality and Its Application to Divergences. Entropy2021, 23, 514. [Google Scholar] [CrossRef]
Pinele, J.; Strapasson, J.E.; Costa, S.I. The Fisher-Rao Distance between Multivariate Normal Distributions: Special Cases, Bounds and Applications. Entropy2020, 22, 404. [Google Scholar] [CrossRef]
Reverter, F.; Oller, J.M. Computing the Rao distance for Gamma distributions. J. Comput. Appl. Math.2003, 157, 155–167. [Google Scholar] [CrossRef]
Pawula, R. F. Generalizations and extensions of the Fokker-Planck-Kolmogorov equations. IEEE Trans. Inform. Theory1967, 13, 33–41. [Google Scholar] [CrossRef]
Pawula, R. F. Approximation of the linear Boltzmann equation by the Fokker-Planck equation. Phys. Rev.1967, 162, 186–188. [Google Scholar] [CrossRef]
Khoshnevisan, D.; Shi, Z. Brownian Sheet and Capacity, Ann. Probab.1999, 27, 3, 1135–1159. [Google Scholar] [CrossRef]
Daniel, R.; Marc, Y. Continuous Martingales and Brownian Motion, 2nd. ed; Springer-Verlag: New York, 1999. [Google Scholar]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.
Copyright: This open access article is published under a Creative Commons CC BY 4.0 license, which permit the free download, distribution, and reuse, provided that the author and preprint are cited in any reuse.
Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.