Preprint
Article

Direct Link Between Energy and Information via Fisher’s Measure

Altmetrics

Downloads

117

Views

25

Comments

0

This version is not peer-reviewed

Submitted:

23 January 2024

Posted:

07 February 2024

You are already at the latest version

Alerts
Abstract
Energy and information are fundamental concepts that play crucial roles in the foundations of science, each contributing to different aspects of our understanding of the natural world. Here we encounter that Fisher' measure is able to link them. We also investigate some interesting relations between Fisher's information, Poisson's distribution, and thermal quantifiers in a grand canonical ensemble. These connections permit one to visualize interesting novel artifacts in statistical mechanics. We clearly see them emerge in discussing the ideal gas. For example, we encounter that the Poisson distribution is intrinsically involved in the physics of the ideal gas, and we are able to give a Fisher-estimation of the difference in informational content between the grand canonical and canonical ensembles.
Keywords: 
Subject: Physical Sciences  -   Mathematical Physics

1. Introduction

The relationship between energy and information continues to be a topic of exploration in both theoretical and applied sciences. Understanding their interplay contributes to a more comprehensive framework for describing and predicting the behavior of physical systems, from the microscopic to the cosmic scale. Here we highlight a direct relation between energy and information through Fisher’s measure. Fisher measure is a concept from information theory and statistics that measures how much information a random variable carries about an unknown parameter [1]. If one can establish a connection between energy and information using Fisher measure, this suggests a potential link between 1) physical or energetic properties and 2) the information content of a system. This kind of interdisciplinary research, combining concepts from physics and information theory, can lead to valuable insights and applications. Such findings may have relevance in fields such as physics, information science, or even technology. This investigation might be expected to contribute to gain some further understanding of fundamental principles in science.
As just stated, Fisher information (FI) is a concept from the field of statistics and information theory, particularly associated with the work of Sir Ronald A. Fisher [1]. Fisher information measure provides information about the sensitivity of a statistical model or experiment to the parameters being estimated. Some prior notions related to our endeavor are sketched in the subsections below. In this work we will look at
(1)
The relationship between 1) the Fisher’s measures F evaluated for the Poisson distribution and 2) the FI I evaluated for the grand canonical ensemble. Two distinct Fisher-parameters are in play here, namely, the inverse temperature β and the mean particle number < N > .
(2)
The connections between these two measures (F or I) and the energy for the first parameter above ( β ).
(3)
Ditto for the inverse of the mean particle number in the case of the second parameter.
(4)
Relations between thermal uncertainty relations and Fisher measures-

2. Sketch Concerning the Notions Involved here

2.1. Poisson’s Distribution

The Poisson distribution (see details below) is important in physics and various other fields due to its ability to model the probability of a given number of events occurring in a fixed interval of time or space when these events happen independently and at a constant average rate [2]. Its applications in physics and other sciences are widespread, and here are some areas where the Poisson distribution is particularly relevant: particle and nuclear physics, photon counting, traffic flows, economics and finance, biophysics, etc. The Poisson distribution is versatile and widely applicable whenever the probability of rare events occurring independently at a constant rate needs to be described. Its simplicity and generality make it a valuable tool in physics and other scientific disciplines. We use it here in connection to statistical mechanics.

2.2. The Likelihood Function

Mathematically, the likelihood function is defined as the probability of observing the given data under a specific set of parameter values. For a continuous distribution, the likelihood function is the probability density function (pdf) evaluated at the observed data points. For a discrete distribution, it is the probability mass function (pmf). In turn, the log-likelihood function is a fundamental concept in statistics. It is used to estimate the parameters of a statistical model based on observed data. Given a statistical model with a set of parameters θ and a likelihood function L ( ? ? d a t a ) L ( ? ? d a t a ) , the log-likelihood function is the natural logarithm of the likelihood function. A Maximum Likelihood Estimator (MLE) is a method used in statistics to estimate the parameters of a statistical model. It is one of the most widely used methods for parameter estimation and is based on the concept of likelihood. Here is how the Maximum Likelihood Estimator (MLE) works: Given a statistical model and a set of observed data, the likelihood function is constructed. The likelihood function measures how well the model’s parameters explain the observed data. It calculates the probability of observing the given data under various parameter values. : The goal of MLE is to find the parameter values that maximize the likelihood function. In other words, it seeks the values of the parameters that make the observed data most probable. To simplify the calculations, the likelihood function is often transformed into the log-likelihood function by taking the natural logarithm. This does not change the position of the maximum, but it makes the calculations more manageable [1].

2.3. Grand Canonical Ensemble

The grand canonical ensemble (GCE) [2] is a statistical mechanical set that describes a system in equilibrium with a particle and heat reservoir, allowing for exchange of both particles and energy with it. This ensemble is commonly used to study systems with a varying number of particles, such as gases at fixed temperature and chemical potential. In the grand canonical ensemble, the system is characterized by its temperature T, volumeV, and chemical potential μ . The ensemble allows for fluctuations in the number of particles N in the system, with an average particle number N . We will have a lot to say about this quantity.

2.4. Important Fisher Properties

In such a context of statistical estimation, Fisher’s information measures how much information a random sample of data contains about an unknown parameter [1]. It is a measure of the amount of uncertainty or variability in the data with respect to the parameter being estimated. The Fisher information matrix is often used to quantify this information. Mathematically, if you have a probability distribution f ( x ; θ ) for some parameter θ , the Fisher information ( I ( θ ) ) is given by the expected value of the square of the score function, which is the derivative of the log-likelihood function with respect to the parameter [1].
The Fisher information measure has several important properties [1]:
  • 1) Information accumulation: It quantifies how much information about a parameter is accumulated by collecting more data.
  • 2) Cramér-Rao Inequality: The Fisher information is related to the precision of parameter estimation. The Cramér-Rao inequality states that the variance of any unbiased estimator is bounded by the inverse of the Fisher information [1].
  • 3) Efficiency of estimators: It helps compare different estimators for efficiency, with more efficient estimators having higher Fisher information.
  • 4) In summary, Fisher information (FI) is a fundamental concept in statistics that provides a quantitative measure of the amount of information contained in a sample of data about the parameters of a statistical model. FI plays a crucial role in the theory of statistical estimation and hypothesis testing.

3. Mathematical Background Details

3.1. For of the Grand Canonical Ensemble

This ensemble describes a system in contact with a reservoir with which it can exchange energy and particles. Let us consider a system of N identical particles, in equilibrium at a temperature T, and confined to a volume V. The classical Hamiltonian for such system is a function of the of the phase point ( x , p ) , that is, H ( x , p ) [2]. The number of particles is not fixed. The resulting Boltzmann probability distribution of the system is [2]
ρ ( x , p ) = z N e β H ( x , p ) Z ( β , z , V ) .
The parameter β is defined as β = 1 / k B T where k B is Boltzmann’s constant. The symbol z represents the fugacity of the system, and Z Z ( β , z ) = N = 0 z N Q N ( β ) denotes the grand partition function where the range of N is 0 N < [2]. The well known canonical partition function for this system is given by [2]
Q N ( β ) = d Ω exp β H ,
with d Ω = d 3 N x d 3 N p / N ! h 3 N the element of volume of the phase space. The average of the particle number in the grand canonical ensemble is given by [2]
N = z ln Z z V , T
while the mean energy is [2]
H = ln Z β z , V .
The mean-square fluctuations in the energy U = H of a system in the grand canonical ensemble is [2]
( Δ U ) 2 = ( Δ U ) 2 c a n + U N T , V 2 ( Δ N ) 2 ,
which is equal to the fluctuation in the canonical ensemble plus a contribution due to the fact that particle number N fluctuates. Such contribution is given by ( Δ N ) 2 = N [2].

3.2. The Above Ideas in the Classical Ideal Gas [2]

Let us consider N identical particles with momenta p i , i = 1 , , N . The classical Hamiltonian for such a system is given by H = i = 1 N p i 2 / 2 m , where m represents the mass of the particles. The canonical partition function adopts now the appearance [2]
Q N ( β ) = 1 N ! V λ 3 N ,
where λ = ( 2 π 2 / m k B T ) 1 / 2 the particles’ mean thermal wavelength. Therefore, the grand canonical partition function becomes [2]
Z = exp z V λ 3 .
The average of the particle number in the grand canonical ensemble is given by [2]
N = z V λ 3 = ln Z ,
while the mean energy is [2]
H = 3 z V 2 β λ 3 = 3 2 N k B T .
The thermodynamics’ entropy is by definition [2]
S = β H N μ + ln Z ,
so that, by Equations (8) and (9) one arrives at the celebrated Sackur-Tetrode (ST) Equation [2]
S = k B N ln e 5 / 2 V λ 3 N ,
which depends on N , T, and V.

4. Poisson Distribution and GCE [2]

A discrete random variable X ( that referes to a discrete number of occurrences k) is said to have a Poisson distribution, with positive parameter λ > 0 if it has a probability mass function given by
f ( k ; λ ) = Pr ( X = k ) = λ k e λ k ! .
An essential fact for us is that the grand canonical ensemble is closely related to the special Poisson distribution [1] in which is a mean number of particles < N > with k being the actual number of particles N. Accordingly, our special Poisson distribution P N here refers to the probability of encountering N particles if we know before hand which is < N > . One has [1]
P N = N N e N N ! , 0 N <
where N is the average number particles.
A Poisson distribution is used to describe the number of events that occur in a fixed interval of time or space, when the events occur independently at a constant rate. Also, it is often used to model situations where events are rare and random, such as radioactive decay or the arrival of particles at a detector. In the context of the grand canonical ensemble, the Poisson distribution can arise as a result of the probabilistic nature of particle number fluctuations. Specifically, in the grand canonical ensemble the average particle number is not fixed, but rather fluctuates around a mean value determined by the chemical potential of the reservoir [2].
When the fluctuations in the particle number are relatively small and the average particle number is large, the grand canonical distribution of particle numbers can be well approximated by the Poisson distribution (PD) [2]. This occurs because the Poisson distribution naturally arises as a limit of the binomial distribution when the number of trials (particle exchanges) becomes large and the probability of success (particle exchange) becomes small. In such scenario the PD helps to describe the statistical behavior of the system and provides insights into how possible particle numbers in equilibrium are distributed [2].
Accordingly, the Poisson distribution can provide a useful approximation in the grand canonical ensemble for systems with a large average particle number, where the fluctuations are small and the particle exchanges with the reservoir are rare events occurring independently [2]. One example where the Poisson distribution can be used as an approximation to the grand canonical distribution is in the context of photon statistics in quantum optics [2].
In this approximation, the average photon number N in the Poisson distribution is equal to the average photon number in the grand canonical distribution [2]. By using the Poisson distribution, one can make calculations and predictions about photon statistics in equilibrium with a reservoir without explicitly considering the full grand canonical distribution, simplifying the analysis while still capturing the essential statistical behavior of the system [2].
The Poisson distribution has several properties. We mention some of them [1]:
  • P N is normalized:
    N = 0 P N = 1 .
  • The expected value of N is
    N = N = 0 P N N .
  • The variance of N behaves in rather peculiar fashion
    ( Δ N ) 2 = N 2 N 2 = N ,
    where
    N 2 = N = 0 P N N 2 = N + N 2 .
In addition, a useful quantity, that we will use later, is the scaled variance or Fano factor, which is an intensive measure of fluctuations [4]. It is defined as [2]
ω = ( Δ N ) 2 N .
It is evident that for the Poisson distribution, we have ω = 1 . For a < 1 Fano factor one speaks of a sub-Poisson instance and for a > 1 one of a super-Poisson one.

5. Fisher’s Information and Poisson Distribution

We begin at this point to develop our vision regarding an energy-information connection. The relation between Fisher’s information and Poisson distribution is essential in the search for such a direct energy-information link.
The discrete Fisher information for the Poisson distribution, denoted as F β , is defined as [3]
F β = N = 0 ( β P N ) 2 P N ,
where we use the abbreviated notation β P N = P N / β . Considering the Poisson distribution P N given by Equation (13), we can derive the following relation: β P N = P N ( N / N β N . Therefore, we have for the Fisher information measure F associated to the Poisson distribution
F β = β N N 2 N = 0 ( P N N 2 2 N N + N 2 ) .
By using Equations (14)–(16), and setting ω = 1 , we can calculate the above sum to obtain the result
F β = 1 N β N 2 .

5.1. Discrete Fisher Information and the N -Parameter

On the other hand, the discrete Fisher information [3] F for the N -parameter of the Poisson distribution P N , denoted by F N , is given by
F N = N = 0 ( N P N ) 2 P N ,
where we use the abbreviated notation N P N = P N / N . Using Equation (13) for P N and performing the sum as described earlier, we find
F N = ( Δ N ) N 2 ,
where we have considered that N P N = P N / N = ( N / N 1 ) P N . Notice that F N is also written as
F N = 1 N .
since it is well known, and we saw it above, that for the Poisson distribution one has a Fano factor ω = 1 . The above relation associates Fisher information to the inverse of the particle number. It is clear that augmenting the number of particles increases our ignorance.

5.2. Grand Canonical Distribution

Connections between Fisher’s measures and energy fluctuations are important in understanding the relationship between statistical physics and information theory. Fisher’s measures provide a way to quantify the amount of information in a system and how it changes over time. Energy fluctuations, on the other hand, are a key aspect of the behavior of physical systems and can be used to understand their thermodynamic properties. By studying the connections between Fisher’s measures and energy fluctuations, researchers can gain insights into the fundamental principles that govern the behavior of complex systems. This can lead to new developments in fields such as statistical mechanics, machine learning, and data analysis.
The general Fisher’s information measure for the grand canonical distribution, denoted by I β , is defined as [1]
I β = N = 0 d Ω ρ ( x , p ) ln ρ ( x , p ) β 2 ,
where ρ ( x , p ) stands for the grand canonical distribution given by Equation (1). After integrating in phase space and performing the sum as indicated in Section 6.3 of [2], we arrive at
I β = ( Δ U ) 2 .
Now, repeating the previous procedure but using the canonical distribution ρ c a n ( x , p ) = exp ( β H ) / Q N ( β ) , the canonical Fisher measure is defined as [5]
I β c a n = d Ω ρ c a n ( x , p ) ln ρ c a n ( x , p ) β 2 ,
where the superscript indicates that we are considering the distribution of the canonical ensemble. Performing the integral, we find
I β c a n = ( Δ U ) 2 c a n ,
which coincides with the energy fluctuations of the canonical ensemble [5]. As promised above we see now that by studying the connections between Fisher’s measures and energy fluctuations we arrive, in [5] for the anonical ensemble, ans here for the grand canonical ones, that the entire amount of information regarding the system is contained in the energy fluctuations.
The equivalence between Fisher’s information measure in the grand canonical (and canonical) ensemble and the energy fluctuation is a profound and significant result in statistical mechanics. Fisher’s information quantifies the precision of parameter estimation, and in the context of out two ensembles it precisely captures the uncertainty in estimating the inverse temperature, a key thermodynamic parameter. The fact that this information measure aligns exactly with the energy fluctuation underscores a deep connection between statistical precision and the inherent variability of energy in the system. This equivalence implies that as the precision of our knowledge about the system’s temperature increases, the energy fluctuations become more constrained. In practical terms, this result provides valuable insights into the relationship between information content and the thermodynamic behavior of a system, offering a bridge between the statistical and thermodynamic perspectives. It also holds implications for experimental design, suggesting that enhancing precision in parameter estimation is directly linked to a better understanding and control of energy fluctuations in the grand canonical ensemble. Overall, this result sheds light on the intricate interplay between information theory and thermodynamics, deepening our comprehension of the fundamental principles governing statistical mechanics.

5.3. Evaluating N -Parameter

The general Fisher’s information measure that has as a parameter I N is defined as
I N = N = 0 d Ω ρ ( x , p ) ln ρ ( x , p ) N 2 .
From Equation (1) we obtain
ln ρ ( x , p ) N = Z z N N z N Z ,
assuming that both z and Z depend on N . Introducing now Equation (30) into Equation (29), we find
I N = Z N = 0 Q N z N N z N Z 2 .

6. Applications to the Ideal Gas

Poisson Connections between Distinct Fisher Measures

From Equation (8) we get β N = 3 N / ( 2 T ) . Thus, Poisson’s Equation (21) becomes the Fisher measure whose parameter is the inverse temperature associated to Poissson’s distribution
F β = 3 2 k B T 2 N . ,
and we realize that the right hand side above is just the mean ideal gas energy of an ideal gas, as determined using Poisson distribution. Thus, F β is < N > times the square of a single particle mean energy.
In the specific scenario of the ideal gas in the grand canonical ensemble, where U = H is determined by Equation (9), we find that U / N = 3 k B T / 2 . Based on the previous considerations, we conclude that
( Δ U ) 2 = ( Δ U ) 2 c a n + 3 2 k B T 2 N ,
that, taking into account (32), we arrive to the interesting Poisson result
( Δ U ) 2 = ( Δ U ) 2 c a n + F β ,
leading to the Poisson relation
F β = ( Δ U ) 2 ( Δ U ) 2 c a n ,
which indicates that discrete Fisher information for the β -parameter, that is F β , when the distribution is the Poisson one is equal to the difference in energy fluctuations between the canonical and grand canonical statistical ensembles. More importantly, we appreciate the fact that Fisher’s information measure is here an energy variance. Reinforcing this idea, we see that, For the ideal gas, using Equations (26), (28), and (35), we can assert that we obtain a generalization of the above idea from Poisson instance to the grand canonical-ideal gas Fisher measure
F β = I β I β c a n .
This suggests that considering the grand canonical ensemble provides more information about the inverse temperature than the canonical ensemble alone. We also see that the Poisson distribution is intrinsically linked to the physics of the ideal gas.
The observation that the Fisher information associated with temperature estimation is larger in the grand canonical ensemble compared to the canonical ensemble for an ideal gas holds significant implications in the realm of statistical mechanics and thermodynamics. The grand canonical ensemble allows for the exchange of particles with a reservoir, introducing an additional level of flexibility in the system that enhances the precision of estimating the temperature. The larger Fisher information suggests that the grand canonical ensemble provides more informative data about the temperature parameter, enabling more accurate and reliable temperature estimates. This has practical consequences in experimental settings where the precision of temperature measurements is crucial. The result underscores the advantage of considering systems with variable particle numbers when aiming for precise thermodynamic parameter estimations. Understanding the nuances of information content in different ensembles contributes not only to the theoretical foundations of statistical mechanics but also guides experimental strategies for optimizing temperature control and measurement accuracy in the study of ideal gases. In essence, the larger Fisher information in the grand canonical ensemble signifies a richer source of data for temperature estimation, emphasizing the importance of ensemble selection in designing experiments and interpreting thermodynamic behavior.
In addition, from Equations (30), we obtain
ln ρ ( x , p ) N = N N 1 .
Introducing this into Equation (29), after integrating and solving the sum, one has, when the Fisher parameter is now < N > ,
I N = 1 N ,
where we have taken into account that
N = 0 N z N Q N = N e N ,
and
N = 0 N 2 z N Q N = N ( 1 + N ) e N .
By comparing Equations (24) and (38), we find
F N = I N .
Thus, the Fisher measures are identical for both the Poisson and grand canonical distributions, being inversely proportional to the average number of particles. This again tells you how strongly the Poisson distribution is linked to the physics of the ideal gas. All information available for parameter < N > is already Poisson-predetermines.
We see that the interpretation of the Fisher measure for the parameter θ depends on what this parameter stands for. If it is β , the Fisher measure is an energy variance, If it is N, the Fisher measure is the inverse of the mean number of particles.

7. Thermodynamic Uncertainty Relation

Uffink asserted in Ref. [5] that temperature and energy are complementary in the same way as position and momentum in quantum mechanics. They augmented that a system can only have a specific temperature if it is in contact with a heat bath, leading to unavoidable energy fluctuations. In analogy to the relation Δ x Δ p , Uffink conjectures that [5]
Δ U Δ 1 T k B ,
where k B is the Boltzmann’s constant and the term δ ( 1 / T ) is the temperature fluctuation, so that
Δ U Δ β 1 ,
where, as usual, β = 1 / k B T .
For a system in thermal contact with a heat bath at temperature T, in the canonical ensemble, we also have the inequality for the canonical-ensemble Fisher measure [5]
I β c a n Δ β 1 ,
with I β c a n the Fisher information given by (26) and Δ β = Δ ( 1 / T ) / k B . Using Equation (36) one can derive the link with the grand canonical Fisher measure I
I β c a n = I β F β .
Substituting this into inequality (44), we get the result
I β Δ β 1 + F β Δ β ,
which can also be written as
( I β F β ) Δ β 1 .
We see that there is complementarity, when the parameter is β , between the difference I F and the fluctuation in the inverse temperature. The Poisson associate Fisher measure can NEVER equal the grand canonical Fisher measure. In fact, it is always smaller.

8. Conclusions

We have studied here interesting aspects of the relation between the three following concepts: a) Fisher’s information measure (FIM) for a given parameter θ , 2) the Poisson distribution, add 3) the grand canonical ensemble (GCE). Some of these aspects have been further specialized for the case of the ideal gas. As Fisher-parameters we have used the inverse temperature β and the mean particlle number < N > . We call F the Poisson associated Fisher measure and I its grand canonical counterpart.
  • Interestingly enough, we have discovered that F and I arw directly related the one to the other.
  • This entails that the Poisson distribution is involved in the physics of the ideal gas,
  • Indeed, we have established the extant relationship between F and I for our two parameters.
  • We also determined the connections between our two Fisher measures and the energy variance for the parameter β .
  • We established the link between F and J when the Fisher-parameter is the mean particle number. The Fisher measure turns out here to be the inverse of the particle number. The larger the mean particle number, the smaller the Fisher information.
  • We have scrutinized thermal uncertainty relations with the lens of Fisher measure. We found as a consequence that, always, I F .
Fisher information is commonly used to quantify the uncertainty or variability in a statistical model. If it is connected to an energy variance, it could provide a new way to quantify uncertainty in physical systems. This may have applications in physics, engineering, and other scientific disciplines where understanding and managing uncertainty are crucial. The connection between Fisher information and energy variance could have implications for statistical mechanics and thermodynamics. Our finding could contribute to a deeper understanding of how information is encoded in physical systems and how it relates to the fundamental properties of those systems.
As for the FIM relation to the particle number one can make some comments. We saw that FIM is a measure of the precision or the amount of information contained in a statistical distribution. If it is inversely related to the mean particle number, it suggests that as the number of particles increases, the precision or uncertainty in the system decreases. This could have implications for understanding the behavior of systems with varying particle numbers. In some contexts, the number of particles in a system can be considered a resource. The inverse relation with Fisher information could imply a trade-off between the availability of this resource (particle number) and the system’s ability to encode information about its state.
It is worth insisting upon one of our present results. The equivalence between Fisher’s information measure in the grand canonical (and canonical) ensemble and the energy fluctuation is a profound and significant result in statistical mechanics. Fisher’s information quantifies the precision of parameter estimation, and in the context of out two ensembles it precisely captures the uncertainty in estimating the inverse temperature, a key thermodynamic parameter. The fact that this information measure aligns exactly with the energy fluctuation underscores a deep connection between statistical precision and the inherent variability of energy in the system. This equivalence implies that as the precision of our knowledge about the system’s temperature increases, the energy fluctuations become more constrained. In practical terms, this result provides valuable insights into the relationship between information content and the thermodynamic behavior of a system, offering a bridge between the statistical and thermodynamic perspectives. It also holds implications for experimental design, suggesting that enhancing precision in parameter estimation is directly linked to a better understanding and control of energy fluctuations in the grand canonical ensemble. Overall, this result sheds light on the intricate interplay between information theory and thermodynamics, deepening our comprehension of the fundamental principles governing statistical mechanics.

References

  1. Frieden, B.R. Science from Fisher information; Cambridge University Press: Cambridge, 2004. [Google Scholar]
  2. Pathria, R.K. Statistical Mechanics, 2nd ed; Butterworth-Heinemann: Oxford, UK.
  3. Moreno, P.S.S.; Yanes, R.J.; Dehesa, J.S. Difference Equations and Applications : Discrete Densities and Fisher Information, Proceedings of the 14th International Conference on Difference Equations and Applications; Ugur University Publishing Company: Istanbul, Turkey, 2009; pp. 291–298. ISBN 978-975-6437-80-3. [Google Scholar]
  4. Kuznietsov, V.A.; Savchuk, O.; Gorenstein, M.I.; Koch, V.; Vovchenko, V. Critical point particle number fluctuations from molecular dynamics. Physical Review C 2022, 105, 044903. [Google Scholar] [CrossRef]
  5. Uffink, J.; van Lith, J. Thermodynamic Uncertainty Relations. Foundations of Physics 1999, 29. [Google Scholar] [CrossRef]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.
Copyright: This open access article is published under a Creative Commons CC BY 4.0 license, which permit the free download, distribution, and reuse, provided that the author and preprint are cited in any reuse.
Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

© 2024 MDPI (Basel, Switzerland) unless otherwise stated