Preprint
Article

On the Rényi Entropy Functional, Tsallis Distributions and Lévy Stable Distributions

Altmetrics

Downloads

105

Views

22

Comments

0

This version is not peer-reviewed

Submitted:

27 January 2024

Posted:

29 January 2024

You are already at the latest version

Alerts
Abstract
The Rényi entropy functional was optimised for the novel findings of this investigation under various conditions. Their research's findings suggest that the complete spectrum of Lévy stable distributions is covered by generalised t-distributions. Additionally, an exposition was undertaken to prove that the Lévy distribution generalizes Tsallisian distribution, rather than the reverse. The current study is a strong generalization of an existing research work in the literature.
Keywords: 
Subject: Computer Science and Mathematics  -   Probability and Statistics

Introduction and literature review

The most justifiable version of information entropy is the Rényi entropy with a free Rényi non- extensivity parameter q, and the Tsallis entropy can be thought of as a linear approximation [1] to the Rényi entropy when q 1 . When q 1 , the Boltzmann-Shannon entropy functional replaces both other entropy functions. When the Rényi entropy functional is subjected to the maximum information entropy (MEP) principle, the result is the micro canonical (homogenous) distribution for an isolated system. The Boltzmann entropy functional replaces the Rényi entropy functional in this situation, which supports the universality of Boltzmann's principle of statistical mechanics, regardless of the value of the Rényi parameter q.
The need for non-extensive statistics based on Tsallis' information entropy is critical given its quick development. The one-parameter family of Rényi entropies (or just Rényi entropy) seems to be the most rational one [2]. The well-known Boltzmann-Shannon entropy functional replaces the Rényi entropy functional when the Rényi parameter q is equal to unity. The non-extensive Tsallis' entropy functional is produced by linearizing the extended Rényi entropy functional in the vicinity [2] of a point q   1 .
When the principle of maximum of an information entropy (MEP) is applied to the Rényi entropy functional of an isolated system. At this phase, the Rényian functional reduces to the Boltzmannian functional , thus enforcing Boltzmann Principle from which all thermodynamic properties of extensive and non-extensive Hamiltonian systems can be deduced.
The measure of information in this example of a system’s incomplete statistical descriptor with the help of probabilistic distribution is called the information entropy functional, or just entropy p = p i ,   0 p i 1 ,   i = 1 , ,   n .
Boltzmann-Shannon representation of the entropy functional is the most well-known to read as:
H B = K B i = 1 n p i l n p i
The entropy H B correlates with the thermodynamic entropy functional in the situation given, where the distribution p i   is the system's macroscopic equilibrium state and the subscripts i   denote dynamic microstates in the Gibbs phase space.
This entropy functional was justified by Khinchin and Shannon based[5] on a system of axioms presented in a theorem form. Their axioms were analyzed in [2,5], where it was shown that a uniquely determined Boltzmann-Shannon entropic form is provided by a quite artificial axiom related to a form of conditional entropy functional(that is, the entropy functional of subsystem of a system being in a prescribed state). Uffink [5] examined several papers on this topic and discovered that the Shore and Johnson [6] axiom system, which results in the Rényi entropy functional[7] to read as:
        H R q p = K B 1 q l n i = 1 n p i q ,   i = 1 n p i = 1 ,
The Rényi entropy functional, which is a mathematical measure was used to quantify the amount of information or disorder in a system. It mentions that q (c.f., (2)) must be positive and not less than zero. The properties and characteristics of Rényi entropy functional are further explored in related literature[7,8,9]. Among its basic properties, we may mention positivity(   H R q 0 ), concavity for q 1 ,   a n d   i n   a d d i t i o n   l i m q 1   H R q = H B .
In the case of | 1 i p i q | 1 (which, in view of the normalization of the distribution p i , corresponds to the condition   | 1 q | 1 ) , one can restrict oneself to the linear term of the logarithm in the expression for H R q p   over this difference, and H R q p changes to
        H T q p = K B 1 q ( 1 i = 1 n p i q )      
Havdra and Charvat [10] and Daroczy [11] have proposed such a linearization of the Rényi entropy functional; at this time, Tsallis' entropy [12] had come into existence.
The entropy functional stops being exhaustive due to logarithm linearization. To examine a range of non-extensive systems, Tsallisian followers have extensively utilised this quality [12,13,14,15,16,17,18,19,20]. In doing so, the constraint 1 q 1 mentioned above is typically ignored.
According to MEP when describing a system statistically, its distribution function should accurately represent the average quantities observed in the system. If these quantities are not known, the distribution function should be as indeterminate as possible. This approach has been widely used in constructing equilibrium statistical thermodynamics for isolated or weakly interacting thermodynamic systems.
After the research conducted by Jaynes[21], Gibbs ensembles were widely used as a statistical approach to accurately represent average quantities observed in a system, as they are commonly used in constructing equilibrium statistical thermodynamics for isolated or weakly interacting thermodynamic systems. The information entropy functional, commonly referred to as the Boltzmann-Shannon entropy functional, is traditionally used to quantify the "disorder" or uncertainty in a system.
This theorem introduces as it reads a new physical interpretation: as Rényi entropy functional is more general than both Shannon and Tsallis, since Rényi entropy functional reduces to Shannon case as the parameter q 1 ,   and that Rényian entropy functional’s linearization in the neighbourhood of a point q   1   is the Tsallisian entropy functional. This leads to a newer ground in information theory , we could represent it by the following diagram:
        H R q 1   H B         H T q 1 ( l i n e a r i z a t i o n o f     H R q i n t h e n e i g h b o u r h o o d o f a p o i n t q   1 )   H T q
In other words, this could be read as employing Rényi entropy functional to research any concept would generate the special case of Shannon case as   t h e   p a r a m e t e r   q 1 , and would reduces to the Tsallis case if we carry out the linearization of the Rényi entropy functional i n   t h e   n e i g h b o u r h o o d   o f   a   p o i n t q   1 .
Shannon entropy, sometimes referred to as Gibbs entropy in statistical physics, is a measure of “disorder” in a system. As an alternative to Gibbs entropy, Tsallis developed a non-extensive entropy[13,15], indexed by q , which results in an infinite family of Tsallis non-extensive entropies. While Tsallis non-extensive entropy produces “type II generalised Lévy stable distributions” with heavy tails that obey power laws, Gibbs entropy produces exponential distributions. It is important to remember that Tsallis entropy is equivalent to Havrda-Charvat's structural q -entropy[14], but the non-extensive mechanics community frequently ignores this relationship. Additionally, rather than the other way around, Tsallis distributions are derived from Lévy distributions.

Analysis

We'll talk about how four different generalised t-distribution types can be derived from the definitions of Rényi entropy functional. The Lagrangian equation of the calculus of variations, a mathematical technique for optimising functions subject to restrictions, is used to reach these conclusions. According to the author's analysis, these discoveries are brand-new, and they are presented.

Result 1

Rényi entropy functional (2) can be optimized under
i = 1 n p i = 1    
Under the constraint:
E 1 + a 1 x b = c o n s t a n t
We have the Lagrangian function L to be:
L = [ K B 1 q l n i = 1 n p i q α 1 ( i = 1 n p i 1   ) β 1 ( E 1 + a 1 x b c o n s t a n t ) ]
L p i = 0   implies
Hence,
( q K B 1 q i = 1 n p i q i = 1 n p i q 1 α 1 i = 1 n 1 ) β 1 i = 1 n 1 + a 1 x b = 0
i.e.,
q K B 1 q p i q 1 i = 1 n p i q = α 1 + β 1 + β 1 a 1 x b = α 1 + β 1 ( 1 + β 1 a 1 α 1 + β 1 x b )
Define   β 1 a 1 α 1 + β 1 = a ,     C q 1 = α 1 + β 1 ( 1 q ) q K B i = 1 n p i q ,   d = 1 1 q
Hence, clearly it follows that
p i = c ( 1 + a x b ) d
where < x < , a , b , c , d > 0 , b d > 0 , 1 > q > 0 .
In the following, we are going to optimize Rényi entropy functional (2) by replacing constraint (5), by the absolute moment and keeping constraint (4).

Result two

We have two constraints,
i = 1 n p i = 1    
Subject to:
i = 1 n x b p i q x = c o n s t a n t
We have the Lagrangian function L to be:
L = [ K B 1 q l n i = 1 n p i q α 2 i = 1 n p i 1   + β 2 ( i = 1 n x b p i q x c o n s t a n t ) ]
L p i = 0   implies
( q K B 1 q i = 1 n p i q i = 1 n p i q 1 α 2 ( i = 1 n 1 ) + q β 2 i = 1 n x b p i q 1 = 0
Hence
( q K B 1 q i = 1 n p i q ( 1 + β 2 ( 1 q ) q K B i = 1 n p i q x b ) p i q 1 = α 2 )
Thus, one gets:
Define   a = β 2 ( 1 q ) q K B i = 1 n p i q ,   ,     C q 1 = α 2 ( 1 q ) q K B i = 1 n p i q ,   d = 1 q 1
Hence, it follows that
p i = c ( 1 + a x b ) d
Where < x < , a , b , c , d > 0 ,   b d > 0 ,   2 b + 1 b + 1 > q > 1 . Because these are partially related through (4) and (11), equation (15) generates Tsallis distribution [13] for b = 2 .

Result three

Following our approach as above subject to the two constraints
i = 1 n p i = 1    
and, after prescribing:
i = 1 n x b p i x = c o n s t a n t
The reader can check that after few algebraic steps, the solution is in the closed form representation:
p i = c ( 1 + a x b ) d ,
  C = α 3 ( 1 q ) q K B i = 1 n p i q ,     C q 1 = β 3 α 3 ,   d = 1 1 q
where < x < , a , b , c , d > 0 ,   b d > 0 ,   1 > q > 1 b + 1 ,     α 3 , β 3   are the Lagrangian multipliers. Equation (16) defines the variance.

Result four

Following our approach as above subject to more complex constraints,
i = 1 n p i = 1 ,        
i = 1 n x b p i q x = c o n s t a n t
i = 1 n x b p i x = c o n s t a n t
After few algebraic steps, the reader can check that by optimizing the Rényi’s entropy functional subject to constraints (4), (11), and (16), the reader can check that after few algebraic steps, the solution is in the closed form representation:
p i = c ( 1 + a x b 1 + a ' x b ) d
C q 1     α 4 ( 1 q ) q K B i = 1 n p i q ,     a = γ 4 α 4 ,   a ' = β 4 ( 1 q ) q K B i = 1 n p i q   d = 1 q 1
where < x < , a , b , c , d > 0 ,   b d > 0 ,   2 b + 1 b + 1 > q > 1 ,   α 4 , β 4 ,   γ 4   are the Lagrangian multipliers. The variance corresponds to b = 2 , (16) defines the variance. We can see that for   0 ,   b = 2 ,   (19) reduces to Tsallisian distribution [13] as a special case. By (19), we have for small values of | x | ,
p i = c ( 1 + a x b 1 + a ' x b ) d ~ c ( 1 + ( a a ' ) x b ) d
Carrying out the same analysis, we have for large values of | x | ,
          p i ~ c ( ( a a ' ) x b ) d
This is summarized in the more compact form:
p i x ~ c ( 1 + ( a a ' ) x b ) d = p 1 x ,       f o r   s m a l l   | x | c ( ( a a ' ) x b ) d     =     p 2 x   ,         f o r     l a r g e   | x |
The density functions (PDFs) (10), (15), (17), and (19) refer to generalized t-distributions, which exhibit polynomial tails for small values of x and power law tails for large values of x. These distributions encompass the entire range of Lévy stable distributions, which are commonly used to model extreme events and heavy-tailed phenomena. Specifically, equation (10) represents the PDFs for small values of x, while equation (19) represents the pdf for large values of x.
  p x ~ c 1 a d x b = p 1 x     ,         f o r   s m a l l   | x |   c a x b d = p 2 x ,           f o r     l a r g e   | x |
where b d > 1 .
The outcomes for (15), (17), and (19) are comparable. Since the Lévy distributions can be represented by the generalised t-distributions (10), (15), (17), and (19), the density functions that can be obtained from Rényi entropy functional are even broader. It is clear from the analysis done and reported in (23), for (19), that the case is still relevant.

Conclusion

Using distinct sets of circumstances to optimise the Rényi entropy functional, we revealed fresh results in this study. The outcomes produce generalised t-distributions for the whole family of Lévy stable distributions. It is discovered that the Lévy distribution generalizes the Tsallisian case, not the other way around. The work of [22], where Shannon and Havrda-Charvat entropy functional were utilised, is strongly generalised in this work.

References

  1. I. A. Mageed, "The Consistency Axioms of The Stable M/G/1 Queue’s Za, b Non-Extensive Maximum Entropy Formalism with M/G/1 Theory Applications to 6G Networks and Multimedia Applications," 2023 International Conference on Computer and Applications (ICCA), Cairo, Egypt, 2023, pp. 1-6. [CrossRef]
  2. I. A. Mageed, et al, "M/G/1 queue with Balking Shannonian Maximum Entropy Closed Form Expression with Some Potential Queueing Applications to Energy," 2022 Global Energy Conference (GEC), Batman, Turkey, 2022, pp. 105-110. [CrossRef]
  3. I. A. Mageed and Q. Zhang, "An Information Theoretic Unified Global Theory For a Stable M/G/1 Queue With Potential Maximum Entropy Applications to Energy Works," 2022 Global Energy Conference (GEC), Batman, Turkey, 2022, pp. 300-305. [CrossRef]
  4. I. A. Mageed and Q. Zhang, "Inductive Inferences of Z-Entropy Formalism (ZEF) Stable M/G/1 Queue with Heavy Tails," 2022 27th International Conference on Automation and Computing (ICAC), Bristol, United Kingdom, 2022, pp. 1-6. [CrossRef]
  5. I.A. Mageed, Q. Zhang, “Inductive Inferences of Z-Entropy Formalism (ZEF) Stable M/G/1 Queue with Heavy Tails,”In 2022 27th IEEE International Conference on Automation and Computing (ICAC), 2022,pp. 1-6.
  6. I.A. Mageed, Q.Zhang, “Threshold Theorems for the Tsallisian and Rényian (TR) Cumulative Distribution Functions (CDFs) of the Heavy-Tailed Stable M/G/1 Queue with Tsallisian and Rényian Entropic Applications to Satellite Images (SIs), “ electronic Journal of Computer Science and Information Technology. 2023, vol 9, no 1, pp. 41-7.
  7. I. A. Mageed, “ Where the mighty trio meet: Information Theory (IT), Pathway Model Theory (PMT) and Queueing Theory (QT), ” In 39th Annual UK Performance Engineering Workshop, 2023, p. 8.
  8. I.A. Mageed, “The Entropian Threshold Theorems for the Steady State Probabilities of the Stable M/G/1 Queue with Heavy Tails with Applications of Probability Density Functions to 6G Networks, ” electronic Journal of Computer Science and Information Technology, 2023, vol 9, no 1, p. 24-30.
  9. I. A. Mageed, D. Kouvatsos, "Non-Extensive Maximum Entropy Formalisms and Inductive Inferences of Stable M/G/1 Queue with Heavy Tails," Advanced Trends in Queueing Theory 2 (2021).
  10. I. A. Mageed, “The Consistency Axioms of The Stable M/G/1 Queue’s Z(a,b) Non-Extensive Maximum Entropy Formalism with M/G/1 Theory Applications to 6G Networks and Multimedia Applications, ” In 2023IEEE International Conference on Computer and Applications (ICCA), pp. 1-6.
  11. I. A. Mageed, et al, “Towards a Revolutionary Info-Geometric Control Theory with Potential Applications of Fokker Planck Kolmogorov (FPK) Equation to System Control, Modelling and Simulation," 2023 28th International Conference on Automation and Computing (ICAC), 2023.
  12. I. A. Mageed, "A Unified Information Data Length (IDL) Theoretic Approach to Information-Theoretic Pathway Model Queueing Theory (QT) with Rényi entropic applications to Fuzzy Logic." In 2023 IEEE International Conference on Computer and Applications (ICCA), 2023, pp. 1-6.
  13. I. A. Mageed, “Rényi’s Maximum Entropy Formalism of Heavy-Tailed Queues with Hurst Exponent Heuristic Mean Queue Length Combined With Potential Applications of Hurst Exponent to Engineering" In 39th Annual UK Performance Engineering Workshop,2023, p. 21.
  14. I. A. Mageed, et al, “ The Linearity Theorem of Rényian and Tsallisian Maximum Entropy Solutions of The Heavy-Tailed Stable M/G/1 Queueing System entailed with Potential Queueing-Theoretic Applications to Cloud Computing and IoT, ” electronic Journal of Computer Science and Information Technology, 9(1), 15-23.
  15. I. A. Mageed, “ Cosistency Axioms of Choice for Ismail’s Entropy Formalism (IEF) Combined with Information-Theoretic (IT) Applications to advance 6G Networks,” European Journal of Technique (EJT), 2023,vol13, no 2, p.207-213.
  16. I.A.Mageed, “ Info- Geometric Analysis of the Stable G/G/1 Queue Manifold Dynamics With G/G/1 Queue Applications to E-health, “ Preprints 2024, 2024011813. [CrossRef]
  17. I.A.Mageed, “ A Theory of Everything: When Information Geometry Meets the Generalized Brownian Motion and the Einsteinian Relativity, “ Preprints 2024, 2024011827. [CrossRef]
  18. I.A.Mageed, “Effect of the root parameter on the stability of the Non-stationary D/M/1 queue’s GI/M/1 model with PSFFA applications to the Internet of Things (IoT), “Preprints 2024, 2024011835. [CrossRef]
  19. I.A.Mageed, et al, Z(a,b) of the Stable Five-Dimensional M/G/1 Queue Manifold Formalism's Info-Geometric Structure with Potential Info-Geometric Applications to Human Computer Collaborations and Digital Twins." 2023 28th International Conference on Automation and Computing (ICAC). IEEE, 2023.
  20. I.A.Mageed, A.Becheroul, "The Threshold Theorems of Generalized Z-entropy (GZE) fractal dimension (FD) Combined with Influential Applications of FD to Biotechnology Engineering." 39th Annual UK Performance Engineering Workshop. 2023.
  21. I.A. Mageed, A. H. Bhat, “GENERALIZED Z-ENTROPY (GZE) AND FRACTAL DIMENSIONS”, Fractal Analysis - Applications and Updates [Working Title]. IntechOpen, Jul. 14, 2023. [CrossRef]
  22. P.N. Rathie, Sedan Silva, Shannon, Lévy, and Tsallis: A Note, Applied Mathematical Sciences, Vol. 2, 2008, no. 28, p.1359 – 1363.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.
Copyright: This open access article is published under a Creative Commons CC BY 4.0 license, which permit the free download, distribution, and reuse, provided that the author and preprint are cited in any reuse.
Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

© 2024 MDPI (Basel, Switzerland) unless otherwise stated