Divergences have become a very useful tool for measuring similarity (or dissimilarity) between probability distributions. Depending on the field of application, a more appropriate measure may be necessary. In this paper we introduce a family of divergences called γ-Divergences. They are based on the convexity property of the functions that generate them. We demonstrate that these divergences verify all the usually required properties, and we extend them to weighted probability distribution. In addition, we define a generalised entropy closely related to the γ-Divergences. Finally, we apply our findings to the analysis of simulated and real time series.
Keywords:
Subject: Physical Sciences - Acoustics
Copyright: This open access article is published under a Creative Commons CC BY 4.0 license, which permit the free download, distribution, and reuse, provided that the author and preprint are cited in any reuse.
Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.