Preprint Review Version 1 Preserved in Portico This version is not peer-reviewed

Applications of Entropy in Data Analysis and Machine Learning: A Review

Version 1 : Received: 2 October 2024 / Approved: 2 October 2024 / Online: 3 October 2024 (08:19:20 CEST)

How to cite: Sepúlveda-Fontaine, S. A.; Amigó, J. M. Applications of Entropy in Data Analysis and Machine Learning: A Review. Preprints 2024, 2024100213. https://doi.org/10.20944/preprints202410.0213.v1 Sepúlveda-Fontaine, S. A.; Amigó, J. M. Applications of Entropy in Data Analysis and Machine Learning: A Review. Preprints 2024, 2024100213. https://doi.org/10.20944/preprints202410.0213.v1

Abstract

Since its origin in the thermodynamics of the 19th century, the concept of entropy has also permeated other fields of physics and mathematics, such as Classical and Quantum Statistical Mechanics, Information Theory, Probability Theory, Ergodic Theory and the Theory of Dynamical Systems. Specifically, we are referring to the classical entropies: the Boltzmann-Gibbs, von Neumann, Shannon, Kolmogorov-Sinai and topological entropies. In addition to their common name, which is historically justified (as we briefly describe in this review), other commonality of the classical entropies is the important role that they have played and are still playing in the theory and applications of their respective fields and beyond. Therefore, it is not surprising that, in the course of time, many other instances of the overarching concept of entropy have been proposed, most of them tailored to specific purposes. Following the current usage, we will refer to all of them, whether classical or new, simply as entropies. Precisely, the subject of this review is their applications in data analysis and machine learning. The reason for these particular applications is that entropies are very well suited to characterize probability mass distributions, typically generated by finite-state processes or symbolized signals. Therefore, we will focus on entropies defined as positive functionals on probability mass distributions and provide an axiomatic characterization that goes back to Shannon and Khinchin. Given the plethora of entropies in the literature, we have selected a representative group, including the classical ones. The applications summarized in this review finely illustrate the power and versatility of entropy in data analysis and machine learning.

Keywords

entropy; entropic measures; data analysis; machine learning; deep learning

Subject

Computer Science and Mathematics, Applied Mathematics

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.