We define degree of dependence of two events A and B in a probability space by using Boltzmann-Shannon entropy function of an appropriate probability distribution produced by these events and depending on one parameter (the probability of intersection of A and B) varying within a closed interval I. The entropy function attains its global maximum when the events A and B are independent. The important particular case of discrete uniform probability space motivates this definition in the following way. The entropy function has a minimum at the left endpoint of I exactly when one of the events and the complement of the other are connected with the relation of inclusion (maximal negative dependence). It has a minimum at the right endpoint of I exactly when one of these events is included in the other (maximal positive dependence). Moreover, the deviation of the entropy from its maximum is equal to average information that carries one of the binary trials defined by A and B with respect to the other. As a consequence, the degree of dependence of A and B can be expressed in terms of information theory and is invariant with respect to the choice of unit of information. Using this formalism, we describe completely the screening tests and their reliability, measure efficacy of a vaccination, the impact of some events from the financial markets to other events, etc.
Keywords:
Subject: Computer Science and Mathematics - Algebra and Number Theory
Copyright: This open access article is published under a Creative Commons CC BY 4.0 license, which permit the free download, distribution, and reuse, provided that the author and preprint are cited in any reuse.
Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.