If I1, I2, ..., Ik are random Boolean variables and the joint probabilities up to the (k-1)th order are known, the values of the kth-order probabilities maximizing the overall entropy have been defined as the maximum independence estimate.. In this article, some contributions deriving from the definition of maximum independence probabilities are proposed. First, it is shown that the maximum independence values are reached when the product of the probabilities of the minterms i1* i2*...ik * containing an even number of complemented variables is equal to the products of the probabilities of the other minterms. Second, the new definition of group mutual information, as the difference between the maximum independence entropy and the real entropy, is proposed and discussed. Finally, the new concept of mutual information is applied to the determination of dependencies in data mining problems.
Maximum Independence and Mutual Information
MEO, Rosa
2002-01-01
Abstract
If I1, I2, ..., Ik are random Boolean variables and the joint probabilities up to the (k-1)th order are known, the values of the kth-order probabilities maximizing the overall entropy have been defined as the maximum independence estimate.. In this article, some contributions deriving from the definition of maximum independence probabilities are proposed. First, it is shown that the maximum independence values are reached when the product of the probabilities of the minterms i1* i2*...ik * containing an even number of complemented variables is equal to the products of the probabilities of the other minterms. Second, the new definition of group mutual information, as the difference between the maximum independence entropy and the real entropy, is proposed and discussed. Finally, the new concept of mutual information is applied to the determination of dependencies in data mining problems.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.