First-Order Dependence Trees with Cumulative Residual Entropy


Creative Commons License

Sutcu M., ABBAS A. E.

Conference on Bayesian Inference and Maximum Entropy Methods in Science and Engineering, Amboise, France, 21 - 26 September 2014, vol.1641, pp.512-521 identifier identifier

  • Publication Type: Conference Paper / Full Text
  • Volume: 1641
  • Doi Number: 10.1063/1.4906017
  • City: Amboise
  • Country: France
  • Page Numbers: pp.512-521
  • Abdullah Gül University Affiliated: No

Abstract

This paper presents a method to approximate discrete joint probability distributions using first-order dependence trees and the recent concept of cumulative residual entropy. A first-order dependence tree is one where each variable is conditioned on at most one variable. The cumulative residual entropy measure is the entropy functional applied to the survival function instead of the probability measure. We formulate the cumulative residual Kullback-Leibler (KL)-divergence and the cumulative residual mutual information measures in terms of the survival function. We then show that the optimal first-order dependence tree approximation of the joint distribution using the cumulative Kullback-Leibler divergence is the one with the largest sum of cumulative residual mutual information pairs. The results parallel Chow-Liu's approximation of joint probability distributions using the traditional Kullback-Leibler divergence and mutual information but applied to survival functions. We compare the approximation results with those of Chow-Liu using the traditional entropy measure. Using a Monte Carlo simulation, we show that the two approximations perform almost equally but they are not same.