First-Order Dependence Trees with Cumulative Residual Entropy


Creative Commons License

Sutcu M., ABBAS A. E.

Conference on Bayesian Inference and Maximum Entropy Methods in Science and Engineering, Amboise, Fransa, 21 - 26 Eylül 2014, cilt.1641, ss.512-521 identifier identifier

  • Yayın Türü: Bildiri / Tam Metin Bildiri
  • Cilt numarası: 1641
  • Doi Numarası: 10.1063/1.4906017
  • Basıldığı Şehir: Amboise
  • Basıldığı Ülke: Fransa
  • Sayfa Sayıları: ss.512-521
  • Abdullah Gül Üniversitesi Adresli: Hayır

Özet

This paper presents a method to approximate discrete joint probability distributions using first-order dependence trees and the recent concept of cumulative residual entropy. A first-order dependence tree is one where each variable is conditioned on at most one variable. The cumulative residual entropy measure is the entropy functional applied to the survival function instead of the probability measure. We formulate the cumulative residual Kullback-Leibler (KL)-divergence and the cumulative residual mutual information measures in terms of the survival function. We then show that the optimal first-order dependence tree approximation of the joint distribution using the cumulative Kullback-Leibler divergence is the one with the largest sum of cumulative residual mutual information pairs. The results parallel Chow-Liu's approximation of joint probability distributions using the traditional Kullback-Leibler divergence and mutual information but applied to survival functions. We compare the approximation results with those of Chow-Liu using the traditional entropy measure. Using a Monte Carlo simulation, we show that the two approximations perform almost equally but they are not same.