Statistical entropy estimation

Statistical estimation of information and statistical distances

SMU has been conducting research into entropy estimation for a number of years. The entropy of a random variable is the expected information gained by observing a realisation of that random variable. To estimate the entropy of a random vector, the naive approach is to partition the sample space into a finite number of cells, estimate the density in each cell by the proportion of sample points falling into that cell, then estimate the entropy by that of the associated empirical density function. For non-uniform distributions, fixed partitions lead to low occupancy numbers in sparse regions and poor resolution in dense regions. In a seminal paper (Kozachenko, Leonenko, 1989) an alternative approach is proposed for the problem of entropy estimation, based on the expected distance between a point and its nearest neighbour in the sample. Nearest neighbour relations define an adaptive partition of the sample space, which allows control of the number of points in each spatial cell, and hence the computation time of the algorithm.

Relative entropy and mutual information extend the notion of entropy to two or more random variable. Evans (2008a) showed that estimators based only on nearest neighbour relations in the marginal space can be computationally expensive, and shows how computational efficiency can be maintained by considering nearest neighbour relations in the joint probability space. Leonenko et al. (2008) presented a more general class of estimators for Renyi entropy and divergence, and showed that these estimators satisfy a Strong Law of Large Numbers. Evans (2008b) showed that this also holds for a broad class of nearest-neighbour statistics. Whether such statistics also satisfy a Central Limit Theorem is currently a subject of active research at Cardiff.

Entropy and divergence (Shannon and Kullback-Leibler) estimation is a central problem in image processing, with many applications for image compression, segmentation, calibration, registration, etc. Mutual information, which is strongly related to Shannon entropy and Kullback-Leibler divergence, is a widely used measure of similarity between images. We study the statistical inference theory for entropies, epsilon-entropies and statistical distance.

Cardiff Investigators

Collaborators

Dr. Luc Pronzato, Dr. Oleg Seleznev, Dr. Denis Denisov.

Selected publications