附註:Includes bibliographical references.
Preliminaries -- Large deviations, hypothesis testing -- Large deviations via types -- Hypothesis testing -- I-projections -- f-Divergence and contingency tables -- Iterative algorithms -- Iterative scaling -- Alternating divergence minimization -- The EM algorithm -- Universal coding -- Redundancy -- Universal codes for certain classes of processes -- Redundancy bounds -- I-radius and channel capacity -- Optimality results -- Redundancy and the MDL principle -- Codes with sublinear redundancy growth -- The minimum description length principle -- Summary of process concepts -- References.
摘要:Information Theory and Statistics: A Tutorial is concerned with applications of information theory concepts in statistics, in the finite alphabet setting. The topics covered include large deviations, hypothesis testing, maximum likelihood estimation in exponential families, analysis of contingency tables, and iterative algorithms with an ""information geometry"" background. Also, an introduction is provided to the theory of universal coding, and to statistical inference via the minimum description length principle motivated by that theory. The tutorial does not assume the reader has an in-depth.