附註:Includes bibliographical references (pages 199-206) and index.
Information Theory and The Central Limit Theorem; Preface; Contents; 1. Introduction to Information Theory; 2. Convergence in Relative Entropy; 3. Non-Identical Variables and Random Vectors; 4. Dependent Random Variables; 5. Convergence to Stable Laws; 6. Convergence on Compact Groups; 7. Convergence to the Poisson Distribution; 8. Free Random Variables; Appendix A Calculating Entropies; Appendix B Poincare Inequalities; Appendix C de Bruijn Identity; Appendix D Entropy Power Inequality; Appendix E Relationships Between Different Forms of Convergence; Bibliography; Index.
摘要:This book provides a comprehensive description of a new method of proving the central limit theorem, through the use of apparently unrelated results from information theory. It gives a basic introduction to the concepts of entropy and Fisher information, and collects together standard results concerning their behaviour. It brings together results from a number of research papers as well as unpublished material, showing how the techniques can give a unified view of limit theorems.