New Developments in Statistical Information Theory Based on Entropy and Divergence Measures

New Developments in Statistical Information Theory Based on Entropy and Divergence Measures
Author :
Publisher : MDPI
Total Pages : 344
Release :
ISBN-10 : 9783038979364
ISBN-13 : 3038979368
Rating : 4/5 (368 Downloads)

Book Synopsis New Developments in Statistical Information Theory Based on Entropy and Divergence Measures by : Leandro Pardo

Download or read book New Developments in Statistical Information Theory Based on Entropy and Divergence Measures written by Leandro Pardo and published by MDPI. This book was released on 2019-05-20 with total page 344 pages. Available in PDF, EPUB and Kindle. Book excerpt: This book presents new and original research in Statistical Information Theory, based on minimum divergence estimators and test statistics, from a theoretical and applied point of view, for different statistical problems with special emphasis on efficiency and robustness. Divergence statistics, based on maximum likelihood estimators, as well as Wald’s statistics, likelihood ratio statistics and Rao’s score statistics, share several optimum asymptotic properties, but are highly non-robust in cases of model misspecification under the presence of outlying observations. It is well-known that a small deviation from the underlying assumptions on the model can have drastic effect on the performance of these classical tests. Specifically, this book presents a robust version of the classical Wald statistical test, for testing simple and composite null hypotheses for general parametric models, based on minimum divergence estimators.


New Developments in Statistical Information Theory Based on Entropy and Divergence Measures Related Books

New Developments in Statistical Information Theory Based on Entropy and Divergence Measures
Language: en
Pages: 344
Authors: Leandro Pardo
Categories: Social Science
Type: BOOK - Published: 2019-05-20 - Publisher: MDPI

GET EBOOK

This book presents new and original research in Statistical Information Theory, based on minimum divergence estimators and test statistics, from a theoretical a
New Developments in Statistical Information Theory Based on Entropy and Divergence Measures
Language: en
Pages: 344
Authors: Leandro Pardo
Categories: Social sciences (General)
Type: BOOK - Published: 2019 - Publisher:

GET EBOOK

This book presents new and original research in Statistical Information Theory, based on minimum divergence estimators and test statistics, from a theoretical a
Statistical Inference Based on Divergence Measures
Language: en
Pages: 512
Authors: Leandro Pardo
Categories: Mathematics
Type: BOOK - Published: 2018-11-12 - Publisher: CRC Press

GET EBOOK

The idea of using functionals of Information Theory, such as entropies or divergences, in statistical inference is not new. However, in spite of the fact that d
Concepts and Recent Advances in Generalized Information Measures and Statistics
Language: en
Pages: 432
Authors: Andres M. Kowalski, Raul D. Rossignoli and Evaldo M. F. Curado
Categories: Science
Type: BOOK - Published: 2013-12-13 - Publisher: Bentham Science Publishers

GET EBOOK

Since the introduction of the information measure widely known as Shannon entropy, quantifiers based on information theory and concepts such as entropic forms a
Information Theory and Statistics
Language: en
Pages: 460
Authors: Solomon Kullback
Categories: Mathematics
Type: BOOK - Published: 2012-09-11 - Publisher: Courier Corporation

GET EBOOK

Highly useful text studies logarithmic measures of information and their application to testing statistical hypotheses. Includes numerous worked examples and pr