Stochastic Convergence and Probability Inequalities

IntroductionUnbiasedness, efficiency, sufficiency, and ancillarity, as outlined in Chapters 2 and 3, are essentially finite-sample concepts, but consistency refers to indefinitely increasing samples sizes, and thus has an asymptotic nature. In general, finite-sample optimality properties of estimato...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Sen, Pranab K., Singer, Julio M., Pedroso de Lima, Antonio C.
Format: Buchkapitel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:IntroductionUnbiasedness, efficiency, sufficiency, and ancillarity, as outlined in Chapters 2 and 3, are essentially finite-sample concepts, but consistency refers to indefinitely increasing samples sizes, and thus has an asymptotic nature. In general, finite-sample optimality properties of estimators and tests hold basically for a small class of probability laws, mostly related to the exponential family of distributions; consistency, however, holds under much less restricted setups as we will see. Moreover, even when finite-sample optimal statistical procedures exist, they may not lead to closed-form expressions and/or be subject to computational burden. These problems are not as bothersome when we adopt an asymptotic point of view and use the corresponding results to obtain good approximations of such procedures for large (although finite) samples. This is accomplished with the incorporation of probability inequalities, limit theorems, and other tools that will be developed in this and subsequent chapters.In this context, a minimal requirement for a good statistical decision rule is its increasing reliability with increasing sample sizes (consistency). For an estimator, consistency relates to an increasing closeness to its population counterpart as the sample sizes become larger. In view of its stochastic nature, this closeness needs to incorporate its fluctuation around the parameter it estimates and thus requires an appropriate adaptation of the definitions usually considered in nonstochastic setups. Generally, a distance function or norm of this stochastic fluctuation is incorporated in the formulation of this closeness, and consistency refers to the convergence of this norm to 0 in some well-defined manner.
DOI:10.1017/CBO9780511806957.007