Linear Spaces and Unbiased Estimation--Application to the Mixed Linear Model
Exemplification of the theory developed in [9] using a linear space of random variables other than linear combinations of the components of a random vector, and unbiased estimation for the parameters of a mixed linear model using quadratic estimators are the primary reasons for the considerations in...
Gespeichert in:
Veröffentlicht in: | The Annals of mathematical statistics 1970-10, Vol.41 (5), p.1735-1748 |
---|---|
1. Verfasser: | |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Exemplification of the theory developed in [9] using a linear space of random variables other than linear combinations of the components of a random vector, and unbiased estimation for the parameters of a mixed linear model using quadratic estimators are the primary reasons for the considerations in this paper. For a random vector Y with expectation Xβ and covariance matrix ∑iνiV
i(ν1, ⋯, νm, and β denote the parameters), interest centers upon quadratic estimability for parametric functions of the form ∑i≤ jγijβiβj+ ∑kγkν k and procedures for obtaining quadratic estimators for such parametric functions. Special emphasis is given to parametric functions of the form ∑kγkνk. Unbiased estimation of variance components is the main reason for quadratic estimability considerations regarding parametric functions of the form ∑kγkνk. Concerning variance component models, Airy, in 1861 (Scheffe [6]), appears to have been the first to introduce a model with more than one source of variation. Such a model is also implied (Scheffe [6]) by Chauvenet in 1863. Fisher [1], [2] reintroduced variance component models and discussed, apparently for the first time, unbiased estimation in such models. Since Fisher's introduction and discussion of unbiased estimation in models with more than one source of variation, there has been considerable literature published on the subject. One of these papers is a description by Henderson [5] which popularized three methods (now known as Henderson's Methods I, II, and III) for obtaining unbiased estimates of variance components. We mention these methods since they seem to be commonly used in the estimation of variance components. For a review as well as a matrix formulation of the methods see Searle [7]. Among the several pieces of work which have dealt with Henderson's methods, only that of Harville [4] seems to have been concerned with consistency of the equations leading to the estimators and to the existence of unbiased (quadratic) estimators under various conditions. Harville, however, only treats a completely random two-way classification model with interaction. One other result which deals with existence of unbiased quadratic estimators in a completely random model is given by Graybill and Hultquist [3]. In Section 2 the form we assume for a mixed linear model is introduced and the pertinent quantiles needed for the application of the results in [9] are obtained. Definitions, terminology, and notation are consistent with the usage in |
---|---|
ISSN: | 0003-4851 2168-8990 |
DOI: | 10.1214/aoms/1177696818 |