Parallel restricted maximum likelihood estimation for linear models with a dense exogenous matrix

Restricted maximum likelihood (REML) estimation of variance–covariance matrices is an optimization problem that has both scientific and industrial applications. Parallel REML gradient algorithms are presented and compared for linear models whose covariance matrix is large, sparse and possibly unstru...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Parallel computing 2002-02, Vol.28 (2), p.343-353
1. Verfasser: Malard, Joël M.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Restricted maximum likelihood (REML) estimation of variance–covariance matrices is an optimization problem that has both scientific and industrial applications. Parallel REML gradient algorithms are presented and compared for linear models whose covariance matrix is large, sparse and possibly unstructured. These algorithms are implemented using publicly available toolkits and demonstrate that REML estimates of large, sparse covariance matrices can be computed efficiently on multicomputers with hundreds of processors by using an effective mixture of data distributions together with a mixture of dense and sparse linear algebra kernels.
ISSN:0167-8191
1872-7336
DOI:10.1016/S0167-8191(01)00143-0