Parallel restricted maximum likelihood estimation for linear models with a dense exogenous matrix
Restricted maximum likelihood (REML) estimation of variance–covariance matrices is an optimization problem that has both scientific and industrial applications. Parallel REML gradient algorithms are presented and compared for linear models whose covariance matrix is large, sparse and possibly unstru...
Gespeichert in:
Veröffentlicht in: | Parallel computing 2002-02, Vol.28 (2), p.343-353 |
---|---|
1. Verfasser: | |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Schreiben Sie den ersten Kommentar!